Back to articles
AI Crash Course: Tokens, Prediction and Temperature

AI Crash Course: Tokens, Prediction and Temperature

via Dev.toKathryn Grayson Nanz

Read the first blog in this series: AI, ML, LLM, and More We often describe AI models as “thinking,” but what is actually happening when an AI model “thinks”? When it’s drafting a response to us, how does it know what to say? One of the most tempting (and common) misunderstandings related to AI models is the perception that they “think”—or have awareness of any kind, for that matter. This is primarily a language problem: we (meaning humans) like to use the words and experiences that we’re most familiar with as a shorthand to communicate complex ideas. After all, how many times have you seen a webpage slowly loading and heard someone say “hang on, it’s thinking about it”? We “wake” computers up from being in “sleep” mode, we initiate network “handshakes,” we get annoyed with memory-”hungry” programs. In the same way, we often describe AI models as “thinking,” sometimes even including the directive to “take as much time as you need to think about this” when prompting them! But what is ac

Continue reading on Dev.to

Opens in a new tab

Read Full Article
3 views

Related Articles