LLMs contain a LOT of parameters. But what’s a parameter? – MIT Technology Review
LLMs contain a LOT of parameters. But what’s a parameter?
They’re the mysterious numbers that make your favorite AI models tick. What are they and what do they do?
By Will Douglas Heavenarchive page
January 7, 2026
Photo Illustration by Sarah Rogers/MITTR | Photos Getty
MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.
I am writing this because one of my editors woke up in the middle of the night and scribbled on a bedside notepad: “What is a parameter?” Unlike a lot of thoughts that hit at 4 a.m., it’s a really good question—one that goes right to the heart of how large language models work. And I’m not just saying that because he’s my boss. (Hi, Boss!)
A large language model’s parameters are often said to be the dials and levers that control how it behaves. Think of a planet-size pinball machine that sends its balls pinging from one end to the other via billions of paddles and bumpers set just so. Tweak those settings and the balls will behave in a different way.
OpenAI’s GPT-3, released in 2020, had 175 billion parameters. Google DeepMind’s latest LLM, Gemini 3, may have at least a trillion—some think it’s probably more like 7 trillion—but the company isn’t saying. (With competition now fierce, AI firms no longer share information about how their models are built.)
But the basics of what parameters are and how they make LLMs do the remarkable things that they do are the same across different models. Ever wondered what makes an LLM really tick—what’s behind the colorful pinball-machine metaphors? Let’s dive in.
What is a parameter?
Think back to middle school algebra, like 2a + b. Those letters are parameters: Assign them values and you get a result. In math or coding, parameters are used to set limits or determine output. The parameters inside LLMs work in a similar way, just on a mind-boggling scale.
Editor’s Note: Read the rest of the story, at the below link.
Continue/Read Original Article Here: LLMs contain a LOT of parameters. But what’s a parameter? | MIT Technology Review
#Billions #DeepMind #Gemini3 #Google #LargeLanguageModels #LLMs #LotsOfParameters #MITTechnologyReview #Parameter #Trillions
