LLMs contain a LOT of parameters. But whatâs a parameter? â MIT Technology Review
Artificial intelligence
LLMs contain a LOT of parameters. But whatâs a parameter?
Theyâre the mysterious numbers that make your favorite AI models tick. What are they and what do they do?
By Will Douglas Heavenarchive page
January 7, 2026
Photo Illustration by Sarah Rogers/MITTR | Photos Getty
MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand whatâs coming next. You can read more from the series here.
I am writing this because one of my editors woke up in the middle of the night and scribbled on a bedside notepad: âWhat is a parameter?â Unlike a lot of thoughts that hit at 4 a.m., itâs a really good questionâone that goes right to the heart of how large language models work. And Iâm not just saying that because heâs my boss. (Hi, Boss!)
A large language modelâs parameters are often said to be the dials and levers that control how it behaves. Think of a planet-size pinball machine that sends its balls pinging from one end to the other via billions of paddles and bumpers set just so. Tweak those settings and the balls will behave in a different way.
OpenAIâs GPT-3, released in 2020, had 175 billion parameters. Google DeepMindâs latest LLM, Gemini 3, may have at least a trillionâsome think itâs probably more like 7 trillionâbut the company isnât saying. (With competition now fierce, AI firms no longer share information about how their models are built.)
But the basics of what parameters are and how they make LLMs do the remarkable things that they do are the same across different models. Ever wondered what makes an LLM really tickâwhatâs behind the colorful pinball-machine metaphors? Letâs dive in.
What is a parameter?
Think back to middle school algebra, like 2a + b. Those letters are parameters: Assign them values and you get a result. In math or coding, parameters are used to set limits or determine output. The parameters inside LLMs work in a similar way, just on a mind-boggling scale.
Editorâs Note: Read the rest of the story, at the below link.
Continue/Read Original Article Here: LLMs contain a LOT of parameters. But whatâs a parameter? | MIT Technology Review
#Billions #DeepMind #Gemini3 #Google #LargeLanguageModels #LLMs #LotsOfParameters #MITTechnologyReview #Parameter #Trillions