#TinyLlama

Dr. Thompsonrogt_x1997
2025-05-30

📣 The LLM shift no one saw coming!
Top engineers are dropping GPT for TinyLlama—faster, cheaper, and surprisingly more effective in real-world tasks 💼⚡
Want to know why?

👇 Read the article and future-proof your GenAI strategy:
👉 medium.com/@rogt.x1997/why-sma


medium.com/@rogt.x1997/why-sma

2025-05-27

Just trained my own language model offline.
No cloud. No APIs. Fine-tuned it on my data, merged it, and ran it with llama.cpp.
This is what real AI literacy looks like.

Documentation:
github.com/hassanhabib/AI.Llam

Video:
youtube.com/watch?v=FQr7VrK5RR

#AI #LLM #LoRA #OfflineAI #TinyLlama

2025-05-21

I asked #tinyllama to generate me a bio for my new #mastodon account at the great #OhaiSocial instance.

In short: it is a verbose inventor of text.

Me: "write me a bio info for my social media profile where my skills are presented: my skills are software, society, rocks , hiking,"

At least it stated: Here's an example of how you might incorporate inline citations into your bio information for your social media profile...

2025-05-19

And I've forgot to say, the rest is also wrong.

#expanse #ai #experiment #knowledge #tinyllama #ollama

The excuse is, that this tinyllama is a VERY tiny model. llama3.2 works much smarter. (but harder to test because it already knows about the expanse without adding stuff to the "knowledge" so I need to invent something...)

2024-09-23

Here it is how you can do #finetuning for a SMAL-language model that can be put on a #RaspberryPI or other edge-computing devices, or even wearables:
youtube.com/watch?v=DTYi7z4cLD

#TinyLLaMA #TinyDolphin #Ollama #AIonEdge #MachineLearning #AIModels #EdgeComputing #AI #LLM

AI Daily | AI News & Videosaidaily
2023-09-05

TinyLlama 1.1B: NEW LLAMA Model Size on 3 Trillion Tokens (Installation Tutorial)
Explore the future of language modeling with TinyLlama 🦙🌐! Unveiling a game-changing project with a colossal dataset of 3 trillion tokens, pushing AI boundaries! 🚀🤖
tweetclick.com/CnuQrhr5VM8

GripNewsGripNews
2023-09-04

🌗 GitHub - jzhang38/TinyLlama
➤ TinyLlama 項目的特點和應用,以及其訓練詳細信息和速度比較。
github.com/jzhang38/TinyLlama
本文介紹了 TinyLlama 項目,旨在在 3 兆令牌上預訓練 1.1B Llama 模型。TinyLlama 可以在許多建立在 Llama 上的開源項目中使用。此外,TinyLlama 具有緊湊性,僅有 1.1B 參數,可滿足對計算和內存佔用量有限的多種應用需求。
+ 這是一個非常有用的項目,尤其是對於那些需要緊湊且高效的語言模型的應用。
+ 這個項目的訓練速度非常快,而且可以在許多開源項目中使用,這是一個非常好的特點。
models

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst