
NewsMachine Learning
Textbooks, Not the Internet, Trained This Powerful AI
via HackernoonMicrosoft
phi-1.5 is a 1.3B-parameter Transformer trained mainly on synthetic, textbook-quality data. Despite its small size, it matches or beats much larger models on commonsense reasoning, grade-school math, and coding benchmarks. The results suggest data quality—not scale alone—drives reasoning ability in LLMs.
Continue reading on Hackernoon
Opens in a new tab
0 views


