-
Raja Niketh Reddy Nallapareddy posted an update
You’ve heard about llms a lot, but have you heard about SLMs?
In simple terms, LLMs have a large model size, and SLMs have a small model size.
While llms like gpt-4o excel at broad tasks, slms prioritize targeted performance with fewer computational demands.
It requires less money to train them, it requires fewer resources to train them, and they are trained with domain-specific datasets.
And as they require very few resources compared with llms, they can even be run on a mobile device.
Some of the SLMs that we have now are Google Gemma, etc.
Hope this helps!!