Stability AI yesterday released an announcement, introducing its a called Stable LM 3B language model, said to be suitable for mobile platform devices, and can bring sustainable, high-performance experience for the relevant devices.
▲ image source Stability AI
The Stable LM 3B model has 3 billion parameters, focuses on text generation, has an autoregressive system based on the Transformer decoder architecture, and uses several open-source large-scale datasets for training.
▲ Image courtesy of Stability AI
The model uses 3 billion parameters and 256 NVIDIA A100 40GB GPUs to train it, which means that although it has fewer parameters than similar large models, the performance is still impressive, and it is more suitable for mobile platforms due to the smaller size and lower power consumption of the model.
In addition, the model has multi-platform compatibility and allows for fine-tuning according to specific needs. Currently, the model has been open-sourced on the Hugging Face platform, which makes it easy for developers to use and improve it, so interested parties can go there to learn more about it.