近日,知名人工智能公司Stability AI推出了全新的16亿参数小型语言模型——Stable LM 2 1.6B。据悉,这是基于英语、西班牙语、德语、意大利语、法语、葡萄牙语和荷兰语等多语言数据进行训练的语言模型。
据Stability AI介绍,Stable LM 2 1.6B在多数基准测试中表现优秀,优于其他参数低于20亿个的小型语言模型,包括微软的Phi-2 (2.7B)、TinyLlama 1.1B和Falcon 1B。这一突破性的成果无疑为人工智能领域带来了新的发展。
作为一款小型语言模型,Stable LM 2 1.6B在保持高效性能的同时,降低了计算资源的消耗,使得更多的企业和开发者能够在有限的条件下享受到人工智能带来的便利。此外,多语言训练的特点也让它在处理多种语言的数据时更具优势,有望为全球范围内的多语言应用提供更好的支持。
英文标题:Stability AI Unveils New Small Language Model Stable LM 2 1.6B
英文关键词:Stability AI, New language model, Multilingual training
英文新闻内容:
Recently, renowned artificial intelligence company Stability AI has launched a new small language model——Stable LM 2 1.6B. It is reportedly trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.
According to Stability AI, Stable LM 2 1.6B has demonstrated excellent performance in most benchmark tests, outperforming other small language models with parameters below 20 billion, including Microsoft’s Phi-2 (2.7B), TinyLlama 1.1B, and Falcon 1B. This breakthrough achievement无疑 brings new developments to the field of artificial intelligence.
As a small language model, Stable LM 2 1.6B maintains high performance while reducing the consumption of computing resources, enabling more enterprises and developers to enjoy the convenience of artificial intelligence with limited resources. Moreover, its multilingual training feature gives it an advantage in handling data in multiple languages, promising to provide better support for global multilingual applications.
【来源】https://stability.ai/news/introducing-stable-lm-2
Views: 1
