Groq CEO:2024年前,大多数AI创业公司将采用更快的LPU
据华尔街见闻报道,总部位于硅谷的Groq正在为大语言模型推理(对现有模型做出决策或预测,而不是训练)开发新的AI芯片(LPU)。Groq创始人兼CEO Jonathan Ross近日在接受采访时表示,到2024年前,大多数AI创业公司都将使用速度更快的LPU。
Ross展示了Groq提供动力支持的音频聊天机器人,其响应速度打破了记录。他表示,AI推理的成本高昂,因此Groq专门为大模型提供了“超快”、更便宜的芯片选择。
Ross宣称:“到今年年底,我们很可能会成为大多数初创公司使用的基础设施,我们的价格对初创公司非常友好。”
Groq的LPU旨在解决大语言模型推理中的速度和成本问题。这些模型需要大量的计算能力,这使得推理过程变得昂贵且耗时。Groq的LPU通过专门针对推理任务进行优化,可以显著提高速度和降低成本。
Ross表示,Groq的LPU已经引起了众多AI创业公司的兴趣,他们正在寻求更快速、更实惠的推理解决方案。他相信,随着AI技术的发展,对更快的LPU的需求将不断增长。
分析人士指出,Groq的LPU有望为AI创业公司提供一个有吸引力的选择,帮助他们降低推理成本并提高模型性能。如果Groq能够兑现其承诺,它可能会在AI芯片市场上占据重要地位。
英语如下:
**Headline: Large Language Model Inference Revolution: Groq’s LPU LeadsWave of AI Startups**
**Keywords:** Large language models, AI chips, startups
**Body:**
Groq CEO: Majority of AI Startups Will AdoptFaster LPUs by 2024
According to Wall Street CN, Groq, a Silicon Valley-based startup, is developing a novel AI chip (LPU) for large language model inference (making decisions or predictions from existing models, as opposed to training them). In a recent interview, Groq founderand CEO Jonathan Ross said that the majority of AI startups will be using faster LPUs by 2024.
Ross demonstrated an audio chatbot powered by Groq, which responded with record-breaking speed. He said that AI inference is expensive, so Groq provides a “blazing fast” and cheaper chip option specifically for large models.
“By the end of this year, we will likely be the infrastructure that most startups are using, and our pricing is very startup-friendly,” Ross claimed.
Groq’s LPU aims to address the speed and cost challenges in large language model inference. These models require immensecomputational power, making the inference process expensive and time-consuming. Groq’s LPU, by being specifically optimized for inference tasks, can significantly improve speed and reduce costs.
Ross said that Groq’s LPU has garnered interest from numerous AI startups looking for faster and more affordable inference solutions. He believes that the demand for faster LPUs will only grow as AI technology advances.
Analysts suggest that Groq’s LPU is poised to offer an attractive option for AI startups, helping them reduce inference costs and improve model performance. If Groq can deliver on its promises, it could establish a notable presence in the AI chip market.
【来源】https://wallstreetcn.com/articles/3709133
Views: 1