Cerebras Systems Launches AI Development Tool
By Max A. Cherney
SAN FRANCISCO (Reuters) – Cerebras Systems launched a tool for AI developers that enables access to its large chips, offering a more affordable alternative to Nvidia processors.
Accessing Nvidia GPUs through cloud providers can be costly and challenging for training large AI models like OpenAI’s ChatGPT, a process known as inference.
Cerebras CEO Andrew Feldman stated, “We’re delivering performance that cannot be achieved by a GPU. We’re doing it at the highest accuracy, and we’re offering it at the lowest price.”
The inference market for AI is expected to grow rapidly, potentially worth billions as AI tools gain traction among consumers and businesses.
Cerebras will provide various inference products via a developer key and its cloud services, while also selling AI systems for those who want on-premises data centers.
Cerebras’ chips, called Wafer Scale Engines, are large and capable of handling extensive data without needing multiple chips, leading to faster performance.
They plan to charge as little as 10 cents per million tokens, a measurement used for large model output data.
Cerebras is also preparing to go public, having filed a confidential prospectus with the SEC this month.
Comments (0)