Nvidia announced at SIGGRAPH 2024 that it is powering a new inference-as-a-service offering from Hugging Face and enabling industrial generative AI use cases with a fresh batch of microservices. The AI computing giant introduced Nvidia NIM microservices, optimized containers with AI models that developers can integrate into their applications. The new Hugging Face service allows developers to prototype with open-source AI models and deploy them in production.
The inference service from Hugging Face will run on Nvidia's DGX Cloud service and utilize its inference microservices to deploy popular large language models. Nvidia's NIM microservices are part of the Nvidia AI Enterprise software suite, available for $4,500 per GPU per year. The company aims to simplify AI application development and bring generative AI capabilities to various industrial sectors like manufacturing and robotics.
Business Insider on MSN.com 12month
Isomorphic Labs, the AI drug discovery platform that was spun out of Google's DeepMind in 2021, has raised external capital for the first time. The $600
How to level up your teaching with AI. Discover how to use clones and GPTs in your classroom—personalized AI teaching is the future.
Trump's Third Term? AI already knows how this can be done. A study shows how OpenAI, Grok, DeepSeek & Google outline ways to dismantle U.S. democracy.
Sam Altman today revealed that OpenAI will release an open weight artificial intelligence model in the coming months. "We are excited to release a powerful new open-weight language model with reasoning in the coming months," Altman wrote on X.