At Computex 2024, Jensen Huang unveiled the NVIDIA NIM – an inference microservices offering optimized models as containers for deployment on clouds, data centers, or workstations.

This enables the world’s 28 million strong developers to quickly build generative AI applications like copilots and chatbots, cutting development time from weeks to minutes.

NIM simplifies the creation of complex generative AI applications, which often require multiple models for generating text, images, video, and speech. It boosts developer productivity and helps enterprises maximize infrastructure efficiency, allowing up to three times more generative AI tokens on accelerated hardware.

Nearly 200 partners, including Cadence, Cloudera, and Hugging Face, are integrating NIM to expedite AI deployments in various domains. Jensen Huang, NVIDIA’s CEO, emphasized that NIM democratizes generative AI, making it accessible to all enterprises, even those without dedicated AI teams.

NIM is available through the NVIDIA AI Enterprise platform, with free access for NVIDIA Developer Program members starting next month. Over 40 models, including Meta Llama 3 and Microsoft Phi-3, are available as NIM endpoints, making it easy for developers to deploy these models using platforms like Hugging Face.

Enterprises are using NIM for diverse applications such as text, image, video, speech generation, and digital biology. Healthcare companies leverage NIM for tasks like surgical planning and drug discovery. NVIDIA ACE NIM microservices enable the creation of interactive digital humans for customer service and other applications.

Major AI platform providers and tools, including Amazon SageMaker and Microsoft Azure AI, support NIM. Global system integrators and service providers, such as Accenture and Deloitte, offer NIM competencies to assist enterprises in deploying AI strategies.

Leading companies like Foxconn, Pegatron, Amdocs, Lowe’s, ServiceNow, and Siemens are using NIM for generative AI applications across various industries, including manufacturing, healthcare, and customer service.

NIM microservices can be deployed on NVIDIA-Certified Systems and major cloud platforms, with experimental access available at ai.nvidia.com.

Leave a Reply

Your email address will not be published