Every second, businesses worldwide are making critical decisions. A logistics company decides which trucks to send where. A retailer figures out how to stock its shelves. An airline scrambles to reroute flights after a storm. These aren’t just routing choices — they’re high-stakes puzzles with millions of variables, and getting them wrong costs money and,
Read Article
Scientists and engineers of all kinds are equipped to solve tough problems a lot faster with NVIDIA CUDA-X libraries powered by NVIDIA GB200 and GH200 superchips. Announced today at the NVIDIA GTC global AI conference, developers can now take advantage of tighter automatic integration and coordination between CPU and GPU resources — enabled by CUDA-X
Read Article
NVIDIA today unveiled partnerships with industry leaders T-Mobile, MITRE, Cisco, ODC, a portfolio company of Cerberus Capital Management, and Booz Allen Hamilton on the research and development of AI-native wireless network hardware, software and architecture for 6G.
General Motors and NVIDIA today announced they are collaborating on next-generation vehicles, factories and robots using AI, simulation and accelerated computing.
Physical AI is unlocking new possibilities at the intersection of autonomy and robotics — accelerating, in particular, the development of autonomous vehicles (AVs). The right technology and frameworks are crucial to ensuring the safety of drivers, passengers and pedestrians. That’s why NVIDIA today announced NVIDIA Halos — a comprehensive safety system bringing together NVIDIA’s lineup
Read Article
Qubits are inherently sensitive to noise, and it is expected that even the most robust qubits will always exhibit noise levels orders of magnitude from what’s…
Qubits are inherently sensitive to noise, and it is expected that even the most robust qubits will always exhibit noise levels orders of magnitude from what’s required for practical quantum applications. This noise problem is solved with quantum error correction (QEC). This is a collection of techniques that can identify and eliminate errors in a controlled way, so long as qubits can be…
Scikit-learn, the most widely used ML library, is popular for processing tabular data because of its simple API, diversity of algorithms, and compatibility with…
Scikit-learn, the most widely used ML library, is popular for processing tabular data because of its simple API, diversity of algorithms, and compatibility with popular Python libraries such as pandas and NumPy. NVIDIA cuML now enables you to continue using familiar scikit-learn APIs and Python libraries while enabling data scientists and machine learning engineers to harness the power of CUDA on…
Xet is on the Hub
AI is transforming how we experience our favorite games. It is unlocking new levels of visuals, performance, and gameplay possibilities with neural rendering…
AI is transforming how we experience our favorite games. It is unlocking new levels of visuals, performance, and gameplay possibilities with neural rendering and generative AI-powered characters. With game development becoming more complex, AI is also playing a role in helping artists and engineers realize their creative visions. At GDC 2025, NVIDIA is building upon NVIDIA RTX Kit…
NVIDIA today announced ahead of the Game Developers Conference (GDC) groundbreaking enhancements to NVIDIA RTX™ neural rendering technologies.