• About Us
  • Contact Us
  • Terms & Conditions
  • Privacy Policy
Technology Hive
  • Home
  • Technology
  • Artificial Intelligence (AI)
  • Cyber Security
  • Machine Learning
  • More
    • Deep Learning
    • AI in Healthcare
    • AI Regulations & Policies
    • Business
    • Cloud Computing
    • Ethics & Society
No Result
View All Result
  • Home
  • Technology
  • Artificial Intelligence (AI)
  • Cyber Security
  • Machine Learning
  • More
    • Deep Learning
    • AI in Healthcare
    • AI Regulations & Policies
    • Business
    • Cloud Computing
    • Ethics & Society
No Result
View All Result
Technology Hive
No Result
View All Result
Home Technology

Nvidia Unveils “Rubin Ultra” and “Feynman” AI Chips for 2027 and 2028

Linda Torries – Tech Writer & Digital Trends Analyst by Linda Torries – Tech Writer & Digital Trends Analyst
March 19, 2025
in Technology
0
Nvidia Unveils “Rubin Ultra” and “Feynman” AI Chips for 2027 and 2028
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter

Introduction to Nvidia’s New AI-Accelerating GPUs

Nvidia’s CEO Jensen Huang revealed several new AI-accelerating GPUs at the company’s GTC 2025 conference in San Jose, California. These new GPUs are set to be released over the coming months and years, and they promise to deliver significant performance improvements for AI training and inference.

What is Vera Rubin?

The centerpiece announcement was Vera Rubin, a GPU named after a famous astronomer. Vera Rubin is scheduled for release in the second half of 2026 and will feature tens of terabytes of memory and a custom Nvidia-designed CPU called Vera. According to Nvidia, Vera Rubin will deliver significant performance improvements over its predecessor, Grace Blackwell, particularly for AI training and inference.

Specifications of Vera Rubin

Vera Rubin features two GPUs together on one die that deliver 50 petaflops of FP4 inference performance per chip. When configured in a full NVL144 rack, the system delivers 3.6 exaflops of FP4 inference compute—3.3 times more than Blackwell Ultra’s 1.1 exaflops in a similar rack configuration. The Vera CPU features 88 custom ARM cores with 176 threads connected to Rubin GPUs via a high-speed 1.8 TB/s NVLink interface.

What is Rubin Ultra?

Huang also announced Rubin Ultra, which will follow in the second half of 2027. Rubin Ultra will use the NVL576 rack configuration and feature individual GPUs with four reticle-sized dies, delivering 100 petaflops of FP4 precision per chip. At the rack level, Rubin Ultra will provide 15 exaflops of FP4 inference compute and 5 exaflops of FP8 training performance—about four times more powerful than the Rubin NVL144 configuration. Each Rubin Ultra GPU will include 1TB of HBM4e memory, with the complete rack containing 365TB of fast memory.

Conclusion

Nvidia’s new AI-accelerating GPUs, Vera Rubin and Rubin Ultra, promise to deliver significant performance improvements for AI training and inference. With their advanced features and high-speed performance, these GPUs are set to revolutionize the field of artificial intelligence and take it to the next level.

FAQs

  • Q: What is Vera Rubin?
    A: Vera Rubin is a GPU named after a famous astronomer, scheduled for release in the second half of 2026.
  • Q: What are the specifications of Vera Rubin?
    A: Vera Rubin features two GPUs together on one die that deliver 50 petaflops of FP4 inference performance per chip.
  • Q: What is Rubin Ultra?
    A: Rubin Ultra is a GPU that will follow in the second half of 2027, using the NVL576 rack configuration and featuring individual GPUs with four reticle-sized dies.
  • Q: What are the benefits of Nvidia’s new AI-accelerating GPUs?
    A: Nvidia’s new AI-accelerating GPUs promise to deliver significant performance improvements for AI training and inference, revolutionizing the field of artificial intelligence.
Previous Post

Nvidia Unveils DGX Desktop AI Supercomputers

Next Post

Revolutionizing Food Production with Artificial Intelligence

Linda Torries – Tech Writer & Digital Trends Analyst

Linda Torries – Tech Writer & Digital Trends Analyst

Linda Torries is a skilled technology writer with a passion for exploring the latest innovations in the digital world. With years of experience in tech journalism, she has written insightful articles on topics such as artificial intelligence, cybersecurity, software development, and consumer electronics. Her writing style is clear, engaging, and informative, making complex tech concepts accessible to a wide audience. Linda stays ahead of industry trends, providing readers with up-to-date analysis and expert opinions on emerging technologies. When she's not writing, she enjoys testing new gadgets, reviewing apps, and sharing practical tech tips to help users navigate the fast-paced digital landscape.

Related Posts

Google Generates Fake AI Podcast From Search Results
Technology

Google Generates Fake AI Podcast From Search Results

by Linda Torries – Tech Writer & Digital Trends Analyst
June 13, 2025
Meta Invests  Billion in Scale AI to Boost Disappointing AI Division
Technology

Meta Invests $15 Billion in Scale AI to Boost Disappointing AI Division

by Linda Torries – Tech Writer & Digital Trends Analyst
June 13, 2025
Drafting a Will to Avoid Digital Limbo
Technology

Drafting a Will to Avoid Digital Limbo

by Linda Torries – Tech Writer & Digital Trends Analyst
June 13, 2025
AI Erroneously Blames Airbus for Fatal Air India Crash Instead of Boeing
Technology

AI Erroneously Blames Airbus for Fatal Air India Crash Instead of Boeing

by Linda Torries – Tech Writer & Digital Trends Analyst
June 12, 2025
AI Chatbots Tell Users What They Want to Hear
Technology

AI Chatbots Tell Users What They Want to Hear

by Linda Torries – Tech Writer & Digital Trends Analyst
June 12, 2025
Next Post
Revolutionizing Food Production with Artificial Intelligence

Revolutionizing Food Production with Artificial Intelligence

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Latest Articles

AI Erroneously Blames Airbus for Fatal Air India Crash Instead of Boeing

AI Erroneously Blames Airbus for Fatal Air India Crash Instead of Boeing

June 12, 2025
Error-Aware Machine Learning Framework

Error-Aware Machine Learning Framework

May 7, 2025
Mastering Hadoop Installation

Mastering Hadoop Installation

April 19, 2025

Browse by Category

  • AI in Healthcare
  • AI Regulations & Policies
  • Artificial Intelligence (AI)
  • Business
  • Cloud Computing
  • Cyber Security
  • Deep Learning
  • Ethics & Society
  • Machine Learning
  • Technology
Technology Hive

Welcome to Technology Hive, your go-to source for the latest insights, trends, and innovations in technology and artificial intelligence. We are a dynamic digital magazine dedicated to exploring the ever-evolving landscape of AI, emerging technologies, and their impact on industries and everyday life.

Categories

  • AI in Healthcare
  • AI Regulations & Policies
  • Artificial Intelligence (AI)
  • Business
  • Cloud Computing
  • Cyber Security
  • Deep Learning
  • Ethics & Society
  • Machine Learning
  • Technology

Recent Posts

  • Best Practices for AI in Bid Proposals
  • Artificial Intelligence for Small Businesses
  • Google Generates Fake AI Podcast From Search Results
  • Technologies Shaping a Nursing Career
  • AI-Powered Next-Gen Services in Regulated Industries

Our Newsletter

Subscribe Us To Receive Our Latest News Directly In Your Inbox!

We don’t spam! Read our privacy policy for more info.

Check your inbox or spam folder to confirm your subscription.

© Copyright 2025. All Right Reserved By Technology Hive.

No Result
View All Result
  • Home
  • Technology
  • Artificial Intelligence (AI)
  • Cyber Security
  • Machine Learning
  • AI in Healthcare
  • AI Regulations & Policies
  • Business
  • Cloud Computing
  • Ethics & Society
  • Deep Learning

© Copyright 2025. All Right Reserved By Technology Hive.

Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?