• About Us
  • Disclaimer
  • Contact Us
  • Privacy Policy
Wednesday, April 29, 2026
mGrowTech
No Result
View All Result
  • Technology And Software
    • Account Based Marketing
    • Channel Marketing
    • Marketing Automation
      • Al, Analytics and Automation
      • Ad Management
  • Digital Marketing
    • Social Media Management
    • Google Marketing
  • Direct Marketing
    • Brand Management
    • Marketing Attribution and Consulting
  • Mobile Marketing
  • Event Management
  • PR Solutions
  • Technology And Software
    • Account Based Marketing
    • Channel Marketing
    • Marketing Automation
      • Al, Analytics and Automation
      • Ad Management
  • Digital Marketing
    • Social Media Management
    • Google Marketing
  • Direct Marketing
    • Brand Management
    • Marketing Attribution and Consulting
  • Mobile Marketing
  • Event Management
  • PR Solutions
No Result
View All Result
mGrowTech
No Result
View All Result
Home Al, Analytics and Automation

Meta AI Introduces UMA (Universal Models for Atoms): A Family of Universal Models for Atoms

Josh by Josh
July 14, 2025
in Al, Analytics and Automation
0
Meta AI Introduces UMA (Universal Models for Atoms): A Family of Universal Models for Atoms


Density Functional Theory (DFT) serves as the foundation of modern computational chemistry and materials science. However, its high computational cost severely limits its usage. Machine Learning Interatomic Potentials (MLIPs) have the potential to closely approximate DFT accuracy while significantly improving performance, reducing computation time from hours to less than a second with O(n) versus O(n³) scaling. However, training MLIPs that generalize across different chemical tasks remains an open challenge, as traditional methods rely on smaller problem-specific datasets instead of using the scaling advantages that have driven significant advances in language and vision models.

Existing attempts to address these challenges have focused on developing Universal MLIPs trained on larger datasets, with datasets like Alexandria and OMat24 leading to improved performance on the Matbench-Discovery leaderboard. Moreover, researchers have explored scaling relations to understand relationships between compute, data, and model size, taking inspiration from empirical scaling laws in LLMs that motivated training on more tokens with larger models for predictable performance improvements. These scaling relations help in determining optimal resource allocation between the dataset and model size. However, their application to MLIPs remains limited compared to the transformative impact seen in language modeling.

Researchers from FAIR at Meta and Carnegie Mellon University have proposed a family of Universal Models for Atoms (UMA) designed to test the limits of accuracy, speed, and generalization for a single model across chemistry and materials science. To address these challenges, Moreover, they developed empirical scaling laws relating compute, data, and model size to determine optimal model sizing and training strategies. This helped in overcoming the challenge of balancing accuracy and efficiency, which was due to the unprecedented dataset of ~500 million atomic systems. Moreover, UMA performs similarly or better than specialized models in both accuracy and inference speed on a wide range of material, molecular, and catalysis benchmarks, without fine-tuning to specific tasks.

The UMA architecture builds upon eSEN, an equivariant graph neural network, with crucial modifications to enable efficient scaling and handle additional inputs, including total charge, spin, and DFT settings for emulation. It also incorporates a new embedding that allows UMA models to integrate charge, spin, and DFT-related tasks. Each of these inputs generates an embedding of the same dimension as the spherical channels used. The training follows a two-stage approach: first stage directly predicts forces for faster training, and the second stage removes the force head and fine-tunes the model to predict conserving forces and stresses using auto-grad, ensuring energy conservation and smooth potential energy landscapes.

The results show that UMA models exhibit log-linear scaling behavior across the tested FLOP ranges. This indicates that greater model capacity is required to fit the UMA dataset, with these scaling relationships used to select accurate model sizes and show MoLE’s advantages over dense architectures. In multi-task training, a significant improvement is observed in loss when moving from 1 expert to 8 experts, smaller gains with 32 experts, and negligible improvements at 128 experts. Moreover, UMA models demonstrate exceptional inference efficiency despite having large parameter counts, with UMA-S capable of simulating 1000 atoms at 16 steps per second and fitting system sizes up to 100,000 atoms in memory on a single 80GB GPU.

In conclusion, researchers introduced a family of Universal Models for Atoms (UMA) that shows strong performance across a wide range of benchmarks, including materials, molecules, catalysts, molecular crystals, and metal-organic frameworks. It achieves new state-of-the-art results on established benchmarks such as AdsorbML and Matbench Discovery. However, it fails to handle long-range interactions due to the standard 6Å cutoff distance. Moreover, it uses separate embeddings for discrete charge or spin values, which limits generalization to unseen charges or spins. Future research aims to advance toward universal MLIPs and unlock new possibilities in atomic simulations, while highlighting the need for more challenging benchmarks to drive future progress.


Sajjad Ansari is a final year undergraduate from IIT Kharagpur. As a Tech enthusiast, he delves into the practical applications of AI with a focus on understanding the impact of AI technologies and their real-world implications. He aims to articulate complex AI concepts in a clear and accessible manner.

READ ALSO

Enabling privacy-preserving AI training on everyday devices | MIT News

OpenAI Releases Privacy Filter: A 1.5B-Parameter Open-Source PII Redaction Model with 50M Active Parameters



Source_link

Related Posts

Enabling privacy-preserving AI training on everyday devices | MIT News
Al, Analytics and Automation

Enabling privacy-preserving AI training on everyday devices | MIT News

April 29, 2026
OpenAI Releases Privacy Filter: A 1.5B-Parameter Open-Source PII Redaction Model with 50M Active Parameters
Al, Analytics and Automation

OpenAI Releases Privacy Filter: A 1.5B-Parameter Open-Source PII Redaction Model with 50M Active Parameters

April 29, 2026
Top 10 Physical AI Models Powering Real-World Robots in 2026
Al, Analytics and Automation

Top 10 Physical AI Models Powering Real-World Robots in 2026

April 28, 2026
Build a Reinforcement Learning Powered Agent that Learns to Retrieve Relevant Long-Term Memories for Accurate LLM Question Answering
Al, Analytics and Automation

Build a Reinforcement Learning Powered Agent that Learns to Retrieve Relevant Long-Term Memories for Accurate LLM Question Answering

April 28, 2026
Microsoft has loosened its exclusive control over OpenAI, and now the artificial intelligence race appears wide open
Al, Analytics and Automation

Microsoft has loosened its exclusive control over OpenAI, and now the artificial intelligence race appears wide open

April 27, 2026
A faster way to estimate AI power consumption | MIT News
Al, Analytics and Automation

A faster way to estimate AI power consumption | MIT News

April 27, 2026
Next Post
The human harbor: Navigating identity and meaning in the AI age

The human harbor: Navigating identity and meaning in the AI age

POPULAR NEWS

Trump ends trade talks with Canada over a digital services tax

Trump ends trade talks with Canada over a digital services tax

June 28, 2025
Communication Effectiveness Skills For Business Leaders

Communication Effectiveness Skills For Business Leaders

June 10, 2025
15 Trending Songs on TikTok in 2025 (+ How to Use Them)

15 Trending Songs on TikTok in 2025 (+ How to Use Them)

June 18, 2025
App Development Cost in Singapore: Pricing Breakdown & Insights

App Development Cost in Singapore: Pricing Breakdown & Insights

June 22, 2025
Comparing the Top 7 Large Language Models LLMs/Systems for Coding in 2025

Comparing the Top 7 Large Language Models LLMs/Systems for Coding in 2025

November 4, 2025

EDITOR'S PICK

What They Are & How to Find Them

What They Are & How to Find Them

January 19, 2026

4 Ways to Get Better Reporting with Compare Attribution Settings

November 18, 2025
Uncover the National Gallery of Art on Google Arts & Culture

Uncover the National Gallery of Art on Google Arts & Culture

July 9, 2025
New Gemma model for function calling

New Gemma model for function calling

December 24, 2025

About

We bring you the best Premium WordPress Themes that perfect for news, magazine, personal blog, etc. Check our landing page for details.

Follow us

Categories

  • Account Based Marketing
  • Ad Management
  • Al, Analytics and Automation
  • Brand Management
  • Channel Marketing
  • Digital Marketing
  • Direct Marketing
  • Event Management
  • Google Marketing
  • Marketing Attribution and Consulting
  • Marketing Automation
  • Mobile Marketing
  • PR Solutions
  • Social Media Management
  • Technology And Software
  • Uncategorized

Recent Posts

  • Why Cutting Customer Service Costs Doesn’t Improve Profit
  • General Motors is adding Gemini to four million cars
  • How to build brand visibility in AI search
  • AI Agents Are Changing the Rules of Digital Growth and Loyalty
  • About Us
  • Disclaimer
  • Contact Us
  • Privacy Policy
No Result
View All Result
  • Technology And Software
    • Account Based Marketing
    • Channel Marketing
    • Marketing Automation
      • Al, Analytics and Automation
      • Ad Management
  • Digital Marketing
    • Social Media Management
    • Google Marketing
  • Direct Marketing
    • Brand Management
    • Marketing Attribution and Consulting
  • Mobile Marketing
  • Event Management
  • PR Solutions