Wednesday, October 29, 2025
News Health
  • Health News
  • Hair Products
  • Nutrition
    • Weight Loss
  • Sexual Health
  • Skin Care
  • Women’s Health
    • Men’s Health
No Result
View All Result
  • Health News
  • Hair Products
  • Nutrition
    • Weight Loss
  • Sexual Health
  • Skin Care
  • Women’s Health
    • Men’s Health
No Result
View All Result
HealthNews
No Result
View All Result
Home Health News

Analogue computers could train AI 1000 times faster and cut energy use

October 29, 2025
in Health News
Share on FacebookShare on Twitter


Analogue computers use less energy than digital ones

metamorworks/Getty Images

Analogue computers that rapidly solve a key type of equation used in training artificial intelligence models could offer a potential solution to the growing energy consumption in data centres caused by the AI boom.

Laptops, smartphones and other familiar devices are known as digital computers, because they store and process data as a series of binary digits, either 0 or 1, and can be programmed to solve a range of problems. In contrast, analogue computers are normally designed to solve just one specific problem. They store and process data using quantities that can vary continuously such as electrical resistance, rather than discrete 0s and 1s.

Analogue computers can excel at speed and energy efficiency, but have previously lacked the accuracy of their digital counterparts. Now, Zhong Sun at Peking University, China, and his colleagues have created a pair of analogue chips that work together to accurately solve matrix equations – a fundamental part of sending data over telecom networks, running large scientific simulations or training AI models.

The first chip outputs a low-precision solution to matrix calculations very rapidly, while a second runs an iterative refinement algorithm to analyse the error rates of the first chip and so improve accuracy. Sun says that the first chip produces results with an error rate of around 1 per cent, but that after three cycles of the second chip, this drops to 0.0000001 per cent – which he says matches the precision of standard digital calculations.

So far, the researchers have built chips capable of solving 16 by 16 matrices, or those with 256 variables, which could have applications for some small problems. But Sun admits that tackling the questions used in today’s large AI models would require far larger circuits, perhaps a million by a million.

But one advantage analogue chips have over digital is that larger matrices don’t take any longer to solve, while digital chips struggle exponentially as the matrix size increases. That means the throughput – the amount of data crunched per second – of a 32 by 32 matrix chip would beat that of a Nvidia H100 GPU, one of the high-end chips used to train AI today.

Theoretically, scaling further could see throughput reach 1000 times that of digital chips like GPUs, while using 100 times less energy, says Sun. But he is quick to point out that real-world tasks may stray outside the extremely narrow capabilities of their circuits, leading to smaller gains.

“It’s only a comparison of speed, and for real applications, the problem may be different,” says Sun. “Our chip can only do matrix computations. If matrix computation occupies most of the computing task, it represents a very significant acceleration for the problem, but if not, it will be a limited speed-up.”

Sun says that because of this, the most likely outcome is the creation of hybrid chips, where a GPU features some analogue circuits that handle very specific parts of problems – but even that is likely some years away.

James Millen at King’s College London says that matrix calculations are a key process in training AI models and that analogue computing offers a potential boost.

“The modern world is built on digital computers. These incredible machines are universal computers, which means they can be used to calculate absolutely anything, but not everything can necessarily be computed efficiently or fast,” says Millen. “Analogue computers are tailored to specific tasks, and in this way can be incredibly fast and efficient. This work uses an analogue computing chip to speed up a process called matrix inversion, which is a key process in training certain AI models. Doing this more efficiently could help reduce the huge energy demands of our ever-growing reliance on AI.”

Topics:



Source link : https://www.newscientist.com/article/2500442-analogue-computers-could-train-ai-1000-times-faster-and-cut-energy-use/?utm_campaign=RSS%7CNSNS&utm_source=NSNS&utm_medium=RSS&utm_content=home

Author :

Publish date : 2025-10-29 12:00:00

Copyright for syndicated content belongs to the linked Source.

Previous Post

Balloons Challenge Lithotripsy for Calcified Lesions

Next Post

Long COVID in Kids: What Pediatricians Need to Know

Related Posts

Health News

Study: HCQ Dosing in Lupus Usually Misses the Mark

October 29, 2025
Health News

Scholarship in Healthcare Quality & Safety Has Come a Long Way

October 29, 2025
Health News

Added Glucocorticoids for Pneumonia Tied to Reduced Deaths in High-Risk Region

October 29, 2025
Health News

The Hidden Toll of GLP-1 Care Complexity on PCPs

October 29, 2025
Health News

Trump Loyalty Test at HHS; CDC Studying Alleged Harms of Wind Power; Viral Strokes

October 29, 2025
Health News

When Fitness Turns Obsessive: Social Media and Vigorexia

October 29, 2025
Load More

Study: HCQ Dosing in Lupus Usually Misses the Mark

October 29, 2025

Scholarship in Healthcare Quality & Safety Has Come a Long Way

October 29, 2025

Added Glucocorticoids for Pneumonia Tied to Reduced Deaths in High-Risk Region

October 29, 2025

The Hidden Toll of GLP-1 Care Complexity on PCPs

October 29, 2025

Trump Loyalty Test at HHS; CDC Studying Alleged Harms of Wind Power; Viral Strokes

October 29, 2025

When Fitness Turns Obsessive: Social Media and Vigorexia

October 29, 2025

Randomized Trial May Set a New Standard for the ICU

October 29, 2025

Long COVID in Kids: What Pediatricians Need to Know

October 29, 2025
Load More

Categories

Archives

October 2025
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031  
« Sep    

© 2022 NewsHealth.

No Result
View All Result
  • Health News
  • Hair Products
  • Nutrition
    • Weight Loss
  • Sexual Health
  • Skin Care
  • Women’s Health
    • Men’s Health

© 2022 NewsHealth.

Go to mobile version