Friday, March 13, 2026
News Health
  • Health News
  • Hair Products
  • Nutrition
    • Weight Loss
  • Sexual Health
  • Skin Care
  • Women’s Health
    • Men’s Health
No Result
View All Result
  • Health News
  • Hair Products
  • Nutrition
    • Weight Loss
  • Sexual Health
  • Skin Care
  • Women’s Health
    • Men’s Health
No Result
View All Result
HealthNews
No Result
View All Result
Home Health News

Transformer architecture, the one innovation that supercharged AI: Best ideas of the century

January 19, 2026
in Health News
Share on FacebookShare on Twitter


New Scientist. Science news and long reads from expert journalists, covering developments in science, technology, health and the environment on the website and the magazine.

Today’s most powerful AI tools – the ones that can summarise documents, generate artwork, write poetry or predict how incredibly complex proteins fold – all stand on the shoulders of the “transformer”. This neural network architecture, first announced in 2017 at an unassuming conference centre in California, enables machines to process information in a way that reflects how humans think.

Previously, most state-of-the-art AI models relied on a technique called a recurrent neural network. This worked by reading text in tight windows, left to right, remembering only what came just before. That set-up worked well enough for short phrases. But in longer, more tangled sentences, the models had to squeeze too much context into their limited memory, causing crucial details to be lost. The ambiguity stumped them.

Transformers threw out that approach and embraced something more radical: self-attention.

It’s surprisingly intuitive. We humans certainly don’t read and interpret text by scanning word by word in a strict order. We skim, we double back, we make guesses and corrections by weighing up the context. This kind of mental agility has long been the holy grail of natural language processing: teaching machines not just how to process language, but also how to understand it.

Transformers mimic that mental leap. Their self-attention mechanism allows them to compare every word in a sentence with every other word, all at once, spotting patterns and building meaning from the relationships between them. “You could leverage all this data from the internet or Wikipedia and use it for your task,” says AI researcher Sasha Luccioni at Hugging Face. “And that was hugely powerful.”

This flexibility isn’t limited to text either. Transformers now underpin tools that generate music, render images and even model molecules. AlphaFold, for instance, treats proteins – long strings of amino acids – like sentences. A protein’s function depends on how it folds and that, in turn, depends on how its parts relate across long distances. Attention mechanisms let the model weigh those distant relationships with fine-grained precision.

In hindsight, the insight feels almost obvious: intelligence, whether human or artificial, depends on knowing what to focus on and when. The transformer didn’t just help machines grasp language. It gave them a way to navigate any structured data – much like humans navigating their own complex worlds.

Topics:

  • artificial intelligence/
  • neural networks



Source link : https://www.newscientist.com/article/2510604-the-one-innovation-that-supercharged-ai-best-ideas-of-the-century/?utm_campaign=RSS%7CNSNS&utm_source=NSNS&utm_medium=RSS&utm_content=home

Author :

Publish date : 2026-01-19 16:00:00

Copyright for syndicated content belongs to the linked Source.

Previous Post

Neurodiversity reveals there’s no such thing as a normal brain: Best ideas of the century

Next Post

The hidden power of epigenetics: Best ideas of the century

Related Posts

Health News

CV Risk Increased Across Spectrum of Primary Aldosteronism

March 13, 2026
Health News

Sugammadex for Block Reversal Cuts Postop Urinary Retention

March 13, 2026
Health News

We don’t know if AI-powered toys are safe, but they’re here anyway

March 13, 2026
Health News

CDC Studies Suggest Slumping Flu Shot Protection

March 12, 2026
Health News

Organ Donation Guidance From CMS Warns Against Coercion, Rushed Decision-Making

March 12, 2026
Health News

Med Student’s Misogyny Goes Viral; Vitamin K PSA; Fibermaxxing: Yea or Nay?

March 12, 2026
Load More

CV Risk Increased Across Spectrum of Primary Aldosteronism

March 13, 2026

Sugammadex for Block Reversal Cuts Postop Urinary Retention

March 13, 2026

We don’t know if AI-powered toys are safe, but they’re here anyway

March 13, 2026

CDC Studies Suggest Slumping Flu Shot Protection

March 12, 2026

Organ Donation Guidance From CMS Warns Against Coercion, Rushed Decision-Making

March 12, 2026

Med Student’s Misogyny Goes Viral; Vitamin K PSA; Fibermaxxing: Yea or Nay?

March 12, 2026

MedPAC Members Express Concern Over Medicare Docs’ Ability to Absorb Cost Increases

March 12, 2026

Higher Odds of Liver Fibrosis in Women With Cardiometabolic Risks

March 12, 2026
Load More

Categories

Archives

March 2026
M T W T F S S
 1
2345678
9101112131415
16171819202122
23242526272829
3031  
« Feb    

© 2022 NewsHealth.

No Result
View All Result
  • Health News
  • Hair Products
  • Nutrition
    • Weight Loss
  • Sexual Health
  • Skin Care
  • Women’s Health
    • Men’s Health

© 2022 NewsHealth.

Go to mobile version