Daily Guardian UAEDaily Guardian UAE
  • Home
  • UAE
  • What’s On
  • Business
  • World
  • Entertainment
  • Lifestyle
  • Sports
  • Technology
  • Travel
  • Web Stories
  • More
    • Editor’s Picks
    • Press Release
What's On

Civilization VII Arcade Edition on Apple Arcade to land on your iPhone soon

January 16, 2026

The Forza Horizon 6 release date leak looks real, but don’t lock your plans yet

January 16, 2026

You can now use Gemini’s Thinking model without worrying about Pro model limits

January 16, 2026

I held the Clicks Communicator, and it blends the QWERTY past with modern tech like no other phone

January 16, 2026

This chip can make future phones thinner and faster through tiny ‘earthquakes’

January 16, 2026
Facebook X (Twitter) Instagram
Finance Pro
Facebook X (Twitter) Instagram
Daily Guardian UAE
Subscribe
  • Home
  • UAE
  • What’s On
  • Business
  • World
  • Entertainment
  • Lifestyle
  • Sports
  • Technology
  • Travel
  • Web Stories
  • More
    • Editor’s Picks
    • Press Release
Daily Guardian UAEDaily Guardian UAE
Home » This robot learned to lip sync like humans by watching YouTube
Technology

This robot learned to lip sync like humans by watching YouTube

By dailyguardian.aeJanuary 15, 20263 Mins Read
Share
Facebook Twitter LinkedIn Pinterest Email

Researchers at Columbia Engineering have trained a human-like robot named Emo to lip-sync speech and songs by studying online videos, showing how machines can now learn complex human behaviour simply by observing it.

Emo is not a full humanoid body but a highly realistic robotic face built to explore how humans communicate. The face is covered with silicone skin and driven by 26 independently controlled facial motors that move the lips, jaw, and cheeks.

These motors allow Emo to form detailed mouth shapes that cover 24 consonants and 16 vowels, which is critical for natural speech and singing. The goal was to reduce the uncanny valley effect, where robots look almost human but still feel unsettling because their facial movements do not match their voice.

How Emo learned to lip sync like a human

The learning process happened in stages. First, Emo explored its own face by moving its motors while watching itself in a mirror. This helped the system understand how motor commands change facial shapes.

Researchers then introduced a learning pipeline that connects sound to movement. Emo watched hours of YouTube videos of people speaking and singing, while an AI model analysed the relationship between audio and visible lip motion.

Instead of focusing on language or meaning, the system studied the raw sounds of speech. A facial action transformer converted those learned patterns into real-time motor commands.

This approach allowed Emo to lip sync not only in English but also in languages it was never trained on, including French, Arabic, and Chinese. The same method worked for singing, which is harder because of stretched vowels and rhythm changes.

Researchers say this matters because future robots will need to communicate naturally if they are going to work alongside people. This advancement has arrived when interest in robots for homes and workplaces is climbing fast.

At CES 2026, that momentum was on full display, with demos ranging from Boston Dynamics’ Atlas humanoid which is ready to enter workplace to SwitchBot’s household-focused robot that can cook meals and do your laundry, and LG’s upcoming home assistant robot designed to make everyday life easier.

Add advances like artificial skin that gives robots human-like sensitivity, and paired with realistic lip syncing, it is easy to see how robots are starting to feel less like machines and more like social companions. Emo is still a research project, but it shows how robots may one day learn human skills the same way we do by watching and listening.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Keep Reading

Civilization VII Arcade Edition on Apple Arcade to land on your iPhone soon

The Forza Horizon 6 release date leak looks real, but don’t lock your plans yet

You can now use Gemini’s Thinking model without worrying about Pro model limits

I held the Clicks Communicator, and it blends the QWERTY past with modern tech like no other phone

This chip can make future phones thinner and faster through tiny ‘earthquakes’

Samsung won’t charge you for Galaxy AI features (or at least some of them)

MacBook Pro models with more powerful M5 series chips could be right around the corner

Are you a Spotify Premium subscriber? Get ready to pay $12.99 per month from February

iPhone 17e might finally embrace a modern redesign as a leak hints at an impending launch

Editors Picks

The Forza Horizon 6 release date leak looks real, but don’t lock your plans yet

January 16, 2026

You can now use Gemini’s Thinking model without worrying about Pro model limits

January 16, 2026

I held the Clicks Communicator, and it blends the QWERTY past with modern tech like no other phone

January 16, 2026

This chip can make future phones thinner and faster through tiny ‘earthquakes’

January 16, 2026

Subscribe to News

Get the latest UAE news and updates directly to your inbox.

Latest Posts

KEZAD Group Signs 50-Year Land Lease with Jotun Abu Dhabi to Set Up AED 450M Facility 

January 16, 2026

Samsung won’t charge you for Galaxy AI features (or at least some of them)

January 16, 2026

MacBook Pro models with more powerful M5 series chips could be right around the corner

January 16, 2026
Facebook X (Twitter) Pinterest TikTok Instagram
© 2026 Daily Guardian UAE. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.