Today: Sep 30, 2024

Watch the Mona Lisa Rap, Courtesy of Microsoft's AI: It's as Creepy as You Suppose

Watch the Mona Lisa Rap, Courtesy of Microsoft's AI: It's as Creepy as You Suppose
April 23, 2024



There are lots of enduring mysteries of human nature: Why are we alive? Who made the universe? And, most significantly, can Mona Lisa rap? Microsoft has introduced a brand new synthetic intelligence generation referred to as VASA-1, which is able to flip an image of an individual's face and a snippet in their voice right into a computer-controlled reside video. And due to this luck, the corporate used to be in a position to create a video of the well-known portrait of Leonardo da Vinci, Mona Lisa, who’s concerned about Anne Hathaway's viral 2011 rap, Paparazzi. Microsoft has simply downloaded VASA-1.
This AI can create a unmarried symbol to sing and discuss obviously. Very similar to EMO from Alibaba
10 wild examples:
1. Mona Lisa whips up Paparazzi %.twitter.com/LSGF3mMVnD— Min Choi (@minchoi) April 18, 2024 Microsoft's video is the newest instance of the development tech firms are making with AI gear, even though Mona Lisa is suffering. the road between quirky and creepy. AI Atlas icon Watch the Mona Lisa Rap, Courtesy of Microsoft's AI: It's as Creepy as You Suppose The tool large's new generation isn’t restricted to the usage of AI to interpret and acknowledge artwork, or all of the international of social media. Microsoft's startup OpenAI, for instance, has created in a similar way shocking movies via its text-to-video style, Sora, which is able to create fine quality movies from audio. Google has a equivalent device referred to as Lumiere. The ones gear are restricted. As for AI gear you'll use each day, be certain to try CNET's critiques of symbol editors in Adobe Firefly, OpenAI's Dall-E 3 and Google's ImageFX, at the side of critiques of chatbots together with ChatGPT, Gemini and Claude. . deepfake to actual Microsoft's generation might appear new. Researchers had been appearing superb life-like movies for years, continuously calling them deep dives. One of the most maximum fascinating data comes from the Massachusetts Institute of Generation, which in 2019 used AI generation to make President Richard Nixon seem to be talking one thing he by no means spoke. spreading disinformation. The ones considerations haven't stopped tool builders from effectively handing over the type of content material Microsoft has created. Such systems have change into so well-liked that Web safety researchers warn that photographs uploaded via other people can be utilized to advertise generation with out the consent of the landlord or the coed. Microsoft believes that those applied sciences can do extra excellent than hurt. “Whilst acknowledging the opportunity of misuse, it is very important acknowledge the overall doable of our way,” the corporate wrote in a weblog publish saying VASA-1. “The advantages – akin to selling equality in training, bettering get entry to to other people with verbal exchange difficulties, offering buddies or well being care to these in want, amongst many others – ascertain the significance of our analysis and different analysis.” Microsoft added that it’s “devoted to creating AI successfully, with the purpose of bettering other people's lives.” Editor's observe: CNET used an AI engine to assist generate a chain of reports, which can be written in parallel. The articles you’re studying are connected to articles that experience a robust affect at the matter of AI however have been created via our professional writers and authors. For more info, see our AI coverage.

OpenAI
Author: OpenAI

Don't Miss

Elon Musk an ‘unlawful immigrant’? His brother Kimbal’s admission video resurfaces and is going viral | Watch

Elon Musk an ‘unlawful immigrant’? His brother Kimbal’s admission video resurfaces and is going viral | Watch

Sep 30, 2024 11:27 AM IST Even though the Musk brothers’ remarks

Watch This 30-Yr-Outdated Diesel Mercedes Die at the Autobahn

What kills an outdated Mercedes-Benz W140 with a mid-six diesel? Now not