[ad_1]
Nostalgia website MyHeritage has launched a new service that allows you to create lifelike animations of faces in still photos.
The AI-powered service called Deep Nostalgia, launched last week, is free to try and is remarkably accurate in depicting how a person would look if captured on video. Their eyes blink, their head moves and their mouth forms a smile.
“You’ll have a ‘wow moment’ when you see a treasured family photo come to life with Deep Nostalgia,” Gilad Japhet, founder and CEO of MyHeritage, said in a statement.
“Seeing our beloved ancestors’ faces come to life in a video simulation lets us imagine how they might have been in reality, and provides a profound new way of connecting to our family history,” he added.
Created with Deep Nostalgia™ | Credit: John P. Mello Jr.
To create Deep Nostalgia, MyHeritage teamed up with D-ID, an Israeli company focused on deep learning and synthetic media.
“I like the partnership with them because both of us really care about the sensitivity of animating people who are no longer with us,” said D-ID cofounder and CEO Gil Perry.
He noted that it has taken his firm years to cross the “uncanny valley” in synthetic media. “With synthetic media, you reach a level where it looks good, but a viewer still senses something is wrong,” he told TechNewsWorld.
“We’ve managed to cross that uncanny valley,” he continued, “and create perfect synthetic media.”
Creepy?
MyHeritage explained that Deep Nostalgia uses several prerecorded driver videos prepared by the nostalgia firm, which direct the movements in the animation and consist of sequences of real human gestures.
A preferred driver is automatically selected for each face based on its orientation, and then seamlessly applied to the photo. To achieve optimal results, the photos are enhanced prior to animation using the MyHeritage Photo Enhancer, which brings blurry and low-resolution faces into focus and increases their resolution.
Users can animate several photos for free, regardless of the number of faces in the pic. Beyond that, continued use requires a subscription. The video animation can be downloaded as an MP4 file and shared on social media.
While some folks may be charmed by adding new life to old photos, others may find the practice a bit creepy.
“Calling this creepy seems extreme,” said Daniel Castro, vice president of the Information Technology & Innovation Foundation, a research and public policy organization in Washington, D.C.
“We’ve seen a lot of interesting advances in AI that allow for reconstruction of images and videos, such as adding color to black-and-white photos or creating high resolution videos from old, grainy videos,” he told TechNewsWorld. “This latest example is another similar advance that allows for more advanced image generation.”
Tip of the Ice Berg
“There’s no simple or single answer to whether this type of deep fake is creepy or fun,” added Jean-Claude Goldenstein, CEO of CREOpoint, a brand and reputation protection provider in San Francisco.
“As a son, I imagine I’d be in favor of a robot with a deep fake of my mother if she could not assist my dad who had Alzheimer’s,” he explained to TechNewsWorld.
“But Imagine a deep fake of the CEOs of Citadel and Robinhood colluding against Reddit retail investors,” he continued. “That would be very far from fun in the run-up to Robinhood’s IPO.”
Castro noted that there are a lot of interesting applications for the technology in Deep Nostalgia, such as in video editing and video conferencing.
“Video conferencing services can reduce bandwidth usage if they can instead replicate certain movements on their own, or they can allow video interactions to appear seamless on a weak connection,” he explained.
Perry added that Deep Nostalgia is just the tip of the ice berg for his company’s technology.
For example, it can be used by filmmakers to reshoot scenes without bringing actors back to a set. They can be at a remote location and be inserted into the scene.
The company is also working on synthesizing mouth movements so when a film is dubbed, the actor will look as if they’re speaking in the dubbed language.
Deep Fake Threat
While there are many worthwhile applications for synthetic media, Castro asserted that deep fakes remain a serious concern. “But it is one that should be addressed through legislation aimed at particular harms, such as election misinformation, harassment, or other harmful acts — such as distributing fake nudes about someone without their consent,” he said.
Avivah Litan, a security and privacy analyst at Gartner, maintained that it’s relatively easy to create deep fakes now. “The services are available on the Internet,” she told TechNewsWorld. “You can just go get them.”
“Right now, it’s being mainly used in the porn industry for revenge porn,” she said.
“The porn industry always leads in innovation,” Litan joked.
Although far from mature, deep fakes have already proven to be hard to detect. “The detection rate will probably end up at 50 percent, but even right now Facebook has only been 65 percent accurate in catching deep fakes,” she observed.
“It’s a big threat looming on the horizon,” she warned.
Tool for Account Takeover
In a research note by Litan, she predicted that by 2023, 20 percent of successful account takeover attacks will use deep fakes to socially engineer users to turn over sensitive data or move money into criminal accounts.
She explained that criminals take over user accounts to gain access to sensitive information and intellectual property, as well as to steal funds from financial accounts. Most theft of money and information starts with user account takeover, as criminals leverage weak security controls typically guarding these accounts.
“Enterprises are incented to raise the security bar and incorporate biometric assurance and verification into their multi-factor authentication applications,” she wrote. “This uses facial and voice verification from life images, voice recordings or videos that users present in a live session to the authenticating organization so that it can verify the legitimate user’s ‘liveness’ and authenticity.”
“Ominously,” Litan continued, “deep fakes will render most biometric authentication and verification useless as criminals can easily mimic user voices, images and videos.”
“Most existing biometric user authentication must be upgraded to incorporate technologies that can detect the presence of a deep fake imposter, and assure the presence of a legitimate user,” she added.
[ad_2]
Source link