Genealogy website MyHeritage has unveiled a bizarre new online tool that can animate old photos of deceased family members.
The free deepfake technology, called Deep Nostalgia, takes any photo and animates the subject’s face – with strangely realistic and unsettling results.
Examples provided by MyHeritage show historical figures, including Queen Victoria, Mark Twain and Florence Nightingale, come to life.
MyHeritage says the tech gives history ‘a fresh new perspective’ by producing a depiction of how a person ‘could have moved and looked if captured on video’.
It’s been developed by researchers at Israel-based firm D-ID, which specialises in video reenactment using deep learning.
Anyone can use the tool on the Deep Nostalgia webpage by uploading or drag-and-dropping an image – although to see the results you’ll need a MyHeritage account.
Scroll down for video
Examples provided by MyHeritage show historical figures, including Queen Victoria, Mark Twain and Florence Nightingale, come to life. The tech uses a modern-day input to animate photos of the deceased
WHAT IS A DEEPFAKE?
Deepfakes are so named because they are made using deep learning, a form of artificial intelligence, to create fake videos of a target individual.
They are made by feeding a computer an algorithm, or set of instructions, as well as lots of images and audio of the target person.
The computer program then learns how to mimic the person’s facial expressions, mannerisms, voice and inflections.
With enough video and audio of someone, you can combine a fake video of a person with fake audio and get them to say anything you want.
‘You’ll have a ‘wow moment’ when you see a treasured family photo come to life with Deep Nostalgia,’ said Gilad Japhet, founder and CEO of MyHeritage.
‘Seeing our beloved ancestors’ faces come to life in a video simulation lets us imagine how they might have been in reality, and provides a profound new way of connecting to our family history.’
Deep Nostalgia uses several pre-recorded ‘driver videos’ – normal videos of living people performing simple gestures and face movements.
These driver videos direct the movements of the photos.
Each time someone uploads an old photo, a preferred driver is automatically selected for each face based on its orientation and then seamlessly applied to the photo.
The result is a short but high-quality video animation of an individual face that can smile, blink and move.
Photos are also enhanced prior to animation using the MyHeritage Photo Enhancer, which brings blurry and low-resolution faces into focus and increases their resolution.
Deep Nostalgia is the newest AI-based photo tool on MyHeritage, the Israeli genealogy platform that launched in 2003 and has soared in popularity since.
MyHeritage is well known for web services and products such as saliva-based DNA testing kits that claim to reveal ‘the ethnic groups and geographic regions you originate from’.
Deep Nostalgia makes use of a selection of modern day videos to animate an old photo. It automatically picks the best input video based on head shape and size
Last year, the firm released both the MyHeritage Photo Enhancer and MyHeritage In Color, which colourises black and white and faded colour photos.
These features have gone viral and have been used over 30 million times, according to the firm.
Deep Nostalgia is a freemium feature – meaning several additional features come with a price.
Users can animate several photos for free, regardless of the number of faces in the photo but ‘continued use requires a subscription’.
The video animation can easily be shared with family and friends on Facebook, Twitter, WhatsApp and other social media.
‘This new product integration is an exciting collaboration between two innovative companies,’ said Gil Perry, co-founder and CEO of D-ID.
‘We’re thrilled that our technology will be accessible to millions of people on MyHeritage and hope many of them will enjoy the impact of video reenactment for historical photos.’
Deepfake technology isn’t without its controversies, however – a growing cause of concern is their ability to contribute to the spread of fake news, misplaced identity and even fraud.
Some hyper-realistic videos depicts a person appears to say or do something they did not, fuelling the spread of misinformation and deceit.
Deepfake technology entered even murkier waters last year, when it emerged that photos shared by girls on their social media were being faked to appear nude by a deepfake bot on messaging app Telegram.
Disturbing deepfake tool on popular messaging app Telegram was found to forge nude images of underage girls from clothed photos on their social media
More than 100,000 non-consensual sexual images of 10,000 women and girls were shared online that were created using the bot between July 2019 and 2020, according to a report from deepfake detection firm Sensity.
More recently, Channel 4 broadcast a deepfake version of the Queen’s speech for its annual 2020 Alternative Christmas Message.
Ofcom decided not to investigate the 354 complaints it received about the video, which depicts ‘Deepfake Queen’ dancing on top of her desk and making jokes about toilet roll shortages due to Covid.
WHY IS FAKE CELEBRITY PORN MADE BY AN AI SO CONCERNING?
Back in December, it was discovered that Reddit users were creating fake pornography using celebrity faces pasted on to adult film actresses’ bodies.
The disturbing videos, created by Reddit user deepfakes, look strikingly real as a result of a sophisticated machine learning algorithm, which uses photographs to create human masks that are then overlaid on top of adult film footage.
Now, AI-assisted porn is spreading all over Reddit, thanks to an easy-to-use app that can be downloaded directly to your desktop computer, according to Motherboard.
To create the likeness of Gal Gadot, for example, the algorithm was trained on real porn videos and images of actor, allowing it to create an approximation of the actor’s face that can be applied to the moving figure in the video.
As all of this is freely available information, it could be done without that person’s consent.
And, as Motherboard notes, people today are constantly uploading photos of themselves to various social media platforms, meaning someone could use such a technique to harass someone they know.
Source: Daily Mail |World News