Microsoft chief warns more deepfake threats could be coming soon


When it comes to deepfakes, what we’ve seen so far is just the tip of the iceberg. In the near future, we won’t be certain if the person we’re speaking to on a video call is real, or an impostor, and crooks won’t have trouble creating an entire chronology of fake videos to support their claims, or trick people into believing the legitimacy of an offer or campaign.
These harrowing predictions come from Eric Horvitz, Microsoft’s chief science officer, in a new research paper, titled “On the horizon: Interactive and compositional deepfakes”.
Deepfakes are “photoshopped” videos, essentially. By using artificial intelligence (AI) and machine learning (ML), a threat actor is able to create a video of a person saying things that they never said. Now, according to Horvitz, crooks are ready to take it to the next level. Interactive deepfakes are just as you’d expect – real-time videos with which users can interact, which are, in reality, utterly fake.
Synthetic history
Compositional deepfakes, on the other hand, are described as “sets of deepfakes” designed to integrate over time with “observed, expected, and engineered world events to create persuasive synthetic histories.”
“Synthetic histories can be constructed manually but may one day be guided by adversarial generative explanation (AGE) techniques,” Horvitz adds.
He also says that in the near future, it will be almost impossible to distinguish fake videos and fake content from authentic ones: “In the absence of mitigations, interactive and compositional deepfakes threaten to move us closer to a post-epistemic world, where fact cannot be distinguished from fiction.”
This absence of mitigations stems from the fact that threat actors can pit artificial intelligence against analysis tools and develop deepfake content that is able to fool even the most advanced detection systems.
“With this process at the foundation of deepfakes, neither pattern recognition techniques nor humans will be able to reliably recognize deepfakes,” Horvitz notes.
So, next time a family member calls from abroad to ask for money to pay the rent, make sure it’s not a fraudster impersonating (opens in new tab) your loved ones.
Via: VentureBeat (opens in new tab)
Audio player loading… When it comes to deepfakes, what we’ve seen so far is just the tip of the iceberg. In the near future, we won’t be certain if the person we’re speaking to on a video call is real, or an impostor, and crooks won’t have trouble creating an…
Recent Posts
- Top digital loan firm security slip-up puts data of 36 million users at risk
- Nvidia admits some early RTX 5080 cards are missing ROPs, too
- I tried ChatGPT’s Dall-E 3 image generator and these 5 tips will help you get the most from your AI creations
- Gabby Petito murder documentary sparks viewer backlash after it uses fake AI voiceover
- The quirky Alarmo clock is no longer exclusive to Nintendo’s online store
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010