Synthetic Media Is Here. Are You Ready?

Talk about serendipity.

In yesterday’s blog, I wrote about the advent of digital clothing which gives people the chance to make it appear as if they are wearing a unique outfit, tailored just for them.

Well today on 60 Minutes, one of the segments was on synthetic media, also known as deepfakes.

I was hoping that fake news would begin to fade away, but it looks like it is only going to get worse. It will be harder and harder to know what’s “real”.

Here is one of the more famous examples of a deepfake:

After a few of these videos starting to appear and people began wondering who was behind these fake videos, a modest, 32-year-old Belgian visual effects artist named Chris Umé, stepped forward to claim credit.

Umé says his work is made easier because he teamed up with a Tom Cruise impersonator whose voice, gestures, and hair are nearly identical to the real McCoy. Umé only deepfakes Cruise’s face and stitches that onto the real video and sound of the impersonator.

Umé notes that It begins with training a deepfake model. He gathers all the face angles of Tom Cruise, all the expressions, all the emotions. It takes time to create a really good deepfake model. The software then begins training, analyzing all the images of Tom Cruise, all his expressions, compared to my impersonator. So the computer’s gonna teach itself: When my impersonator is smiling, Umé is going to recreate Tom Cruise smiling, and so on.

The U.S. military, law enforcement and intelligence agencies have kept a wary eye on deepfakes for years. At a 2019 hearing, Senator Ben Sasse of Nebraska asked if the U.S. is prepared for the onslaught of disinformation, fakery, and fraud.

Ben Sasse: When you think about the catastrophic potential to public trust and to markets that could come from deepfake attacks, are we organized in a way that we could possibly respond fast enough?

Dan Coats: We clearly need to be more agile. It poses a major threat to the United States and something that the intelligence community needs to be restructured to address. 

The technology behind deepfakes is artificial intelligence, which mimics the way humans learn. In 2014, researchers for the first time used computers to create realistic-looking faces using something called “generative adversarial networks,” or GANs.

In a GAN, you set up an adversarial game where you have two AIs combating each other to try and create the best fake synthetic content. And as these two networks combat each other, one trying to generate the best image, the other trying to detect where it could be better, you basically end up with an output that is increasingly improving all the time.

You can see the power of generative adversarial networks is on full display at a website called “” Every time you refresh the page, there’s a new image of a person who does not exist. If you click the link, you will see how realistic these fake people look. If you refresh the page, you will see a new one.

Synthesia, based in London, is one of dozens of companies using deepfake technology to transform video and audio productions. Synthesia essentially replaces cameras with code, allowing it to do a lot of things that you wouldn’t be able to do with a normal camera. It’s still very early, but some people believe this will be a fundamental change in how media is created. Synthesia makes and sells “digital avatars,” using the faces of paid actors to deliver personalized messages in 64 languages… and allows corporate CEOs to address employees overseas.

You can try it for free; the free version does not allow you to choose your avatar, but you can have an avatar deliver a customized message. Here is what I came up with:

Can you imagine if I could have Barack Obama deliver such a message or Bruce Springsteen? I’d have to hire a  staff of people to help me with my blog…

So it is a whole new world out there, and it will get harder and harder to know what is real.

But you can always count on Borden’s Blather being the real deal; no one would want to associate a fake account with it…

You can watch the whole 60 Minutes segment by clicking here.

*image from NY Post

43 thoughts on “Synthetic Media Is Here. Are You Ready?

  1. tell me something about your clowning and juggling if you are really jim….

    kind of scary, isn’t it? we’ll have an even harder time knowing what’s real from here on out –

    Liked by 3 people

  2. It’s scary stuff, Jim. It really is difficult to know just what is real and what isn’t. Except your video is a winner of course.


  3. Why Tom Cruise. That guy fuels a deep visceral reaction of hate in me every time I see his smug grin. Our AI has surpassed our intelligence. Soon, anyone is going to be able to make a video of Trump telling his Q-Anon followers to start an armed revolution. Doesn’t bode well for our future.

    Liked by 1 person

  4. So what you’re saying is that all those videos of Numpty Trumpty were actually fake nooz and it was really Putin doing impressions all along? I’d believe that 😊

    Liked by 1 person

  5. I’d think there would be legal liability for deepfaking someone’s image if you could find out who did it. You don’t need Tom Cruise to do an add if you can deepfake it. Even if you could identify the culprit, the harm will have already been done and the culprit may not have any assets to pay a judgement.

    Liked by 1 person

    1. here was a blurb from the show:

      “If you are wondering how all of this is legal, most deepfakes are considered protected free speech. Attempts at legislation are all over the map. In New York, commercial use of a performer’s synthetic likeness without consent is banned for 40 years after their death. California and Texas prohibit deceptive political deepfakes in the lead-up to an election. “

      Liked by 1 person

      1. Interesting and good to know. You could certainly harm a person’s reputation with deepfakes not to mention the harm to society in undermining our ability to believe what we see. The legal side has some catching up to do, as usual when it comes to new technology and scientific advances.

        Liked by 1 person

Comments are closed.