Are you constantly self-conscious about what you do in public, always worried you'll be filmed doing something that will invite a Twitter mob to ruin your life? Well, guess what? Machines are now making those videos for you. "Deepfakes", videos made by computers to show people doing and saying things they never have, could be a major threat to people, and the world, in an era of mainstreamed conspiracy theories and fake news. The process involves taking an existing video, and combining it with another. The technology used to create deepfakes uses artificial intelligence to gather information about the subject or subjects' body movements and/or facial expressions to create a theoretically realistic portrayal of what a combination of the two videos would look like. Essentially, if you were to search for images of a celebrity on Google, you would eventually get a picture of them at every possible angle. Eventually. Machines can do that a lot faster than humans. The result is that you get an eerily realistic video of someone doing whatever you want them to.
Deepfakes were initially used for putting celebrities' faces on people in porn videos, or adding Nicholas Cage to otherwise Nicholas Cage-less movie moments. Making Meghan Markle appear to be a porn star is certainly cruel enough, but consider this: the past two years have been awash in discussions of biased news reporting and Russia hacking our election. Our country is more divided than ever before. What are the chances that someone will use this technology to undermine democracy? What if you could produce a video of journalists saying whatever you want them to? What if someone makes a video of Trump, or another world leader, declaring war on another nation? This could make the controversy over the Jim Acosta video look like nothing.
So how do we combat deepfakes? Well, they fall apart if you look close enough. Often, the technique used to make deepfakes causes parts of the video to to distort. Since a lot the facial movements and expressions are based on photos, even the best ones don't have the subject blinking enough to look human. They also succumb to the "uncanny valley," the point where computer and machine-made imitations of human behavior become really unsettling. Unfortunately, a quick look online will make it apparent that careful examination and critical thinking are not key characteristics of current political discourse. Also, many of the flaws in the videos could become less apparent when people are watching them on their phones. In addition, the artificial intelligence used to create deepfakes could eventually learn to correct some of these flaws, creating even more realistic videos. Hopefully, facts will prevail in the end, but the threat posed by deepfakes should not be ignored. When the public can be misled so easily, the very concept of democracy is at risk.
This week, packages containing explosive devices were found outside the homes of philanthropist George Soros , the Clintons, and the Obamas, and others critical of Donald Trump, as well as outside the New York headquarters of CNN. Many considered these incidents evidence of the threat of right-wing hate speech, sentiments emboldened and often shared by President Trump, being turned into actions. Others said that the left brought this on themselves for hostile rhetoric towards right-wingers and not supporting the President. Some even said that the bombs were a “false flag”, placed by the victims themselves to attract sympathy right before the midterms. This was all in the comments section of one YouTube video posted by CNN of the news of the bombing attempt on their office. Political discourse has become so deeply fragmented that it’s easy to become jaded, uncaring, and silent towards major issues. What’s the point of making your voice heard if it won’t matter in such a chaotic en...
Comments
Post a Comment