Philosophy In A World Of Mass Deception

Our current political situation is so horribly distressing that it is easy to lose sight of even more horrible things that may be on the horizon.

A recent article at Business Insider briefly surveys some of the forthcoming technological developments, some just a few years away, that threaten to put us in a persistent state of Cartesian doubt: “tools that will allow anyone to easily create fraudulent, photo-realistic video and audio.”

Thanks to advances in artificial intelligence (AI) and computer-generated imagery (CGI) technology, over the coming decade it will become trivial to produce fake media of public figures and ordinary people saying and doing whatever hoaxers can dream of—something that will have immense and worrying implications for society…

It will open up worrying new fronts in information warfare, as hostile governments weaponise the technology to sow falsehoods, propaganda, and mistrust in target populations. The tools will be a boon to malicious pranksters, giving them powerful new tools to bully and blackmail, and even produce synthetic “revenge porn” featuring their unwilling targets. And fraud schemes will become ever-more sophisticated and difficult to detect, creating uncertainty as to who is on the other end of any phone call or video-conference.

The article’s author, Rob Price, conferred with Gregory C. Allen, an adjunct fellow at the Center for a New American Security, about the technology and our unusual historical position:

These advances mean that humanity is rapidly approaching the end of a unique period in human history. We “live in an amazing time where the tech for documenting the truth is significantly more advanced than the tech for fabricating the truth. This was not always the case. If you think back to the invention of the printing press, and early newspapers, it was just as easy to lie in a newspaper as it was to tell the truth,” Allen said. “And with the invention of the photograph and the phonograph, or recorded audio, we now live in a new technological equilibrium where—provided you have the right instruments there—you can prove something occurred… we thought that was a permanent technological outcome, and it is now clear that is a temporary technological outcome. And that we cannot rely on this technological balance of truth favouring truth forever.”

Here is “Barack Obama,” “Donald Trump,” and “Hillary Clinton” discussing some of the technology developed by a start-up known as Lyrebird.

Thinking a bit further down the road, if holographic technology and built-in augmented reality become common, the problems may be even more severe.

I wonder what the effect of this technology will be on philosophy. I’m not talking about creating fake but convincing video footage of, say, notorious utilitarian Alastair Norcross endorsing the categorical imperative (though someone should totally do that). Rather, I’m interested in both philosophy’s possible roles in helping humanity live with technology that could have us thinking that it is likely—not just possible—that we are being fooled, that is, life regularly expecting deception; and also the ways philosophy might be changed by such technology, if at all.

For further horrifying speculation about the future, watch Black Mirror

sounds like “liar”

Horizons Sustainable Financial Services
Notify of

Newest Most Voted
Inline Feedbacks
View all comments
Durval Menezes
6 years ago

As a Promethean, I think that the evils of technology could be solved by more technology.

In the case of fake videos, it would be relatively trivial (i.e., much less difficult than actually faking the videos) to embed a cryptographic signature of the person(s) supposedly being filmed so the its authenticity could be checked automatically by the video player, more or less like a web browser nowadays authenticates a bank’s web site.

Tristan Haze
Reply to  Durval Menezes
6 years ago

Could you explain this further Durval? I’m intrigued but I don’t understand the proposed solution. What would happen if I filmed someone committing a crime? Say I don’t know who they are but they look different from everyone else in the world. Would your solution enable it to be shown that this video was genuine? Apologies if I’m being thick but I bet I’m not the only reader who doesn’t completely understand your proposal.

Alan White
Alan White
Reply to  Tristan Haze
6 years ago

If I get this suggestion it would require:

(i) all sources of digital record–phones, cameras, etc.–to embed a spatiotemporal-GPS unique identifier not just in an image but throughout the digital grid of an image so that cropped parts could be identified as used in cobbled fakes
(ii) all CGI would contain similar identifiers as a sort of “alibi” for anyone depicted in wholly fictional ways (that would require tracking of all real people in real spacetime to verify such alibis with something like implants)
(iii) implementation of tech that would disable/prevent any device from circumventing (i) and (ii)
(iv) some sort of rigidly enforced global/universal standard of (i) and (ii) and (iii) internationally

The weakness is not the tech that would be required for (i) and (ii)–it’s probably within our grasp already and what Menezes is talking about–but the problem is practical achievement of (i) and (ii) by (iii) and (iv) that would make it effective. I don’t realistically see how (iii) is achievable given the same Promethean assumption Menezes makes and how (iv) jibes with multicultural legal systems–not to say anything about the darker sides of ingenious human nature that work against the entire proposal (like going old-school film as a basis to corrupt digitizing, etc.).

BTW Justin Black Mirror is terrifyingly gripping. “Shut Up and Dance”, e.g.

Derek Bowman
6 years ago

I wouldn’t want to deny that this leap in technology is troubling, but I wonder if it’s really different in kind than the problems we already have. Video can already be easily edited and manipulated by selecting what is and isn’t shown. We still rely on written and printed documents for many purposes – e.g. tax returns, e-mail records, bank records, etc.

I guess I don’t see what this technology will allow that existing forms of spin and fake news don’t already allow. We have already proven very adept at filtering and interpreting information. The idea that video/audio recordings, or even eyewitness testimony is some spin-free direct report of truth is already an illusion. See for example this Vice News interviewer’s conversation with one of the “Unite the Right” organizers about the video of the car driving into protesters and killing Heather Hayer:

Reply to  Derek Bowman
6 years ago

indeed, the problems are as old as our species…

Eric Campbell
Eric Campbell
6 years ago

I think it would be great if philosophy got to work much more on the problems around the fact that it is already the case that we are extremely likely–indeed certainly–being fooled by all manner of propaganda at many levels, from “vertical” corporate and government propaganda to “horizontal” propaganda within (sub) cultures and indeed all socio-moral groups. Actually I think it would be great if I got to work on that, so toodles!

6 years ago

The problem: 1) Who in the populace reads or cares about what philosophers say? 2) why do we have a society that such is the case? 3) Given 1 and 2 is the any hope philosophers can make a difference?

My take is that we have to keep trying. But my gut tells me it is all in vain in a populace that found our current president their first choice. Let us, as philosophers, carry out Socrates’ mission — gadflies on the body politic.

Dale Miller
Reply to  DocF
6 years ago

But on the bright side, the public’s complete lack of interest in anything we say makes it unlikely that anyone will bother making fake videos of us.

Lord Philostronaut
Lord Philostronaut
6 years ago

A wee shout out to the University of Glasgow who have just established a position in Philosophy specifically focussed on questions arising from AR and VR tech (and whether they are genuinely novel).