In defense of Peter Daou's dumb app
Yes, it's ridiculous. Yes, it's ineffectual. But at least it's trying to solve a real problem.
Peter Daou is forever being dunked on. The former Clinton strategist has become a figurehead on Twitter for his absurd loyalty to Hillary, and his brash yet naïve political pronouncements have led to him being mercilessly mocked. The Outline called him "the weirdest man alive," while the New Republic said he is an embarrassment to Clinton.
So when Daou this week launched Verrit, a website dedicated to providing verified soundbites and facts for supporters of Clinton to combat the right, it was almost inevitable that mockery would follow. That doesn't mean it wasn't justified. Verrit aims to be a site for the 65.8 million people who voted for Hillary, and works by listing facts on small "cards" that purport to be verified with a seven-digit number and that include links to their sources. The point, according to Daou, is to arm supporters of Clinton with the truth against the accusation that has come to dominate online discourse thanks to the president: fake news
The problems with the site, however, should be clear. The little cards, meant to be shared on social media, are easy to fake. The verification process appears to be little more that linking to the stories that are used as sources. And what its actual effects might be are unclear; why would anyone who didn't care about certain "facts" earlier start now because of a seven-digit code? It is, in essence, a propaganda outlet, but one that isn't even very good at propagandizing.
All that said, Daou's attempt is in its own way understandable. Naïve as it might be, Verrit emerged out of a desire to verify what's true, to put into circulation trustworthy pieces of information against a backdrop of misinformation and constant conflict. The horizon of knowledge in the 21st century has become intensely clouded, in no small part because deep political polarization and fake news have found the perfect medium in the web. In that sense, Verrit gets the solution wrong, but still expresses a real anxiety about how to form consensus when we cannot agree on what constitutes the truth.
Of course, truth has always been in a state of flux, found in those now familiar controversies about the Enlightenment, Copernicus and Galileo, Darwin, or any number of scientific and philosophical inquiries. But particularly in the 20th and 21st centuries, the notion of truth as a fixed goal towards which we move ever closer shifted as we began to understand that deep questions about, say, morality or society have no fully objective answers and are instead rooted in arbitrary worldviews. Making matters more complicated is that the rise of mass media culture and its circulation of images and ideas began to destabilize once clear questions of authenticity and immanence; how might a war have the same impact when it feels little different from a movie? Once-fanciful sounding theories like Jean Baudrillard's notions of hyperreality or the simulacrum, in which representations of things come to occupy the same importance as "real" things, became commonplace and self-evident.
But digital technology and the web in particular seem to have exacerbated these conditions. Consider that just this week, 30 million people viewed a Facebook video falsely claiming it was from Hurricane Irma, when it was in fact from a tornado in Uruguay. False stories of Black Lives Matter protesters blocking emergency services during a hurricane ricocheted around right-wing sites.
The internet is filled with thousands upon thousands of such false or badly sourced pieces. And as Kyle Chayka has pointed out, because the aesthetics of social media and websites in general flatten difference — making a fake story and a real one reported on by four journalists look indistinguishable — it becomes more and more difficult for even sophisticated readers to determine what is and isn't credible. And that might be the biggest problem of all.
Returning to the old ways of determining the truth, as Verrit yearns to do, is probably impossible. People might not even accept facts that clash with their worldviews in the best of times. But still, the web and social media have nonetheless done something to corrupt our discourse.
This week, for instance, it was revealed that Facebook sold $100,000 in ads in 2016 to a Russian troll farm linked to election meddling. The ability to detect what is and is not true is thus not simply a question of being informed about the news, but also about recognizing how the structure of social media can be gamed to significant effect.
Peter Daou's dumb app might not be the answer to this problem, but it at least tries to address a simmering new anxiety. Hopefully someone else will try a little harder.