Skip to main content
chun and burton banner

What makes information feel true or compelling? In today’s digital society, what seems “true” can be performed, produced, scripted and dramatized. Authenticity itself is patterned on the algorithmic flows of capitalism and subject to variation, iteration and outside influence. Is it even possible to sift through the mis- and disinformation to find the authentic?

Simon Fraser University’s (SFU) Wendy Hui Kyong Chun and Anthony Glyn Burton’s new book, Algorithmic Authenticity, brings together different disciplinary understandings of “authenticity” to find alternative ways to approach mis- and disinformation beyond contemporary fact-checking. Linking cross-disciplinary research on the history and practices of algorithms and authenticity they offer alternative ways to understand its impact on social life and its role in current information disorders.

Chun is the Canada 150 Research Chair in New Media in the School of Communication at SFU and a Fellow of the Royal Society of Canada. She is the director of the Digital Democracies Institute, which she founded to address concerns in online spaces. This group of diverse scholars and stakeholders from around the world collaborate across disciplines, schools, industry and public sectors to research and create vibrant democratic technologies and cultures. Chun also leads the Data Fluencies Project, a Mellon-funded international research project combining analytical techniques from arts and humanities with tools from data science to better understand and engage with our data-filled world.

Burton is a Ph.D. candidate at SFU's School of Communications and a Mellon-SFU Graduate Fellow at the Digital Democracies Institute. His research interests broadly include the networked development of epistemologies and ideology in technological and datafied environments and internet political subcultures. His doctoral research focuses on the links between libertarianism in Silicon Valley, the alt-right, and the right's adoption of leftist social theory.

Burton is the first author of Algorithmic Authenticity, supported by Wendy Chun, Liliana Bounegru, Melody Devries, Amy Harris, hannah holtzclaw, Ioana Jucan, Alex Juhasz, D. W. Kamish, Ganaele Langlois, Jasmine Proctor, Christine Tomlinson, Roopa Vasudevan, and Esther Weltevrede.

 

We spoke with Burton and Chun about their research.  


Our culture tends to appreciate truth and authenticity, yet the digital environment is rife with bias, conspiracies and “fake news.” What is meant by Algorithmic Authenticity

Algorithmic authenticity helps us understand the persistence of misinformation within the digital environment. As many researchers have pointed out, fact-checking, although important, isn’t enough: corrections lag stories and by repeating the original story, they can also disseminate misinformation. More importantly, people spread stories they find authentic or true—regardless of its facticity. Tellingly, the 2016 U.S. elections were called both the “fake news elections” and the “authenticity elections”—with the winner scoring high on both. In Algorithmic Authenticity, we ask: what makes information feel compelling or true? 

By stressing the algorithmic nature of authenticity, we also reveal how authenticity—something supposedly genuine and spontaneous—has become a method. There is an explosion of “how-to” guides that offer step-by-step rules to becoming an authentic influencer. The algorithmic nature of authenticity, we reveal, isn’t new—following the command “to thine own self be true” is fundamentally algorithmic. Algorithms are, broadly speaking, procedures that take a given input and produce an expected output, and we broaden this definition beyond computing environments in order to show how social media and the digital world has given an algorithmic tint to conceptions of human and cultural authenticity.

A fascinating section of the book talks about “the authentic self.” But ironically, our lives are highly mediated by society and by the internet. We are adept at crafting online truths and personas. Is the “authentic self” real or an algorithm? Or is it both?

To answer your question: both. As mentioned previously, the clarion call of authenticity is “to thine own self be true.” This line is taken from Hamlet, which reveals that the concept of  ‘thine own self’ is less some eternal truth and more something performed, something that emerges in relation to others. In terms of social media, we are given platforms to express ourselves—but that these expressions—which are framed in terms of visibility or revealing our inner true self—are mediated by these platforms  (think of Meta's own knowledge that Instagram negatively affects teen mental health, chiefly because it is engineered to have people mostly express the glamorous parts of life). But it takes two to tango; user agency is still required to make these expressions exist in the first place.

How do you recommend that we arm ourselves against mis- and disinformation? How do we move beyond contemporary fact-checking in our digital environments?

As mentioned previously, fact-checking doesn't work—humans are charged by emotions, by their own needs and concerns, and fact-checkers have not made the difference they were supposed to. So on the one hand, a critical consciousness is key—we have to constantly question what we read online. On the other hand, the very move for people to “do their own research” has partially gotten us into this mess in the first place, especially in terms of the persistence of conspiracy theories. There needs to be interventions bigger than those centred around the self, because social media companies profit off of misinformation—it is content, and it provides data.

Removing the impact of the profit motive when it comes to the dissemination of news is a start, especially to prevent destructive actions like major social media platforms removing established news sources from people's newsfeeds like what has happened here in Canada and is about to happen in Australia. Also, as Naomi Oreskes has argued in her must-read book, Why Trust Science?, scientific and professional communities have to be more open, diverse and engage transparent and rigorous peer-review processes.

Tell us more about the work of the Digital Democracies Institute. Is there a current project that you are particularly excited about?

BURTON: As a young scholar, I am still in the afterglow of publishing Algorithmic Authenticity, but we are still doing work concerning the spread of mis- and disinformation. Right now a few of us are editing a special issue of the International Journal of Communications that extends some of the questions tackled in the book. It is centered around investigating the different roles that emotions (we are using the word “affect”) play in how mis- and disinformation intersect with these conceptions of authenticity. It is a rich field and there are many different angles with which we need to look at these issues, so there is much more work to be done here.

CHUN: I am currently most excited by the data fluencies project. Moving beyond literacy, data fluencies combine the interpretative traditions of the arts and humanities with critical work in the data sciences to express, imagine and create innovative engagements with (and resistances to) our data-filled world. Our projects range from developing mixed qualitative-quantitative methods to study social media that are not based on user surveillance, (i.e. constantly tracking and experimenting on users to community-based efforts to rethink machine learning.  This is a huge multi-institutional and multi-disciplinary project. I am also working on a new book Sensing AI: Sentiment, Generative Models and the Returns of Critical Theory that explores the odd resonances between theories in the humanities and those that ground the computational social sciences.

 

For more: Buy or download Algorithmic Authenticity, and visit the Digital Democracies Institute at: digitaldemocracies.org

 

SFU's Scholarly Impact of the Week series does not reflect the opinions or viewpoints of the university, but those of the scholars. The timing of articles in the series is chosen weeks or months in advance, based on a published set of criteria. Any correspondence with university or world events at the time of publication is purely coincidental.

For more information, please see SFU's Code of Faculty Ethics and Responsibilities and the statement on academic freedom.