The "Micro-Propaganda Machine"

The "Micro-Propaganda Machine"

Organisation: Independent (Medium) (United States)

Publication Date: 03/30/2017

Applicant(s)

Size of team/newsroom:small

Description

Starting in November 2016, I produced a series of major investigative data pieces on Medium concerning online propaganda, emergent technology, and "fake news." My project focuses on examining the structural changes that are taking place in the media ecosystem. Using a large-scale hyperlink analysis, my work contributed empirical evidence in the debate around "post-truth" politics and coordinated misinformation campaigns. My project is a hybrid of academic research, open data, large-scale network visualization, and "deep" data-driven investigative storytelling. Beyond my "fake news," "right-wing," and "left-wing" hyperlink-based ecosystem maps, my project also investigates the behavioral tracking and advertising technologies that underpin this politicized "micro-propaganda machine." In a separate adtech data scrape of close to 100 “fake news” websites, I found a stripped-down, yet densely connected network of behavioral trackers, dodgy http redirects, insecure data siphoning, and social platform integration. This weaponized "shadow" infrastructure appears to play a part in helping misinformation move through propaganda sites and hyper-partisan blogs into social media platforms like Facebook. Related to the impact of this infrastructure on digital culture and politics, by understanding more about the "dark side" of propaganda networks, research could help curb the effects of coordinated misinformation campaigns (eg, #Pizzagate). The preliminary goal of my research was to spatially map out the 2016 post-election news ecosystem beyond the walls of Facebook and Twitter. However, the results of my initial analysis displayed tens of thousands of intricate connections between highly politicized media actors across the internet; these intricate patterns between news websites, search engines, and even "clusters" of individual social media profiles huddled around opinion-leaders. Journalists, academics, and political analysts have begun to default to biased data sources like Facebook, Twitter, and Google, as well as packaged info providers such as Wikileaks with little regard to the underlying issues (e.g., doxing, the impact of automated accounts), or assuring that context around the information they relay to the public is added. I believe this disconnect has helped promote a widening data-culture gap, and part of the reason platforms like Twitter can be used strategically to obscure critically important issues in media, politics, and society.

What makes this project innovative? What was its impact?

The reactive amplification of direct-to-audience media appears to be a stark new reality of the "post-truth" politics era. Yet, by visualizing the structural connections between networked actors, and locating the ideological clusters within these issue-based media ecosystems, research can help lead the way to better understanding the fundamental nature of misinformation. For instance, I found nearly 80,000 YouTube news videos created by AI-based tech in my follow-up work. I see this result as highly pertinent to showing how "fake news" can be increasingly spread with lower financial and human overhead. After exploring what I’ve labeled the "micro-propaganda machine," my project shows the changing news ecosystem--demonstrating that what’s "in the public interest" is increasingly the opposite of what the public is interested in. To make matters worse, much of the news media, and consequently the audiences who rely on these institutions for accurate information, are being led down the wrong paths in their sourcing and reporting efforts. In a sense, news organizations seem to be "following" the conversation instead of working to organize the information that the public needs to know in the first place. The response to my "fake news" ecosystem project has been highly encouraging, with my work being featured in Fortune magazine, appearing as part of a two-page full-color network visualization spread in The Guardian Sunday Observer's "Google and Democracy" piece, in a major #Pizzagate story in The Washington Post, a mention in Jane Mayer's 27 March 2017 cover story in the New Yorker, as well as in nearly 70 other leading news outlets across the world. My project's introductory (background) piece, republished on LSE's USA Politics and Policy, was one of the top ten "most-read" articles of 2016.

Technologies used for this project:

I utilized mostly open source software/tools for this project, including Gephi, a Python-based web scraping tool, a javascript screen capture extension, and heavy use of Google Sheets and Docs. Additionally, I shared most of the resulting data from my project in each of my project stories on Pastebin, ShareCSV, and GitHub to make the case for the benefits of having an "open" web for transparent reporting and research. I'd argue the main technologies, however, were human, and included deductive logic and hundreds of hours of time!
Follow this project
Wait
Comment

Comments (0)

You have to be connected to contribute

You have to be connected to follow

Leave this project and no longer be informed about this project

By joining this project, you will be informed by email when an update or a new contribution is posted on the website.

Thank you for your active participation !
Best,

The GEN Community Team