Understanding Value in Media: Perspectives from Consumers and Industry

Understanding Value in Media: Perspectives from Consumers and Industry. World Economic Forum. April 2, 2020.

The disruption of the media industry, with the rise of social media, the digitization of content and the increase in mobile consumption has changed traditional funding models beyond recognition.
The role of media historically has been central to the making of society and the construction of identity. At this dark moment for humanity, threatened by COVID-19 with many people physically isolated, this role is vital in the search for information, stories and art to feed the human spirit and ignite the imagination to overcome the challenges ahead.
This report considers how different stakeholders in media – content creators, advertisers, marketing agencies and individual consumers – each value media content. By analysing this dynamic, the industry – and consumers – can make informed decisions about the future. [Note: contains copyrighted material].

[PDF format, 40 pages].

Detecting Malign or Subversive Information Efforts over Social Media: Scalable Analytics for Early Warning

Detecting Malign or Subversive Information Efforts over Social Media: Scalable Analytics for Early Warning. RAND Corporation. William Marcellino et al. March 16, 2020.

The United States has a capability gap in detecting malign or subversive information campaigns before these campaigns substantially influence the attitudes and behaviors of large audiences. Although there is ongoing research into detecting parts of such campaigns (e.g., compromised accounts and “fake news” stories), this report addresses a novel method to detect whole efforts. The authors adapted an existing social media analysis method, combining network analysis and text analysis to map, visualize, and understand the communities interacting on social media. As a case study, they examined whether Russia and its agents might have used Russia’s hosting of the 2018 World Cup as a launching point for malign and subversive information efforts. The authors analyzed approximately 69 million tweets, in three languages, about the World Cup in the month before and the month after the event, and they identified what appear to be two distinct Russian information efforts, one aimed at Russian-speaking and one at French-speaking audiences. Notably, the latter specifically targeted the populist gilets jaunes (yellow vests) movement; detecting this effort months before it made headlines illustrates the value of this method. To help others use and develop the method, the authors detail the specifics of their analysis and share lessons learned. Outside entities should be able to replicate the analysis in new contexts with new data sets. Given the importance of detecting malign information efforts on social media, it is hoped that the U.S. government can efficiently and quickly implement this or a similar method. [Note: contains copyrighted material].

[PDF format, 66 pages].

Is Seeing Still Believing? The Deepfake Challenge to Truth in Politics

Is Seeing Still Believing? The Deepfake Challenge to Truth in Politics. Brookings Institution. William A. Galston. January 8, 2020

On Nov. 25, an article headlined “Spot the deepfake. (It’s getting harder.)” appeared on the front page of The New York Times business section. The editors would not have placed this piece on the front page a year ago. If they had, few would have understood what its headline meant. Today, most do. This technology, one of the most worrying fruits of rapid advances in artificial intelligence (AI), allows those who wield it to create audio and video representations of real people saying and doing made-up things. As this technology develops, it becomes increasingly difficult to distinguish real audio and video recordings from fraudulent misrepresentations created by manipulating real sounds and images. “In the short term, detection will be reasonably effective,” says Subbarao Kambhampati, a professor of computer science at Arizona State University. “In the longer run, I think it will be impossible to distinguish between the real pictures and the fake pictures.” [Note: contains copyrighted material].

[HTML format, various paging].

Profiles of News Consumption: Platform Choices, Perceptions of Reliability, and Partisanship

Profiles of News Consumption: Platform Choices, Perceptions of Reliability, and Partisanship. RAND Corporation. Michael Pollard, Jennifer Kavanagh. December 10, 2019

In this report, the authors use survey data to explore how U.S. media consumers interact with news platforms, finding mixed perceptions about the reliability of news and that consumer partisanship broadly shapes news consumption behavior. [Note: contains copyrighted material].

[PDF format, 110 pages].

Fighting Deepfakes When Detection Fails

Fighting Deepfakes When Detection Fails. Brookings Institution. Alex Engler. November 14, 2019

Deepfakes intended to spread misinformation are already a threat to online discourse, and there is every reason to believe this problem will become more significant in the future. So far, most ongoing research and mitigation efforts have focused on automated deepfake detection, which will aid deepfake discovery for the next few years. However, worse than cybersecurity’s perpetual cat-and-mouse game, automated deepfake detection is likely to become impossible in the relatively near future, as the approaches that generate fake digital content improve considerably. In addition to supporting the near-term creation and responsible dissemination of deepfake detection technology, policymakers should invest in discovering and developing longer-term solutions. Policymakers should take actions that:

  • Support ongoing deepfake detection efforts with continued funding through DARPA’s MediFor program, as well as adding new grants to support collaboration between detection efforts and training journalists and fact-checkers to use these tools.
  • Create an additional stream of funding awards for the development of new tools, such as reverse video search or blockchain-based verification systems, that may better persist in the face of undetectable deepfakes.
  • Encourage the release of large social media datasets for social science researchers to study and explore solutions to viral misinformation and disinformation campaigns. [Note: contains copyrighted material].

[HTML format, various paging].

The Democracy Playbook: Preventing and Reversing Democratic Backsliding

The Democracy Playbook: Preventing and Reversing Democratic Backsliding. Brookings Institution. Norman Eisen et al. November 2019

The Democracy Playbook sets forth strategies and actions that supporters of liberal democracy can implement to halt and reverse democratic backsliding and make democratic institutions work more effectively for citizens. The strategies are deeply rooted in the evidence: what the scholarship and practice of democracy teach us about what does and does not work. We hope that diverse groups and individuals will find the syntheses herein useful as they design catered, context-specific strategies for contesting and resisting the illiberal toolkit. This playbook is organized into two principal sections: one dealing with actions that domestic actors can take within democracies, including retrenching ones, and the second section addressing the role of international actors in supporting and empowering pro-democracy actors on the ground. [Note: contains copyrighted material].

[PDF format, 100 pages].

Fighting Disinformation Online: A Database of Web Tools

Fighting Disinformation Online: A Database of Web Tools. RAND Corporation. Jennifer Kavanagh, Hilary Reininger, Norah Griffin. November 12, 2019

The rise of the internet and the advent of social media have fundamentally changed the information ecosystem, giving the public direct access to more information than ever before. But it’s often nearly impossible to distinguish between accurate information and low-quality or false content. This means that disinformation — false or intentionally misleading information that aims to achieve an economic or political goal — can become rampant, spreading further and faster online than it ever could in another format.

As part of its Truth Decay initiative, RAND is responding to this urgent problem. Researchers identified and characterized the universe of online tools developed by nonprofits and civil society organizations to target online disinformation. The tools in this database are aimed at helping information consumers, researchers, and journalists navigate today’s challenging information environment. Researchers identified and characterized each tool on a number of dimensions, including the type of tool, the underlying technology, and the delivery format.

Hostile Social Manipulation: Present Realities and Emerging Trends

Hostile Social Manipulation: Present Realities and Emerging Trends. RAND Corporation.  Michael J. Mazarr et al. September 4, 2019.

The role of information warfare in global strategic competition has become much more apparent in recent years. Today’s practitioners of what this report’s authors term hostile social manipulation employ targeted social media campaigns, sophisticated forgeries, cyberbullying and harassment of individuals, distribution of rumors and conspiracy theories, and other tools and approaches to cause damage to the target state. These emerging tools and techniques represent a potentially significant threat to U.S. and allied national interests. This report represents an effort to better define and understand the challenge by focusing on the activities of the two leading authors of such techniques — Russia and China. The authors conduct a detailed assessment of available evidence of Russian and Chinese social manipulation efforts, the doctrines and strategies behind such efforts, and evidence of their potential effectiveness. RAND analysts reviewed English-, Russian-, and Chinese-language sources; examined national security strategies and policies and military doctrines; surveyed existing public-source evidence of Russian and Chinese activities; and assessed multiple categories of evidence of effectiveness of Russian activities in Europe, including public opinion data, evidence on the trends in support of political parties and movements sympathetic to Russia, and data from national defense policies. The authors find a growing commitment to tools of social manipulation by leading U.S. competitors. The findings in this report are sufficient to suggest that the U.S. government should take several immediate steps, including developing a more formal and concrete framework for understanding the issue and funding additional research to understand the scope of the challenge. [Note: contains copyrighted material].

[PDF format, 302 pages].

State of the News Media: Data and Trends about Key Sectors in the U.S. News Media Industry

State of the News Media: Data and Trends about Key Sectors in the U.S. News Media Industry. Pew Research Center. July 9, 2019.

Since 2004, Pew Research Center has issued an annual report on key audience and economic indicators for a variety of sectors within the U.S. news media industry. These data speak to the shifting ways in which Americans seek out news and information, how news organizations get their revenue, and the resources available to American journalists as they seek to inform the public about important events of the day. The press is sometimes called the fourth branch of government, but in the U.S., it’s also very much a business – one whose ability to serve the public is dependent on its ability to attract eyeballs and dollars.

Over the years, the Center’s approach to these indicators has evolved along with the industry, carefully considering the metrics, sectors and format in which the data appear. Instead of a single summary report, our approach is to roll out a series of fact sheets showcasing the most important current and historical data points for each sector – in an easy-to-digest format – a few at a time. [Note: contains copyrighted material].

[HTML format, various paging].