pp28

News Literacy Project Module 3: Viral Content and Hidden Ads

Module Three of the News Literacy Project is Navigating Today’s Information Landscape, including Viral Content and Hidden Ads.

The third module in the News Literacy Project program deals with the changes our online environment has made to the way we consume content.  From viral rumors, to algorithms, to creative and tricky ad campaigns, students are exposed to examples and discussions that help them understand more about how to classify online content.

Virology: Dissecting Rumors

In this section, the program discusses viral content and digital rumors.  Students are told that this kind of content is designed to appeal to emotions, and override reason.  The unit begins with four examples of viral content, posts claiming:

  • A number of deaths in Italy attributed to the flu vaccine
  • Muslims have actually been banned from US under the 1952 Immigration and Nationality Act
  • ‘Pastafarians’ can wear colanders on their heads for driver’s license photos
  • Starbucks is offering free lifetime passes if one shares the image

Students are asked to identify whether the rumors are fact or fiction (only the Pastafarians story is presented as true). They are then asked to identify the emotions that are evoked (fear, anger, curiosity, hope).  Next students are told about four main types of reasons someone would create a viral rumor, or share something unverified:

  • narrow self interest at expense of others
  • group interest such as country, party, religion
  • altruistic reasons, to warn others, etc.
  • malicious reasons, to damage, upset, hurt, or provoke others

Students then apply those motives to the previous stories (answers given in order as altruism, malice, personal self interest, group interest).  Once students have applied those motives to the given cases, they are reminded that rumors are nearly impossible to kill, and thus still have power even when they’ve been debunked.  They are encouraged to stop the spread of viral rumors by:

  • fact-checking information before sharing
  • circulating corrections as widely as possible

Finally this section offers three sites as places to check rumors, or the advice to ‘do some research on your own’:

  • factcheck.org
  • snopes.com
  • politifact.com

Personalizing Information: The Role of Algorithms

The next section in the program deals with algorithms.  Here the program explains that digital technology is rapidly changing the way we create and take in information, and that algorithms were developed to help us find and sort information relevant to us.  They explain how the things we see on online platforms like Google, Facebook, YouTube, and Netflix, shape our experiences by using information about things we’ve clicked on in the past.  Our interfaces have been tailored for us by the algorithms that use some of the following as information in filtering and determining what we want to see:

  • gender
  • age
  • location
  • online activity
  • social media activity
  • shopping habits

While algorithms are useful tools, the program also discusses the pitfalls associated with using them, namely that they can make it more difficult to encounter new things and new ideas.  They introduce students to the phrase ‘Filter Bubble’ – ‘a self-contained online world where what you encounter is shaped, customized, and personalized for you based on what algorithms think you want to find’.

Likening the consumption of information to a diet, the program discusses the need for a balanced information diet; not only consuming what you want, but also what you need to be an informed citizen.  They state that it is healthy to see and hear others’ points of view to expand one’s perspective.  The exercises in this section were minimal, merely using an image with a slider on it to slide back and forth between search results for different items with, and without, personalization enabled.  This was intended to show the students that while filtering information for them can save time, it can also leave out important information that they might need to know.

Branded Content: Master of Disguise

In this section, the program explores recent trends in advertising that make it much harder to identify just what kinds of content are advertisements.  Students are exposed to eight examples of image, print content, and video content, and are asked which are advertising something (they all are, though some are difficult to classify).

The focus of this segment is transparency and credibility; being able to easily identify an ad as being an ad, and being sure that particular news organizations are not allowing their advertising to be masked as news.  Students are asked to think about the following when they evaluate a piece of information:

  • who created the piece of information they encounter
  • what is the purpose of the information

The section briefly addressed the idea of a firewall between the ad sales department of a newspaper and its journalism department.  News organizations are supposed to protect their journalistic integrity when it comes to their writing, some of which may upset advertisers; and they discuss ways in which those organizations try to do that, such as placing their departments in separate rooms or on separate floors in their offices.

Students are introduced to several terms that describe news-like ads:

  • branded content
  • native ads
  • sponsored content
  • content marketing
  • brand storytelling
  • advertorial
  • branded news
  • brand experiences

Students are told the goal of these camouflaged ads is to blend in with traditional news content, gain trust of consumers, and make it harder to ignore their advertising by using storytelling techniques typically used by reporters.

Students activities are as follows:

  • answering a poll question about their feelings about seeing ads
  • ranking a set of ads from most to least transparent, and drawing a line where they think the ads begin to slide into an unacceptable level of transparency
  • ranking a set of ads and hidden ads from most to least ethical, and drawing a line where they think the ads begin to slide into an unacceptable ethical area

Stories included examples like:

  • an ad for the show Southland printed on the front page of the LA Times that looked like a news story
  • an infographic/multimedia page on the NY Times about incarcerated women (an ad for the Netflix show Orange is the New Black)
  • a Scientology article on the Atlantic’s page produced by the church
  • a Buzzfeed listicle article that was really a Samsung ad by Best Buy
  • the video of ‘a year in pictures‘ of a woman appearing to have been abused
  • an OFA ad about registering to vote in 2012
  • a fake video of a boy appearing to rescue his sister after being shot which was part of a campaign to raise awareness about the conflict in Syria
  • an anti-bullying cartoon ad

 

This isn’t too difficult a unit to evaluate.  The topics and skills are very important in developing a better sense of awareness about online information.  Learning how to give viral information a more critical analysis before sharing it is a habit everyone could benefit from acquiring.  Understanding how algorithms shape the content we see – and don’t see – can alert us to gaps in our feeds that can make us less informed about events that matter to us.  And developing the ability to spot advertising, even when it’s in disguise, can help us better understand the purpose and motivation behind much of the information we allow into our online worlds.

One concern here is the expressed confidence in the fact check websites that they recommend.  Each of these sites have offered faulty or incomplete information in the past, sometimes out of an apparent bias.  I would have felt much better about them being listed if the unit had also mentioned that trusting any one source as infallible is probably unwise.  The unit would have served students better to encourage skepticism of all sources, even fact check sites. Even mainstream media sources.

Another concern centers around the filter bubble explanation, and the ‘firewall’ in the newsrooms.  Just last week the New York Times welcomed its newest conservative columnist, Bret Stephens.  This was met with howls of protest from several quarters, including climate scientists and the Times’ own Cairo bureau chief.  You can talk about ‘firewalls’ and ‘bubbles’ all you like, but if journalists and media personnel push the notion that there is ‘settled science’ or ‘correct thinking’ about the very complex problem of climate and the environment, that in itself constitutes a bubble, one they are apparently irate that Stephens is trying to pop.

And sometimes the bubble the traditional media live in is evident not in what they say, but in what they do not say.  A recently foiled terror attack in Gaza didn’t make the paper at all, but:

Here are some other stories for which the New York Times did somehow manage to find room in its print edition this week:

  • A full page (and more) about “a small subculture of surfers” who ride the waves at night.

  • A nearly 2000-word profile of a Los Angeles dermatologist who treats a lot of people in the movie and television business.

  • An article claiming, inaccurately, that “elevated drug paraphernalia and New Age-inflected styles have emerged as unlikely must-have items of the season.”

It’s unclear what these particular stories contain that would bump a story about cancer patients trying to sneak explosives into Israel.  But it’s a good reminder to note that bias and bubbling are not only about what the media reports, but also what it deliberately chooses not to.

Thus, there’s an element of ‘do as I say, not as I do’ in this program – it’s championing ideals they should aspire to, but they do fall short often enough to frustrate their message.  This program would better serve the students, and the nation, by inserting and discussing some examples when news organizations got it wrong, or failed to live up to their standards.  Major news organizations aren’t above falling for pranks, hoaxes, and other fake news.  An honest admission of this, coupled with the pledge to continue to improve, and the encouragement to students to help hold them accountable, would have made me feel much better.  The last thing I want to see from a program like this is training students to only trust mainstream media and approved sources.  Hopefully the skepticism they are engendering in the students will apply to them as well.