Teaching kids to spot misinformation on social media

The material comes from the project’s newsletter, the Sift, which takes the most recent viral rumors, conspiracy theories, hoaxes and journalistic ethics issues and turns them into timely lessons with discussion prompts and links. The Sift, which is published weekly during the school year, has more than 10,000 subscribers, most of them educators.

The News Literacy Project also offers a program called Checkology, a browser-based platform designed for students in grades six through 12 that helps prepare the next generation to easily identify misinformation. During the coronavirus pandemic, the project is offering access to Checkology Premium at no cost to educators and parents in the United States. More than 1,100 educators and parents in 49 states and the District of Columbia have registered to use the platform with as many as 90,000 students.

You can learn more about the News Literacy Project and all of the educational resources it provides in this piece, but here is a rundown:

Founded more than a decade ago by Alan Miller, a former Pulitzer Prize-winning reporter at the Los Angeles Times, the News Literacy Project is the leading provider of news literacy education.

It creates digital curriculums and other resources and works with educators and journalists to teach middle and high school students how to recognize news and information to trust — and it provides them with the tools they need to be informed and engaged participants in a democracy. It uses the standards of high-quality journalism as an aspirational yardstick against which to measure all news and information. Just as important, it provides the next generation with an appreciation of the First Amendment and the role of a free press.

Here’s material from the Oct. 19 Sift:

As the U.S. presidential election draws near, social media companies are taking action against falsehoods and questionable content posted on their platforms, sparking fresh controversy on the timing and scope of such efforts.

YouTube announced on Oct. 15 that it is banning QAnon and other “harmful conspiracy theories” that target individuals. The decision follows similar recent efforts by Facebook, Twitter and other platforms to curb content related to QAnon, a sprawling system of conspiratorial beliefs. Other social media decisions restricting content also made headlines in rapid succession.

Both decisions unfolded less than a week after the company said it would temporarily stop running political ads once polls close on Nov. 3. Twitter and Facebook also each took steps to slow the spread of a widely disputed Oct. 14 report by the New York Post, which included unverified claims based on purportedly hacked materials involving Democratic presidential nominee Joe Biden’s son Hunter. Twitter prevented users from sharing certain links and related images. (Amid pushback, the company soon reversed course and said it was changing its hacked-material policy.) Meanwhile, Facebook opted to reduce the reach of the New York Post piece while the company’s third-party fact-checkers reviewed it.

Note: There’s growing concern that online falsehoods could foment real-world violence around the election.

Also note: Despite a steady stream of content moderation efforts by Facebook in recent years, engagement with misinformation on the platform is higher today than before the 2016 election, according to a new study.

Discuss: Do you agree with the steps that social media companies have taken recently to prevent the spread of misinformation? Is banning Holocaust denial, QAnon content, anti-vaccination ads or post-election political ads censorship or responsible moderation? If you were the CEO of YouTube, Facebook, Twitter or another major platform, how would you respond to misinformation on your platform?

NO: This aerial photo does not show crowds at a rally for President Trump in Ocala, Fla.

YES: The photo shows a crowd of more than 1 million people at the 2018 Street Parade music festival in Zurich.

Note: This isn’t the first time this photo has gone viral in a false context. In August, it circulated online along with the false claim that it showed crowds protesting covid-19 restrictions in Berlin. There was such a protest in Berlin on Aug. 1, but the crowd was not nearly as large.

Tip: Be wary of aerial photos showing crowds from user-generated sources of information (especially anonymous people or strangers online). Viral rumors often present images of large crowds in false contexts to try to exaggerate support for a given position.

★ Featured rumor resource: Use these classroom-ready slides to turn this viral rumor into an engaging, fact-checking challenge with your students.

YES: Several iterations of this copy-and-paste rumor recently went viral on Facebook, gaining traction with voters outside South Carolina, where the claim appears to have originated.

Note: These kinds of rumors, in which blocks of text are copied and pasted — sometimes with slight alterations — are called “copypasta” in Internet parlance.

Also note: Aside from lacking evidence for this claim, the version of this rumor shown above contains two additional red flags: It attributes the shared text to “a very reliable good friend” who is unnamed, and encourages people to “PLEASE PLEASE PASS THIS ON!”

Source link