Scientific American: “Although children are prime targets, educators cannot figure out how best to teach them to separate fact from fiction. When Amanda Gardner, an educator with two decades of experience, helped to start a new charter elementary and middle school outside of Seattle last year, she did not anticipate teaching students who denied that the Holocaust happened, argued that COVID is a hoax and told their teacher that the 2020 presidential election was rigged. Yet some children insisted that these conspiracy fantasies were true. Both misinformation, which includes honest mistakes, and disinformation, which involves an intention to mislead, have had “a growing impact on students over the past 10 to 20 years,” Gardner says, yet many schools do not focus on the issue. “Most high schools probably do some teaching to prevent plagiarism, but I think that’s about it.” Children, it turns out, are ripe targets for fake news. Age 14 is when kids often start believing in unproven conspiratorial ideas, according to a study published in September 2021 in the British Journal of Developmental Psychology. Many teens also have trouble assessing the credibility of online information. In a 2016 study involving nearly 8,000 U.S. students, Stanford University researchers found that more than 80 percent of middle schoolers believed that an advertisement labeled as sponsored content was actually a news story. The researchers also found that less than 20 percent of high schoolers seriously questioned spurious claims in social media, such as a Facebook post that said images of strange-looking flowers, supposedly near the site of a nuclear power plant accident in Japan, proved that dangerous radiation levels persisted in the area. When college students in the survey looked at a Twitter post touting a poll favoring gun control, more than two thirds failed to note that the liberal antigun groups behind the poll could have influenced the data.
Disinformation campaigns often directly go after young users, steering them toward misleading content. A 2018 Wall Street Journal investigation found that YouTube’s recommendation algorithm, which offers personalized suggestions about what users should watch next, is skewed to recommend videos that are more extreme and far-fetched than what the viewer started with. For instance, when researchers searched for videos using the phrase “lunar eclipse,” they were steered to a video suggesting that Earth is flat. YouTube is one of the most popular social media site among teens: After Zeynep Tufekci, an associate professor at the University of North Carolina, Chapel Hill, School of Information and Library Science, spent time searching for videos on YouTube and observed what the algorithm told her to watch next, she suggested that it was “one of the most powerful radicalizing instruments of the 21st century.”…
Sorry, comments are closed for this post.