Image from Google Jackets

Learning from Shared News: When Abundant Information Leads to Belief Polarization / Renee Bowen, Danil Dmitriev, Simone Galperti.

By: Contributor(s): Material type: TextTextSeries: Working Paper Series (National Bureau of Economic Research) ; no. w28465.Publication details: Cambridge, Mass. National Bureau of Economic Research 2021.Description: 1 online resource: illustrations (black and white)Subject(s): Online resources: Available additional physical forms:
  • Hardcopy version available to institutional subscribers
Abstract: We study learning via shared news. Each period agents receive the same quantity and quality of first-hand information and can share it with friends. Some friends (possibly few) share selectively, generating heterogeneous news diets across agents akin to echo chambers. Agents are aware of selective sharing and update beliefs by Bayes' rule. Contrary to standard learning results, we show that beliefs can diverge in this environment leading to polarization. This requires that (i) agents hold misperceptions (even minor) about friends' sharing and (ii) information quality is sufficiently low. Polarization can worsen when agents' social connections expand. When the quantity of first-hand information becomes large, agents can hold opposite extreme beliefs resulting in severe polarization. Our results hold without media bias or fake news, so eliminating these is not sufficient to reduce polarization. When fake news is included, we show that it can lead to polarization but only through misperceived selective sharing. News aggregators can curb polarization caused by shared news.
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)

February 2021.

We study learning via shared news. Each period agents receive the same quantity and quality of first-hand information and can share it with friends. Some friends (possibly few) share selectively, generating heterogeneous news diets across agents akin to echo chambers. Agents are aware of selective sharing and update beliefs by Bayes' rule. Contrary to standard learning results, we show that beliefs can diverge in this environment leading to polarization. This requires that (i) agents hold misperceptions (even minor) about friends' sharing and (ii) information quality is sufficiently low. Polarization can worsen when agents' social connections expand. When the quantity of first-hand information becomes large, agents can hold opposite extreme beliefs resulting in severe polarization. Our results hold without media bias or fake news, so eliminating these is not sufficient to reduce polarization. When fake news is included, we show that it can lead to polarization but only through misperceived selective sharing. News aggregators can curb polarization caused by shared news.

Hardcopy version available to institutional subscribers

System requirements: Adobe [Acrobat] Reader required for PDF files.

Mode of access: World Wide Web.

Print version record

There are no comments on this title.

to post a comment.

Powered by Koha