Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Measuring the Misinformation Ecosyst...
~
Jiang, Shan.
Linked to FindBook
Google Book
Amazon
博客來
Measuring the Misinformation Ecosystem: Audiences, Platforms, and Storytellers.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Measuring the Misinformation Ecosystem: Audiences, Platforms, and Storytellers./
Author:
Jiang, Shan.
Published:
Ann Arbor : ProQuest Dissertations & Theses, : 2021,
Description:
146 p.
Notes:
Source: Dissertations Abstracts International, Volume: 83-01, Section: B.
Contained By:
Dissertations Abstracts International83-01B.
Subject:
Computer science. -
Online resource:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28547107
ISBN:
9798516927362
Measuring the Misinformation Ecosystem: Audiences, Platforms, and Storytellers.
Jiang, Shan.
Measuring the Misinformation Ecosystem: Audiences, Platforms, and Storytellers.
- Ann Arbor : ProQuest Dissertations & Theses, 2021 - 146 p.
Source: Dissertations Abstracts International, Volume: 83-01, Section: B.
Thesis (Ph.D.)--Northeastern University, 2021.
This item must not be sold to any third party vendors.
Misinformation, broadly defined as any false or inaccurate information, has been proliferating on social media. This proliferation has been raising increasing societal concerns about its potential consequences, e.g., polarizing the public and eroding trust in institutions. Existing surveys and experiments across disciplines have investigated the misinformation problem from multiple perspectives, ranging from the socio-psychological foundations of audiences' susceptibility to algorithmic solutions aiding platforms' intervention against the spread of misinformation. Yet, large-scale empirical study is still needed to comprehensively understand how different players behave and interact in the misinformation ecosystem.To this end, the goal of this thesis is to study the misinformation ecosystem by measuring the behaviors of three key players: audiences, platforms, and storytellers.Audiences receive and respond to misinformation, and therefore their behaviors are potentially influenced by such falsehood or inaccuracies. The first part of the thesis investigates if and how audiences respond differently under misinformation. This part starts with an unsupervised exploration of user comments to misinformation posts on social media, where I observe significantly distinctive linguistic patterns when audiences comment on fabricated stories versus truthful ones, e.g., increased signals suggesting their awareness of misinformation and extensive usage of angry emojis and swear words. In light of this exploration, I then refocus on measuring to what extend audiences disbelieve or believe in these stories. Applying supervised classifiers trained to identify (dis)belief, I estimate 12%/15% of audiences express disbelief, and 26%/20% of them express belief in true/false information.Platforms play an essential role in how misinformation reaches its audiences. The second part of the thesis examines a specific practice of platforms' operations - content moderation, the AI-human hybrid process of removing toxic content to maintain community standards. Using YouTube as a lens, this part investigates how misinformation and partisanship of videos interact with moderation practices on their corresponding comments. I observe that videos containing verifiably false content have heavier moderation of their comments, especially when the comments are posted after a fact-check article. Additionally, I find no evidence to support allegations of political bias in content moderation on YouTube, when justifiable factors (e.g., hate speech) are controlled.Storytellers generate misinformation and then release them onto platforms. The third part of the thesis structurizes storytellers' behaviors and explores prevalent types of misinformation to date, by rationalizing fact-check articles. My intuition is that key phrases in a fact-check article that identify the misinformation type(s) (e.g., doctored images, urban legends) also act as rationales that determine the verdict of the fact-check (e.g., false). I experiment on rationalized models with domain knowledge as weak supervision to extract these phrases as rationales, and then cluster semantically similar rationales to summarize prevalent misinformation types. Using archived fact-check articles from Snopes.com, I identify ten types of misinformation stories. I discuss how these types have evolved over the last ten years and compare their prevalence between the 2016/2020 US presidential elections and the H1N1/COVID-19 pandemics.Altogether, my work presents an overview of the misinformation ecosystem to date, as well as methodologies and tools for measuring it. The empirical findings in the thesis are derived from computational approaches based on observational data, and are reproducible from repositories that I have publicly released. Ultimately, I hope that my research helps the public to understand misinformation and regain trust in authentic content online.
ISBN: 9798516927362Subjects--Topical Terms:
523869
Computer science.
Subjects--Index Terms:
Computational journalism
Measuring the Misinformation Ecosystem: Audiences, Platforms, and Storytellers.
LDR
:05207nmm a2200397 4500
001
2285090
005
20211124104348.5
008
220723s2021 ||||||||||||||||| ||eng d
020
$a
9798516927362
035
$a
(MiAaPQ)AAI28547107
035
$a
AAI28547107
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Jiang, Shan.
$3
3172376
245
1 0
$a
Measuring the Misinformation Ecosystem: Audiences, Platforms, and Storytellers.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2021
300
$a
146 p.
500
$a
Source: Dissertations Abstracts International, Volume: 83-01, Section: B.
500
$a
Advisor: Wilson, Christo.
502
$a
Thesis (Ph.D.)--Northeastern University, 2021.
506
$a
This item must not be sold to any third party vendors.
520
$a
Misinformation, broadly defined as any false or inaccurate information, has been proliferating on social media. This proliferation has been raising increasing societal concerns about its potential consequences, e.g., polarizing the public and eroding trust in institutions. Existing surveys and experiments across disciplines have investigated the misinformation problem from multiple perspectives, ranging from the socio-psychological foundations of audiences' susceptibility to algorithmic solutions aiding platforms' intervention against the spread of misinformation. Yet, large-scale empirical study is still needed to comprehensively understand how different players behave and interact in the misinformation ecosystem.To this end, the goal of this thesis is to study the misinformation ecosystem by measuring the behaviors of three key players: audiences, platforms, and storytellers.Audiences receive and respond to misinformation, and therefore their behaviors are potentially influenced by such falsehood or inaccuracies. The first part of the thesis investigates if and how audiences respond differently under misinformation. This part starts with an unsupervised exploration of user comments to misinformation posts on social media, where I observe significantly distinctive linguistic patterns when audiences comment on fabricated stories versus truthful ones, e.g., increased signals suggesting their awareness of misinformation and extensive usage of angry emojis and swear words. In light of this exploration, I then refocus on measuring to what extend audiences disbelieve or believe in these stories. Applying supervised classifiers trained to identify (dis)belief, I estimate 12%/15% of audiences express disbelief, and 26%/20% of them express belief in true/false information.Platforms play an essential role in how misinformation reaches its audiences. The second part of the thesis examines a specific practice of platforms' operations - content moderation, the AI-human hybrid process of removing toxic content to maintain community standards. Using YouTube as a lens, this part investigates how misinformation and partisanship of videos interact with moderation practices on their corresponding comments. I observe that videos containing verifiably false content have heavier moderation of their comments, especially when the comments are posted after a fact-check article. Additionally, I find no evidence to support allegations of political bias in content moderation on YouTube, when justifiable factors (e.g., hate speech) are controlled.Storytellers generate misinformation and then release them onto platforms. The third part of the thesis structurizes storytellers' behaviors and explores prevalent types of misinformation to date, by rationalizing fact-check articles. My intuition is that key phrases in a fact-check article that identify the misinformation type(s) (e.g., doctored images, urban legends) also act as rationales that determine the verdict of the fact-check (e.g., false). I experiment on rationalized models with domain knowledge as weak supervision to extract these phrases as rationales, and then cluster semantically similar rationales to summarize prevalent misinformation types. Using archived fact-check articles from Snopes.com, I identify ten types of misinformation stories. I discuss how these types have evolved over the last ten years and compare their prevalence between the 2016/2020 US presidential elections and the H1N1/COVID-19 pandemics.Altogether, my work presents an overview of the misinformation ecosystem to date, as well as methodologies and tools for measuring it. The empirical findings in the thesis are derived from computational approaches based on observational data, and are reproducible from repositories that I have publicly released. Ultimately, I hope that my research helps the public to understand misinformation and regain trust in authentic content online.
590
$a
School code: 0160.
650
4
$a
Computer science.
$3
523869
650
4
$a
Communication.
$3
524709
650
4
$a
Sociolinguistics.
$3
524467
653
$a
Computational journalism
653
$a
Computational social science
653
$a
Fact-checking
653
$a
Misinformation
653
$a
Natural language processing
653
$a
Social informatics
653
$a
COVID-19 pandemic
690
$a
0984
690
$a
0459
690
$a
0636
710
2
$a
Northeastern University.
$b
Computer Science.
$3
1678818
773
0
$t
Dissertations Abstracts International
$g
83-01B.
790
$a
0160
791
$a
Ph.D.
792
$a
2021
793
$a
English
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28547107
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9436823
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login