語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Measuring the Misinformation Ecosyst...
~
Jiang, Shan.
FindBook
Google Book
Amazon
博客來
Measuring the Misinformation Ecosystem: Audiences, Platforms, and Storytellers.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Measuring the Misinformation Ecosystem: Audiences, Platforms, and Storytellers./
作者:
Jiang, Shan.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2021,
面頁冊數:
146 p.
附註:
Source: Dissertations Abstracts International, Volume: 83-01, Section: B.
Contained By:
Dissertations Abstracts International83-01B.
標題:
Computer science. -
電子資源:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28547107
ISBN:
9798516927362
Measuring the Misinformation Ecosystem: Audiences, Platforms, and Storytellers.
Jiang, Shan.
Measuring the Misinformation Ecosystem: Audiences, Platforms, and Storytellers.
- Ann Arbor : ProQuest Dissertations & Theses, 2021 - 146 p.
Source: Dissertations Abstracts International, Volume: 83-01, Section: B.
Thesis (Ph.D.)--Northeastern University, 2021.
This item must not be sold to any third party vendors.
Misinformation, broadly defined as any false or inaccurate information, has been proliferating on social media. This proliferation has been raising increasing societal concerns about its potential consequences, e.g., polarizing the public and eroding trust in institutions. Existing surveys and experiments across disciplines have investigated the misinformation problem from multiple perspectives, ranging from the socio-psychological foundations of audiences' susceptibility to algorithmic solutions aiding platforms' intervention against the spread of misinformation. Yet, large-scale empirical study is still needed to comprehensively understand how different players behave and interact in the misinformation ecosystem.To this end, the goal of this thesis is to study the misinformation ecosystem by measuring the behaviors of three key players: audiences, platforms, and storytellers.Audiences receive and respond to misinformation, and therefore their behaviors are potentially influenced by such falsehood or inaccuracies. The first part of the thesis investigates if and how audiences respond differently under misinformation. This part starts with an unsupervised exploration of user comments to misinformation posts on social media, where I observe significantly distinctive linguistic patterns when audiences comment on fabricated stories versus truthful ones, e.g., increased signals suggesting their awareness of misinformation and extensive usage of angry emojis and swear words. In light of this exploration, I then refocus on measuring to what extend audiences disbelieve or believe in these stories. Applying supervised classifiers trained to identify (dis)belief, I estimate 12%/15% of audiences express disbelief, and 26%/20% of them express belief in true/false information.Platforms play an essential role in how misinformation reaches its audiences. The second part of the thesis examines a specific practice of platforms' operations - content moderation, the AI-human hybrid process of removing toxic content to maintain community standards. Using YouTube as a lens, this part investigates how misinformation and partisanship of videos interact with moderation practices on their corresponding comments. I observe that videos containing verifiably false content have heavier moderation of their comments, especially when the comments are posted after a fact-check article. Additionally, I find no evidence to support allegations of political bias in content moderation on YouTube, when justifiable factors (e.g., hate speech) are controlled.Storytellers generate misinformation and then release them onto platforms. The third part of the thesis structurizes storytellers' behaviors and explores prevalent types of misinformation to date, by rationalizing fact-check articles. My intuition is that key phrases in a fact-check article that identify the misinformation type(s) (e.g., doctored images, urban legends) also act as rationales that determine the verdict of the fact-check (e.g., false). I experiment on rationalized models with domain knowledge as weak supervision to extract these phrases as rationales, and then cluster semantically similar rationales to summarize prevalent misinformation types. Using archived fact-check articles from Snopes.com, I identify ten types of misinformation stories. I discuss how these types have evolved over the last ten years and compare their prevalence between the 2016/2020 US presidential elections and the H1N1/COVID-19 pandemics.Altogether, my work presents an overview of the misinformation ecosystem to date, as well as methodologies and tools for measuring it. The empirical findings in the thesis are derived from computational approaches based on observational data, and are reproducible from repositories that I have publicly released. Ultimately, I hope that my research helps the public to understand misinformation and regain trust in authentic content online.
ISBN: 9798516927362Subjects--Topical Terms:
523869
Computer science.
Subjects--Index Terms:
Computational journalism
Measuring the Misinformation Ecosystem: Audiences, Platforms, and Storytellers.
LDR
:05207nmm a2200397 4500
001
2285090
005
20211124104348.5
008
220723s2021 ||||||||||||||||| ||eng d
020
$a
9798516927362
035
$a
(MiAaPQ)AAI28547107
035
$a
AAI28547107
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Jiang, Shan.
$3
3172376
245
1 0
$a
Measuring the Misinformation Ecosystem: Audiences, Platforms, and Storytellers.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2021
300
$a
146 p.
500
$a
Source: Dissertations Abstracts International, Volume: 83-01, Section: B.
500
$a
Advisor: Wilson, Christo.
502
$a
Thesis (Ph.D.)--Northeastern University, 2021.
506
$a
This item must not be sold to any third party vendors.
520
$a
Misinformation, broadly defined as any false or inaccurate information, has been proliferating on social media. This proliferation has been raising increasing societal concerns about its potential consequences, e.g., polarizing the public and eroding trust in institutions. Existing surveys and experiments across disciplines have investigated the misinformation problem from multiple perspectives, ranging from the socio-psychological foundations of audiences' susceptibility to algorithmic solutions aiding platforms' intervention against the spread of misinformation. Yet, large-scale empirical study is still needed to comprehensively understand how different players behave and interact in the misinformation ecosystem.To this end, the goal of this thesis is to study the misinformation ecosystem by measuring the behaviors of three key players: audiences, platforms, and storytellers.Audiences receive and respond to misinformation, and therefore their behaviors are potentially influenced by such falsehood or inaccuracies. The first part of the thesis investigates if and how audiences respond differently under misinformation. This part starts with an unsupervised exploration of user comments to misinformation posts on social media, where I observe significantly distinctive linguistic patterns when audiences comment on fabricated stories versus truthful ones, e.g., increased signals suggesting their awareness of misinformation and extensive usage of angry emojis and swear words. In light of this exploration, I then refocus on measuring to what extend audiences disbelieve or believe in these stories. Applying supervised classifiers trained to identify (dis)belief, I estimate 12%/15% of audiences express disbelief, and 26%/20% of them express belief in true/false information.Platforms play an essential role in how misinformation reaches its audiences. The second part of the thesis examines a specific practice of platforms' operations - content moderation, the AI-human hybrid process of removing toxic content to maintain community standards. Using YouTube as a lens, this part investigates how misinformation and partisanship of videos interact with moderation practices on their corresponding comments. I observe that videos containing verifiably false content have heavier moderation of their comments, especially when the comments are posted after a fact-check article. Additionally, I find no evidence to support allegations of political bias in content moderation on YouTube, when justifiable factors (e.g., hate speech) are controlled.Storytellers generate misinformation and then release them onto platforms. The third part of the thesis structurizes storytellers' behaviors and explores prevalent types of misinformation to date, by rationalizing fact-check articles. My intuition is that key phrases in a fact-check article that identify the misinformation type(s) (e.g., doctored images, urban legends) also act as rationales that determine the verdict of the fact-check (e.g., false). I experiment on rationalized models with domain knowledge as weak supervision to extract these phrases as rationales, and then cluster semantically similar rationales to summarize prevalent misinformation types. Using archived fact-check articles from Snopes.com, I identify ten types of misinformation stories. I discuss how these types have evolved over the last ten years and compare their prevalence between the 2016/2020 US presidential elections and the H1N1/COVID-19 pandemics.Altogether, my work presents an overview of the misinformation ecosystem to date, as well as methodologies and tools for measuring it. The empirical findings in the thesis are derived from computational approaches based on observational data, and are reproducible from repositories that I have publicly released. Ultimately, I hope that my research helps the public to understand misinformation and regain trust in authentic content online.
590
$a
School code: 0160.
650
4
$a
Computer science.
$3
523869
650
4
$a
Communication.
$3
524709
650
4
$a
Sociolinguistics.
$3
524467
653
$a
Computational journalism
653
$a
Computational social science
653
$a
Fact-checking
653
$a
Misinformation
653
$a
Natural language processing
653
$a
Social informatics
653
$a
COVID-19 pandemic
690
$a
0984
690
$a
0459
690
$a
0636
710
2
$a
Northeastern University.
$b
Computer Science.
$3
1678818
773
0
$t
Dissertations Abstracts International
$g
83-01B.
790
$a
0160
791
$a
Ph.D.
792
$a
2021
793
$a
English
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28547107
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9436823
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入