Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Generalized domain adaptation for se...
~
Xiao, Min.
Linked to FindBook
Google Book
Amazon
博客來
Generalized domain adaptation for sequence labeling in natural language processing.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Generalized domain adaptation for sequence labeling in natural language processing./
Author:
Xiao, Min.
Published:
Ann Arbor : ProQuest Dissertations & Theses, : 2016,
Description:
100 p.
Notes:
Source: Dissertation Abstracts International, Volume: 77-10(E), Section: B.
Contained By:
Dissertation Abstracts International77-10B(E).
Subject:
Computer science. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10112383
ISBN:
9781339755373
Generalized domain adaptation for sequence labeling in natural language processing.
Xiao, Min.
Generalized domain adaptation for sequence labeling in natural language processing.
- Ann Arbor : ProQuest Dissertations & Theses, 2016 - 100 p.
Source: Dissertation Abstracts International, Volume: 77-10(E), Section: B.
Thesis (Ph.D.)--Temple University, 2016.
Sequence labeling tasks have been widely studied in the natural language processing area, such as part-of-speech tagging, syntactic chunking, dependency parsing, and etc. Most of those systems are developed on a large amount of labeled training data via supervised learning. However, manually collecting labeled training data is too time-consuming and expensive. As an alternative, to alleviate the issue of label scarcity, domain adaptation has recently been proposed to train a statistical machine learning model in a target domain where there is no enough labeled training data by exploiting existing free labeled training data in a different but related source domain. The natural language processing community has witnessed the success of domain adaptation in a variety of sequence labeling tasks.
ISBN: 9781339755373Subjects--Topical Terms:
523869
Computer science.
Generalized domain adaptation for sequence labeling in natural language processing.
LDR
:03620nmm a2200301 4500
001
2121901
005
20170830070054.5
008
180830s2016 ||||||||||||||||| ||eng d
020
$a
9781339755373
035
$a
(MiAaPQ)AAI10112383
035
$a
AAI10112383
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Xiao, Min.
$3
2016333
245
1 0
$a
Generalized domain adaptation for sequence labeling in natural language processing.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2016
300
$a
100 p.
500
$a
Source: Dissertation Abstracts International, Volume: 77-10(E), Section: B.
500
$a
Adviser: Yuhong Guo.
502
$a
Thesis (Ph.D.)--Temple University, 2016.
520
$a
Sequence labeling tasks have been widely studied in the natural language processing area, such as part-of-speech tagging, syntactic chunking, dependency parsing, and etc. Most of those systems are developed on a large amount of labeled training data via supervised learning. However, manually collecting labeled training data is too time-consuming and expensive. As an alternative, to alleviate the issue of label scarcity, domain adaptation has recently been proposed to train a statistical machine learning model in a target domain where there is no enough labeled training data by exploiting existing free labeled training data in a different but related source domain. The natural language processing community has witnessed the success of domain adaptation in a variety of sequence labeling tasks.
520
$a
Though the labeled training data in the source domain are available and free, however, they are not exactly as and can be very different from the test data in the target domain. Thus, simply applying naive supervised machine learning algorithms without considering domain differences may not fulfill the purpose. In this dissertation, we developed several novel representation learning approaches to address domain adaptation for sequence labeling in natural language processing. Those representation learning techniques aim to induce latent generalizable features to bridge domain divergence to enable cross-domain prediction.
520
$a
We first tackle a semi-supervised domain adaptation scenario where the target domain has a small amount of labeled training data and propose a distributed representation learning approach based on a probabilistic neural language model. We then relax the assumption of the availability of labeled training data in the target domain and study an unsupervised domain adaptation scenario where the target domain has only unlabeled training data, and give a task-informative representation learning approach based on dynamic dependency networks. Both works are developed in the setting where different domains contain sentences in different genres. We then extend and generalize domain adaptation into a more challenging scenario where different domains contain sentences in different languages and propose two cross-lingual representation learning approaches, one is based on deep neural networks with auxiliary bilingual word pairs and the other is based on annotation projection with auxiliary parallel sentences. All four specific learning scenarios are extensively evaluated with different sequence labeling tasks. The empirical results demonstrate the effectiveness of those generalized domain adaptation techniques for sequence labeling in natural language processing.
590
$a
School code: 0225.
650
4
$a
Computer science.
$3
523869
690
$a
0984
710
2
$a
Temple University.
$b
Computer and Information Science.
$3
1065462
773
0
$t
Dissertation Abstracts International
$g
77-10B(E).
790
$a
0225
791
$a
Ph.D.
792
$a
2016
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10112383
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9332517
電子資源
01.外借(書)_YB
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login