Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Driver Behavior and Environment Inte...
~
Zheng, Yang.
Linked to FindBook
Google Book
Amazon
博客來
Driver Behavior and Environment Interaction Modeling for Intelligent Vehicle Advancements.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Driver Behavior and Environment Interaction Modeling for Intelligent Vehicle Advancements./
Author:
Zheng, Yang.
Published:
Ann Arbor : ProQuest Dissertations & Theses, : 2018,
Description:
160 p.
Notes:
Source: Dissertation Abstracts International, Volume: 79-12(E), Section: B.
Contained By:
Dissertation Abstracts International79-12B(E).
Subject:
Automotive engineering. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10970420
ISBN:
9780438293441
Driver Behavior and Environment Interaction Modeling for Intelligent Vehicle Advancements.
Zheng, Yang.
Driver Behavior and Environment Interaction Modeling for Intelligent Vehicle Advancements.
- Ann Arbor : ProQuest Dissertations & Theses, 2018 - 160 p.
Source: Dissertation Abstracts International, Volume: 79-12(E), Section: B.
Thesis (Ph.D.)--The University of Texas at Dallas, 2018.
With continued progress in artificial intelligence, vehicle technologies have advanced significantly from human controlled driving towards fully automated driving. During the transition, the intelligent vehicle should be able to understand the driver's perception of the environment and controlling behavior of the vehicle, as well as provide human-like interaction with the driver. To understand the complicated driving task which incorporates the interaction among the driver, the vehicle, and the environment, naturalistic driving studies and autonomous driving perception experiments are necessary to capture the in-vehicle and out-of-vehicle signals, process their dynamics, and migrate the driver's decision-making into the vehicle. This dissertation is focused on intelligent vehicle advancements, which include driver behavior analysis, environment perception, and advanced human-machine interface. First, with the availability of UTDrive naturalistic driving corpus, the driver's lane-change event is detected from vehicle dynamic signals, achieving over 80% accuracies using CAN signals only. Human factors for the lane-change detection are analyzed. Second, a high-digits road map corpus is leveraged to retrieve driving environment attributes, as well as to provide the road prior knowledge for drivable space segmentation on images. Combining environment attributes with vehicle dynamic signals, the lane-change recognition accuracies are improved from 82.22%-88.46% to 92.50%-96.67%. The road prior mask generated from the map data is shown to be an additional source to fuse with vision/laser sensors for the autonomous driving road perception, and in addition, it also has the capability for automatic annotation and virtual street views compensation. Next, the vehicle dynamics sensing functionality is migrated into a mobile platform -- Mobile-UTDrive, which allows for a smartphone device to be freely positioned in the vehicle. As an application, the smartphone collected signals are employed for an unsupervised driving performance assessment, giving the driver's objective rating score. Finally, a voice-based interface between the driver and vehicle is simulated, and natural language processing tasks are investigated in the design of a navigation dialogue system. The accuracy for intent detection (i.e., classify whether a sentence is navigation-related or not) is achieved as 98.83%, and for semantic parsing (i.e., extract useful context information) is achieved as 99.60%. Taken collectively, these advancements contribute to improved driver-to-vehicle interaction modeling, improved safety, and therefore reduce the transition challenge between human controlled to fully automated smart vehicles.
ISBN: 9780438293441Subjects--Topical Terms:
2181195
Automotive engineering.
Driver Behavior and Environment Interaction Modeling for Intelligent Vehicle Advancements.
LDR
:03705nmm a2200301 4500
001
2203253
005
20190528072654.5
008
201008s2018 ||||||||||||||||| ||eng d
020
$a
9780438293441
035
$a
(MiAaPQ)AAI10970420
035
$a
(MiAaPQ)0382vireo:445Zheng
035
$a
AAI10970420
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Zheng, Yang.
$3
1287177
245
1 0
$a
Driver Behavior and Environment Interaction Modeling for Intelligent Vehicle Advancements.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2018
300
$a
160 p.
500
$a
Source: Dissertation Abstracts International, Volume: 79-12(E), Section: B.
500
$a
Adviser: John H.L. Hansen.
502
$a
Thesis (Ph.D.)--The University of Texas at Dallas, 2018.
520
$a
With continued progress in artificial intelligence, vehicle technologies have advanced significantly from human controlled driving towards fully automated driving. During the transition, the intelligent vehicle should be able to understand the driver's perception of the environment and controlling behavior of the vehicle, as well as provide human-like interaction with the driver. To understand the complicated driving task which incorporates the interaction among the driver, the vehicle, and the environment, naturalistic driving studies and autonomous driving perception experiments are necessary to capture the in-vehicle and out-of-vehicle signals, process their dynamics, and migrate the driver's decision-making into the vehicle. This dissertation is focused on intelligent vehicle advancements, which include driver behavior analysis, environment perception, and advanced human-machine interface. First, with the availability of UTDrive naturalistic driving corpus, the driver's lane-change event is detected from vehicle dynamic signals, achieving over 80% accuracies using CAN signals only. Human factors for the lane-change detection are analyzed. Second, a high-digits road map corpus is leveraged to retrieve driving environment attributes, as well as to provide the road prior knowledge for drivable space segmentation on images. Combining environment attributes with vehicle dynamic signals, the lane-change recognition accuracies are improved from 82.22%-88.46% to 92.50%-96.67%. The road prior mask generated from the map data is shown to be an additional source to fuse with vision/laser sensors for the autonomous driving road perception, and in addition, it also has the capability for automatic annotation and virtual street views compensation. Next, the vehicle dynamics sensing functionality is migrated into a mobile platform -- Mobile-UTDrive, which allows for a smartphone device to be freely positioned in the vehicle. As an application, the smartphone collected signals are employed for an unsupervised driving performance assessment, giving the driver's objective rating score. Finally, a voice-based interface between the driver and vehicle is simulated, and natural language processing tasks are investigated in the design of a navigation dialogue system. The accuracy for intent detection (i.e., classify whether a sentence is navigation-related or not) is achieved as 98.83%, and for semantic parsing (i.e., extract useful context information) is achieved as 99.60%. Taken collectively, these advancements contribute to improved driver-to-vehicle interaction modeling, improved safety, and therefore reduce the transition challenge between human controlled to fully automated smart vehicles.
590
$a
School code: 0382.
650
4
$a
Automotive engineering.
$3
2181195
650
4
$a
Electrical engineering.
$3
649834
690
$a
0540
690
$a
0544
710
2
$a
The University of Texas at Dallas.
$b
Electrical Engineering.
$3
1679269
773
0
$t
Dissertation Abstracts International
$g
79-12B(E).
790
$a
0382
791
$a
Ph.D.
792
$a
2018
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10970420
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9379802
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login