Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Linked to FindBook
Google Book
Amazon
博客來
Radar-Based Perception for Visually Degraded Environments.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Radar-Based Perception for Visually Degraded Environments./
Author:
Kramer, Andrew.
Description:
1 online resource (144 pages)
Notes:
Source: Dissertations Abstracts International, Volume: 83-03, Section: B.
Contained By:
Dissertations Abstracts International83-03B.
Subject:
Robotics. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28643745click for full text (PQDT)
ISBN:
9798538154005
Radar-Based Perception for Visually Degraded Environments.
Kramer, Andrew.
Radar-Based Perception for Visually Degraded Environments.
- 1 online resource (144 pages)
Source: Dissertations Abstracts International, Volume: 83-03, Section: B.
Thesis (Ph.D.)--University of Colorado at Boulder, 2021.
Includes bibliographical references
Autonomous mobile robots are being deployed in ever more varied roles and environments. For instance, a robotic explorer was deployed just this year to Jezero Crater, the most challenging area of Mars yet explored. This might lead one to believe autonomous mobile robots are able to operate in almost any environment. But in fact there exists a broad range of environments that are inaccessible to autonomous robots. Environments with smoke, fog, darkness, and other visually challenging conditions, collectively referred to as visually-degraded environments (VDEs), are among the biggest hurdles to deploying autonomous mobile robots in the field. But a robot is likely to encounter such conditions almost anywhere outside of a well-lit indoor environment. So if autonomous mobile robots are to be truly useful in a broad range of environments, the problem of perception in VDEs must be addressed. In theory, one could avoid visual challenges simply by using a different type of sensor. Millimeter wave radar, for instance, is unaffected by most kinds of airborne particulates and it does not depend on ambient light. But using radar for robotic perception comes with its own challenges. Popular visual and lidar-based methods for perception and state estimation do not adapt well to radar. Additionally, radar measurements are subject to forms of noise unknown in other sensors. This dissertation will cover a number of novel developments in perception and state estimation using millimeter wave radar that address these issues including i) a radar-inertial method for state estimation in smoke and fog, ii) a radar-only occupancy mapping method, iii) a unique dataset for radar-based state estimation and perception, iv) an end-to-end learned method for 3D radar image alignment, and v) a new radar-only detector for moving obstacles in road environments.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2023
Mode of access: World Wide Web
ISBN: 9798538154005Subjects--Topical Terms:
519753
Robotics.
Subjects--Index Terms:
EstimationIndex Terms--Genre/Form:
542853
Electronic books.
Radar-Based Perception for Visually Degraded Environments.
LDR
:03246nmm a2200409K 4500
001
2354028
005
20230324111139.5
006
m o d
007
cr mn ---uuuuu
008
241011s2021 xx obm 000 0 eng d
020
$a
9798538154005
035
$a
(MiAaPQ)AAI28643745
035
$a
AAI28643745
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Kramer, Andrew.
$3
3694367
245
1 0
$a
Radar-Based Perception for Visually Degraded Environments.
264
0
$c
2021
300
$a
1 online resource (144 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 83-03, Section: B.
500
$a
Advisor: Heckman, Christoffer; Wong, Uland.
502
$a
Thesis (Ph.D.)--University of Colorado at Boulder, 2021.
504
$a
Includes bibliographical references
520
$a
Autonomous mobile robots are being deployed in ever more varied roles and environments. For instance, a robotic explorer was deployed just this year to Jezero Crater, the most challenging area of Mars yet explored. This might lead one to believe autonomous mobile robots are able to operate in almost any environment. But in fact there exists a broad range of environments that are inaccessible to autonomous robots. Environments with smoke, fog, darkness, and other visually challenging conditions, collectively referred to as visually-degraded environments (VDEs), are among the biggest hurdles to deploying autonomous mobile robots in the field. But a robot is likely to encounter such conditions almost anywhere outside of a well-lit indoor environment. So if autonomous mobile robots are to be truly useful in a broad range of environments, the problem of perception in VDEs must be addressed. In theory, one could avoid visual challenges simply by using a different type of sensor. Millimeter wave radar, for instance, is unaffected by most kinds of airborne particulates and it does not depend on ambient light. But using radar for robotic perception comes with its own challenges. Popular visual and lidar-based methods for perception and state estimation do not adapt well to radar. Additionally, radar measurements are subject to forms of noise unknown in other sensors. This dissertation will cover a number of novel developments in perception and state estimation using millimeter wave radar that address these issues including i) a radar-inertial method for state estimation in smoke and fog, ii) a radar-only occupancy mapping method, iii) a unique dataset for radar-based state estimation and perception, iv) an end-to-end learned method for 3D radar image alignment, and v) a new radar-only detector for moving obstacles in road environments.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2023
538
$a
Mode of access: World Wide Web
650
4
$a
Robotics.
$3
519753
650
4
$a
Velocity.
$3
3560495
650
4
$a
Data collection.
$3
3561708
650
4
$a
Datasets.
$3
3541416
650
4
$a
Experiments.
$3
525909
650
4
$a
Calibration.
$3
2068745
650
4
$a
Sensors.
$3
3549539
650
4
$a
Electromagnetics.
$3
3173223
650
4
$a
Computer science.
$3
523869
653
$a
Estimation
653
$a
Learning
653
$a
Radar
653
$a
Sensing
653
$a
State
653
$a
Visually-degraded environments
655
7
$a
Electronic books.
$2
lcsh
$3
542853
690
$a
0771
690
$a
0984
690
$a
0607
710
2
$a
ProQuest Information and Learning Co.
$3
783688
710
2
$a
University of Colorado at Boulder.
$b
Computer Science.
$3
1018560
773
0
$t
Dissertations Abstracts International
$g
83-03B.
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28643745
$z
click for full text (PQDT)
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9476384
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login