語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
FindBook
Google Book
Amazon
博客來
A Framework for Dexterous Manipulation Through Tactile Perception.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
A Framework for Dexterous Manipulation Through Tactile Perception./
作者:
Ganguly, Kanishka.
面頁冊數:
1 online resource (149 pages)
附註:
Source: Dissertations Abstracts International, Volume: 84-12, Section: B.
Contained By:
Dissertations Abstracts International84-12B.
標題:
Robotics. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=29996830click for full text (PQDT)
ISBN:
9798379753221
A Framework for Dexterous Manipulation Through Tactile Perception.
Ganguly, Kanishka.
A Framework for Dexterous Manipulation Through Tactile Perception.
- 1 online resource (149 pages)
Source: Dissertations Abstracts International, Volume: 84-12, Section: B.
Thesis (Ph.D.)--University of Maryland, College Park, 2022.
Includes bibliographical references
A long-anticipated, yet hitherto unfilled goal in Robotics research has been to have robotic agents seamlessly integrating with humans in their natural environments, and performing useful tasks alongside humans. While tremendous progress has been made in allowing robots to perceive visually, and understand and reason about the scene, the act of manipulating said environment still remains a challenging and incomplete task.For robotic agents to have capabilities where they can perform useful tasks in environments that are not specifically designed for their operation, it is crucial to have dexterous manipulation capabilities guided by some form of tactile perception. While visual perception provides a largescale understanding of the environment, tactile perception allows fine-grained understanding of objects and textures. For truly useful robotic agents, a tightly coupled system comprising both visual and tactile perception is a necessity.Tactile sensing hardware can be classified on a spectrum, organized by form-factor on one end to sensing accuracy and robustness on the other. Most off-the-shelf sensors available today trade off one of these features for the other. The tactile sensor used in this research, the BioTac SP, has been selected for its anthropomorphic qualities, such as its shape and sensing mechanism while compromising on quality of sensory outputs. This sensor provides a sensing surface, and returns 24 tactile points of data at each timestamp, along with pressure values.We first present a novel method for contact and motion estimation through visual perception, where we perform non-rigid registration of a human performing actions and compute dense motion estimation trajectories. This is used to compute topological scene changes, and is refined to get object and contact segmentation. We then ground these contact points and motion trajectories to an intermediate action-graph, which can then executed by a robot agent.Secondly, we introduce the concept of computational tactile flow, which is inspired by fMRI studies on humans where it was discovered that the same parts of the brain that react to optical motion stimulus also react to tactile stimulus. We mathematically model the BioTac SP sensor, and interpolate surfaces in two- and three dimensions, on which we compute tactile flow fields. We demonstrate the flow fields on various surfaces, and suggest various useful applications of tactile flow.We next apply tactile feedback to a novel controller, that is able to grasp objects without any prior knowledge about the shape, material, or weight of the objects. We apply tactile flow to compute slippage during grasp, and adjust the finger forces to maintain stable grasp during motion. We demonstrate success on transparent and soft, deformable objects, alongside other regularly shaped samples. Lastly, we take a different approach to processing tactile data, where we compute tactile events taking inspiration from neuromorphic computing literature. We compute spatio-temporal gradients on the raw tactile data, to generate event surfaces, which are more robust and reduces sensor noise. This intermediate surface is then used to track contact regions over the BioTac SP sensor skin, and allows us to detect slippage, track spatial edge contours, and magnitude of applied forces.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2023
Mode of access: World Wide Web
ISBN: 9798379753221Subjects--Topical Terms:
519753
Robotics.
Subjects--Index Terms:
GraspingIndex Terms--Genre/Form:
542853
Electronic books.
A Framework for Dexterous Manipulation Through Tactile Perception.
LDR
:04703nmm a2200397K 4500
001
2363151
005
20231116093808.5
006
m o d
007
cr mn ---uuuuu
008
241011s2022 xx obm 000 0 eng d
020
$a
9798379753221
035
$a
(MiAaPQ)AAI29996830
035
$a
AAI29996830
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Ganguly, Kanishka.
$3
3703900
245
1 2
$a
A Framework for Dexterous Manipulation Through Tactile Perception.
264
0
$c
2022
300
$a
1 online resource (149 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 84-12, Section: B.
500
$a
Advisor: Aloimonos, Yiannis.
502
$a
Thesis (Ph.D.)--University of Maryland, College Park, 2022.
504
$a
Includes bibliographical references
520
$a
A long-anticipated, yet hitherto unfilled goal in Robotics research has been to have robotic agents seamlessly integrating with humans in their natural environments, and performing useful tasks alongside humans. While tremendous progress has been made in allowing robots to perceive visually, and understand and reason about the scene, the act of manipulating said environment still remains a challenging and incomplete task.For robotic agents to have capabilities where they can perform useful tasks in environments that are not specifically designed for their operation, it is crucial to have dexterous manipulation capabilities guided by some form of tactile perception. While visual perception provides a largescale understanding of the environment, tactile perception allows fine-grained understanding of objects and textures. For truly useful robotic agents, a tightly coupled system comprising both visual and tactile perception is a necessity.Tactile sensing hardware can be classified on a spectrum, organized by form-factor on one end to sensing accuracy and robustness on the other. Most off-the-shelf sensors available today trade off one of these features for the other. The tactile sensor used in this research, the BioTac SP, has been selected for its anthropomorphic qualities, such as its shape and sensing mechanism while compromising on quality of sensory outputs. This sensor provides a sensing surface, and returns 24 tactile points of data at each timestamp, along with pressure values.We first present a novel method for contact and motion estimation through visual perception, where we perform non-rigid registration of a human performing actions and compute dense motion estimation trajectories. This is used to compute topological scene changes, and is refined to get object and contact segmentation. We then ground these contact points and motion trajectories to an intermediate action-graph, which can then executed by a robot agent.Secondly, we introduce the concept of computational tactile flow, which is inspired by fMRI studies on humans where it was discovered that the same parts of the brain that react to optical motion stimulus also react to tactile stimulus. We mathematically model the BioTac SP sensor, and interpolate surfaces in two- and three dimensions, on which we compute tactile flow fields. We demonstrate the flow fields on various surfaces, and suggest various useful applications of tactile flow.We next apply tactile feedback to a novel controller, that is able to grasp objects without any prior knowledge about the shape, material, or weight of the objects. We apply tactile flow to compute slippage during grasp, and adjust the finger forces to maintain stable grasp during motion. We demonstrate success on transparent and soft, deformable objects, alongside other regularly shaped samples. Lastly, we take a different approach to processing tactile data, where we compute tactile events taking inspiration from neuromorphic computing literature. We compute spatio-temporal gradients on the raw tactile data, to generate event surfaces, which are more robust and reduces sensor noise. This intermediate surface is then used to track contact regions over the BioTac SP sensor skin, and allows us to detect slippage, track spatial edge contours, and magnitude of applied forces.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2023
538
$a
Mode of access: World Wide Web
650
4
$a
Robotics.
$3
519753
650
4
$a
Computer science.
$3
523869
650
4
$a
Remote sensing.
$3
535394
653
$a
Grasping
653
$a
Manipulation
653
$a
Tactile sensing
653
$a
Natural environments
653
$a
Tactile flow
655
7
$a
Electronic books.
$2
lcsh
$3
542853
690
$a
0771
690
$a
0984
690
$a
0799
710
2
$a
ProQuest Information and Learning Co.
$3
783688
710
2
$a
University of Maryland, College Park.
$b
Computer Science.
$3
1018451
773
0
$t
Dissertations Abstracts International
$g
84-12B.
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=29996830
$z
click for full text (PQDT)
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9485507
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入