語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Introduction to deep learning for he...
~
Xiao, Cao.
FindBook
Google Book
Amazon
博客來
Introduction to deep learning for healthcare
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Introduction to deep learning for healthcare/ by Cao Xiao, Jimeng Sun.
作者:
Xiao, Cao.
其他作者:
Sun, Jimeng.
出版者:
Cham :Springer International Publishing : : 2021.,
面頁冊數:
xi, 232 p. :ill., digital ;24 cm.
內容註:
I Introduction -- I.1 Who should read this book? -- I.2 Book organization -- II Health Data -- II.1 The growth of EHR Adoption -- II.2 Health Data -- II.2.1 Life cycle of health data -- II.2.2 Structured Health Data -- II.2.3 Unstructured clinical notes -- II.2.4 Continuous signals -- II.2.5 Medical Imaging Data -- II.2.6 Biomedical data for in silico drug Discovery -- II.3 Health Data Standards -- III Machine Learning Basics -- III.1 Supervised Learning -- III.1.1 Logistic Regression -- III.1.2 Softmax Regression -- III.1.3 Gradient Descent -- III.1.4 Stochastic and Minibatch Gradient Descent -- III.2 Unsupervised Learning -- III.2.1 Principal component analysis -- III.2.2 t-distributed stochastic neighbor embedding (t-SNE) -- III.2.3 Clustering -- III.3 Assessing Model Performance -- III.3.1 Evaluation Metrics for Regression Tasks -- III.3.2 Evaluation Metrics for Classification Tasks -- III.3.3 Evaluation Metrics for Clustering Tasks -- III.3.4 Evaluation Strategy -- III.4 Modeling Exercise -- III.5 Hands-On Practice -- 3 -- 4 CONTENTS -- IVDeep Neural Networks (DNN) -- IV.1 A Single neuron -- IV.1.1 Activation function -- IV.1.2 Loss Function -- IV.1.3 Train a single neuron -- IV.2 Multilayer Neural Network -- IV.2.1 Network Representation -- IV.2.2 Train a Multilayer Neural Network -- IV.2.3 Summary of the Backpropagation Algorithm -- IV.2.4 Parameters and Hyper-parameters -- IV.3 Readmission Prediction from EHR Data with DNN -- IV.4 DNN for Drug Property Prediction -- V Embedding -- V.1 Overview -- V.2 Word2Vec -- V.2.1 Idea and Formulation of Word2Vec -- V.2.2 Healthcare application of Word2Vec -- V.3 Med2Vec: two-level embedding for EHR -- V.3.1 Med2Vec Method -- V.4 MiME: Embed Internal Structure -- V.4.1 Notations of MIME -- V.4.2 Description of MIME -- V.4.3 Experiment results of MIME -- VI Convolutional Neural Networks (CNN) -- VI.1 CNN intuition -- VI.2 Architecture of CNN -- VI.2.1 Convolution layer - 1D -- VI.2.2 Convolution layer - 2D -- VI.2.3 Pooling Layer -- VI.2.4 Fully Connected Layer -- VI.3 Backpropagation Algorithm in CNN* -- VI.3.1 Forward and Backward Computation for 1-D Data -- VI.3.2 Forward Computation and Backpropagation for 2-D Convolution -- Layer. -- VI.3.3 Special CNN Architecture -- VI.4 Healthcare Applications -- VI.5 Automated surveillance of cranial images for acute neurologic events -- VI.6 Detection of Lymph Node Metastases from Pathology Images -- VI.7 Cardiologist-level arrhythmia detection and classification in ambulatory -- ECG -- CONTENTS 5 -- VIIRecurrent Neural Networks (RNN) -- VII.1Basic Concepts and Notations -- VII.2Backpropagation Through Time (BPTT) algorithm -- VII.2.1Forward Pass -- VII.2.2 Backward Pass -- VII.3RNN Variants -- VII.3.1 Long Short-Term Memory (LSTM) -- VII.3.2 Gated Recurrent Unit (GRU) -- VII.3.3 Bidirectional RNN -- VII.3.4 Encoder-Decoder Sequence-to-Sequence Models -- VII.4Case Study: Early detection of heart failure -- VII.5Case Study: Sequential clinical event prediction -- VII.6Case Study: De-identification of Clinical Notes -- VII.7Case Study: Automatic Detection of Heart Disease from electrocardiography -- (ECG) Data -- VIIAIutoencoders (AE) -- VIII.1Overview -- VIII.2Autoencoders -- VIII.3Sparse Autoencoders -- VIII.4Stacked Autoencoders -- VIII.5Denoising Autoencoders -- VIII.6Case Study: "Deep Patient" via stacked denoising autoencoders -- VIII.7Case Study: Learning from Noisy, Sparse, and Irregular Clinical -- data -- IX Attention Models -- IX.1 Overview -- IX.2 Attention Mechanism -- IX.2.1 Attention based on Encoder-Decoder RNN Models -- IX.2.2 Case Study: Attention Model over Longitudinal EHR -- IX.2.3 Case Study: Attention model over a Medical Ontology -- IX.2.4 Case Study: ICD Classification from Clinical Notes -- X Memory Networks -- X.1 Original Memory Networks -- X.2 End-to-end Memory Networks -- X.3 Case Study: Medication Recommendation -- X.4 EEG-RelNet: Memory Derived from Data -- X.5 Incorporate Memory from Unstructured Knowledge Base -- XIGraph Neural Networks -- XI.1 Overview -- XI.2 Graph Convolutional Networks -- XI.2.1 Basic Setting of GCN -- XI.2.2 Spatial Convolution on Graphs -- 6 CONTENTS -- XI.2.3 Spectral Convolution on Graphs -- XI.2.4 Approximate Graph Convolution -- XI.2.5 Neighborhood Aggregation -- XI.3 Neural Fingerprinting: Drug Molecule Embedding with GCN -- XI.4 Decagon: Modeling Polypharmacy Side Effects with GCN -- XI.5 Case Study: Multiview Drug-drug Interaction Prediction -- XIIGenerative Models -- XII.1Generative adversarial networks (GAN) -- XII.1.1 The GAN Framework -- XII.1.2 The Cost Function of Discriminator -- XII.1.3 The Cost Function of Generator -- XII.2Variational Autoencoders (VAE) -- XII.2.1 Latent Variable Models -- XII.2.2Objective Formulation -- XII.2.3Objective Approximation -- XII.2.4 Reparameterization Trick -- XII.3Case Study: Generating Patient Records -- XII.4Case Study: Small Molecule Generation for Drug Discovery -- XII CIonclusion -- XIII.1Model Setup -- XIII.2Model Training -- XIII.3Testing and Performance Evaluation -- XIII.4Result Visualization -- XIII.5Case Studies -- XIVAppendix -- XIV.1Regularization* -- XIV.1.1Vanishing or Exploding Gradient Problem -- XIV.1.2Dropout -- XIV.1.3Batch normalization -- XIV.2Stochastic Gradient Descent and Minibatch gradient descent* -- XIV.3Advanced optimization* -- XIV.3.1Momentum -- XIV.3.2Adagrad, Adadelta, and RMSprop -- XIV.3.3Adam.
Contained By:
Springer Nature eBook
標題:
Artificial intelligence - Medical applications. -
電子資源:
https://doi.org/10.1007/978-3-030-82184-5
ISBN:
9783030821845
Introduction to deep learning for healthcare
Xiao, Cao.
Introduction to deep learning for healthcare
[electronic resource] /by Cao Xiao, Jimeng Sun. - Cham :Springer International Publishing :2021. - xi, 232 p. :ill., digital ;24 cm.
I Introduction -- I.1 Who should read this book? -- I.2 Book organization -- II Health Data -- II.1 The growth of EHR Adoption -- II.2 Health Data -- II.2.1 Life cycle of health data -- II.2.2 Structured Health Data -- II.2.3 Unstructured clinical notes -- II.2.4 Continuous signals -- II.2.5 Medical Imaging Data -- II.2.6 Biomedical data for in silico drug Discovery -- II.3 Health Data Standards -- III Machine Learning Basics -- III.1 Supervised Learning -- III.1.1 Logistic Regression -- III.1.2 Softmax Regression -- III.1.3 Gradient Descent -- III.1.4 Stochastic and Minibatch Gradient Descent -- III.2 Unsupervised Learning -- III.2.1 Principal component analysis -- III.2.2 t-distributed stochastic neighbor embedding (t-SNE) -- III.2.3 Clustering -- III.3 Assessing Model Performance -- III.3.1 Evaluation Metrics for Regression Tasks -- III.3.2 Evaluation Metrics for Classification Tasks -- III.3.3 Evaluation Metrics for Clustering Tasks -- III.3.4 Evaluation Strategy -- III.4 Modeling Exercise -- III.5 Hands-On Practice -- 3 -- 4 CONTENTS -- IVDeep Neural Networks (DNN) -- IV.1 A Single neuron -- IV.1.1 Activation function -- IV.1.2 Loss Function -- IV.1.3 Train a single neuron -- IV.2 Multilayer Neural Network -- IV.2.1 Network Representation -- IV.2.2 Train a Multilayer Neural Network -- IV.2.3 Summary of the Backpropagation Algorithm -- IV.2.4 Parameters and Hyper-parameters -- IV.3 Readmission Prediction from EHR Data with DNN -- IV.4 DNN for Drug Property Prediction -- V Embedding -- V.1 Overview -- V.2 Word2Vec -- V.2.1 Idea and Formulation of Word2Vec -- V.2.2 Healthcare application of Word2Vec -- V.3 Med2Vec: two-level embedding for EHR -- V.3.1 Med2Vec Method -- V.4 MiME: Embed Internal Structure -- V.4.1 Notations of MIME -- V.4.2 Description of MIME -- V.4.3 Experiment results of MIME -- VI Convolutional Neural Networks (CNN) -- VI.1 CNN intuition -- VI.2 Architecture of CNN -- VI.2.1 Convolution layer - 1D -- VI.2.2 Convolution layer - 2D -- VI.2.3 Pooling Layer -- VI.2.4 Fully Connected Layer -- VI.3 Backpropagation Algorithm in CNN* -- VI.3.1 Forward and Backward Computation for 1-D Data -- VI.3.2 Forward Computation and Backpropagation for 2-D Convolution -- Layer. -- VI.3.3 Special CNN Architecture -- VI.4 Healthcare Applications -- VI.5 Automated surveillance of cranial images for acute neurologic events -- VI.6 Detection of Lymph Node Metastases from Pathology Images -- VI.7 Cardiologist-level arrhythmia detection and classification in ambulatory -- ECG -- CONTENTS 5 -- VIIRecurrent Neural Networks (RNN) -- VII.1Basic Concepts and Notations -- VII.2Backpropagation Through Time (BPTT) algorithm -- VII.2.1Forward Pass -- VII.2.2 Backward Pass -- VII.3RNN Variants -- VII.3.1 Long Short-Term Memory (LSTM) -- VII.3.2 Gated Recurrent Unit (GRU) -- VII.3.3 Bidirectional RNN -- VII.3.4 Encoder-Decoder Sequence-to-Sequence Models -- VII.4Case Study: Early detection of heart failure -- VII.5Case Study: Sequential clinical event prediction -- VII.6Case Study: De-identification of Clinical Notes -- VII.7Case Study: Automatic Detection of Heart Disease from electrocardiography -- (ECG) Data -- VIIAIutoencoders (AE) -- VIII.1Overview -- VIII.2Autoencoders -- VIII.3Sparse Autoencoders -- VIII.4Stacked Autoencoders -- VIII.5Denoising Autoencoders -- VIII.6Case Study: "Deep Patient" via stacked denoising autoencoders -- VIII.7Case Study: Learning from Noisy, Sparse, and Irregular Clinical -- data -- IX Attention Models -- IX.1 Overview -- IX.2 Attention Mechanism -- IX.2.1 Attention based on Encoder-Decoder RNN Models -- IX.2.2 Case Study: Attention Model over Longitudinal EHR -- IX.2.3 Case Study: Attention model over a Medical Ontology -- IX.2.4 Case Study: ICD Classification from Clinical Notes -- X Memory Networks -- X.1 Original Memory Networks -- X.2 End-to-end Memory Networks -- X.3 Case Study: Medication Recommendation -- X.4 EEG-RelNet: Memory Derived from Data -- X.5 Incorporate Memory from Unstructured Knowledge Base -- XIGraph Neural Networks -- XI.1 Overview -- XI.2 Graph Convolutional Networks -- XI.2.1 Basic Setting of GCN -- XI.2.2 Spatial Convolution on Graphs -- 6 CONTENTS -- XI.2.3 Spectral Convolution on Graphs -- XI.2.4 Approximate Graph Convolution -- XI.2.5 Neighborhood Aggregation -- XI.3 Neural Fingerprinting: Drug Molecule Embedding with GCN -- XI.4 Decagon: Modeling Polypharmacy Side Effects with GCN -- XI.5 Case Study: Multiview Drug-drug Interaction Prediction -- XIIGenerative Models -- XII.1Generative adversarial networks (GAN) -- XII.1.1 The GAN Framework -- XII.1.2 The Cost Function of Discriminator -- XII.1.3 The Cost Function of Generator -- XII.2Variational Autoencoders (VAE) -- XII.2.1 Latent Variable Models -- XII.2.2Objective Formulation -- XII.2.3Objective Approximation -- XII.2.4 Reparameterization Trick -- XII.3Case Study: Generating Patient Records -- XII.4Case Study: Small Molecule Generation for Drug Discovery -- XII CIonclusion -- XIII.1Model Setup -- XIII.2Model Training -- XIII.3Testing and Performance Evaluation -- XIII.4Result Visualization -- XIII.5Case Studies -- XIVAppendix -- XIV.1Regularization* -- XIV.1.1Vanishing or Exploding Gradient Problem -- XIV.1.2Dropout -- XIV.1.3Batch normalization -- XIV.2Stochastic Gradient Descent and Minibatch gradient descent* -- XIV.3Advanced optimization* -- XIV.3.1Momentum -- XIV.3.2Adagrad, Adadelta, and RMSprop -- XIV.3.3Adam.
This textbook presents deep learning models and their healthcare applications. It focuses on rich health data and deep learning models that can effectively model health data. Healthcare data: Among all healthcare technologies, electronic health records (EHRs) had vast adoption and a significant impact on healthcare delivery in recent years. One crucial benefit of EHRs is to capture all the patient encounters with rich multi-modality data. Healthcare data include both structured and unstructured information. Structured data include various medical codes for diagnoses and procedures, lab results, and medication information. Unstructured data contain 1) clinical notes as text, 2) medical imaging data such as X-rays, echocardiogram, and magnetic resonance imaging (MRI), and 3) time-series data such as the electrocardiogram (ECG) and electroencephalogram (EEG) Beyond the data collected during clinical visits, patient self-generated/reported data start to grow thanks to wearable sensors' increasing use. The authors present deep learning case studies on all data described. Deep learning models: Neural network models are a class of machine learning methods with a long history. Deep learning models are neural networks of many layers, which can extract multiple levels of features from raw data. Deep learning applied to healthcare is a natural and promising direction with many initial successes. The authors cover deep neural networks, convolutional neural networks, recurrent neural networks, embedding methods, autoencoders, attention models, graph neural networks, memory networks, and generative models. It's presented with concrete healthcare case studies such as clinical predictive modeling, readmission prediction, phenotyping, x-ray classification, ECG diagnosis, sleep monitoring, automatic diagnosis coding from clinical notes, automatic deidentification, medication recommendation, drug discovery (drug property prediction and molecule generation), and clinical trial matching. This textbook targets graduate-level students focused on deep learning methods and their healthcare applications. It can be used for the concepts of deep learning and its applications as well. Researchers working in this field will also find this book to be extremely useful and valuable for their research.
ISBN: 9783030821845
Standard No.: 10.1007/978-3-030-82184-5doiSubjects--Topical Terms:
900591
Artificial intelligence
--Medical applications.
LC Class. No.: R859.7.A78 / X53 2021
Dewey Class. No.: 610.28563
Introduction to deep learning for healthcare
LDR
:08700nmm 22003255a 4500
001
2258705
003
DE-He213
005
20211111121118.0
006
m d
007
cr nn 008maaau
008
220422s2021 sz s 0 eng d
020
$a
9783030821845
$q
(electronic bk.)
020
$a
9783030821838
$q
(paper)
024
7
$a
10.1007/978-3-030-82184-5
$2
doi
035
$a
978-3-030-82184-5
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
R859.7.A78
$b
X53 2021
072
7
$a
UBH
$2
bicssc
072
7
$a
MED000000
$2
bisacsh
072
7
$a
UBH
$2
thema
082
0 4
$a
610.28563
$2
23
090
$a
R859.7.A78
$b
X6 2021
100
1
$a
Xiao, Cao.
$3
3288523
245
1 0
$a
Introduction to deep learning for healthcare
$h
[electronic resource] /
$c
by Cao Xiao, Jimeng Sun.
260
$a
Cham :
$b
Springer International Publishing :
$b
Imprint: Springer,
$c
2021.
300
$a
xi, 232 p. :
$b
ill., digital ;
$c
24 cm.
505
0
$a
I Introduction -- I.1 Who should read this book? -- I.2 Book organization -- II Health Data -- II.1 The growth of EHR Adoption -- II.2 Health Data -- II.2.1 Life cycle of health data -- II.2.2 Structured Health Data -- II.2.3 Unstructured clinical notes -- II.2.4 Continuous signals -- II.2.5 Medical Imaging Data -- II.2.6 Biomedical data for in silico drug Discovery -- II.3 Health Data Standards -- III Machine Learning Basics -- III.1 Supervised Learning -- III.1.1 Logistic Regression -- III.1.2 Softmax Regression -- III.1.3 Gradient Descent -- III.1.4 Stochastic and Minibatch Gradient Descent -- III.2 Unsupervised Learning -- III.2.1 Principal component analysis -- III.2.2 t-distributed stochastic neighbor embedding (t-SNE) -- III.2.3 Clustering -- III.3 Assessing Model Performance -- III.3.1 Evaluation Metrics for Regression Tasks -- III.3.2 Evaluation Metrics for Classification Tasks -- III.3.3 Evaluation Metrics for Clustering Tasks -- III.3.4 Evaluation Strategy -- III.4 Modeling Exercise -- III.5 Hands-On Practice -- 3 -- 4 CONTENTS -- IVDeep Neural Networks (DNN) -- IV.1 A Single neuron -- IV.1.1 Activation function -- IV.1.2 Loss Function -- IV.1.3 Train a single neuron -- IV.2 Multilayer Neural Network -- IV.2.1 Network Representation -- IV.2.2 Train a Multilayer Neural Network -- IV.2.3 Summary of the Backpropagation Algorithm -- IV.2.4 Parameters and Hyper-parameters -- IV.3 Readmission Prediction from EHR Data with DNN -- IV.4 DNN for Drug Property Prediction -- V Embedding -- V.1 Overview -- V.2 Word2Vec -- V.2.1 Idea and Formulation of Word2Vec -- V.2.2 Healthcare application of Word2Vec -- V.3 Med2Vec: two-level embedding for EHR -- V.3.1 Med2Vec Method -- V.4 MiME: Embed Internal Structure -- V.4.1 Notations of MIME -- V.4.2 Description of MIME -- V.4.3 Experiment results of MIME -- VI Convolutional Neural Networks (CNN) -- VI.1 CNN intuition -- VI.2 Architecture of CNN -- VI.2.1 Convolution layer - 1D -- VI.2.2 Convolution layer - 2D -- VI.2.3 Pooling Layer -- VI.2.4 Fully Connected Layer -- VI.3 Backpropagation Algorithm in CNN* -- VI.3.1 Forward and Backward Computation for 1-D Data -- VI.3.2 Forward Computation and Backpropagation for 2-D Convolution -- Layer. -- VI.3.3 Special CNN Architecture -- VI.4 Healthcare Applications -- VI.5 Automated surveillance of cranial images for acute neurologic events -- VI.6 Detection of Lymph Node Metastases from Pathology Images -- VI.7 Cardiologist-level arrhythmia detection and classification in ambulatory -- ECG -- CONTENTS 5 -- VIIRecurrent Neural Networks (RNN) -- VII.1Basic Concepts and Notations -- VII.2Backpropagation Through Time (BPTT) algorithm -- VII.2.1Forward Pass -- VII.2.2 Backward Pass -- VII.3RNN Variants -- VII.3.1 Long Short-Term Memory (LSTM) -- VII.3.2 Gated Recurrent Unit (GRU) -- VII.3.3 Bidirectional RNN -- VII.3.4 Encoder-Decoder Sequence-to-Sequence Models -- VII.4Case Study: Early detection of heart failure -- VII.5Case Study: Sequential clinical event prediction -- VII.6Case Study: De-identification of Clinical Notes -- VII.7Case Study: Automatic Detection of Heart Disease from electrocardiography -- (ECG) Data -- VIIAIutoencoders (AE) -- VIII.1Overview -- VIII.2Autoencoders -- VIII.3Sparse Autoencoders -- VIII.4Stacked Autoencoders -- VIII.5Denoising Autoencoders -- VIII.6Case Study: "Deep Patient" via stacked denoising autoencoders -- VIII.7Case Study: Learning from Noisy, Sparse, and Irregular Clinical -- data -- IX Attention Models -- IX.1 Overview -- IX.2 Attention Mechanism -- IX.2.1 Attention based on Encoder-Decoder RNN Models -- IX.2.2 Case Study: Attention Model over Longitudinal EHR -- IX.2.3 Case Study: Attention model over a Medical Ontology -- IX.2.4 Case Study: ICD Classification from Clinical Notes -- X Memory Networks -- X.1 Original Memory Networks -- X.2 End-to-end Memory Networks -- X.3 Case Study: Medication Recommendation -- X.4 EEG-RelNet: Memory Derived from Data -- X.5 Incorporate Memory from Unstructured Knowledge Base -- XIGraph Neural Networks -- XI.1 Overview -- XI.2 Graph Convolutional Networks -- XI.2.1 Basic Setting of GCN -- XI.2.2 Spatial Convolution on Graphs -- 6 CONTENTS -- XI.2.3 Spectral Convolution on Graphs -- XI.2.4 Approximate Graph Convolution -- XI.2.5 Neighborhood Aggregation -- XI.3 Neural Fingerprinting: Drug Molecule Embedding with GCN -- XI.4 Decagon: Modeling Polypharmacy Side Effects with GCN -- XI.5 Case Study: Multiview Drug-drug Interaction Prediction -- XIIGenerative Models -- XII.1Generative adversarial networks (GAN) -- XII.1.1 The GAN Framework -- XII.1.2 The Cost Function of Discriminator -- XII.1.3 The Cost Function of Generator -- XII.2Variational Autoencoders (VAE) -- XII.2.1 Latent Variable Models -- XII.2.2Objective Formulation -- XII.2.3Objective Approximation -- XII.2.4 Reparameterization Trick -- XII.3Case Study: Generating Patient Records -- XII.4Case Study: Small Molecule Generation for Drug Discovery -- XII CIonclusion -- XIII.1Model Setup -- XIII.2Model Training -- XIII.3Testing and Performance Evaluation -- XIII.4Result Visualization -- XIII.5Case Studies -- XIVAppendix -- XIV.1Regularization* -- XIV.1.1Vanishing or Exploding Gradient Problem -- XIV.1.2Dropout -- XIV.1.3Batch normalization -- XIV.2Stochastic Gradient Descent and Minibatch gradient descent* -- XIV.3Advanced optimization* -- XIV.3.1Momentum -- XIV.3.2Adagrad, Adadelta, and RMSprop -- XIV.3.3Adam.
520
$a
This textbook presents deep learning models and their healthcare applications. It focuses on rich health data and deep learning models that can effectively model health data. Healthcare data: Among all healthcare technologies, electronic health records (EHRs) had vast adoption and a significant impact on healthcare delivery in recent years. One crucial benefit of EHRs is to capture all the patient encounters with rich multi-modality data. Healthcare data include both structured and unstructured information. Structured data include various medical codes for diagnoses and procedures, lab results, and medication information. Unstructured data contain 1) clinical notes as text, 2) medical imaging data such as X-rays, echocardiogram, and magnetic resonance imaging (MRI), and 3) time-series data such as the electrocardiogram (ECG) and electroencephalogram (EEG) Beyond the data collected during clinical visits, patient self-generated/reported data start to grow thanks to wearable sensors' increasing use. The authors present deep learning case studies on all data described. Deep learning models: Neural network models are a class of machine learning methods with a long history. Deep learning models are neural networks of many layers, which can extract multiple levels of features from raw data. Deep learning applied to healthcare is a natural and promising direction with many initial successes. The authors cover deep neural networks, convolutional neural networks, recurrent neural networks, embedding methods, autoencoders, attention models, graph neural networks, memory networks, and generative models. It's presented with concrete healthcare case studies such as clinical predictive modeling, readmission prediction, phenotyping, x-ray classification, ECG diagnosis, sleep monitoring, automatic diagnosis coding from clinical notes, automatic deidentification, medication recommendation, drug discovery (drug property prediction and molecule generation), and clinical trial matching. This textbook targets graduate-level students focused on deep learning methods and their healthcare applications. It can be used for the concepts of deep learning and its applications as well. Researchers working in this field will also find this book to be extremely useful and valuable for their research.
650
0
$a
Artificial intelligence
$x
Medical applications.
$3
900591
650
0
$a
Machine learning.
$3
533906
650
1 4
$a
Health Informatics.
$3
892928
650
2 4
$a
Machine Learning.
$3
3382522
650
2 4
$a
Artificial Intelligence.
$3
769149
700
1
$a
Sun, Jimeng.
$3
1286709
710
2
$a
SpringerLink (Online service)
$3
836513
773
0
$t
Springer Nature eBook
856
4 0
$u
https://doi.org/10.1007/978-3-030-82184-5
950
$a
Computer Science (SpringerNature-11645)
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9414312
電子資源
11.線上閱覽_V
電子書
EB R859.7.A78 X53 2021
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入