TY - JOUR
T1 - An Introductory Review of Deep Learning for Prediction Models With Big Data
AU - Emmert-Streib, Frank
AU - Yang, Zhen
AU - Feng, Han
AU - Tripathi, Shailesh
AU - Dehmer, Matthias
N1 - Publisher Copyright:
© Copyright © 2020 Emmert-Streib, Yang, Feng, Tripathi and Dehmer.
PY - 2020/2/28
Y1 - 2020/2/28
N2 - Deep learning models stand for a new learning paradigm in artificial intelligence (AI) and machine learning. Recent breakthrough results in image analysis and speech recognition have generated a massive interest in this field because also applications in many other domains providing big data seem possible. On a downside, the mathematical and computational methodology underlying deep learning models is very challenging, especially for interdisciplinary scientists. For this reason, we present in this paper an introductory review of deep learning approaches including Deep Feedforward Neural Networks (D-FFNN), Convolutional Neural Networks (CNNs), Deep Belief Networks (DBNs), Autoencoders (AEs), and Long Short-Term Memory (LSTM) networks. These models form the major core architectures of deep learning models currently used and should belong in any data scientist's toolbox. Importantly, those core architectural building blocks can be composed flexibly—in an almost Lego-like manner—to build new application-specific network architectures. Hence, a basic understanding of these network architectures is important to be prepared for future developments in AI.
AB - Deep learning models stand for a new learning paradigm in artificial intelligence (AI) and machine learning. Recent breakthrough results in image analysis and speech recognition have generated a massive interest in this field because also applications in many other domains providing big data seem possible. On a downside, the mathematical and computational methodology underlying deep learning models is very challenging, especially for interdisciplinary scientists. For this reason, we present in this paper an introductory review of deep learning approaches including Deep Feedforward Neural Networks (D-FFNN), Convolutional Neural Networks (CNNs), Deep Belief Networks (DBNs), Autoencoders (AEs), and Long Short-Term Memory (LSTM) networks. These models form the major core architectures of deep learning models currently used and should belong in any data scientist's toolbox. Importantly, those core architectural building blocks can be composed flexibly—in an almost Lego-like manner—to build new application-specific network architectures. Hence, a basic understanding of these network architectures is important to be prepared for future developments in AI.
KW - artificial intelligence
KW - data science
KW - deep learning
KW - machine learning
KW - neural networks
KW - prediction models
UR - http://www.scopus.com/inward/record.url?scp=85117886897&partnerID=8YFLogxK
U2 - 10.3389/frai.2020.00004
DO - 10.3389/frai.2020.00004
M3 - Review article
C2 - 33733124
VL - 3
SP - 4
JO - Frontiers in Artificial Intelligence
JF - Frontiers in Artificial Intelligence
M1 - 4
ER -