## What is/are Stacked Autoencoder?

Stacked Autoencoder - A novel ensemble deep relevant learning soft sensor (EDRLSS) modeling framework based on stacked autoencoder (SAE), mutual information (MI), and bagging-based strategy is proposed.^{[1]}In this paper, an intelligent incipient fault diagnosis model based on deep time series feature extraction network is proposed, which integrates denoising autoencoder (DAE), layer normalization and dropout layer added LSTM (LD-LSTM), and stacked autoencoder (SAE) to obtain the efficient features.

^{[2]}Stacked autoencoder (SAE) algorithm was used to extract color features in pixel-level, and the trained parameters were used as the initial parameters of the prediction model by using back propagation neural network (BPNN) algorithm.

^{[3]}This paper proposed a feature extraction method based on Stacked Autoencoder (SAE) in order to yield a robust signal representation.

^{[4]}Also, the proposed APBSO is utilized for training deep-stacked autoencoder to choose the optimal weights.

^{[5]}In this paper, a combined model based on stacked autoencoders (SAE) and FrFE (CSF) is proposed for hyperspectral AD.

^{[6]}This study proposes a novel machine learning-based model (SAE-RNN) that hybrids the stacked autoencoder (SAE) with a recurrent neural network (RNN) for providing accurate and timely information to support emergency management in areas impacted by flood hazards.

^{[7]}In order to address this problem, this paper proposes an effective context-specific sentiment based stacked autoencoder (CSSAE) to learn the concrete preference of the user by merging the rating and reviews for a context-specific item into a stacked autoencoder.

^{[8]}Secondly, we adopt the stacked autoencoder (SAE) to reduce the dimensionality of Google search queries data.

^{[9]}We initially use a stacked autoencoder to extract meaningful high-order features from the original similarity matrix, and then perform feature interactive learning, and finally utilize an integrated model composed of multiple random forests and logistic regression to make comprehensive predictions.

^{[10]}With the motivation of comparing the performance and efficiency of classical and new algorithms, we extracted different feature descriptors (Projections, LBP, HOG), and implemented different classifiers (Ridge regression, SVM, AdaBoost, Stacked Autoencoders, and CNN) for open/closed eye classification.

^{[11]}The stacked autoencoder (SAE) is for the low-latitude features of raw data, while the group method of data handling (GMDH) is applied for the sub-series forecasting.

^{[12]}Firstly, a stacked autoencoder (SAE) improved by a back propagation neural network (BPNN) with a softmax classifier is used to enhance the one-dimensional vibration signal.

^{[13]}Besides that, we've a tendency to research the importance of the recent advances in machine learning along with the deep kerne learning, further because the numerous types of auto-encoders (the basic auto-encoder and also the stacked autoencoder).

^{[14]}Motivated by this, an intensified iterative learning (IIL) model which is developed from the stacked autoencoder is proposed in this study.

^{[15]}SMALF first utilizes a stacked autoencoder to learn miRNA latent feature and disease latent feature from the original miRNA-disease association matrix.

^{[16]}Accordingly, the big data classification is progressed using a stacked autoencoder, which is trained by the Adaptive E-Bat algorithm.

^{[17]}The Stacked Autoencoder (SAE) network was then used to predict the location and volume of hemorrhage by conductivity reconstruction.

^{[18]}A new technique, namely Adaptive rag-Rider optimization algorithm (Adaptive rag-ROA) is presented to train deep-stacked autoencoder (Deep SAE) for discovering epileptic seizures.

^{[19]}First, we use AANE algorithm to extract low-dimensional features of circRNAs and diseases and then stacked autoencoder (SAE) to automatically extract in-depth features.

^{[20]}This paper extracts 549 features consisting of information such as DLL/API that can be checked from PE files that are commonly used in malicious code analysis and compresses the data by storing data through SAE (Stacked AutoEncoder) among autoencoders.

^{[21]}The performance of the proposed method is compared with non-convolution-based methods that combine fully-connected NN structures (multi-layer perceptron (MLP)) and dimension-reduction techniques (principal component analysis (PCA) and stacked autoencoder (SAE)) in the benchmark egg model.

^{[22]}In view of the fault diagnosis problem of satellite attitude control system, a fault diagnosis method based on stacked autoencoder (SAE) network is proposed.

^{[23]}To achieve this goal, three machine learning techniques (artificial neural network, random forest, stacked autoencoders) were adopted to improve the conventional source localization approach.

^{[24]}Traditional deep learning like stacked autoencoder (SAE) only captures the feature representations by minimizing the global reconstruction errors, which causes a loss of the intrinsic geometric structure embedded in the raw data.

^{[25]}In our model, successive operations in a clustering algorithm are expressed as steps in a recurrent process, stacked on top of representations output by a Stacked Autoencoder (SAE).

^{[26]}Hierarchical neural network based one-class anomaly detection algorithms generally rely on stacked autoencoders (AEs) for feature learning.

^{[27]}The analysis involves using the stacked autoencoder (SAE) for training and testing based on an error backpropagation algorithm, and the data collected from engineered slopes as samples.

^{[28]}First, a stacked autoencoder (SAE) is used to extract the deep features.

^{[29]}Moreover, the classification of big data is performed at MapReduce framework using stacked autoencoder, which is trained using the proposed Adaptive $$E^{2}$$ -Bat algorithm.

^{[30]}The results of this algorithm are compared with those of gated recurrent unit (GRU) and stacked autoencoders (SAEs), and the results show that this algorithm has the lowest traffic flow fitting error and the highest performance.

^{[31]}A dimensionality reduction of the spectral signatures or angular signatures is rapidly obtained by using a stacked autoencoders (SAE) trained based on contaminated images only.

^{[32]}This article proposes a new data-driven health monitoring method, which uses multiobjective optimization and stacked autoencoder based health indicator.

^{[33]}Thus, this paper proposes a model for the Indian music classification system using the optimization‐based stacked autoencoder.

^{[34]}Therefore, an effective method named Sunflower Sine Cosine (SFSC)-based stacked autoencoder is designed to perform Electroencephalogram (EEG) signal classification using trust-aware routing in WSN.

^{[35]}The model extracts high-level features using stacked autoencoders from decomposed pressure signals (using complementary ensemble empirical mode decomposition with adaptive noise (CEEMDAN) algorithm).

^{[36]}But the stacked autoencoders model is usually trained by the BP algorithm, which has the problem of slow convergence.

^{[37]}And using the stacked autoencoder to encode the key evolution pattern of urban meteorological systems could provide important auxiliary information for

^{[38]}We also demonstrate that stacked autoencoders are more suitable for large-scale feature selection, however, sparse autoencoders are beneficial for a smaller number of feature selection.

^{[39]}An electricity theft detection method based on stacked autoencoder (SAE) and the undersampling and re-sampling based random forest (UaRe-RF) algorithm is proposed in this work to formulate appropriate strategies for the practical electricity theft detection requirements of the power company.

^{[40]}Secondly, an adaptive pattern characterization is carried out by considering a stacked autoencoder.

^{[41]}In recent years, deep learning like stacked autoencoder (SAE) has been widely applied to soft sensor modeling for industrial process quality prediction.

^{[42]}Stacked autoencoder (SAE)-based DNN was used for the rapid identification of methicillin-resistant Staphylococcus aureus (MRSA) and methicillin-sensitive S.

^{[43]}

## deep neural network

By matching and screening three major dataset, LINCS-L1000, CTRP and Achilles, a stacked autoencoder deep neural network was used to extract the gene information.^{[1]}In this paper, a deep neural network has been designed using stacked autoencoder (SAE) for deep feature extraction from time-frequency spectrum of single and combined PQ disturbances in electrical power system network.

^{[2]}In the classification stage, a deep neural network (DNN) comprising an optimized stacked autoencoder (SAE) with Bayesian optimization and a softmax layer is used.

^{[3]}Finally, deep neural networks involve three stacked Autoencoders and a Softmax are adopted as an exterior layer for classification.

^{[4]}We also show that the presented model has reasonable performance when compared to stacked autoencoders deep neural networks.

^{[5]}This paper presents an innovative data-integration that uses an iterative-learning method, a deep neural network (DNN) coupled with a stacked autoencoder (SAE) to solve issues encountered with many-objective history matching.

^{[6]}Objective: In this paper, a stacked autoencoder deep neural network is proposed to extract the QRS complex from raw ECG signals without any conventional feature extraction phase.

^{[7]}In this paper, pap-smear images are analysed and cells are classified using stacked autoencoder based deep neural network.

^{[8]}We employed time–frequency analysis (TFA) and a stacked autoencoder (SAE), which is a deep neural network (DNN)-based learning algorithm, to assess the mobility and fall risk of the elderly according to the criteria of the timed up and go test (TUG).

^{[9]}In this paper, we design three kinds of deep neural networks: Long Short Term Memory (LSTM), Gated Recurrent Units (GRU), and Stacked Autoencoders (SAEs) to predict the position and the velocity of the forward vehicles.

^{[10]}Based on multi-sensor information, stacked autoencoder deep neural networks (DNNs) are utilized to realize the automatic locomotion transition and the corresponding control parameters’ switching.

^{[11]}BLS is shown to not only outperforms standard learning algorithms (Least absolute shrinkage and selection operator (LASSO), shallow and deep neural networks, stacked autoencoders) in terms of training time, but also in terms of testing accuracy.

^{[12]}

## convolutional neural network

These methods include the convolutional neural network, deep belief network, recurrent neural network, and stacked autoencoder, and they are applied to identify plant species, diagnose plant diseases, etc.^{[1]}Aiming to gain better performance for facial expression recognition (FER) systems, we propose a hybrid DL architecture based on convolutional neural network and Stacked AutoEncoder.

^{[2]}Based on 3D convolutional neural network (3D-CNN) and stacked autoencoder (SAE), a method called CSA-NAH is proposed to reduce the wraparound error under sparse measuring.

^{[3]}We introduce a hybrid pixel-based model that allows improving the unsupervised training with stacked autoencoders (SAE) by inserting convolutional neural networks (CNN) in the encoding and decoding steps.

^{[4]}This article is a review of various architectures such as Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), Stacked Autoencoder (SAE), Long Short Term Memory (LSTM), or a combination of those architectures, for anomaly detection purpose in SCADA networks.

^{[5]}A convolutional neural network, a stacked autoencoder, a deep belief network and the aforementioned traditional ML algorithms are investigated.

^{[6]}Two deep learning algorithms of convolutional neural network (CNN) and stacked autoencoder were employed to classify the AE signals into the corresponding classes.

^{[7]}In this study, we propose two hybrid models using genetic algorithm (GA) and deep learning algorithms of Stacked Autoencoder (SAE) and Convolutional Neural Network (CNN) for the prediction of HGB-anemia, nutritional anemia, (iron deficiency anemia, B12 deficiency anemia, and folate deficiency anemia), and patients without anemia.

^{[8]}The deep ensemble approach using convolutional neural network (CNN) and stacked autoencoder (SAE) is employed to improve the prediction performance.

^{[9]}In order to explore new ways for this innovation, we have proposed two novel methods named SCA-DTIs and SCA-DTA, respectively to predict both drug-target interactions and drug-target binding affinities (DTAs) based on convolutional neural network (CNN) with stacked autoencoders (SAE).

^{[10]}

## support vector machine

The comparisons with several other traditional DCSCF methods, such as Back Propagation (BP), Stacked Autoencoder (SAE), Support Vector Machine (SVM) neural networks, and alike, show that by the proposed method, the DCSCF location can be detected more accurately in a shorter time.^{[1]}The experimental results with the MNIST data, a number of large hyperspectral remote sensing data, and different types of data in different application areas, including many image and nonimage datasets, show that the WRBF and PMWNN can work well on both image and nonimage data and have very competitive accuracy compared to learning models, such as stacked autoencoders, deep belief nets, support vector machine (SVM), multilayer perceptron (MLP), LeNet-5, RBF network, recently proposed CDL, broad learning, gcForest, ERDK, and FDRK.

^{[2]}The utilized framework contains three main parts: capturing the topological information of the PPI network with NRL, denoising the gene feature with the participation of a stacked autoencoder (SAE), and optimizing a support vector machine (SVM) classifier to identify IS-related genes.

^{[3]}In this regard, this paper presents an evaluation of the four main techniques for novelty detection: k-Nearest Neighbor, Gaussian Mixture Models, One-Class Support Vector Machine, and Stacked Autoencoder.

^{[4]}Comparison with ten frequently used models, including k -nearest neighbor ( k -NN), support vector machine (SVM), decision tree (DT), gradient booting decision tree (GBDT), random forest (RF), stacked autoencoder (SAE), and four classic RNNs, also shows the proposed models outperform these state-of-the-art traffic prediction methods in terms of accuracy and stability.

^{[5]}Furthermore, the model performance is compared to that of logistic regression, support vector machine, random forest, LSTM classifier, isolation forest, and stacked autoencoder models.

^{[6]}For this estimation, we analyze the speckle using both standard methods (linear principal component analysis, support vector machine (SVM)) and a neural network, in the form of a sparse stacked autoencoder (SSAE) with a softmax classifier or with an SVM.

^{[7]}Condition indicators based on fault characteristic frequencies observed over the extended Park's vector modulus are fused with deep features extracted using stacked autoencoders to generate a multidimensional feature space for fault classification using support vector machine.

^{[8]}

## novel deep learning

Firstly, to improve the accuracy of the data-driven model, a novel deep learning model for crystal growth process is established by combining a stacked autoencoder (SAE) and a long short-term memory network (LSTM) to extract the working condition information and the dynamic timing features in process data.^{[1]}In this paper, a novel deep learning classification approach that fuses Elastic Net-Stacked Autoencoder (EN-SAE) with Kernel Density Estimation (KDE), named ESK-model, is proposed bases on DNA barcode.

^{[2]}In this paper, we propose a novel deep learning method that fuses Elastic Net-Stacked Autoencoder (EN-SAE) with Kernel Density Estimation (KDE), named ESK model.

^{[3]}Hence, in the present study, a novel deep learning (DL) framework is proposed to fill the gap by balancing the three stages of fault feature extraction, fault detection, and parameter optimization based on the long short term memory- recurrent neural networks (RNN- LSTM), stacked autoencoders (SAE), and particle swarm optimization (PSO) techniques.

^{[4]}This research paper offers a novel deep learning framework for chronic kidney disease classification using stacked autoencoder model utilizing multimedia data with a softmax classifier.

^{[5]}

## deep learning model

▪ The authors employ deep learning models such as deep neural networks, a stacked autoencoder, and long short-term memory models.^{[1]}Therefore, this paper proposes a food market regulation method based on blockchain and a deep learning model: Stacked autoencoders (SAEs).

^{[2]}Our results indicate that trafficBERT outperforms models trained using data for specific roads, as well as commonly used statistical and deep learning models, such as Stacked Autoencoder, and models based on long short-term memory, in terms of accuracy.

^{[3]}In recent years, deep learning models like stacked autoencoder and its variants have been utilized to handle this problem and perform well.

^{[4]}In view of this, the paper proposes a deep learning model based on stacked autoencoder (SAE) to predict electricity price.

^{[5]}

## long short term

By utilizing a deep learning technique, we construct a new stacked autoencoder long short-term memory (SAE-LSTM) network-based multitask learning model to extract state features from the monitoring data, and then perform multistep forecasting to obtain performance degradation and failure probability information.^{[1]}A novel deep representative learning soft-sensor modeling approach is proposed based on stacked autoencoder (SAE), mutual information (MI), and long-short term memory (LSTM).

^{[2]}For fixing the optimal wavelet's layers in PM10 forecasting, an innovative coupled model based on WT, long short-term memory (LSTM), and SAE (stacked autoencoder) are proposed.

^{[3]}METHODS The proposed model is a combination of long short-term memory (LSTM) and stacked autoencoder (SAE).

^{[4]}

## deep belief network

The application architectures of these methods include multilayer perceptrons, stacked autoencoders, deep belief networks, two- or three-dimensional convolutional neural networks, recurrent neural networks, graph neural networks, and complex neural networks and are described from five perspectives: residue-level prediction, sequence-level prediction, three-dimensional structural analysis, interaction prediction, and mass spectrometry data mining.^{[1]}The traditional deep learning algorithms [stacked noise autoencoder (SNAE), stacked autoencoder (SAE), stacked contractive autoencoder (SCAE), stacked sparse autoencoder (SSAE), deep belief network (DBN)] are introduced to carry out the comparative simulation with the model in this study.

^{[2]}, stacked autoencoders, deep belief networks, convolutional neural networks, recurrent neural networks, and generative adversarial networks, are introduced according to applications of deep learning in other domains that are briefly illustrated based on the types of data, such as acoustic data, image data, and textual data.

^{[3]}

## extreme learning machine

Finally, extreme learning machine with stacked autoencoder (ELM-SA) based classifier is employed for the effective classification of BT.^{[1]}A method for mechanical fault diagnoses of an HVCB based on a semisupervised stacked autoencoder (SSAE) and an integrated extreme learning machine (IELM) is proposed in this study.

^{[2]}A novel bagging-based multivariate ensemble deep learning approach integrating stacked autoencoder and kernel-based extreme learning machine (B-SAKE) is proposed to address these challenges in this study.

^{[3]}

## deep learning framework

In this paper, a deep learning framework consisting of stacked autoencoder(SAE), multi-scale ResNet and stacked ensemble module, named DHNLDA, was constructed to predict lncRNA-disease associations, which integrates multiple biological data sources and constructing feature matrices.^{[1]}This paper proposes a deep learning framework where wavelet transforms (WT), 2-dimensional Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) stacked autoencoders (SAE) are combined towards single-step time series prediction.

^{[2]}

## short term memory

The newly proposed model is predicted by combining a stacked autoencoder and a long- and short-term memory network.^{[1]}

## Deep Stacked Autoencoder

Hence, this paper presents an occupancy detection approach for detecting the person’s count in the room or building using the proposed Chaotic Whale Spider Monkey (ChaoWSM) + Deep stacked autoencoder.^{[1]}The extracted features of these three regions are concatenated and reduced its dimensionality using Deep Stacked AutoEncoders.

^{[2]}This paper presents a data-driven approach based on deep stacked autoencoders for the localization and characterization of acoustic emission sources in complex aerospace panels.

^{[3]}The fashion images are used as the input and are processed by a deep stacked autoencoder to produce latent feature representation, and the output of this autoencoder will be used as the input of the clustering task.

^{[4]}

## Sparse Stacked Autoencoder

Concretely, the sparse stacked autoencoder trained with unlabeled data is used to extract more robust features, and the fully-connected network trained with a small amount of labeled data is used for location and identification.^{[1]}This paper proposes an embedded hybrid feature deep sparse stacked autoencoder ensemble method to solve this problem.

^{[2]}For this estimation, we analyze the speckle using both standard methods (linear principal component analysis, support vector machine (SVM)) and a neural network, in the form of a sparse stacked autoencoder (SSAE) with a softmax classifier or with an SVM.

^{[3]}

## Semisupervised Stacked Autoencoder

A method for mechanical fault diagnoses of an HVCB based on a semisupervised stacked autoencoder (SSAE) and an integrated extreme learning machine (IELM) is proposed in this study.^{[1]}In this paper, we present a novel DL framework, namely, semisupervised stacked autoencoders (Semi-SAEs) with cotraining, for HSI classification.

^{[2]}

## Connected Stacked Autoencoder

The BAF consists of four parallel-connected stacked autoencoders (SAEs), and each of them uses a different activation function, including sigmoid, tanh, relu, and softplus.^{[1]}The BAF consists of four parallel connected Stacked AutoEncoders (SAEs) with different activation functions: the Sigmoid, the Tanh, the ReLu, and the Softplus.

^{[2]}

## Learning Stacked Autoencoder

The approach presented in this thesis extends GWAS and explores the use of deep learning stacked autoencoders (SAE) and association rule mining (ARM) to identify epistatic interactions between SNPs.^{[1]}This framework includes traditional GWAS quality control, association analysis, deep learning stacked autoencoders, and a multilayer perceptron for classification.

^{[2]}

## Improved Stacked Autoencoder

We aimed to develop an improved stacked autoencoder (SAE) for metabolomic data classification.^{[1]}The improved stacked autoencoder (I-SAE) based on the idea of partial data reconstruction is proposed to learn the high dimensional characteristics of operational data, which can enhance the separability of normal and abnormal operational data.

^{[2]}

## stacked autoencoder network

We use a stacked autoencoder network along with two advanced machine learning techniques to forecast the Indian summer monsoon.^{[1]}Combined with the supervised fine-tuning classifier to train and update the network parameters, the FDIA detection model with stacked autoencoder network is constructed.

^{[2]}We apply the proposed ranked dropout to a stacked autoencoder network and compare it with standard dropout, gaussian dropout, uniform dropout and DropConnect on MNIST dataset.

^{[3]}The classification was done with a stacked autoencoder network.

^{[4]}Firstly for feature representation, a concise feature space is learned in an unsupervised way via stacked autoencoder network.

^{[5]}

## stacked autoencoder deep

By matching and screening three major dataset, LINCS-L1000, CTRP and Achilles, a stacked autoencoder deep neural network was used to extract the gene information.^{[1]}Objective: In this paper, a stacked autoencoder deep neural network is proposed to extract the QRS complex from raw ECG signals without any conventional feature extraction phase.

^{[2]}Based on multi-sensor information, stacked autoencoder deep neural networks (DNNs) are utilized to realize the automatic locomotion transition and the corresponding control parameters’ switching.

^{[3]}

## stacked autoencoder model

Furthermore, the model performance is compared to that of logistic regression, support vector machine, random forest, LSTM classifier, isolation forest, and stacked autoencoder models.^{[1]}This research paper offers a novel deep learning framework for chronic kidney disease classification using stacked autoencoder model utilizing multimedia data with a softmax classifier.

^{[2]}

## stacked autoencoder neural

In this paper, a new deep feature transfer-based stacked autoencoder neural network system is proposed for the automatic diagnosis of DME in fundus images.^{[1]}Then, a stacked autoencoder neural network is used for compressing the feature depth.

^{[2]}