• Title/Summary/Keyword: LSTM variant

Search Result 4, Processing Time 0.021 seconds

A Novel Framework Based on CNN-LSTM Neural Network for Prediction of Missing Values in Electricity Consumption Time-Series Datasets

  • Hussain, Syed Nazir;Aziz, Azlan Abd;Hossen, Md. Jakir;Aziz, Nor Azlina Ab;Murthy, G. Ramana;Mustakim, Fajaruddin Bin
    • Journal of Information Processing Systems
    • /
    • v.18 no.1
    • /
    • pp.115-129
    • /
    • 2022
  • Adopting Internet of Things (IoT)-based technologies in smart homes helps users analyze home appliances electricity consumption for better overall cost monitoring. The IoT application like smart home system (SHS) could suffer from large missing values gaps due to several factors such as security attacks, sensor faults, or connection errors. In this paper, a novel framework has been proposed to predict large gaps of missing values from the SHS home appliances electricity consumption time-series datasets. The framework follows a series of steps to detect, predict and reconstruct the input time-series datasets of missing values. A hybrid convolutional neural network-long short term memory (CNN-LSTM) neural network used to forecast large missing values gaps. A comparative experiment has been conducted to evaluate the performance of hybrid CNN-LSTM with its single variant CNN and LSTM in forecasting missing values. The experimental results indicate a performance superiority of the CNN-LSTM model over the single CNN and LSTM neural networks.

A Delta- and Attention-based Long Short-Term Memory (LSTM) Architecture model for Rainfall-runoff Modeling

  • Ahn, Kuk-Hyun;Yoon, Sunghyun
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2022.05a
    • /
    • pp.35-35
    • /
    • 2022
  • 최근에 딥 러닝(Deep learning) 기반의 많은 방법들이 수문학적 모형 및 예측에서 의미있는 결과를 보여주고 있지만 더 많은 연구가 요구되고 있다. 본 연구에서는 수자원의 가장 대표적인 모델링 구조인 강우유출의 관계의 규명에 대한 모형을 Long Short-Term Memory (LSTM) 기반의 변형 된 방법으로 제시하고자 한다. 구체적으로 본 연구에서는 반응변수인 유출량에 대한 직접적인 고려가 아니라 그의 1차 도함수 (First derivative)로 정의되는 Delta기반으로 모형을 구축하였다. 또한, Attention 메카니즘 기반의 모형을 사용함으로써 강우유출의 관계의 규명에 있어 정확성을 향상시키고자 하였다. 마지막으로 확률 기반의 예측를 생성하고 이에 대한 불확실성의 고려를 위하여 Denisty 기반의 모형을 포함시켰고 이를 통하여 Epistemic uncertainty와 Aleatory uncertainty에 대한 상대적 정량화를 수행하였다. 본 연구에서 제시되는 모형의 효용성 및 적용성을 평가하기 위하여 미국 전역에 위치하는 총 507개의 유역의 일별 데이터를 기반으로 모형을 평가하였다. 결과적으로 본 연구에서 제시한 모형이 기존의 대표적인 딥 러닝 기반의 모형인 LSTM 모형과 비교하였을 때 높은 정확성뿐만 아니라 불확실성의 표현과 정량화에 대한 유용한 것으로 확인되었다.

  • PDF

Novel Optimizer AdamW+ implementation in LSTM Model for DGA Detection

  • Awais Javed;Adnan Rashdi;Imran Rashid;Faisal Amir
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.11
    • /
    • pp.133-141
    • /
    • 2023
  • This work take deeper analysis of Adaptive Moment Estimation (Adam) and Adam with Weight Decay (AdamW) implementation in real world text classification problem (DGA Malware Detection). AdamW is introduced by decoupling weight decay from L2 regularization and implemented as improved optimizer. This work introduces a novel implementation of AdamW variant as AdamW+ by further simplifying weight decay implementation in AdamW. DGA malware detection LSTM models results for Adam, AdamW and AdamW+ are evaluated on various DGA families/ groups as multiclass text classification. Proposed AdamW+ optimizer results has shown improvement in all standard performance metrics over Adam and AdamW. Analysis of outcome has shown that novel optimizer has outperformed both Adam and AdamW text classification based problems.

A Study on Korean Sentiment Analysis Rate Using Neural Network and Ensemble Combination

  • Sim, YuJeong;Moon, Seok-Jae;Lee, Jong-Youg
    • International Journal of Advanced Culture Technology
    • /
    • v.9 no.4
    • /
    • pp.268-273
    • /
    • 2021
  • In this paper, we propose a sentiment analysis model that improves performance on small-scale data. A sentiment analysis model for small-scale data is proposed and verified through experiments. To this end, we propose Bagging-Bi-GRU, which combines Bi-GRU, which learns GRU, which is a variant of LSTM (Long Short-Term Memory) with excellent performance on sequential data, in both directions and the bagging technique, which is one of the ensembles learning methods. In order to verify the performance of the proposed model, it is applied to small-scale data and large-scale data. And by comparing and analyzing it with the existing machine learning algorithm, Bi-GRU, it shows that the performance of the proposed model is improved not only for small data but also for large data.