• Title/Summary/Keyword: RNN

Search Result 454, Processing Time 0.044 seconds

Parameter Estimation of Recurrent Neural Networks Using A Unscented Kalman Filter Training Algorithm and Its Applications to Nonlinear Channel Equalization (언센티드 칼만필터 훈련 알고리즘에 의한 순환신경망의 파라미터 추정 및 비선형 채널 등화에의 응용)

  • Kwon Oh-Shin
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.15 no.5
    • /
    • pp.552-559
    • /
    • 2005
  • Recurrent neural networks(RNNs) trained with gradient based such as real time recurrent learning(RTRL) has a drawback of slor convergence rate. This algorithm also needs the derivative calculation which is not trivialized in error back propagation process. In this paper a derivative free Kalman filter, so called the unscented Kalman filter(UKF), for training a fully connected RNN is presented in a state space formulation of the system. A derivative free Kalman filler learning algorithm makes the RNN have fast convergence speed and good tracking performance without the derivative computation. Through experiments of nonlinear channel equalization, performance of the RNNs with a derivative free Kalman filter teaming algorithm is evaluated.

Psalm Text Generator Comparison Between English and Korean Using LSTM Blocks in a Recurrent Neural Network (순환 신경망에서 LSTM 블록을 사용한 영어와 한국어의 시편 생성기 비교)

  • Snowberger, Aaron Daniel;Lee, Choong Ho
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2022.10a
    • /
    • pp.269-271
    • /
    • 2022
  • In recent years, RNN networks with LSTM blocks have been used extensively in machine learning tasks that process sequential data. These networks have proven to be particularly good at sequential language processing tasks by being more able to accurately predict the next most likely word in a given sequence than traditional neural networks. This study trained an RNN / LSTM neural network on three different translations of 150 biblical Psalms - in both English and Korean. The resulting model is then fed an input word and a length number from which it automatically generates a new Psalm of the desired length based on the patterns it recognized while training. The results of training the network on both English text and Korean text are compared and discussed.

  • PDF

Performance Comparison of PM10 Prediction Models Based on RNN and LSTM (RNN과 LSTM 기반의 PM10 예측 모델 성능 비교)

  • Jung, Yong-jin;Lee, Jong-sung;Oh, Chang-heon
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2021.05a
    • /
    • pp.280-282
    • /
    • 2021
  • A particular matter prediction model was designed using a deep learning algorithm to solve the problem of particular matter forecast with subjective judgment applied. RNN and LSTM were used among deep learning algorithms, and it was designed by applying optimal parameters by proceeding with hyperparametric navigation. The predicted performance of the two models was evaluated through RMSE and predicted accuracy. The performance assessment confirmed that there was no significant difference between the RMSE and accuracy, but there was a difference in the detailed forecast accuracy.

  • PDF

Comparison of High Concentration Prediction Performance of Particulate Matter by Deep Learning Algorithm (딥러닝 알고리즘별 미세먼지 고농도 예측 성능 비교)

  • Lee, Jong-sung;Jung, Yong-jin;Oh, Chang-heon
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2021.10a
    • /
    • pp.348-350
    • /
    • 2021
  • When predicting the concentration of fine dust using deep learning, there is a problem that the characteristics of a high concentration of 81㎍/m3 or more are not well reflected in the prediction model. In this paper, a comparison through predictive performance was conducted to confirm the results of reflecting the characteristics of fine dust in the high concentration area according to the deep learning algorithm. As a result of performance evaluation, overall, similar levels of results were shown, but the RNN model showed higher accuracy than other models at concentrations of "very bad" based on AQI. This confirmed that the RNN algorithm reflected the characteristics of the high concentration better than the DNN and LSTM algorithms.

  • PDF

A Study on data management by applying LSTM time series parameters (LSTM 시계열 매개변수 적용을 통한 효율적 데이터 관리)

  • Min, Youn A
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2022.07a
    • /
    • pp.537-538
    • /
    • 2022
  • LSTM은 딥러닝 RNN의 한 종류이며 RNN의 단점인 장기 데이터손실에 대한 문제를 해결하기 위해 제시된다. 본 논문에서는 LSTM의 하이퍼파라미터 적용 시 이전 state의 중요도와 이후 state에 대한 중요도 예측에 대한 신경망 처리를 위하여 유의미성 측정가능한 매개변수를 적용하여 처리하고 데이터에 대한 정밀도와 재현율을 높이는 것을 목적으로 한다. 동일한 데이터셋에 대하여 전통적인 LSTM 방식과 본 연구를 비교한 결과 정밀도와 재현율이 5%이상 증가함을 확인하였다.

  • PDF

Design and Implementation of AI Recommendation Platform for Commercial Services

  • Jong-Eon Lee
    • International journal of advanced smart convergence
    • /
    • v.12 no.4
    • /
    • pp.202-207
    • /
    • 2023
  • In this paper, we discuss the design and implementation of a recommendation platform actually built in the field. We survey deep learning-based recommendation models that are effective in reflecting individual user characteristics. The recently proposed RNN-based sequential recommendation models reflect individual user characteristics well. The recommendation platform we proposed has an architecture that can collect, store, and process big data from a company's commercial services. Our recommendation platform provides service providers with intuitive tools to evaluate and apply timely optimized recommendation models. In the model evaluation we performed, RNN-based sequential recommendation models showed high scores.

Korean Semantic Role Labeling using Input-feeding RNN Search Model with CopyNet (Input-feeding RNN Search 모델과 CopyNet을 이용한 한국어 의미역 결정)

  • Bae, Jangseong;Lee, Changki
    • 한국어정보학회:학술대회논문집
    • /
    • 2016.10a
    • /
    • pp.300-304
    • /
    • 2016
  • 본 논문에서는 한국어 의미역 결정을 순차열 분류 문제(Sequence Labeling Problem)가 아닌 순차열 변환 문제(Sequence-to-Sequence Learning)로 접근하였고, 구문 분석 단계와 자질 설계가 필요 없는 End-to-end 방식으로 연구를 진행하였다. 음절 단위의 RNN Search 모델을 사용하여 음절 단위로 입력된 문장을 의미역이 달린 어절들로 변환하였다. 또한 순차열 변환 문제의 성능을 높이기 위해 연구된 인풋-피딩(Input-feeding) 기술과 카피넷(CopyNet) 기술을 한국어 의미역 결정에 적용하였다. 실험 결과, Korean PropBank 데이터에서 79.42%의 레이블 단위 f1-score, 71.58%의 어절 단위 f1-score를 보였다.

  • PDF

Control of an Electro-hydraulic Servosystem Using Neural Network with 2-Dimensional Iterative Learning Rule (2차원 반복 학습 신경망을 이용한 전기.유압 서보시스템의 제어)

  • Kwak D.H.;Lee J.K.
    • Transactions of The Korea Fluid Power Systems Society
    • /
    • v.1 no.1
    • /
    • pp.1-9
    • /
    • 2004
  • This paper addresses an approximation and tracking control of recurrent neural networks(RNN) using two-dimensional iterative learning algorithm for an electro-hydraulic servo system. And two dimensional learning rule is driven in the discrete system which consists of nonlinear output function and linear input. In order to control the trajectory of position, two RNN's with the same network architecture were used. Simulation results show that two RNN's using 2-D learning algorithm are able to approximate the plant output and desired trajectory to a very high degree of a accuracy respectively and the control algorithm using two same RNN was very effective to control trajectory tracking of electro-hydraulic servo system.

  • PDF

Comparison of Performance between MLP and RNN Model to Predict Purchase Timing for Repurchase Product (반복 구매제품의 재구매시기 예측을 위한 다층퍼셉트론(MLP) 모형과 순환신경망(RNN) 모형의 성능비교)

  • Song, Hee Seok
    • Journal of Information Technology Applications and Management
    • /
    • v.24 no.1
    • /
    • pp.111-128
    • /
    • 2017
  • Existing studies for recommender have focused on recommending an appropriate item based on the customer preference. However, it has not yet been studied actively to recommend purchase timing for the repurchase product despite of its importance. This study aims to propose MLP and RNN models based on the only simple purchase history data to predict the timing of customer repurchase and compare performances in the perspective of prediction accuracy and quality. As an experiment result, RNN model showed outstanding performance compared to MLP model. The proposed model can be used to develop CRM system which can offer SMS or app based promotion to the customer at the right time. This model also can be used to increase sales for repurchase product business by balancing the level of order as well as inducing repurchase of customer.

Korean Semantic Role Labeling using Input-feeding RNN Search Model with CopyNet (Input-feeding RNN Search 모델과 CopyNet을 이용한 한국어 의미역 결정)

  • Bae, Jangseong;Lee, Changki
    • Annual Conference on Human and Language Technology
    • /
    • 2016.10a
    • /
    • pp.300-304
    • /
    • 2016
  • 본 논문에서는 한국어 의미역 결정을 순차열 분류 문제(Sequence Labeling Problem)가 아닌 순차열 변환 문제(Sequence-to-Sequence Learning)로 접근하였고, 구문 분석 단계와 자질 설계가 필요 없는 End-to-end 방식으로 연구를 진행하였다. 음절 단위의 RNN Search 모델을 사용하여 음절 단위로 입력된 문장을 의미역이 달린 어절들로 변환하였다. 또한 순차열 변환 문제의 성능을 높이기 위해 연구된 인풋-피딩(Input-feeding) 기술과 카피넷(CopyNet) 기술을 한국어 의미역 결정에 적용하였다. 실험 결과, Korean PropBank 데이터에서 79.42%의 레이블 단위 f1-score, 71.58%의 어절 단위 f1-score를 보였다.

  • PDF