• 제목/요약/키워드: Long Short-term Memory

Search Result 643, Processing Time 0.023 seconds

The Effects of Interactive Metronome on Short-term Memory and Attention for Children With Mental Retardation (상호작용식 메트로놈(Interactive Metronome: IM) 훈련이 지적장애 아동의 집중력과 단기기억력에 미치는 영향)

  • Bak, Ah-Ream;Yoo, Doo-Han
    • The Journal of Korean Academy of Sensory Integration
    • /
    • v.14 no.1
    • /
    • pp.19-30
    • /
    • 2016
  • Objective : The purpose of this study was to identify the effects of Interactive Metronome (IM) training on short-term memory and attention for children with mental retardation. Methods : For this study, single-subject experimental research was conducted using an ABA design. We observed two children, twice a week for 9 weeks, which was 18 sessions in total. We evaluated the children's brain waves without intervention and the child's pseudo randomly selected sample of one short-term memory task as assessed in the baseline A phase for three sessions. In the intervention phase the children received 40-50 minutes of Interactive Metronome training twice a week, a total of 12 sessions. The short-term memory test and long form test as assessed after treatment, without brain wave in short form test measuring. During the baseline A phase, data were collected using the same procedure as the baseline A phase. Results : After the interactive metronome training, positive changes was observed in brain waves, attentions and short-term memory. Conclusion : The results of this study expect that IM training has a potential for improving cognitive functions of children with mental retardation. In addition, the results of this study can be used as basic data in attention and short-term memory of occupational therapy intervention for children with mental retardation.

Implementation of Artificial Hippocampus Algorithm Using Weight Modulator (가중치 모듈레이터를 이용한 인공 해마 알고리즘 구현)

  • Chu, Jung-Ho;Kang, Dae-Seong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.5
    • /
    • pp.393-398
    • /
    • 2007
  • In this paper, we propose the development of Artificial Hippocampus Algorithm(AHA) which remodels a principle of brain of hippocampus. Hippocampus takes charge auto-associative memory and controlling functions of long-term or short-term memory strengthening. We organize auto-associative memory based 4 steps system (EC, DG CA3, and CA1) and improve speed of teaming by addition of modulator to long-term memory teaming. In hippocampus system, according to the 3 steps order, information applies statistical deviation on Dentate Gyrus region and is labeled to responsive pattern by adjustment of a good impression. In CA3 region, pattern is reorganized by auto-associative memory. In CA1 region, convergence of connection weight which is used long-term memory is learned fast a by neural network which is applied modulator. To measure performance of Artificial Hippocampus Algorithm, PCA(Principal Component Analysis) and LDA(Linear Discriminants Analysis) are applied to face images which are classified by pose, expression and picture quality. Next, we calculate feature vectors and learn by AHA. Finally, we confirm cognitive rate. The results of experiments, we can compare a proposed method of other methods, and we can confirm that the proposed method is superior to the existing method.

Performance Analysis and Identifying Characteristics of Processing-in-Memory System with Polyhedral Benchmark Suite (프로세싱 인 메모리 시스템에서의 PolyBench 구동에 대한 동작 성능 및 특성 분석과 고찰)

  • Jeonggeun Kim
    • Journal of the Semiconductor & Display Technology
    • /
    • v.22 no.3
    • /
    • pp.142-148
    • /
    • 2023
  • In this paper, we identify performance issues in executing compute kernels from PolyBench, which includes compute kernels that are the core computational units of various data-intensive workloads, such as deep learning and data-intensive applications, on Processing-in-Memory (PIM) devices. Therefore, using our in-house simulator, we measured and compared the various performance metrics of workloads based on traditional out-of-order and in-order processors with Processing-in-Memory-based systems. As a result, the PIM-based system improves performance compared to other computing models due to the short-term data reuse characteristic of computational kernels from PolyBench. However, some kernels perform poorly in PIM-based systems without a multi-layer cache hierarchy due to some kernel's long-term data reuse characteristics. Hence, our evaluation and analysis results suggest that further research should consider dynamic and workload pattern adaptive approaches to overcome performance degradation from computational kernels with long-term data reuse characteristics and hidden data locality.

  • PDF

Prediction Oil and Gas Throughput Using Deep Learning

  • Sangseop Lim
    • Journal of the Korea Society of Computer and Information
    • /
    • v.28 no.5
    • /
    • pp.155-161
    • /
    • 2023
  • 97.5% of our country's exports and 87.2% of imports are transported by sea, making ports an important component of the Korean economy. To efficiently operate these ports, it is necessary to improve the short-term prediction of port water volume through scientific research methods. Previous research has mainly focused on long-term prediction for large-scale infrastructure investment and has largely concentrated on container port water volume. In this study, short-term predictions for petroleum and liquefied gas cargo water volume were performed for Ulsan Port, one of the representative petroleum ports in Korea, and the prediction performance was confirmed using the deep learning model LSTM (Long Short Term Memory). The results of this study are expected to provide evidence for improving the efficiency of port operations by increasing the accuracy of demand predictions for petroleum and liquefied gas cargo water volume. Additionally, the possibility of using LSTM for predicting not only container port water volume but also petroleum and liquefied gas cargo water volume was confirmed, and it is expected to be applicable to future generalized studies through further research.

Deep learning-based LSTM model for prediction of long-term piezoresistive sensing performance of cement-based sensors incorporating multi-walled carbon nanotube

  • Jang, Daeik;Bang, Jinho;Yoon, H.N.;Seo, Joonho;Jung, Jongwon;Jang, Jeong Gook;Yang, Beomjoo
    • Computers and Concrete
    • /
    • v.30 no.5
    • /
    • pp.301-310
    • /
    • 2022
  • Cement-based sensors have been widely used as structural health monitoring systems, however, their long-term sensing performance have not actively investigated. In this study, a deep learning-based methodology is adopted to predict the long-term piezoresistive properties of cement-based sensors. Samples with different multi-walled carbon nanotube contents (0.1, 0.3, and 0.5 wt.%) are fabricated, and piezoresistive tests are conducted over 10,000 loading cycles to obtain the training data. Time-dependent degradation is predicted using a modified long short-term memory (LSTM) model. The effects of different model variables including the amount of training data, number of epochs, and dropout ratio on the accuracy of predictions are analyzed. Finally, the effectiveness of the proposed approach is evaluated by comparing the predictions for long-term piezoresistive sensing performance with untrained experimental data. A sensitivity of 6% is experimentally examined in the sample containing 0.1 wt.% of MWCNTs, and predictions with accuracy up to 98% are found using the proposed LSTM model. Based on the experimental results, the proposed model is expected to be applied in the structural health monitoring systems to predict their long-term piezoresistice sensing performances during their service life.

LSTM-based Sales Forecasting Model

  • Hong, Jun-Ki
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.4
    • /
    • pp.1232-1245
    • /
    • 2021
  • In this study, prediction of product sales as they relate to changes in temperature is proposed. This model uses long short-term memory (LSTM), which has shown excellent performance for time series predictions. For verification of the proposed sales prediction model, the sales of short pants, flip-flop sandals, and winter outerwear are predicted based on changes in temperature and time series sales data for clothing products collected from 2015 to 2019 (a total of 1,865 days). The sales predictions using the proposed model show increases in the sale of shorts and flip-flops as the temperature rises (a pattern similar to actual sales), while the sale of winter outerwear increases as the temperature decreases.

LSTM Model based on Session Management for Network Intrusion Detection (네트워크 침입탐지를 위한 세션관리 기반의 LSTM 모델)

  • Lee, Min-Wook
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.20 no.3
    • /
    • pp.1-7
    • /
    • 2020
  • With the increase in cyber attacks, automated IDS using machine learning is being studied. According to recent research, the IDS using the recursive learning model shows high detection performance. However, the simple application of the recursive model may be difficult to reflect the associated session characteristics, as the overlapping session environment may degrade the performance. In this paper, we designed the session management module and applied it to LSTM (Long Short-Term Memory) recursive model. For the experiment, the CSE-CIC-IDS 2018 dataset is used and increased the normal session ratio to reduce the association of mal-session. The results show that the proposed model is able to maintain high detection performance even in the environment where session relevance is difficult to find.

Effect of TV news camerawork and viewers' involvement on memory of news (TV뉴스의 카메라워크와 수용자의 관여도가 뉴스 기억에 미치는 영향)

  • Park, Dug-Chun
    • Journal of Digital Convergence
    • /
    • v.11 no.7
    • /
    • pp.297-304
    • /
    • 2013
  • This research explores the effect of TV news camerawork and viewers' involvement on memory of news through experiment. For this experimental research, 2 groups of subjects composed of university students were exposed to different types of TV news and responded to survey questions which were analysed by SPSS program. This research found that camerawork of TV news doesn't have an effect on short-term memory but on long-term memory. Though the fact viewers' involvement has a positive effect on shot-term and long-term memory was found, interactive effect of viewers' involvement and camerawork as an peripheral clue was not found.

Neuropsychology of Memory (기억의 신경심리학)

  • Rhee, Min-Kyu
    • Sleep Medicine and Psychophysiology
    • /
    • v.4 no.1
    • /
    • pp.1-14
    • /
    • 1997
  • This paper reviewed models to explain memory and neuropsychological tests to assess memory. Memory was explained in cognitive and neuroanatomical perspectives, Cognitive model describes memory as structure and process. In structure model, memory is divided into three systems: sensory memory, short-term memory(working memory), and long-term memory. In process model, there are broadly three categories of memory process: encoding, storage, and retrieval. Memory process work in memory structure. There are two prominent models of the neuroanatomy of memory, derived from the work of Mishkin and Appenzeller and that of Squire and Zola-Morgan. These two models are the most useful for the clinician in part because they take into account the connections between the limbic and frontal cortical regions. The major difference between the two models concerns the role of the amygdala in memory processess. Mishkin and his colleagues believe that the amygdala plays a significant role while Squire and his colleagues do not. The most popular and widely used tests of memory ability such as WMS-R, AVLT, CVLT, HVLT. RBMT, CFT, and BVRT-R, were reviewed.

  • PDF

A Survey on Neural Networks Using Memory Component (메모리 요소를 활용한 신경망 연구 동향)

  • Lee, Jihwan;Park, Jinuk;Kim, Jaehyung;Kim, Jaein;Roh, Hongchan;Park, Sanghyun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.7 no.8
    • /
    • pp.307-324
    • /
    • 2018
  • Recently, recurrent neural networks have been attracting attention in solving prediction problem of sequential data through structure considering time dependency. However, as the time step of sequential data increases, the problem of the gradient vanishing is occurred. Long short-term memory models have been proposed to solve this problem, but there is a limit to storing a lot of data and preserving it for a long time. Therefore, research on memory-augmented neural network (MANN), which is a learning model using recurrent neural networks and memory elements, has been actively conducted. In this paper, we describe the structure and characteristics of MANN models that emerged as a hot topic in deep learning field and present the latest techniques and future research that utilize MANN.