• Title/Summary/Keyword: memory attention

Search Result 470, Processing Time 0.025 seconds

A Psychological Model for Mathematical Problem Solving based on Revised Bloom Taxonomy for High School Girl Students

  • Hajibaba, Maryam;Radmehr, Farzad;Alamolhodaei, Hassan
    • Research in Mathematical Education
    • /
    • v.17 no.3
    • /
    • pp.199-220
    • /
    • 2013
  • The main objective of this study is to explore the relationship between psychological factors (i.e. math anxiety, attention, attitude, Working Memory Capacity (WMC), and Field dependency) and students' mathematics problem solving based on Revised Bloom Taxonomy. A sample of 169 K11 school girls were tested on (1) The Witkin's cognitive style (Group Embedded Figure Test). (2) Digit Span Backwards Test. (3) Mathematics Anxiety Rating Scale (MARS). (4) Modified Fennema-Sherman Attitude Scales. (5) Mathematics Attention Test (MAT), and (6) Mathematics questions based on Revised Bloom Taxonomy (RBT). Results obtained indicate that the effect of these items on students mathematical problem solving is different in each cognitive process and level of knowledge dimension.

DG-based SPO tuple recognition using self-attention M-Bi-LSTM

  • Jung, Joon-young
    • ETRI Journal
    • /
    • v.44 no.3
    • /
    • pp.438-449
    • /
    • 2022
  • This study proposes a dependency grammar-based self-attention multilayered bidirectional long short-term memory (DG-M-Bi-LSTM) model for subject-predicate-object (SPO) tuple recognition from natural language (NL) sentences. To add recent knowledge to the knowledge base autonomously, it is essential to extract knowledge from numerous NL data. Therefore, this study proposes a high-accuracy SPO tuple recognition model that requires a small amount of learning data to extract knowledge from NL sentences. The accuracy of SPO tuple recognition using DG-M-Bi-LSTM is compared with that using NL-based self-attention multilayered bidirectional LSTM, DG-based bidirectional encoder representations from transformers (BERT), and NL-based BERT to evaluate its effectiveness. The DG-M-Bi-LSTM model achieves the best results in terms of recognition accuracy for extracting SPO tuples from NL sentences even if it has fewer deep neural network (DNN) parameters than BERT. In particular, its accuracy is better than that of BERT when the learning data are limited. Additionally, its pretrained DNN parameters can be applied to other domains because it learns the structural relations in NL sentences.

The Influence of Rosemary Oil Inhalation on Memory, Attention and Autonomic Nerve System on the Elderly by Different Concentration (농도별 로즈마리 오일 흡입이 노인의 기억력, 집중력 및 자율신경계 반응에 미치는 영향)

  • Yang, In Suk;Park, Myung Sook
    • Journal of Convergence for Information Technology
    • /
    • v.9 no.3
    • /
    • pp.56-67
    • /
    • 2019
  • The aim of this study was to investigate the influence of rosemary oil inhalation on memory, attention and autonomic nervous system according to the concentration difference in the aged. The research design was non-equivalent control group non-synchronized design. Participants were 89 individuals aged 65 or older who live in the community. Participants inhaled almond carrier oil(control group), 10%(experimental group A) and 100%(experimental group B) rosemary oil. Memory, attention, and autonomic nervous system responses were measured. Data were analyzed by SPSS win 24.0. The differences of the group and time were analyzed through repeated measure ANOVA. There were no significant differences in immediate recall (F=.42, p =.656), delayed recall (F=.45, p=.639), recognition (F=1.45, p=.242), digit span-forward (F=1.53, p=.223), digit span-backward (F=.46, p=.636), activities of sympathetic nerve system (LF)(F=.19, p=.828), activities of parasympathetic nerve system (HF)(F=.37, p=.694), LH/HF(F=1.39, p=.256), systolic blood pressure (F=.37, p=.694), diastolic blood pressure (F=1.25, p=.291). The inhalation of 10% and 100% rosemary oil for five minutes showed no significant effects on memory, attention and automatic nervous system in the aged.

The Effect of White Noise on Memory and Attention of Local Community Elderly during Computer-Based Cognitive Rehabilitation Program (컴퓨터 기반 인지재활 프로그램 적용 시 백색소음이 지역사회 노인의 기억력과 주의력에 미치는 영향)

  • Kim, Gi-Di;Heo, Myoung
    • Journal of Korea Entertainment Industry Association
    • /
    • v.13 no.8
    • /
    • pp.627-633
    • /
    • 2019
  • The purpose of this study was to investigate the effect of white noise on memory and attention of local community elderly during computer-based cognitive rehabilitation program(COMCOG). 30 elderly subjects were recruited and conveniently allocated into experimental and control group. Experimental group subjects carried out COMCOG with white noise. Control group subjects received COMCOG for 30 minutes, 3 times per week for 6 weeks. Neurocognitive funtion test(CNT) were used to evaluate memory and attention during pre and post. The most CNT outcome at post was significantly improved than the pre outcome in the both group(p<.05). And the experimental group showed significantly difference from control group(p<.05). The result of this study show that the COMCOG with white noise may be appropriate for improving memory and attention in elderly. This will enable the application of COMCOG with white noise in cognitive rehabilitation therapy of local elderly and it will help guide the selection of the therapist as one therapeutic basis

Electroencephalography of Learning and Memory (학습과 기억의 뇌파)

  • Jeon, Hyeonjin;Lee, Seung-Hwan
    • Korean Journal of Biological Psychiatry
    • /
    • v.23 no.3
    • /
    • pp.102-107
    • /
    • 2016
  • This review will summarize EEG studies of learning and memory based on frequency bands including theta waves (4-7 Hz), gamma waves (> 30 Hz) and alpha waves (7-12 Hz). Authors searched and reviewed EEG papers especially focusing on learning and memory from PubMed. Theta waves are associated with acquisition of new information from stimuli. Gamma waves are connected with comparing and binding old information in preexisting memory and new information from stimuli. Alpha waves are linked with attention. Eventually it mediates the learning and memory process. Although EEG studies of learning and memory still have controversial issues, the future EEG studies will facilitate clinical benefits by virtue of more developed and encouraging prospects.

Emotion Classification based on EEG signals with LSTM deep learning method (어텐션 메커니즘 기반 Long-Short Term Memory Network를 이용한 EEG 신호 기반의 감정 분류 기법)

  • Kim, Youmin;Choi, Ahyoung
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.26 no.1
    • /
    • pp.1-10
    • /
    • 2021
  • This study proposed a Long-Short Term Memory network to consider changes in emotion over time, and applied an attention mechanism to give weights to the emotion states that appear at specific moments. We used 32 channel EEG data from DEAP database. A 2-level classification (Low and High) experiment and a 3-level classification experiment (Low, Middle, and High) were performed on Valence and Arousal emotion model. As a result, accuracy of the 2-level classification experiment was 90.1% for Valence and 88.1% for Arousal. The accuracy of 3-level classification was 83.5% for Valence and 82.5% for Arousal.

Robustness of Differentiable Neural Computer Using Limited Retention Vector-based Memory Deallocation in Language Model

  • Lee, Donghyun;Park, Hosung;Seo, Soonshin;Son, Hyunsoo;Kim, Gyujin;Kim, Ji-Hwan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.3
    • /
    • pp.837-852
    • /
    • 2021
  • Recurrent neural network (RNN) architectures have been used for language modeling (LM) tasks that require learning long-range word or character sequences. However, the RNN architecture is still suffered from unstable gradients on long-range sequences. To address the issue of long-range sequences, an attention mechanism has been used, showing state-of-the-art (SOTA) performance in all LM tasks. A differentiable neural computer (DNC) is a deep learning architecture using an attention mechanism. The DNC architecture is a neural network augmented with a content-addressable external memory. However, in the write operation, some information unrelated to the input word remains in memory. Moreover, DNCs have been found to perform poorly with low numbers of weight parameters. Therefore, we propose a robust memory deallocation method using a limited retention vector. The limited retention vector determines whether the network increases or decreases its usage of information in external memory according to a threshold. We experimentally evaluate the robustness of a DNC implementing the proposed approach according to the size of the controller and external memory on the enwik8 LM task. When we decreased the number of weight parameters by 32.47%, the proposed DNC showed a low bits-per-character (BPC) degradation of 4.30%, demonstrating the effectiveness of our approach in language modeling tasks.

ON HIPPOCAMPUS PROTOCOL BY A BRAIN WAVE ANALYSIS IN THE FIELD OF MEMORY FOR A MUSICAL THERAPY

  • Kengo-Shibata;Takashi-Azakami
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 1999.06a
    • /
    • pp.95-96
    • /
    • 1999
  • The authors have considered the 1/f fluctuation of vial rhythm with $1/f\beta$ spectrum of $\alpha$ wave in relation to the invigoration for the learning memory by paid their attention to the hippocampus protocol in this paper. At the first clinical experiment, the data of the remembrance test at short period is able to make as the foundation of the repeat memory. It can replace this memory with long period memory through the hippocampus by the superposition of the same memory-nerve circuits.

Trends of the CCIX Interconnect and Memory Expansion Technology (CCIX 연결망과 메모리 확장기술 동향)

  • Kim, S.Y.;Ahn, H.Y.;Jun, S.I.;Park, Y.M.;Han, W.J.
    • Electronics and Telecommunications Trends
    • /
    • v.37 no.1
    • /
    • pp.42-52
    • /
    • 2022
  • With the advent of the big data era, the memory capacity required for computing systems is rapidly increasing, especially in High Performance Computing systems. However, the number of DRAMs that can be used in a computing node is limited by the structural limitations of the hardware (for example, CPU specifications). Memory expansion technology has attracted attention as a means of overcoming this limitation. This technology expands the memory capacity by leveraging the external memory connected to the host system through hardware interface such as PCIe and CCIX. In this paper, we present an overview and describe the development trends of the memory expansion technology. We also provide detailed descriptions and use cases of the CCIX that provides higher bandwidth and lower latency than cases of the PCIe.

Trends in Compute Express Link(CXL) Technology (CXL 인터커넥트 기술 연구개발 동향)

  • S.Y. Kim;H.Y. Ahn;Y.M. Park;W.J. Han
    • Electronics and Telecommunications Trends
    • /
    • v.38 no.5
    • /
    • pp.23-33
    • /
    • 2023
  • With the widespread demand from data-intensive tasks such as machine learning and large-scale databases, the amount of data processed in modern computing systems is increasing exponentially. Such data-intensive tasks require large amounts of memory to rapidly process and analyze massive data. However, existing computing system architectures face challenges when building large-scale memory owing to various structural issues such as CPU specifications. Moreover, large-scale memory may cause problems including memory overprovisioning. The Compute Express Link (CXL) allows computing nodes to use large amounts of memory while mitigating related problems. Hence, CXL is attracting great attention in industry and academia. We describe the overarching concepts underlying CXL and explore recent research trends in this technology.