• Title/Summary/Keyword: Attention Learning

Search Result 1,333, Processing Time 0.026 seconds

Electroencephalogram Analysis on Learning Factors during Relaxed or Concentrated Attention according to the Color Temperatures of LED Illuminance (이완집중 및 긴장집중 시 LED 조명의 색온도에 따른 학습요인의 뇌파분석)

  • Jee, Soon-Duk;Kim, Chae-Bogk
    • Journal of the Korean Institute of Educational Facilities
    • /
    • v.21 no.6
    • /
    • pp.33-42
    • /
    • 2014
  • The objective of this study is to investigate learning factors (stability, attention and activation) in school by electroencephalogram (theta, alpha and beta waves) analysis during relaxed or concentrated. In order to measure electroencephalograms, MP 150 by Biopac and ECI Electro-Cap are employed. Three types of color temperatures (3000K, 5000K, 7000K) are used and 13 undergraduate and 12 graduate students are selected as experimental subjects. When subjects are relaxed during contemplation or concentrated during mental arithmetic, we compare with stability, attention and activation indices. The test results show that subjects were stable when color temperature is 5000K. Subjects gave best attention when color temperature is 7000K. Subjects activated well when color temperature is 3000K during relaxed attention. However, subjects activated rigorous when color temperature is 7000K during constrained attention.

Paying Attention to Students and Promoting Students' Mathematics Understanding

  • Li, Miao;Tang, Jian-Lan;Huang, Xiao-Xue
    • Research in Mathematical Education
    • /
    • v.12 no.1
    • /
    • pp.67-83
    • /
    • 2008
  • Promoting students' mathematics understanding is an important research theme in mathematics education. According to general theories of learning, mathematics understanding is close to active learning or significant learning. Thus, if a teacher wants to promote his/her students' mathematics understanding, he/she should pay attention to the students so that the students' thinking is in active situation. In the first part of this paper, some mathematics teachers' ideas about paying attention to their students in Chinese high school are given by questionnaire and interview. In the second part of this paper, we give some teaching episodes about how experienced mathematics teachers promote their students' mathematics understanding based on paying attention on them.

  • PDF

The Convergence Influence of excessive smartphone use on attention deficit, learning environment, and academic procrastination in health college students (보건계열 대학생의 스마트폰 과다사용이 주의력결핍, 학습환경, 학업지연행동에 미치는 융합적 영향)

  • Im, In-Chul;Jang, Kyeung-Ae
    • Journal of the Korea Convergence Society
    • /
    • v.8 no.12
    • /
    • pp.129-137
    • /
    • 2017
  • The purpose of the study is to investigate the convergence influence of excessive smartphone use on attention deficit, learning environment, and academic procrastination in health college students. A self-reported questionnaire was completed by 255 college students in Busan drom March 6 to June 12, 2017. The degree of smartphone overuse, lack of attention, learning environment, and academic procrastination according to smartphone use characteristics showed significant effects on the time spent on smartphones per day, awareness of smartphone addiction, and personal use of smartphones during class time (p<0.001). It was shown that smartphone overuse was positively correlated with attention deficit (r=0.870, p<0.01), learning environment (r=0.812, p<0.01), academic procrastination (r=0.772, p<0.01), and attention deficit showed a positive relationship with learning environment (r=0.918, p<0.01) and academic procrastination (r=0.798, p<0.01) Learning environment was positively correlated with academic procrastination (r=0.777, p<0.01). The influence factors of smartphone overuse were attention deficit (p<0.001), followed by academic delay behavior (p<0.01). It is necessary to establish a healthy learning environment through prevention and proper use of smartphone.

Deep Learning-based Super Resolution Method Using Combination of Channel Attention and Spatial Attention (채널 강조와 공간 강조의 결합을 이용한 딥 러닝 기반의 초해상도 방법)

  • Lee, Dong-Woo;Lee, Sang-Hun;Han, Hyun Ho
    • Journal of the Korea Convergence Society
    • /
    • v.11 no.12
    • /
    • pp.15-22
    • /
    • 2020
  • In this paper, we proposed a deep learning based super-resolution method that combines Channel Attention and Spatial Attention feature enhancement methods. It is important to restore high-frequency components, such as texture and features, that have large changes in surrounding pixels during super-resolution processing. We proposed a super-resolution method using feature enhancement that combines Channel Attention and Spatial Attention. The existing CNN (Convolutional Neural Network) based super-resolution method has difficulty in deep network learning and lacks emphasis on high frequency components, resulting in blurry contours and distortion. In order to solve the problem, we used an emphasis block that combines Channel Attention and Spatial Attention to which Skip Connection was applied, and a Residual Block. The emphasized feature map extracted by the method was extended through Sub-pixel Convolution to obtain the super resolution. As a result, about PSNR improved by 5%, SSIM improved by 3% compared with the conventional SRCNN, and by comparison with VDSR, about PSNR improved by 2% and SSIM improved by 1%.

A Study on Automatic Recommendation of Keywords for Sub-Classification of National Science and Technology Standard Classification System Using AttentionMesh (AttentionMesh를 활용한 국가과학기술표준분류체계 소분류 키워드 자동추천에 관한 연구)

  • Park, Jin Ho;Song, Min Sun
    • Journal of Korean Library and Information Science Society
    • /
    • v.53 no.2
    • /
    • pp.95-115
    • /
    • 2022
  • The purpose of this study is to transform the sub-categorization terms of the National Science and Technology Standards Classification System into technical keywords by applying a machine learning algorithm. For this purpose, AttentionMeSH was used as a learning algorithm suitable for topic word recommendation. For source data, four-year research status files from 2017 to 2020, refined by the Korea Institute of Science and Technology Planning and Evaluation, were used. For learning, four attributes that well express the research content were used: task name, research goal, research abstract, and expected effect. As a result, it was confirmed that the result of MiF 0.6377 was derived when the threshold was 0.5. In order to utilize machine learning in actual work in the future and to secure technical keywords, it is expected that it will be necessary to establish a term management system and secure data of various attributes.

An Attention Method-based Deep Learning Encoder for the Sentiment Classification of Documents (문서의 감정 분류를 위한 주목 방법 기반의 딥러닝 인코더)

  • Kwon, Sunjae;Kim, Juae;Kang, Sangwoo;Seo, Jungyun
    • KIISE Transactions on Computing Practices
    • /
    • v.23 no.4
    • /
    • pp.268-273
    • /
    • 2017
  • Recently, deep learning encoder-based approach has been actively applied in the field of sentiment classification. However, Long Short-Term Memory network deep learning encoder, the commonly used architecture, lacks the quality of vector representation when the length of the documents is prolonged. In this study, for effective classification of the sentiment documents, we suggest the use of attention method-based deep learning encoder that generates document vector representation by weighted sum of the outputs of Long Short-Term Memory network based on importance. In addition, we propose methods to modify the attention method-based deep learning encoder to suit the sentiment classification field, which consist of a part that is to applied to window attention method and an attention weight adjustment part. In the window attention method part, the weights are obtained in the window units to effectively recognize feeling features that consist of more than one word. In the attention weight adjustment part, the learned weights are smoothened. Experimental results revealed that the performance of the proposed method outperformed Long Short-Term Memory network encoder, showing 89.67% in accuracy criteria.

Attentive Transfer Learning via Self-supervised Learning for Cervical Dysplasia Diagnosis

  • Chae, Jinyeong;Zimmermann, Roger;Kim, Dongho;Kim, Jihie
    • Journal of Information Processing Systems
    • /
    • v.17 no.3
    • /
    • pp.453-461
    • /
    • 2021
  • Many deep learning approaches have been studied for image classification in computer vision. However, there are not enough data to generate accurate models in medical fields, and many datasets are not annotated. This study presents a new method that can use both unlabeled and labeled data. The proposed method is applied to classify cervix images into normal versus cancerous, and we demonstrate the results. First, we use a patch self-supervised learning for training the global context of the image using an unlabeled image dataset. Second, we generate a classifier model by using the transferred knowledge from self-supervised learning. We also apply attention learning to capture the local features of the image. The combined method provides better performance than state-of-the-art approaches in accuracy and sensitivity.

Saliency Attention Method for Salient Object Detection Based on Deep Learning (딥러닝 기반의 돌출 객체 검출을 위한 Saliency Attention 방법)

  • Kim, Hoi-Jun;Lee, Sang-Hun;Han, Hyun Ho;Kim, Jin-Soo
    • Journal of the Korea Convergence Society
    • /
    • v.11 no.12
    • /
    • pp.39-47
    • /
    • 2020
  • In this paper, we proposed a deep learning-based detection method using Saliency Attention to detect salient objects in images. The salient object detection separates the object where the human eye is focused from the background, and determines the highly relevant part of the image. It is usefully used in various fields such as object tracking, detection, and recognition. Existing deep learning-based methods are mostly Autoencoder structures, and many feature losses occur in encoders that compress and extract features and decoders that decompress and extend the extracted features. These losses cause the salient object area to be lost or detect the background as an object. In the proposed method, Saliency Attention is proposed to reduce the feature loss and suppress the background region in the Autoencoder structure. The influence of the feature values was determined using the ELU activation function, and Attention was performed on the feature values in the normalized negative and positive regions, respectively. Through this Attention method, the background area was suppressed and the projected object area was emphasized. Experimental results showed improved detection results compared to existing deep learning methods.

Factors which Hinder Attention in Online Classes and Solutions (온라인 수업에서 주의 집중을 저해하는 요인과 해결방안)

  • Shin, Soo-Bum
    • Journal of Creative Information Culture
    • /
    • v.6 no.3
    • /
    • pp.159-168
    • /
    • 2020
  • Unlike in-person classes, such as classroom classes, various variables affect learning effectiveness in online classes. In online classes, the teacher must recognize various variables of learners in advance, and if there are variables that hinder learning, students can increase the concentration of learning attention by removing them. Therefore, it is necessary to study which variables distract attention. In this study, factors that inhibit attentional attention that may occur in online classes were selected, and factors analysis was conducted through questionnaires. As a result of the study, the factors of class progression, environment and device manipulation were indicated as the factors that hinder the attention in online classes. According to the results of this study, plans to increase attention in on-line classes were suggested.

Linear-Time Korean Morphological Analysis Using an Action-based Local Monotonic Attention Mechanism

  • Hwang, Hyunsun;Lee, Changki
    • ETRI Journal
    • /
    • v.42 no.1
    • /
    • pp.101-107
    • /
    • 2020
  • For Korean language processing, morphological analysis is a critical component that requires extensive work. This morphological analysis can be conducted in an end-to-end manner without requiring a complicated feature design using a sequence-to-sequence model. However, the sequence-to-sequence model has a time complexity of O(n2) for an input length n when using the attention mechanism technique for high performance. In this study, we propose a linear-time Korean morphological analysis model using a local monotonic attention mechanism relying on monotonic alignment, which is a characteristic of Korean morphological analysis. The proposed model indicates an extreme improvement in a single threaded environment and a high morphometric F1-measure even for a hard attention model with the elimination of the attention mechanism formula.