• Title/Summary/Keyword: Knowledge distillation

Search Result 48, Processing Time 0.024 seconds

Process Control Using n Neural Network Combined with the Conventional PID Controllers

  • Lee, Moonyong;Park, Sunwon
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.2 no.3
    • /
    • pp.196-200
    • /
    • 2000
  • A neural controller for process control is proposed that combines a conventional multi-loop PID controller with a neural network. The concept of target signal based on feedback error is used fur on-line learning of the neural network. This controller is applied to distillation column control to illustrate its effectiveness. The result shows that the proposed neural controller can cope well with disturbance, strong interactions, time delays without any prior knowledge of the process.

  • PDF

Process Control Using a Neural Network Combined with the Conventional PID Controllers

  • Lee, Moonyong;Park, Sunwon
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.2 no.2
    • /
    • pp.136-139
    • /
    • 2000
  • A neural controller for process control is proposed that combines a conventional multi-loop PID controller with a neural network. The concept of target signal based on feedback error is used for on-line learning of the neural network. This controller is applied to distillation column control to illustrate its effectiveness. The result shows that the proposed neural controller can cope well with disturbance, strong interactions, time delays without any prior knowledge of the process.

  • PDF

A study of distillation column control by using a neural controller (신경제어기를 이용한 증류탑의 제어에 관한 연구)

  • 이문용;박선원
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1990.10a
    • /
    • pp.234-239
    • /
    • 1990
  • A neural controller for process control was proposed that combines a simple feedback controller with a neural network. This control was applied to distillation control. The feedback error learning technique was used for on-line learning. Important characteristics on neural controller were analyzed. The proposed neural controller can cope well with strong interactions, significant time delays, sudden changes in process dynamics without any prior knowledge of the process. It was shown that the neural controller has good features such as fault tolerance, interpolation effect and random learning capability

  • PDF

Shot Boundary Detection Model using Knowledge Distillation (지식의 증류기법을 이용한 샷 경계 검출 모델)

  • Park, Sung Min;Yoon, Ui Nyoung;Jo, Geun-Sik
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2019.06a
    • /
    • pp.29-31
    • /
    • 2019
  • 샷 경계 검출(Shot Boundary Detection)은 영상 콘텐츠 분석을 위한 필수적인 기술이며, 다양한 방식으로 편집된 영상의 샷 경계를 정확하게 검출하기 위한 연구가 지속되어 왔다. 그러나 기존에 연구들은 고정된 샷 경계 검출 알고리즘이나 매뉴얼한 작업과 같이 학습이 불가능한 과정이 포함되어 있어 성능 개선에 한계가 있었다. 본 논문에서는 이러한 과정을 제거한 End-to-End 모델을 제안한다. 제안하는 모델은 시공간 정보 추출성능을 높이기 위해 행동 인식 데이터셋을 이용한 전이학습을 사용하고, 샷 경계 검출 성능을 높이기 위해 개선된 지식의 증류기법(Knowledge Distillation)을 결합한다. 제안하는 모델은 ClipShots 데이터셋에서 DeepSBD 에 비해 cut transition 과 gradual transition 이 각각 5.4%, 41.29% 높은 성능을 보였고, DSM 과의 비교에서 cut transition 의 정확도가 1.3% 더 높은 결과를 보였다.

  • PDF

SqueezeNet based Single Image Super Resolution using Knowledge Distillation (SqueezeNet 기반의 지식 증류 가법을 활용한 초해상화 기법)

  • Seo, Yu lim;Kang, Suk-Ju
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2020.11a
    • /
    • pp.226-227
    • /
    • 2020
  • 근래의 초해상화 (super-resolution, SR) 연구는 네트워크를 깊고, 넓게 만들어 성능을 높이는데 주를 이뤘다. 그러나 동시에 높은 연산량과 메모리 소비량이 증가하는 문제가 발생하기 때문에 이를 실제로 하드웨어로 구현하기에는 어려운 문제가 존재한다. 그렇기에 우리는 네트워크 최적화를 통해 성능 감소를 최소화하면서 파라미터 수를 줄이는 네트워크 SqueezeSR을 설계하였다. 또한 지식 증류(Knowledge Distillation, KD)를 이용해 추가적인 파라미터 수 증가 없이 성능을 높일 수 있는 학습 방법을 제안한다. 또한 KD 시 teacher network의 성능이 보다 student network에 잘 전달되도록 feature map 간의 비교를 통해 학습 효율을 높일 수 있었다. 결과적으로 우리는 KD 기법을 통해 추가적인 파라미터 수 증가 없이 성능을 높여 다른 SR네트워크 대비 더 빠르고 성능 감소를 최소화한 네트워크를 제안한다.

  • PDF

Continuous Korean Named Entity Recognition Using Knowledge Distillation (지식증류를 활용한 지속적 한국어 개체명 인식 )

  • Junseo Jang;Seongsik Park;Harksoo Kim
    • Annual Conference on Human and Language Technology
    • /
    • 2023.10a
    • /
    • pp.505-509
    • /
    • 2023
  • 개체명 인식은 주어진 텍스트에서 특정 유형의 개체들을 식별하고 추출하는 작업이다. 일반적인 딥러닝 기반 개체명 인식은 사전에 개체명들을 모두 정의한 뒤 모델을 학습한다. 하지만 실제 학습 환경에서는 지속적으로 새로운 개체명이 등장할 수 있을뿐더러 기존 개체명을 학습한 데이터가 접근이 불가할 수 있다. 또한, 새로 모델을 학습하기 위해 새로운 데이터에 기존 개체명을 수동 태깅하기엔 많은 시간과 비용이 든다. 해결 방안으로 여러 방법론이 제시되었지만 새로운 개체명을 학습하는 과정에서 기존 개체명 지식에 대한 망각 현상이 나타났다. 본 논문에서는 지식증류를 활용한 지속학습이 한국어 개체명 인식에서 기존 지식에 대한 망각을 줄이고 새로운 지식을 학습하는데 효과적임을 보인다. 국립국어원에서 제공한 개체명 인식 데이터로 실험과 평가를 진행하여 성능의 우수성을 보인다.

  • PDF

A Study of Lightening Super-Resolution Networks Using Self-Distillation (자가증류를 이용한 초해상화 네트워크 경량화 연구)

  • Lee, Yeojin;Park, Hanhoon
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2022.06a
    • /
    • pp.221-223
    • /
    • 2022
  • 최근 CNN(Convolutional Neural Network)은 초해상화(super-resolution)를 포함한 다양한 컴퓨터 비전 분야에서 우수한 성능을 보이며 널리 사용되고 있다. 그러나 CNN은 계산 집약적이고 많은 메모리가 요구되어 한정적인 하드웨어 자원인 모바일이나 IoT(Internet of Things) 기기에 적용하기 어렵다는 문제가 있다. 이런 한계를 해결하기 위해, 기 학습된 깊은 CNN 모델의 성능을 최대한 유지하며 네트워크의 깊이나 크기를 줄이는 경량화 연구가 활발히 진행되고 있다. 본 논문은 네트워크 경량화 기술인 지식증류(knowledge distillation) 중 자가증류(self-distillation)를 초해상화 CNN 모델에 적용하여 성능을 평가, 분석한다. 실험 결과, 정량적 평가지표를 통하여 자가증류를 통해서도 성능이 우수한 경량화된 초해상화 모델을 얻을 수 있음을 확인하였다.

  • PDF

A Study on Lightweight Transformer Based Super Resolution Model Using Knowledge Distillation (지식 증류 기법을 사용한 트랜스포머 기반 초해상화 모델 경량화 연구)

  • Dong-hyun Kim;Dong-hun Lee;Aro Kim;Vani Priyanka Galia;Sang-hyo Park
    • Journal of Broadcast Engineering
    • /
    • v.28 no.3
    • /
    • pp.333-336
    • /
    • 2023
  • Recently, the transformer model used in natural language processing is also applied to the image super resolution field, showing good performance. However, these transformer based models have a disadvantage that they are difficult to use in small mobile devices because they are complex and have many learning parameters and require high hardware resources. Therefore, in this paper, we propose a knowledge distillation technique that can effectively reduce the size of a transformer based super resolution model. As a result of the experiment, it was confirmed that by applying the proposed technique to the student model with reduced number of transformer blocks, performance similar to or higher than that of the teacher model could be obtained.

Semi-Supervised Domain Adaptation on LiDAR 3D Object Detection with Self-Training and Knowledge Distillation (자가학습과 지식증류 방법을 활용한 LiDAR 3차원 물체 탐지에서의 준지도 도메인 적응)

  • Jungwan Woo;Jaeyeul Kim;Sunghoon Im
    • The Journal of Korea Robotics Society
    • /
    • v.18 no.3
    • /
    • pp.346-351
    • /
    • 2023
  • With the release of numerous open driving datasets, the demand for domain adaptation in perception tasks has increased, particularly when transferring knowledge from rich datasets to novel domains. However, it is difficult to solve the change 1) in the sensor domain caused by heterogeneous LiDAR sensors and 2) in the environmental domain caused by different environmental factors. We overcome domain differences in the semi-supervised setting with 3-stage model parameter training. First, we pre-train the model with the source dataset with object scaling based on statistics of the object size. Then we fine-tine the partially frozen model weights with copy-and-paste augmentation. The 3D points in the box labels are copied from one scene and pasted to the other scenes. Finally, we use the knowledge distillation method to update the student network with a moving average from the teacher network along with a self-training method with pseudo labels. Test-Time Augmentation with varying z values is employed to predict the final results. Our method achieved 3rd place in ECCV 2022 workshop on the 3D Perception for Autonomous Driving challenge.

Surface Treatment of Air Gap Membrane Distillation (AGMD) Condensation Plates: Techniques and Influences on Module Performance

  • Harianto, Rachel Ananda;Aryapratama, Rio;Lee, Seockheon;Jo, Wonjin;Lee, Heon Ju
    • Applied Science and Convergence Technology
    • /
    • v.23 no.5
    • /
    • pp.248-253
    • /
    • 2014
  • Air Gap Membrane Distillation (AGMD) is one of several technologies that can be used to solve problems fresh water availability. AGMD exhibits several advantages, including low conductive heat loss and higher thermal efficiency, due to the presence of an air gap between the membrane and condensation wall. A previous study by Bhardwaj found that the condensation surface properties (materials and contact angle) affected the total collected fresh water in the solar distillation process. However, the process condition differences between solar distillation and AGMD might result in different condensation phenomena. In contrast, N. Miljkovic showed that a hydrophobic surface has higher condensation heat transfer. Moreover, to the best of our knowledge, there is no study that investigates the effect of condensation surface properties in AGMD to overall process performance (i.e. flux and thermal efficiency). Thus, in this study, we treated the AGMD condensation surface to make it hydrophobic or hydrophilic. The condensation surface could be made hydrophilic by immersing and boiling plate in deionized (DI) water, which caused the formation of hydrophilic aluminum hydroxide (AlOOH) nanostructures. Afterwards, the treated plate was coated using hexamethyldisiloxane (HMDSO) through plasma-enhanced chemical vapor deposition (PECVD). The result indicated that condensation surface properties do not affect the permeate flux or thermal efficiency significantly. In general, the permeate flux and thermal efficiency for the treated plates were lower than those of the non-treated plate (pristine). However, at a 1 mm and 3 mm air gap, the treated plate outperformed the non-treated plate (pristine) in terms of permeate flux. Therefore, although surface wettability effect was not significant, it still provided a little influence.