• Title/Summary/Keyword: transformer model

Search Result 611, Processing Time 0.039 seconds

Pulse Multiplication in Autotransformer Based AC-DC Converters using a Zigzag Connection

  • Singh, Bhim;Gairola, Sanjay
    • Journal of Power Electronics
    • /
    • v.7 no.3
    • /
    • pp.191-202
    • /
    • 2007
  • This paper deals with pulse multiplication in zigzag connected autotransformer based 12-pulse AC-DC converters feeding vector controlled induction motor drives (VCIMD) for improving the power quality at the point of common coupling (PCC) without using a Zero-Sequence-Blocking-Transformer (ZSBT). The proposed 24-pulse AC-DC converter is based on the principle of DC ripple re-injection technique for pulse multiplication and harmonic mitigation. The design of the autotransformer is carried out for the proposed AC-DC converter and the effect of load variation on VCIMD is also studied to demonstrate the effectiveness of the proposed AC-DC converter. Test results from a laboratory developed prototype, along with simulated results, are presented to validate the design and model of the proposed 24-pulse AC-DC converter.

A Current Compensation Algorithm for a CT Saturation (CT 포화 복원 알고리즘)

  • Yi, Xiao-Li;Kang, Sang-Hee;Lee, Dong-Gyu;Kang, Yong-Cheol
    • Proceedings of the KIEE Conference
    • /
    • 2003.11a
    • /
    • pp.88-90
    • /
    • 2003
  • In this paper, an algorithm to compensate the distorted signals due to CT(Current Transformer) saturation is suggested. Firstly, WT(Wavelet Transform) is used to detect a start point and an end point of saturation. Filter banks which can be easily realized in real-time applications are employed in detecting CT saturation. Secondly, least-square curve fitting method is used to restore the distorted section of the secondary current. Fault simulations are performed on a power system model using EMTP(Electromagnetic Transient Program). A series of test results indicate that WT has superior detection accuracy and the proposed algorithm which shows very stable features under various levels of remanent flux is also satisfactory.

  • PDF

Dynamic Characteristic Analysis of Multi-bridge PWM Inverter SSSC (다중브리지 PWM 인버터로 구성된 SSSC의 동특성 분석)

  • Bae B.Y.;Park S.H.;Ha Y.C.;Kim H.J.;Han B.M.;Kim H.W.
    • Proceedings of the KIPE Conference
    • /
    • 2001.07a
    • /
    • pp.685-688
    • /
    • 2001
  • This paper proposes an SSSC based on multi-bridge inverters. The dynamic characteristic of the proposed SSSC was analyzed by EMTP simulation and a scaled handware model, assuming that the SSSC is inserted in the transmission line of the one-machine-infinite-bus power system. The proposed SSSC has 6 multi-bridge inverters per phase, which generates 13 pulses for each half period of power frequency. The proposed SSSC generates a quasi-simusoidal output voltage by 90 degree phase shift to the line current The proposed SSSC does not require the coupling transformer for voltage injection, and has a flexibility in operation voltage by increasing the number of series connection.

  • PDF

Analysis of Flyback Converter Transformer with Capacitor Model (Capacitor 모델을 이용한 플라이백 컨버터 변압기의 해석)

  • Kim, Chun-hui;Im, Hye-yeong;Shin, Yong-hwan;Shin, Hwi-beom
    • Proceedings of the KIPE Conference
    • /
    • 2012.07a
    • /
    • pp.520-521
    • /
    • 2012
  • 플라이백 컨버터는 다른 컨버터에 비해 간단한 구조를 갖고 있기 때문에 가장 많이 쓰이는 저전력용 컨버터 중 하나이다. 일반적으로 플라이백 컨버터에 있는 변압기를 전기적 등가회로를 이용하여 해석 해 왔다. 하지만 컨버터의 정밀한 특성을 필요로 하는 최근 추세로 변압기를 근사적으로 표현하는 전기회로의 사용에는 한계가 있다. 이에 플라이백 컨버터 변압기를 Gyrator Capacitor 모델링을 하여 이해하기 쉽고 정확한 분석과 설계를 가능하게 하고자 한다.

  • PDF

A Theoretical Study on Voltage Drop of Auto-Transformer for Railway Vehicle Base (철도차량기지용 단권변압기의 전압강하에 대한 이론적 고찰)

  • Yu, Ki-Seong;Kim, Jae-Moon
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.67 no.12
    • /
    • pp.1723-1728
    • /
    • 2018
  • In order to investigate the voltage drop compensation effect of AT for domestic railway vehicle base, the parameters of AT voltage drop of railroad car base are Z3 (Impedance of feeder line), Xn ( Distance from railroad vehicle to AT to SS), and Dn (distance between both ATs of railway vehicle).In addition, when installed in a SSP for a railway vehicle base, there is no AT and feeder line in the railway vehicle base except for the SSP for the main line and the SSP for the railway vehicle base, so that if zero or ignored, the AC single-phase two- It can be confirmed that it becomes a form.

DAB Converter Based on Unified High-Frequency Bipolar Buck-Boost Theory for Low Current Stress

  • Kan, Jia-rong;Yang, Yao-dong;Tang, Yu;Wu, Dong-chun;Wu, Yun-ya;Wu, Jiang
    • Journal of Power Electronics
    • /
    • v.19 no.2
    • /
    • pp.431-442
    • /
    • 2019
  • This paper proposes a unified high-frequency bipolar buck-boost (UHFBB) control strategy for a dual-active-bridge (DAB), which is derived from the classical buck and boost DC/DC converter. It can achieve optimized current stress of the switches and soft switching in wider range. The UHFBB control strategy includes multi-control-variables, which can be achieved according to an algorithm derived from an accurate mathematical model. The design method for the parameters, such as the transformer turns ratio and the inductance, are shown. The current stress of the switches is analyzed for selecting an optimal inductor. The analysis is verified by the experimental results within a 500W prototype.

Explaining the Translation Error Factors of Machine Translation Services Using Self-Attention Visualization (Self-Attention 시각화를 사용한 기계번역 서비스의 번역 오류 요인 설명)

  • Zhang, Chenglong;Ahn, Hyunchul
    • Journal of Information Technology Services
    • /
    • v.21 no.2
    • /
    • pp.85-95
    • /
    • 2022
  • This study analyzed the translation error factors of machine translation services such as Naver Papago and Google Translate through Self-Attention path visualization. Self-Attention is a key method of the Transformer and BERT NLP models and recently widely used in machine translation. We propose a method to explain translation error factors of machine translation algorithms by comparison the Self-Attention paths between ST(source text) and ST'(transformed ST) of which meaning is not changed, but the translation output is more accurate. Through this method, it is possible to gain explainability to analyze a machine translation algorithm's inside process, which is invisible like a black box. In our experiment, it was possible to explore the factors that caused translation errors by analyzing the difference in key word's attention path. The study used the XLM-RoBERTa multilingual NLP model provided by exBERT for Self-Attention visualization, and it was applied to two examples of Korean-Chinese and Korean-English translations.

Comparative Study of Deep Learning Algorithm for Detection of Welding Defects in Radiographic Images (방사선 투과 이미지에서의 용접 결함 검출을 위한 딥러닝 알고리즘 비교 연구)

  • Oh, Sang-jin;Yun, Gwang-ho;Lim, Chaeog;Shin, Sung-chul
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.25 no.4_2
    • /
    • pp.687-697
    • /
    • 2022
  • An automated system is needed for the effectiveness of non-destructive testing. In order to utilize the radiographic testing data accumulated in the film, the types of welding defects were classified into 9 and the shape of defects were analyzed. Data was preprocessed to use deep learning with high performance in image classification, and a combination of one-stage/two-stage method and convolutional neural networks/Transformer backbone was compared to confirm a model suitable for welding defect detection. The combination of two-stage, which can learn step-by-step, and deep-layered CNN backbone, showed the best performance with mean average precision 0.868.

Technical Trends in Hyperscale Artificial Intelligence Processors (초거대 인공지능 프로세서 반도체 기술 개발 동향)

  • W. Jeon;C.G. Lyuh
    • Electronics and Telecommunications Trends
    • /
    • v.38 no.5
    • /
    • pp.1-11
    • /
    • 2023
  • The emergence of generative hyperscale artificial intelligence (AI) has enabled new services, such as image-generating AI and conversational AI based on large language models. Such services likely lead to the influx of numerous users, who cannot be handled using conventional AI models. Furthermore, the exponential increase in training data, computations, and high user demand of AI models has led to intensive hardware resource consumption, highlighting the need to develop domain-specific semiconductors for hyperscale AI. In this technical report, we describe development trends in technologies for hyperscale AI processors pursued by domestic and foreign semiconductor companies, such as NVIDIA, Graphcore, Tesla, Google, Meta, SAPEON, FuriosaAI, and Rebellions.

Korean Pre-trained Model KE-T5-based Automatic Paper Summarization (한국어 사전학습 모델 KE-T5 기반 자동 논문 요약)

  • Seo, Hyeon-Tae;Shin, Saim;Kim, San
    • Annual Conference on Human and Language Technology
    • /
    • 2021.10a
    • /
    • pp.505-506
    • /
    • 2021
  • 최근 인터넷에서 기하급수적으로 증가하는 방대한 양의 텍스트를 자동으로 요약하려는 연구가 활발하게 이루어지고 있다. 자동 텍스트 요약 작업은 다양한 사전학습 모델의 등장으로 인해 많은 발전을 이루었다. 특히 T5(Text-to-Text Transfer Transformer) 기반의 모델은 자동 텍스트 요약 작업에서 매우 우수한 성능을 보이며, 해당 분야의 SOTA(State of the Art)를 달성하고 있다. 본 논문에서는 방대한 양의 한국어를 학습시킨 사전학습 모델 KE-T5를 활용하여 자동 논문 요약을 수행하고 평가한다.

  • PDF