• Title/Summary/Keyword: 작업지능

Search Result 885, Processing Time 0.025 seconds

AI/BIG DATA-based Smart Factory Technology Status Analysis for Effective Display Manufacturing (효과적인 디스플레이 제조를 위한 AI/BIG DATA 기반 스마트 팩토리 기술 현황 분석)

  • Jung, Sukwon;Lim, Huhnkuk
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.25 no.3
    • /
    • pp.471-477
    • /
    • 2021
  • In the field of display, a smart factory means more efficient display manufacturing using AI/BIG DATA technology not only for job automation, but also for existing process management, moving facilities, process abnormalities, and defect classification. In the past, when defects appeared in the display manufacturing process, the classification of defects and coping with process abnormalities were different, a lot of time was consumed for this. However, in the field of display manufacturing, advanced process equipment must be used, and it can be said that the competitiveness of the display manufacturing industry is to quickly identify the cause of defects and increase the yield. In this paper, we will summarize the cases in which smart factory AI/BIG DATA technology is applied to domestic display manufacturing, and analyze what advantages can be derived compared to existing methods. This information can be used as prior knowledge for improved smart factory development in the field of display manufacturing using AI/BIG DATA.

Analysis of the Ripple Effect of the US Federal Reserve System's Quantitative Easing Policy on Stock Price Fluctuations (미국연방준비제도의 양적완화 정책이 주가 변동에 미치는 영향 분석)

  • Hong, Sunghyuck
    • Journal of Digital Convergence
    • /
    • v.19 no.3
    • /
    • pp.161-166
    • /
    • 2021
  • The macroeconomic concept represents the movement of a country's economy, and it affects the overall economic activities of business, government, and households. In the macroeconomy, by looking at changes in national income, inflation, unemployment, currency, interest rates, and raw materials, it is possible to understand the effects of economic actors' actions and interactions on the prices of products and services. The US Federal Reserve System (FED) is leading the world economy by offering various stimulus measures to overcome the corona economic recession. Although the stock price continued to decline on March 20, 2020 due to the current economic recession caused by the corona, the US S&P 500 index began rebounding after March 23 and to 3,694.62 as of December 15 due to quantitative easing, a powerful stimulus for the FED. Therefore, the FED's economic stimulus measures based on macroeconomic indicators are more influencing, rather than judging the stock price forecast from the corporate financial statements. Therefore, this study was conducted to reduce losses in stock investment and establish sound investment by analyzing the FED's economic stimulus measures and its effect on stock prices.

Implementation of FPGA-based Accelerator for GRU Inference with Structured Compression (구조적 압축을 통한 FPGA 기반 GRU 추론 가속기 설계)

  • Chae, Byeong-Cheol
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.26 no.6
    • /
    • pp.850-858
    • /
    • 2022
  • To deploy Gate Recurrent Units (GRU) on resource-constrained embedded devices, this paper presents a reconfigurable FPGA-based GRU accelerator that enables structured compression. Firstly, a dense GRU model is significantly reduced in size by hybrid quantization and structured top-k pruning. Secondly, the energy consumption on external memory access is greatly reduced by the proposed reuse computing pattern. Finally, the accelerator can handle a structured sparse model that benefits from the algorithm-hardware co-design workflows. Moreover, inference tasks can be flexibly performed using all functional dimensions, sequence length, and number of layers. Implemented on the Intel DE1-SoC FPGA, the proposed accelerator achieves 45.01 GOPs in a structured sparse GRU network without batching. Compared to the implementation of CPU and GPU, low-cost FPGA accelerator achieves 57 and 30x improvements in latency, 300 and 23.44x improvements in energy efficiency, respectively. Thus, the proposed accelerator is utilized as an early study of real-time embedded applications, demonstrating the potential for further development in the future.

Crowdsourcing based Local Traffic Event Detection Scheme (크라우드 소싱 기반의 지역 교통 이벤트 검출 기법)

  • Kim, Yuna;Choi, Dojin;Lim, Jongtae;Kim, Sanghyeuk;Kim, Jonghun;Bok, Kyoungsoo;Yoo, Jaesoo
    • The Journal of the Korea Contents Association
    • /
    • v.22 no.4
    • /
    • pp.83-93
    • /
    • 2022
  • Research is underway to solve the traffic problem by using crowdsourcing, where drivers use their mobile devices to provide traffic information. If it is used for traffic event detection through crowdsourcing, the task of collecting related data is reduced, which lowers time cost and increases accuracy. In this paper, we propose a scheme to collect traffic-related data using crowdsourcing and to detect events affecting traffic through this. The proposed scheme uses machine learning algorithms for processing large amounts of data to determine the event type of the collected data. In addition, to find out the location where the event occurs, a keyword indicating the location is extracted from the collected data, and the administrative area of the keyword is returned. In this way, it is possible to resolve a location that is broadly defined in the existing location information or incorrect location information. Various performance evaluations are performed to prove the superiority and feasibility of the proposed scheme.

A Case Study on the Development of a Curriculum by using NCS (NCS를 활용한 교육과정 개발 사례 연구)

  • Baek, Jinwook
    • Journal of Creative Information Culture
    • /
    • v.7 no.4
    • /
    • pp.217-224
    • /
    • 2021
  • Because of the development of the 4th industrial technology such as artificial intelligence and the corona situation, the recent university society is rapidly changing. As the college entrance exam environment is also getting seriously difficult, departments and majors are rapidly changing at the university. Therefore, the department of the university needs a curriculum for new majors or convergence majors according to the internal and external environment of the department. However, it is a very difficult and time-consuming task to develop such a curriculum accurately. This paper proposes a curriculum development method and an application case using the National Competency Standards(NCS). The curriculum development method proposed in this paper consists of the traditional three-step method, namely, analysis, design, and development, and NCS is used in each step. This proposed method focused on reducing the time of curriculum development, and an example of curriculum development is presented for the usefulness of the proposed method.

Machine Learning Language Model Implementation Using Literary Texts (문학 텍스트를 활용한 머신러닝 언어모델 구현)

  • Jeon, Hyeongu;Jung, Kichul;Kwon, Kyoungah;Lee, Insung
    • The Journal of the Convergence on Culture Technology
    • /
    • v.7 no.2
    • /
    • pp.427-436
    • /
    • 2021
  • The purpose of this study is to implement a machine learning language model that learns literary texts. Literary texts have an important characteristic that pairs of question-and-answer are not frequently clearly distinguished. Also, literary texts consist of pronouns, figurative expressions, soliloquies, etc. They hinder the necessity of machine learning using literary texts by making it difficult to learn algorithms. Algorithms that learn literary texts can show more human-friendly interactions than algorithms that learn general sentences. For this goal, this paper proposes three text correction tasks that must be preceded in researches using literary texts for machine learning language model: pronoun processing, dialogue pair expansion, and data amplification. Learning data for artificial intelligence should have clear meanings to facilitate machine learning and to ensure high effectiveness. The introduction of special genres of texts such as literature into natural language processing research is expected not only to expand the learning area of machine learning, but to show a new language learning method.

Text Data Analysis Model Based on Web Application (웹 애플리케이션 기반의 텍스트 데이터 분석 모델)

  • Jin, Go-Whan
    • The Journal of the Korea Contents Association
    • /
    • v.21 no.11
    • /
    • pp.785-792
    • /
    • 2021
  • Since the Fourth Industrial Revolution, various changes have occurred in society as a whole due to advance in technologies such as artificial intelligence and big data. The amount of data that can be collect in the process of applying important technologies tends to increase rapidly. Especially in academia, existing generated literature data is analyzed in order to grasp research trends, and analysis of these literature organizes the research flow and organizes some research methodologies and themes, or by grasping the subjects that are currently being talked about in academia, we are making a lot of contributions to setting the direction of future research. However, it is difficult to access whether data collection is necessary for the analysis of document data without the expertise of ordinary programs. In this paper, propose a text mining-based topic modeling Web application model. Even if you lack specialized knowledge about data analysis methods through the proposed model, you can perform various tasks such as collecting, storing, and text-analyzing research papers, and researchers can analyze previous research and research trends. It is expect that the time and effort required for data analysis can be reduce order to understand.

Ground Test of Docking Phase for Nanosatellite (초소형위성 지상 환경 도킹 시험)

  • Kim, Hae-Dong;Choi, Won-Sub;Kim, Min-Ki;Kim, Jin-Hyung;Kim, KiDuck;Kim, Ji-Seok;Cho, Dong-Hyun
    • Journal of Space Technology and Applications
    • /
    • v.1 no.1
    • /
    • pp.7-22
    • /
    • 2021
  • In this paper, we describe the results of the docking phase test in the ground environment of the rendezvous/docking technology verification satellite under development for the first time in Korea. rendezvous/docking technology is a high-level technology in space technology, which is also very important for accessing and performing tasks on relative objects in space orbit. In this paper, we describe the ground test results that the chaser finally docks the fixed target using an air bearing device. Based on the thrust control algorithm in the docking phase and the relative object recognition and relative distance estimation algorithm using visual-based sensors validated in this paper, we intend to use them for later expansion to rendezvous/docking algorithms in three-dimensional space for testing in space.

Introduction and Analysis of Open Source Software Development Methodology (오픈소스 SW 개발 방법론 소개 및 분석)

  • Son, Kyung A;Yun, Young-Sun
    • Journal of Software Assessment and Valuation
    • /
    • v.16 no.2
    • /
    • pp.163-172
    • /
    • 2020
  • Recently, concepts of the Fourth Industrial Revolution technologies such as artificial intelligence, big data, and cloud computing have been introduced and the limits of individual or team development policies are being reviewed. Also, a lot of latest technology source codes have been opened to the public, and related studies are being conducted based on them. Meanwhile, the company is applying the strengths of the open source software development methodology to proprietary software development, and publicly announcing support for open source development methodology. In this paper, we introduced several software development methodology such as open source model, inner source model, and the similar DevOps model, which have been actively discussed recently, and compared their characteristics and components. Rather than claiming the excellence of a specific model, we argue that if the software development policy of an individual or affiliated organization is established according to each benefit, they will be able to achieve software quality improvement while satisfying customer requirements.

A Study on the Deep Learning-Based Textbook Questionnaires Detection Experiment (딥러닝 기반 교재 문항 검출 실험 연구)

  • Kim, Tae Jong;Han, Tae In;Park, Ji Su
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.10 no.11
    • /
    • pp.513-520
    • /
    • 2021
  • Recently, research on edutech, which combines education and technology in the e-learning field called learning, education and training, has been actively conducted, but it is still insufficient to collect and utilize data tailored to individual learners based on learning activity data that can be automatically collected from digital devices. Therefore, this study attempts to detect questions in textbooks or problem papers using artificial intelligence computer vision technology that plays the same role as human eyes. The textbook or questionnaire item detection model proposed in this study can help collect, store, and analyze offline learning activity data in connection with intelligent education services without digital conversion of textbooks or questionnaires to help learners provide personalized learning services even in offline learning.