• Title/Summary/Keyword: Performance verification

Search Result 2,568, Processing Time 0.028 seconds

Data analysis by Integrating statistics and visualization: Visual verification for the prediction model (통계와 시각화를 결합한 데이터 분석: 예측모형 대한 시각화 검증)

  • Mun, Seong Min;Lee, Kyung Won
    • Design Convergence Study
    • /
    • v.15 no.6
    • /
    • pp.195-214
    • /
    • 2016
  • Predictive analysis is based on a probabilistic learning algorithm called pattern recognition or machine learning. Therefore, if users want to extract more information from the data, they are required high statistical knowledge. In addition, it is difficult to find out data pattern and characteristics of the data. This study conducted statistical data analyses and visual data analyses to supplement prediction analysis's weakness. Through this study, we could find some implications that haven't been found in the previous studies. First, we could find data pattern when adjust data selection according as splitting criteria for the decision tree method. Second, we could find what type of data included in the final prediction model. We found some implications that haven't been found in the previous studies from the results of statistical and visual analyses. In statistical analysis we found relation among the multivariable and deducted prediction model to predict high box office performance. In visualization analysis we proposed visual analysis method with various interactive functions. Finally through this study we verified final prediction model and suggested analysis method extract variety of information from the data.

COVID-19-related Korean Fake News Detection Using Occurrence Frequencies of Parts of Speech (품사별 출현 빈도를 활용한 코로나19 관련 한국어 가짜뉴스 탐지)

  • Jihyeok Kim;Hyunchul Ahn
    • Journal of Intelligence and Information Systems
    • /
    • v.29 no.2
    • /
    • pp.267-283
    • /
    • 2023
  • The COVID-19 pandemic, which began in December 2019 and continues to this day, has left the public needing information to help them cope with the pandemic. However, COVID-19-related fake news on social media seriously threatens the public's health. In particular, if fake news related to COVID-19 is massively spread with similar content, the time required for verification to determine whether it is genuine or fake will be prolonged, posing a severe threat to our society. In response, academics have been actively researching intelligent models that can quickly detect COVID-19-related fake news. Still, the data used in most of the existing studies are in English, and studies on Korean fake news detection are scarce. In this study, we collect data on COVID-19-related fake news written in Korean that is spread on social media and propose an intelligent fake news detection model using it. The proposed model utilizes the frequency information of parts of speech, one of the linguistic characteristics, to improve the prediction performance of the fake news detection model based on Doc2Vec, a document embedding technique mainly used in prior studies. The empirical analysis shows that the proposed model can more accurately identify Korean COVID-19-related fake news by increasing the recall and F1 score compared to the comparison model.

Study on Security Policy Distribute Methodology for Zero Trust Environment (제로 트러스트 환경을 위한 보안 정책 배포 방법에 대한 연구)

  • Sung-Hwa Han;Hoo-Ki Lee
    • Convergence Security Journal
    • /
    • v.22 no.1
    • /
    • pp.93-98
    • /
    • 2022
  • Information service technology continues to develop, and information service continues to expand based on the IT convergence trend. The premeter-based security model chosen by many organizations can increase the effectiveness of security technologies. However, in the premeter-based security model, it is very difficult to deny security threats that occur from within. To solve this problem, a zero trust model has been proposed. The zero trust model requires authentication for user and terminal environments, device security environment verification, and real-time monitoring and control functions. The operating environment of the information service may vary. Information security management should be able to response effectively when security threats occur in various systems at the same time. In this study, we proposed a security policy distribution system in the object reference method that can effectively distribute security policies to many systems. It was confirmed that the object reference type security policy distribution system proposed in this study can support all of the operating environments of the system constituting the information service. Since the policy distribution performance was confirmed to be similar to that of other security systems, it was verified that it was sufficiently effective. However, since this study assumed that the security threat target was predefined, additional research is needed on the identification method of the breach target for each security threat.

Intelligent Motion Pattern Recognition Algorithm for Abnormal Behavior Detections in Unmanned Stores (무인 점포 사용자 이상행동을 탐지하기 위한 지능형 모션 패턴 인식 알고리즘)

  • Young-june Choi;Ji-young Na;Jun-ho Ahn
    • Journal of Internet Computing and Services
    • /
    • v.24 no.6
    • /
    • pp.73-80
    • /
    • 2023
  • The recent steep increase in the minimum hourly wage has increased the burden of labor costs, and the share of unmanned stores is increasing in the aftermath of COVID-19. As a result, theft crimes targeting unmanned stores are also increasing, and the "Just Walk Out" system is introduced to prevent such thefts, and LiDAR sensors, weight sensors, etc. are used or manually checked through continuous CCTV monitoring. However, the more expensive sensors are used, the higher the initial cost of operating the store and the higher the cost in many ways, and CCTV verification is difficult for managers to monitor around the clock and is limited in use. In this paper, we would like to propose an AI image processing fusion algorithm that can solve these sensors or human-dependent parts and detect customers who perform abnormal behaviors such as theft at low costs that can be used in unmanned stores and provide cloud-based notifications. In addition, this paper verifies the accuracy of each algorithm based on behavior pattern data collected from unmanned stores through motion capture using mediapipe, object detection using YOLO, and fusion algorithm and proves the performance of the convergence algorithm through various scenario designs.

Development of a Signal Acquisition Device to Verify the Applicability of Millimeter Wave Tracking Radar Transmission and Receiving Components (밀리미터파 추적레이더 송·수신 구성품의 적용성 검증을 위한 신호획득장치 개발)

  • Jinkyu Choi;Youngcheol Shin;Soonil Hong;Han-Chun Ryu;Hongrak Kim;Jihan Joo
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.23 no.6
    • /
    • pp.185-190
    • /
    • 2023
  • Recently, tracking radar requires the development of millimeter wave tracking radar to acquire target information with high resolution in various environments. The development of millimeter wave tracking radar requires the development of transmission and receiving components that can be applied to the millimeter wave tracking radar, as well as verification of the applicability of the tracking radar. In order to verify the applicability of the developed transmitting and receiving components, it is necessary to develop a signal acquisition device that can control the transmitting and receiving components using the operating concept of a tracking radar and check the status of the received signal. In this paper, we implemented a signal acquisition device that can confirm the applicability of components developed for millimeter wave tracking radar. The signal acquisition device was designed to process in real time the OOOMHz center frequency and OOMHz bandwidth signals input from 4 channels to verify the received signal. In addition, component control applying the tracking radar operation concept was designed to be controlled by communication such as RS422, RS232, and SPI and generation of control signals for the transmission and receiving time. Lastly, the implemented signal acquisition device was verified through a signal acquisition device performance test.

Methodology of Test for sUAV Navigation System Error (소형무인항공기 항법시스템오차 시험평가 방법)

  • SungKwan Ku;HyoJung Ahn;Yo-han Ju;Seokmin Hong
    • Journal of Advanced Navigation Technology
    • /
    • v.25 no.6
    • /
    • pp.510-516
    • /
    • 2021
  • Recently, the range of utilization and demand for unmanned aerial vehicle (UAV) has been continuously increasing, and research on the construction of a separate operating system for low-altitude UAV is underway through the development of a management system separate from manned aircraft. Since low-altitude UAVs also fly in the airspace, it is essential to establish technical standards and certification systems necessary for the operation of the aircraft, and research on this is also in progress. If the operating standards and certification requirements of the aircraft are presented, a test method to confirm this should also be presented. In particular, the accuracy of small UAV's navigation required during flight is required to be more precise than that of a manned aircraft or a large UAV. It was necessary to calculate a separate navigation error. In this study, we presented a test method for deriving navigation errors that can be applied to UAVs that have difficulty in acquiring long-term operational data, which is different from existing manned aircraft, and conducted verification tests.

Investigation of thermal hydraulic behavior of the High Temperature Test Facility's lower plenum via large eddy simulation

  • Hyeongi Moon ;Sujong Yoon;Mauricio Tano-Retamale ;Aaron Epiney ;Minseop Song;Jae-Ho Jeong
    • Nuclear Engineering and Technology
    • /
    • v.55 no.10
    • /
    • pp.3874-3897
    • /
    • 2023
  • A high-fidelity computational fluid dynamics (CFD) analysis was performed using the Large Eddy Simulation (LES) model for the lower plenum of the High-Temperature Test Facility (HTTF), a ¼ scale test facility of the modular high temperature gas-cooled reactor (MHTGR) managed by Oregon State University. In most next-generation nuclear reactors, thermal stress due to thermal striping is one of the risks to be curiously considered. This is also true for HTGRs, especially since the exhaust helium gas temperature is high. In order to evaluate these risks and performance, organizations in the United States led by the OECD NEA are conducting a thermal hydraulic code benchmark for HTGR, and the test facility used for this benchmark is HTTF. HTTF can perform experiments in both normal and accident situations and provide high-quality experimental data. However, it is difficult to provide sufficient data for benchmarking through experiments, and there is a problem with the reliability of CFD analysis results based on Reynolds-averaged Navier-Stokes to analyze thermal hydraulic behavior without verification. To solve this problem, high-fidelity 3-D CFD analysis was performed using the LES model for HTTF. It was also verified that the LES model can properly simulate this jet mixing phenomenon via a unit cell test that provides experimental information. As a result of CFD analysis, the lower the dependency of the sub-grid scale model, the closer to the actual analysis result. In the case of unit cell test CFD analysis and HTTF CFD analysis, the volume-averaged sub-grid scale model dependency was calculated to be 13.0% and 9.16%, respectively. As a result of HTTF analysis, quantitative data of the fluid inside the HTTF lower plenum was provided in this paper. As a result of qualitative analysis, the temperature was highest at the center of the lower plenum, while the temperature fluctuation was highest near the edge of the lower plenum wall. The power spectral density of temperature was analyzed via fast Fourier transform (FFT) for specific points on the center and side of the lower plenum. FFT results did not reveal specific frequency-dominant temperature fluctuations in the center part. It was confirmed that the temperature power spectral density (PSD) at the top increased from the center to the wake. The vortex was visualized using the well-known scalar Q-criterion, and as a result, the closer to the outlet duct, the greater the influence of the mainstream, so that the inflow jet vortex was dissipated and mixed at the top of the lower plenum. Additionally, FFT analysis was performed on the support structure near the corner of the lower plenum with large temperature fluctuations, and as a result, it was confirmed that the temperature fluctuation of the flow did not have a significant effect near the corner wall. In addition, the vortices generated from the lower plenum to the outlet duct were identified in this paper. It is considered that the quantitative and qualitative results presented in this paper will serve as reference data for the benchmark.

Development of a deep learning-based cabbage core region detection and depth classification model (딥러닝 기반 배추 심 중심 영역 및 깊이 분류 모델 개발)

  • Ki Hyun Kwon;Jong Hyeok Roh;Ah-Na Kim;Tae Hyong Kim
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.16 no.6
    • /
    • pp.392-399
    • /
    • 2023
  • This paper proposes a deep learning model to determine the region and depth of cabbage cores for robotic automation of the cabbage core removal process during the kimchi manufacturing process. In addition, rather than predicting the depth of the measured cabbage, a model was presented that simultaneously detects and classifies the area by converting it into a discrete class. For deep learning model learning and verification, RGB images of the harvested cabbage 522 were obtained. The core region and depth labeling and data augmentation techniques from the acquired images was processed. MAP, IoU, acuity, sensitivity, specificity, and F1-score were selected to evaluate the performance of the proposed YOLO-v4 deep learning model-based cabbage core area detection and classification model. As a result, the mAP and IoU values were 0.97 and 0.91, respectively, and the acuity and F1-score values were 96.2% and 95.5% for depth classification, respectively. Through the results of this study, it was confirmed that the depth information of cabbage can be classified, and that it can be used in the development of a robot-automation system for the cabbage core removal process in the future.

Analysis of the Effectiveness of Big Data-Based Six Sigma Methodology: Focus on DX SS (빅데이터 기반 6시그마 방법론의 유효성 분석: DX SS를 중심으로)

  • Kim Jung Hyuk;Kim Yoon Ki
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.13 no.1
    • /
    • pp.1-16
    • /
    • 2024
  • Over recent years, 6 Sigma has become a key methodology in manufacturing for quality improvement and cost reduction. However, challenges have arisen due to the difficulty in analyzing large-scale data generated by smart factories and its traditional, formal application. To address these limitations, a big data-based 6 Sigma approach has been developed, integrating the strengths of 6 Sigma and big data analysis, including statistical verification, mathematical optimization, interpretability, and machine learning. Despite its potential, the practical impact of this big data-based 6 Sigma on manufacturing processes and management performance has not been adequately verified, leading to its limited reliability and underutilization in practice. This study investigates the efficiency impact of DX SS, a big data-based 6 Sigma, on manufacturing processes, and identifies key success policies for its effective introduction and implementation in enterprises. The study highlights the importance of involving all executives and employees and researching key success policies, as demonstrated by cases where methodology implementation failed due to incorrect policies. This research aims to assist manufacturing companies in achieving successful outcomes by actively adopting and utilizing the methodologies presented.

Comparison Analysis of Patient Specific Quality Assurance Results using portal dose image prediction and Anisotropic analytical algorithm (Portal dose image prediction과 anisotropic analytical algorithm을 사용한 환자 특이적 정도관리 결과 비교 분석)

  • BEOMSEOK AHN;BOGYOUM KIM;JEHEE LEE
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.35
    • /
    • pp.15-21
    • /
    • 2023
  • Purpose: The purpose of this study is to compare the performance of the anisotropic analytical algorithm (AAA) and portal dose image prediction (PDIP) for patient-specific quality assurance based on electronic portal imaging device, and to evaluate the clinical feasibility of portal dosimetry using AAA. Subjects and methods: We retrospectively selected a total of 32 patients, including 15 lung cancer patients and 17 liver cancer patients. Verification plans were generated using PDIP and AAA. We obtained gamma passing rates by comparing the calculated distribution with the measured distribution and obtained MLC positional difference values. Results: The mean gamma passing rate for lung cancer patients was 99.5% ± 1.1% for 3%/3 mm using PDIP and 90.6% ± 5.8% for 1%/1 mm. Using AAA, the mean gamma passing rate was 98.9% ± 1.7% for 3%/3 mm and 87.8% ± 5.2% for 1%/1 mm. The mean gamma passing rate for liver cancer patients was 99.9% ± 0.3% for 3%/3 mm using PDIP and 96.6% ± 4.6% for 1%/1 mm. Using AAA, the mean gamma passing rate was 99.6% ± 0.5% for 3%/3 mm and 89.5% ± 6.4% for 1%/1 mm. The MLC positional difference was small at 0.013 mm ± 0.002 mm and showed no correlation with the gamma passing rate. Conclusion: The AAA algorithm can be clinically used as a portal dosimetry calculation algorithm for patientspecific quality assurance based on electronic portal imaging device.

  • PDF