• Title/Summary/Keyword: data process

Search Result 23,881, Processing Time 0.047 seconds

A PC-Based System for Gear Pitch Analysis and Monitoring in Gear Manufacturing Process (기어피치분석 및 공정관측을 위한 PC기반시스템 구축)

  • 김성준;지용수
    • Journal of Korean Society for Quality Management
    • /
    • v.30 no.3
    • /
    • pp.111-119
    • /
    • 2002
  • Gears are essential elements for mechanical power transmission. Geometric precision is the main factor for characterizing gear grade and qualify. Gear pitch is one of the crucial measurements, which is defined as a distance between two adjacent gear teeth. It is well-known that variability in gear pitches may causes wear-out and vibration noise. Therefore maintaining pitch errors at a low level plays a key role in assuring the gear quality to customers. This paper is concerned with a case study, which presents a computerized system for Inspecting pitch errors in a gear machining process. This system consists of a PC and window-based programs. Although the start and stop is manually accomplished, the process of measuring and analyzing pitch data is automatically conducted in this system. Our purpose lies in reducing inspection cost and time as well as Increasing test reliability. Its operation is briefly illustrated by example. Sometimes a strong autocorrelation is observed from pitch data. We also discuss a process monitoring scheme taking account of autocorrelations.

A Study on the Development of AHP Analytic Tools (AHP 분석도구 개발에 관한 연구)

  • Kim, Dong-Kil;Choi, Sung-Ho;Han, Sung-Soo
    • Journal of Information Technology Services
    • /
    • v.17 no.2
    • /
    • pp.101-110
    • /
    • 2018
  • Analytic Hierarchy Process (AHP) is a systematic and objective decision method that stratifies multiple criteria to determine various evaluation factors' order of priority. Since AHP requires several stages of analysis with complicated attributes, they normally rely on computer programs for the execution of such analyses. However, the existing AHP analytic tools carry the inconvenience of having to repeatedly input massive amounts data generated in every stage. This repetitious data entry process results in prolonged analysis times and higher possibility of errors, leading to decreased reliability of the results. Thus, in this study, we develop an analytic tool that effectively simplifies the data entry process in AHP analysis, for the purpose of reducing analysis times and increasing the reliability of the results.

NC데이타와 Off-Line Program을 이용한 연마 로봇 시스템 개발

  • 오영섭;유범상;양균의
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 1997.04a
    • /
    • pp.692-697
    • /
    • 1997
  • This paper presents a method of grinding and polishing automation of precision die after CNC machining. The method employs a robot system equipped with a pneumatic spindle and a special abrasive film pad. The robote program is automatically generated off-line from a PC and downloaded to robot controller. Position and orientation data for the program is supplied from cutter contact (CC) data of NC machining process. This eliminates separate robot teaching process. This paper aims at practical automation of die finishing process which is very time consuming and suffering from shortage of workpeople. Time loss for changeover from one product to next is eliminated by off-line programming exploiting appropriate NC machining data. Dextrous 6-axis robot with rigid wrist and simple tooling enables the process applicable to larger, rather complex 3 dimensional free surfaces

  • PDF

A Basic Study on the Extension of Design Information to Improve Interoperability in BIM-based Collaborative Design Process (BIM 기반 협업에서의 상호운용성 향상을 위한 설계정보의 확장방안에 대한 기초적 연구)

  • Jung, Jae-Hwan;Kim, Jim-Man;Kim, Sung-Ah
    • Journal of KIBIM
    • /
    • v.5 no.1
    • /
    • pp.25-34
    • /
    • 2015
  • In the initial step of BIM based architectural design process, workloads are increased and the decision making process becomes more complex than those of the conventional design process. Technologies regarding distribution, exchange, classification, verification of BIM data are fundamental elements of construct environment for information sharing based on BIM. Interoperability of BIM model data is another issue to integrate BIM model. To improve interoperability in BIM-based collaboration, a model for utilizing formal&unformal design informations is suggested. Futhermore, Prototyping the model and practical test is conducted for advancement of data exchange making design data richen.

Neural Network-based Time Series Modeling of Optical Emission Spectroscopy Data for Fault Prediction in Reactive Ion Etching

  • Sang Jeen Hong
    • Journal of the Semiconductor & Display Technology
    • /
    • v.22 no.4
    • /
    • pp.131-135
    • /
    • 2023
  • Neural network-based time series models called time series neural networks (TSNNs) are trained by the error backpropagation algorithm and used to predict process shifts of parameters such as gas flow, RF power, and chamber pressure in reactive ion etching (RIE). The training data consists of process conditions, as well as principal components (PCs) of optical emission spectroscopy (OES) data collected in-situ. Data are generated during the etching of benzocyclobutene (BCB) in a SF6/O2 plasma. Combinations of baseline and faulty responses for each process parameter are simulated, and a moving average of TSNN predictions successfully identifies process shifts in the recipe parameters for various degrees of faults.

  • PDF

Abnormality Detection to Non-linear Multivariate Process Using Supervised Learning Methods (지도학습기법을 이용한 비선형 다변량 공정의 비정상 상태 탐지)

  • Son, Young-Tae;Yun, Deok-Kyun
    • IE interfaces
    • /
    • v.24 no.1
    • /
    • pp.8-14
    • /
    • 2011
  • Principal Component Analysis (PCA) reduces the dimensionality of the process by creating a new set of variables, Principal components (PCs), which attempt to reflect the true underlying process dimension. However, for highly nonlinear processes, this form of monitoring may not be efficient since the process dimensionality can't be represented by a small number of PCs. Examples include the process of semiconductors, pharmaceuticals and chemicals. Nonlinear correlated process variables can be reduced to a set of nonlinear principal components, through the application of Kernel Principal Component Analysis (KPCA). Support Vector Data Description (SVDD) which has roots in a supervised learning theory is a training algorithm based on structural risk minimization. Its control limit does not depend on the distribution, but adapts to the real data. So, in this paper proposes a non-linear process monitoring technique based on supervised learning methods and KPCA. Through simulated examples, it has been shown that the proposed monitoring chart is more effective than $T^2$ chart for nonlinear processes.

A New Measure of Process Capability for Non-Normal Process : $C_{psk}$ (비정규 공정에 대한 공정능력의 새로운 측도: $C_{psk}$)

  • 김홍준;송서일
    • Journal of Korean Society for Quality Management
    • /
    • v.26 no.1
    • /
    • pp.48-60
    • /
    • 1998
  • This paper proposes a fourth generation index $C_{psk}$, constructed from $C_{psk}$, by introducing the factor|$\mu$-T| in the numerator as an extra penalty for the departure of the process mean from the preassigned target value T. The motivation behind the introduction of $C_{psk}$ is that when $T\neqM$ process shifts away from target are evaluated without respect to direction. All indices that are now in use assume normally distributed data, and any use of the indices on non-normal data results in inaccurate capability measurements. In this paper, a new process capability index $C_{psk}$ is introduced for non-normal process. The Pearson curve and the Johnson curve are selected for capability index calculation and data modeling the normal-based index $C_{psk}$ is used as the model for non-normal process. A significant result of this research find that the ranking of the six indices, $C_{p}$, $C_{pk}$, $C_{pm}$, ${C^*}_{psk}$, $C_{pmk}$, $C_{psk}$in terms of sensitivity to departure of the process median from the target value from the most sensitive one up to the least sensitive are $C_{psk}$, $C_{pmk}$, ${C^*}_{psk}$,$C_{pm}$, $C_{pk}$, $C_{p}$.

  • PDF

How Through-Process Optimization (TPO) Assists to Meet Product Quality

  • Klaus Jax;Yuyou Zhai;Wolfgang Oberaigner
    • Corrosion Science and Technology
    • /
    • v.23 no.2
    • /
    • pp.131-138
    • /
    • 2024
  • This paper introduces Primetals Technologies' Through-Process Optimization (TPO) Services and Through-Process Quality Control (TPQC) System, which integrate domain knowledge, software, and automation expertise to assist steel producers in achieving operational excellence. TPQC collects high-resolution process and product data from the entire production route, providing visualizations and facilitating quality assurance. It also enables the application of artificial intelligence techniques to optimize processes, accelerate steel grade development, and enhance product quality. The main objective of TPO is to grow and digitize operational know-how, increase profitability, and better meet customer needs. The paper describes the contribution of these systems to achieving operational excellence, with a focus on quality assurance. Transparent and traceable production data is used for manual and automatic quality evaluation, resulting in product quality status and guiding the product disposition process. Deviation management is supported by rule-based and AI-based assistants, along with monitoring, alarming, and reporting functions ensuring early recognition of deviations. Embedded root cause proposals and their corrective and compensatory actions facilitate decision support to maintain product quality. Quality indicators and predictive quality models further enhance the efficiency of the quality assurance process. Utilizing the quality assurance software package, TPQC acts as a "one-truth" platform for product quality key players.

Economic Performance of an EWMA Chart for Monitoring MMSE-Controlled Processes

  • Lee, Jae-Heon;Yang, Wan-Youn
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.2
    • /
    • pp.285-295
    • /
    • 2004
  • Statistical process control(SPC) and engineering process control(EPC) are two complementary strategies for quality improvement. An integrated process control(IPC) can use EPC to reduce the effect of predictable quality variations and SPC to monitor the process for detection of special causes. In this paper we assume an IMA(1,1) model as a disturbance process and an occurrence of a level shift in the process, and we consider the economic performance for applying an EWMA chart to monitor MMSE-controlled processes. The numerical results suggest that the IPC scheme in an IMA(1,1) disturbance model does not give additional advantages in the economic aspect.

  • PDF

Design and Implementation of XML-based Electronic Data Interchange Using Unified Modeling Language (UML을 이용한 XML/EDI 시스템 설계 및 구현)

  • 문태수;김호진
    • The Journal of Society for e-Business Studies
    • /
    • v.7 no.3
    • /
    • pp.139-158
    • /
    • 2002
  • Most of companies related to the area of B2B electronic commerce are making their efforts to innovate their existing business process into new designed process. XML-based electronic data interchange has potential to impact on reshaping the traditional EDI systems. This study intends to suggest a prototype of XML-based electronic data interchange using unified modeling language, with a case study applied in Korean automobile industry. In order to accomplish the research objectives, we employed UML as its standard modeling language, In this study, four diagramming techniques such as use case diagram, sequence diagram, class diagram, component diagram among eight modeling techniques are used for analyzing hierarchical business process. As a result of applying UML methodology, we design and develop XML/EDI applications efficiently. Our field test applied to Korean automobile industry shows that data modeling to design XML application using UML is better than existing methodologies in representing object schema of XML data and in extension and interoperability of systems.

  • PDF