• Title/Summary/Keyword: Programming Error

Search Result 273, Processing Time 0.026 seconds

R&D Status of Quantum Computing Technology (양자컴퓨팅 기술 연구개발 동향)

  • Baek, C.H.;Hwang, Y.S.;Kim, T.W.;Choi, B.S.
    • Electronics and Telecommunications Trends
    • /
    • v.33 no.1
    • /
    • pp.20-33
    • /
    • 2018
  • The calculation speed of quantum computing is expected to outperform that of existing supercomputers with regard to certain problems such as secure computing, optimization problems, searching, and quantum chemistry. Many companies such as Google and IBM have been trying to make 50 superconducting qubits, which is expected to demonstrate quantum supremacy and those quantum computers are more advantageous in computing power than classical computers. However, quantum computers are expected to be applicable to solving real-world problems with superior computing power. This will require large scale quantum computing with many more qubits than the current 50 qubits available. To realize this, first, quantum error correction codes are required to be capable of computing within a sufficient amount of time with tolerable accuracy. Next, a compiler is required for the qubits encoded by quantum error correction codes to perform quantum operations. A large-scale quantum computer is therefore predicted to be composed of three essential components: a programming environment, layout mapping of qubits, and quantum processors. These components analyze how many numbers of qubits are needed, how accurate the qubit operations are, and where they are placed and operated. In this paper, recent progress on large-scale quantum computing and the relation of their components will be introduced.

New Response Surface Approach to Optimize Medium Composition for Production of Bacteriocin by Lactobacillus acidophilus ATCC 4356

  • RHEEM, SUNGSUE;SEJONG OH;KYOUNG SIK HAN;JEE YOUNG IMM;SAEHUN KIM
    • Journal of Microbiology and Biotechnology
    • /
    • v.12 no.3
    • /
    • pp.449-456
    • /
    • 2002
  • The objective of this study was to optimize medium composition of initial pH, tryptone, glucose, yeast extract, and mineral mixture for production of bacteriocin by Lactobacillus acidophilus ATCC 4356, using response surface methodology. A response surface approach including new statistical and plotting methods was employed for design and analysis of the experiment. An interiorly augmented central composite design was used as an experimental design. A normal-distribution log-link generalized linear model based on a subset fourth-order polynomial ($R^2$=0.94, Mean Error Deviance=0.0065) was used as an analysis model. This model was statistically superior to the full second-order polynomial-based generalized linear model ($R^2$=0.80, Mean Error Deviance=0.0140). Nonlinear programming determined the optimum composition of the medium as initial pH 6.35, typtone $1.21\%$, glucose $0.9\%$, yeast extract $0.65\%$, and mineral mixture $1.17\%$. A validation experiment confirmed that the optimized medium was comparable to the MRS medium in bacteriocin production, having the advantage of economy and practicality.

Rapid Prototyping of Polymer Microfluidic Devices Using CAD/CAM Tools for Laser Micromachining

  • Iovenitti, Pio G.;Mutapcic, Emir;Hume, Richard;Hayes, Jason P.
    • International Journal of CAD/CAM
    • /
    • v.6 no.1
    • /
    • pp.183-192
    • /
    • 2006
  • A CAD/CAM system has been developed for rapid prototyping (RP) of microfluidic devices based on excimer laser micromachining. The system comprises of two complementary softwares. One, the CAM tool, creates part programs from CAD models. The other, the Simulator Tool, uses a part program to generate the laser tool path and the 2D and 3D graphical representation of the machined microstructure. The CAM tool's algorithms use the 3D geometry of a microstructure, defined as an STL file exported from a CAD system, and process parameters (laser fluence, pulse repetition frequency, number of shots per area, wall angle), to automatically generate Numerical Control (NC) part programs for the machine controller. The performance of the system has been verified and demonstrated by machining a particle transportation device. The CAM tool simplifies part programming and replaces the tedious trial-and-error approach to creating programs. The simulator tool accepts manual or computer generated part programs, and displays the tool path and the machined structure. This enables error checking and editing of the program before machining, and development of programs for complex microstructures. Combined, the tools provide a user-friendly CAD/CAM system environment for rapid prototyping of microfluidic devices.

Output-error state-space identification of vibrating structures using evolution strategies: a benchmark study

  • Dertimanis, Vasilis K.
    • Smart Structures and Systems
    • /
    • v.14 no.1
    • /
    • pp.17-37
    • /
    • 2014
  • In this study, four widely accepted and used variants of Evolution Strategies (ES) are adapted and applied to the output-error state-space identification problem. The selection of ES is justified by prior strong indication of superior performance to similar problems, over alternatives like Genetic Algorithms (GA) or Evolutionary Programming (EP). The ES variants that are being tested are (i) the (1+1)-ES, (ii) the $({\mu}/{\rho}+{\lambda})-{\sigma}$-SA-ES, (iii) the $({\mu}_I,{\lambda})-{\sigma}$-SA-ES, and (iv) the (${\mu}_w,{\lambda}$)-CMA-ES. The study is based on a six-degree-of-freedom (DOF) structural model of a shear building that is characterized by light damping (up to 5%). The envisaged analysis is taking place through Monte Carlo experiments under two different excitation types (stationary / non-stationary) and the applied ES are assessed in terms of (i) accurate modal parameters extraction, (ii) statistical consistency, (iii) performance under noise-corrupted data, and (iv) performance under non-stationary data. The results of this suggest that ES are indeed competitive alternatives in the non-linear state-space estimation problem and deserve further attention.

A Study on an Automatical BKLS Measurement By Programming Technology

  • Shin, YeounOuk;Kim, KiBum
    • International journal of advanced smart convergence
    • /
    • v.7 no.3
    • /
    • pp.73-78
    • /
    • 2018
  • This study focuses on presenting the IT program module provided by BKLS measure in order to solve the problem of capital cost due to information asymmetry of external investors and corporate executives. Barron at al(1998) set up a BKLS measure to guide the market by intermediate analysts. The BKLS measure was measured by using the changes in the analyst forecast dispersion and analyst mean forecast error squared. This study suggests a model of the algorithm that the BKLS measure can be provided to all investors immediately by IT program in order to deliver the meaningful value in the domestic capital market as measured. This is a method of generating and analyzing real-time or non-real-time prediction models by transferring the predicted estimates delivered to the Big Data Log Analysis System through the statistical DB to the statistical forecasting engine. Because BKLS measure is not carried out in a concrete method, it is practically very difficult to estimate the BKLS measure. It is expected that the BKLS measure of Barron at al(1998) introduced in this study and the model of IT module provided in real time will be the starting point for the follow-up study for the introduction and realization of IT technology in the future.

The Vision-based Autonomous Guided Vehicle Using a Virtual Photo-Sensor Array (VPSA) for a Port Automation (가상 포토센서 배열을 탑재한 항만 자동화 자을 주행 차량)

  • Kim, Soo-Yong;Park, Young-Su;Kim, Sang-Woo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.2
    • /
    • pp.164-171
    • /
    • 2010
  • We have studied the port-automation system which is requested by the steep increment of cost and complexity for processing the freight. This paper will introduce a new algorithm for navigating and controlling the autonomous Guided Vehicle (AGV). The camera has the optical distortion in nature and is sensitive to the external ray, the weather, and the shadow, but it is very cheap and flexible to make and construct the automation system for the port. So we tried to apply to the AGV for detecting and tracking the lane using the CCD camera. In order to make the error stable and exact, this paper proposes new concept and algorithm for obtaining the error is generated by the Virtual Photo-Sensor Array (VPSA). VPSAs are implemented by programming and very easy to use for the various autonomous systems. Because the load of the computation is light, the AGV utilizes the maximal performance of the CCD camera and enables the CPU to take multi-tasks. We experimented on the proposed algorithm using the mobile robot and confirmed the stable and exact performance for tracking the lane.

Enhanced Communication Transport Protocol: Implementations and Experimentations (ECTP 멀티캐스트 전송 프로토콜: 구현 및 성능분석)

  • Park, Ki-Shik;Park, Juyoung;Koh, Seok-Joo;Jo, In-June
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.28 no.10B
    • /
    • pp.876-890
    • /
    • 2003
  • This paper proposes a protocol for the reliableand QoS-aware multicast transport, which is called the Enhanced Communications Transport Protocol (ECTP). The ECTP has so far been developed and standardized in ITU-T SG17 and ISO/IEC JTC 1/SC 6. Differently from the conventional reliable multicast, as shownin the IETF RMT WG, the ECTP additionally provides several distinct features such as tight control of multicast session, tree-based error control, and QoS management. For the tight control of multicast connections, the sender is at the heart of one-to-many group communications, and it is responsible for overall connection management such as connection creation/termination, pause/resumption, and the join and leave operations. for tree-based reliability control, ECTP configures a hierarchical tree during connection creation. Error control is performed within each local group defined by a control tree, which was partly designed like the IETF TRACK approach. Each parent retransmits lost data in response to retransmission requests from its children. For QoS management, ECTP supports QoS negotiation for resource reservation, and it also provides QoS monitoring and maintenance operations. ECTP has been implemented and tested on Linux machine, along with Application Programming Interfaces based on Berkeley sockets. For basic testing of the ECTP functionality, we give some preliminary experimental results for performance comparison of ECTP and TCP unicast transports. In conclusion, we describe the status of ECTP experimentations over APAN/KOREN testbed networks

A New Application of Human Visual Simulated Images in Optometry Services

  • Chang, Lin-Song;Wu, Bo-Wen
    • Journal of the Optical Society of Korea
    • /
    • v.17 no.4
    • /
    • pp.328-335
    • /
    • 2013
  • Due to the rapid advancement of auto-refractor technology, most optometry shops provide refraction services. Despite their speed and convenience, the measurement values provided by auto-refractors include a significant degree of error due to psychological and physical factors. Therefore, there is a need for repetitive testing to obtain a smaller mean error value. However, even repetitive testing itself might not be sufficient to ensure accurate measurements. Therefore, research on a method of measurement that can complement auto-refractor measurements and provide confirmation of refraction results needs to be conducted. The customized optometry model described herein can satisfy the above requirements. With existing technologies, using human eye measurement devices to obtain relevant individual optical feature parameters is no longer difficult, and these parameters allow us to construct an optometry model for individual eyeballs. They also allow us to compute visual images produced from the optometry model using the CODE V macro programming language before recognizing the diffraction effects visual images with the neural network algorithm to obtain the accurate refractive diopter. This study attempts to combine the optometry model with the back-propagation neural network and achieve a double check recognition effect by complementing the auto-refractor. Results show that the accuracy achieved was above 98% and that this application could significantly enhance the service quality of refraction.

Machine Learning Methodology for Management of Shipbuilding Master Data

  • Jeong, Ju Hyeon;Woo, Jong Hun;Park, JungGoo
    • International Journal of Naval Architecture and Ocean Engineering
    • /
    • v.12 no.1
    • /
    • pp.428-439
    • /
    • 2020
  • The continuous development of information and communication technologies has resulted in an exponential increase in data. Consequently, technologies related to data analysis are growing in importance. The shipbuilding industry has high production uncertainty and variability, which has created an urgent need for data analysis techniques, such as machine learning. In particular, the industry cannot effectively respond to changes in the production-related standard time information systems, such as the basic cycle time and lead time. Improvement measures are necessary to enable the industry to respond swiftly to changes in the production environment. In this study, the lead times for fabrication, assembly of ship block, spool fabrication and painting were predicted using machine learning technology to propose a new management method for the process lead time using a master data system for the time element in the production data. Data preprocessing was performed in various ways using R and Python, which are open source programming languages, and process variables were selected considering their relationships with the lead time through correlation analysis and analysis of variables. Various machine learning, deep learning, and ensemble learning algorithms were applied to create the lead time prediction models. In addition, the applicability of the proposed machine learning methodology to standard work hour prediction was verified by evaluating the prediction models using the evaluation criteria, such as the Mean Absolute Percentage Error (MAPE) and Root Mean Squared Logarithmic Error (RMSLE).

Development of a new explicit soft computing model to predict the blast-induced ground vibration

  • Alzabeebee, Saif;Jamei, Mehdi;Hasanipanah, Mahdi;Amnieh, Hassan Bakhshandeh;Karbasi, Masoud;Keawsawasvong, Suraparb
    • Geomechanics and Engineering
    • /
    • v.30 no.6
    • /
    • pp.551-564
    • /
    • 2022
  • Fragmenting the rock mass is considered as the most important work in open-pit mines. Ground vibration is the most hazardous issue of blasting which can cause critical damage to the surrounding structures. This paper focuses on developing an explicit model to predict the ground vibration through an multi objective evolutionary polynomial regression (MOGA-EPR). To this end, a database including 79 sets of data related to a quarry site in Malaysia were used. In addition, a gene expression programming (GEP) model and several empirical equations were employed to predict ground vibration, and their performances were then compared with the MOGA-EPR model using the mean absolute error (MAE), root mean square error (RMSE), mean (𝜇), standard deviation of the mean (𝜎), coefficient of determination (R2) and a20-index. Comparing the results, it was found that the MOGA-EPR model predicted the ground vibration more precisely than the GEP model and the empirical equations, where the MOGA-EPR scored lower MAE and RMSE, 𝜇 and 𝜎 closer to the optimum value, and higher R2 and a20-index. Accordingly, the proposed MOGA-EPR model can be introduced as a useful method to predict ground vibration and has the capacity to be generalized to predict other blasting effects.