• Title/Summary/Keyword: Database Parameter

Search Result 238, Processing Time 0.03 seconds

Electronic Catalogue Based Cutting Parameter Selection (전자 카탈로그식 절삭변수 선정의 자동화)

  • 이성열
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.6 no.4
    • /
    • pp.1-5
    • /
    • 2001
  • This study presents an electronic catalogue based cutting parameter selection system using MS Access software. The proposed system has been designed to electronically select proper cutting conditions based on the stored data base. The existing approaches used in most small and medium sized companies are basically to use manufacturing engineer's experience or to find the recommended values from the manufacturing engineer handbook. These processes are often time consuming and inconsistent, especially when a new engineer is involved. Therefore, this study proposes a simple, yet quick and consistent electronic catalogue based cutting parameter selection method which uses MS Access in terms of programming and database implementation. Consequently, the proposed system could automatically generate the proper cutting conditions (feed, depth of cut, and cutting speed) as soon as the input data (proper information about the tool and work material) is given. Thanks to the simple structure and popularity of the MS Access, the engineer could be quickly accustomed to the system and easily modify/insert/delete the database if necessary.

  • PDF

A Study About Radionuclides Migration Behavior in Terms of Solubility at Gyeongju Low- and Intermediate-Level Radioactive Waste (LILW) Repository

  • Park, Sang June;Byon, Jihyang;Lee, Jun-Yeop;Ahn, Seokyoung
    • Journal of Nuclear Fuel Cycle and Waste Technology(JNFCWT)
    • /
    • v.19 no.1
    • /
    • pp.113-121
    • /
    • 2021
  • A safety assessment of radioactive waste repositories is a mandatory requirement process because there are possible radiological hazards owing to radionuclide migration from radioactive waste to the biosphere. For a reliable safety assessment, it is important to establish a parameter database that reflects the site-specific characteristics of the disposal facility and repository site. From this perspective, solubility, a major geochemical parameter, has been chosen as an important parameter for modeling the migration behavior of radionuclides. The solubilities were derived for Am, Ni, Tc, and U, which were major radionuclides in this study, and on-site groundwater data reflecting the operational conditions of the Gyeongju low and intermediate level radioactive waste (LILW) repository were applied to reflect the site-specific characteristics. The radiation dose was derived by applying the solubility and radionuclide inventory data to the RESRAD-OFFSITE code, and sensitivity analysis of the dose according to the solubility variation was performed. As a result, owing to the low amount of radionuclide inventory, the dose variation was insignificant. The derived solubility can be used as the main input data for the safety assessment of the Gyeongju LILW repository in the future.

Automatic Identification of Database Workloads by using SVM Workload Classifier (SVM 워크로드 분류기를 통한 자동화된 데이터베이스 워크로드 식별)

  • Kim, So-Yeon;Roh, Hong-Chan;Park, Sang-Hyun
    • The Journal of the Korea Contents Association
    • /
    • v.10 no.4
    • /
    • pp.84-90
    • /
    • 2010
  • DBMS is used for a range of applications from data warehousing through on-line transaction processing. As a result of this demand, DBMS has continued to grow in terms of its size. This growth invokes the most important issue of manually tuning the performance of DBMS. The DBMS tuning should be adaptive to the type of the workload put upon it. But, identifying workloads in mixed database applications might be quite difficult. Therefore, a method is necessary for identifying workloads in the mixed database environment. In this paper, we propose a SVM workload classifier to automatically identify a DBMS workload. Database workloads are collected in TPC-C and TPC-W benchmark while changing the resource parameters. Parameters for SVM workload classifier, C and kernel parameter, were chosen experimentally. The experiments revealed that the accuracy of the proposed SVM workload classifier is about 9% higher than that of Decision tree, Naive Bayes, Multilayer perceptron and K-NN classifier.

Development of an Expert System for Optimizing Die and Mold Polishing-II (금형면 자동 다듬질 전문가 시스템 개발에 관한 연구-II-통합 연마 파라미터를 사용한 최적 가공 구현 및 전문가 시스템 구축-)

  • 민헌식;이성환;안유민;조남규;한창수
    • Transactions of the Korean Society of Machine Tool Engineers
    • /
    • v.11 no.1
    • /
    • pp.45-51
    • /
    • 2002
  • To reduce the cost and increase reliability of die and mold products, automation of the finishing (polishing) process is essential. A major element of automation is a reliable database and a knowledge base for polishing status. This paper presents a polishing expert system which can determine optimal polishing sequences and conditions by using an empirical formula and an experimental database. The simplex method was used for the curve fittings of the experimental results. Also a graphical user interface, which visualizes the optimized results, was developed.

A study on the Restoration of Feature Information in STEPAP224 to Solid model (STEP AP224에 표현된 특징형상 정보의 솔리드 모델 복원에 관한 연구)

  • 김야일;강무진
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2001.04a
    • /
    • pp.367-372
    • /
    • 2001
  • Feature restoration is that restore feature to 3D solid model using the feature information in STEP AP224. Feature is very important in CAPP, but feature information is defined very complicated in STEP AP224. This paper recommends the algorithm of extraction the feature information in physical STEP AP224file. This program import STEP AP224 file, parse the geometric and topological information, the tolerance data, and feature information line-by-line. After importation and parsing, store data into database. Feature restoration module analyze database including feature information, extract feature information, e.g. feature type, feature's parameter, etc., analyze the relationship and then restore feature to 3D solid model.

  • PDF

Design and analysis tool for optimal interconnect structures (DATOIS) (최적회로 연결선 구조를 위한 설계 및 해석도구 (DATOIS))

  • 박종흠;김준희;김석윤
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.35C no.7
    • /
    • pp.20-29
    • /
    • 1998
  • As the packing density of ICs in recent submicron IC design increases, interconnects gain importance. Because interconnects directly affect on two major components of circuit performance, power dissipation and operating speed, circuit engineers are concerned with the optimal design of interconnects and the aid tool to design them. When circuit models of interconnects are given (including geometry and material information), the analysis process for the given structure is not an easy task, but conversely, it is much more difficult to design an interconnect structure with given circuit characteristics. This paper focuses on the latter process that has not been foucsed on much till now due to the complexity of the problem, and prsents a design aid tool(DATOIS) to synthesize interconnects. this tool stroes the circuit performance parameters for normalized interconnect geometries, and has two oeprational modes:analysis mode and synthesis mode. In the analysis mode, circuit performance parameters are obtained by searching the internal database for a given geometry and interpolates results if necessary . In thesynthesis mode, when a given circuit performance parameter satisfies a set of geometry condition in the database, those geometry structures are printed out.

  • PDF

An Object-Level Feature Representation Model for the Multi-target Retrieval of Remote Sensing Images

  • Zeng, Zhi;Du, Zhenhong;Liu, Renyi
    • Journal of Computing Science and Engineering
    • /
    • v.8 no.2
    • /
    • pp.65-77
    • /
    • 2014
  • To address the problem of multi-target retrieval (MTR) of remote sensing images, this study proposes a new object-level feature representation model. The model provides an enhanced application image representation that improves the efficiency of MTR. Generating the model in our scheme includes processes, such as object-oriented image segmentation, feature parameter calculation, and symbolic image database construction. The proposed model uses the spatial representation method of the extended nine-direction lower-triangular (9DLT) matrix to combine spatial relationships among objects, and organizes the image features according to MPEG-7 standards. A similarity metric method is proposed that improves the precision of similarity retrieval. Our method provides a trade-off strategy that supports flexible matching on the target features, or the spatial relationship between the query target and the image database. We implement this retrieval framework on a dataset of remote sensing images. Experimental results show that the proposed model achieves competitive and high-retrieval precision.

Development of Grinding Expert System by Fuzzy Model (Fuzzy 모델에 의한 연삭 가공의 전문가 시스템의 개발)

  • Kim, Nam-Gyeong;Kim, Geon-Hoe;Song, Ji-Bok
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.8 no.3
    • /
    • pp.27-43
    • /
    • 1991
  • 연소 가공은 고품질 고정도를 필요로 하는 경우 매우 유효한 가공방법이지만 그 공정이 많은 Parameter에 의해 구성되기 때문에 동일한 조건에서도 정량적인 평가가 어려우므로 작업현장 에서는 과학적 원리와 공학적 지식 보다는 숙련자의 경험과 기능에 의존하고 있는 실정이다. 본 연구에서는 이와 같은 국면에 대처한 문제 해결을 위해 Computer가 인간사고에 근접 할 수 있도록 Fuzzy 이론과 Default 이론을 도입하고 전문가의 이론적 지식과 숙련자의 감각적 지식을 적극 수용 하여 연소용 Expert system (최적 가공 조건의 설정 System과 Trouble shooting system)을 개발하였다. 또한 연소 가공 Data의 불확실한 애매성을 효과적으로 이용 할 수 있도록 Fuzzy 가능성이론에 의해 가공 Datad을 회귀 분석하여 실가공 Data base에 축적시켜 재활용토록 설계하었으며 개발된 본 System 의 실행 결과 그 활용성이 높음을 입증하였다.

  • PDF

Memory-based Pattern Completion in Database Semantics

  • Hausser Roland
    • Language and Information
    • /
    • v.9 no.1
    • /
    • pp.69-92
    • /
    • 2005
  • Pattern recognition in cognitive agents is based on (i) the uninterpreted input data (e.g. parameter values) provided by the agent's hardware devices and (ii) and interpreted patterns (e.g. templates) provided by the agent's memory. Computationally, the task consists in finding the memory data corresponding best to the input data, for any given input. Once the best fitting memory data have been found, the input is recognized by applying to it the interpretation which happens to be stored with the memorized pattern. This paper presents a fast converging procedure which starts from a few initially recognized items and then analyzes the remainder of the input by systematically checking for items shown by memory to have been related to the initial items in previous encounters. In this way, known patterns are tried first, and only when they have been exhausted, an elementary exploration of the input is commenced. Efficiency is improved further by choosing the candidate to be tested next according to frequency.

  • PDF

Multi-Level Fusion Processing Algorithm for Complex Radar Signals Based on Evidence Theory

  • Tian, Runlan;Zhao, Rupeng;Wang, Xiaofeng
    • Journal of Information Processing Systems
    • /
    • v.15 no.5
    • /
    • pp.1243-1257
    • /
    • 2019
  • As current algorithms unable to perform effective fusion processing of unknown complex radar signals lacking database, and the result is unstable, this paper presents a multi-level fusion processing algorithm for complex radar signals based on evidence theory as a solution to this problem. Specifically, the real-time database is initially established, accompanied by similarity model based on parameter type, and then similarity matrix is calculated. D-S evidence theory is subsequently applied to exercise fusion processing on the similarity of parameters concerning each signal and the trust value concerning target framework of each signal in order. The signals are ultimately combined and perfected. The results of simulation experiment reveal that the proposed algorithm can exert favorable effect on the fusion of unknown complex radar signals, with higher efficiency and less time, maintaining stable processing even of considerable samples.