• Title/Summary/Keyword: 소프트웨어 공학수준

Search Result 214, Processing Time 0.027 seconds

Proposal of Big Data Analysis and Visualization Technique Curriculum for Non-Technical Majors in Business Management Analysis (경영분석 업무에 종사하는 비 기술기반 전공자를 위한 빅데이터 분석 및 시각화 기법 교육과정 제안)

  • Hong, Pil-Tae;Yu, Jong-Pil
    • Journal of Practical Engineering Education
    • /
    • v.12 no.1
    • /
    • pp.31-39
    • /
    • 2020
  • Big data analysis is analyzed and used in a variety of management and industrial sites, and plays an important role in management decision making. The job competency of big data analysis personnel engaged in management analysis work does not necessarily require the acquisition of microscopic IT skills, but requires a variety of experiences and humanities knowledge and analytical skills as a Data Scientist. However, big data education by state-run and state-run educational institutions and job education institutions based on the National Competency Standards (NCS) is proceeding in terms of software engineering, and this teaching methodology can have difficult and inefficient consequences for non-technical majors. Therefore, we analyzed the current Big Data platform and its related technologies and defined which of them are the requisite job competency requirements for field personnel. Based on this, the education courses for big data analysis and visualization techniques were organized for non-technical-based majors. This specialized curriculum was conducted by working-level officials of financial institutions engaged in management analysis at the management site and was able to achieve better educational effects The education methods presented in this study will effectively carry out big data tasks across industries and encourage visualization of big data analysis for non-technical professionals.

A Model of Time Dependent Design Value Engineering and Life Cycle Cost Analysis for Apartment Buildings (공동주택의 시간의존적 설계VE 및 LCC분석 모델)

  • Seo, Kwang-Jun;Choi, Mi-Ra;Shin, Nam-Soo
    • Korean Journal of Construction Engineering and Management
    • /
    • v.6 no.6 s.28
    • /
    • pp.133-141
    • /
    • 2005
  • In the resent years, the importance of VE (value engineering) and LCC (life cycle cost) analysis for apartment building construction projects has been fully recognized. Accordingly theoretical models, guidelines, and supporting software systems were developed for the value engineering and life cycle cost analysis for construction management including large building systems. However, the level of consensus on VE and LCC analysis results is still low due to the lack of reliable data on maintenance. This paper presents time dependent LCC model based value analysis method for rational investment decision making and design alternative selection for construction of apartment building. The proposed method incorporates a time dependent LCC model and a performance evaluation technique by fuzzy logic theory to properly handle the uncertainties associated with statistics data and to analyze the value of alternatives more rationally. The presented time dependent VE and LCC analysis procedure were applied to a real world project, and this case study is discussed in the paper. The model and the procedure presented in this study can greatly contribute to design value engineering alternative selection, the estimation of the life cycle cost, and the allocation of budget for apartment building construction projects.

A Benchmark of Open Source Data Mining Package for Thermal Environment Modeling in Smart Farm(R, OpenCV, OpenNN and Orange) (스마트팜 열환경 모델링을 위한 Open source 기반 Data mining 기법 분석)

  • Lee, Jun-Yeob;Oh, Jong-wo;Lee, DongHoon
    • Proceedings of the Korean Society for Agricultural Machinery Conference
    • /
    • 2017.04a
    • /
    • pp.168-168
    • /
    • 2017
  • ICT 융합 스마트팜 내의 환경계측 센서, 영상 및 사양관리 시스템의 증가에도 불구하고 이들 장비에서 확보되는 데이터를 적절히 유효하게 활용하는 기술이 미흡한 실정이다. 돈사의 경우 가축의 복지수준, 성장 변화를 실시간으로 모니터링 및 예측할 수 있는 데이터 분석 및 모델링 기술 확보가 필요하다. 이를 위해선 가축의 생리적 변화 및 행동적 변화를 조기에 감지하고 가축의 복지수준을 실시간으로 감시하고 분석 및 예측 기술이 필요한데 이를 위한 대표적인 정보 통신 공학적 접근법 중에 하나가 Data mining 이다. Data mining에 대한 연구 수행에 필요한 다양한 소프트웨어 중에서 Open source로 제공이 되는 4가지 도구를 비교 분석하였다. 스마트 돈사 내에서 열환경 모델링을 목표로 한 데이터 분석에서 고려해야할 요인으로 데이터 분석 알고리즘 도출 시간, 시각화 기능, 타 라이브러리와 연계 기능 등을 중점 적으로 분석하였다. 선정된 4가지 분석 도구는 1) R(https://cran.r-project.org), 2) OpenCV(http://opencv.org), 3) OpenNN (http://www.opennn.net), 4) Orange(http://orange.biolab.si) 이다. 비교 분석을 수행한 운영체제는 Linux-Ubuntu 16.04.4 LTS(X64)이며, CPU의 클럭속도는 3.6 Ghz, 메모리는 64 Gb를 설치하였다. 개발언어 측면에서 살펴보면 1) R 스크립트, 2) C/C++, Python, Java, 3) C++, 4) C/C++, Python, Cython을 지원하여 C/C++ 언어와 Python 개발 언어가 상대적으로 유리하였다. 데이터 분석 알고리즘의 경우 소스코드 범위에서 라이브러리를 제공하는 경우 Cross-Platform 개발이 가능하여 여러 운영체제에서 개발한 결과를 별도의 Porting 과정을 거치지 않고 사용할 수 있었다. 빌트인 라이브러리 경우 순서대로 R 의 경우 가장 많은 수의 Data mining 알고리즘을 제공하고 있다. 이는 R 운영 환경 자체가 개방형으로 되어 있어 온라인에서 추가되는 새로운 라이브러리를 클라우드를 통하여 공유하기 때문인 것으로 판단되었다. OpenCV의 경우 영상 처리에 강점이 있었으며, OpenNN은 신경망학습과 관련된 라이브러리를 소스코드 레벨에서 공개한 것이 강점이라 할 수 있다. Orage의 경우 라이브러리 집합을 제공하는 것에 중점을 둔 다른 패키지와 달리 시각화 기능 및 망 구성 등 사용자 인터페이스를 통합하여 운영한 것이 강점이라 할 수 있다. 열환경 모델링에 요구되는 시간 복잡도에 대응하기 위한 부가 정보 처리 기술에 대한 연구를 수행하여 스마트팜 열환경 모델링을 실시간으로 구현할 수 있는 방안 연구를 수행할 것이다.

  • PDF

Methods for Stabilizing QoS in Mobile Cloud Computing (모바일 클라우드 컴퓨팅을 위한 QoS 안정화 기법)

  • La, Hyun Jung;Kim, Soo Dong
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.2 no.8
    • /
    • pp.507-516
    • /
    • 2013
  • Mobile devices have limited computing power and resources. Since mobile devices are equipped with rich network connectivity, an approach to subscribe cloud services can effectively remedy the problem, which is called Mobile Cloud Computing (MCC). Most works on MCC depend on a method to offload functional components at runtime. However, these works only consider the limited verion of offloading to a pre-defined, designated node. Moveover, there is the limitation of managing services subscribed by applications. To provide a comprehensive and practical solution for MCC, in this paper, we propose a self-stabilizing process and its management-related methods. The proposed process is based on an autonomic computing paradigm and works with diverse quality remedy actions such as migration or replicating services. And, we devise a pratical offloading mechanism which is still in an initial stage of the study. The proposed offloading mechanism is based on our proposed MCC meta-model. By adopting the self-stabilization process for MCC, many of the technical issues are effectively resolved, and mobile cloud environments can maintain consistent levels of quality in autonomous manner.

Quantification of Rockwool Substrate Water Content using a Capacitive Water Sensor (정전용량 수분센서의 배지 함수량 정량화)

  • Baek, Jeong-Hyeon;Park, Ju-Sung;Lee, Ho-Jin;An, Jin-Hee;Choi, Eun-Young
    • Journal of Bio-Environment Control
    • /
    • v.30 no.1
    • /
    • pp.27-36
    • /
    • 2021
  • A capacitive water sensor was developed to measure the capacitance over a wide part of a substrate using an insulated electrode plate (30 cm × 10 cm) with copper and Teflon attached on either side of the substrate. This study aimed to convert the capacitance output obtained from the condenser-type capacitance sensor into the substrate water content. The quantification experiment was performed by measuring the changes in substrate water weight and capacitance while providing a nutrient solution and by subsequently comparing these values. The substrate water weight and capacitance were measured every 20 to 30 seconds using the sensor and load cell with a software developed specifically for this study. Using a curve-fitting program, the substrate water content was estimated from the output of the capacitance using the water weight and capacitance of the substrate as variables. When the amount of water supplied was increased, the capacitance tended to increase. Coefficient of variation (CV) in capacitance according to the water weight in substrate was greater with the 1.0 kg of water weight, compared with other weights. Thus, the fitting was performed with higher than 1.0 kg, from 1.7 to 6.0 kg of water weight. The correlation coefficient between the capacitance and water weight in substrate was 0.9696. The calibration equation estimated water content from the capacitance, and it was compared with the substrate water weight measured by the load cell.

A Hardware Implementation of Image Scaler Based on Area Coverage Ratio (면적 점유비를 이용한 영상 스케일러의 설계)

  • 성시문;이진언;김춘호;김이섭
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.40 no.3
    • /
    • pp.43-53
    • /
    • 2003
  • Unlike in analog display devices, the physical screen resolution in digital devices are fixed from the manufacturing. It is a weak point on digital devices. The screen resolution displayed in digital display devices is varied. Thus, interpolation or decimation of the resolution on the display is needed to make the input pixels equal to the screen resolution., This process is called image scaling. Many researches have been developed to reduce the hardware cost and distortion of the image of image scaling algorithm. In this paper, we proposed a Winscale algorithm. which modifies the scale up/down in continuous domain to the scale up/down in discrete domain. Thus, the algorithm is suitable to digital display devices. Hardware implementation of the image scaler is performed using Verilog XL and chip is fabricated in a 0.5${\mu}{\textrm}{m}$ Samsung SOG technology. The hardware costs as well as the scalabilities are compared with the conventional image scaling algorithms that are used in other software. This Winscale algorithm is proved more scalable than other image-scaling algorithm, which has similar H/W cost. This image-scaling algorithm can be used in various digital display devices that need image scaling process.

Methods to Enhance Service Scalability Using Service Replication and Migration (서비스 복제 및 이주를 이용한 서비스 확장성 향상 기법)

  • Kim, Ji-Won;Lee, Jae-Yoo;Kim, Soo-Dong
    • Journal of KIISE:Software and Applications
    • /
    • v.37 no.7
    • /
    • pp.503-517
    • /
    • 2010
  • Service-oriented computing, the effective paradigm for developing service applications by using reusable services, becomes popular. In service-oriented computing, service consumer has no responsibility for managing services, just invokes services what service providers are producing. On the other hand, service providers should manage any resources and data for service consumers can use the service anytime and anywhere. However, it is hard service providers manage the quality of the services because an unspecified number of service consumers. Therefore, service scalability for providing services with higher quality of services specified in a service level agreement becomes a potential problem in service-oriented computing. There have been many researches for scalability in network, database, and distributed computing area. But a research about a definition of service scalability and metrics of measuring service scalability is still not mature in service engineering area. In this paper, we construct a service network which connects multiple service nodes, and integrate all the resources to manage it. And we also present a service scalability framework for managing service scalability by using a mechanism of service migration or replication. In section 3, we, firstly, present the structure of the scalability management framework and basic functionalities. In section 4, we propose scalability enhancement mechanism which is needed to release functionality of the framework. In section 5, we design and implement the framework by using proposed mechanism. In section 6, we demonstrate the result of our case study which dynamically manages services in multi-nodes environment by applying our framework. Through the case study, we show the applicability of our scalability management framework and mechanism.

A Study DH the Identification Of Critical Intelligent Information Technologies and Application Areas in the Defence Side (국방부문 핵심지능정보기술 식별 및 활용방안 연구)

  • 김화수;이승구
    • Proceedings of the Korea Inteligent Information System Society Conference
    • /
    • 2000.11a
    • /
    • pp.407-416
    • /
    • 2000
  • 국방 부문에 종사하는 관리자들은 국방정보시스템 사업관리에 있어서 최신정보기술에 대한 기본적인 사항은 알고있어야 효율적이고 효과적이며 성공적인 사업관리를 진행할 수 있을 것이다. 국방 부문에 종사하는 관리자들이 저비용 고효율의 국방정보시스템을 건설하고 운영 유지관리 하기 위하여 알아야 할 핵심 및 최신정보기술은 크게 인공지능기술, 멀티미디어 정보화 기술, 가상현실 기술, 시뮬레이션 기술, 텔레프레즌스 기술, 나노테크놀로지 기술, 데이터베이스 기술, 병렬처리 기술, 로봇공학 기술, 소프트웨어 공학에 관련된 기술 등이 있다. 그러나 국방부문에 종사하는 정보통신 전문 인력을 제외한 관리자들이 국방관련 사업관리를 수행하면서 정보기술에 대한 이해 수준이 비교적 낮기 때문에 효율적으로 국방사업을 준비, 계획, 추진하기 어려운 실정이다. 따라서 국방부문에 종사하는 관리자들이 정보기술을 알기 쉽게 이해할 수 있도록 국방부문 핵심지능형정보기술 발전 및 군 활용방안을 이해하기 쉽도록 작성하며 효율적인 사업관리가 이루어질 수 있는 방안을 연구하였다. 본 논문은 국방부문핵심 지능정보기술 식별 및 활용방안을 연구하여 핵심적으로 식별된 사항들을 우리 국방부문의 $C^4$I(지휘, 통제, 통신, 컴퓨터시스템)시스템, 내장형 무기시스템, 각종 교육훈련 정보시스템, 자원관리 정보시스템 등에 어떻게 적용할 것이며 적용시 기대효과는 무엇인가를 제시토록 하여 국방부문에 종사하는 관리자들이 각종 국방사업을 조정, 통제, 확인, 감독, 준비/계획하면서 참고하여 저비용 고효율의 국방관련 각층 사업을 관리할 수 있는 능력을 배양시키도록 연구를 수행하였다. 국방관련 각종 사업을 관리할 수 있는 능력을 배양시키도록 연구를 수행하였다. 국방부문 핵심지능정보기술 발전 및 활용 방안에 포함될 주요 내용을 요약하여 제시하였다.의 경향성을 나타내는 오차 주기(error cyc1e)를 이용함으로써 고객들의 수요의 경향성을 좀 더 세밀한 부분까지 파악할 수 있게 해 준다.ction, secondary electron microscopy, atomic force microscoy, $\alpha$-step, Raman scattering spectroscopu, Fourier transform infrared spectroscopy 및 micro hardness tester를 이용하여 기판 bias 전압이 DLC 박막의 특성에 미치는 영향을 조사하였다. 분석결과 본 연구에서 제작된 DLC 박막은 탄소와 수소만으로 구성되어 있으며, 비정질 상태임을 알 수 있었다. 기판 bias 전압의 증가에 따라 박막의 두께가 감소됨을 알 수 있었고, -150V에서는 박막이 거의 만들어지지 않았으며, -200V에서는 기판 표면이 식각되었다. 이것은 기판 bias 전압과 ECR 플라즈마에 의한 이온충돌 효과 때문으로 판단되며, 150V 이하에서는 증착되는 양보다 re-sputtering 되는 양이 더 많을 것으로 생각된다. 기판 bias 전압을 증가시킬수록 플라즈마에 의한 이온충돌 현상이 두드러져 탄소와 결합하고 있던 수소원자들이 떨어져 나가는 탈수소화 (dehydrogenation) 현상을 확인할 수 있었으며, 이것은 C-H 결합에너지가 C-C 결합이나 C=C 결합보다 약하여 수소 원자가 비교적 해리가 잘되므로 이러한 현상이 일어난다고 판단된다. 결합이 끊어진 탄소 원자들은 다른 탄소원자들과 결합하여 3차원적 cross-link를 형성시켜 나가면서 내부 압축응력을 증가시키는 것으로 알려져 있으며, hardness 시험 결과로 이것을 확인할 수 있었다. 그리고 표면거칠기는 기판 bias 전압을 증가시킬수록 더 smooth 해짐을 확인

  • PDF

Nonlinear Impact Analysis for Eco-Pillar Debris Barrier with Hollow Cross-Section (중공트랙단면 에코필라 사방댐의 비선형 충돌해석)

  • Kim, Hyun-Gi;Kim, Bum-Joon
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.20 no.7
    • /
    • pp.430-439
    • /
    • 2019
  • In this study, a nonlinear impact analysis was performed to evaluate the safety and damage of an eco-pillar debris barrier with a hollow cross-section, which was proposed to improve constructability and economic efficiency. The construction of concrete eco-pillar debris barriers has increased recently. However, there are no design standards concerning debris barriers in Korea, and it is difficult to find a study on performance evaluations in extreme environments. Thus, an analysis of an eco-pillar debris barrier was done using the rock impact speed, which was estimated from the debris flow velocity. The diameters of rocks were determined by ETAG 27. The impact position, angles, and rock diameter were considered as variables. A concrete nonlinear material model was applied, and the estimation of damage was done by ABAQUS software. As a result, the damage ratio was found to be less than 1.0 at rock diameters of 0.3 m and 0.5 m, but it was 1.39 when the diameter was 0.7 m. This study could be used as basic data on impact force in the design of the cross section of an eco-pillar debris barrier.

A Method for Prediction of Quality Defects in Manufacturing Using Natural Language Processing and Machine Learning (자연어 처리 및 기계학습을 활용한 제조업 현장의 품질 불량 예측 방법론)

  • Roh, Jeong-Min;Kim, Yongsung
    • Journal of Platform Technology
    • /
    • v.9 no.3
    • /
    • pp.52-62
    • /
    • 2021
  • Quality control is critical at manufacturing sites and is key to predicting the risk of quality defect before manufacturing. However, the reliability of manual quality control methods is affected by human and physical limitations because manufacturing processes vary across industries. These limitations become particularly obvious in domain areas with numerous manufacturing processes, such as the manufacture of major nuclear equipment. This study proposed a novel method for predicting the risk of quality defects by using natural language processing and machine learning. In this study, production data collected over 6 years at a factory that manufactures main equipment that is installed in nuclear power plants were used. In the preprocessing stage of text data, a mapping method was applied to the word dictionary so that domain knowledge could be appropriately reflected, and a hybrid algorithm, which combined n-gram, Term Frequency-Inverse Document Frequency, and Singular Value Decomposition, was constructed for sentence vectorization. Next, in the experiment to classify the risky processes resulting in poor quality, k-fold cross-validation was applied to categorize cases from Unigram to cumulative Trigram. Furthermore, for achieving objective experimental results, Naive Bayes and Support Vector Machine were used as classification algorithms and the maximum accuracy and F1-score of 0.7685 and 0.8641, respectively, were achieved. Thus, the proposed method is effective. The performance of the proposed method were compared and with votes of field engineers, and the results revealed that the proposed method outperformed field engineers. Thus, the method can be implemented for quality control at manufacturing sites.