• Title/Summary/Keyword: Systems Engineering Tool

Search Result 1,684, Processing Time 0.026 seconds

The design of communication protocol for controlling efficiently modular medical instruments (모듈화된 의료장비들의 효율적 제어를 위한 통신 프로토콜 설계)

  • 신창민;김영길
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2000.10a
    • /
    • pp.284-287
    • /
    • 2000
  • Recently, developing medical devices have a tendency becoming the module for satisfying user's mutual complex needs. Because the most effective method for the observation of patients condition a diagnosis and a treatment is collecting data from various devices and controling operation following it. Module tendency is more popular due to manage easily totally many individual systems. This study implemented communication protocol to control by one control system connecting modular medical devices. Implemented system consist of one master module controlling all module and managing communication and many Slave modules. Communication between each modules introduced SPI(Serial Peripheral Interface) among many synchronous serial communication methods for the exact transmission and receipt of data. All communication executes by packet format. This can detect error. And, this protocol introduced PNP(Plug And Play) function that auto-detect connecting or removing module during running. This protocol exactly transmitted and received in faster speed more than 1Mbps. And in practical application to the ventilator this confirmed to give and take real-time data. And various functions by th central control system is implemented in this protocol.

  • PDF

Energy Efficiency Enhancement of Macro-Femto Cell Tier (매크로-펨토셀의 에너지 효율 향상)

  • Kim, Jeong-Su;Lee, Moon-Ho
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.18 no.1
    • /
    • pp.47-58
    • /
    • 2018
  • The heterogeneous cellular network (HCN) is most significant as a key technology for future fifth generation (5G) wireless networks. The heterogeneous network considered consists of randomly macrocell base stations (MBSs) overlaid with femtocell base stations (BSs). The stochastic geometry has been shown to be a very powerful tool to model, analyze, and design networks with random topologies such as wireless ad hoc, sensor networks, and multi- tier cellular networks. The HCNs can be energy-efficiently designed by deploying various BSs belonging to different networks, which has drawn significant attention to one of the technologies for future 5G wireless networks. In this paper, we propose switching off/on systems enabling the BSs in the cellular networks to efficiently consume the power by introducing active/sleep modes, which is able to reduce the interference and power consumption in the MBSs and FBSs on an individual basis as well as improve the energy efficiency of the cellular networks. We formulate the minimization of the power onsumption for the MBSs and FBSs as well as an optimization problem to maximize the energy efficiency subject to throughput outage constraints, which can be solved the Karush Kuhn Tucker (KKT) conditions according to the femto tier BS density. We also formulate and compare the coverage probability and the energy efficiency in HCNs scenarios with and without coordinated multi-point (CoMP) to avoid coverage holes.

Classification of Parent Company's Downward Business Clients Using Random Forest: Focused on Value Chain at the Industry of Automobile Parts (랜덤포레스트를 이용한 모기업의 하향 거래처 기업의 분류: 자동차 부품산업의 가치사슬을 중심으로)

  • Kim, Teajin;Hong, Jeongshik;Jeon, Yunsu;Park, Jongryul;An, Teayuk
    • The Journal of Society for e-Business Studies
    • /
    • v.23 no.1
    • /
    • pp.1-22
    • /
    • 2018
  • The value chain has been utilized as a strategic tool to improve competitive advantage, mainly at the enterprise level and at the industrial level. However, in order to conduct value chain analysis at the enterprise level, the client companies of the parent company should be classified according to whether they belong to it's value chain. The establishment of a value chain for a single company can be performed smoothly by experts, but it takes a lot of cost and time to build one which consists of multiple companies. Thus, this study proposes a model that automatically classifies the companies that form a value chain based on actual transaction data. A total of 19 transaction attribute variables were extracted from the transaction data and processed into the form of input data for machine learning method. The proposed model was constructed using the Random Forest algorithm. The experiment was conducted on a automobile parts company. The experimental results demonstrate that the proposed model can classify the client companies of the parent company automatically with 92% of accuracy, 76% of F1-score and 94% of AUC. Also, the empirical study confirm that a few transaction attributes such as transaction concentration, transaction amount and total sales per customer are the main characteristics representing the companies that form a value chain.

A study of statistical techniques for clinical data about cerebrovascular diseases (중풍임상자료(中風臨床資料)에 대한 통계적(統計的) 분석방법연구(分析方法硏究))

  • Kang, Hyo-Shin;Kwon, Young-Kyu;Park, Chang-Gook;Shin, Yang-Kyu;Kim, Sang-Chul
    • The Journal of Korean Medicine
    • /
    • v.17 no.1 s.31
    • /
    • pp.302-328
    • /
    • 1996
  • I . Objective and significance of the study To design a data acquisition chart, which facilitates data collection and analysis. The chart is also useful for solving problems that arise from personal variations in clinical symptoms and filling the knowledge base of an expert system. II. Content and scope 1. Collect the diagnosis knowledge of cerebrovacular diseases from doctors and analyze it. 2. Design a data acquisition chart. 3. Compare ODS and doctors with respect to their diagnosis results 4. Select patients who are determined to suffer from cerebrovascular diseases using CT(computed tomographic) scan, collect clinical data from them. III. Results and Application The chart be used for data collection and analysis in different medical hospitals, The results of data analysis facilitates collecting clinical data about other diseases and implementing the knowledge base. Also, the collected data serves as a tool for medical education, and cooperative diagnosis of oriental and western medical doctors.

  • PDF

Gender Analysis in Elderly Speech Signal Processing (노인음성신호처리에서의 젠더 분석)

  • Lee, JiYeoun
    • Journal of Digital Convergence
    • /
    • v.16 no.10
    • /
    • pp.351-356
    • /
    • 2018
  • Changes in vocal cords due to aging can change the frequency of speech, and the speech signals of the elderly can be automatically distinguished from normal speech signals through various analyzes. The purpose of this study is to provide a tool that can be easily accessed by the elderly and disabled people who can be excluded from the rapidly changing technological society and to improve the voice recognition performance. In the study, the gender of the subjects was reported as sex analysis, and the number of female and male voice samples was used equally. In addition, the gender analysis was applied to set the voices of the elderly without using voices of all ages. Finally, we applied a review methodology of standards and reference models to reduce gender difference. 10 Korean women and 10 men aged 70 to 80 years old are used in this study. Comparing the F0 value extracted directly with the waveform and the F0 extracted with TF32 and the Wavesufer speech analysis program, Wavesufer analyzed the F0 of the elderly voice better than TF32. However, there is a need for a voice analysis program for elderly people. In conclusions, analyzing the voice of the elderly will improve speech recognition and synthesis capabilities of existing smart medical systems.

A Study on the Architecture Modeling of Information System using Simulation (시뮬레이션을 이용한 정보시스템 아키텍쳐 모델링에 관한 연구)

  • Park, Sang-Kook;Kim, Jong-Bae
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2013.10a
    • /
    • pp.455-458
    • /
    • 2013
  • The conventional design of the information system architecture based on the personal experience of information systems has been acted as a limit in progress utilizing appropriate resource allocation and performance improvements. Architecture design depending on personal experience makes differences in variance of a designer's experience, intellectual level in related tasks and surroundings, and architecture quality according to individual's propensity. After all these problems cause a waste of expensive hardware resources. At working place, post-monitoring tools are diversely developed and are running to find the bottleneck and the process problems in the information operation. However, there are no simulation tools or models that are used for expecting and counteracting the problems at early period of designing architecture. To solve these problems we will first develop a simulation model for designing information system architecture in a pilot form, and will verify validity. If an error rate is found in the permissible range, then it can be said that the simulation reflects the characteristic of information system architecture. After the model is developed in a level that can be used in various ways, more accurate performance computation will be able to do, getting out of the old way relying on calculations, and prevent the existence of idle resources and expense waste that comes from the wrong design of architecture.

  • PDF

Distance measurement System from detected objects within Kinect depth sensor's field of view and its applications (키넥트 깊이 측정 센서의 가시 범위 내 감지된 사물의 거리 측정 시스템과 그 응용분야)

  • Niyonsaba, Eric;Jang, Jong-Wook
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2017.05a
    • /
    • pp.279-282
    • /
    • 2017
  • Kinect depth sensor, a depth camera developed by Microsoft as a natural user interface for game appeared as a very useful tool in computer vision field. In this paper, due to kinect's depth sensor and its high frame rate, we developed a distance measurement system using Kinect camera to test it for unmanned vehicles which need vision systems to perceive the surrounding environment like human do in order to detect objects in their path. Therefore, kinect depth sensor is used to detect objects in its field of view and enhance the distance measurement system from objects to the vision sensor. Detected object is identified in accuracy way to determine if it is a real object or a pixel nose to reduce the processing time by ignoring pixels which are not a part of a real object. Using depth segmentation techniques along with Open CV library for image processing, we can identify present objects within Kinect camera's field of view and measure the distance from them to the sensor. Tests show promising results that this system can be used as well for autonomous vehicles equipped with low-cost range sensor, Kinect camera, for further processing depending on the application type when they reach a certain distance far from detected objects.

  • PDF

Impacts of wave and tidal forcing on 3D nearshore processes on natural beaches. Part I: Flow and turbulence fields

  • Bakhtyar, R.;Dastgheib, A.;Roelvink, D.;Barry, D.A.
    • Ocean Systems Engineering
    • /
    • v.6 no.1
    • /
    • pp.23-60
    • /
    • 2016
  • The major objective of this study was to develop further understanding of 3D nearshore hydrodynamics under a variety of wave and tidal forcing conditions. The main tool used was a comprehensive 3D numerical model - combining the flow module of Delft3D with the WAVE solver of XBeach - of nearshore hydro- and morphodynamics that can simulate flow, sediment transport, and morphological evolution. Surf-swash zone hydrodynamics were modeled using the 3D Navier-Stokes equations, combined with various turbulence models (${\kappa}-{\varepsilon}$, ${\kappa}-L$, ATM and H-LES). Sediment transport and resulting foreshore profile changes were approximated using different sediment transport relations that consider both bed- and suspended-load transport of non-cohesive sediments. The numerical set-up was tested against field data, with good agreement found. Different numerical experiments under a range of bed characteristics and incident wave and tidal conditions were run to test the model's capability to reproduce 3D flow, wave propagation, sediment transport and morphodynamics in the nearshore at the field scale. The results were interpreted according to existing understanding of surf and swash zone processes. Our numerical experiments confirm that the angle between the crest line of the approaching wave and the shoreline defines the direction and strength of the longshore current, while the longshore current velocity varies across the nearshore zone. The model simulates the undertow, hydraulic cell and rip-current patterns generated by radiation stresses and longshore variability in wave heights. Numerical results show that a non-uniform seabed is crucial for generation of rip currents in the nearshore (when bed slope is uniform, rips are not generated). Increasing the wave height increases the peaks of eddy viscosity and TKE (turbulent kinetic energy), while increasing the tidal amplitude reduces these peaks. Wave and tide interaction has most striking effects on the foreshore profile with the formation of the intertidal bar. High values of eddy viscosity, TKE and wave set-up are spread offshore for coarser grain sizes. Beach profile steepness modifies the nearshore circulation pattern, significantly enhancing the vertical component of the flow. The local recirculation within the longshore current in the inshore region causes a transient offshore shift and strengthening of the longshore current. Overall, the analysis shows that, with reasonable hypotheses, it is possible to simulate the nearshore hydrodynamics subjected to oceanic forcing, consistent with existing understanding of this area. Part II of this work presents 3D nearshore morphodynamics induced by the tides and waves.

Detection and Analysis of the Liver Area and Liver Tumors in CT Scans (CT 영상에서의 간 영역과 간 종양 추출 및 분석)

  • Kim, Kwang-Baek
    • Journal of Intelligence and Information Systems
    • /
    • v.13 no.1
    • /
    • pp.15-27
    • /
    • 2007
  • In Korea, hepatoma is the thirdly frequent cause of death from cancer occupying 17.2% among the whole deaths from cancer and the rate of death from hepatoma comes to about 21's persons per one-hundred thousand ones. This paper proposes an automatic method for the extraction of areas being suspicious as hepatoma from a CT scan and evaluates the availability as an auxiliary tool for the diagnosis of hepatoma. For detecting tumors in the internal of the liver from CT scans, first, an area of the liver is extracted from about $45{\sim}50's$ CT scans obtained by scanning in 2.5-mm intervals starting from the lower part of the chest. In the extraction of an area of the liver, after unconcerned areas outside of the ribs being removed, areas of the internal organs are separated and enlarged by using intensity information of the CT scan. The area of the liver is extracted among separated areas by using information on position and morphology of the liver. Since hepatoma is a hypervascular turner, the area corresponding to hepatoma appears more brightly than the surroundings in contrast-enhancement CT scans, and when hepatoma shows expansile growth, the area has a spherical shape. So, for the extraction of areas of hepatoma, areas being brighter than the surroundings and globe-shaped are selected as candidate ones in an area of the liver, and then, areas appearing at the same position in successive CT scans among the candidates are discriminated as hepatoma. For the performance evaluation of the proposed method, experiment results obtained by applying the proposed method to CT scans were compared with the diagnoses by radiologists. The evaluation results showed that all areas of the liver and liver tumors were extracted exactly and the proposed method has a high availability as an auxiliary diagnosis tools for the discrimination of liver tumors.

  • PDF

A Study on Patent Indexes for Characteristics Analysis of IP Portfolios (IP포트폴리오의 특성분석을 위한 특허지표 개발에 대한 연구)

  • Yoon, Jeong-Yoen;Ryu, Tae-Kyu;Yoon, Jang-Hyeok
    • Journal of Information Management
    • /
    • v.43 no.2
    • /
    • pp.67-83
    • /
    • 2012
  • Patents are the sources reflecting technology development by research and development(R&D) as well as the tools to secure economic benefits in the market, so using patent information is crucial for decision making processes in formulating technology development strategies. Intellectual property(IP) portfolios including a set of patents related to products and individual technologies are the basic unit that has the economic meaning in making national policies and technology strategies. Therefore, this research develops a total of 69 measures to identify the collective characteristics for IP portfolios("characteristics index"), by incorporating the patent indexes that have been widely used and the patent indexes that developed recently, and applying the concepts to patent analysis that have been used in interdisciplinary studies including economics and library and information science. The results of this research produced a characteristics index manual which helps experts to identify characteristics of technological innovation systems from various dimensions. We expect that the characteristics indexes can be used as a supportive tool for comparative analysis among IP portfolios in the technology policy making process.