• Title/Summary/Keyword: 모듈설계

Search Result 4,282, Processing Time 0.033 seconds

An Ontology Model for Public Service Export Platform (공공 서비스 수출 플랫폼을 위한 온톨로지 모형)

  • Lee, Gang-Won;Park, Sei-Kwon;Ryu, Seung-Wan;Shin, Dong-Cheon
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.1
    • /
    • pp.149-161
    • /
    • 2014
  • The export of domestic public services to overseas markets contains many potential obstacles, stemming from different export procedures, the target services, and socio-economic environments. In order to alleviate these problems, the business incubation platform as an open business ecosystem can be a powerful instrument to support the decisions taken by participants and stakeholders. In this paper, we propose an ontology model and its implementation processes for the business incubation platform with an open and pervasive architecture to support public service exports. For the conceptual model of platform ontology, export case studies are used for requirements analysis. The conceptual model shows the basic structure, with vocabulary and its meaning, the relationship between ontologies, and key attributes. For the implementation and test of the ontology model, the logical structure is edited using Prot$\acute{e}$g$\acute{e}$ editor. The core engine of the business incubation platform is the simulator module, where the various contexts of export businesses should be captured, defined, and shared with other modules through ontologies. It is well-known that an ontology, with which concepts and their relationships are represented using a shared vocabulary, is an efficient and effective tool for organizing meta-information to develop structural frameworks in a particular domain. The proposed model consists of five ontologies derived from a requirements survey of major stakeholders and their operational scenarios: service, requirements, environment, enterprise, and county. The service ontology contains several components that can find and categorize public services through a case analysis of the public service export. Key attributes of the service ontology are composed of categories including objective, requirements, activity, and service. The objective category, which has sub-attributes including operational body (organization) and user, acts as a reference to search and classify public services. The requirements category relates to the functional needs at a particular phase of system (service) design or operation. Sub-attributes of requirements are user, application, platform, architecture, and social overhead. The activity category represents business processes during the operation and maintenance phase. The activity category also has sub-attributes including facility, software, and project unit. The service category, with sub-attributes such as target, time, and place, acts as a reference to sort and classify the public services. The requirements ontology is derived from the basic and common components of public services and target countries. The key attributes of the requirements ontology are business, technology, and constraints. Business requirements represent the needs of processes and activities for public service export; technology represents the technological requirements for the operation of public services; and constraints represent the business law, regulations, or cultural characteristics of the target country. The environment ontology is derived from case studies of target countries for public service operation. Key attributes of the environment ontology are user, requirements, and activity. A user includes stakeholders in public services, from citizens to operators and managers; the requirements attribute represents the managerial and physical needs during operation; the activity attribute represents business processes in detail. The enterprise ontology is introduced from a previous study, and its attributes are activity, organization, strategy, marketing, and time. The country ontology is derived from the demographic and geopolitical analysis of the target country, and its key attributes are economy, social infrastructure, law, regulation, customs, population, location, and development strategies. The priority list for target services for a certain country and/or the priority list for target countries for a certain public services are generated by a matching algorithm. These lists are used as input seeds to simulate the consortium partners, and government's policies and programs. In the simulation, the environmental differences between Korea and the target country can be customized through a gap analysis and work-flow optimization process. When the process gap between Korea and the target country is too large for a single corporation to cover, a consortium is considered an alternative choice, and various alternatives are derived from the capability index of enterprises. For financial packages, a mix of various foreign aid funds can be simulated during this stage. It is expected that the proposed ontology model and the business incubation platform can be used by various participants in the public service export market. It could be especially beneficial to small and medium businesses that have relatively fewer resources and experience with public service export. We also expect that the open and pervasive service architecture in a digital business ecosystem will help stakeholders find new opportunities through information sharing and collaboration on business processes.

Implant Isolation Characteristics for 1.25 Gbps Monolithic Integrated Bi-Directional Optoelectronic SoC (1.25 Gbps 단일집적 양방향 광전 SoC를 위한 임플란트 절연 특성 분석)

  • Kim, Sung-Il;Kang, Kwang-Yong;Lee, Hai-Young
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.44 no.8
    • /
    • pp.52-59
    • /
    • 2007
  • In this paper, we analyzed and measured implant isolation characteristics for a 1.25 Gbps monolithic integrated hi-directional (M-BiDi) optoelectronic system-on-a-chip, which is a key component to constitute gigabit passive optical networks (PONs) for a fiber-to-the-home (FTTH). Also, we derived an equivalent circuit of the implant structure under various DC bias conditions. The 1.25 Gbps M-BiDi transmit-receive SoC consists of a laser diode with a monitor photodiode as a transmitter and a digital photodiode as a digital data receiver on the same InP wafer According to IEEE 802.3ah and ITU-T G.983.3 standards, a receiver sensitivity of the digital receiver has to satisfy under -24 dBm @ BER=10-12. Therefore, the electrical crosstalk levels have to maintain less than -86 dB from DC to 3 GHz. From analysed and measured results of the implant structure, the M-BiDi SoC with the implant area of 20 mm width and more than 200 mm distance between the laser diode and monitor photodiode, and between the monitor photodiode and digital photodiode, satisfies the electrical crosstalk level. These implant characteristics can be used for the design and fabrication of an optoelectronic SoC design, and expended to a mixed-mode SoC field.

Design and Implementation of Medical Information System using QR Code (QR 코드를 이용한 의료정보 시스템 설계 및 구현)

  • Lee, Sung-Gwon;Jeong, Chang-Won;Joo, Su-Chong
    • Journal of Internet Computing and Services
    • /
    • v.16 no.2
    • /
    • pp.109-115
    • /
    • 2015
  • The new medical device technologies for bio-signal information and medical information which developed in various forms have been increasing. Information gathering techniques and the increasing of the bio-signal information device are being used as the main information of the medical service in everyday life. Hence, there is increasing in utilization of the various bio-signals, but it has a problem that does not account for security reasons. Furthermore, the medical image information and bio-signal of the patient in medical field is generated by the individual device, that make the situation cannot be managed and integrated. In order to solve that problem, in this paper we integrated the QR code signal associated with the medial image information including the finding of the doctor and the bio-signal information. bio-signal. System implementation environment for medical imaging devices and bio-signal acquisition was configured through bio-signal measurement, smart device and PC. For the ROI extraction of bio-signal and the receiving of image information that transfer from the medical equipment or bio-signal measurement, .NET Framework was used to operate the QR server module on Window Server 2008 operating system. The main function of the QR server module is to parse the DICOM file generated from the medical imaging device and extract the identified ROI information to store and manage in the database. Additionally, EMR, patient health information such as OCS, extracted ROI information needed for basic information and emergency situation is managed by QR code. QR code and ROI management and the bio-signal information file also store and manage depending on the size of receiving the bio-singnal information case with a PID (patient identification) to be used by the bio-signal device. If the receiving of information is not less than the maximum size to be converted into a QR code, the QR code and the URL information can access the bio-signal information through the server. Likewise, .Net Framework is installed to provide the information in the form of the QR code, so the client can check and find the relevant information through PC and android-based smart device. Finally, the existing medical imaging information, bio-signal information and the health information of the patient are integrated over the result of executing the application service in order to provide a medical information service which is suitable in medical field.

A Study of a Non-commercial 3D Planning System, Plunc for Clinical Applicability (비 상업용 3차원 치료계획시스템인 Plunc의 임상적용 가능성에 대한 연구)

  • Cho, Byung-Chul;Oh, Do-Hoon;Bae, Hoon-Sik
    • Radiation Oncology Journal
    • /
    • v.16 no.1
    • /
    • pp.71-79
    • /
    • 1998
  • Purpose : The objective of this study is to introduce our installation of a non-commercial 3D Planning system, Plunc and confirm it's clinical applicability in various treatment situations. Materials and Methods : We obtained source codes of Plunc, offered by University of North Carolina and installed them on a Pentium Pro 200MHz (128MB RAM, Millenium VGA) with Linux operating system. To examine accuracy of dose distributions calculated by Plunc, we input beam data of 6MV Photon of our linear accelerator(Siemens MXE 6740) including tissue-maximum ratio, scatter-maximum ratio, attenuation coefficients and shapes of wedge filters. After then, we compared values of dose distributions(Percent depth dose; PDD, dose profiles with and without wedge filters, oblique incident beam, and dose distributions under air-gap) calculated by Plunc with measured values. Results : Plunc operated in almost real time except spending about 10 seconds in full volume dose distribution and dose-volume histogram(DVH) on the PC described above. As compared with measurements for irradiations of 90-cm 550 and 10-cm depth isocenter, the PDD curves calculated by Plunc did not exceed $1\%$ of inaccuracies except buildup region. For dose profiles with and without wedge filter, the calculated ones are accurate within $2\%$ except low-dose region outside irradiations where Plunc showed $5\%$ of dose reduction. For the oblique incident beam, it showed a good agreement except low dose region below $30\%$ of isocenter dose. In the case of dose distribution under air-gap, there was $5\%$ errors of the central-axis dose. Conclusion : By comparing photon dose calculations using the Plunc with measurements, we confirmed that Plunc showed acceptable accuracies about $2-5\%$ in typical treatment situations which was comparable to commercial planning systems using correction-based a1gorithms. Plunc does not have a function for electron beam planning up to the present. However, it is possible to implement electron dose calculation modules or more accurate photon dose calculation into the Plunc system. Plunc is shown to be useful to clear many limitations of 2D planning systems in clinics where a commercial 3D planning system is not available.

  • PDF

Design of Client-Server Model For Effective Processing and Utilization of Bigdata (빅데이터의 효과적인 처리 및 활용을 위한 클라이언트-서버 모델 설계)

  • Park, Dae Seo;Kim, Hwa Jong
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.4
    • /
    • pp.109-122
    • /
    • 2016
  • Recently, big data analysis has developed into a field of interest to individuals and non-experts as well as companies and professionals. Accordingly, it is utilized for marketing and social problem solving by analyzing the data currently opened or collected directly. In Korea, various companies and individuals are challenging big data analysis, but it is difficult from the initial stage of analysis due to limitation of big data disclosure and collection difficulties. Nowadays, the system improvement for big data activation and big data disclosure services are variously carried out in Korea and abroad, and services for opening public data such as domestic government 3.0 (data.go.kr) are mainly implemented. In addition to the efforts made by the government, services that share data held by corporations or individuals are running, but it is difficult to find useful data because of the lack of shared data. In addition, big data traffic problems can occur because it is necessary to download and examine the entire data in order to grasp the attributes and simple information about the shared data. Therefore, We need for a new system for big data processing and utilization. First, big data pre-analysis technology is needed as a way to solve big data sharing problem. Pre-analysis is a concept proposed in this paper in order to solve the problem of sharing big data, and it means to provide users with the results generated by pre-analyzing the data in advance. Through preliminary analysis, it is possible to improve the usability of big data by providing information that can grasp the properties and characteristics of big data when the data user searches for big data. In addition, by sharing the summary data or sample data generated through the pre-analysis, it is possible to solve the security problem that may occur when the original data is disclosed, thereby enabling the big data sharing between the data provider and the data user. Second, it is necessary to quickly generate appropriate preprocessing results according to the level of disclosure or network status of raw data and to provide the results to users through big data distribution processing using spark. Third, in order to solve the problem of big traffic, the system monitors the traffic of the network in real time. When preprocessing the data requested by the user, preprocessing to a size available in the current network and transmitting it to the user is required so that no big traffic occurs. In this paper, we present various data sizes according to the level of disclosure through pre - analysis. This method is expected to show a low traffic volume when compared with the conventional method of sharing only raw data in a large number of systems. In this paper, we describe how to solve problems that occur when big data is released and used, and to help facilitate sharing and analysis. The client-server model uses SPARK for fast analysis and processing of user requests. Server Agent and a Client Agent, each of which is deployed on the Server and Client side. The Server Agent is a necessary agent for the data provider and performs preliminary analysis of big data to generate Data Descriptor with information of Sample Data, Summary Data, and Raw Data. In addition, it performs fast and efficient big data preprocessing through big data distribution processing and continuously monitors network traffic. The Client Agent is an agent placed on the data user side. It can search the big data through the Data Descriptor which is the result of the pre-analysis and can quickly search the data. The desired data can be requested from the server to download the big data according to the level of disclosure. It separates the Server Agent and the client agent when the data provider publishes the data for data to be used by the user. In particular, we focus on the Big Data Sharing, Distributed Big Data Processing, Big Traffic problem, and construct the detailed module of the client - server model and present the design method of each module. The system designed on the basis of the proposed model, the user who acquires the data analyzes the data in the desired direction or preprocesses the new data. By analyzing the newly processed data through the server agent, the data user changes its role as the data provider. The data provider can also obtain useful statistical information from the Data Descriptor of the data it discloses and become a data user to perform new analysis using the sample data. In this way, raw data is processed and processed big data is utilized by the user, thereby forming a natural shared environment. The role of data provider and data user is not distinguished, and provides an ideal shared service that enables everyone to be a provider and a user. The client-server model solves the problem of sharing big data and provides a free sharing environment to securely big data disclosure and provides an ideal shared service to easily find big data.

Packaging Technology for the Optical Fiber Bragg Grating Multiplexed Sensors (광섬유 브래그 격자 다중화 센서 패키징 기술에 관한 연구)

  • Lee, Sang Mae
    • Journal of the Microelectronics and Packaging Society
    • /
    • v.24 no.4
    • /
    • pp.23-29
    • /
    • 2017
  • The packaged optical fiber Bragg grating sensors which were networked by multiplexing the Bragg grating sensors with WDM technology were investigated in application for the structural health monitoring of the marine trestle structure transporting the ship. The optical fiber Bragg grating sensor was packaged in a cylindrical shape made of aluminum tubes. Furthermore, after the packaged optical fiber sensor was inserted in polymeric tube, the epoxy was filled inside the tube so that the sensor has resistance and durability against sea water. The packaged optical fiber sensor component was investigated under 0.2 MPa of hydraulic pressure and was found to be robust. The number and location of Bragg gratings attached at the trestle were determined where the trestle was subject to high displacement obtained by the finite element simulation. Strain of the part in the trestle being subjected to the maximum load was analyzed to be ${\sim}1000{\mu}{\varepsilon}$ and thus shift in Bragg wavelength of the sensor caused by the maximum load of the trestle was found to be ~1,200 pm. According to results of the finite element analysis, the Bragg wavelength spacings of the sensors were determined to have 3~5 nm without overlapping of grating wavelengths between sensors when the trestle was under loads and thus 50 of the grating sensors with each module consisting of 5 sensors could be networked within 150 nm optical window at 1550 nm wavelength of the Bragg wavelength interrogator. Shifts in Bragg wavelength of the 5 packaged optical fiber sensors attached at the mock trestle unit were well interrogated by the grating interrogator which used the optical fiber loop mirror, and the maximum strain rate was measured to be about $235.650{\mu}{\varepsilon}$. The modelling result of the sensor packaging and networking was in good agreements with experimental result each other.

A Study of the Effect of the Permeability and Selectivity on the Performance of Membrane System Design (분리막 투과도와 분리도 인자의 시스템 설계 효과 연구)

  • Shin, Mi-Soo;Jang, Dongsoon;Lee, Yongguk
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.38 no.12
    • /
    • pp.656-661
    • /
    • 2016
  • Manufacturing membrane materials with high selectivity and permeability is quite desirable but practically not possible, since the permeability and selectivity are usually inversely proportional. From the viewpoint of reducing the cost of $CO_2$ capture, module performance is even more important than the performance of membrane materials itself, which is affected by the permeance of the membrane (P, stagecut) and selectivity (S). As a typical example, when the mixture with a composition of 13% $CO_2$ and 87% of $N_2$ is fed into the module with 10% stage cut and selectivity 5, in the 10 parts of the permeate, $CO_2$ represents 4.28 parts and $N_2$ represents 5.72 parts. In this case, the $CO_2$ concentration in the permeate is 42.8% and the recovery rate of $CO_2$ in this first separation appears as 4.28/13 = 32.9%. When permeance and selectivity are doubled, however, from 10% to 20% and from 5 to 10, respectively, the $CO_2$ concentration in the permeant becomes 64.5% and the recovery rate is 12.9/13 = 99.2%. Since in this case, most of the $CO_2$ is separated, this may be the ideal condition. For a given feed concentration, the $CO_2$ concentration in the separated gas decreases if permeance is larger than the threshold value for complete recovery at a given selectivity. Conversely, for a given permeance, increasing the selectivity over the threshold value does not improve the process further. For a given initial feed gas concentration, if permeance or selectivity is larger than that required for the complete separation of $CO_2$, the process becomes less efficient. From all these considerations, we can see that there exists an optimum design for a given set of conditions.

Fabrication of Portable Self-Powered Wireless Data Transmitting and Receiving System for User Environment Monitoring (사용자 환경 모니터링을 위한 소형 자가발전 무선 데이터 송수신 시스템 개발)

  • Jang, Sunmin;Cho, Sumin;Joung, Yoonsu;Kim, Jaehyoung;Kim, Hyeonsu;Jang, Dayeon;Ra, Yoonsang;Lee, Donghan;La, Moonwoo;Choi, Dongwhi
    • Korean Chemical Engineering Research
    • /
    • v.60 no.2
    • /
    • pp.249-254
    • /
    • 2022
  • With the rapid advance of the semiconductor and Information and communication technologies, remote environment monitoring technology, which can detect and analyze surrounding environmental conditions with various types of sensors and wireless communication technologies, is also drawing attention. However, since the conventional remote environmental monitoring systems require external power supplies, it causes time and space limitations on comfortable usage. In this study, we proposed the concept of the self-powered remote environmental monitoring system by supplying the power with the levitation-electromagnetic generator (L-EMG), which is rationally designed to effectively harvest biomechanical energy in consideration of the mechanical characteristics of biomechanical energy. In this regard, the proposed L-EMG is designed to effectively respond to the external vibration with the movable center magnet considering the mechanical characteristics of the biomechanical energy, such as relatively low-frequency and high amplitude of vibration. Hence the L-EMG based on the fragile force equilibrium can generate high-quality electrical energy to supply power. Additionally, the environmental detective sensor and wireless transmission module are composed of the micro control unit (MCU) to minimize the required power for electronic device operation by applying the sleep mode, resulting in the extension of operation time. Finally, in order to maximize user convenience, a mobile phone application was built to enable easy monitoring of the surrounding environment. Thus, the proposed concept not only verifies the possibility of establishing the self-powered remote environmental monitoring system using biomechanical energy but further suggests a design guideline.

A Study on the Interactive Narrative - Focusing on the analysis of VR animation <Wolves in the Walls> (인터랙티브 내러티브에 관한 연구 - VR 애니메이션 <Wolves in the Walls>의 분석을 중심으로)

  • Zhuang Sheng
    • Trans-
    • /
    • v.15
    • /
    • pp.25-56
    • /
    • 2023
  • VR is a dynamic image simulation technology with very high information density. Among them, spatial depth, temporality, and realism bring an unprecedented sense of immersion to the experience. However, due to its high information density, the information contained in it is very easy to be manipulated, creating an illusion of objectivity. Users need guidance to help them interpret the high density of dynamic image information. Just like setting up navigation interfaces and interactivity in games, interactivity in virtual reality is a way to interpret virtual content. At present, domestic research on VR content is mainly focused on technology exploration and visual aesthetic experience. However, there is still a lack of research on interactive storytelling design, which is an important part of VR content creation. In order to explore a better interactive storytelling model in virtual reality content, this paper analyzes the interactive storytelling features of the VR animated version of <Wolves in the walls> through the methods of literature review and case study. We find that the following rules can be followed when creating VR content: 1. the VR environment should fully utilize the advantages of free movement for users, and users should not be viewed as mere observers. The user's sense of presence should be fully considered when designing interaction modules. Break down the "fourth wall" to encourage audience interaction in the virtual reality environment, and make the hot media of VR "cool". 2.Provide developer-driven narrative in the early stages of the work so that users are not confused about the ambiguous world situation when they first enter a virtual environment with a high degree of freedom. 1.Unlike some games that guide users through text, you can guide them through a more natural interactive approach that adds natural dialog between the user and story characters (NPC). Also, since gaze guidance is an important part of story progression, you should set up spatial scene user gaze guidance elements within it. For example, you can provide eye-following cues, motion cues, language cues, and more. By analyzing the interactive storytelling features and innovations of the VR animation <Wolves in the walls>, I hope to summarize the main elements of interactive storytelling from its content. Based on this, I hope to explore how to better showcase interactive storytelling in virtual reality content and provide thoughts on future VR content creation.

u-EMS : An Emergency Medical Service based on Ubiquitous Sensor Network using Bio-Sensors (u-EMS : 바이오 센서 네트워크 기반의 응급 구조 시스템)

  • Kim, Hong-Kyu;Moon, Seung-Jin
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.13 no.7
    • /
    • pp.433-441
    • /
    • 2007
  • The bio-Sensors, which are sensing the vital signs of human bodies, are largely used by the medical equipment. Recently, the sensor network technology, which composes of the sensor interface for small-seize hardware, processor, the wireless communication module and battery in small sized hardware, has been extended to the area of bio-senor network systems due to the advances of the MEMS technology. In this paper we have suggested a design and implementation of a health care information system(called u-EMS) using a bio-sensor network technology that is a combination of the bio-sensor and the sensor network technology. In proposed system, we have used the following vital body sensors such as EKG sensor, the blood pressure sensor, the heart rate sensor, the pulse oximeter sensor and the glucose sensor. We have collected various vital sign data through the sensor network module and processed the data to implement a health care measurement system. Such measured data can be displayed by the wireless terminal(PDA, Cell phone) and the digital-frame display device. Finally, we have conducted a series of tests which considered both patient's vital sign and context-awared information in order to improve the effectiveness of the u-EMS.