• Title/Summary/Keyword: 표준프로파일

Search Result 298, Processing Time 0.024 seconds

A Method to Customize the Variability of EJB-Based Components (EJB 기반 컴포넌트의 가변성 맞춤화 기법)

  • Min Hyun-Gi;Kim Sung-Ahn;Lee Jin-Yeal;Kim Soo-Dong
    • Journal of KIISE:Software and Applications
    • /
    • v.33 no.6
    • /
    • pp.539-549
    • /
    • 2006
  • Component-Based Development (CBD) has emerged as a new effective technology that reduces development cost and time-to-market by assembling reusable components in developing software. The degree of conformance to standards and common features in a domain largely determines the reusability of components. In addition, variability within commonality should also be modeled and customization mechanism for the variability should be designed into components. Enterprise JavaBeans (EJB) is considered a most suitable environment for implementing components. However. the reusability of EJB is limited because EJB does not have built-in variability design mechanisms. In this paper, we present efficient variability design techniques for implementing components in EJB. We propose a method to customize the variability of EJB-based components by applying three variability design mechanisms; selection, plug-in, and external profile. And we elaborate the suitable situations where each variability design mechanism can be applied, and conduct a technical comparison to other approaches available.

Mid-span Spectral Inversion System Applied with Dispersion Management with Different RDPS Determinations for Half Transmission Link (반 전송 링크의 RDPS 결정 방식이 다른 분산 제어가 적용된 Mid-span Spectral Inversion 시스템)

  • Lee, Seong-Real
    • Journal of Advanced Navigation Technology
    • /
    • v.26 no.5
    • /
    • pp.331-337
    • /
    • 2022
  • The length of optical fiber in dispersion-managed link combined with optical phase conjugation to compensate for signal distortion caused by chromatic dispersion and nonlinear Kerr effect is a major factor determining the compensation effectiveness. The dispersion-managed link consists of several fiber spans in which standard single mode fiber and dispersion compensating fiber are arranged. In this paper, the compensation effect in the link that changes residual dispersion per span only by adjusting the length of one type of optical fiber, which is different in the first half link and the second half link with respect to optical phase conjugator (OPC), has been investigated. It was confirmed that the best compensation for 960 Gb/s wavelength division multiplexed signal could be obtained in the dispersion-managed link, in which the cumulative dispersion profile is symmetric around the OPC, and the cumulative dispersion amount is all positive in the first half, and all the cumulative dispersion amount is distributed negatively in the second half.

Development of NCS Based Vocational Curriculum Model for the Practical and Creative Human Respirces (실전 창의형 인재 양성을 위한 NCS 기반 직업교육과정의 모형 개발)

  • Kim, Dong-Yeon;Kim, Jinsoo
    • 대한공업교육학회지
    • /
    • v.39 no.2
    • /
    • pp.101-121
    • /
    • 2014
  • The study aims to develop the NCS based vocational curriculum model for the practical and creative human resources. For effectiveness of the study, the study consists of literature studies of both domestic and international, contents analysis, case study, expert(9samples) consultation and review, and in-depth-interview of the three advisory members. The validity of the developed model is analyzed through mean, standard deviation and contents validity ratio(CVR). The main results of the model development in our study are as follow. First, our NCS based vocational curriculum model for the practical and creative human resources is developed with the analyses of NCS development manuals, training standard utilization and training curriculum organization manuals, NCS learning module development manual and case studies, NCS research report, NCS based curriculum pilot development resources directed toward the high schools and vocational school as well as the domestic and international literature study on career training model like NCS. Second, based on the findings of our analysis in combination with the findings from the consultations with the expert and advisory committee, total 19 sub-factors of each step and domain are extracted. The sub-factors of domain in step 1 are the competency unit, definition of competency unit, competency unit element, performance criteria, range of variable, guide of assessment, key competency; in step 2, they are subject title, subject objectives, chapter title, chapter objectives, pedagogical methods, assessment methods and basic job competence; and in step 2, they are NCS based subject matrix table, NCS based subject profile, NCS based job training curriculum table, NCS based subjects organization flowchart, NCS based job training operation plan. Third, the final model including step 3 NCS based subject profile are developed in association with the linked organizational sub-factors of step 1 and step 2. Forth, the validity tests for the final model by the step and domain yield the mean 4.67, CVR value 1.00, indicating the superior validity. Also, the means of each sub-factors are all over 4.33 with the CVR value 1.00, indicating the high validity as well. The means of the associated organizations within the model are also over 4.33 with the CVR value of 1.00. Standard deviations are all .50 or lower which are small. Fifth, based on the validity test results and the in-depth-interview of the expert and advisory committee, the model is adjusted complemented to establish final model of the NCS based vocational curriculum for the practical and creative human resources.

Design and Implementation of Medical Information System using QR Code (QR 코드를 이용한 의료정보 시스템 설계 및 구현)

  • Lee, Sung-Gwon;Jeong, Chang-Won;Joo, Su-Chong
    • Journal of Internet Computing and Services
    • /
    • v.16 no.2
    • /
    • pp.109-115
    • /
    • 2015
  • The new medical device technologies for bio-signal information and medical information which developed in various forms have been increasing. Information gathering techniques and the increasing of the bio-signal information device are being used as the main information of the medical service in everyday life. Hence, there is increasing in utilization of the various bio-signals, but it has a problem that does not account for security reasons. Furthermore, the medical image information and bio-signal of the patient in medical field is generated by the individual device, that make the situation cannot be managed and integrated. In order to solve that problem, in this paper we integrated the QR code signal associated with the medial image information including the finding of the doctor and the bio-signal information. bio-signal. System implementation environment for medical imaging devices and bio-signal acquisition was configured through bio-signal measurement, smart device and PC. For the ROI extraction of bio-signal and the receiving of image information that transfer from the medical equipment or bio-signal measurement, .NET Framework was used to operate the QR server module on Window Server 2008 operating system. The main function of the QR server module is to parse the DICOM file generated from the medical imaging device and extract the identified ROI information to store and manage in the database. Additionally, EMR, patient health information such as OCS, extracted ROI information needed for basic information and emergency situation is managed by QR code. QR code and ROI management and the bio-signal information file also store and manage depending on the size of receiving the bio-singnal information case with a PID (patient identification) to be used by the bio-signal device. If the receiving of information is not less than the maximum size to be converted into a QR code, the QR code and the URL information can access the bio-signal information through the server. Likewise, .Net Framework is installed to provide the information in the form of the QR code, so the client can check and find the relevant information through PC and android-based smart device. Finally, the existing medical imaging information, bio-signal information and the health information of the patient are integrated over the result of executing the application service in order to provide a medical information service which is suitable in medical field.

A Study on the Application of Cross-Certification Technology for the Automatic Authentication of Charging Users in ISO 15118 Standard (ISO 15118 충전 사용자 자동인증을 위한 교차인증서 기술의 적용에 관한 연구)

  • Lee, Sujeong;Shin, Minho;Jang, Hyuk-soo
    • The Journal of Society for e-Business Studies
    • /
    • v.25 no.2
    • /
    • pp.1-14
    • /
    • 2020
  • ISO 15118 is an international standard that defines communication between electric vehicles and electric vehicle chargers. Plug & Charge (PnC) was also defined as a technology to automatically authenticate users when using charging services. PnC indicates automatic authentication technology where all processes such as electric vehicle user authentication, charging and billing are automatically processed. According to the standard, certificates for chargers and CPSs (Certificate Provisioning Services) should be under the V2G (Vehicle to Grid) Root certificate. In Korea, the utility company operates its own PKI (Public Key Infrastructure), making it difficult to provide chargers under the V2G Root Certificate. Therefore, a method that can be authenticated is necessary even when you have different Root Certificates. This paper proposes to apply cross-certificate technology to PnC authentication. Automatic authentication of Cross Certification is to issue a cross-certificate of the Root CA and include it in the certificate chain to proceed with automatic authentication, even if you have different Root certificates. Applying cross-certificate technology enables verification of certificates under other Root certificates. In this paper, the PnC automatic authentication and cross certificate automatic authentication is implemented, so as to proceed with proof of concept proving that both methods are available. Define development requirements, certificate profiles, and user authentication sequences, and implement and execute them accordingly. This experiment confirms that two automatic authentication are practicable, especially the scalability of automatic authentication using cross-certificate PnC.

A Study on Light-weight Algorithm of Large scale BIM data for Visualization on Web based GIS Platform (웹기반 GIS 플랫폼 상 가시화 처리를 위한 대용량 BIM 데이터의 경량화 알고리즘 제시)

  • Kim, Ji Eun;Hong, Chang Hee
    • Spatial Information Research
    • /
    • v.23 no.1
    • /
    • pp.41-48
    • /
    • 2015
  • BIM Technology contains data from the life cycle of facility through 3D modeling. For these, one building products the huge file because of massive data. One of them is IFC which is the standard format, and there are issues that large scale data processing based on geometry and property information of object. It increases the rendering speed and constitutes the graphic card, so large scale data is inefficient for screen visualization to user. The light weighting of large scale BIM data has to solve for process and quality of program essentially. This paper has been searched and confirmed about light weight techniques from domestic and abroad researches. To control and visualize the large scale BIM data effectively, we proposed and verified the technique which is able to optimize the BIM character. For operating the large scale data of facility on web based GIS platform, the quality of screen switch from user phase and the effective memory operation were secured.

Development of KML conversion technology for ENCs application (전자해도 활용을 위한 KML 변환기술 개발)

  • Oh, Se-Woong;Ko, Hyun-Joo;Park, Jong-Min;Lee, Moon-Jin
    • Proceedings of the Korean Institute of Navigation and Port Research Conference
    • /
    • 2010.04a
    • /
    • pp.135-138
    • /
    • 2010
  • IMO adopt the revision of SOLAS convention on requirement systems for ECDIS and considered an ECDIS as the major system for E-Navigation strategy on marine transportation safety and environment protection. ENC(Electronic Navigational Chart) as base map of ECDIS is considered as a principal information infrastructure that is essential for navigation tasks. But ENCs are not easy to utilize because they are encoded according to ISO/IEC 8211 file format, and ENCs is required to utilize in parts of Marine GIS and various marine application because they are used for navigational purpose mainly. Meanwhile Google earth is satellite map that Google company service, is utilized in all kinds of industry generally providing local information including satellite image, map, topography, 3D building information, etc. In this paper, we developed KML conversion technology for ENC application. details of development contents consist of ENC loading module and KML conversion module. Also, we applied this conversion technology to Korea ENC and evaluated the results.

  • PDF

VLSI Design of Interface between MAC and PHY Layers for Adaptive Burst Profiling in BWA System (BWA 시스템에서 적응형 버스트 프로파일링을 위한 MAC과 PHY 계층 간 인터페이스의 VLSI 설계)

  • Song Moon Kyou;Kong Min Han
    • Journal of the Institute of Electronics Engineers of Korea TC
    • /
    • v.42 no.1
    • /
    • pp.39-47
    • /
    • 2005
  • The range of hardware implementation increases in communication systems as high-speed processing is required for high data rate. In the broadband wireless access (BWA) system based on IEEE standard 802.16 the functions of higher part in the MAC layer to Provide data needed for generating MAC PDU are implemented in software, and the tasks from formatting MAC PDUs by using those data to transmitting the messages in a modem are implemented in hardware. In this paper, the interface hardware for efficient message exchange between MAC and PHY layers in the BWA system is designed. The hardware performs the following functions including those of the transmission convergence(TC) sublayer; (1) formatting TC PDU(Protocol data unit) from/to MAC PDU, (2) Reed-solomon(RS) encoding/decoding, and (3) resolving DL MAP and UL MAP, so that it controls transmission slot and uplink and downlink traffic according to the modulation scheme of burst profile. Also, it provides various control signal for PHY modem. In addition, the truncated binary exponential backoff (TBEB) algorithm is implemented in a subscriber station to avoid collision on contention-based transmission of messages. The VLSI architecture performing all these functions is implemented and verified in VHDL.

Development of an Algorithm for Automatic Quantity Take-off of Slab Rebar (슬래브 철근 물량 산출 자동화 알고리즘 개발)

  • Kim, Suhwan;Kim, Sunkuk;Suh, Sangwook;Kim, Sangchul
    • Korean Journal of Construction Engineering and Management
    • /
    • v.24 no.5
    • /
    • pp.52-62
    • /
    • 2023
  • The objective of this study is to propose an automated algorithm for precise cutting length of slab rebar complying with regulations such as anchorage length, standard hooks, and lapping length. This algorithm aims to improve the traditional manual quantity take-off process typically outsourced by external contractors. By providing accurate rebar quantity data at BBS(Bar Bending Schedule) level from the bidding phase, uncertainty in quantity take-off can be eliminated and reliance on out-sourcing reduced. In addition, the algorithm allows for early determination of precise quantities, enabling construction firms to preapre competitive and optimized bids, leading to increased profit margins during contract negotiations. The proposed algorithm not only streamlines redundant tasks across various processes, including estimating, budgeting, and BBS generation but also offers flexibility in handling post-contract structural drawing changes. In particular, the proposed algorithm, when combined with BIM, can solve the technical problems of using BIM in the early phases of construction, and the algorithm's formulas and shape codes that built as REVIT-based family files, can help saving time and manpower.

An Iterative, Interactive and Unified Seismic Velocity Analysis (반복적 대화식 통합 탄성파 속도분석)

  • Suh Sayng-Yong;Chung Bu-Heung;Jang Seong-Hyung
    • Geophysics and Geophysical Exploration
    • /
    • v.2 no.1
    • /
    • pp.26-32
    • /
    • 1999
  • Among the various seismic data processing sequences, the velocity analysis is the most time consuming and man-hour intensive processing steps. For the production seismic data processing, a good velocity analysis tool as well as the high performance computer is required. The tool must give fast and accurate velocity analysis. There are two different approches in the velocity analysis, batch and interactive. In the batch processing, a velocity plot is made at every analysis point. Generally, the plot consisted of a semblance contour, super gather, and a stack pannel. The interpreter chooses the velocity function by analyzing the velocity plot. The technique is highly dependent on the interpreters skill and requires human efforts. As the high speed graphic workstations are becoming more popular, various interactive velocity analysis programs are developed. Although, the programs enabled faster picking of the velocity nodes using mouse, the main improvement of these programs is simply the replacement of the paper plot by the graphic screen. The velocity spectrum is highly sensitive to the presence of the noise, especially the coherent noise often found in the shallow region of the marine seismic data. For the accurate velocity analysis, these noise must be removed before the spectrum is computed. Also, the velocity analysis must be carried out by carefully choosing the location of the analysis point and accuarate computation of the spectrum. The analyzed velocity function must be verified by the mute and stack, and the sequence must be repeated most time. Therefore an iterative, interactive, and unified velocity analysis tool is highly required. An interactive velocity analysis program, xva(X-Window based Velocity Analysis) was invented. The program handles all processes required in the velocity analysis such as composing the super gather, computing the velocity spectrum, NMO correction, mute, and stack. Most of the parameter changes give the final stack via a few mouse clicks thereby enabling the iterative and interactive processing. A simple trace indexing scheme is introduced and a program to nike the index of the Geobit seismic disk file was invented. The index is used to reference the original input, i.e., CDP sort, directly A transformation techinique of the mute function between the T-X domain and NMOC domain is introduced and adopted to the program. The result of the transform is simliar to the remove-NMO technique in suppressing the shallow noise such as direct wave and refracted wave. However, it has two improvements, i.e., no interpolation error and very high speed computing time. By the introduction of the technique, the mute times can be easily designed from the NMOC domain and applied to the super gather in the T-X domain, thereby producing more accurate velocity spectrum interactively. The xva program consists of 28 files, 12,029 lines, 34,990 words and 304,073 characters. The program references Geobit utility libraries and can be installed under Geobit preinstalled environment. The program runs on X-Window/Motif environment. The program menu is designed according to the Motif style guide. A brief usage of the program has been discussed. The program allows fast and accurate seismic velocity analysis, which is necessary computing the AVO (Amplitude Versus Offset) based DHI (Direct Hydrocarn Indicator), and making the high quality seismic sections.

  • PDF