• Title/Summary/Keyword: Information System Reliability

Search Result 2,683, Processing Time 0.033 seconds

Security Requirements Analysis on IP Camera via Threat Modeling and Common Criteria (보안위협모델링과 국제공통평가기준을 이용한 IP Camera 보안요구사항 분석)

  • Park, Jisoo;Kim, Seungjoo
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.6 no.3
    • /
    • pp.121-134
    • /
    • 2017
  • With rapid increasing the development and use of IoT Devices, requirements for safe IoT devices and services such as reliability, security are also increasing. In Security engineering, SDLC (Secure Development Life Cycle) is applied to make the trustworthy system. Secure Development Life Cycle has 4 big steps, Security requirements, Design, Implementation and Operation and each step has own goals and activities. Deriving security requirements, the first step of SDLC, must be accurate and objective because it affect the rest of the SDLC. For accurate and objective security requirements, Threat modeling is used. And the results of the threat modeling can satisfy the completeness of scope of analysis and the traceability of threats. In many countries, academic and IT company, a lot of researches about drawing security requirements systematically are being done. But in domestic, awareness and researches about deriving security requirements systematically are lacking. So in this paper, I described about method and process to drawing security requirements systematically by using threat modeling including DFD, STRIDE, Attack Library and Attack Tree. And also security requirements are described via Common Criteria for delivering objective meaning and broad use of them.

Analysis of Network Traffic with Urban Area Characteristics for Mobile Network Traffic Model (이동통신 네트워크 트래픽 모델을 위한 도시 지역 이동통신 트래픽 특성 분석)

  • Yoon, Young-Hyun
    • The KIPS Transactions:PartC
    • /
    • v.10C no.4
    • /
    • pp.471-478
    • /
    • 2003
  • Traditionally,, analysis, simulation and measurement have all been used to evaluate the performance of network protocols and functional entities that support mobile wireless service. Simulation methods are useful for testing the complex systems which have the very complicate interactions between components. To develop a mobile call simulator which is used to examine, validate, and predict the performance of mobile wireless call procedures must have the teletraffic model, which is to describe the mobile communication environments. Mobile teletraffic model is consists of 2 sub-models, traffic source and network traffic model. In this paper, we analyzed the network traffic data which are gathered from selected Base Stations (BSs) to define the mobile teletraffic model. We defined 4 types of cell location-Residential, Commercial, Industrial, and Afforest zone. We selected some Base Stations (BSs) which are represented cell location types in Seoul city, and gathered real data from them And then, we present the call rate per hour, cail distribution pattern per day, busy hours, loose hours, the maximum number of call, and the minimum number of calls based on defined cell location types. Those parameters are very important to test the mobile communication system´s performance and reliability and are very useful for defining the mobile network traffic model or for working the existed mobile simulation programs as input parameters.

A Quality Assessment of Meta-Analyses Research in Social Work (국내 사회복지 관련 메타분석 연구의 질 평가)

  • Cho, Mi-Kyoung;Kim, Hee-Young
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.17 no.10
    • /
    • pp.158-167
    • /
    • 2016
  • This study was conducted to evaluate the quality of meta-analysis of social work in South Korea using the Assessment of Multiple Systematic Review (AMSTAR). Electronic databases including the Korean Studies Information Service System (KISS), DBpia, and RISS4U were searched for 'meta-analysis', 'social work', and 'social welfare' from 2000 to December 2015, and 42 meta-analysis studies were included. Data were analyzed using descriptive statistics, t-tests, and ANOVA. The mean score for AMSTAR evaluation was $4.766{\pm}1.66$, while 19 studies (45.2%) were classified at the low-quality level, and 22 (52.4%) were at the moderate-quality level. The scores of quality assessment were analyzed by publication year, participants, number of studies included, number of DB, reporting study quality, extraction diagram and topics. The findings indicated that the following changes should be implemented to improve the quality and reliability of meta-analysis results in social work research: 1) common reporting guidelines should be provided for the social work field, 2) quality analyses of each study should be conducted to achieve a high level of evidence of effectiveness of social work interventions, 3) the characteristics of the included studies should be provided, and 4) a consensus and procedure based on at least two independent data extractors should be reported.

A Use-case based Component Mining Approach for the Modernization of Legacy Systems (레거시 시스템을 현대화하기 위한 유스케이스 기반의 컴포넌트 추출 방법)

  • Kim, Hyeon-Soo;Chae, Heung-Seok;Kim, Chul-Hong
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.7
    • /
    • pp.601-611
    • /
    • 2005
  • Due to not only proven stability and reliability but a significant investment and years of accumulated -experience and knowledge, legacy systems have supported the core business applications of a number of organizations over many years. While the emergence of Web-based e-business environments requires externalizing core business processes to the Web. This is a competitive advantage in the new economy. Consequently, organizations now need to mine the business value buried in the legacy systems for reuse in new e-business applications. In this paper we suggest a systematic approach to mining components that perform specific business services and that consist of the legacy system's assets to be leveraged on the modem platform. The proposed activities are divided into several tasks. First, use cases that realize the business processes are captured. Secondly, a design model is constructed for each identified use case in order to integrate the use cases with the similar functionalities. Thirdly, we identify component candidates from the design model and then adjust the component candidates by considering common elements among the candidate components. And also business components are divided into three more fine-grained components to deploy them onto J2EE/EJB environments. finally, we define the interfaces of components which provide functionalities of the components as operations.

A Study on Converting the Data of Probability of Hit(Ph) for OneSAF Model (OneSAF 모델을 위한 명중률 데이터 변환 방법)

  • Kim, Gun In;Kang, Tae Ho;Seo, Woo Duck;Pyun, Jae Jung
    • Journal of the Korea Society for Simulation
    • /
    • v.29 no.3
    • /
    • pp.83-91
    • /
    • 2020
  • To use the OneSAF model for the analysis of Defence M&S, the most critical factor is the acquisition of input data. The model user is hard to determine the input data such as the probability of hit(Ph) and the probability of kill(Pk). These data can be obtained directly by live fire during the development test and the operational test. Therefore, this needs more time and resources to get the Ph and Pk. In this paper, we reviewed possible ways to obtain the Ph and Pk. We introduced several data producing methodologies. In particular, the error budget method was presented to convert the Ph(%) data of AWAM model to the error(mil) data of OneSAF model. Also, the conversion method which can get the adjusted results from the JMEM is introduced. The probability of a hit was calculated based on the error budget method in order to prove the usefulness of the given method. More accurate data were obtained when the error budget method and the projected area from the published photo were used simultaneously. The importance of the Ph calculation was demonstrated by sensitivity analysis of the Ph on combat effectiveness. This paper emphasizes the importance of determining the Ph data and improving the reliability of the M&S system though steady collection and analysis of the Ph data.

The GOCI-II Early Mission Marine Fog Detection Products: Optical Characteristics and Verification (천리안 해양위성 2호(GOCI-II) 임무 초기 해무 탐지 산출: 해무의 광학적 특성 및 초기 검증)

  • Kim, Minsang;Park, Myung-Sook
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.5_2
    • /
    • pp.1317-1328
    • /
    • 2021
  • This study analyzes the early satellite mission marine fog detection results from Geostationary Ocean Color Imager-II (GOCI-II). We investigate optical characteristics of the GOCI-II spectral bands for marine fog between October 2020 and March 2021 during the overlapping mission period of Geostationary Ocean Color Imager (GOCI) and GOCI-II. For Rayleigh-corrected reflection (Rrc) at 412 nm band available for the input of the GOCI-II marine fog algorithm, the inter-comparison between GOCI and GOCI-II data showed a small Root Mean Square Error (RMSE) value (0.01) with a high correlation coefficient (0.988). Another input variable, Normalized Localization Standard (NLSD), also shows a reasonable correlation (0.798) between the GOCI and GOCI-II data with a small RMSE value (0.007). We also found distinctive optical characteristics between marine fog and clouds by the GOCI-II observations, showing the narrower distribution of all bands' Rrc values centered at high values for cloud compared to marine fog. The GOCI-II marine fog detection distribution for actual cases is similar to the GOCI but more detailed due to the improved spatial resolution from 500 m to 250 m. The validation with the automated synoptic observing system (ASOS) visibility data confirms the initial reliability of the GOCI-II marine fog detection. Also, it is expected to improve the performance of the GOCI-II marine fog detection algorithm by adding sufficient samples to verify stable performance, improving the post-processing process by replacing real-time available cloud input data and reducing false alarm by adding aerosol information.

Frame security method in physical layer using OFB over Gigabit Ethernet Network (기가비트 이더넷 망에서 OFB 방식을 이용한 물리 계층 프레임 보안 기법)

  • Im, Sung-yeal
    • Journal of Internet Computing and Services
    • /
    • v.22 no.5
    • /
    • pp.17-26
    • /
    • 2021
  • This paper is about a physical layer frame security technique using OFB-style encryption/decryption with AES algorithms on Gigabit Ethernet network. We propose a data security technique at the physical layer that performs OFB-style encryption/decryption with AES algorithm with strong security strength when sending and receiving data over Gigabit Ethernet network. Generally, when operating Gigabit Ethernet network, there is no security features, but data security is required, additional devices that apply this technique can be installed to perform security functions. In the case of data transmission over Gigabit Ethernet network, the Ethernet frames conform to IEEE 802.3 specification, which includes several fields to ensure proper reception of data at the receiving node in addition to the data field. When encrypting, only the data field should be encrypted and transmitted in real time. In this paper, we show that only the data field of the IEEE802.3 frame is encrypted and transmitted on the sending node, and only the data field is decrypted to show the plain text on the receiving node, which shows that the encryption/decryption is carried out correctly. Therefore, additional installation of devices that apply this technique can increase the reliability of the system when security for data is required in Ethernet network operating without security features.

Case of Dynamic Performance Optimization for Hydraulic Drifter (유압 드리프터의 동적성능 최적화 사례)

  • Noh, Dae-kyung;Lee, Dae-Hee;Jang, Joo-Sup;Yun, Joo-Seop;Lee, Dong-Won
    • Journal of the Korea Society for Simulation
    • /
    • v.28 no.2
    • /
    • pp.35-48
    • /
    • 2019
  • Domestic hydraulic drifters till now have been developed by benchmarking products from overseas leading companies. However, they do not have excellent impact performance as they are not suitable for characteristics (large flow rate and low pressure) of Korean hydraulic drill power pack, and therefore, research on the optimum design has not made much headway. This study performs multi-objective function optimization for hydraulic drifters whose capacity has been redesigned to deal with the large flow rate, and also with the help of this function, it aims to improve impact power and reduce supply and surge pressure. A summary of the research study is as follows: First, we set goals for improving impact power, supply pressure, and surge pressure, and then perform multi-objective function optimization on them. After that, we secure the reliability of the optimized analytical model by comparing the test results of the prototype built by the optimized design with the analysis results of the analytical model. This study used SimulationX, that is the hydraulic system analysis software, and EasyDesign, which is a multi-objective function optimization program. Through this research, we have achieved the results that satisfy the goal of developing high power drifters suitable for Korean type hydraulic drills.

Improvement of Analysis Methods for Fatty Acids in Infant Formula by Gas Chromatography Flame-Ionization Detector (GC-FID를 이용한 조제유류 중 지방산 분석법 개선 연구)

  • Hwang, Keum Hee;Choi, Won Hee;Hu, Soo Jung;Lee, Hye young;Hwang, Kyung Mi
    • Journal of Food Hygiene and Safety
    • /
    • v.36 no.1
    • /
    • pp.34-41
    • /
    • 2021
  • The purpose of this research is to improve analysis methods of determining the contents of fatty acids in infant formulas and follow-up formulas. A gas chromatography (GC) method was performed on a GC system coupled to flame ionization detector, with a fused silica capillary column (SP2560, 100 m×0.25 mm, 0.20 ㎛). The method was validated using standard reference material (SRM, NIST 1849a). Performance parameters for method validation such as specificity, linearity, limits of detection (LOD) and quantification (LOQ), accuracy and precision were examined. The linearity of standard solution with correlation coefficient was higher than 0.999 in the range of 0.1-5 mg/mL. The LOD and LOQ were 0.01-0.06 mg/mL and 0.03-0.2 mg/mL, respectively. The recovery using standard reference material was confirmed and the precision was found to be between 0.8% and 2.9% relative standard deviation (RSD). Optimized methods were applied in sample analysis to verify the reliability. All the tested products had acceptable contents of fatty acids compared with component specification for nutrition labeling. The result of this research will provide efficient experimental information and strengthen the management of nutrients in infant formula and follow-up formula.

ESG Management, Strategies for corporate sustainable growth : KT's company-wide goals and strategies (ESG 경영, 기업의 지속가능성장을 위한 전략 : KT의 전사적 목표와 전략)

  • Kang, Yoon Ji;Kim, Sanghoon
    • Journal of the Korea Convergence Society
    • /
    • v.13 no.4
    • /
    • pp.233-244
    • /
    • 2022
  • One of the most noteworthy topics in recent corporate management is ESG(Environmental, Social, Governance). Although there are many companies that have declared ESG management, KT has declared full-fledged ESG management in 2021 and is sharing its sustainable management strategy with stakeholders. In addition, KT is strengthening ESG management by issuing ESG bonds for the first time in the domestic ICT industry. At a time when the information technology industry became more important due to COVID-19, this study attempted to examine KT's ESG management goals and strategies by dividing them into environmental, social, and governance areas. KT was aiming to achieve environmental integrity through 'environmental management', 'green competence', 'energy resources', and 'eco-friendly projects' in the environmental field. In addition, in the social field, genuine creating social value was pursued through 'social contribution', 'co-growth', and 'human rights management'. Finally, in the governance area, it was aiming for a transparent corporate management system to pursue economic reliability through 'ethics and compliance' and 'risk management'. In particular, KT was promoting its own ESG management by promoting strategies to solve environmental and social problems using AI and BigData technologies based on the characteristics of a digital platform company. This study aims to derive implications for ESG strategy establishment and ESG management development direction through KT's ESG management case in relation to ESG management, which has emerged as a hot topic.