• Title/Summary/Keyword: Standardized network

Search Result 369, Processing Time 0.029 seconds

Study on Method to Develop Case-based Security Threat Scenario for Cybersecurity Training in ICS Environment (ICS 환경에서의 사이버보안 훈련을 위한 사례 기반 보안 위협 시나리오 개발 방법론 연구)

  • GyuHyun Jeon;Kwangsoo Kim;Jaesik Kang;Seungwoon Lee;Jung Taek Seo
    • Journal of Platform Technology
    • /
    • v.12 no.1
    • /
    • pp.91-105
    • /
    • 2024
  • As the number of cases of applying IT systems to the existing isolated ICS (Industrial Control System) network environment continues to increase, security threats in the ICS environment have rapidly increased. Security threat scenarios help to design security strategies in cybersecurity training, including analysis, prediction, and response to cyberattacks. For successful cybersecurity training, research is needed to develop valid and reliable security threat scenarios for meaningful training. Therefore, this paper proposes a case-based security threat scenario development methodology for cybersecurity training in the ICS environment. To this end, we develop a methodology consisting of five steps based on analyzing actual cybersecurity incident cases targeting ICS. Threat techniques are standardized in the same form using objective data based on the MITER ATT&CK framework, and then a list of CVEs and CWEs corresponding to the threat technique is identified. Additionally, it analyzes and identifies vulnerable functions in programming used in CWE and ICS assets. Based on the data generated up to the previous stage, develop security threat scenarios for cybersecurity training for new ICS. As a result of verification through a comparative analysis between the proposed methodology and existing research confirmed that the proposed method was more effective than the existing method regarding scenario validity, appropriateness of evidence, and development of various scenarios.

  • PDF

A Study on Organizational Strategy and Operational Elements of Community-based Agricultural Management Bodies (마을단위 농업경영체 조직전략 및 운영요소 도출 연구)

  • Kim, Jong An;Kil, Cheong Soon;Kim, Gi Tae;Kim, Won Gyeong
    • Journal of Agricultural Extension & Community Development
    • /
    • v.20 no.3
    • /
    • pp.777-822
    • /
    • 2013
  • This study attempts to elicit the organizational strategy and operational elements of community-based agricultural management bodies as new main farm management. We analyzed the newest discussion trend, cooperated community management, between Republic of Korea and Japan based on theory of organizing regional agriculture, and also researched on the organization management and business management about cooperated management of community. In this study, the main conclusion of the organizational strategy and operational elements of community-based agricultural management bodies are as following. i) The community-based agricultural management bodies is the cooperation managed individual agriculture resources as joint stock for purposing compound goal, an expansion agricultural income, maintenance farm productivity and rural societies. ii) The domain of cooperative management focus on secondary and tertiary industry like food process, farm produce distribution, rural experience more than farm produce production. The study suggest business promotion system of village unit farmers groups, element of organization management as executive decision organization, business management, operating factor for each steps and management element of cooperation farm working. iii) The policy direction for invigoration community-based agricultural management bodies is to make facilitation for each steps instead of standardized support.

Toward a Social Sciences Methodology for Electronic Survey Research on the Internet or Personal Computer check (사회과학 연구에 있어 인터넷 및 상업용 통신망을 이용한 전자설문 조사방법의 활용)

  • Hong Yong-Gee;Lee Hong-Gee;Chae Su-Kyung
    • Management & Information Systems Review
    • /
    • v.3
    • /
    • pp.287-316
    • /
    • 1999
  • Cyberspace permits us to more beyond traditional face-to-face, mail and telephone surveys, yet still to examine basic issues regarding the quality of data collection: sampling, questionnaire design, survey distribution, means of response, and database creation. This article address each of these issues by contrasting and comparing traditional survey methods(Paper-and-Pencil) with Internet or Personal Computer networks-mediated (Screen-and-Keyboard) survey methods also introduces researchers to this revolutionary and innovative tool and outlines a variety of practical methods for using the Internet or Personal Computer Networks. The revolution in telecommunications technology has fostered the rapid growth of the Internet all over the world. The Internet is a massive global network and comprising many national and international networks of interconnected computers. The Internet or Personal Computer Networks could be the comprehensive interactive tool that will facilitate the development of the skills. The Internet or Personal Computer Networks provides a virtual frontier to expand our access to information and to increase our knowledge and understanding of public opinion, political behavior, social trends and lifestyles through survey research. Comparable to other technological advancements, the Internet or Personal Computer Networks presents opportunities that will impact significantly on the process and quality of survey research now and in the twenty-first century. There are trade-offs between traditional and the Internet or Personal Computer Networks survey. The Internet or Personal Computer Networks is an important channel for obtaining information for target participants. The cost savings in time, efforts, and material were substantial. The use of the Internet or Personal Computer Networks survey tool will increase the quality of research environment. There are several limitations to the Internet or Personal Computer Network survey approach. It requires the researcher to be familiar with Internet navigation and E-mail, it is essential for this process. The use of Listserv and Newsgroup result in a biased sample of the population of corporate trainers. However, it is this group that participates in technology and is in the fore front of shaping the new organizations of interest, and therefore it consists of appropriate participants. If this survey method becomes popular and is too frequently used, potential respondents may become as annoyed with E-mail as the sometimes are with mail survey and junk mail. Being a member of the Listserv of Newsgroup may moderate that reaction. There is a need to determine efficient, effective ways for the researcher to strip identifiers from E-mail, so that respondents remain anonymous, while simultaneously blocking a respondent from responding to a particular survey instrument more than once. The optimum process would be on that is initiated by the researcher : simple, fast and inexpensive to administer and has credibility with respondents. This would protect the legitimacy of the sample and anonymity. Creating attractive Internet or Personal Computer Networks survey formats that build on the strengths of standardized structures but also capitalize on the dynamic and interactive capability of the medium. Without such innovations in survey design, it is difficult to imagine why potential survey respondents would use their time to answer questions. More must be done to create diverse and exciting ways of building an credibility between respondents and researchers on the Internet or Personal Computer Networks. We believe that the future of much exciting research is based in the Electronic survey research. The ability to communicate across distance, time, and national boundaries offers great possibilities for studying the ways in which technology and technological discourse are shaped. used, and disseminated ; the many recent doctoral dissertations that treat some aspect of electronic survey research testify to the increase focus on the Internet or Personal Computer Networks. Thus, scholars should begin a serious conversation about the methodological issues of conducting research In cyberspace. Of all the disciplines, Internet or Personal Computer Networks, emphasis on the relationship between technology and human communication, should take the lead in considering research in the cyberspace.

  • PDF

Analysis and Design of Profiling Adaptor for XML based Energy Storage System (XML 기반의 에너지 저장용 프로파일 어댑터 분석 및 설계)

  • Woo, Yongje;Park, Jaehong;Kang, Mingoo;Kwon, Kiwon
    • Journal of Internet Computing and Services
    • /
    • v.16 no.5
    • /
    • pp.29-38
    • /
    • 2015
  • The Energy Storage System stores electricity for later use. This system can store electricity from legacy electric power systems or renewable energy systems into a battery device when demand is low. When there is high electricity demand, it uses the electricity previously stored and enables efficient energy usage and stable operation of the electric power system. It increases the energy usage efficiency, stabilizes the power supply system, and increases the utilization of renewable energy. The recent increase in the global interest for efficient energy consumption has increased the need for an energy storage system that can satisfy both the consumers' demand for stable power supply and the suppliers' demand for power demand normalization. In general, an energy storage system consists of a Power Conditioning System, a Battery Management System, a battery cell and peripheral devices. The specifications of the subsystems that form the energy storage system are manufacturer dependent. Since the core component interfaces are not standardized, there are difficulties in forming and operating the energy storage system. In this paper, the design of the profile structure for energy storage system and realization of private profiling system for energy storage system is presented. The profiling system accommodates diverse component settings that are manufacturer dependent and information needed for effective operation. The settings and operation information of various PCSs, BMSs, battery cells, and other peripheral device are analyzed to define profile specification and structure. A profile adapter software that can be applied to energy storage system is designed and implemented. The profiles for energy storage system generated by the profile authoring tool consist of a settings profile and operation profile. Setting profile consists of configuration information for energy device what composes energy saving system. To be more specific, setting profile has three parts of category as information for electric control module, sub system, and interface for communication between electric devices. Operation profile includes information in relation to the method in which controls Energy Storage system. The profiles are based on standard XML specification to accommodate future extensions. The profile system has been verified by applying it to an energy storage system and testing charge and discharge operations.

A Study on the Component-based GIS Development Methodology using UML (UML을 활용한 컴포넌트 기반의 GIS 개발방법론에 관한 연구)

  • Park, Tae-Og;Kim, Kye-Hyun
    • Journal of Korea Spatial Information System Society
    • /
    • v.3 no.2 s.6
    • /
    • pp.21-43
    • /
    • 2001
  • The environment to development information system including a GIS has been drastically changed in recent years in the perspectives of the complexity and diversity of the software, and the distributed processing and network computing, etc. This leads the paradigm of the software development to the CBD(Component Based Development) based object-oriented technology. As an effort to support these movements, OGC has released the abstract and implementation standards to enable approaching to the service for heterogeneous geographic information processing. It is also common trend in domestic field to develop the GIS application based on the component technology for municipal governments. Therefore, it is imperative to adopt the component technology considering current movements, yet related research works have not been made. This research is to propose a component-based GIS development methodology-ATOM(Advanced Technology Of Methodology)-and to verify its adoptability through the case study. ATOM can be used as a methodology to develop component itself and enterprise GIS supporting the whole procedure for the software development life cycle based on conventional reusable component. ATOM defines stepwise development process comprising activities and work units of each process. Also, it provides input and output, standardized items and specs for the documentation, detailed instructions for the easy understanding of the development methodology. The major characteristics of ATOM would be the component-based development methodology considering numerous features of the GIS domain to generate a component with a simple function, the smallest size, and the maximum reusability. The case study to validate the adoptability of the ATOM showed that it proves to be a efficient tool for generating a component providing relatively systematic and detailed guidelines for the component development. Therefore, ATOM would lead to the promotion of the quality and the productivity for developing application GIS software and eventually contribute to the automatic production of the GIS software, the our final goal.

  • PDF

An Empirical Study on the Effect of CRM System on the Performance of Pharmaceutical Companies (고객관계관리 시스템의 수준이 BSC 관점에서의 기업성과에 미치는 영향 : 제약회사를 중심으로)

  • Kim, Hyun-Jung;Park, Jong-Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.4
    • /
    • pp.43-65
    • /
    • 2010
  • Facing a complex environment driven by a decade, many companies are adopting new strategic frameworks such as Customer Relationship Management system to achieve sustainable profitability as well as overcome serious competition for survival. In many business areas, CRM system advanced a great deal in a matter of continuous compensating the defect and overall integration. However, pharmaceutical companies in Korea were slow to accept them for usesince they still have a tendency of holding fast to traditional way of sales and marketing based on individual networks of sales representatives. In the circumstance, this article tried to empirically address current status of CRM system as well as the effects of the system on the performance of pharmaceutical companies by applying BSC method's four perspectives, from financial, customer, learning and growth and internal process. Survey by e-mail and post to employers and employees who were working in pharma firms were undergone for the purpose. Total 113 cases among collected 140 ones were used for the statistical analysis by SPSS ver. 15 package. Reliability, Factor analysis, regression were done. This study revealed that CRM system had a significant effect on improving financial and non-financial performance of pharmaceutical companies as expected. Proposed regression model fits well and among them, CRM marketing information system shed the light on substantial impact on companies' outcome given profitability, growth and investment. Useful analytical information by CRM marketing information system appears to enable pharmaceutical firms to set up effective marketing and sales strategies, these result in favorable financial performance by enhancing values for stakeholderseventually, not to mention short-term profit and/or mid-term potential to growth. CRM system depicted its influence on not only financial performance, but also non-financial fruit of pharmaceutical companies. Further analysis for each component showed that CRM marketing information system were able to demonstrate statistically significant effect on the performance like the result of financial outcome. CRM system is believed to provide the companies with efficient way of customers managing by valuable standardized business process prompt coping with specific customers' needs. It consequently induces customer satisfaction and retentionto improve performance for long period. That is, there is a virtuous circle for creating value as the cornerstone for sustainable growth. However, the research failed to put forward to evidence to support hypothesis regarding favorable influence of CRM sales representative's records assessment system and CRM customer analysis system on the management performance. The analysis is regarded to reflect the lack of understanding of sales people and respondents between actual work duties and far-sighted goal in strategic analysis framework. Ordinary salesmen seem to dedicate short-term goal for the purpose of meeting sales target, receiving incentive bonus in a manner-of-fact style, as such, they tend to avail themselves of personal network and sales and promotional expense rather than CRM system. The study finding proposed a link between CRM information system and performance. It empirically indicated that pharmaceutical companies had been implementing CRM system as an effective strategic business framework in order for more balanced achievements based on the grounded understanding of both CRM system and integrated performance. It suggests a positive impact of supportive CRM system on firm performance, especially for pharmaceutical industry through the initial empirical evidence. Also, it brings out unmet needs for more practical system design, improvement of employees' awareness, increase of system utilization in the field. On the basis of the insight from this exploratory study, confirmatory research by more appropriate measurement tool and increased sample size should be further examined.

A MVC Framework for Visualizing Text Data (텍스트 데이터 시각화를 위한 MVC 프레임워크)

  • Choi, Kwang Sun;Jeong, Kyo Sung;Kim, Soo Dong
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.2
    • /
    • pp.39-58
    • /
    • 2014
  • As the importance of big data and related technologies continues to grow in the industry, it has become highlighted to visualize results of processing and analyzing big data. Visualization of data delivers people effectiveness and clarity for understanding the result of analyzing. By the way, visualization has a role as the GUI (Graphical User Interface) that supports communications between people and analysis systems. Usually to make development and maintenance easier, these GUI parts should be loosely coupled from the parts of processing and analyzing data. And also to implement a loosely coupled architecture, it is necessary to adopt design patterns such as MVC (Model-View-Controller) which is designed for minimizing coupling between UI part and data processing part. On the other hand, big data can be classified as structured data and unstructured data. The visualization of structured data is relatively easy to unstructured data. For all that, as it has been spread out that the people utilize and analyze unstructured data, they usually develop the visualization system only for each project to overcome the limitation traditional visualization system for structured data. Furthermore, for text data which covers a huge part of unstructured data, visualization of data is more difficult. It results from the complexity of technology for analyzing text data as like linguistic analysis, text mining, social network analysis, and so on. And also those technologies are not standardized. This situation makes it more difficult to reuse the visualization system of a project to other projects. We assume that the reason is lack of commonality design of visualization system considering to expanse it to other system. In our research, we suggest a common information model for visualizing text data and propose a comprehensive and reusable framework, TexVizu, for visualizing text data. At first, we survey representative researches in text visualization era. And also we identify common elements for text visualization and common patterns among various cases of its. And then we review and analyze elements and patterns with three different viewpoints as structural viewpoint, interactive viewpoint, and semantic viewpoint. And then we design an integrated model of text data which represent elements for visualization. The structural viewpoint is for identifying structural element from various text documents as like title, author, body, and so on. The interactive viewpoint is for identifying the types of relations and interactions between text documents as like post, comment, reply and so on. The semantic viewpoint is for identifying semantic elements which extracted from analyzing text data linguistically and are represented as tags for classifying types of entity as like people, place or location, time, event and so on. After then we extract and choose common requirements for visualizing text data. The requirements are categorized as four types which are structure information, content information, relation information, trend information. Each type of requirements comprised with required visualization techniques, data and goal (what to know). These requirements are common and key requirement for design a framework which keep that a visualization system are loosely coupled from data processing or analyzing system. Finally we designed a common text visualization framework, TexVizu which is reusable and expansible for various visualization projects by collaborating with various Text Data Loader and Analytical Text Data Visualizer via common interfaces as like ITextDataLoader and IATDProvider. And also TexVisu is comprised with Analytical Text Data Model, Analytical Text Data Storage and Analytical Text Data Controller. In this framework, external components are the specifications of required interfaces for collaborating with this framework. As an experiment, we also adopt this framework into two text visualization systems as like a social opinion mining system and an online news analysis system.

Permanent Preservation and Use of Historical Archives : Preservation Issues Digitization of Historical Collection (역사기록물(Archives)의 항구적인 보존화 이용 : 보존전략과 디지털정보화)

  • Lee, Sang-min
    • The Korean Journal of Archival Studies
    • /
    • no.1
    • /
    • pp.23-76
    • /
    • 2000
  • In this paper, I examined what have been researched and determined about preservation strategy and selection of preservation media in the western archival community. Archivists have primarily been concerned with 'preservation' and 'use' of archival materials worth of being preserved permanently. In the new information era, preservation and use of archival materials were faced with new challenge. Life expectancy of paper records was shortened due to acidification and brittleness of the modem papers. Also emergence of information technology affects the traditional way of preservation and use of archival materials. User expectations are becoming so high technology-oriented and so complicated as to make archivists act like information managers using computer technology rather than traditional archival handicraft. Preservation strategy plays an important role in archival management as well as information management. For a cost-effective management of archives and archival institutions, preservation strategy is a must. The preservation strategy encompasses all aspects of archival preservation process and practices, from selection of archives, appraisal, inventorying, arrangement, description, conservation, microfilming or digitization, archival buildings, and access service. Those archival functions should be considered in their relations to each other to ensure proper preservation of archival materials. In the integrated preservation strategy, 'preservation' and 'use' should be combined and fulfilled without sacrificing the other. Preservation strategy planning is essential to determine the policies of archives to preserve their holdings safe and provide people with a maximum access in most effective ways. Preservation microfilming is to ensure permanent preservation of information held in important archival materials. To do this, a detailed standardization has been developed to guarantee the permanence of microfilm as well as its product quality. Silver gelatin film can last up to 500 years in the optimum storage environment and the most viable option for permanent preservation media. ISO and ANIS developed such standards for the quality of microfilms and microfilming technology. Preservation microfilming guidelines was also developed to ensure effective archival management and picture quality of microfilms. It is essential to assess the need of preservation microfilming. Limit in resources always put a restraint on preservation management. Appraisal (and selection) of what to be preserved was the most important part of preservation microfilming. In addition, microfilms with standard quality can be scanned to produce quality digital images for instant use through internet. As information technology develops, archivists began to utilize information technology to make preservation easier and more economical, and to promote use of archival materials through computer communication network. Digitization was introduced to provide easy and universal access to unique archives, and its large capacity of preserving archival data seems very promising. However, digitization, i.e., transferring images of records to electronic codes, still, needs to be standardized. Digitized data are electronic records, and st present electronic records are very unstable and not to be preserved permanently. Digital media including optical disks materials have not been proved as reliable media for permanent preservation. Due to their chemical coating and physical character using light, they are not stable and can be preserved at best 100 years in the optimum storage environment. Most CD-R can last only 20 years. Furthermore, obsolescence of hardware and software makes hard to reproduce digital images made from earlier versions. Even if when reformatting is possible, the cost of refreshing or upgrading of digital images is very expensive and the very process has to be done at least every five to ten years. No standard for this obsolescence of hardware and software has come into being yet. In short, digital permanence is not a fact, but remains to be uncertain possibility. Archivists must consider in their preservation planning both risk of introducing new technology and promising possibility of new technology at the same time. In planning digitization of historical materials, archivists should incorporate planning for maintaining digitized images and reformatting them in the coming generations of new applications. Without the comprehensive planning, future use of the expensive digital images will become unavailable. And that is a loss of information, and a final failure of both 'preservation' and 'use' of archival materials. As peter Adelstein said, it is wise to be conservative when considerations of conservations are involved.

Recent Research for the Seismic Activities and Crustal Velocity Structure (국내 지진활동 및 지각구조 연구동향)

  • Kim, Sung-Kyun;Jun, Myung-Soon;Jeon, Jeong-Soo
    • Economic and Environmental Geology
    • /
    • v.39 no.4 s.179
    • /
    • pp.369-384
    • /
    • 2006
  • Korean Peninsula, located on the southeastern part of Eurasian plate, belongs to the intraplate region. The characteristics of intraplate earthquake show the low and rare seismicity and the sparse and irregular distribution of epicenters comparing to interplate earthquake. To evaluate the exact seismic activity in intraplate region, long-term seismic data including historical earthquake data should be archived. Fortunately the long-term historical earthquake records about 2,000 years are available in Korea Peninsula. By the analysis of this historical and instrumental earthquake data, seismic activity was very high in 16-18 centuries and is more active at the Yellow sea area than East sea area. Comparing to the high seismic activity of the north-eastern China in 16-18 centuries, it is inferred that seismic activity in two regions shows close relationship. Also general trend of epicenter distribution shows the SE-NW direction. In Korea Peninsula, the first seismic station was installed at Incheon in 1905 and 5 additional seismic stations were installed till 1943. There was no seismic station from 1945 to 1962, but a World Wide Standardized Seismograph was installed at Seoul in 1963. In 1990, Korean Meteorological Adminstration(KMA) had established centralized modem seismic network in real-time, consisted of 12 stations. After that time, many institutes tried to expand their own seismic networks in Korea Peninsula. Now KMA operates 35 velocity-type seismic stations and 75 accelerometers and Korea Institute of Geoscience and Mineral Resources operates 32 and 16 stations, respectively. Korea Institute of Nuclear Safety and Korea Electric Power Research Institute operate 4 and 13 stations, consisted of velocity-type and accelerometer. In and around the Korean Peninsula, 27 intraplate earthquake mechanisms since 1936 were analyzed to understand the regional stress orientation and tectonics. These earthquakes are largest ones in this century and may represent the characteristics of earthquake in this region. Focal mechanism of these earthquakes show predominant strike-slip faulting with small amount of thrust components. The average P-axis is almost horizontal ENE-WSW. In north-eastern China, strike-slip faulting is dominant and nearly horizontal average P-axis in ENE-WSW is very similar with the Korean Peninsula. On the other hand, in the eastern part of East Sea, thrust faulting is dominant and average P-axis is horizontal with ESE-WNW. This indicate that not only the subducting Pacific Plate in east but also the indenting Indian Plate controls earthquake mechanism in the far east of the Eurasian Plate. Crustal velocity model is very important to determine the hypocenters of the local earthquakes. But the crust model in and around Korean Peninsula is not clear till now, because the sufficient seismic data could not accumulated. To solve this problem, reflection and refraction seismic survey and seismic wave analysis method were simultaneously applied to two long cross-section traversing the southern Korean Peninsula since 2002. This survey should be continuously conducted.