• Title/Summary/Keyword: Enterprise Cloud

Search Result 77, Processing Time 0.019 seconds

Enhancement of Enterprise Security System Using Zero Trust Security (제로 트러스트 보안을 활용한 기업보안시스템 강화 방안)

  • Lee, Seon-a;Kim, Beom Seok;Lee, Hye in;Park, Won hyung
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2021.10a
    • /
    • pp.214-216
    • /
    • 2021
  • It proposes a plan to strengthen the limitations of existing corporate security systems based on Zero-Trust. With the advent of the era of the Fourth Industrial Revolution, the paradigm of security is also changing. As remote work becomes more active due to cloud computing and COVID-19, security issues arising from the changed IT environment are raised. At the same time, in the current situation where attack techniques are becoming intelligent and advanced, companies should further strengthen their current security systems by utilizing zero trust security. Zero-trust security increases security by monitoring all data communications based on the concept of doubting and trusting everything, and allowing strict authentication and minimal access to access requestors. Therefore, this paper introduces a zero trust security solution that strengthens the existing security system and presents the direction and validity that companies should introduce.

  • PDF

Strengthening Enterprise Security through the Adoption of Zero Trust Architecture - A Focus on Micro-segmentation Approach - (제로 트러스트 아키텍처 도입을 통한 기업 보안 강화 방안 - 마이크로 세그먼테이션 접근법 중심으로 -)

  • Seung-Hyun Joo;Jin-Min Kim;Dae-Hyun Kwon;Yong-Tae Shin
    • Convergence Security Journal
    • /
    • v.23 no.3
    • /
    • pp.3-11
    • /
    • 2023
  • Zero Trust, characterized by the principle of "Never Trust, Always Verify," represents a novel security paradigm. The proliferation of remote work and the widespread use of cloud services have led to the establishment of Work From Anywhere (WFA) environments, where access to corporate systems is possible from any location. In such environments, the boundaries between internal and external networks have become increasingly ambiguous, rendering traditional perimeter security models inadequate to address the complex and diverse nature of cyber threats and attacks. This research paper introduces the implementation principles of Zero Trust and focuses on the Micro Segmentation approach, highlighting its relevance in mitigating the limitations of perimeter security. By leveraging the risk management framework provided by the National Institute of Standards and Technology (NIST), this paper proposes a comprehensive procedure for the adoption of Zero Trust. The aim is to empower organizations to enhance their security strategies.

Study on Automation of Comprehensive IT Asset Management (포괄적 IT 자산관리의 자동화에 관한 연구)

  • Wonseop Hwang;Daihwan Min;Junghwan Kim;Hanjin Lee
    • Journal of Information Technology Services
    • /
    • v.23 no.1
    • /
    • pp.1-10
    • /
    • 2024
  • The IT environment is changing due to the acceleration of digital transformation in enterprises and organizations. This expansion of the digital space makes centralized cybersecurity controls more difficult. For this reason, cyberattacks are increasing in frequency and severity and are becoming more sophisticated, such as ransomware and digital supply chain attacks. Even in large organizations with numerous security personnel and systems, security incidents continue to occur due to unmanaged and unknown threats and vulnerabilities to IT assets. It's time to move beyond the current focus on detecting and responding to security threats to managing the full range of cyber risks. This requires the implementation of asset Inventory for comprehensive management by collecting and integrating all IT assets of the enterprise and organization in a wide range. IT Asset Management(ITAM) systems exist to identify and manage various assets from a financial and administrative perspective. However, the asset information managed in this way is not complete, and there are problems with duplication of data. Also, it is insufficient to update of data-set, including Network Infrastructure, Active Directory, Virtualization Management, and Cloud Platforms. In this study, we, the researcher group propose a new framework for automated 'Comprehensive IT Asset Management(CITAM)' required for security operations by designing a process to automatically collect asset data-set. Such as the Hostname, IP, MAC address, Serial, OS, installed software information, last seen time, those are already distributed and stored in operating IT security systems. CITAM framwork could classify them into unique device units through analysis processes in term of aggregation, normalization, deduplication, validation, and integration.

A New Design and Implementation of Digital Evidence Container for Triage and Effective Investigation (디지털 증거 선별 조사의 효율성을 위한 Digital Evidence Container 설계 및 구현)

  • Lim, Kyung-Soo;Lee, Chang-Hoon;Lee, Sang-In
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.49 no.4
    • /
    • pp.31-41
    • /
    • 2012
  • The law enforcement agencies in the worldwide are confiscating or retaining computer systems involved in a crime/civil case, if there are any, at the preliminary investigation stage, even though the case does not involve a cyber-crime. They are collecting digital evidences from the suspects's systems and using them in the essential investigation procedure. It requires much time, though, to collect, duplicate and analyze disk images in general crime cases, especially in cases in which rapid response must be taken such as kidnapping and murder cases. The enterprise forensics, moreover, it is impossible to acquire and duplicate hard disk drives in mass storage server, database server and cloud environments. Therefore, it is efficient and effective to selectively collect only traces of the behavior of the user activities on operating systems or particular files in focus of triage investigation. On the other hand, if we acquire essential digital evidences from target computer, it is not forensically sound to collect just files. We need to use standard digital evidence container from various sources to prove integrity and probative of evidence. In this article, we describe a new digital evidence container, we called Xebeg, which is easily able to preserve collected digital evidences selectively for using general technology such as XML and PKZIP compression technology, which is satisfied with generality, integrity, unification, scalability and security.

A Study of 3D Modeling of Compressed Urban LiDAR Data Using VRML (VRML을 이용한 도심지역 LiDAR 압축자료의 3차원 표현)

  • Jang, Young-Woon;Choi, Yun-Woong;Cho, Gi-Sung
    • Journal of Korean Society for Geospatial Information Science
    • /
    • v.19 no.2
    • /
    • pp.3-8
    • /
    • 2011
  • Recently, the demand for enterprise for service map providing and portal site services of a 3D virtual city model for public users has been expanding. Also, accuracy of the data, transfer rate and the update for the update for the lapse of time emerge are considered as more impertant factors, by providing 3D information with the web or mobile devices. With the latest technology, we have seen various 3D data through the web. With the VRML progressing actively, because it can provide a virtual display of the world and all aspects of interaction with web. It offers installation of simple plug-in without extra cost on the web. LiDAR system can obtain spatial data easily and accurately, as supprted by numerous researches and applications. However, in general, LiDAR data is obtained in the form of an irregular point cloud. So, in case of using data without converting, high processor is needed for presenting 2D forms from point data composed of 3D data and the data increase. This study expresses urban LiDAR data in 3D, 2D raster data that was applied by compressing algorithm that was used for solving the problems of large storage space and processing. For expressing 3D, algorithm that converts compressed LiDAR data into code Suited to VRML was made. Finally, urban area was expressed in 3D with expressing ground and feature separately.

The Development of 1G-PON Reach Extender based on Wavelength Division Multiplexing for Reduction of Optical Core (국사 광역화와 광코어 절감을 위한 파장분할다중 기반의 1기가급 수동 광가입자망 Reach Extender 효율 극대화 기술 개발)

  • Lee, Kyu-Man;Kwon, Taek-Won
    • Journal of Digital Convergence
    • /
    • v.17 no.8
    • /
    • pp.229-235
    • /
    • 2019
  • As the demand for broadband multimedia including the Internet explosively increases, the advancement of the subscriber network is becoming the biggest issue in the telecommunication industry due to the surge of data traffic caused by the emergence of new services such as smart phone, IPTV, VoIP, VOD and cloud services. In this paper, we have developed WDM(Wavelength Division Multiplexing)-PON(passive optical network) based on the 1-Gigabit Reach Externder (RE) technique to reduce optical core. Particularly, in order to strengthen the market competitiveness, we considered low cost, miniaturization, integration technique, and low power of optical parts. In addition, we have developed a batch system by integrating all techniques for reliability, remote management through the development of transmission distance extension and development of capacity increase of optical line by using RE technology in existing PON network. Based on system interworking with existing commercial 1G PON devices, it can be worthy of achievement of wide nationalization and optical core reduction by using this developed system. Based on these results, we are studying development of 10G PON technology.

An Efficient Dual Queue Strategy for Improving Storage System Response Times (저장시스템의 응답 시간 개선을 위한 효율적인 이중 큐 전략)

  • Hyun-Seob Lee
    • Journal of Internet of Things and Convergence
    • /
    • v.10 no.3
    • /
    • pp.19-24
    • /
    • 2024
  • Recent advances in large-scale data processing technologies such as big data, cloud computing, and artificial intelligence have increased the demand for high-performance storage devices in data centers and enterprise environments. In particular, the fast data response speed of storage devices is a key factor that determines the overall system performance. Solid state drives (SSDs) based on the Non-Volatile Memory Express (NVMe) interface are gaining traction, but new bottlenecks are emerging in the process of handling large data input and output requests from multiple hosts simultaneously. SSDs typically process host requests by sequentially stacking them in an internal queue. When long transfer length requests are processed first, shorter requests wait longer, increasing the average response time. To solve this problem, data transfer timeout and data partitioning methods have been proposed, but they do not provide a fundamental solution. In this paper, we propose a dual queue based scheduling scheme (DQBS), which manages the data transfer order based on the request order in one queue and the transfer length in the other queue. Then, the request time and transmission length are comprehensively considered to determine the efficient data transmission order. This enables the balanced processing of long and short requests, thus reducing the overall average response time. The simulation results show that the proposed method outperforms the existing sequential processing method. This study presents a scheduling technique that maximizes data transfer efficiency in a high-performance SSD environment, which is expected to contribute to the development of next-generation high-performance storage systems