• Title/Summary/Keyword: Real Security

Search Result 1,826, Processing Time 0.025 seconds

Analysis of Latency and Computation Cost for AES-based Whitebox Cryptography Technique (AES 기반 화이트박스 암호 기법의 지연 시간과 연산량 분석)

  • Lee, Jin-min;Kim, So-yeon;Lee, Il-Gu
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2022.05a
    • /
    • pp.115-117
    • /
    • 2022
  • Whitebox encryption technique is a method of preventing exposure of encryption keys by mixing encryption key information with a software-based encryption algorithm. Whitebox encryption technique is attracting attention as a technology that replaces conventional hardware-based security encryption techniques by making it difficult to infer confidential data and keys by accessing memory with unauthorized reverse engineering analysis. However, in the encryption and decryption process, a large lookup table is used to hide computational results and encryption keys, resulting in a problem of slow encryption and increased memory size. In particular, it is difficult to apply whitebox cryptography to low-cost, low-power, and light-weight Internet of Things products due to limited memory space and battery capacity. In addition, in a network environment that requires real-time service support, the response delay time increases due to the encryption/decryption speed of the whitebox encryption, resulting in deterioration of communication efficiency. Therefore, in this paper, we analyze whether the AES-based whitebox(WBC-AES) proposed by S.Chow can satisfy the speed and memory requirements based on the experimental results.

  • PDF

Data Central Network Technology Trend Analysis using SDN/NFV/Edge-Computing (SDN, NFV, Edge-Computing을 이용한 데이터 중심 네트워크 기술 동향 분석)

  • Kim, Ki-Hyeon;Choi, Mi-Jung
    • KNOM Review
    • /
    • v.22 no.3
    • /
    • pp.1-12
    • /
    • 2019
  • Recently, researching using big data and AI has emerged as a major issue in the ICT field. But, the size of big data for research is growing exponentially. In addition, users of data transmission of existing network method suggest that the problem the time taken to send and receive big data is slower than the time to copy and send the hard disk. Accordingly, researchers require dynamic and flexible network technology that can transmit data at high speed and accommodate various network structures. SDN/NFV technologies can be programming a network to provide a network suitable for the needs of users. It can easily solve the network's flexibility and security problems. Also, the problem with performing AI is that centralized data processing cannot guarantee real-time, and network delay occur when traffic increases. In order to solve this problem, the edge-computing technology, should be used which has moved away from the centralized method. In this paper, we investigate the concept and research trend of SDN, NFV, and edge-computing technologies, and analyze the trends of data central network technologies used by combining these three technologies.

3-Factor Authentication Using HMAC-based One-Time Password (HMAC 기반의 일회용 패스워드를 이용한 3-Factor 인증)

  • Kim, Ji-Hong;Oh, Sei-Woong
    • Journal of the Korea Society of Computer and Information
    • /
    • v.14 no.6
    • /
    • pp.27-32
    • /
    • 2009
  • Recently, most of information services are provided by the computer network, since the technology of computer communication is developing rapidly, and the worth of information over the network is also increasing with expensive cost. But various attacks to quietly intercept the informations is invoked with the technology of communication developed, and then most of the financial agency currently have used OTP, which is generated by a token at a number whenever a user authenticates to a server, rather than general static password for some services. A 2-Factor OTP generating method using the OTP token is mostly used by the financial agency. However, the method is vulnerable to real attacks and therefore the OTP token could be robbed and disappeared. In this paper, we propose a 3-Factor OTP way using HMAC to conquer the problems and analyze the security of the proposed scheme.

A Study on Improvement of the Calculation Methodology of Employee Invention Compensation (직무발명 보상액 산정 방법론의 개선 방안 연구)

  • Cho, Myunggeun;Lee, Hwansoo
    • Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology
    • /
    • v.7 no.12
    • /
    • pp.101-110
    • /
    • 2017
  • According to the Statistics Korea in 2016, 56.9% of companies do not fairly pay compensation for employee invention, despite the increasing proportion of the inventions in corporations. One reason is that the objective calculation method for employee and patent's contribution and the clear standard of fair compensation have not been established. Therefore, this study proposes a new calculation method using DCF (Discounted cash flow) and AHP (Analytical hiearchy process) methodology to calculate the fair amount of employee invention compensation, and verified it through real case examples. As a result, 2.3 times higher amount of compensation was calculated than the previous approach. This study is meaningful that it provided objective compensation criteria that could more protect the inventor in the situation which the clear criteria for the calculation of fair compensation are not established. This methodology is expected to be applicable for SMEs as employee invention compensation.

A Study on the Risk Perception differences by Age on Augmented Reality Game (증강현실 게임에서 연령에 따른 위험 인식 차이 연구)

  • Choi, Jieun;Kang, Juyoung;Lee, Hwansoo
    • Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology
    • /
    • v.7 no.3
    • /
    • pp.401-410
    • /
    • 2017
  • Augmented reality game has differences with traditional games in the way what it includes physical risks happened in the real world. Although accidents related to Pokemon Go are increasing, academic discussions on the issues are rare. Thus, this study explores risk types caused by the augmented reality game and examined how perception of the risks affect usage intention on the game. This study also provides the research implications through comparative analysis between youth group who generally has the tendency toward game addiction and older group. According to the results, teenagers had less usage intention when perceived financial risk increase. In case of over twenty, time risk had negative relationship with usage intention. Physical risk had no significant effect on usage intention for teenagers, but positive relationship was observed in case over twenty. These results imply the necessity of an appropriate regulation for safe game play.

Study on Remote Face Recognition System Using by Multi Thread on Distributed Processing Server (분산처리서버에서의 멀티 쓰레드 방식을 적용한 원격얼굴인식 시스템)

  • Kim, Eui-Sun;Ko, Il-Ju
    • The Journal of Korean Institute of Next Generation Computing
    • /
    • v.13 no.5
    • /
    • pp.19-28
    • /
    • 2017
  • Various methods for reducing the load on the server have been implemented in performing face recognition remotely by the spread of IP security cameras. In this paper, IP surveillance cameras at remote sites are input through a DSP board equipped with face detection function, and then face detection is performed. Then, the facial region image is transmitted to the server, and the face recognition processing is performed through face recognition distributed processing. As a result, the overall server system load and significantly reduce processing and real-time face recognition has the advantage that you can perform while linked up to 256 cameras. The technology that can accomplish this is to perform 64-channel face recognition per server using distributed processing server technology and to process face search results through 250 camera channels when operating four distributed processing servers there was.

A Novel Two-Stage Training Method for Unbiased Scene Graph Generation via Distribution Alignment

  • Dongdong Jia;Meili Zhou;Wei WEI;Dong Wang;Zongwen Bai
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.17 no.12
    • /
    • pp.3383-3397
    • /
    • 2023
  • Scene graphs serve as semantic abstractions of images and play a crucial role in enhancing visual comprehension and reasoning. However, the performance of Scene Graph Generation is often compromised when working with biased data in real-world situations. While many existing systems focus on a single stage of learning for both feature extraction and classification, some employ Class-Balancing strategies, such as Re-weighting, Data Resampling, and Transfer Learning from head to tail. In this paper, we propose a novel approach that decouples the feature extraction and classification phases of the scene graph generation process. For feature extraction, we leverage a transformer-based architecture and design an adaptive calibration function specifically for predicate classification. This function enables us to dynamically adjust the classification scores for each predicate category. Additionally, we introduce a Distribution Alignment technique that effectively balances the class distribution after the feature extraction phase reaches a stable state, thereby facilitating the retraining of the classification head. Importantly, our Distribution Alignment strategy is model-independent and does not require additional supervision, making it applicable to a wide range of SGG models. Using the scene graph diagnostic toolkit on Visual Genome and several popular models, we achieved significant improvements over the previous state-of-the-art methods with our model. Compared to the TDE model, our model improved mR@100 by 70.5% for PredCls, by 84.0% for SGCls, and by 97.6% for SGDet tasks.

Designing Bigdata Platform for Multi-Source Maritime Information

  • Junsang Kim
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.1
    • /
    • pp.111-119
    • /
    • 2024
  • In this paper, we propose a big data platform that can collect information from various sources collected at ocean. Currently operating ocean-related big data platforms are focused on storing and sharing created data, and each data provider is responsible for data collection and preprocessing. There are high costs and inefficiencies in collecting and integrating data in a marine environment using communication networks that are poor compared to those on land, making it difficult to implement related infrastructure. In particular, in fields that require real-time data collection and analysis, such as weather information, radar and sensor data, a number of issues must be considered compared to land-based systems, such as data security, characteristics of organizations and ships, and data collection costs, in addition to communication network issues. First, this paper defines these problems and presents solutions. In order to design a big data platform that reflects this, we first propose a data source, hierarchical MEC, and data flow structure, and then present an overall platform structure that integrates them all.

Enhanced ACGAN based on Progressive Step Training and Weight Transfer

  • Jinmo Byeon;Inshil Doh;Dana Yang
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.3
    • /
    • pp.11-20
    • /
    • 2024
  • Among the generative models in Artificial Intelligence (AI), especially Generative Adversarial Network (GAN) has been successful in various applications such as image processing, density estimation, and style transfer. While the GAN models including Conditional GAN (CGAN), CycleGAN, BigGAN, have been extended and improved, researchers face challenges in real-world applications in specific domains such as disaster simulation, healthcare, and urban planning due to data scarcity and unstable learning causing Image distortion. This paper proposes a new progressive learning methodology called Progressive Step Training (PST) based on the Auxiliary Classifier GAN (ACGAN) that discriminates class labels, leveraging the progressive learning approach of the Progressive Growing of GAN (PGGAN). The PST model achieves 70.82% faster stabilization, 51.3% lower standard deviation, stable convergence of loss values in the later high resolution stages, and a 94.6% faster loss reduction compared to conventional methods.

Development of Personalized Heart Disease Health Status Monitoring Web Service (개인별 맞춤형 심장질환 건강상태 모니터링 웹 서비스 개발)

  • Young-bok Cho
    • Journal of Practical Engineering Education
    • /
    • v.16 no.4
    • /
    • pp.491-497
    • /
    • 2024
  • Over the past five years, the proportion of patients with arrhythmia heart disease among teenagers and those in their 20s has been increasing. Heart disease has consistently remained the second leading cause of death in Korea and as the number has increased, electrocardiogram testing for arrhythmia has become important. However, specialized electrocardiogram medical devices are economically burdensome and are difficult to store individually in hospitals due to their large size and difficulty in operation. Testing is conducted through visits. Therefore, it is essential to enable individuals to perform ECG self-examinations using an Arduino-based ECG sensor that is affordable and easy to use in daily life, so that arrhythmia can be identified through individual ECG measurement. In this study, data is measured using an electrocardiogram sensor (AD8232), and changes in bio signals are visually provided through real-time monitoring, allowing users to make intuitive decisions and at the same time understand test results. To safeguard sensitive personal information, we have developed a web service that provides individual heart disease and customized health guides that can protect personal information through web vulnerability security using session and user authentication and SSL.