• Title/Summary/Keyword: 트랜잭션 처리

Search Result 468, Processing Time 0.022 seconds

User Integrated Authentication System using EID in Blockchain Environment (블록체인 환경에서 EID를 이용한 사용자 통합 인증 시스템)

  • Kim, Jai-Yong;Jung, Yong-Hoon;Jun, Moon-Seog;Lee, Sang-Beon
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.21 no.3
    • /
    • pp.24-31
    • /
    • 2020
  • Centralized systems in computing environments have various problems, such as privacy infringement due to hacking, and the possibility of privacy violations in case of system failure. Blockchain, one of the core technologies for the next generation of converged information, is expected to be an alternative to the existing centralized system, which has had various problems. This paper proposes a blockchain-based user authentication system that can identify users using EID in an online environment. Existing identification (ID)/password (PW) authentication methods require users to store personal information in multiple sites, and receive and use their respective IDs. However, the proposed system can be used without users signing up at various sites after the issuing of an EID. The proposed system issues an EID with a minimum of information, such as an e-mail address and a telephone number. By comparing the stability and efficiency of a centralized system, the proposed integrated authentication system proved to be excellent. In order to compare stability against existing systems, we chose attack methods and encroachments on the computing environment. To verify efficiency, the total throughput between the user's app, the issuance and certification-authority's servers, and the service provider's servers was compared and analyzed based on processing time per transaction.

Improvement of Performance for Online Certificate Status Validation (실시간 인증서 상태검증의 성능개선)

  • Jung, Jai-Dong;Oh, Hae-Seok
    • The KIPS Transactions:PartC
    • /
    • v.10C no.4
    • /
    • pp.433-440
    • /
    • 2003
  • According as the real economic activities are carried out in the cyber world and the identity problem of a trade counterpart emerges, digital signature has been diffused. Due to the weakness for real-time validation using the validation method of digital signature, Certificate Revocation List, On-line Certificate Status Protocol was introduced. In this case, every transaction workload requested to verify digital signature is concentrated of a validation server node. Currently this method has been utilized on domestic financial transactions, but sooner or later the limitation will be revealed. In this paper, the validation method will be introduced which not only it can guarantee real-time validation but also the requesting node of certificate validation can maintain real-time certificate status information. This method makes the revocation management node update the certificate status information in real-time to the validation node while revoking certificate. The characteristic of this method is that the revocation management node should memorize the validation nodes which a certificate holder uses. If a certificate holder connects a validation node for the first time, the validation node should request its certificate status information to the above revocation management node and the revocation management node memorizes the validation node at the time. After that, the revocation management node inform the revocation information in real-time to all the validation node registered when a request of revocation happens. The benefits of this method are the fact that we can reduce the validation time because the certificate validation can be completed at the validation node and that we can avoid the concentration of requesting certificate status information to a revocation node.

A Study on DID-based Vehicle Component Data Collection Model for EV Life Cycle Assessment (전기차 전과정평가를 위한 DID 기반 차량부품 데이터수집 모델 연구)

  • Jun-Woo Kwon;Soojin Lee;Jane Kim;Seung-Hyun Seo
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.12 no.10
    • /
    • pp.309-318
    • /
    • 2023
  • Recently, each country has been moving to introduce an LCA (Life Cycle Assessment) to regulate greenhouse gas emissions. The LCA is a mean of measuring and evaluating greenhouse gas emissions generated over the entire life cycle of a vehicle. Reliable data for each electric vehicle component is needed to increase the reliability of the LCA results. To this end, studies on life cycle evaluation models using blockchain technology have been conducted. However, in the existing model, key product information is exposed to other participants. And each time parts data information is updated, it must be recorded in the blockchain ledger in the form of a transaction, which is inefficient. In this paper, we proposed a DID(Decentralized Identity)-based data collection model for LCA to collect vehicle component data and verify its validity effectively. The proposed model increases the reliability of the LCA by ensuring the validity and integrity of the collected data and verifying the source of the data. The proposed model guarantees the validity and integrity of collected data. As only user authentication information is shared on the blockchain ledger, the model prevents indiscriminate exposure of data and efficiently verifies and updates the source of data.

Analysis and Evaluation of Frequent Pattern Mining Technique based on Landmark Window (랜드마크 윈도우 기반의 빈발 패턴 마이닝 기법의 분석 및 성능평가)

  • Pyun, Gwangbum;Yun, Unil
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.101-107
    • /
    • 2014
  • With the development of online service, recent forms of databases have been changed from static database structures to dynamic stream database structures. Previous data mining techniques have been used as tools of decision making such as establishment of marketing strategies and DNA analyses. However, the capability to analyze real-time data more quickly is necessary in the recent interesting areas such as sensor network, robotics, and artificial intelligence. Landmark window-based frequent pattern mining, one of the stream mining approaches, performs mining operations with respect to parts of databases or each transaction of them, instead of all the data. In this paper, we analyze and evaluate the techniques of the well-known landmark window-based frequent pattern mining algorithms, called Lossy counting and hMiner. When Lossy counting mines frequent patterns from a set of new transactions, it performs union operations between the previous and current mining results. hMiner, which is a state-of-the-art algorithm based on the landmark window model, conducts mining operations whenever a new transaction occurs. Since hMiner extracts frequent patterns as soon as a new transaction is entered, we can obtain the latest mining results reflecting real-time information. For this reason, such algorithms are also called online mining approaches. We evaluate and compare the performance of the primitive algorithm, Lossy counting and the latest one, hMiner. As the criteria of our performance analysis, we first consider algorithms' total runtime and average processing time per transaction. In addition, to compare the efficiency of storage structures between them, their maximum memory usage is also evaluated. Lastly, we show how stably the two algorithms conduct their mining works with respect to the databases that feature gradually increasing items. With respect to the evaluation results of mining time and transaction processing, hMiner has higher speed than that of Lossy counting. Since hMiner stores candidate frequent patterns in a hash method, it can directly access candidate frequent patterns. Meanwhile, Lossy counting stores them in a lattice manner; thus, it has to search for multiple nodes in order to access the candidate frequent patterns. On the other hand, hMiner shows worse performance than that of Lossy counting in terms of maximum memory usage. hMiner should have all of the information for candidate frequent patterns to store them to hash's buckets, while Lossy counting stores them, reducing their information by using the lattice method. Since the storage of Lossy counting can share items concurrently included in multiple patterns, its memory usage is more efficient than that of hMiner. However, hMiner presents better efficiency than that of Lossy counting with respect to scalability evaluation due to the following reasons. If the number of items is increased, shared items are decreased in contrast; thereby, Lossy counting's memory efficiency is weakened. Furthermore, if the number of transactions becomes higher, its pruning effect becomes worse. From the experimental results, we can determine that the landmark window-based frequent pattern mining algorithms are suitable for real-time systems although they require a significant amount of memory. Hence, we need to improve their data structures more efficiently in order to utilize them additionally in resource-constrained environments such as WSN(Wireless sensor network).

A Case Study on SK Telecom's Next Generation Marketing System Development (SK텔레콤의 차세대 마케팅 시스템 개발사례 연구)

  • Lee, Sang-Goo;Jang, Si-Young;Yang, Jung-Yeon
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.14 no.2
    • /
    • pp.158-170
    • /
    • 2008
  • In response to the changing demands of ever competitive market, SK Telecom has built a new marketing system that can support dynamic marketing campaigns and, at the same time, scale up to the large volumes of data and transactions for the next decade. The system which employs Unix-based client-server (using Web browser interfaces) architecture will replace the current mainframe-based COIS system. The project, named NGM (Next Generation Marketing ), is unprecedentedly large in scale. However, both managerial and technical problems led the project into a crisis. The application framework that depended on a software solution from a major global vendor could not support the dynamic functionalities required for the new system. In March 2005, SK telecom declared the suspension of the NGM project. The second phase of the project started in May 2005 following a comprehensive replanning. It was decided that no single existing solution could cope with the complexity of the new system and hence the new system would be custom-built. As such. a number of technical challenges emerged. In this paper, we report on the three key dimensions of technical challenges - middleware and application framework, database architecture and tuning, and system performance. The processes and approaches, adopted in building NGM system, may be viewed as "best practices" in the telecom industry. The completed NGM system, now called "U.key System," successfully came into operation on the ninth of October, 2006. This new infrastructure is expected to give birth to a series of innovative, fruitful, and customer-oriented applications in the near future.

Business Process Design to Apply ebXML Framework to the Port and Logistics Distribution Industry (ebXML 적용을 위한 항만물류산업 비즈니스 프로세스 설계)

  • Choi, Hyung-Rim;Park, Nam-Kyu;Lim, Ho-Seob;Lee, Hyun-Chul;Lee, Chang-Sup
    • Information Systems Review
    • /
    • v.4 no.2
    • /
    • pp.209-222
    • /
    • 2002
  • EDI (Electronic Data Interchange) has been widely utilized to support Business Activities since it has such advantages as fast transfer of information, less documentation work, efficient information exchange etc. Recently e-business environment has urged the traditional EDI system to be changed to ebXML framework. To apply the ebXML framework to a certain industry, it is required to implement Business Process (BP), Core Component (CC), Collaboration Protocol Profile (CPP), Collaboration Protocol Agreement (CPA), Messaging system etc. We have selected the port and logistics industry as a target domain to apply ebXML framework, since the EDI usage ratio of it is relatively higher than other industries. In this paper, we have analyzed the current status of EDI system and transaction processes in the port and logistics industry. We have defined the business process that will be registered in the registry/repository, the main component of ebXML framework, using UN/CEFACT modeling methodology. And Business Collaborations, Business Transactions, Business Document Flows, Choreography, Pattern, etc. are represented using UML according to UN/ CEFACT modeling methodology, to apply ebXML Framework to the port and logistics distribution industry. Also we have suggested the meta methodology for applying the ebXML framework to other industries.

Study on Context-Aware SOA based on Open Service Gateway initiative platform (OSGi플렛폼 기반의 상황인식 서비스지향아키텍쳐에 관한 연구)

  • Choi, Sung-Wook;Oh, Am-Suk;Kwon, Oh-Hyun;Kang, Si-Hyeob;Hong, Soon-Goo;Choi, Hyun-Rim
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.10 no.11
    • /
    • pp.2083-2090
    • /
    • 2006
  • In an proposed Context-Aware SOA(Service Oriented Architecture) based OSGi(Open Service Gateway initiative) platform, Service provider manages relative kinds of services in an integrative basis from various sensors, puts each service in a SOAP (Simple Object access Protocol) message, and register thorn to the UDDI(Universal Description Discovery and Integration) server of service registry, service requester retrivel the specified kinds of services and call them to service provider. Recently most context-aware technologies for ubiquitous home network are mainly putting emphasis on RFID/USN and location-based technology. Because of this, service-oriented architecture researches have not been made enough. Under the environment of an OSGi service platform, various context-aware services are dynamically mapping from various sensors, new services are being offered for the asking of users, and existing services are changing. Accordingly, the data sharing between services provided, management of service life cycle, and the facilitation of service distribution are needed. Taking into considering all these factors, this study has suggested an Context-Aware SOA based eclipse SOA Tools Platform using OSGi platform that can transaction throughtput of more than 546 TPS of distributional Little's Law from ATAM(Architecture Tradeoff Analysis Method) while remaining stable other condition.

A Study on the Application of Blockchain Technology to the Record Management Model (블록체인기술을 적용한 기록관리 모델 구축 방법 연구)

  • Hong, Deok-Yong
    • Journal of Korean Society of Archives and Records Management
    • /
    • v.19 no.3
    • /
    • pp.223-245
    • /
    • 2019
  • As the foundation for the Fourth Industrial Revolution, blockchain is becoming an essential core infrastructure and technology that creates new growth engines in various industries and is rapidly spreading to the environment of businesses and institutions worldwide. In this study, the characteristics and trends of blockchain technology were investigated and arranged, its application to the records management section of public institutions was required, and the procedures and methods of construction in the records management field of public institutions were studied in literature. Finally, blockchain technology was applied to the records management to propose an archive chain model and describe possible expectations. When the transactions that record the records management process of electronic documents are loaded into the blockchain, all the step information can be checked at once in the activity of processing the records management standard tasks that were fragmentarily nonlinked. If a blockchain function is installed in the electronic records management system, the person who produces the document by acquiring and registering the document enters the metadata and information, as well as stores and classifies all contents. This would simplify the process of reporting the production status and provide real-time information through the original text information disclosure service. Archivechain is a model that applies a cloud infrastructure as a backend as a service (BaaS) by applying a hyperledger platform based on the assumption that an electronic document production system and a records management system are integrated. Creating a smart, electronic system of the records management is the solution to bringing scattered information together by placing all life cycles of public records management in a blockchain.