• Title/Summary/Keyword: identifier system

Search Result 264, Processing Time 0.033 seconds

A Study on Derivation of Technical Elements for Joint Use of Copyright Rights Group Data by ISNI Korea Consortium (ISNI Korea 컨소시엄의 저작권 권리 단체 데이터 공동 활용을 위한 기술요소 도출 연구)

  • Park, Jin Ho;Kwak, Seung Jin;Lee, Seungmin;Oh, Sang Hee
    • Journal of the Korean BIBLIA Society for library and Information Science
    • /
    • v.31 no.1
    • /
    • pp.379-392
    • /
    • 2020
  • The purpose of this study is to present technical elements to improve the data utilization of the ISNI Korea Consortium, which is operated by the National Library of Korea, which is responsible for the registration of Korean names. The ISNI Korea Consortium aims to register various creation-related information in addition to bibliographic-related individual and organization name information. To this end, this study examines the ISNI Korea consortium, which is a data provider's consultative body, among other things, the metadata status of copyright organizations and the linked data specification of ISNI. As a result, data acquisition, refining, storage, identifier management, consortium metadata management, five in total, and RDF data management (repository) in terms of linked data, RDF data issuance, RDF data retrieval, RDF data inquiry, RDF data Eight technical elements, including download, ontology inquiry, standard term inquiry and mapping information management, were derived.

Adaptive Inter-Agent Communication Protocol for Large-Scale Mobile Agent Systems (대규모 이동 에이전트 시스템을 위한 적응적 에이전트간 통신 프로토콜)

  • Ahn Jin-Ho
    • The KIPS Transactions:PartA
    • /
    • v.13A no.4 s.101
    • /
    • pp.351-362
    • /
    • 2006
  • This paper proposes an adaptive inter-agent communication protocol to considerably reduce the amount of agent location information maintained by each service node and the message delivery time while avoiding the dependency of the home node of a mobile agent. To satisfy this goal, the protocol enables each mobile agent to autonomously leave its location information only on some few of its visiting nodes. Also, it may significantly reduce the agent cache updating frequency of each service node by keeping the identifier of the location manager of each agent in the smart agent location cache of the node. Simulation results show that the proposed protocol reduces about $76%{\sim}80%$ of message delivery overhead and about $76%{\sim}79%$ of the amount of agent location information each service node should maintain compared with the traditional one.

The study of the stereo X-ray system for automated X-ray inspection system using 3D-reconstruction shape information (3차원 형상복원 정보 기반의 검색 자동화를 위한 스테레오 X-선 검색장치에 관한 연구)

  • Hwang, Young-Gwan;Lee, Nam-Ho
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.8
    • /
    • pp.2043-2050
    • /
    • 2014
  • As most the scanning systems developed until now provide radiation scan plane images of the inspected objects, there has been a limitation in judging exactly the shape of the objects inside a logistics container exactly with only 2-D radiation image information. As a radiation image is just the density information of the scanned object, the direct application of general stereo image processing techniques is inefficient. So we propose that a new volume-based 3-D reconstruction algorithm. Experimental results show the proposed new volume based reconstruction technique can provide more efficient visualization for X-ray inspection. For validation of the proposed shape reconstruction algorithm using volume, 15 samples were scanned and reconstructed to restore the shape using an X-ray stereo inspection system. Reconstruction results of the objects show a high degree of accuracy compared to the width (2.56%), height (6.15%) and depth (7.12%) of the measured value for a real object respectively. In addition, using a K-Mean clustering algorithm a detection efficiency of 97% is achieved. The results of the reconstructed shape information using the volume based shape reconstruction algorithm provide the depth information of the inspected object with stereo X-ray inspection. Depth information used as an identifier for an automated search is possible and additional studies will proceed to retrieve an X-ray inspection system that can greatly improve the efficiency of an inspection.

A Design Communication System for Message Protection in Next Generation Wireless Network Environment (차세대 무선 네트워크 환경에서 메시지 보호를 위한 통신 시스템 설계)

  • Min, So-Yeon;Jin, Byung-Wook
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.16 no.7
    • /
    • pp.4884-4890
    • /
    • 2015
  • These days most of people possesses an average of one to two mobile devices in the world and a wireless network market is gradually expanding. Wi-Fi preference are increasing in accordance with the use growth of mobile devices. A number of areas such as public agencies, health care, education, learning, and content, manufacturing, retail create new values based on Wi-Fi, and the global network is built and provides complex services. However, There exist some attacks and vulnerabilities like wireless radio device identifier vulnerability, illegal use of network resources through the MAC forgery, wireless authentication key cracking, unauthorized AP / devices attack in the next generation radio network environment. In addition, advanced security technology research, such as authentication Advancement and high-speed secure connection is not nearly progress. Therefore, this paper designed a secure communication system for message protection in next-generation wireless network environments by device identification and, designing content classification and storage protocols. The proposed protocol analyzed safeties with respect to the occurring vulnerability and the securities by comparing and analyzing the existing password techniques in the existing wireless network environment. It is slower 0.72 times than existing cypher system, WPA2-PSK, but enforces the stability in security side.

Video Scene Detection using Shot Clustering based on Visual Features (시각적 특징을 기반한 샷 클러스터링을 통한 비디오 씬 탐지 기법)

  • Shin, Dong-Wook;Kim, Tae-Hwan;Choi, Joong-Min
    • Journal of Intelligence and Information Systems
    • /
    • v.18 no.2
    • /
    • pp.47-60
    • /
    • 2012
  • Video data comes in the form of the unstructured and the complex structure. As the importance of efficient management and retrieval for video data increases, studies on the video parsing based on the visual features contained in the video contents are researched to reconstruct video data as the meaningful structure. The early studies on video parsing are focused on splitting video data into shots, but detecting the shot boundary defined with the physical boundary does not cosider the semantic association of video data. Recently, studies on structuralizing video shots having the semantic association to the video scene defined with the semantic boundary by utilizing clustering methods are actively progressed. Previous studies on detecting the video scene try to detect video scenes by utilizing clustering algorithms based on the similarity measure between video shots mainly depended on color features. However, the correct identification of a video shot or scene and the detection of the gradual transitions such as dissolve, fade and wipe are difficult because color features of video data contain a noise and are abruptly changed due to the intervention of an unexpected object. In this paper, to solve these problems, we propose the Scene Detector by using Color histogram, corner Edge and Object color histogram (SDCEO) that clusters similar shots organizing same event based on visual features including the color histogram, the corner edge and the object color histogram to detect video scenes. The SDCEO is worthy of notice in a sense that it uses the edge feature with the color feature, and as a result, it effectively detects the gradual transitions as well as the abrupt transitions. The SDCEO consists of the Shot Bound Identifier and the Video Scene Detector. The Shot Bound Identifier is comprised of the Color Histogram Analysis step and the Corner Edge Analysis step. In the Color Histogram Analysis step, SDCEO uses the color histogram feature to organizing shot boundaries. The color histogram, recording the percentage of each quantized color among all pixels in a frame, are chosen for their good performance, as also reported in other work of content-based image and video analysis. To organize shot boundaries, SDCEO joins associated sequential frames into shot boundaries by measuring the similarity of the color histogram between frames. In the Corner Edge Analysis step, SDCEO identifies the final shot boundaries by using the corner edge feature. SDCEO detect associated shot boundaries comparing the corner edge feature between the last frame of previous shot boundary and the first frame of next shot boundary. In the Key-frame Extraction step, SDCEO compares each frame with all frames and measures the similarity by using histogram euclidean distance, and then select the frame the most similar with all frames contained in same shot boundary as the key-frame. Video Scene Detector clusters associated shots organizing same event by utilizing the hierarchical agglomerative clustering method based on the visual features including the color histogram and the object color histogram. After detecting video scenes, SDCEO organizes final video scene by repetitive clustering until the simiarity distance between shot boundaries less than the threshold h. In this paper, we construct the prototype of SDCEO and experiments are carried out with the baseline data that are manually constructed, and the experimental results that the precision of shot boundary detection is 93.3% and the precision of video scene detection is 83.3% are satisfactory.

A Study of Web Application Attack Detection extended ESM Agent (통합보안관리 에이전트를 확장한 웹 어플리케이션 공격 탐지 연구)

  • Kim, Sung-Rak
    • Journal of the Korea Society of Computer and Information
    • /
    • v.12 no.1 s.45
    • /
    • pp.161-168
    • /
    • 2007
  • Web attack uses structural, logical and coding error or web application rather than vulnerability to Web server itself. According to the Open Web Application Security Project (OWASP) published about ten types of the web application vulnerability to show the causes of hacking, the risk of hacking and the severity of damage are well known. The detection ability and response is important to deal with web hacking. Filtering methods like pattern matching and code modification are used for defense but these methods can not detect new types of attacks. Also though the security unit product like IDS or web application firewall can be used, these require a lot of money and efforts to operate and maintain, and security unit product is likely to generate false positive detection. In this research profiling method that attracts the structure of web application and the attributes of input parameters such as types and length is used, and by installing structural database of web application in advance it is possible that the lack of the validation of user input value check and the verification and attack detection is solved through using profiling identifier of database against illegal request. Integral security management system has been used in most institutes. Therefore even if additional unit security product is not applied, attacks against the web application will be able to be detected by showing the model, which the security monitoring log gathering agent of the integral security management system and the function of the detection of web application attack are combined.

  • PDF

Big Data Based Dynamic Flow Aggregation over 5G Network Slicing

  • Sun, Guolin;Mareri, Bruce;Liu, Guisong;Fang, Xiufen;Jiang, Wei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.10
    • /
    • pp.4717-4737
    • /
    • 2017
  • Today, smart grids, smart homes, smart water networks, and intelligent transportation, are infrastructure systems that connect our world more than we ever thought possible and are associated with a single concept, the Internet of Things (IoT). The number of devices connected to the IoT and hence the number of traffic flow increases continuously, as well as the emergence of new applications. Although cutting-edge hardware technology can be employed to achieve a fast implementation to handle this huge data streams, there will always be a limit on size of traffic supported by a given architecture. However, recent cloud-based big data technologies fortunately offer an ideal environment to handle this issue. Moreover, the ever-increasing high volume of traffic created on demand presents great challenges for flow management. As a solution, flow aggregation decreases the number of flows needed to be processed by the network. The previous works in the literature prove that most of aggregation strategies designed for smart grids aim at optimizing system operation performance. They consider a common identifier to aggregate traffic on each device, having its independent static aggregation policy. In this paper, we propose a dynamic approach to aggregate flows based on traffic characteristics and device preferences. Our algorithm runs on a big data platform to provide an end-to-end network visibility of flows, which performs high-speed and high-volume computations to identify the clusters of similar flows and aggregate massive number of mice flows into a few meta-flows. Compared with existing solutions, our approach dynamically aggregates large number of such small flows into fewer flows, based on traffic characteristics and access node preferences. Using this approach, we alleviate the problem of processing a large amount of micro flows, and also significantly improve the accuracy of meeting the access node QoS demands. We conducted experiments, using a dataset of up to 100,000 flows, and studied the performance of our algorithm analytically. The experimental results are presented to show the promising effectiveness and scalability of our proposed approach.

A Study on Security Level-based Authentication for Supporting Multiple Objects in RFID Systems (다중 객체 지원을 위한 RFID 시스템에서 보안 레벨 기반의 인증 기법에 관한 연구)

  • Kim, Ji-Yeon;Jung, Jong-Jin;Jo, Geun-Sik;Lee, Kyoon-Ha
    • The Journal of Society for e-Business Studies
    • /
    • v.13 no.1
    • /
    • pp.21-32
    • /
    • 2008
  • RFID systems provide technologies of automatic object identification through wireless communications in invisible ranges and adaptability against various circumstances. These advantages make RFID systems to be applied in various fields of industries and individual life. However, it is difficult to use tags with distinction as tags are increasingly used in life because a tag usually stores only one object identifier in common RFID applications. In addition, RFID systems often make serious violation of privacy caused by various attacks because of their weakness of radio frequency communication. Therefore, information sharing methods among applications are necessary for expansive development of RFID systems. In this paper, we propose efficient RFID scheme. At first, we design a new RFID tag structure which supports many object identifiers of different applications in a tag and allows those applications to access them simultaneously. Secondly, we propose an authentication protocol to support the proposed tag structure. The proposed protocol is designed by considering of robustness against various attacks in low cost RFID systems. Especially, the proposed protocol is focused on efficiency of authentication procedure by considering security levels of applications. In the proposed protocol, each application goes through one of different authentication procedures according to their security levels. Finally, we prove efficiency of th proposed scheme compared with the other schemes through experiments and evaluation.

  • PDF

A Study on the Comparative Analysis of the Description Rules of ISBD and KCR4 (ISBD 통합판과 KCR4 기술규칙 비교 연구)

  • Lee, Mihwa
    • Journal of the Korean BIBLIA Society for library and Information Science
    • /
    • v.24 no.2
    • /
    • pp.185-203
    • /
    • 2013
  • This study was to suggest the new rules for revision of KCR4 by comparing between ISBD consolidated edition and KCR4. The study methods was to compare the rules in each element after mapping the description elements in each area of ISBD and KCR4. Resultingly, first, content forms and media types must be included for describing resource types. Second, it is needed for rules about the common title and the dependent title. Third, it is needed for rules about "parallel" such as parallel title, parallel other title information, parallel statement of responsibility relating to title, parallel edition statement, parallel statement of responsibility relating to edition, parallel numbering system, parallel place of publication, production and distribution, et. al. Fourth, the rules about material or type of resource specific area must be regulated in terms of the contents of the resource. Fifth, the home country principle must be not applied in describing the place of publication, production and distribution for the consistency. Sixth, it is needed to regulate the extent, other physical details, dimensions, and accompanying material statement for all materials instead of the material description according to material types. Seventh, rule number of notes must be agreed to number of main rules. Eighth, it is needed for detailed rules about resource identifier. This study might be contributed to revise the KCR4.

AFTL: An Efficient Adaptive Flash Translation Layer using Hot Data Identifier for NAND Flash Memory (AFTL: Hot Data 검출기를 이용한 적응형 플래시 전환 계층)

  • Yun, Hyun-Sik;Joo, Young-Do;Lee, Dong-Ho
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.35 no.1
    • /
    • pp.18-29
    • /
    • 2008
  • NAND Flash memory has been growing popular storage device for the last years because of its low power consumption, fast access speed, shock resistance and light weight properties. However, it has the distinct characteristics such as erase-before-write architecture, asymmetric read/write/erase speed, and the limitation on the number of erasure per block. Due to these limitations, various Flash Translation Layers (FTLs) have been proposed to effectively use NAND flash memory. The systems that adopted the conventional FTL may result in severe performance degradation by the hot data which are frequently requested data for overwrite in the same logical address. In this paper, we propose a novel FTL algorithm called Adaptive Flash Translation Layer (AFTL) which uses sector mapping method for hot data and log-based block mapping method for cold data. Our system removes the redundant write operations and the erase operations by the separating hot data from cold data. Moreover, the read performance is enhanced according to sector translation that tends to use a few read operations. A series of experiments was organized to inspect the performance of the proposed method, and they show very impressive results.