• Title/Summary/Keyword: 삭제방법

Search Result 637, Processing Time 0.023 seconds

Deriving Priorities of Competences Required for Digital Forensic Experts using AHP (AHP 방법을 활용한 디지털포렌식 전문가 역량의 우선순위 도출)

  • Yun, Haejung;Lee, Seung Yong;Lee, Choong C.
    • The Journal of Society for e-Business Studies
    • /
    • v.22 no.1
    • /
    • pp.107-122
    • /
    • 2017
  • Nowadays, digital forensic experts are not only computer experts who restore and find deleted files, but also general experts who posses various capabilities including knowledge about processes/laws, communication skills, and ethics. However, there have been few studies about qualifications or competencies required for digital forensic experts comparing with their importance. Therefore, in this study, AHP questionnaires were distributed to digital forensic experts and analyzed to derive priorities of competencies; the first-tier questions which consisted of knowledge, technology, and attitude, and the second-tier ones which have 20 items. Research findings showed that the most important competency was knowledge, followed by technology and attitude but no significant difference was found. Among 20 items of the second-tier competencies, the most important competency was "digital forensics equipment/tool program utilization skill" and it was followed by "data extraction and imaging skill from storage devices." Attitude such as "judgment," "morality," "communication skill," "concentration" were subsequently followed. The least critical one was "substantial law related to actual cases." Previous studies on training/education for digital forensics experts focused on law, IT knowledge, and usage of analytic tools while attitude-related competencies have not given proper attention. We hope this study can provide helpful implications to design curriculum and qualifying exam to foster digital forensic experts.

Development of User Music Recognition System For Online Music Management Service (온라인 음악 관리 서비스를 위한 사용자 음원 인식 시스템 개발)

  • Sung, Bo-Kyung;Ko, Il-Ju
    • Journal of the Korea Society of Computer and Information
    • /
    • v.15 no.11
    • /
    • pp.91-99
    • /
    • 2010
  • Recently, recognizing user resource for personalized service has been needed in digital content service fields. Especially, to analyze user taste, recommend music and service music related information need recognition of user music file in case of online music service. Music related information service is offered through recognizing user music based on tag information. Recognition error has grown by weak points like changing and removing of tag information. Techniques of content based user music recognition with music signal itself are researched for solving upper problems. In this paper, we propose user music recognition on the internet by extracted feature from music signal. Features are extracted after suitable preprocessing for structure of content based user music recognition. Recognizing on music server consist of feature form are progressed with extracted feature. Through this, user music can be recognized independently of tag data. 600 music was collected and converted to each 5 music qualities for proving of proposed recognition. Converted 3000 experiment music on this method is used for recognition experiment on music server including 300,000 music. Average of recognition ratio was 85%. Weak points of tag based music recognition were overcome through proposed content based music recognition. Recognition performance of proposed method show a possibility that can be adapt to online music service in practice.

A Study on Selection Process of Web Services Based on the Multi-Attributes Decision Making (다중 속성 의사결정에 의한 웹 서비스 선정 프로세스에 관한 연구)

  • Seo Young-Jun;Song Young-Jae
    • The KIPS Transactions:PartD
    • /
    • v.13D no.4 s.107
    • /
    • pp.603-612
    • /
    • 2006
  • Recently the web service area is rapidly growing as the next generation IT paradigm because of increase of concern about SOA(Services-Oriented Architecture) and growth of B2B market. Since a service discovery through UDDI(Universal Description, Discovery and Integration) is limited to a functional requirement, it is not considered an effect on frequency of service using and reliability of mutual relation. That is, a quality as nonfunctional aspect of web service is regarded as important factor for a success between consumer and provider. Therefore, the web service selection method with considering the quality is necessary. This paper suggests the agent-based quality broker architecture and selection process which helps to find a service providing the optimum quality that the consumer needs in a position of service consumer. A theory of agent is accepted widely and suitable for proposed system architecture in the circumstance of distributed and heterogeneous environment like web service. In this paper, we considered the QoS and CoS in the evaluation process to solve the problem of existing researches related to the web service selection and used PROMETHEE(Preference Ranking Organization MeTHod for Enrichment Evaluations) as an evaluation method which is most suitable for the web service selection among MCDM approaches. PROMETHEE has advantages that solve the problem that a pair-wise comparison should be performed again when comparative services are added or deleted. This paper suggested a case study with the service composition scenario in order to verify the selection process. In this case study, the decision making problem was described on the basis of evaluated values for qualities from a consumer's point of view and the defined service level.

Effective Index and Backup Techniques for HLR System in Mobile Networks (이동통신 HLR 시스템에서의 효과적인 색인 및 백업 기법)

  • 김장환;이충세
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.9 no.1
    • /
    • pp.33-46
    • /
    • 2003
  • A Home Location Register(HLR) database system manages each subscriber's location information, which continuously changes in a cellular network. For this purpose, the HLR database system provides table management, index management, and backup management facilities. In this thesis, we propose using a two-level index method for the mobile directory number(MDN) as a suitable method and a chained bucket hashing method for the electronic serial number(ESN). Both the MDN and the ESN are used as keys in the HLR database system. We also propose an efficient backup method that takes into account the characteristics of HLR database transactions. The retrieval speed and the memory usage of the two-level index method are better than those of the R-tree index method. The insertion and deletion overhead of the chained bucket hashing method is less than that of the modified linear hashing method. In the proposed backup method, we use two kinds of dirty flags in order to solve the performance degradation problem caused by frequent registration-location operations. For a million subscribers, proposed techniques support reduction of memory size(more than 62%), directory operations (2500,000 times), and backup operations(more than 80%) compared with current techniques.

Performance Enhancement Architecture for HLR System Based on Distributed Mobile Embedded System (분산 모바일 임베디드 시스템 기반의 새로운 위치정보 관리 시스템)

  • Kim Jang Hwan
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.29 no.12B
    • /
    • pp.1022-1036
    • /
    • 2004
  • In mobile cellular network the ever-changing location of a mobile host necessitates the continuous tracking of its current position and efficient management of location information. A database called Home Location Register(HLR) plays a major role in location management in this distributed environment, providing table management, index management, and backup management facilities. The objectives of this paper are to identify the p개blems of the current HLR system through rigorous analysis, to suggest solutions to them, and to propose a new architecture for the HLR system. In the HLR system, a main memory database system is used to provide real-time accesses and updates of subscriber's information. Thus it is suggested that the improvement bemade to support better real-time facilities, to manage subscriber's information more reliably, and to accommodate more subscribers. In this paper, I propose an efficient backup method that takes into account the characteristics of HLR database transactions. The retrieval speed and the memory usage of the two-level index method are better than those of the T-tree index method. Insertion md deletion overhead of the chained bucket hashing method is less than that of modified linear hashing method. In the proposed backup method, I use two kinds of dirty flags in order to solve the performance degradation problem caused by frequent registration-location operations. Performance analysis has been performed to evaluate the proposed techniques based on a system with subscribers. The results show that, in comparison with the current techniques, the memory requirement is reduced by more than 62%,directory operations, and backup operation by more than 80%.

A study on the marginal fit of CAD/CAM 3-unit bridges (CAD/CAM 3-unit bridges의 변연 적합도에 관한 연구)

  • Lee, Ki-Hong;Yeo, In-Sung;Kim, Sung-Hun;Han, Jung-Suk;Lee, Jai-Bong;Yang, Jae-Ho
    • The Journal of Korean Academy of Prosthodontics
    • /
    • v.49 no.2
    • /
    • pp.101-105
    • /
    • 2011
  • The purpose of this study was to assess the marginal fit of three-unit bridges produced using LAVA CAD/CAM (computer-aided design/computer-aided manufacturing) system and conventional PFG in vitro. Materials and methods: #11, 13 resin teeth were prepared on dentiform, then duplicated. Twenty resin models were fabricated, ten for PFG 3-unit bridges and ten for LAVA 3-unit bridges. Each bridge was cemented on the resin model. Marginal discrepancy was measured with stereoscopic microscope (Nikon DS-Fi 1, Nikon, Japan) at a magnification of ${\times}75$. Independent t-test was done for the statistical analysis. Results: The mean marginal discrepancy values and standard deviations of the PFG bridges was $97.1{\pm}18.7\;{\mu}m$ for incisors, $76.6{\pm}21.8\;{\mu}m$ for canines; that of the LAVA bridges was $90.4{\pm}26.7\;{\mu}m$ for incisor, $110.2{\pm}30.2\;{\mu}m$ for canines. The mean marginal discrepancy between PFG and LAVA for incisor did not show significant difference (P<.05). But for canine, the mean marginal discrepancy of PFG bridges was smaller than that of LAVA bridges (P<.05). Conclusion: The LAVA CAD/CAM 3-unit bridges and the PFG 3-unit bridges showed clinically acceptable marginal discrepancy.

ORTHODONTIC TREATMENT THROUGH EXTRACT10N OF UPPER AND LOWER LATERAL TEETH (상하악 측절치 발거를 통한 전치부 총생의 치료)

  • Park, Sang-Hyun;Lee, Kwang-Hee;Kim, Dae-Eop;Lee, Jong-Seon
    • Journal of the korean academy of Pediatric Dentistry
    • /
    • v.28 no.4
    • /
    • pp.547-552
    • /
    • 2001
  • Extracting mandibular incisors for orthodontic treatment may adversely affect the occlusion. However, when properly used, extraction of mandibular inciors is a selection for the correction of the malocclusion. Generally, treatment for crowding needs to select between nonextraction and four premolar extraction. Approaches for crowded mandibular incisors include distal movement of posterior teeth, lateral movement of canines, labial movement of incisors, interproximal enamel reduction, removal of premolars, removal of one or two incisors, and various combinations of the above. Extraction of incisors is used in case of crowding, anterior tooth size discrepancy, absent of maxillary lateral incisors, and ectopic eruption. But severe overjet. overbite, and space are the contraindication of it. A patient had severe crowding on upper anterior teeth, impacted upper left lateral incisor, palatal ectopic eruption of upper right incisor and severe crowding on lower anterior teeth. Lower lateral incisors are extracted for space availability and facial esthetics. We report the case of orthodontic treatment of upper and lower anterior crowding through extraction of lateral incisor.

  • PDF

Accelerating GPU-based Volume Ray-casting Using Brick Vertex (브릭 정점을 이용한 GPU 기반 볼륨 광선투사법 가속화)

  • Chae, Su-Pyeong;Shin, Byeong-Seok
    • Journal of the Korea Computer Graphics Society
    • /
    • v.17 no.3
    • /
    • pp.1-7
    • /
    • 2011
  • Recently, various researches have been proposed to accelerate GPU-based volume ray-casting. However, those researches may cause several problems such as bottleneck of data transmission between CPU and GPU, requirement of additional video memory for hierarchical structure and increase of processing time whenever opacity transfer function changes. In this paper, we propose an efficient GPU-based empty space skipping technique to solve these problems. We store maximum density in a brick of volume dataset on a vertex element. Then we delete vertices regarded as transparent one by opacity transfer function in geometry shader. Remaining vertices are used to generate bounding boxes of non-transparent area that helps the ray to traverse efficiently. Although these vertices are independent on viewing condition they need to be reproduced when opacity transfer function changes. Our technique provides fast generation of opaque vertices for interactive processing since the generation stage of the opaque vertices is running in GPU pipeline. The rendering results of our algorithm are identical to the that of general GPU ray-casting, but the performance can be up to more than 10 times faster.

Study on Remote Data Acquisition Methods Using OAuth Protocol of Android Operating System (안드로이드 환경의 OAuth 프로토콜을 이용한 원격지 데이터 수집 방법 연구)

  • Nam, Gi-hoon;Gong, Seong-hyeon;Seok, Byoung-jin;Lee, Changhoon
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.28 no.1
    • /
    • pp.111-122
    • /
    • 2018
  • Using OAuth protocol, third-party applications on the Android operating system use user's credentials or access tokens that have access authority on user's resources to gain user's account and personal information from account information providers. These credentials and token information are stored in the device by the OAuth data management method provided by the Android operating system. If this information is leaked, the attacker can use the leaked credential and token data to get user's personal data without login. This feature enables the digital forensic investigator to collect data directly from the remote server of the services used by the target of investigation in terms of collecting evidence data. Evidence data collected at a remote location can be a basis for secondary warranties and provide evidence which can be very important evidence when an attacker attempts to destroy evidence, such as the removal of an application from an Android device. In this paper, we analyze the management status of OAuth tokens in various Android operating system and device environment, and show how to collect data of various third party applications using it. This paper introduces a method of expanding the scope of data acquisition by collecting remote data of the services used by the subject of investigation from the viewpoint of digital forensics.

DNA Watermarking Method based on Random Codon Circular Code (랜덤 코돈 원형 부호 기반의 DNA 워터마킹)

  • Lee, Suk-Hwan;Kwon, Seong-Geun;Kwon, Ki-Ryong
    • Journal of Korea Multimedia Society
    • /
    • v.16 no.3
    • /
    • pp.318-329
    • /
    • 2013
  • This paper proposes a DNA watermarking method for the privacy protection and the prevention of illegal copy. The proposed method allocates codons to random circular angles by using random mapping table and selects triplet codons for embedding target with the help of the Lipschitz regularity value of local modulus maxima of codon circular angles. Then the watermark is embedded into circular angles of triplet codons without changing the codes of amino acids in a DNA. The length and location of target triplet codons depend on the random mapping table for 64 codons that includes start and stop codons. This table is used as the watermark key and can be applied on any codon sequence regardless of the length of sequence. If this table is unknown, it is very difficult to detect the length and location of them for extracting the watermark. We evaluated our method and DNA-crypt watermarking of Heider method on the condition of similar capacity. From evaluation results, we verified that our method has lower base changing rate than DNA-crypt and has lower bit error rate on point mutation and insertions/deletions than DNA-crypt. Furthermore, we verified that the entropy of random mapping table and the locaton of triplet codons is high, meaning that the watermark security has high level.