• Title/Summary/Keyword: Information processing

Search Result 42,420, Processing Time 0.061 seconds

Classifying a Strength of Dependency between classes by using Software Metrics and Machine Learning in Object-Oriented System (기계학습과 품질 메트릭을 활용한 객체간 링크결합강도 분류에 관한 연구)

  • Jung, Sungkyun;Ahn, Jaegyoon;Yeu, Yunku;Park, Sanghyun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.2 no.10
    • /
    • pp.651-660
    • /
    • 2013
  • Object oriented design brought up improvement of productivity and software quality by adopting some concepts such as inheritance and encapsulation. However, both the number of software's classes and object couplings are increasing as the software volume is becoming larger. The object coupling between classes is closely related with software complexity, and high complexity causes decreasing software quality. In order to solve the object coupling issue, IT-field researchers adopt a component based development and software quality metrics. The component based development requires explicit representation of dependencies between classes and the software quality metrics evaluates quality of software. As part of the research, we intend to gain a basic data that will be used on decomposing software. We focused on properties of the linkage between classes rather than previous studies evaluated and accumulated the qualities of individual classes. Our method exploits machine learning technique to analyze the properties of linkage and predict the strength of dependency between classes, as a new perspective on analyzing software property.

Design of an Arm Gesture Recognition System Using Feature Transformation and Hidden Markov Models (특징 변환과 은닉 마코프 모델을 이용한 팔 제스처 인식 시스템의 설계)

  • Heo, Se-Kyeong;Shin, Ye-Seul;Kim, Hye-Suk;Kim, In-Cheol
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.2 no.10
    • /
    • pp.723-730
    • /
    • 2013
  • This paper presents the design of an arm gesture recognition system using Kinect sensor. A variety of methods have been proposed for gesture recognition, ranging from the use of Dynamic Time Warping(DTW) to Hidden Markov Models(HMM). Our system learns a unique HMM corresponding to each arm gesture from a set of sequential skeleton data. Whenever the same gesture is performed, the trajectory of each joint captured by Kinect sensor may much differ from the previous, depending on the length and/or the orientation of the subject's arm. In order to obtain the robust performance independent of these conditions, the proposed system executes the feature transformation, in which the feature vectors of joint positions are transformed into those of angles between joints. To improve the computational efficiency for learning and using HMMs, our system also performs the k-means clustering to get one-dimensional integer sequences as inputs for discrete HMMs from high-dimensional real-number observation vectors. The dimension reduction and discretization can help our system use HMMs efficiently to recognize gestures in real-time environments. Finally, we demonstrate the recognition performance of our system through some experiments using two different datasets.

Counterfeit Money Detection Algorithm using Non-Local Mean Value and Support Vector Machine Classifier (비지역적 특징값과 서포트 벡터 머신 분류기를 이용한 위변조 지폐 판별 알고리즘)

  • Ji, Sang-Keun;Lee, Hae-Yeoun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.2 no.1
    • /
    • pp.55-64
    • /
    • 2013
  • Due to the popularization of digital high-performance capturing equipments and the emergence of powerful image-editing softwares, it is easy for anyone to make a high-quality counterfeit money. However, the probability of detecting a counterfeit money to the general public is extremely low. In this paper, we propose a counterfeit money detection algorithm using a general purpose scanner. This algorithm determines counterfeit money based on the different features in the printing process. After the non-local mean value is used to analyze the noises from each money, we extract statistical features from these noises by calculating a gray level co-occurrence matrix. Then, these features are applied to train and test the support vector machine classifier for identifying either original or counterfeit money. In the experiment, we use total 324 images of original money and counterfeit money. Also, we compare with noise features from previous researches using wiener filter and discrete wavelet transform. The accuracy of the algorithm for identifying counterfeit money was over 94%. Also, the accuracy for identifying the printing source was over 93%. The presented algorithm performs better than previous researches.

Design and Implementation of a Protocol for Interworking Open Web Application Store (개방형 웹 애플리케이션 스토어 연동을 위한 프로토콜의 설계 및 구현)

  • Baek, Jihun;Kim, Jihun;Nam, Yongwoo;Lee, HyungUk;Park, Sangwon;Jeon, Jonghong;Lee, Seungyoon
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.2 no.10
    • /
    • pp.669-678
    • /
    • 2013
  • Recently, because the portable devices became popular, it is easily to see that each person carries more than just one portable device and the use of the smartphone stretches as time goes by. After the smartphone has propagated rapidly, the total usage of the smartphone applications has also increased. But still, each application store has a different platform to develop and to apply an application. The application store is divided into two big markets, the Android and the Apple. So the developers have to develop their application by using these two different platforms. Developing into two different platforms almost makes a double development cost. And for the other platforms, the weakness is, which still have a small market breadth like Bada is not about the cost, but about drawing the proper developers for the given platform application development. The web application is rising up as the solution to solve these problems, reducing the cost and time in developing applications for every platform. For web applications don't need to make a vassal relationship with application markets platform. Which makes it possible for an application to operate properly in every portable devices and reduces the time and cost in developing. Therefore, all of the application markets could be united into one big market through a protocol which will connect each web applications market. But, still there is no standard for the web application store and no current web application store is possible to interlock with other web application stores. In this paper, we are trying to suggest a protocol by developing a prototype and prove that this protocol can supplement the current weakness.

Understanding of Structural Changes of Keyword Networks in the Computer Engineering Field (컴퓨터공학 분야 키워드네트워크의 구조적 변화 이해)

  • Kwon, Yung-Keun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.2 no.3
    • /
    • pp.187-194
    • /
    • 2013
  • Recently, there have been many trials to analyze characteristics of research trends through a structural analysis of keyword networks in various fields. However, most previous studies have mainly focused on structural analysis harbored in some static networks and there is a lack of research on changes of such networks structure with time. In this paper, we constructed annual keyword networks by using a database of papers published in the international computer engineering-field journals from 2002 through 2011, and examined the changes of them. As a result, it was shown that most keywords in a network are preserved in the network of the next year, and their degree of connectivity and the average weight of the connections were higher and smaller, respectively, than those of the keywords which are not preserved. In addition, when a keyword network shifted to one of the next year, the connections between keywords were more likely to be removed than preserved, and the average weight of the removal connections was higher than that of the preserved ones. These results imply that the keywords are not changed over time but their connections are very likely to be changed; and there is apparent differences between the preserved and removal groups of keywords/connections with respect to degree and weights of connections. All these results are consistently observed over the ten-year datasets and they can be important principles in understanding the structural changes of the keyword networks.

A Study of Standard eBook Contents Conversion (전자책 표준간의 컨텐츠 변환에 관한 연구)

  • Ko, Seung-Kyu;Sohn, Won-Sung;Lim, Soon-Bum;Choy, Yoon-Chul
    • The KIPS Transactions:PartD
    • /
    • v.10D no.2
    • /
    • pp.267-276
    • /
    • 2003
  • Many countries have established eBook standards adequate to their environments. In USA, OEB PS is announced for distribution and display of eBooks, in Japan, JepaX is announced for storage and exchange, and in Korea, EBKS is made for clear exchange of eBook contents. These diverse objectives lead to different content structures. These variety of content structure will cause a problem in exchanging them. To correctly exchange eBook contents, the content structure should be considered. So, In this paper, we study conversion methods of standard eBooks contents based on Korean eBook standard, with contemplating content structure. To convert contents properly, the mapping relations should be clearly defined. For this, we consider standard's structure and extension mechanisms, and use path notations and namespaces for precise description. Moreover, through analysis of each mapping relationships, we classify conversion cases into automatic, semi-automatic, and manual conversions. Finally we write up conversion scripts and experiment with them.

A Distributed Method for Constructing a P2P Overlay Multicast Network using Computational Intelligence (지능적 계산법을 이용한 분산적 P2P 오버레이 멀티케스트 네트워크 구성 기법)

  • Park, Jaesung
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.11 no.6
    • /
    • pp.95-102
    • /
    • 2012
  • In this paper, we propose a method that can construct efficiently a P2P overlay multicast network composed of many heterogeneous peers in communication bandwidth, processing power and a storage size by selecting a peer in a distributed fashion using an ant-colony theory that is one of the computational intelligence methods. The proposed method considers not only the capacity of a peer but also the number of children peers supported by the peer and the hop distance between a multicast source and the peer when selecting a parent peer of a newly joining node. Thus, an P2P multicast overlay network is constructed efficiently in that the distances between a multicast source and peers are maintained small. In addition, the proposed method works in a distributed fashion in that peers use their local information to find a parent node. Thus, compared to a centralized method where a centralized server maintains and controls the overlay construction process, the proposed method scales well. Through simulations, we show that, by making a few high capacity peers support a lot of low capacity peers, the proposed method can maintain the size of overlay network small even there are a few thousands of peers in the network.

Efficient Methodology in Markov Random Field Modeling : Multiresolution Structure and Bayesian Approach in Parameter Estimation (피라미드 구조와 베이지안 접근법을 이용한 Markove Random Field의 효율적 모델링)

  • 정명희;홍의석
    • Korean Journal of Remote Sensing
    • /
    • v.15 no.2
    • /
    • pp.147-158
    • /
    • 1999
  • Remote sensing technique has offered better understanding of our environment for the decades by providing useful level of information on the landcover. In many applications using the remotely sensed data, digital image processing methodology has been usefully employed to characterize the features in the data and develop the models. Random field models, especially Markov Random Field (MRF) models exploiting spatial relationships, are successfully utilized in many problems such as texture modeling, region labeling and so on. Usually, remotely sensed imagery are very large in nature and the data increase greatly in the problem requiring temporal data over time period. The time required to process increasing larger images is not linear. In this study, the methodology to reduce the computational cost is investigated in the utilization of the Markov Random Field. For this, multiresolution framework is explored which provides convenient and efficient structures for the transition between the local and global features. The computational requirements for parameter estimation of the MRF model also become excessive as image size increases. A Bayesian approach is investigated as an alternative estimation method to reduce the computational burden in estimation of the parameters of large images.

A Study on the Enhancement of DEM Resolution by Radar Interferometry (레이더 간섭기법을 이용한 수치고도모델 해상도 향상에 관한 연구)

  • Kim Chang-Oh;Kim Sang-Wan;Lee Dong-Cheon;Lee Yong-Wook;Kim Jeong Woo
    • Korean Journal of Remote Sensing
    • /
    • v.21 no.4
    • /
    • pp.287-302
    • /
    • 2005
  • Digital Elevation Models (DEMs) were generated by ERS-l/2 and JERS-1 SAR interferometry in Daejon area, Korea. The quality of the DEM's was evaluated by the Ground Control Points (GCPs) in city area where GCPs were determined by GPS surveys, while in the mountain area with no GCPs, a 1:25,000 digital map was used. In order to minimize errors due to the inaccurate satellite orbit information and the phase unwrapping procedure, a Differential InSAR (DInSAR) was implemented in addition to the traditional InSAR analysis for DEM generation. In addition, DEMs from GTOPO30, SRTM-3, and 1:25,000 digital map were used for assessment the resolution of the DEM generated from DInSAR. 5-6 meters of elevation errors were found in the flat area regardless of the usage and the resolution of DEM, as a result of InSAR analyzing with a pair of ERS tandem and 6 pairs of JERS-1 interferograms. In the mountain area, however, DInSAR with DEMs from SRTM-3 and the digital map was found to be very effective to reduce errors due to phase unwrapping procedure. Also errors due to low signal-to-noise ratio of radar images and atmospheric effect were attenuated in the DEMs generated from the stacking of 6 pairs of JERS-1. SAR interferometry with multiple pairs of SAR interferogram with low resolution DEM can be effectively used to enhance the resolution of DEM in terms of data processing time and cost.

A Study on the Concentration of Research Investment in National R&D Projects Using the Theil Index (타일(Theil) 지수를 이용한 국가연구개발사업의 연구비 집중도 분석)

  • Yang, Hyeonchae;Sung, Kyungmo;Kim, Yeonglin
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.8 no.9
    • /
    • pp.355-362
    • /
    • 2019
  • In the past, when research and development(R&D) resources were absolutely scarce, the so-called 'choice and concentration' strategy of national R&D projects has been persuasive. Under the current situation where various actors such as GRIs(Government-funded Research Institutes) and universities supported by more abundant R&D resources conduct national R&D projects, this strategy cannot be applied without distinction. In order to see how the strategy has worked, this paper analyzes the concentration of research funds allocated to actors performing national R&D projects. Concentration is measured based on the amount of research funds supported by government from 2002 to 2016 using the Theil index to break down the concentration of individual actors in the overall national R&D project. The results from the Theil index were compared with concentrations using the Gini coefficient, a widely known indicator. As a result, the Theil index could be used to analyze the concentration and sub-components' contribution such as universities and GRIs that make up the entire national R&D system. The results also showed GRIs had the highest concentration, followed by universities, but their concentration has been somewhat reduced compared to 10 years ago. On the other hand, small-sized companies have maintained a certain level, although they are not highly concentrated. In other words, universities and GRIs tend to reduce the gap in the allocation of research funds among institutions, while small-sized companies tend to distribute them evenly.