• Title/Summary/Keyword: Neighborhood search

Search Result 113, Processing Time 0.024 seconds

High accuracy map matching method using monocular cameras and low-end GPS-IMU systems (단안 카메라와 저정밀 GPS-IMU 신호를 융합한 맵매칭 방법)

  • Kim, Yong-Gyun;Koo, Hyung-Il;Kang, Seok-Won;Kim, Joon-Won;Kim, Jae-Gwan
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.19 no.4
    • /
    • pp.34-40
    • /
    • 2018
  • This paper presents a new method to estimate the pose of a moving object accurately using a monocular camera and a low-end GPS+IMU sensor system. For this goal, we adopted a deep neural network for the semantic segmentation of input images and compared the results with a semantic map of a neighborhood. In this map matching, we use weight tables to deal with label inconsistency effectively. Signals from a low-end GPS+IMU sensor system are used to limit search spaces and minimize the proposed function. For the evaluation, we added noise to the signals from a high-end GPS-IMU system. The results show that the pose can be recovered from the noisy signals. We also show that the proposed method is effective in handling non-open-sky situations.

Improved Shape Extraction Using Inward and Outward Curve Evolution (양방향 곡선 전개를 이용한 개선된 형태 추출)

  • Kim Ha-Hyoung;Kim Seong-Kon;Kim Doo-Young
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.1 no.1
    • /
    • pp.23-31
    • /
    • 2000
  • Iterative curve evolution techniques are powerful methods for image segmentation. Classical methods proposed curve evolutions which guarantee close contours at convergence and, combined with the level set method, they easily handled curve topology changes. In this paper, we present a new geometric active contour model based on level set methods introduced by Osher & Sethian for detection of object boundaries or shape and we adopt anisotropic diffusion filtering method for removing noise from original image. Classical methods allow only one-way curve evolutions : shrinking or expanding of the curve. Thus, the initial curve must encircle all the objects to be segmented or several curves must be used, each one totally inside one object. But our method allows a two-way curve evolution : parts of the curve evolve in the outward direction while others evolve in the inward direction. It offers much more freedom in the initial curve position than with a classical geodesic search method. Our algorithm performs accurate and precise segmentations from noisy images with complex objects(jncluding sharp angles, deep concavities or holes), Besides it easily handled curve topology changes. In order to minimize the processing time, we use the narrow band method which allows us to perform calculations in the neighborhood of the contour and not in the whole image.

  • PDF

Study of Effects of Measurement Errors in Damage Detection (동적 측정오차가 손상탐지에 미치는 영향에 관한 연구)

  • Kim, Ki-Ook
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.39 no.3
    • /
    • pp.218-224
    • /
    • 2011
  • A modal method is presented for the investigation of the effects of measurement errors in damage detection for dynamic structural systems. The structural modifications to the baseline system result in the response changes of the perturbed structure, which are measured to determine a unique system in the inverse problem of damage detection. If the numerical modal data are exact, mathematical programming techniques can be applied to obtain the accurate structural changes. In practice, however, the associated measurement errors are unavoidable, to some extent, and cause significant deviations from the correct perturbed system because of the intrinsic instability of eigenvalue problem. Hence, a self-equilibrating inverse system is allowed to drift in the close neighborhood of the measured data. A numerical example shows that iterative procedures can be used to search for the damaged structural elements. A small set of selected degrees of freedom is employed for practical applicability and computational efficiency.

The Development of a Trial Curriculum Classification and Coding System Using Group Technology

  • Lee, Sung-Youl;Yu, Hwa-Young;Ahn, Jung-A;Park, Ga-Eun;Choi, Woo-Seok
    • Journal of Engineering Education Research
    • /
    • v.17 no.4
    • /
    • pp.43-47
    • /
    • 2014
  • The rapid development of science & technology and the globalization of society have accelerated the fractionation and specialization of academic disciplines. Accordingly, Korean colleges and universities are continually dropping antiquated courses to make room for new courses that better meet societal demands. With emphasis placed on providing students with a broader range of choices in terms of course selection, compulsory courses have given way to elective courses. On average, 4 year institutions of higher learning in Korea currently offer somewhere in the neighborhood of 1,000 different courses yearly. The classification of an ever growing list of courses offered and the practical use of such data would not be possible without the aid of computers. For example, if we were able to show the pre/post requisite relationship among various courses as well as the commonalities in substance among courses, such data generated regarding the interrelationship of different courses would undoubtedly greatly benefit the students, as well as the professors, during course registration. Furthermore, the GT system's relatively simple approach to course classification and coding will obviate the need for the development of a more complicated keyword based search engine, and hopefully contribute to the standardization of the course coding scheme in the future..Therefore, as a sample case project, this study will use GT to classify and code all courses offered at the College of Engineering of K University, thereby developing a system that will facilitate the scanning of relevant courses.

Elite Ant System for Solving Multicast Routing Problem (멀티캐스트 라우팅 문제 해결을 위한 엘리트 개미 시스템)

  • Lee, Seung-Gwan
    • Journal of the Korea Society of Computer and Information
    • /
    • v.13 no.3
    • /
    • pp.147-152
    • /
    • 2008
  • Ant System(AS) is new meta heuristic for hard combinatorial optimization problem. It is a population based approach that uses exploitation of positive feedback as well as greedy search. It was first proposed for tackling the well known Traveling Salesman Problem. In this paper, AS is applied to the Multicast Routing Problem. Multicast Routing is modeled as the NP-complete Steiner tree problem. This is the shortest path from source node to all destination nodes. We proposed new AS to resolve this problem. The proposed method selects the neighborhood node to consider all costs of the edge and the next node in state transition rule. Also, The edges which are selected elite agents are updated to additional pheromone. Simulation results of our proposed method show fast convergence and give lower total cost than original AS and $AS_{elite}$.

  • PDF

Generation of Pareto Sets based on Resource Reduction for Multi-Objective Problems Involving Project Scheduling and Resource Leveling (프로젝트 일정과 자원 평준화를 포함한 다목적 최적화 문제에서 순차적 자원 감소에 기반한 파레토 집합의 생성)

  • Jeong, Woo-Jin;Park, Sung-Chul;Yim, Dong-Soon
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.43 no.2
    • /
    • pp.79-86
    • /
    • 2020
  • To make a satisfactory decision regarding project scheduling, a trade-off between the resource-related cost and project duration must be considered. A beneficial method for decision makers is to provide a number of alternative schedules of diverse project duration with minimum resource cost. In view of optimization, the alternative schedules are Pareto sets under multi-objective of project duration and resource cost. Assuming that resource cost is closely related to resource leveling, a heuristic algorithm for resource capacity reduction (HRCR) is developed in this study in order to generate the Pareto sets efficiently. The heuristic is based on the fact that resource leveling can be improved by systematically reducing the resource capacity. Once the reduced resource capacity is given, a schedule with minimum project duration can be obtained by solving a resource-constrained project scheduling problem. In HRCR, VNS (Variable Neighborhood Search) is implemented to solve the resource-constrained project scheduling problem. Extensive experiments to evaluate the HRCR performance are accomplished with standard benchmarking data sets, PSPLIB. Considering 5 resource leveling objective functions, it is shown that HRCR outperforms well-known multi-objective optimization algorithm, SPEA2 (Strength Pareto Evolutionary Algorithm-2), in generating dominant Pareto sets. The number of approximate Pareto optimal also can be extended by modifying weight parameter to reduce resource capacity in HRCR.

A Multimodal Profile Ensemble Approach to Development of Recommender Systems Using Big Data (빅데이터 기반 추천시스템 구현을 위한 다중 프로파일 앙상블 기법)

  • Kim, Minjeong;Cho, Yoonho
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.4
    • /
    • pp.93-110
    • /
    • 2015
  • The recommender system is a system which recommends products to the customers who are likely to be interested in. Based on automated information filtering technology, various recommender systems have been developed. Collaborative filtering (CF), one of the most successful recommendation algorithms, has been applied in a number of different domains such as recommending Web pages, books, movies, music and products. But, it has been known that CF has a critical shortcoming. CF finds neighbors whose preferences are like those of the target customer and recommends products those customers have most liked. Thus, CF works properly only when there's a sufficient number of ratings on common product from customers. When there's a shortage of customer ratings, CF makes the formation of a neighborhood inaccurate, thereby resulting in poor recommendations. To improve the performance of CF based recommender systems, most of the related studies have been focused on the development of novel algorithms under the assumption of using a single profile, which is created from user's rating information for items, purchase transactions, or Web access logs. With the advent of big data, companies got to collect more data and to use a variety of information with big size. So, many companies recognize it very importantly to utilize big data because it makes companies to improve their competitiveness and to create new value. In particular, on the rise is the issue of utilizing personal big data in the recommender system. It is why personal big data facilitate more accurate identification of the preferences or behaviors of users. The proposed recommendation methodology is as follows: First, multimodal user profiles are created from personal big data in order to grasp the preferences and behavior of users from various viewpoints. We derive five user profiles based on the personal information such as rating, site preference, demographic, Internet usage, and topic in text. Next, the similarity between users is calculated based on the profiles and then neighbors of users are found from the results. One of three ensemble approaches is applied to calculate the similarity. Each ensemble approach uses the similarity of combined profile, the average similarity of each profile, and the weighted average similarity of each profile, respectively. Finally, the products that people among the neighborhood prefer most to are recommended to the target users. For the experiments, we used the demographic data and a very large volume of Web log transaction for 5,000 panel users of a company that is specialized to analyzing ranks of Web sites. R and SAS E-miner was used to implement the proposed recommender system and to conduct the topic analysis using the keyword search, respectively. To evaluate the recommendation performance, we used 60% of data for training and 40% of data for test. The 5-fold cross validation was also conducted to enhance the reliability of our experiments. A widely used combination metric called F1 metric that gives equal weight to both recall and precision was employed for our evaluation. As the results of evaluation, the proposed methodology achieved the significant improvement over the single profile based CF algorithm. In particular, the ensemble approach using weighted average similarity shows the highest performance. That is, the rate of improvement in F1 is 16.9 percent for the ensemble approach using weighted average similarity and 8.1 percent for the ensemble approach using average similarity of each profile. From these results, we conclude that the multimodal profile ensemble approach is a viable solution to the problems encountered when there's a shortage of customer ratings. This study has significance in suggesting what kind of information could we use to create profile in the environment of big data and how could we combine and utilize them effectively. However, our methodology should be further studied to consider for its real-world application. We need to compare the differences in recommendation accuracy by applying the proposed method to different recommendation algorithms and then to identify which combination of them would show the best performance.

Noise-robust electrocardiogram R-peak detection with adaptive filter and variable threshold (적응형 필터와 가변 임계값을 적용하여 잡음에 강인한 심전도 R-피크 검출)

  • Rahman, MD Saifur;Choi, Chul-Hyung;Kim, Si-Kyung;Park, In-Deok;Kim, Young-Pil
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.18 no.12
    • /
    • pp.126-134
    • /
    • 2017
  • There have been numerous studies on extracting the R-peak from electrocardiogram (ECG) signals. However, most of the detection methods are complicated to implement in a real-time portable electrocardiograph device and have the disadvantage of requiring a large amount of calculations. R-peak detection requires pre-processing and post-processing related to baseline drift and the removal of noise from the commercial power supply for ECG data. An adaptive filter technique is widely used for R-peak detection, but the R-peak value cannot be detected when the input is lower than a threshold value. Moreover, there is a problem in detecting the P-peak and T-peak values due to the derivation of an erroneous threshold value as a result of noise. We propose a robust R-peak detection algorithm with low complexity and simple computation to solve these problems. The proposed scheme removes the baseline drift in ECG signals using an adaptive filter to solve the problems involved in threshold extraction. We also propose a technique to extract the appropriate threshold value automatically using the minimum and maximum values of the filtered ECG signal. To detect the R-peak from the ECG signal, we propose a threshold neighborhood search technique. Through experiments, we confirmed the improvement of the R-peak detection accuracy of the proposed method and achieved a detection speed that is suitable for a mobile system by reducing the amount of calculation. The experimental results show that the heart rate detection accuracy and sensitivity were very high (about 100%).

Social Network : A Novel Approach to New Customer Recommendations (사회연결망 : 신규고객 추천문제의 새로운 접근법)

  • Park, Jong-Hak;Cho, Yoon-Ho;Kim, Jae-Kyeong
    • Journal of Intelligence and Information Systems
    • /
    • v.15 no.1
    • /
    • pp.123-140
    • /
    • 2009
  • Collaborative filtering recommends products using customers' preferences, so it cannot recommend products to the new customer who has no preference information. This paper proposes a novel approach to new customer recommendations using the social network analysis which is used to search relationships among social entities such as genetics network, traffic network, organization network, etc. The proposed recommendation method identifies customers most likely to be neighbors to the new customer using the centrality theory in social network analysis and recommends products those customers have liked in the past. The procedure of our method is divided into four phases : purchase similarity analysis, social network construction, centrality-based neighborhood formation, and recommendation generation. To evaluate the effectiveness of our approach, we have conducted several experiments using a data set from a department store in Korea. Our method was compared with the best-seller-based method that uses the best-seller list to generate recommendations for the new customer. The experimental results show that our approach significantly outperforms the best-seller-based method as measured by F1-measure.

  • PDF

Comparison of Texture Images and Application of Template Matching for Geo-spatial Feature Analysis Based on Remote Sensing Data (원격탐사 자료 기반 지형공간 특성분석을 위한 텍스처 영상 비교와 템플레이트 정합의 적용)

  • Yoo Hee Young;Jeon So Hee;Lee Kiwon;Kwon Byung-Doo
    • Journal of the Korean earth science society
    • /
    • v.26 no.7
    • /
    • pp.683-690
    • /
    • 2005
  • As remote sensing imagery with high spatial resolution (e.g. pixel resolution of 1m or less) is used widely in the specific application domains, the requirements of advanced methods for this imagery are increasing. Among many applicable methods, the texture image analysis, which was characterized by the spatial distribution of the gray levels in a neighborhood, can be regarded as one useful method. In the texture image, we compared and analyzed different results according to various directions, kernel sizes, and parameter types for the GLCM algorithm. Then, we studied spatial feature characteristics within each result image. In addition, a template matching program which can search spatial patterns using template images selected from original and texture images was also embodied and applied. Probabilities were examined on the basis of the results. These results would anticipate effective applications for detecting and analyzing specific shaped geological or other complex features using high spatial resolution imagery.