• Title/Summary/Keyword: Processing speed

Search Result 4,313, Processing Time 0.03 seconds

Matching Points Filtering Applied Panorama Image Processing Using SURF and RANSAC Algorithm (SURF와 RANSAC 알고리즘을 이용한 대응점 필터링 적용 파노라마 이미지 처리)

  • Kim, Jeongho;Kim, Daewon
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.51 no.4
    • /
    • pp.144-159
    • /
    • 2014
  • Techniques for making a single panoramic image using multiple pictures are widely studied in many areas such as computer vision, computer graphics, etc. The panorama image can be applied to various fields like virtual reality, robot vision areas which require wide-angled shots as an useful way to overcome the limitations such as picture-angle, resolutions, and internal informations of an image taken from a single camera. It is so much meaningful in a point that a panoramic image usually provides better immersion feeling than a plain image. Although there are many ways to build a panoramic image, most of them are using the way of extracting feature points and matching points of each images for making a single panoramic image. In addition, those methods use the RANSAC(RANdom SAmple Consensus) algorithm with matching points and the Homography matrix to transform the image. The SURF(Speeded Up Robust Features) algorithm which is used in this paper to extract featuring points uses an image's black and white informations and local spatial informations. The SURF is widely being used since it is very much robust at detecting image's size, view-point changes, and additionally, faster than the SIFT(Scale Invariant Features Transform) algorithm. The SURF has a shortcoming of making an error which results in decreasing the RANSAC algorithm's performance speed when extracting image's feature points. As a result, this may increase the CPU usage occupation rate. The error of detecting matching points may role as a critical reason for disqualifying panoramic image's accuracy and lucidity. In this paper, in order to minimize errors of extracting matching points, we used $3{\times}3$ region's RGB pixel values around the matching points' coordinates to perform intermediate filtering process for removing wrong matching points. We have also presented analysis and evaluation results relating to enhanced working speed for producing a panorama image, CPU usage rate, extracted matching points' decreasing rate and accuracy.

A Fluid Analysis Study on Centrifugal Pump Performance Improvement by Impeller Modification (원심펌프 회전차 Modification시 성능개선에 관한 유동해석 연구)

  • Lee, A-Yeong;Jang, Hyun-Jun;Lee, Jin-Woo;Cho, Won-Jeong
    • Journal of the Korean Institute of Gas
    • /
    • v.24 no.2
    • /
    • pp.1-8
    • /
    • 2020
  • Centrifugal pump is a facility that transfers energy to fluid through centrifugal force, which is usually generated by rotating the impeller at high speed, and is a major process facility used in many LNG production bases such as vaporization seawater pump, industrial water and fire extinguishing pump using seawater. to be. Currently, pumps in LNG plant sites are subject to operating conditions that vary depending on the amount of supply desired by the customer for a long period of time. Pumps in particular occupy a large part of the consumption strategy at the plant site, and if the optimum operation condition is not available, it can incur enormous energy loss in long term plant operation. In order to solve this problem, it is necessary to identify the performance deterioration factor through the flow analysis and the result analysis according to the fluctuations of the pump's operating conditions and to determine the optimal operation efficiency. In order to evaluate operation efficiency through experimental techniques, considerable time and cost are incurred, such as on-site operating conditions and manufacturing of experimental equipment. If the performance of the pump is not suitable for the site, and the performance of the pump needs to be reduced, a method of changing the rotation speed or using a special liquid containing high viscosity or solids is used. Especially, in order to prevent disruptions in the operation of LNG production bases, a technology is required to satisfy the required performance conditions by processing the existing impeller of the pump within a short time. Therefore, in this study, the rotation difference of the pump was applied to the ANSYS CFX program by applying the modified 3D modeling shape. In addition, the results obtained from the flow analysis and the curve fitting toolbox of the MATLAB program were analyzed numerically to verify the outer diameter correction theory.

Calculation of Dry Matter Yield Damage of Whole Crop Maize in Accordance with Abnormal Climate Using Machine Learning Model (기계학습 모델을 이용한 이상기상에 따른 사일리지용 옥수수 생산량 피해량)

  • Jo, Hyun Wook;Kim, Min Kyu;Kim, Ji Yung;Jo, Mu Hwan;Kim, Moonju;Lee, Su An;Kim, Kyeong Dae;Kim, Byong Wan;Sung, Kyung Il
    • Journal of The Korean Society of Grassland and Forage Science
    • /
    • v.41 no.4
    • /
    • pp.287-294
    • /
    • 2021
  • The objective of this study was conducted to calculate the damage of whole crop maize in accordance with abnormal climate using the forage yield prediction model through machine learning. The forage yield prediction model was developed through 8 machine learning by processing after collecting whole crop maize and climate data, and the experimental area was selected as Gyeonggi-do. The forage yield prediction model was developed using the DeepCrossing (R2=0.5442, RMSE=0.1769) technique of the highest accuracy among machine learning techniques. The damage was calculated as the difference between the predicted dry matter yield of normal and abnormal climate. In normal climate, the predicted dry matter yield varies depending on the region, it was found in the range of 15,003~17,517 kg/ha. In abnormal temperature, precipitation, and wind speed, the predicted dry matter yield differed according to region and abnormal climate level, and ranged from 14,947 to 17,571, 14,986 to 17,525, and 14,920 to 17,557 kg/ha, respectively. In abnormal temperature, precipitation, and wind speed, the damage was in the range of -68 to 89 kg/ha, -17 to 17 kg/ha, and -112 to 121 kg/ha, respectively, which could not be judged as damage. In order to accurately calculate the damage of whole crop maize need to increase the number of abnormal climate data used in the forage yield prediction model.

Real-time Color Recognition Based on Graphic Hardware Acceleration (그래픽 하드웨어 가속을 이용한 실시간 색상 인식)

  • Kim, Ku-Jin;Yoon, Ji-Young;Choi, Yoo-Joo
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.14 no.1
    • /
    • pp.1-12
    • /
    • 2008
  • In this paper, we present a real-time algorithm for recognizing the vehicle color from the indoor and outdoor vehicle images based on GPU (Graphics Processing Unit) acceleration. In the preprocessing step, we construct feature victors from the sample vehicle images with different colors. Then, we combine the feature vectors for each color and store them as a reference texture that would be used in the GPU. Given an input vehicle image, the CPU constructs its feature Hector, and then the GPU compares it with the sample feature vectors in the reference texture. The similarities between the input feature vector and the sample feature vectors for each color are measured, and then the result is transferred to the CPU to recognize the vehicle color. The output colors are categorized into seven colors that include three achromatic colors: black, silver, and white and four chromatic colors: red, yellow, blue, and green. We construct feature vectors by using the histograms which consist of hue-saturation pairs and hue-intensity pairs. The weight factor is given to the saturation values. Our algorithm shows 94.67% of successful color recognition rate, by using a large number of sample images captured in various environments, by generating feature vectors that distinguish different colors, and by utilizing an appropriate likelihood function. We also accelerate the speed of color recognition by utilizing the parallel computation functionality in the GPU. In the experiments, we constructed a reference texture from 7,168 sample images, where 1,024 images were used for each color. The average time for generating a feature vector is 0.509ms for the $150{\times}113$ resolution image. After the feature vector is constructed, the execution time for GPU-based color recognition is 2.316ms in average, and this is 5.47 times faster than the case when the algorithm is executed in the CPU. Our experiments were limited to the vehicle images only, but our algorithm can be extended to the input images of the general objects.

Analysis and Evaluation of Frequent Pattern Mining Technique based on Landmark Window (랜드마크 윈도우 기반의 빈발 패턴 마이닝 기법의 분석 및 성능평가)

  • Pyun, Gwangbum;Yun, Unil
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.101-107
    • /
    • 2014
  • With the development of online service, recent forms of databases have been changed from static database structures to dynamic stream database structures. Previous data mining techniques have been used as tools of decision making such as establishment of marketing strategies and DNA analyses. However, the capability to analyze real-time data more quickly is necessary in the recent interesting areas such as sensor network, robotics, and artificial intelligence. Landmark window-based frequent pattern mining, one of the stream mining approaches, performs mining operations with respect to parts of databases or each transaction of them, instead of all the data. In this paper, we analyze and evaluate the techniques of the well-known landmark window-based frequent pattern mining algorithms, called Lossy counting and hMiner. When Lossy counting mines frequent patterns from a set of new transactions, it performs union operations between the previous and current mining results. hMiner, which is a state-of-the-art algorithm based on the landmark window model, conducts mining operations whenever a new transaction occurs. Since hMiner extracts frequent patterns as soon as a new transaction is entered, we can obtain the latest mining results reflecting real-time information. For this reason, such algorithms are also called online mining approaches. We evaluate and compare the performance of the primitive algorithm, Lossy counting and the latest one, hMiner. As the criteria of our performance analysis, we first consider algorithms' total runtime and average processing time per transaction. In addition, to compare the efficiency of storage structures between them, their maximum memory usage is also evaluated. Lastly, we show how stably the two algorithms conduct their mining works with respect to the databases that feature gradually increasing items. With respect to the evaluation results of mining time and transaction processing, hMiner has higher speed than that of Lossy counting. Since hMiner stores candidate frequent patterns in a hash method, it can directly access candidate frequent patterns. Meanwhile, Lossy counting stores them in a lattice manner; thus, it has to search for multiple nodes in order to access the candidate frequent patterns. On the other hand, hMiner shows worse performance than that of Lossy counting in terms of maximum memory usage. hMiner should have all of the information for candidate frequent patterns to store them to hash's buckets, while Lossy counting stores them, reducing their information by using the lattice method. Since the storage of Lossy counting can share items concurrently included in multiple patterns, its memory usage is more efficient than that of hMiner. However, hMiner presents better efficiency than that of Lossy counting with respect to scalability evaluation due to the following reasons. If the number of items is increased, shared items are decreased in contrast; thereby, Lossy counting's memory efficiency is weakened. Furthermore, if the number of transactions becomes higher, its pruning effect becomes worse. From the experimental results, we can determine that the landmark window-based frequent pattern mining algorithms are suitable for real-time systems although they require a significant amount of memory. Hence, we need to improve their data structures more efficiently in order to utilize them additionally in resource-constrained environments such as WSN(Wireless sensor network).

A Study on Brand Identity of TV Programs in the Digital Culture - Focusing on the comparative research of current issue programs, and development - (디지털 문화에서 TV 방송의 브랜드 아이덴티티 연구 -시사 교양 프로그램의 사례비교 및 개발을 중심으로-)

  • Jeong, Bong-Keum;Chang, Dong-Ryun
    • Archives of design research
    • /
    • v.18 no.4 s.62
    • /
    • pp.53-64
    • /
    • 2005
  • The emergence of new communication media, digital, is something of a wonder, as well as a cultural tension. The industrial technologies that dramatically expand human abilities are being developed much faster than the speed of adaptation by humans. Without an exception, it creates new contents and form of the culture by shaking the very foundation of the notion about human beings. Korean broadcasting environment has stepped into the era of multi-media, multi-channel as the digital technology separated the media into network, cable, satellite and internet. In this digital culture, broadcasting, as a medium of information delivering and communication, has bigger influence than ever. Such changes in broadcasting environment turned the TV viewers into new consumers who participate and play the main role in active communication by choosing and using the media. This study is trying to systemize the question about the core identity of broadcasting through brand as the consumers stand in the center of broadcasting with the power to select channel. The story schema theory can be applied as a cognitive psychological tool to approach the active consumers in order to explain the cognitive processes that are related to information processing. It is a design with stories, which comes up as a case of a brand's story telling. The range of this study covers the current issue and educational programs in network TV during the period of May and August of year 2005. The cases of Korean and foreign programs were compared by the station each program is broadcasted. This study concludes that it is important to take the channel identity into the consideration in the brand strategy of each program. Especially, the leading programs of a station must not be treated as a separate program that has nothing to do with the station's identity. They must be treated to include the contents and form that builds the identity of the channel. Also, this study reconfirmed that building a brand of the anchor person can play as an important factor in the identity of the program's brand.

  • PDF

Modeling of Sensorineural Hearing Loss for the Evaluation of Digital Hearing Aid Algorithms (디지털 보청기 알고리즘 평가를 위한 감음신경성 난청의 모델링)

  • 김동욱;박영철
    • Journal of Biomedical Engineering Research
    • /
    • v.19 no.1
    • /
    • pp.59-68
    • /
    • 1998
  • Digital hearing aids offer many advantages over conventional analog hearing aids. With the advent of high speed digital signal processing chips, new digital techniques have been introduced to digital hearing aids. In addition, the evaluation of new ideas in hearing aids is necessarily accompanied by intensive subject-based clinical tests which requires much time and cost. In this paper, we present an objective method to evaluate and predict the performance of hearing aid systems without the help of such subject-based tests. In the hearing impairment simulation(HIS) algorithm, a sensorineural hearing impairment medel is established from auditory test data of the impaired subject being simulated. Also, the nonlinear behavior of the loudness recruitment is defined using hearing loss functions generated from the measurements. To transform the natural input sound into the impaired one, a frequency sampling filter is designed. The filter is continuously refreshed with the level-dependent frequency response function provided by the impairment model. To assess the performance, the HIS algorithm was implemented in real-time using a floating-point DSP. Signals processed with the real-time system were presented to normal subjects and their auditory data modified by the system was measured. The sensorineural hearing impairment was simulated and tested. The threshold of hearing and the speech discrimination tests exhibited the efficiency of the system in its use for the hearing impairment simulation. Using the HIS system we evaluated three typical hearing aid algorithms.

  • PDF

Increase in Anti-Oxidant Components and Reduction of Off-Flavors on Radish Leaf Extracts by Extrusion Process (압출성형 무청 분말 추출물의 항산화 물질 함량 증가 및 이취 감소)

  • Sung, Nak-Yun;Park, Woo-Young;Kim, Yi-Eun;Cho, Eun-Ji;Song, Hayeon;Jun, Hyeong-Kwang;Park, Jae-Nam;Kim, Mi-Hwan;Ryu, Gi-Hyung;Byun, Eui-Hong
    • Journal of the Korean Society of Food Science and Nutrition
    • /
    • v.45 no.12
    • /
    • pp.1769-1775
    • /
    • 2016
  • Aerial parts (leaves and stems) of radish are usually discarded due to the distinct undesirable flavors associated with inappropriate preparations, despite their many health benefits. In this study, we examined the role of extrusion process in the removal of off-flavors and elevation of antioxidant activity in radish (Raphanus sativus L.) leaves and stems. To optimize the extrusion conditions, we changed the barrel temperature (110, 120, and $130^{\circ}C$), screw speed (150, 200, 250, and 300 rpm), and moisture content (20, 25, and 30%). The polyphenol and flavonoid contents significantly increased in extruded radish leaves and stems (ER) under optimum extrusion conditions ($130^{\circ}C$, 250 rpm, and 20%). Under extrusion conditions, we compared off-flavors (as amount of sulfur-containing compound) levels between ER and non-extruded radish leaves and stems (NER) by an electronic nose. A total of six peaks (sulfur-containing compound) were similarly detected in both ER and NER, whereas the ER showed reduced off-flavors. Levels of glucosinolate (${\mu}g/g$), which can be hydrolyzed into off-flavors during mastication or processing, were significantly decreased in the ER. From these results, extrusion processing can be an effective method to increase anti-oxidant activity and removal of off-flavors in radish leaves and stems.

Evaluation of Reliability about Short TAT (Turn-Around Time) of Domestic Automation Equipment (Gamma Pro) (국산 자동화 장비(Gamma Pro)의 결과보고시간 단축에 대한 유용성 평가)

  • Oh, Yun-Jeong;Kim, Ji-Young;Seok, Jae-Dong
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.14 no.2
    • /
    • pp.197-202
    • /
    • 2010
  • Purpose: Recently, many hospitals have been tried to increase the satisfaction of the outpatients through blood-gathering, exam, result notice and process in a day. Each laboratory has been used the automatic equipment for the rapid requests of the result notice and the increase of the reliability and efficiency. Current automatic equipments that have been limited short TAT(Turn-Around Time)because of the restricted batch lists and 1 tip-5 detectors. The Gamma Pro which is made in Korea to improve the shortcomings of existing automation equipment, complemented with capacity to perform a wide range of domestic automation equipment. In this study, we evaluated the usefulness and reliability of short TAT by comparing Gamma Pro with current automatic equipment. Materials and Methods: We studied the correlation between Gamma Pro and RIA-mat 280 using the respective 100 specimens of low or high density to the patients who were requested the thyroid hormone test (Total T3, TSH and Free T4) in Samsung Medical Center Sep. 2009. To evaluate the split-level Gamma Pro, First, we measured accuracy and carry over on the tips. Second, the condition of optimal incubation was measured by the RPM (Revolution Per Minute) and revolution axis diameter on the incubator. For the analysis for the speed of the specimen-processing, TAT was investigated with the results in a certain time. Result: The correlation coefficients (R2) between the Gamma Pro and RIA-mat 280 showed a good correlation as T3 (0.98), TSH (0.99), FT4 (0.92). The coefficient of variation (C.V) and accuracy was 0.38 % and 98.3 % at tip 1 and 0.39 % and 98.6 % at tip 2. Carry over showed 0.80 % and 1.04% at tip 1 and tip 2, respectively. These results indicate that tips had no effect on carry over contamination. At the incubator condition, we found that the optimal condition was 1.0mm of diameter at 600RPM in 1.0mm and 1.5mm of at 500RPM or 1.0mm and 1.5 mm of diameter at 600 RPM. the Gamma Pro showed that the number of exam times were increased as maximum 20 times/day comparing to 6 times/day by current automatic equipment. These results also led to the short TAT from 4.20 hour to 2.19 hours in whole processing. Conclusion: The correlation of between the Gamma Pro and RIA-mat 280 was good and has not carry over contamination in tips. The domestic automation equipment (Gamma Pro) decreases the TAT in whole test comparing to RIA-280. These results demonstrate that Gamma Pro has a good efficiency, reliability and practical usefulness, which may contribute to the excellent skill to process the large scale specimens.

  • PDF

Study on economic effects of outsourcing of food materials on the hotel kitchen - Focus on cooking Western food in the first class hotel - (식재료 아웃소싱이 경제적 주방에 미치는 영향에 관한 연구 - 특1급호텔 양식조리를 중심으로 -)

  • 성태종
    • Journal of Applied Tourism Food and Beverage Management and Research
    • /
    • v.13 no.2
    • /
    • pp.45-69
    • /
    • 2002
  • This study is designed to examine feasibility and limitation of outsourcing in cooking Western food in a hotel, to interpret importance of outsourcing(eg. outside order, outside procurement, outside supply) in a broad sense in order to reinforce the core capacity in the cooking department, and to know whether the cooking human power is efficiently used and how much the chefs recognize outsourcing of food materials. As many companies conduct restructuring to cut down its size, the reduction of human power led the Western food cooking in the hotel to lower core capacities, lower quality, and lower efficiency. In addition, the sagging morale of chefs undermined creativity. To change from the traditional kitchen to an economic kitchen needs to look into importance of outsourcing, cognitive attitude of chefs, relation with outside suppliers. Here suggests performance of positive changes in the structure The study examined feasibility and limitation of outsourcing in the hotel kitchen as well as chefs' cognitive attitude toward outsourcing of food materials to reinforce core capabilities of the hotel kitchen. 1. Companies of outsourcing are selected according to variability of price conditions, flexibility of contract conditions, popularity of the outsourcing company, and reputation of the outsourcing company. 2. The importance of outsourcing in the Western food cooking is divided into 4 factors such as standard of selecting outsourcing companies, policies of cooking manu, quality of cooking, and quantity of cooking. 3. The most feasible section in outsourcing of food materials is a process of kneading flour for bread, which shows that many Western-food chefs expect to put higher possibility of outsourcing on the kneading. In other words, when it comes to confectionery and bakery, there are many outside expert processing companies supplying high quality products. In the order of outsourcing feasibility, sauce is followed by processed vegetable, garnish of main dish, and soup. The least feasible section in outsourcing of food materials is appetize. Appetize includes a concept of a improvised dish and needs speed. Due to its color, freshness, and sensibility of taste, the appetize plays a key role in the Western food cooking. 4. When outsourcing is taken in place, the highest risk is to lower the inner cooking skills. Therefore chefs in charge of the Western food sequently recognize both internal problems including storage of cooking skills, unstability of layoffs, and loss of cooperation between departments, and external problems including inferior goods, difficulty of differentiating manu, delay of delivery, and expiration date. It shows that most of the Western food chefs consider risks of the internal problems at first. 5. A effective outsourcing needs appropriate selection of outsourcing companies, maintenance of credibility, active communication, check and management of hygiene. However regardless of their position or career, chefs in charge of the Western food have the same cognitive attitude toward selecting successful outsourcing companies after the outsourcing system is enforced. The core of cooking, or a final stage in the full process of so-called artistic cooking, should be treated with insourcing. Reduction of several cooking processes resulted in shortened cooking time, increased efficiency, faster cooking, cutting the waiting-lines, and finally more room for customers. The outsourcing system can reduce or eliminate the following processes in cooking: buying various food materials, checking, storing, preparing, and processing. Especially in the Western food cooking department of a hotel, the outsourcing system should be enforced to make an economic kitchen and to efficiently manage it. Wow it's time to change from the traditional kitchen to an economic kitchen in the hotel cooking department. For that, the cooking department should become a small but strong organization by outsourcing except its core work.

  • PDF