• Title/Summary/Keyword: 실시간 A* 알고리즘

Search Result 2,750, Processing Time 0.033 seconds

Speed-up Techniques for High-Resolution Grid Data Processing in the Early Warning System for Agrometeorological Disaster (농업기상재해 조기경보시스템에서의 고해상도 격자형 자료의 처리 속도 향상 기법)

  • Park, J.H.;Shin, Y.S.;Kim, S.K.;Kang, W.S.;Han, Y.K.;Kim, J.H.;Kim, D.J.;Kim, S.O.;Shim, K.M.;Park, E.W.
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.19 no.3
    • /
    • pp.153-163
    • /
    • 2017
  • The objective of this study is to enhance the model's speed of estimating weather variables (e.g., minimum/maximum temperature, sunshine hour, PRISM (Parameter-elevation Regression on Independent Slopes Model) based precipitation), which are applied to the Agrometeorological Early Warning System (http://www.agmet.kr). The current process of weather estimation is operated on high-performance multi-core CPUs that have 8 physical cores and 16 logical threads. Nonetheless, the server is not even dedicated to the handling of a single county, indicating that very high overhead is involved in calculating the 10 counties of the Seomjin River Basin. In order to reduce such overhead, several cache and parallelization techniques were used to measure the performance and to check the applicability. Results are as follows: (1) for simple calculations such as Growing Degree Days accumulation, the time required for Input and Output (I/O) is significantly greater than that for calculation, suggesting the need of a technique which reduces disk I/O bottlenecks; (2) when there are many I/O, it is advantageous to distribute them on several servers. However, each server must have a cache for input data so that it does not compete for the same resource; and (3) GPU-based parallel processing method is most suitable for models such as PRISM with large computation loads.

Reconstruction of Stereo MR Angiography Optimized to View Position and Distance using MIP (최대강도투사를 이용한 관찰 위치와 거리에 최적화 된 입체 자기공명 뇌 혈관영상 재구성)

  • Shin, Seok-Hyun;Hwang, Do-Sik
    • Investigative Magnetic Resonance Imaging
    • /
    • v.16 no.1
    • /
    • pp.67-75
    • /
    • 2012
  • Purpose : We studied enhanced method to view the vessels in the brain using Magnetic Resonance Angiography (MRA). Noticing that Maximum Intensity Projection (MIP) image is often used to evaluate the arteries of the neck and brain, we propose a new method for view brain vessels to stereo image in 3D space with more superior and more correct compared with conventional method. Materials and Methods: We use 3T Siemens Tim Trio MRI scanner with 4 channel head coil and get a 3D MRA brain data by fixing volunteers head and radiating Phase Contrast pulse sequence. MRA brain data is 3D rotated according to the view angle of each eyes. Optimal view angle (projection angle) is determined by the distance between eye and center of the data. Newly acquired MRA data are projected along with the projection line and display only the highest values. Each left and right view MIP image is integrated through anaglyph imaging method and optimal stereoscopic MIP image is acquired. Results: Result image shows that proposed method let enable to view MIP image at any direction of MRA data that is impossible to the conventional method. Moreover, considering disparity and distance from viewer to center of MRA data at spherical coordinates, we can get more realistic stereo image. In conclusion, we can get optimal stereoscopic images according to the position that viewers want to see and distance between viewer and MRA data. Conclusion: Proposed method overcome problems of conventional method that shows only specific projected image (z-axis projection) and give optimal depth information by converting mono MIP image to stereoscopic image considering viewers position. And can display any view of MRA data at spherical coordinates. If the optimization algorithm and parallel processing is applied, it may give useful medical information for diagnosis and treatment planning in real-time.

Investigating the Performance of Bayesian-based Feature Selection and Classification Approach to Social Media Sentiment Analysis (소셜미디어 감성분석을 위한 베이지안 속성 선택과 분류에 대한 연구)

  • Chang Min Kang;Kyun Sun Eo;Kun Chang Lee
    • Information Systems Review
    • /
    • v.24 no.1
    • /
    • pp.1-19
    • /
    • 2022
  • Social media-based communication has become crucial part of our personal and official lives. Therefore, it is no surprise that social media sentiment analysis has emerged an important way of detecting potential customers' sentiment trends for all kinds of companies. However, social media sentiment analysis suffers from huge number of sentiment features obtained in the process of conducting the sentiment analysis. In this sense, this study proposes a novel method by using Bayesian Network. In this model MBFS (Markov Blanket-based Feature Selection) is used to reduce the number of sentiment features. To show the validity of our proposed model, we utilized online review data from Yelp, a famous social media about restaurant, bars, beauty salons evaluation and recommendation. We used a number of benchmarking feature selection methods like correlation-based feature selection, information gain, and gain ratio. A number of machine learning classifiers were also used for our validation tasks, like TAN, NBN, Sons & Spouses BN (Bayesian Network), Augmented Markov Blanket. Furthermore, we conducted Bayesian Network-based what-if analysis to see how the knowledge map between target node and related explanatory nodes could yield meaningful glimpse into what is going on in sentiments underlying the target dataset.

Development of Measuring Technique for Milk Composition by Using Visible-Near Infrared Spectroscopy (가시광선-근적외선 분광법을 이용한 유성분 측정 기술 개발)

  • Choi, Chang-Hyun;Yun, Hyun-Woong;Kim, Yong-Joo
    • Food Science and Preservation
    • /
    • v.19 no.1
    • /
    • pp.95-103
    • /
    • 2012
  • The objective of this study was to develop models for the predict of the milk properties (fat, protein, SNF, lactose, MUN) of unhomogenized milk using the visible and near-infrared (NIR) spectroscopic technique. A total of 180 milk samples were collected from dairy farms. To determine optimal measurement temperature, the temperatures of the milk samples were kept at three levels ($5^{\circ}C$, $20^{\circ}C$, and $40^{\circ}C$). A spectrophotometer was used to measure the reflectance spectra of the milk samples. Multilinear-regression (MLR) models with stepwise method were developed for the selection of the optimal wavelength. The preprocessing methods were used to minimize the spectroscopic noise, and the partial-least-square (PLS) models were developed to prediction of the milk properties of the unhomogenized milk. The PLS results showed that there was a good correlation between the predicted and measured milk properties of the samples at $40^{\circ}C$ and at 400~2,500 nm. The optimal-wavelength range of fat and protein were 1,600~1,800 nm, and normalization improved the prediction performance. The SNF and lactose were optimized at 1,600~1,900 nm, and the MUN at 600~800 nm. The best preprocessing method for SNF, lactose, and MUN turned out to be smoothing, MSC, and second derivative. The Correlation coefficients between the predicted and measured fat, protein, SNF, lactose, and MUN were 0.98, 0.90, 0.82, 0.75, and 0.61, respectively. The study results indicate that the models can be used to assess milk quality.

An Analysis of the Research Trends for Urban Study using Topic Modeling (토픽모델링을 이용한 도시 분야 연구동향 분석)

  • Jang, Sun-Young;Jung, Seunghyun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.22 no.3
    • /
    • pp.661-670
    • /
    • 2021
  • Research trends can be usefully used to determine the importance of research topics by period, identify insufficient research fields, and discover new fields. In this study, research trends of urban spaces, where various problems are occurring due to population concentration and urbanization, were analyzed by topic modeling. The analysis target was the abstracts of papers listed in the Korea Citation Index (KCI) published between 2002 and 2019. Topic modeling is an algorithm-based text mining technique that can discover a certain pattern in the entire content, and it is easy to cluster. In this study, the frequency of keywords, trends by year, topic derivation, cluster by topic, and trend by topic type were analyzed. Research in urban regeneration is increasing continuously, and it was analyzed as a field where detailed topics could be expanded in the future. Furthermore, urban regeneration is now becoming a regular research field. On the other hand, topics related to development/growth and energy/environment have entered a stagnation period. This study is meaningful because the correlation and trends between keywords were analyzed using topic modeling targeting all domestic urban studies.

Compression Sensing Technique for Efficient Structural Health Monitoring - Focusing on Optimization of CAFB and Shaking Table Test Using Kobe Seismic Waveforms (효율적인 SHM을 위한 압축센싱 기술 - Kobe 지진파형을 이용한 CAFB의 최적화 및 지진응답실험 중심으로)

  • Heo, Gwang-Hee;Lee, Chin-Ok;Seo, Sang-Gu;Jeong, Yu-Seung;Jeon, Joon-Ryong
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.24 no.2
    • /
    • pp.23-32
    • /
    • 2020
  • The compression sensing technology, CAFB, was developed to obtain the raw signal of the target structure by compressing it into a signal of the intended frequency range. At this point, for compression sensing, the CAFB can be optimized for various reference signals depending on the desired frequency range of the target structure. In addition, optimized CAFB should be able to efficiently compress the effective structural answers of the target structure even in sudden/dangerous conditions such as earthquakes. In this paper, the targeted frequency range for efficient structural integrity monitoring of relatively flexible structures was set below 10Hz, and the optimization method of CAFB for this purpose and the seismic response performance of CAFB in seismic conditions were evaluated experimentally. To this end, in this paper, CAFB was first optimized using Kobe seismic waveform, and embedded it in its own wireless IDAQ system. In addition, seismic response tests were conducted on two span bridges using Kobe seismic waveform. Finally, using an IDAQ system with built-in CAFB, the seismic response of the two-span bridge was wirelessly obtained, and the compression signal obtained was cross-referenced with the raw signal. From the results of the experiment, the compression signal showed excellent response performance and data compression effects in relation to the raw signal, and CAFB was able to effectively compress and sensitize the effective structural response of the structure even in seismic situations. Finally, in this paper, the optimization method of CAFB was presented to suit the intended frequency range (less than 10Hz), and CAFB proved to be an economical and efficient data compression sensing technology for instrumentation-monitoring of seismic conditions.

Evaluating Reverse Logistics Networks with Centralized Centers : Hybrid Genetic Algorithm Approach (집중형센터를 가진 역물류네트워크 평가 : 혼합형 유전알고리즘 접근법)

  • Yun, YoungSu
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.4
    • /
    • pp.55-79
    • /
    • 2013
  • In this paper, we propose a hybrid genetic algorithm (HGA) approach to effectively solve the reverse logistics network with centralized centers (RLNCC). For the proposed HGA approach, genetic algorithm (GA) is used as a main algorithm. For implementing GA, a new bit-string representation scheme using 0 and 1 values is suggested, which can easily make initial population of GA. As genetic operators, the elitist strategy in enlarged sampling space developed by Gen and Chang (1997), a new two-point crossover operator, and a new random mutation operator are used for selection, crossover and mutation, respectively. For hybrid concept of GA, an iterative hill climbing method (IHCM) developed by Michalewicz (1994) is inserted into HGA search loop. The IHCM is one of local search techniques and precisely explores the space converged by GA search. The RLNCC is composed of collection centers, remanufacturing centers, redistribution centers, and secondary markets in reverse logistics networks. Of the centers and secondary markets, only one collection center, remanufacturing center, redistribution center, and secondary market should be opened in reverse logistics networks. Some assumptions are considered for effectively implementing the RLNCC The RLNCC is represented by a mixed integer programming (MIP) model using indexes, parameters and decision variables. The objective function of the MIP model is to minimize the total cost which is consisted of transportation cost, fixed cost, and handling cost. The transportation cost is obtained by transporting the returned products between each centers and secondary markets. The fixed cost is calculated by opening or closing decision at each center and secondary markets. That is, if there are three collection centers (the opening costs of collection center 1 2, and 3 are 10.5, 12.1, 8.9, respectively), and the collection center 1 is opened and the remainders are all closed, then the fixed cost is 10.5. The handling cost means the cost of treating the products returned from customers at each center and secondary markets which are opened at each RLNCC stage. The RLNCC is solved by the proposed HGA approach. In numerical experiment, the proposed HGA and a conventional competing approach is compared with each other using various measures of performance. For the conventional competing approach, the GA approach by Yun (2013) is used. The GA approach has not any local search technique such as the IHCM proposed the HGA approach. As measures of performance, CPU time, optimal solution, and optimal setting are used. Two types of the RLNCC with different numbers of customers, collection centers, remanufacturing centers, redistribution centers and secondary markets are presented for comparing the performances of the HGA and GA approaches. The MIP models using the two types of the RLNCC are programmed by Visual Basic Version 6.0, and the computer implementing environment is the IBM compatible PC with 3.06Ghz CPU speed and 1GB RAM on Windows XP. The parameters used in the HGA and GA approaches are that the total number of generations is 10,000, population size 20, crossover rate 0.5, mutation rate 0.1, and the search range for the IHCM is 2.0. Total 20 iterations are made for eliminating the randomness of the searches of the HGA and GA approaches. With performance comparisons, network representations by opening/closing decision, and convergence processes using two types of the RLNCCs, the experimental result shows that the HGA has significantly better performance in terms of the optimal solution than the GA, though the GA is slightly quicker than the HGA in terms of the CPU time. Finally, it has been proved that the proposed HGA approach is more efficient than conventional GA approach in two types of the RLNCC since the former has a GA search process as well as a local search process for additional search scheme, while the latter has a GA search process alone. For a future study, much more large-sized RLNCCs will be tested for robustness of our approach.

An accuracy analysis of Cyberknife tumor tracking radiotherapy according to unpredictable change of respiration (예측 불가능한 호흡 변화에 따른 사이버나이프 종양 추적 방사선 치료의 정확도 분석)

  • Seo, jung min;Lee, chang yeol;Huh, hyun do;Kim, wan sun
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.27 no.2
    • /
    • pp.157-166
    • /
    • 2015
  • Purpose : Cyber-Knife tumor tracking system, based on the correlation relationship between the position of a tumor which moves in response to the real time respiratory cycle signal and respiration was obtained by the LED marker attached to the outside of the patient, the location of the tumor to predict in advance, the movement of the tumor in synchronization with the therapeutic device to track real-time tumor, is a system for treating. The purpose of this study, in the cyber knife tumor tracking radiation therapy, trying to evaluate the accuracy of tumor tracking radiation therapy system due to the change in the form of unpredictable sudden breathing due to cough and sleep. Materials and Methods : Breathing Log files that were used in the study, based on the Respiratory gating radiotherapy and Cyber-knife tracking radiosurgery breathing Log files of patients who received herein, measured using the Log files in the form of a Sinusoidal pattern and Sudden change pattern. it has been reconstituted as possible. Enter the reconstructed respiratory Log file cyber knife dynamic chest Phantom, so that it is possible to implement a motion due to respiration, add manufacturing the driving apparatus of the existing dynamic chest Phantom, Phantom the form of respiration we have developed a program that can be applied to. Movement of the phantom inside the target (Ball cube target) was driven by the displacement of three sizes of according to the size of the respiratory vertical (Superior-Inferior) direction to the 5 mm, 10 mm, 20 mm. Insert crosses two EBT3 films in phantom inside the target in response to changes in the target movement, the End-to-End (E2E) test provided in Cyber-Knife manufacturer depending on the form of the breathing five times each. It was determined by carrying. Accuracy of tumor tracking system is indicated by the target error by analyzing the inserted film, additional E2E test is analyzed by measuring the correlation error while being advanced. Results : If the target error is a sine curve breathing form, the size of the target of the movement is in response to the 5 mm, 10 mm, 20 mm, respectively, of the average $1.14{\pm}0.13mm$, $1.05{\pm}0.20mm$, with $2.37{\pm}0.17mm$, suddenly for it is variations in breathing, respective average $1.87{\pm}0.19mm$, $2.15{\pm}0.21mm$, and analyzed with $2.44{\pm}0.26mm$. If the correlation error can be defined by the length of the displacement vector in the target track is a sinusoidal breathing mode, the size of the target of the movement in response to 5 mm, 10 mm, 20 mm, respective average $0.84{\pm}0.01mm$, $0.70{\pm}0.13mm$, with $1.63{\pm}0.10mm$, if it is a variant of sudden breathing respective average $0.97{\pm}0.06mm$, $1.44{\pm}0.11mm$, and analyzed with $1.98{\pm}0.10mm$. The larger the correlation error values in both the both the respiratory form, the target error value is large. If the motion size of the target of the sine curve breathing form is greater than or equal to 20 mm, was measured at 1.5 mm or more is a recommendation value of both cyber knife manufacturer of both error value. Conclusion : There is a tendency that the correlation error value between about target error value magnitude of the target motion is large is increased, the error value becomes large in variation of rapid respiration than breathing the form of a sine curve. The more the shape of the breathing large movements regular shape of sine curves target accuracy of the tumor tracking system can be judged to be reduced. Using the algorithm of Cyber-Knife tumor tracking system, when there is a change in the sudden unpredictable respiratory due patient coughing during treatment enforcement is to stop the treatment, it is assumed to carry out the internal target validation process again, it is necessary to readjust the form of respiration. Patients under treatment is determined to be able to improve the treatment of accuracy to induce the observed form of regular breathing and put like to see the goggles monitor capable of the respiratory form of the person.

  • PDF

BVOCs Estimates Using MEGAN in South Korea: A Case Study of June in 2012 (MEGAN을 이용한 국내 BVOCs 배출량 산정: 2012년 6월 사례 연구)

  • Kim, Kyeongsu;Lee, Seung-Jae
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.24 no.1
    • /
    • pp.48-61
    • /
    • 2022
  • South Korea is quite vegetation rich country which has 63% forests and 16% cropland area. Massive NOx emissions from megacities, therefore, are easily combined with BVOCs emitted from the forest and cropland area, then produce high ozone concentration. BVOCs emissions have been estimated using well-known emission models, such as BEIS (Biogenic Emission Inventory System) or MEGAN (Model of Emission of Gases and Aerosol from Nature) which were developed using non-Korean emission factors. In this study, we ran MEGAN v2.1 model to estimate BVO Cs emissions in Korea. The MO DIS Land Cover and LAI (Leaf Area Index) products over Korea were used to run the MEGAN model for June 2012. Isoprene and Monoterpenes emissions from the model were inter-compared against the enclosure chamber measurements from Taehwa research forest in Korea, during June 11 and 12, 2012. For estimating emission from the enclosed chamber measurement data. The initial results show that isoprene emissions from the MEGAN model were up to 6.4 times higher than those from the enclosure chamber measurement. Monoterpenes from enclosure chamber measurement were up to 5.6 times higher than MEGAN emission. The differences between two datasets, however, were much smaller during the time of high emissions. More inter-comparison results and the possibilities of improving the MEGAN modeling performance using local measurement data over Korea will be presented and discussed.

Development of Intelligent Job Classification System based on Job Posting on Job Sites (구인구직사이트의 구인정보 기반 지능형 직무분류체계의 구축)

  • Lee, Jung Seung
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.123-139
    • /
    • 2019
  • The job classification system of major job sites differs from site to site and is different from the job classification system of the 'SQF(Sectoral Qualifications Framework)' proposed by the SW field. Therefore, a new job classification system is needed for SW companies, SW job seekers, and job sites to understand. The purpose of this study is to establish a standard job classification system that reflects market demand by analyzing SQF based on job offer information of major job sites and the NCS(National Competency Standards). For this purpose, the association analysis between occupations of major job sites is conducted and the association rule between SQF and occupation is conducted to derive the association rule between occupations. Using this association rule, we proposed an intelligent job classification system based on data mapping the job classification system of major job sites and SQF and job classification system. First, major job sites are selected to obtain information on the job classification system of the SW market. Then We identify ways to collect job information from each site and collect data through open API. Focusing on the relationship between the data, filtering only the job information posted on each job site at the same time, other job information is deleted. Next, we will map the job classification system between job sites using the association rules derived from the association analysis. We will complete the mapping between these market segments, discuss with the experts, further map the SQF, and finally propose a new job classification system. As a result, more than 30,000 job listings were collected in XML format using open API in 'WORKNET,' 'JOBKOREA,' and 'saramin', which are the main job sites in Korea. After filtering out about 900 job postings simultaneously posted on multiple job sites, 800 association rules were derived by applying the Apriori algorithm, which is a frequent pattern mining. Based on 800 related rules, the job classification system of WORKNET, JOBKOREA, and saramin and the SQF job classification system were mapped and classified into 1st and 4th stages. In the new job taxonomy, the first primary class, IT consulting, computer system, network, and security related job system, consisted of three secondary classifications, five tertiary classifications, and five fourth classifications. The second primary classification, the database and the job system related to system operation, consisted of three secondary classifications, three tertiary classifications, and four fourth classifications. The third primary category, Web Planning, Web Programming, Web Design, and Game, was composed of four secondary classifications, nine tertiary classifications, and two fourth classifications. The last primary classification, job systems related to ICT management, computer and communication engineering technology, consisted of three secondary classifications and six tertiary classifications. In particular, the new job classification system has a relatively flexible stage of classification, unlike other existing classification systems. WORKNET divides jobs into third categories, JOBKOREA divides jobs into second categories, and the subdivided jobs into keywords. saramin divided the job into the second classification, and the subdivided the job into keyword form. The newly proposed standard job classification system accepts some keyword-based jobs, and treats some product names as jobs. In the classification system, not only are jobs suspended in the second classification, but there are also jobs that are subdivided into the fourth classification. This reflected the idea that not all jobs could be broken down into the same steps. We also proposed a combination of rules and experts' opinions from market data collected and conducted associative analysis. Therefore, the newly proposed job classification system can be regarded as a data-based intelligent job classification system that reflects the market demand, unlike the existing job classification system. This study is meaningful in that it suggests a new job classification system that reflects market demand by attempting mapping between occupations based on data through the association analysis between occupations rather than intuition of some experts. However, this study has a limitation in that it cannot fully reflect the market demand that changes over time because the data collection point is temporary. As market demands change over time, including seasonal factors and major corporate public recruitment timings, continuous data monitoring and repeated experiments are needed to achieve more accurate matching. The results of this study can be used to suggest the direction of improvement of SQF in the SW industry in the future, and it is expected to be transferred to other industries with the experience of success in the SW industry.