• Title/Summary/Keyword: Stop Accuracy

Search Result 100, Processing Time 0.023 seconds

One-stop Evaluation Protocol of Ischemic Heart Disease: Myocardial Fusion PET Study (허혈성 심장 질환의 One-stop Evaluation Protocol: Myocardial Fusion PET Study)

  • Kim, Kyong-Mok;Lee, Byung-Wook;Lee, Dong-Wook;Kim, Jeong-Su;Jang, Yeong-Do;Bang, Chan-Seok;Baek, Jong-Hun;Lee, In-Su
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.14 no.2
    • /
    • pp.33-37
    • /
    • 2010
  • Purpose: In the early stage of using PET/CT, it was used to damper revision but recently shows that CT with MDCT is commonly used and works well for an anatomical diagnosis. This hospital makes the accuracy and convenience more higher in the diagnosis and evaluate of coronary heart disease through concurrently running myocardial perfusion SPECT examination, myocardial PET examination with FDG, and CT coronary artery CT angiography(coronary CTA) used PET/CT with 64-slice. This report shows protocol and image based on results from about 400 coronary heart disease examinations since having 64 channels PET/CT in July 2007. Materials and Methods: An Equipment for this examination is 64-slice CT and Discovery VCT (DVCT) that is consisted of PET with BGO ($Bi_4Ge_3O_{12}$) scintillation crystal by GE health care. First myocardial perfusion SPECT with pharmacologic stress test to reduce waiting time of a patient and get a quick diagnosis and evaluation, and right after it, myocardial FDG PET examination and coronary CTA run without a break. One-stop evaluation protocol of ischemic heart disease is as follows. 1)Myocardial perfusion SPECT with pharmacologic stress: A patient is injected with $^{99m}Tc$-MIBI 10 mCi and does not have any fatty food for myocardial PET examination and drink natural water with ursodeoxcholic acid 100 mg and we get SPECT image in an hour. 2)Myocardial FDG PET: To reduce blood fatty content and to increase uptake of FDG, we used creative oral glucose load using insulin and Acipimox to according to blood acid content. A patient is injected with $^{18}F$-FDG 5 mCi for reduction of his radiation exposure and we get a gated image an hour later and get delay image when we need. 3) Coronary CTA: The most important point is to control heart rate and to get cooperation of patient's breath. In order to reduce a heart rate of him or her below 65 beats, let him or her take beta blocker 50 mg ~ 200 mg after a consultation with a doctor about it and have breath-practices then have the examination. Right before the examination, we spray isosorbide dinitrate 3 to 5 times to lower tension of bessel wall and to extension a blood wall of a patient. It makes to get better the shape of an anatomy. At filming, a patient is injected CT contrast with high pressure and have enough practices before the examination in order to have no problem. For reduction of his radiation exposure, we have to do ECG-triggered X-ray tube modulation exposure. Results: We evaluate coronary artery stenosis through coronary CTA and study correlation (culprit vessel check) of a decline between stenosis and perfusion from the myocardial perfusion SPECT with pharmacologic stress, coronary CTA, and can check viability of infarction or hibernating myocardium by FDG PET. Conclusion: The examination makes us to set up a direction of remedy (drug treatment, PCI, CABG) because we can estimate of effect from remedy, lesion site and severity. In addition, we have an advantage that it takes just 3 hours and one-stop in that all of process of examinations run in succession and at the same time. Therefore it shows that the method is useful in one stop evaluation of ischemic heart disease.

  • PDF

A Study on Development of Patent Information Retrieval Using Textmining (텍스트 마이닝을 이용한 특허정보검색 개발에 관한 연구)

  • Go, Gwang-Su;Jung, Won-Kyo;Shin, Young-Geun;Park, Sang-Sung;Jang, Dong-Sik
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.12 no.8
    • /
    • pp.3677-3688
    • /
    • 2011
  • The patent information retrieval system can serve a variety of purposes. In general, the patent information is retrieved using limited key words. To identify earlier technology and priority rights repeated effort is needed. This study proposes a method of content-based retrieval using text mining. Using the proposed algorithm, each of the documents is invested with characteristic value. The characteristic values are used to compare similarities between query documents and database documents. Text analysis is composed of 3 steps: stop-word, keyword analysis and weighted value calculation. In the test results, the general retrieval and the proposed algorithm were compared by using accuracy measurements. As the study arranges the result documents as similarities of the query documents, the surfer can improve the efficiency by reviewing the similar documents first. Also because of being able to input the full-text of patent documents, the users unacquainted with surfing can use it easily and quickly. It can reduce the amount of displayed missing data through the use of content based retrieval instead of keyword based retrieval for extending the scope of the search.

A Study on the Utilization of Photoballoon System for Database Generation of Small Areas (소규모 지역의 자료기반 구축을 위한 Photoballoon 시스템의 활용에 관한 연구)

  • 이재기;조재호;최석근;이재동
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.11 no.2
    • /
    • pp.7-15
    • /
    • 1993
  • In order to generate database, we need to obtain speedy and corret topographic information according to requisite purpose. Generally methods to an acquisition of topographic information are available by the use of maps, satellite images, stereo models of aerophoto and so forth. But we must choose a optimal method in consideration of area of object region, spatial solution of image, required accuracy and economic. Therefore, this study aims at providing the establish method of efficient topographic data base of small object region by means of spatial layer techniques of geo-spatial information system and using acquisition of geo-information and production method of base map with photoballoon system to obtain topographic information for reasonable plan and design of object region which select a zone preparation of a collective village with small region. As a result of this study, we decided an f-stop and a shutter speed of camera to obtain accurate stereo model and were able to obtain stereo photography and topography for small region by using of photoballoon system through accuracy analysis according to change flight height and air base speedly and economically. We can establish the data base useable to efficient plan and design as existence map with overlay plan drawing.

  • PDF

Detection Scheme of Heart and Respiration Signals for a Driver of Car with a Doppler Radar (도플러 레이더 기반 차량 운전자의 심박 및 호흡 신호 검출 기법 연구)

  • Yun, Younguk;Lee, Jeongpyo;Kim, Jinmyung;Kim, Youngok
    • Journal of the Society of Disaster Information
    • /
    • v.16 no.1
    • /
    • pp.87-95
    • /
    • 2020
  • Purpose: In this paper, we propose an algorithm for detecting respiratory rate and heart beat of a driver of car by exploiting Doppler radar, and verifying the feasibility of the study through experiments. Method: In this paper, we propose a weighted peak detection technique using peak frequency values. The tests are performed in stop-state and driving-state, and the experiment result is analyzed by two proposed algorithms. Result: The results showed more than 95% and 96% accuracy of respiratory and heart rate, respectively. It also showed more than 72% and 84% accuracy of those even for driving experiments. Conclusion: The proposed detection scheme for vital signs can be used for the safety of the driver as well as for prevention of a large size of car accidents.

A Study on Applicability of Machine Learning for Book Classification of Public Libraries: Focusing on Social Science and Arts (공공도서관 도서 분류를 위한 머신러닝 적용 가능성 연구 - 사회과학과 예술분야를 중심으로 -)

  • Kwak, Chul Wan
    • Journal of the Korean BIBLIA Society for library and Information Science
    • /
    • v.32 no.1
    • /
    • pp.133-150
    • /
    • 2021
  • The purpose of this study is to identify the applicability of machine learning targeting titles in the classification of books in public libraries. Data analysis was performed using Python's scikit-learn library through the Jupiter notebook of the Anaconda platform. KoNLPy analyzer and Okt class were used for Hangul morpheme analysis. The units of analysis were 2,000 title fields and KDC classification class numbers (300 and 600) extracted from the KORMARC records of public libraries. As a result of analyzing the data using six machine learning models, it showed a possibility of applying machine learning to book classification. Among the models used, the neural network model has the highest accuracy of title classification. The study suggested the need for improving the accuracy of title classification, the need for research on book titles, tokenization of titles, and stop words.

An Ensemble Approach to Detect Fake News Spreaders on Twitter

  • Sarwar, Muhammad Nabeel;UlAmin, Riaz;Jabeen, Sidra
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.5
    • /
    • pp.294-302
    • /
    • 2022
  • Detection of fake news is a complex and a challenging task. Generation of fake news is very hard to stop, only steps to control its circulation may help in minimizing its impacts. Humans tend to believe in misleading false information. Researcher started with social media sites to categorize in terms of real or fake news. False information misleads any individual or an organization that may cause of big failure and any financial loss. Automatic system for detection of false information circulating on social media is an emerging area of research. It is gaining attention of both industry and academia since US presidential elections 2016. Fake news has negative and severe effects on individuals and organizations elongating its hostile effects on the society. Prediction of fake news in timely manner is important. This research focuses on detection of fake news spreaders. In this context, overall, 6 models are developed during this research, trained and tested with dataset of PAN 2020. Four approaches N-gram based; user statistics-based models are trained with different values of hyper parameters. Extensive grid search with cross validation is applied in each machine learning model. In N-gram based models, out of numerous machine learning models this research focused on better results yielding algorithms, assessed by deep reading of state-of-the-art related work in the field. For better accuracy, author aimed at developing models using Random Forest, Logistic Regression, SVM, and XGBoost. All four machine learning algorithms were trained with cross validated grid search hyper parameters. Advantages of this research over previous work is user statistics-based model and then ensemble learning model. Which were designed in a way to help classifying Twitter users as fake news spreader or not with highest reliability. User statistical model used 17 features, on the basis of which it categorized a Twitter user as malicious. New dataset based on predictions of machine learning models was constructed. And then Three techniques of simple mean, logistic regression and random forest in combination with ensemble model is applied. Logistic regression combined in ensemble model gave best training and testing results, achieving an accuracy of 72%.

Simulation of Micro-SMES System using PSCAO/EMTOC (PSCAD/EMTDC를 이용한 Micro-SMES의 시뮬레이션)

  • Kim, Bong-Tae;Park, Min-Won;Seong, Ki-Chul;Yu, In-Keun
    • Proceedings of the KIEE Conference
    • /
    • 2002.07b
    • /
    • pp.1361-1363
    • /
    • 2002
  • Micro-SMES(Superconducting Magnetic Energy Storage) has been studied as an impulsive high power supply for industrial applications. Recently, electric power reliability of our country has been improved. However, there are still remaining problems which are short-duration variations like instantaneous and momentary interruption and voltage sag by nature calamity ; typhoon, lightning, snow, etc. Besides, power quality ; harmonics, goes down because of using power electronics equipments. Malfunction of controller and stop machinery, and losing important data are caused by poor power quality at a couple of second in accuracy controllers. Due to those, battery based UPS has been used, but there are several disadvantages ; long charge and discharge time, environmental problem by acid and heavy metal, and short life time. Micro-SMES is an alternative to settle problems mentioned above. However, there need huge system apparatuses in order to verify the effect of system efficiency and stability considering the size of micro-SMES, the sort of converter type, and various conditions. This paper presents a cost effective simulation method of micro-SMES and power converter, and design for micro-SMES based system using PSCAD/EMTDC.

  • PDF

Restoration of Landsat ETM+ SLC-off Gaps Using SPOT Image (SPOT 영상을 이용한 Landsat-7의 SLC-off 영상 복원)

  • Kim Hye-Jin;Yu Ki-Yun;Kim Yong-Il
    • Proceedings of the Korean Society of Surveying, Geodesy, Photogrammetry, and Cartography Conference
    • /
    • 2006.04a
    • /
    • pp.229-234
    • /
    • 2006
  • On May 31, 2003. Landsat 7 experienced an anomaly causing the Scan Line Corrector(SLC) to stop functioning normally. The SLC-off causes individual scan lines to alternately overlap and then leave large gaps at the edge of the Image. A many scientists with ongoing experience using ETM+ data evaluated the scientific usability and validity of Landsat 7 products containing the SLC anomaly The best reference scene for gap-filling is the other SLC-on Landsat scene that provide same resolution, few changes, and similar data acquisition. But receiving of Landsat imagery is not stable in Korea. So SPOT image can be another alternative solution because it is a steady-state multispectral satellite image as Landsat image. In this study, we filled the SLC-off gap s of 2, 3, 4 bands using SPOT image by a local regression technique, and assigned the optimum spectral value to gaps of 1, 5, 7 bands based on a spectral adjacency. Through this process, we could restore Landsat SLC-off image and evaluated the accuracy of the results.

  • PDF

One-stop System Model of Port and Logistics using SCM (공급사슬망체계하에서의 수출입화물 원스톱서비스 시스템 개발에 관한 연구)

  • 박남규;김현수;조재형
    • Proceedings of the Korean Institute of Navigation and Port Research Conference
    • /
    • 2000.11a
    • /
    • pp.43-51
    • /
    • 2000
  • Since 1996, the document exchange method by EDI has been introduced in port and logistics industries to enhance the declaration activities to the Pusan Port Authority and Customs. In spite of these efforts, users such as shipping companies, shipping agents, and freight forwarders have complained the inconvenience of using EDI systems. The major reasons can be summarized as too much transfer time, inconvenient EDI software, and problems on message receiving confirmation. To solve these problems, although we have developed an Internet based EDI system for Port-MIS users, we have failed its implementation practically for the short of Governments readiness, the complexity of systems and separation with in-house systems. So, the writers have changed the direction of research to applying the concept of SCM to logistics system by XML/EDI. In this Paper, the prototype systems to integrate processes of shipping company, Port Authority, Customs and stevedoring company will be suggested. The new method of EDI gives us advantages, which are the accuracy of cargo data and integration of processes among firms and keeping the service of cargo trace.

  • PDF

Weighted Latin Hypercube Sampling to Estimate Clearance-to-stop for Probabilistic Design of Seismically Isolated Structures in Nuclear Power Plants

  • Han, Minsoo;Hong, Kee-Jeung;Cho, Sung-Gook
    • Journal of the Earthquake Engineering Society of Korea
    • /
    • v.22 no.2
    • /
    • pp.63-75
    • /
    • 2018
  • This paper proposes extension of Latin Hypercube Sampling (LHS) to avoid the necessity of using intervals with the same probability area where intervals with different probability areas are used. This method is called Weighted Latin Hypercube Sampling (WLHS). This paper describes equations and detail procedure necessary to apply weight function to WLHS. WLHS is verified through numerical examples by comparing the estimated distribution parameters with those from other methods such as Random Sampling and Latin Hypercube Sampling. WLHS provides more flexible way on selecting samples than LHS. Accuracy of WLHS estimation on distribution parameters is depending on the selection of weight function. The proposed WLHS is applied to seismically isolated structures in nuclear power plants. In this application, clearance-to-stops (CSs) calculated using LHS proposed by Huang et al. [1] and WLHS proposed in this paper, respectively, are compared to investigate the effect of choosing different sampling techniques.