• Title/Summary/Keyword: Multi-tool

Search Result 1,438, Processing Time 0.031 seconds

Quantitative Assessment Technology of Small Animal Myocardial Infarction PET Image Using Gaussian Mixture Model (다중가우시안혼합모델을 이용한 소동물 심근경색 PET 영상의 정량적 평가 기술)

  • Woo, Sang-Keun;Lee, Yong-Jin;Lee, Won-Ho;Kim, Min-Hwan;Park, Ji-Ae;Kim, Jin-Su;Kim, Jong-Guk;Kang, Joo-Hyun;Ji, Young-Hoon;Choi, Chang-Woon;Lim, Sang-Moo;Kim, Kyeong-Min
    • Progress in Medical Physics
    • /
    • v.22 no.1
    • /
    • pp.42-51
    • /
    • 2011
  • Nuclear medicine images (SPECT, PET) were widely used tool for assessment of myocardial viability and perfusion. However it had difficult to define accurate myocardial infarct region. The purpose of this study was to investigate methodological approach for automatic measurement of rat myocardial infarct size using polar map with adaptive threshold. Rat myocardial infarction model was induced by ligation of the left circumflex artery. PET images were obtained after intravenous injection of 37 MBq $^{18}F$-FDG. After 60 min uptake, each animal was scanned for 20 min with ECG gating. PET data were reconstructed using ordered subset expectation maximization (OSEM) 2D. To automatically make the myocardial contour and generate polar map, we used QGS software (Cedars-Sinai Medical Center). The reference infarct size was defined by infarction area percentage of the total left myocardium using TTC staining. We used three threshold methods (predefined threshold, Otsu and Multi Gaussian mixture model; MGMM). Predefined threshold method was commonly used in other studies. We applied threshold value form 10% to 90% in step of 10%. Otsu algorithm calculated threshold with the maximum between class variance. MGMM method estimated the distribution of image intensity using multiple Gaussian mixture models (MGMM2, ${\cdots}$ MGMM5) and calculated adaptive threshold. The infarct size in polar map was calculated as the percentage of lower threshold area in polar map from the total polar map area. The measured infarct size using different threshold methods was evaluated by comparison with reference infarct size. The mean difference between with polar map defect size by predefined thresholds (20%, 30%, and 40%) and reference infarct size were $7.04{\pm}3.44%$, $3.87{\pm}2.09%$ and $2.15{\pm}2.07%$, respectively. Otsu verse reference infarct size was $3.56{\pm}4.16%$. MGMM methods verse reference infarct size was $2.29{\pm}1.94%$. The predefined threshold (30%) showed the smallest mean difference with reference infarct size. However, MGMM was more accurate than predefined threshold in under 10% reference infarct size case (MGMM: 0.006%, predefined threshold: 0.59%). In this study, we was to evaluate myocardial infarct size in polar map using multiple Gaussian mixture model. MGMM method was provide adaptive threshold in each subject and will be a useful for automatic measurement of infarct size.

A Research Regarding the Application and Development of Web Contents Data in Home Economics (가정과 수업의 웹 콘텐츠 자료 활용 및 개발에 관한 연구)

  • Kim Mi-Suk;Wee Eun-Hah
    • Journal of Korean Home Economics Education Association
    • /
    • v.18 no.1 s.39
    • /
    • pp.49-64
    • /
    • 2006
  • The objective of this research is to see the current status of application and development of web contents data, and to suggest the way to improve the application and development of web contents data in home economics classes in middle schools. The respondents of the research were 312 middle school home economics teachers from all over the nation, and the tool was a questionnaire which consist of 22 questions about general status of the person who was answering and their recognitions and demands on the application and development of the web contents data. The major findings were as follows : 1) 88.5% of the sample responded that they accurately grasped a meaning of a class employing web contents data, and as for effects on preparation of professional study. 2) Most of the teachers were making good use of materials from the web in their classes. They responded that it maximized the efficiency of students' learning. Some didn't use the web contents in their classes. The reasons why the web contents data usage had been low were that the classrooms were not equipped properly (43.2%) and it took long time to create web contests (37.8%). 3) Kinds of web contents data that showed the most amount of usage were the presentations (48.4%), multi-media teaching materials(23.7%), and moving pictures(19.9%). 4) Teaches wanted to improve these particular materials among the web contents: family life and home, administration and environment of resources, and clothing preparation and administration. As for the lessons, teachers wanted developments of contents of lessons, generating motives, and evaluation to be by individual teachers or curriculum researchers' societies, and 30.8% were by Korea Education & Research Information Service (KERIS).

  • PDF

Water Quality and Ecosystem Health Assessments in Urban Stream Ecosystems (도심하천 생태계에서의 수질 및 생태건강성 평가)

  • Kim, Hyun-Mac;Lee, Jae-Hoon;An, Kwang-Guk
    • Korean Journal of Environmental Biology
    • /
    • v.26 no.4
    • /
    • pp.311-322
    • /
    • 2008
  • The objectives of the study were to analyze chemical water quality and physical habitat characteristics in the urban streams (Miho and Gap streams) along with evaluations of fish community structures and ecosystem health, throughout fish composition and guild analyses during 2006$\sim$2007. Concentrations of BOD and COD averaged 3.5 and 5.7 mg L$^{-1}$, in the urban streams, while TN and TP averaged 5.1 mg L$^{-1}$ and 274 ${\mu}g$ L$^{-1}$, indicating an eutrophic state. Especially, organic pollution and eutrophication were most intense in the downstream reach of both streams. Total number of fish was 34 species in the both streams, and the most abundant species was Zacco platypus (32$\sim$42% of the total). In both streams, the relative abundance of sensitive species was low (23%) and tolerant and omnivores were high (45%, 52%), indicating an typical tolerance and trophic guilds of urban streams in Korea. According to multi-metric models of Stream Ecosystem Health Assessments (SEHA), model values were 19 and 24 in Miho Stream and Gap Stream, respectively. Habitat analysis showed that QHEI (Qulatitative Habitat Evaluation Index) values were 123 and 135 in the two streams, respectively. The minimum values in the SEHA and QHEI were observed in the both downstreams, and this was mainly attributed to chemical pollutions, as shown in the water quality parameters. The model values of SEHA were strongly correlated with conductivity (r=-0.530, p=0.016), BOD (r=-0.578, p< 0.01), COD (r=-0.603, p< 0.01), and nutrients (TN, TP: r>0.40, p<0.05). This model applied in this study seems to be a useful tool, which could reflect the chemical water quality in the urban streams. Overall, this study suggests that consistent ecological monitoring is required in the urban streams for the conservations along with ecological restorations in the degradated downstrems.

GPU Based Feature Profile Simulation for Deep Contact Hole Etching in Fluorocarbon Plasma

  • Im, Yeon-Ho;Chang, Won-Seok;Choi, Kwang-Sung;Yu, Dong-Hun;Cho, Deog-Gyun;Yook, Yeong-Geun;Chun, Poo-Reum;Lee, Se-A;Kim, Jin-Tae;Kwon, Deuk-Chul;Yoon, Jung-Sik;Kim3, Dae-Woong;You, Shin-Jae
    • Proceedings of the Korean Vacuum Society Conference
    • /
    • 2012.08a
    • /
    • pp.80-81
    • /
    • 2012
  • Recently, one of the critical issues in the etching processes of the nanoscale devices is to achieve ultra-high aspect ratio contact (UHARC) profile without anomalous behaviors such as sidewall bowing, and twisting profile. To achieve this goal, the fluorocarbon plasmas with major advantage of the sidewall passivation have been used commonly with numerous additives to obtain the ideal etch profiles. However, they still suffer from formidable challenges such as tight limits of sidewall bowing and controlling the randomly distorted features in nanoscale etching profile. Furthermore, the absence of the available plasma simulation tools has made it difficult to develop revolutionary technologies to overcome these process limitations, including novel plasma chemistries, and plasma sources. As an effort to address these issues, we performed a fluorocarbon surface kinetic modeling based on the experimental plasma diagnostic data for silicon dioxide etching process under inductively coupled C4F6/Ar/O2 plasmas. For this work, the SiO2 etch rates were investigated with bulk plasma diagnostics tools such as Langmuir probe, cutoff probe and Quadruple Mass Spectrometer (QMS). The surface chemistries of the etched samples were measured by X-ray Photoelectron Spectrometer. To measure plasma parameters, the self-cleaned RF Langmuir probe was used for polymer deposition environment on the probe tip and double-checked by the cutoff probe which was known to be a precise plasma diagnostic tool for the electron density measurement. In addition, neutral and ion fluxes from bulk plasma were monitored with appearance methods using QMS signal. Based on these experimental data, we proposed a phenomenological, and realistic two-layer surface reaction model of SiO2 etch process under the overlying polymer passivation layer, considering material balance of deposition and etching through steady-state fluorocarbon layer. The predicted surface reaction modeling results showed good agreement with the experimental data. With the above studies of plasma surface reaction, we have developed a 3D topography simulator using the multi-layer level set algorithm and new memory saving technique, which is suitable in 3D UHARC etch simulation. Ballistic transports of neutral and ion species inside feature profile was considered by deterministic and Monte Carlo methods, respectively. In case of ultra-high aspect ratio contact hole etching, it is already well-known that the huge computational burden is required for realistic consideration of these ballistic transports. To address this issue, the related computational codes were efficiently parallelized for GPU (Graphic Processing Unit) computing, so that the total computation time could be improved more than few hundred times compared to the serial version. Finally, the 3D topography simulator was integrated with ballistic transport module and etch reaction model. Realistic etch-profile simulations with consideration of the sidewall polymer passivation layer were demonstrated.

  • PDF

Evaluate the implementation of Volumetric Modulated Arc Therapy QA in the radiation therapy treatment according to Various factors by using the Portal Dosimetry (용적변조회전 방사선치료에서 Portal Dosimetry를 이용한 선량평가의 재현성 분석)

  • Kim, Se Hyeon;Bae, Sun Myung;Seo, Dong Rin;Kang, Tae Young;Baek, Geum Mun
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.27 no.2
    • /
    • pp.167-174
    • /
    • 2015
  • Purpose : The pre-treatment QA using Portal dosimetry for Volumetric Arc Therapy To analyze whether maintaining the reproducibility depending on various factors. Materials and Methods : Test was used for TrueBeam STx$^{TM}$ (Ver.1.5, Varian, USA). Varian Eclipse Treatment planning system(TPS) was used for planning with total of seven patients include head and neck cancer, lung cancer, prostate cancer, and cervical cancer was established for a Portal dosimetry QA plan. In order to measure these plans, Portal Dosimetry application (Ver.10) (Varian) and Portal Vision aS1000 Imager was used. Each Points of QA was determined by dividing, before and after morning treatment, and the after afternoon treatment ended (after 4 hours). Calibration of EPID(Dark field correction, Flood field correction, Dose normalization) was implemented before Every QA measure points. MLC initialize was implemented after each QA points and QA was retried. Also before QA measurements, Beam Ouput at the each of QA points was measured using the Water Phantom and Ionization chamber(IBA dosimetry, Germany). Results : The mean values of the Gamma pass rate(GPR, 3%, 3mm) for every patients between morning, afternoon and evening was 97.3%, 96.1%, 95.4% and the patient's showing maximum difference was 95.7%, 94.2% 93.7%. The mean value of GPR before and after EPID calibration were 95.94%, 96.01%. The mean value of Beam Output were 100.45%, 100.46%, 100.59% at each QA points. The mean value of GPR before and after MLC initialization were 95.83%, 96.40%. Conclusion : Maintain the reproducibility of the Portal Dosimetry as a VMAT QA tool required management of the various factors that can affect the dosimetry.

  • PDF

Development of a Climate Change Vulnerability Index on the Health Care Sector (기후변화 건강 취약성 평가지표 개발)

  • Shin, Hosung;Lee, Suehyung
    • Journal of Environmental Policy
    • /
    • v.13 no.1
    • /
    • pp.69-93
    • /
    • 2014
  • The aim of this research was to develop a climate change vulnerability index at the district level (Si, Gun, Gu) with respect to the health care sector in Korea. The climate change vulnerability index was esimated based on the four major causes of climate-related illnesses : vector, flood, heat waves, and air pollution/allergies. The vulnerability assessment framework consists of six layers, all of which are based on the IPCC vulnerability concepts (exposure, sensitivity, and adaptive capacity) and the pathway of direct and indirect impacts of climate change modulators on health. We collected proxy variables based on the conceptual framework of climate change vulnerability. Data were standardized using the min-max normalization method. We applied the analytic hierarchy process (AHP) weight and aggregated the variables using the non-compensatory multi-criteria approach. To verify the index, sensitivity analysis was conducted by using another aggregation method (geometric transformation method, which was applied to the index of multiple deprivation in the UK) and weight, calculated by the Budget Allocation method. The results showed that it would be possible to identify the vulnerable areas by applying the developed climate change vulnerability assessment index. The climate change vulnerability index could then be used as a valuable tool in setting climate change adaptation policies in the health care sector.

  • PDF

Automatic Interpretation of F-18-FDG Brain PET Using Artificial Neural Network: Discrimination of Medial and Lateral Temporal Lobe Epilepsy (인공신경회로망을 이용한 뇌 F-18-FDG PET 자동 해석: 내.외측 측두엽간질의 감별)

  • Lee, Jae-Sung;Lee, Dong-Soo;Kim, Seok-Ki;Park, Kwang-Suk;Lee, Sang-Kun;Chung, June-Key;Lee, Myung-Chul
    • The Korean Journal of Nuclear Medicine
    • /
    • v.38 no.3
    • /
    • pp.233-240
    • /
    • 2004
  • Purpose: We developed a computer-aided classifier using artificial neural network (ANN) to discriminate the cerebral metabolic pattern of medial and lateral temporal lobe epilepsy (TLE). Materials and Methods: We studied brain F-18-FDG PET images of 113 epilepsy patients sugically and pathologically proven as medial TLE (left 41, right 42) or lateral TLE (left 14, right 16). PET images were spatially transformed onto a standard template and normalized to the mean counts of cortical regions. Asymmetry indices for predefined 17 mirrored regions to hemispheric midline and those for medial and lateral temporal lobes were used as input features for ANN. ANN classifier was composed of 3 independent multi-layered perceptrons (1 for left/right lateralization and 2 for medial/lateral discrimination) and trained to interpret metabolic patterns and produce one of 4 diagnoses (L/R medial TLE or L/R lateral TLE). Randomly selected 8 images from each group were used to train the ANN classifier and remaining 51 images were used as test sets. The accuracy of the diagnosis with ANN was estimated by averaging the agreement rates of independent 50 trials and compared to that of nuclear medicine experts. Results: The accuracy in lateralization was 89% by the human experts and 90% by the ANN classifier Overall accuracy in localization of epileptogenic zones by the ANN classifier was 69%, which was comparable to that by the human experts (72%). Conclusion: We conclude that ANN classifier performed as well as human experts and could be potentially useful supporting tool for the differential diagnosis of TLE.

Use of Human Serum Albumin Fusion Tags for Recombinant Protein Secretory Expression in the Methylotrophic Yeast Hansenula polymorpha (메탄올 자화효모 Hansenula polymorpha에서의 재조합 단백질 분비발현을 위한 인체 혈청 알부민 융합단편의 활용)

  • Song, Ji-Hye;Hwang, Dong Hyeon;Oh, Doo-Byoung;Rhee, Sang Ki;Kwon, Ohsuk
    • Microbiology and Biotechnology Letters
    • /
    • v.41 no.1
    • /
    • pp.17-25
    • /
    • 2013
  • The thermotolerant methylotrophic yeast Hansenula polymorpha is an attractive model organism for various fundamental studies, such as the genetic control of enzymes involved in methanol metabolism, peroxisome biogenesis, nitrate assimilation, and resistance to heavy metals and oxidative stresses. In addition, H. polymorpha has been highlighted as a promising recombinant protein expression host, especially due to the availability of strong and tightly regulatable promoters. In this study, we investigated the possibility of employing human serum albumin (HSA) as the fusion tag for the secretory expression of heterologous proteins in H. polymorpha. A set of four expression cassettes, which contained the methanol oxidase (MOX) promoter, translational HSA fusion tag, and the terminator of MOX, were constructed. The expression cassettes were also designed to contain sequences for accessory elements including His8-tag, $2{\times}(Gly_4Ser_1)$ linkers, tobacco etch virus protease recognition sites (Tev), multi-cloning sites, and strep-tags. To determine the effects of the size of the HSA fusion tag on the secretory expression of the target protein, each cassette contained the HSA gene fragment truncated at a specific position based on its domain structure. By using the Green fluorescence protein gene as the reporter, the properties of each expression cassette were compared in various conditions. Our results suggest that the translational HSA fusion tag is an efficient tool for the secretory expression of recombinant proteins in H. polymorpha.

Monitoring and Risk Assessment of Pesticide Residues on Stalk and Stem Vegetables Marketed in Incheon Metropolitan Area (인천광역시 유통 엽경채류 농산물의 잔류농약 실태조사 및 위해성 평가)

  • Park, Byung-Kyu;Jung, Seung-Hye;Kwon, Sung-Hee;Ye, Eun-Young;Lee, Han-Jung;Seo, Soon-Jae;Joo, Kwang-Sig;Heo, Myung-Je
    • Journal of Food Hygiene and Safety
    • /
    • v.35 no.4
    • /
    • pp.365-374
    • /
    • 2020
  • This study was conducted to monitor the residual pesticides on a total of 320 stalk and stem vegetables from January 2019 to December 2019 in the Incheon metropolitan area. Pesticide residues in samples were analyzed by the multi-residue method for 373 pesticides using GC-MS/MS, LC-MS/MS, GC-ECD, GC-NPD and HPLC-UVD. Risk assessment was also carried out based on the amount of stalk and stem vegetables consumed. The linearity correlation coefficient for the calibration curve was 0.9951 to 1.0000, LOD 0.002 to 0.022 mg/kg, LOQ 0.005 to 0.066 mg/kg and recovery was 82.0 to 108.0%. According to the monitoring of pesticides, 36 (11.3%) of 320 were detected with pesticide residues and 3 (0.9%) samples exceeded the maximum residual limit. The detection frequency for Chinese chives and Welsh onion was higher than that for other stalk and stem vegetables. The frequently detected pesticides were etofenprox, procymidone, fludioxonil, and pendimethalin. As a tool of risk assessment through the consumption of pesticide detectable agricultural products, the ratio of estimated daily intake (EDI) to acceptable daily intake (ADI) was calculated in the range of 0.0062-24.1423%. These results indicate that there is no particular health risk through consumption of commercial stalk and stem vegetables detected with pesticide residues.

Computerized Multiple 15-hue tests for Quantifying Color Vision Acuity (색각 능력의 정량적 평가를 위한 전산화된 다중 15-색상 배열 검사법)

  • Ko S.T.;Hong S.C.;Choi M.J.
    • Journal of Biomedical Engineering Research
    • /
    • v.21 no.3 s.61
    • /
    • pp.321-331
    • /
    • 2000
  • Multiple 15-hue tests were designed and implemented on a PC in the study so as to quickly and quantitatively evaluate color vision acuity. Difficulty of the test was control)ed by the value of CDBACC (color difference between adjacent color chips) calculated using a CIELAB formula. The multiple 15-hue tests consist of eight of the hue tests (test 3-10) and three of the basic color (red, green, blue) tests (test 11-13). The 15 colors used for the hue tests were specified by the 15 color coordinates that were located at a constant distance (d = 2. 3. 5. 7, 10, 20, 30. 40) from white reference in the CIE chromaticity coordinate system and were separated by a constant color difference (CDBACC = 0.75, 1.1, 1.8. 2.5. 3.5. 7.5. 11, 14) from the adjacent chips. The color coordinates for the 15 chips for the basic color tests were the same as those of the 15 points spaced equally by a constant color difference (6.87 for the green color test. 7.27 for the red color test, 7.86 for the blue color test) from the white reference along the axis of red, green and blue. Thirty normal subjects who were not color blind were taken to undergo the multiple 15-hue tests. It was observed that most of the subjects correctly arranged color chips for the tests with CDBACC greater than 5, whereas no one correctly answered for those with CDBACC less than 2. Rapid changes in the number of the subjects correctly arranged took place when CDBACC of the tests was between 2 and 4.5. In the basic color tests, unlike the hue tests having similar values of CDBACC, it was seen that the subjects arranged color chips even less correctly. It was found that JNCD (just noticeable color difference) - a measure of color vision acuity was about 3 in average for the subjects. The JNCD was chosen as the value of the CDBACC of the test for which about $50\%$ of the subjects failed to successfully arrange color chips. ERCCA (error rate of color chips arrangement) for the test with CDBACC the same as the JNCD was shown to be about $20\%$. It is expected that the multi 15-hue tests implemented on a PC in the study will be an economical tool to quickly and quantitatively evaluate color vision acuity and, accordingly, the tests can be used for early diagnosis to massive potential patients suffering from diseases (ex. diabetes, glaucoma) which may induce changes in color vision acuity.

  • PDF