• Title/Summary/Keyword: 평가기준 및 기법

Search Result 1,003, Processing Time 0.04 seconds

Effectiveness Assessment on Jaw-Tracking in Intensity Modulated Radiation Therapy and Volumetric Modulated Arc Therapy for Esophageal Cancer (식도암 세기조절방사선치료와 용적세기조절회전치료에 대한 Jaw-Tracking의 유용성 평가)

  • Oh, Hyeon Taek;Yoo, Soon Mi;Jeon, Soo Dong;Kim, Min Su;Song, Heung Kwon;Yoon, In Ha;Back, Geum Mun
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.31 no.1
    • /
    • pp.33-41
    • /
    • 2019
  • Purpose : To evaluate the effectiveness of Jaw-tracking(JT) technique in Intensity-modulated radiation therapy(IMRT) and Volumetric-modulated arc therapy(VMAT) for radiation therapy of esophageal cancer by analyzing volume dose of perimetrical normal organs along with the low-dose volume regions. Materials and Method: A total of 27 patients were selected who received radiation therapy for esophageal cancer with using $VitalBeam^{TM}$(Varian Medical System, U.S.A) in our hospital. Using Eclipse system(Ver. 13.6 Varian, U.S.A), radiation treatment planning was set up with Jaw-tracking technique(JT) and Non-Jaw-tracking technique(NJT), and was conducted for the patients with T-shaped Planning target volume(PTV), including Supraclavicular lymph nodes(SCL). PTV was classified into whether celiac area was included or not to identify the influence on the radiation field. To compare the treatment plans, Organ at risk(OAR) was defined to bilateral lung, heart, and spinal cord and evaluated for Conformity index(CI) and Homogeneity index(HI). Portal dosimetry was performed to verify a clinical application using Electronic portal imaging device(EPID) and Gamma analysis was performed with establishing thresholds of radiation field as a parameter, with various range of 0 %, 5 %, and 10 %. Results: All treatment plans were established on gamma pass rates of 95 % with 3 mm/3 % criteria. For a threshold of 10 %, both JT and NJT passed with rate of more than 95 % and both gamma passing rate decreased more than 1 % in IMRT as the low dose threshold decreased to 5 % and 0 %. For the case of JT in IMRT on PTV without celiac area, $V_5$ and $V_{10}$ of both lung showed a decrease by respectively 8.5 % and 5.3 % in average and up to 14.7 %. A $D_{mean}$ decreased by $72.3{\pm}51cGy$, while there was an increase in radiation dose reduction in PTV including celiac area. A $D_{mean}$ of heart decreased by $68.9{\pm}38.5cGy$ and that of spinal cord decreased by $39.7{\pm}30cGy$. For the case of JT in VMAT, $V_5$ decreased by 2.5 % in average in lungs, and also a little amount in heart and spinal cord. Radiation dose reduction of JT showed an increase when PTV includes celiac area in VMAT. Conclusion: In the radiation treatment planning for esophageal cancer, IMRT showed a significant decrease in $V_5$, and $V_{10}$ of both lungs when applying JT, and dose reduction was greater when the irradiated area in low-dose field is larger. Therefore, IMRT is more advantageous in applying JT than VMAT for radiation therapy of esophageal cancer and can protect the normal organs from MLC leakage and transmitted doses in low-dose field.

Probability-based Pre-fetching Method for Multi-level Abstracted Data in Web GIS (웹 지리정보시스템에서 다단계 추상화 데이터의 확률기반 프리페칭 기법)

  • 황병연;박연원;김유성
    • Spatial Information Research
    • /
    • v.11 no.3
    • /
    • pp.261-274
    • /
    • 2003
  • The effective probability-based tile pre-fetching algorithm and the collaborative cache replacement algorithm are able to reduce the response time for user's requests by transferring tiles which will be used in advance and determining tiles which should be removed from the restrictive cache space of a client based on the future access probabilities in Web GISs(Geographical Information Systems). The Web GISs have multi-level abstracted data for the quick response time when zoom-in and zoom-out queries are requested. But, the previous pre-fetching algorithm is applied on only two-dimensional pre-fetching space, and doesn't consider expanded pre-fetching space for multi-level abstracted data in Web GISs. In this thesis, a probability-based pre-fetching algorithm for multi-level abstracted in Web GISs was proposed. This algorithm expanded the previous two-dimensional pre-fetching space into three-dimensional one for pre-fetching tiles of the upper levels or lower levels. Moreover, we evaluated the effect of the proposed pre-fetching algorithm by using a simulation method. Through the experimental results, the response time for user requests was improved 1.8%∼21.6% on the average. Consequently, in Web GISs with multi-level abstracted data, the proposed pre-fetching algorithm and the collaborative cache replacement algorithm can reduce the response time for user requests substantially.

  • PDF

Error Analysis of Delivered Dose Reconstruction Using Cone-beam CT and MLC Log Data (콘빔 CT 및 MLC 로그데이터를 이용한 전달 선량 재구성 시 오차 분석)

  • Cheong, Kwang-Ho;Park, So-Ah;Kang, Sei-Kwon;Hwang, Tae-Jin;Lee, Me-Yeon;Kim, Kyoung-Joo;Bae, Hoon-Sik;Oh, Do-Hoon
    • Progress in Medical Physics
    • /
    • v.21 no.4
    • /
    • pp.332-339
    • /
    • 2010
  • We aimed to setup an adaptive radiation therapy platform using cone-beam CT (CBCT) and multileaf collimator (MLC) log data and also intended to analyze a trend of dose calculation errors during the procedure based on a phantom study. We took CT and CBCT images of Catphan-600 (The Phantom Laboratory, USA) phantom, and made a simple step-and-shoot intensity-modulated radiation therapy (IMRT) plan based on the CT. Original plan doses were recalculated based on the CT ($CT_{plan}$) and the CBCT ($CBCT_{plan}$). Delivered monitor unit weights and leaves-positions during beam delivery for each MLC segment were extracted from the MLC log data then we reconstructed delivered doses based on the CT ($CT_{recon}$) and CBCT ($CBCT_{recon}$) respectively using the extracted information. Dose calculation errors were evaluated by two-dimensional dose discrepancies ($CT_{plan}$ was the benchmark), gamma index and dose-volume histograms (DVHs). From the dose differences and DVHs, it was estimated that the delivered dose was slightly greater than the planned dose; however, it was insignificant. Gamma index result showed that dose calculation error on CBCT using planned or reconstructed data were relatively greater than CT based calculation. In addition, there were significant discrepancies on the edge of each beam while those were less than errors due to inconsistency of CT and CBCT. $CBCT_{recon}$ showed coupled effects of above two kinds of errors; however, total error was decreased even though overall uncertainty for the evaluation of delivered dose on the CBCT was increased. Therefore, it is necessary to evaluate dose calculation errors separately as a setup error, dose calculation error due to CBCT image quality and reconstructed dose error which is actually what we want to know.

Evaluation of Reference Intervals of Some Selected Chemistry Parameters using Bootstrap Technique in Dogs (Bootstrap 기법을 이용한 개의 혈청검사 일부 항목의 참고범위 평가)

  • Kim, Eu-Tteum;Pak, Son-Il
    • Journal of Veterinary Clinics
    • /
    • v.24 no.4
    • /
    • pp.509-513
    • /
    • 2007
  • Parametric and nonparametric coupled with bootstrap simulation technique were used to reevaluate previously defined reference intervals of serum chemistry parameters. A population-based study was performed in 100 clinically healthy dogs that were retrieved from the medical records of Kangwon National University Animal Hospital during 2005-2006. Data were from 52 males and 48 females(1 to 8 years old, 2.2-5.8 kg of body weight). Chemistry parameters examined were blood urea nitrogen(BUN)(mg/dl), cholesterol(mg/dl), calcium(mg/dl), aspartate aminotransferase(AST)(U/L), alanine aminotransferase(ALT)(U/L), alkaline phosphatase(ALP)(U/L), and total protein(g/dl), and were measured by Ektachem DT 60 analyzer(Johnson & Johnson). All but calcium were highly skewed distributions. Outliers were commonly identified particularly in enzyme parameters, ranging 5-9% of the samples and the remaining were only 1-2%. Regardless of distribution type of each analyte, nonparametric methods showed better estimates for use in clinical chemistry compare to parametric methods. The mean and reference intervals estimated by nonparametric bootstrap methods of BUN, cholesterol, calcium, AST, ALT, ALP, and total protein were 14.7(7.0-24.2), 227.3(120.7-480.8), 10.9(8.1-12.5), 25.4(11.8-66.6), 25.5(11.7-68.9), 87.7(31.1-240.8), and 6.8(5.6-8.2), respectively. This study indicates that bootstrap methods could be a useful statistical method to establish population-based reference intervals of serum chemistry parameters, as it is often the case that many laboratory values do not confirm to a normal distribution. In addition, the results emphasize on the confidence intervals of the analytical parameters showing distribution-related variations.

Preliminary Results of 3-Dimensional Conformal Radiotherapy for Primary Unresectable Hepatocellular Carcinoma (절제 불가능한 원발성 간암의 입체조형 방사선치료의 초기 임상 결과)

  • Keum Ki Chang;Park Hee Chul;Seong Jinsil;Chang Sei Kyoung;Han Kwang Hyub;Chon Chae Yoon;Moon Young Myoung;Kim Gwi Eon;Suh Chang Ok
    • Radiation Oncology Journal
    • /
    • v.20 no.2
    • /
    • pp.123-129
    • /
    • 2002
  • Purpose : The purpose of this study 띤as to determine the potential role of three-dimensional conformal radiotherapy (3D-CRT) in the treatment of primary unresectable hepatocellular carcinoma. The preliminary results on the efficacy and the toxicity of 3D-CRT are reported. Materials and Methods : Seventeen patients were enrolled in this study, which was conducted prospectively from January 1995 to June 1997. The exclusion criteria included the presence of extrahepatic metastasis, liver cirrhosis of Child-Pugh classification C, tumors occupying more than two thirds of the entire liver, and a performance status of more than 3 on the ECOG scale. Two patients were treated with radiotherapy only while the remaining 15 were treated with combined transcatheter arterial chemoembolization. Radiotherapy was given to the field including the tumor plus a 1.5 cm margin using a 3D-CRT technique. The radiation dose ranged from $36\~60\;Gy$ (median; 59.4 Gy). Tumor response was based on a radiological examination such as the CT scan, MR imaging, and hepatic artery angiography at $4\~8$ weeks following the completion of treatment. The acute and subacute toxicities were monitored. Results : An objective response was observed in 11 out of 17 patients, giving a response rate of $64.7\%$. The actuarial survival rate at 2 years was $21.2\%$ from the start of radiotherapy (median survival; 19 months). Six patients developed a distant metastasis consisting of a lung metastasis in 5 patients and bone metastasis in one. The complications related to 30-CRT were gastro-duodenitis $(\geq\;grade\;2)$ in 2 patients. There were no treatment related deaths and radiation induced hepatitis. Conclusion : The preliminary results show that 3D-CRT is a reliable and effective treatment modality for primary unresectable hepatocellular carcinoma compared to other conventional modalities. Further studies to evaluate the definitive role of the 3D-CRT technique in the treatment of primary unresectable hepatocellular carcinoma are needed.

The Research on Recommender for New Customers Using Collaborative Filtering and Social Network Analysis (협력필터링과 사회연결망을 이용한 신규고객 추천방법에 대한 연구)

  • Shin, Chang-Hoon;Lee, Ji-Won;Yang, Han-Na;Choi, Il Young
    • Journal of Intelligence and Information Systems
    • /
    • v.18 no.4
    • /
    • pp.19-42
    • /
    • 2012
  • Consumer consumption patterns are shifting rapidly as buyers migrate from offline markets to e-commerce routes, such as shopping channels on TV and internet shopping malls. In the offline markets consumers go shopping, see the shopping items, and choose from them. Recently consumers tend towards buying at shopping sites free from time and place. However, as e-commerce markets continue to expand, customers are complaining that it is becoming a bigger hassle to shop online. In the online shopping, shoppers have very limited information on the products. The delivered products can be different from what they have wanted. This case results to purchase cancellation. Because these things happen frequently, they are likely to refer to the consumer reviews and companies should be concerned about consumer's voice. E-commerce is a very important marketing tool for suppliers. It can recommend products to customers and connect them directly with suppliers with just a click of a button. The recommender system is being studied in various ways. Some of the more prominent ones include recommendation based on best-seller and demographics, contents filtering, and collaborative filtering. However, these systems all share two weaknesses : they cannot recommend products to consumers on a personal level, and they cannot recommend products to new consumers with no buying history. To fix these problems, we can use the information which has been collected from the questionnaires about their demographics and preference ratings. But, consumers feel these questionnaires are a burden and are unlikely to provide correct information. This study investigates combining collaborative filtering with the centrality of social network analysis. This centrality measure provides the information to infer the preference of new consumers from the shopping history of existing and previous ones. While the past researches had focused on the existing consumers with similar shopping patterns, this study tried to improve the accuracy of recommendation with all shopping information, which included not only similar shopping patterns but also dissimilar ones. Data used in this study, Movie Lens' data, was made by Group Lens research Project Team at University of Minnesota to recommend movies with a collaborative filtering technique. This data was built from the questionnaires of 943 respondents which gave the information on the preference ratings on 1,684 movies. Total data of 100,000 was organized by time, with initial data of 50,000 being existing customers and the latter 50,000 being new customers. The proposed recommender system consists of three systems : [+] group recommender system, [-] group recommender system, and integrated recommender system. [+] group recommender system looks at customers with similar buying patterns as 'neighbors', whereas [-] group recommender system looks at customers with opposite buying patterns as 'contraries'. Integrated recommender system uses both of the aforementioned recommender systems to recommend movies that both recommender systems pick. The study of three systems allows us to find the most suitable recommender system that will optimize accuracy and customer satisfaction. Our analysis showed that integrated recommender system is the best solution among the three systems studied, followed by [-] group recommended system and [+] group recommender system. This result conforms to the intuition that the accuracy of recommendation can be improved using all the relevant information. We provided contour maps and graphs to easily compare the accuracy of each recommender system. Although we saw improvement on accuracy with the integrated recommender system, we must remember that this research is based on static data with no live customers. In other words, consumers did not see the movies actually recommended from the system. Also, this recommendation system may not work well with products other than movies. Thus, it is important to note that recommendation systems need particular calibration for specific product/customer types.

The Framework of Research Network and Performance Evaluation on Personal Information Security: Social Network Analysis Perspective (개인정보보호 분야의 연구자 네트워크와 성과 평가 프레임워크: 소셜 네트워크 분석을 중심으로)

  • Kim, Minsu;Choi, Jaewon;Kim, Hyun Jin
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.1
    • /
    • pp.177-193
    • /
    • 2014
  • Over the past decade, there has been a rapid diffusion of electronic commerce and a rising number of interconnected networks, resulting in an escalation of security threats and privacy concerns. Electronic commerce has a built-in trade-off between the necessity of providing at least some personal information to consummate an online transaction, and the risk of negative consequences from providing such information. More recently, the frequent disclosure of private information has raised concerns about privacy and its impacts. This has motivated researchers in various fields to explore information privacy issues to address these concerns. Accordingly, the necessity for information privacy policies and technologies for collecting and storing data, and information privacy research in various fields such as medicine, computer science, business, and statistics has increased. The occurrence of various information security accidents have made finding experts in the information security field an important issue. Objective measures for finding such experts are required, as it is currently rather subjective. Based on social network analysis, this paper focused on a framework to evaluate the process of finding experts in the information security field. We collected data from the National Discovery for Science Leaders (NDSL) database, initially collecting about 2000 papers covering the period between 2005 and 2013. Outliers and the data of irrelevant papers were dropped, leaving 784 papers to test the suggested hypotheses. The co-authorship network data for co-author relationship, publisher, affiliation, and so on were analyzed using social network measures including centrality and structural hole. The results of our model estimation are as follows. With the exception of Hypothesis 3, which deals with the relationship between eigenvector centrality and performance, all of our hypotheses were supported. In line with our hypothesis, degree centrality (H1) was supported with its positive influence on the researchers' publishing performance (p<0.001). This finding indicates that as the degree of cooperation increased, the more the publishing performance of researchers increased. In addition, closeness centrality (H2) was also positively associated with researchers' publishing performance (p<0.001), suggesting that, as the efficiency of information acquisition increased, the more the researchers' publishing performance increased. This paper identified the difference in publishing performance among researchers. The analysis can be used to identify core experts and evaluate their performance in the information privacy research field. The co-authorship network for information privacy can aid in understanding the deep relationships among researchers. In addition, extracting characteristics of publishers and affiliations, this paper suggested an understanding of the social network measures and their potential for finding experts in the information privacy field. Social concerns about securing the objectivity of experts have increased, because experts in the information privacy field frequently participate in political consultation, and business education support and evaluation. In terms of practical implications, this research suggests an objective framework for experts in the information privacy field, and is useful for people who are in charge of managing research human resources. This study has some limitations, providing opportunities and suggestions for future research. Presenting the difference in information diffusion according to media and proximity presents difficulties for the generalization of the theory due to the small sample size. Therefore, further studies could consider an increased sample size and media diversity, the difference in information diffusion according to the media type, and information proximity could be explored in more detail. Moreover, previous network research has commonly observed a causal relationship between the independent and dependent variable (Kadushin, 2012). In this study, degree centrality as an independent variable might have causal relationship with performance as a dependent variable. However, in the case of network analysis research, network indices could be computed after the network relationship is created. An annual analysis could help mitigate this limitation.

Development of 3D Impulse Calculation Technique for Falling Down of Trees (수목 도복의 3D 충격량 산출 기법 개발)

  • Kim, Chae-Won;Kim, Choong-Sik
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.51 no.2
    • /
    • pp.1-11
    • /
    • 2023
  • This study intended to develop a technique for quantitatively and 3-dimensionally predicting the potential failure zone and impulse that may occur when trees are fall down. The main outcomes of this study are as follows. First, this study established the potential failure zone and impulse calculation formula in order to quantitatively calculate the risks generated when trees are fallen down. When estimating the potential failure zone, the calculation was performed by magnifying the height of trees by 1.5 times, reflecting the likelihood of trees falling down and slipping. With regard to the slope of a tree, the range of 360° centered on the root collar was set in the case of trees that grow upright and the range of 180° from the inclined direction was set in the case of trees that grow inclined. The angular momentum was calculated by reflecting the rotational motion from the root collar when the trees fell down, and the impulse was calculated by converting it into the linear momentum. Second, the program to calculate a potential failure zone and impulse was developed using Rhino3D and Grasshopper. This study created the 3-dimensional models of the shapes for topography, buildings, and trees using the Rhino3D, thereby connecting them to Grasshopper to construct the spatial information. The algorithm was programmed using the calculation formula in the stage of risk calculation. This calculation considered the information on the trees' growth such as the height, inclination, and weight of trees and the surrounding environment including adjacent trees, damage targets, and analysis ranges. In the stage of risk inquiry, the calculation results were visualized into a three-dimensional model by summarizing them. For instance, the risk degrees were classified into various colors to efficiently determine the dangerous trees and dangerous areas.

Deriving Key Risk Sub-Clauses which the Engineer of FIDIC Red Book Shall Agree or Determine according to Sub-Clause 3.7 -based on FIDIC Conditions of Contract for Construction, Second Edition 2017- (FIDIC Red Book의 Engineer가 합의 또는 결정해야할 핵심 리스크 세부조항 도출 -FIDIC Red Book 2017년 개정판 기준으로-)

  • Jei, Jae Yong;Hong, Seong Yeoll;Seo, Sung Chul;Park, Hyung Keun
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.43 no.2
    • /
    • pp.239-247
    • /
    • 2023
  • The FIDIC Red Book is an international standard contract condition in which the Employer designs and the Contractor performs the construction. The Engineer of FIDIC Red Book shall agree or determine any matter or Claim in accordance with Sub-Clause 3.7 neutrally, not as an agent of the Employer. This study aimed to derive Key Risk Sub-Clauses out of 49 Sub-Clauses that the Engineer of FIDIC Red Book recently revised in 18 years shall agree or determine according to Sub-Clause 3.7 using the Delphi method. A panel of 35 experts with more than 10 years of experience and expertise in international construction contracts was formed, and through total three Delphi surveys, errors and biases were prevented in the judgment process to improve reliability. As for the research method, 49 Sub-Clauses that engineers shall agree on or determine according to Sub-Clause 3.7 of the FIDIC Red Book were investigated through the analysis of contract conditions. In order to evaluate the probability and impact of contractual risk for each 49 Sub-Clause, the Delphi survey conducted repeatedly a closed-type survey three times on a Likert 10-point scale. The results of the first Delphi survey were delivered during the second survey, and the results of the second survey were delivered to the third survey, which was re-evaluated in the direction of increasing the consensus of experts' opinions. The reliability of the Delphi 3rd survey results was verified with the COV value of the coefficient of variation. The PI Risk Matrix was applied to the average value of risk probability and impact of each of the 49 Sub-Clauses and finally, 9 Key Risk Sub-Clauses that fell within the extreme risk range were derived.

Development of Automated Region of Interest for the Evaluation of Renal Scintigraphy : Study on the Inter-operator Variability (신장 핵의학 영상의 정량적 분석을 위한 관심영역 자동설정 기능 개발 및 사용자별 분석결과의 변화도 감소효과 분석)

  • 이형구;송주영;서태석;최보영;신경섭
    • Progress in Medical Physics
    • /
    • v.12 no.1
    • /
    • pp.41-50
    • /
    • 2001
  • The quantification analysis of renal scintigraphy is strongly affected by the location, shape and size of region of interest(ROI). When ROIs are drawn manually, these ROIs are not reproducible due to the operators' subjective point of view, and may lead to inconsistent results even if the same data were analyzed. In this study, the effect of the ROI variation on the analysis of renal scintigraphy when the ROIs are drawn manually was investigated, and in order to obtain more consistent results, methods for automated ROI definition were developed and the results from the application of the developed methods were analyzed. Relative renal function, glomerular filtration rate and mean transit time were selected as clinical parameters for the analysis of the effect of ROI and the analysis tools were designed with the programming language of IDL5.2. To obtain renal scintigraphy, $^{99m}$Tc-DTPA was injected to the 11 adults of normal condition and to study the inter-operator variability, 9 researchers executed the analyses. The calculation of threshold using the gradient value of pixels and border tracing technique were used to define renal ROI and then the background ROI and aorta ROI were defined automatically considering anatomical information and pixel value. The automatic methods to define renal ROI were classified to 4 groups according to the exclusion of operator's subjectiveness. These automatic methods reduced the inter-operator variability remarkably in comparison with manual method and proved the effective tool to obtain reasonable and consistent results in analyzing the renal scintigraphy quantitatively.

  • PDF