• Title/Summary/Keyword: 성능 검증

Search Result 9,147, Processing Time 0.042 seconds

The Development of Quality Assurance Program for CyberKnife (사이버나이프의 품질관리 절차서 개발)

  • Jang, Ji-Sun;Kang, Young-Nam;Shin, Dong-Oh;Kim, Moon-Chan;Yoon, Sei-Chul;Choi, Ihl-Bohng;Kim, Mi-Sook;Cho, Chul-Koo;Yoo, Seong-Yul;Kwon, Soo-Il;Lee, Dong-Han
    • Radiation Oncology Journal
    • /
    • v.24 no.3
    • /
    • pp.185-191
    • /
    • 2006
  • [ $\underline{Purpose}$ ]: Standardization quality assurance (QA) program of CyberKnife for suitable circumstances in Korea has not been established. In this research, we investigated the development of QA program for CyberKnife and evaluation of the feasibility under applications. $\underline{Materials\;and\;Methods}$: Considering the feature of constitution for systems and the therapeutic methodology of CyberKnife, the list of quality control (QC) was established and divided dependent on the each period of operations. And then all these developed QC lists were categorized into three groups such as basic QC, delivery specific QC, and patient specific QC based on the each purpose of QA. In order to verify the validity of the established QA program, this QC lists was applied to two CyberKnife centers. The acceptable tolerance was based on the undertaking inspection list from the CyberKnife manufacturer and the QC results during last three years of two CyberKnife centers in Korea. The acquired measurement results were evaluated for the analysis of the current QA status and the verification of the propriety for the developed QA program. $\underline{Results}$: The current QA status of two CyberKnife centers was evaluated from the accuracy of all measurements in relation with application of the established QA program. Each measurement result was verified having a good agreement within the acceptable tolerance limit of the developed QA program. $\underline{Conclusion}$: It is considered that the developed QA program in this research could be established the standardization of QC methods for CyberKnife and confirmed the accuracy and stability for the image-guided stereotactic radiotherapy.

Evaluating applicability of metal artifact reduction algorithm for head & neck radiation treatment planning CT (Metal artifact reduction algorithm의 두경부 CT에 대한 적용 가능성 평가)

  • Son, Sang Jun;Park, Jang Pil;Kim, Min Jeong;Yoo, Suk Hyun
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.26 no.1
    • /
    • pp.107-114
    • /
    • 2014
  • Purpose : The purpose of this study is evaluation for the applicability of O-MAR(Metal artifact Reduction for Orthopedic Implants)(ver. 3.6.0, Philips, Netherlands) in head & neck radiation treatment planning CT with metal artifact created by dental implant. Materials and Methods : All of the in this study's CT images were scanned by Brilliance Big Bore CT(Philips, Netherlands) at 120kVp, 2mm sliced and Metal artifact reduced by O-MAR. To compare the original and reconstructed CT images worked on RTPS(Eclipse ver 10.0.42, Varian, USA). In order to test the basic performance of the O-MAR, The phantom was made to create metal artifact by dental implant and other phantoms used for without artifact images. To measure a difference of HU in with artifact images and without artifact images, homogeneous phantom and inhomogeneous phantoms were used with cerrobend rods. Each of images were compared a difference of HU in ROIs. And also, 1 case of patient's original CT image applied O-MAR and density corrected CT were evaluated for dose distributions with SNC Patient(Sun Nuclear Co., USA). Results : In cases of head&neck phantom, the difference of dose distibution is appeared 99.8% gamma passing rate(criteria 2 mm / 2%) between original and CT images applied O-MAR. And 98.5% appeared in patient case, among original CT, O-MAR and density corrected CT. The difference of total dose distribution is less than 2% that appeared both phantom and patient case study. Though the dose deviations are little, there are still matters to discuss that the dose deviations are concentrated so locally. In this study, The quality of all images applied O-MAR was improved. Unexpectedly, Increase of max. HU was founded in air cavity of the O-MAR images compare to cavity of the original images and wrong corrections were appeared, too. Conclusion : The result of study assuming restrained case of O-MAR adapted to near skin and low density area, it appeared image distortion and artifact correction simultaneously. In O-MAR CT, air cavity area even turned tissue HU by wrong correction was founded, too. Consequentially, It seems O-MAR algorithm is not perfect to distinguish air cavity and photon starvation artifact. Nevertheless, the differences of HU and dose distribution are not a huge that is not suitable for clinical use. And there are more advantages in clinic for improved quality of CT images and DRRs, precision of contouring OARs or tumors and correcting artifact area. So original and O-MAR CT must be used together in clinic for more accurate treatment plan.

A Study on the Nutrition Contents and Blood Glucose Response Effect of Diabetic-Oriented Convenience Food prepared Medicinal Plants and Chicken (생약재와 닭고기를 이용하여 개발된 편의 당뇨식사의 영양성분 및 혈당반응)

  • 한종현;박성혜
    • Journal of the East Asian Society of Dietary Life
    • /
    • v.12 no.2
    • /
    • pp.91-99
    • /
    • 2002
  • This study was carried out to develop a diabetic-oriented convenience flood using 7 medicinal plants (Schisandra chinensis, Coix lachryma-jobi, Dioscorea batatas, Ophipogon japonicus, Lyicium chinense, Houttuynia cordata, Polygonatum sibiricum) and chicken. Portion size was 310g, total calorie was 551.6 kcal and carbohydrate, lipid and protein were consisted of 53.0%, 20.9% and 26.1%, respectively. Calcium, zinc and iron content were 268.9mg, 5.4mg and 6.1mg, respectively. Crude fiber content was 22.9g. In sensory evaluation, the scores of taste, color, texture and overall acceptability were higher than normal diabetic meal. Hypoglycemic effect of the device meal for diabetic persons was excellent compared to that of normal diabetic meal. The above results indicate that the 7 medicinal plants can be used as functional ingredients fur diabetic-oriented convenience flood industry. Also, device meal can be used as ready-prepared food for weight control.

  • PDF

Cyclic Behavior of Wall-Slab Joints with Lap Splices of Coldly Straightened Re-bars and with Mechanical Splices (굽힌 후 편 철근의 겹침 이음 및 기계적 이음을 갖는 벽-슬래브 접합부의 반복하중에 대한 거동)

  • Chun, Sung-Chul;Lee, Jin-Gon;Ha, Tae-Hun
    • Journal of the Korea Concrete Institute
    • /
    • v.24 no.3
    • /
    • pp.275-283
    • /
    • 2012
  • Steel Plate for Rebar Connection was recently developed to splice rebars in delayed slab-wall joints in high-rise building, slurry wall-slab joints, temporary openings, etc. It consists of several couplers and a thin steel plate with shear key. Cyclic loading tests on slab-wall joints were conducted to verify structural behavior of the joints having Steel Plate for Rebar Connection. For comparison, joints with Rebend Connection and without splices were also tested. The joints with Steel Plate for Rebar Connection showed typical flexural behavior in the sequence of tension re-bar yielding, sufficient flexural deformation, crushing of compression concrete, and compression rebar buckling. However, the joints with Rebend Connection had more bond cracks in slabs faces and spalling in side cover-concrete, even though elastic behavior of the joints was similar to that of the joints with Steel Plate for Re-bar Connection. Consequently, the joints with Rebend Connection had less strengths and deformation capacities than the joints with Steel Plate for Re-bar Connection. In addition, stiffness of the joints with Rebend Connection degraded more rapidly than the other joints as cyclic loads were applied. This may be caused by low elastic modulus of re-straightened rebars and restraightening of kinked bar. For two types of diameters (13mm and 16mm) and two types of grades (SD300 and SD400) of rebars, the joints with Steel Plate for Rebar Connection had higher strength than nominal strength calculated from actual material properties. On the contrary, strengths of the joints with Rebend Connection decreased as bar diameter increased and as grade becames higher. Therefore, Rebend Connection should be used with caution in design and construction.

Selection Model of System Trading Strategies using SVM (SVM을 이용한 시스템트레이딩전략의 선택모형)

  • Park, Sungcheol;Kim, Sun Woong;Choi, Heung Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.2
    • /
    • pp.59-71
    • /
    • 2014
  • System trading is becoming more popular among Korean traders recently. System traders use automatic order systems based on the system generated buy and sell signals. These signals are generated from the predetermined entry and exit rules that were coded by system traders. Most researches on system trading have focused on designing profitable entry and exit rules using technical indicators. However, market conditions, strategy characteristics, and money management also have influences on the profitability of the system trading. Unexpected price deviations from the predetermined trading rules can incur large losses to system traders. Therefore, most professional traders use strategy portfolios rather than only one strategy. Building a good strategy portfolio is important because trading performance depends on strategy portfolios. Despite of the importance of designing strategy portfolio, rule of thumb methods have been used to select trading strategies. In this study, we propose a SVM-based strategy portfolio management system. SVM were introduced by Vapnik and is known to be effective for data mining area. It can build good portfolios within a very short period of time. Since SVM minimizes structural risks, it is best suitable for the futures trading market in which prices do not move exactly the same as the past. Our system trading strategies include moving-average cross system, MACD cross system, trend-following system, buy dips and sell rallies system, DMI system, Keltner channel system, Bollinger Bands system, and Fibonacci system. These strategies are well known and frequently being used by many professional traders. We program these strategies for generating automated system signals for entry and exit. We propose SVM-based strategies selection system and portfolio construction and order routing system. Strategies selection system is a portfolio training system. It generates training data and makes SVM model using optimal portfolio. We make $m{\times}n$ data matrix by dividing KOSPI 200 index futures data with a same period. Optimal strategy portfolio is derived from analyzing each strategy performance. SVM model is generated based on this data and optimal strategy portfolio. We use 80% of the data for training and the remaining 20% is used for testing the strategy. For training, we select two strategies which show the highest profit in the next day. Selection method 1 selects two strategies and method 2 selects maximum two strategies which show profit more than 0.1 point. We use one-against-all method which has fast processing time. We analyse the daily data of KOSPI 200 index futures contracts from January 1990 to November 2011. Price change rates for 50 days are used as SVM input data. The training period is from January 1990 to March 2007 and the test period is from March 2007 to November 2011. We suggest three benchmark strategies portfolio. BM1 holds two contracts of KOSPI 200 index futures for testing period. BM2 is constructed as two strategies which show the largest cumulative profit during 30 days before testing starts. BM3 has two strategies which show best profits during testing period. Trading cost include brokerage commission cost and slippage cost. The proposed strategy portfolio management system shows profit more than double of the benchmark portfolios. BM1 shows 103.44 point profit, BM2 shows 488.61 point profit, and BM3 shows 502.41 point profit after deducting trading cost. The best benchmark is the portfolio of the two best profit strategies during the test period. The proposed system 1 shows 706.22 point profit and proposed system 2 shows 768.95 point profit after deducting trading cost. The equity curves for the entire period show stable pattern. With higher profit, this suggests a good trading direction for system traders. We can make more stable and more profitable portfolios if we add money management module to the system.

The NCAM Land-Atmosphere Modeling Package (LAMP) Version 1: Implementation and Evaluation (국가농림기상센터 지면대기모델링패키지(NCAM-LAMP) 버전 1: 구축 및 평가)

  • Lee, Seung-Jae;Song, Jiae;Kim, Yu-Jung
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.18 no.4
    • /
    • pp.307-319
    • /
    • 2016
  • A Land-Atmosphere Modeling Package (LAMP) for supporting agricultural and forest management was developed at the National Center for AgroMeteorology (NCAM). The package is comprised of two components; one is the Weather Research and Forecasting modeling system (WRF) coupled with Noah-Multiparameterization options (Noah-MP) Land Surface Model (LSM) and the other is an offline one-dimensional LSM. The objective of this paper is to briefly describe the two components of the NCAM-LAMP and to evaluate their initial performance. The coupled WRF/Noah-MP system is configured with a parent domain over East Asia and three nested domains with a finest horizontal grid size of 810 m. The innermost domain covers two Gwangneung deciduous and coniferous KoFlux sites (GDK and GCK). The model is integrated for about 8 days with the initial and boundary conditions taken from the National Centers for Environmental Prediction (NCEP) Final Analysis (FNL) data. The verification variables are 2-m air temperature, 10-m wind, 2-m humidity, and surface precipitation for the WRF/Noah-MP coupled system. Skill scores are calculated for each domain and two dynamic vegetation options using the difference between the observed data from the Korea Meteorological Administration (KMA) and the simulated data from the WRF/Noah-MP coupled system. The accuracy of precipitation simulation is examined using a contingency table that is made up of the Probability of Detection (POD) and the Equitable Threat Score (ETS). The standalone LSM simulation is conducted for one year with the original settings and is compared with the KoFlux site observation for net radiation, sensible heat flux, latent heat flux, and soil moisture variables. According to results, the innermost domain (810 m resolution) among all domains showed the minimum root mean square error for 2-m air temperature, 10-m wind, and 2-m humidity. Turning on the dynamic vegetation had a tendency of reducing 10-m wind simulation errors in all domains. The first nested domain (7,290 m resolution) showed the highest precipitation score, but showed little advantage compared with using the dynamic vegetation. On the other hand, the offline one-dimensional Noah-MP LSM simulation captured the site observed pattern and magnitude of radiative fluxes and soil moisture, and it left room for further improvement through supplementing the model input of leaf area index and finding a proper combination of model physics.

Annotation Method based on Face Area for Efficient Interactive Video Authoring (효과적인 인터랙티브 비디오 저작을 위한 얼굴영역 기반의 어노테이션 방법)

  • Yoon, Ui Nyoung;Ga, Myeong Hyeon;Jo, Geun-Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.1
    • /
    • pp.83-98
    • /
    • 2015
  • Many TV viewers use mainly portal sites in order to retrieve information related to broadcast while watching TV. However retrieving information that people wanted needs a lot of time to retrieve the information because current internet presents too much information which is not required. Consequentially, this process can't satisfy users who want to consume information immediately. Interactive video is being actively investigated to solve this problem. An interactive video provides clickable objects, areas or hotspots to interact with users. When users click object on the interactive video, they can see additional information, related to video, instantly. The following shows the three basic procedures to make an interactive video using interactive video authoring tool: (1) Create an augmented object; (2) Set an object's area and time to be displayed on the video; (3) Set an interactive action which is related to pages or hyperlink; However users who use existing authoring tools such as Popcorn Maker and Zentrick spend a lot of time in step (2). If users use wireWAX then they can save sufficient time to set object's location and time to be displayed because wireWAX uses vision based annotation method. But they need to wait for time to detect and track object. Therefore, it is required to reduce the process time in step (2) using benefits of manual annotation method and vision-based annotation method effectively. This paper proposes a novel annotation method allows annotator to easily annotate based on face area. For proposing new annotation method, this paper presents two steps: pre-processing step and annotation step. The pre-processing is necessary because system detects shots for users who want to find contents of video easily. Pre-processing step is as follow: 1) Extract shots using color histogram based shot boundary detection method from frames of video; 2) Make shot clusters using similarities of shots and aligns as shot sequences; and 3) Detect and track faces from all shots of shot sequence metadata and save into the shot sequence metadata with each shot. After pre-processing, user can annotates object as follow: 1) Annotator selects a shot sequence, and then selects keyframe of shot in the shot sequence; 2) Annotator annotates objects on the relative position of the actor's face on the selected keyframe. Then same objects will be annotated automatically until the end of shot sequence which has detected face area; and 3) User assigns additional information to the annotated object. In addition, this paper designs the feedback model in order to compensate the defects which are wrong aligned shots, wrong detected faces problem and inaccurate location problem might occur after object annotation. Furthermore, users can use interpolation method to interpolate position of objects which is deleted by feedback. After feedback user can save annotated object data to the interactive object metadata. Finally, this paper shows interactive video authoring system implemented for verifying performance of proposed annotation method which uses presented models. In the experiment presents analysis of object annotation time, and user evaluation. First, result of object annotation average time shows our proposed tool is 2 times faster than existing authoring tools for object annotation. Sometimes, annotation time of proposed tool took longer than existing authoring tools, because wrong shots are detected in the pre-processing. The usefulness and convenience of the system were measured through the user evaluation which was aimed at users who have experienced in interactive video authoring system. Recruited 19 experts evaluates of 11 questions which is out of CSUQ(Computer System Usability Questionnaire). CSUQ is designed by IBM for evaluating system. Through the user evaluation, showed that proposed tool is useful for authoring interactive video than about 10% of the other interactive video authoring systems.

Response Modeling for the Marketing Promotion with Weighted Case Based Reasoning Under Imbalanced Data Distribution (불균형 데이터 환경에서 변수가중치를 적용한 사례기반추론 기반의 고객반응 예측)

  • Kim, Eunmi;Hong, Taeho
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.1
    • /
    • pp.29-45
    • /
    • 2015
  • Response modeling is a well-known research issue for those who have tried to get more superior performance in the capability of predicting the customers' response for the marketing promotion. The response model for customers would reduce the marketing cost by identifying prospective customers from very large customer database and predicting the purchasing intention of the selected customers while the promotion which is derived from an undifferentiated marketing strategy results in unnecessary cost. In addition, the big data environment has accelerated developing the response model with data mining techniques such as CBR, neural networks and support vector machines. And CBR is one of the most major tools in business because it is known as simple and robust to apply to the response model. However, CBR is an attractive data mining technique for data mining applications in business even though it hasn't shown high performance compared to other machine learning techniques. Thus many studies have tried to improve CBR and utilized in business data mining with the enhanced algorithms or the support of other techniques such as genetic algorithm, decision tree and AHP (Analytic Process Hierarchy). Ahn and Kim(2008) utilized logit, neural networks, CBR to predict that which customers would purchase the items promoted by marketing department and tried to optimized the number of k for k-nearest neighbor with genetic algorithm for the purpose of improving the performance of the integrated model. Hong and Park(2009) noted that the integrated approach with CBR for logit, neural networks, and Support Vector Machine (SVM) showed more improved prediction ability for response of customers to marketing promotion than each data mining models such as logit, neural networks, and SVM. This paper presented an approach to predict customers' response of marketing promotion with Case Based Reasoning. The proposed model was developed by applying different weights to each feature. We deployed logit model with a database including the promotion and the purchasing data of bath soap. After that, the coefficients were used to give different weights of CBR. We analyzed the performance of proposed weighted CBR based model compared to neural networks and pure CBR based model empirically and found that the proposed weighted CBR based model showed more superior performance than pure CBR model. Imbalanced data is a common problem to build data mining model to classify a class with real data such as bankruptcy prediction, intrusion detection, fraud detection, churn management, and response modeling. Imbalanced data means that the number of instance in one class is remarkably small or large compared to the number of instance in other classes. The classification model such as response modeling has a lot of trouble to recognize the pattern from data through learning because the model tends to ignore a small number of classes while classifying a large number of classes correctly. To resolve the problem caused from imbalanced data distribution, sampling method is one of the most representative approach. The sampling method could be categorized to under sampling and over sampling. However, CBR is not sensitive to data distribution because it doesn't learn from data unlike machine learning algorithm. In this study, we investigated the robustness of our proposed model while changing the ratio of response customers and nonresponse customers to the promotion program because the response customers for the suggested promotion is always a small part of nonresponse customers in the real world. We simulated the proposed model 100 times to validate the robustness with different ratio of response customers to response customers under the imbalanced data distribution. Finally, we found that our proposed CBR based model showed superior performance than compared models under the imbalanced data sets. Our study is expected to improve the performance of response model for the promotion program with CBR under imbalanced data distribution in the real world.

A Hardware Implementation of the Underlying Field Arithmetic Processor based on Optimized Unit Operation Components for Elliptic Curve Cryptosystems (타원곡선을 암호시스템에 사용되는 최적단위 연산항을 기반으로 한 기저체 연산기의 하드웨어 구현)

  • Jo, Seong-Je;Kwon, Yong-Jin
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.8 no.1
    • /
    • pp.88-95
    • /
    • 2002
  • In recent years, the security of hardware and software systems is one of the most essential factor of our safe network community. As elliptic Curve Cryptosystems proposed by N. Koblitz and V. Miller independently in 1985, require fewer bits for the same security as the existing cryptosystems, for example RSA, there is a net reduction in cost size, and time. In this thesis, we propose an efficient hardware architecture of underlying field arithmetic processor for Elliptic Curve Cryptosystems, and a very useful method for implementing the architecture, especially multiplicative inverse operator over GF$GF (2^m)$ onto FPGA and futhermore VLSI, where the method is based on optimized unit operation components. We optimize the arithmetic processor for speed so that it has a resonable number of gates to implement. The proposed architecture could be applied to any finite field $F_{2m}$. According to the simulation result, though the number of gates are increased by a factor of 8.8, the multiplication speed We optimize the arithmetic processor for speed so that it has a resonable number of gates to implement. The proposed architecture could be applied to any finite field $F_{2m}$. According to the simulation result, though the number of gates are increased by a factor of 8.8, the multiplication speed and inversion speed has been improved 150 times, 480 times respectively compared with the thesis presented by Sarwono Sutikno et al. [7]. The designed underlying arithmetic processor can be also applied for implementing other crypto-processor and various finite field applications.

Debris flow characteristics and sabo dam function in urban steep slopes (도심지 급경사지에서 토석류 범람 특성 및 사방댐 기능)

  • Kim, Yeonjoong;Kim, Taewoo;Kim, Dongkyum;Yoon, Jongsung
    • Journal of Korea Water Resources Association
    • /
    • v.53 no.8
    • /
    • pp.627-636
    • /
    • 2020
  • Debris flow disasters primarily occur in mountainous terrains far from cities. As such, they have been underestimated to cause relatively less damage compared with other natural disasters. However, owing to urbanization, several residential areas and major facilities have been built in mountainous regions, and the frequency of debris flow disasters is steadily increasing owing to the increase in rainfall with environmental and climate changes. Thus, the risk of debris flow is on the rise. However, only a few studies have explored the characteristics of flooding and reduction measures for debris flow in areas designated as steep slopes. In this regard, it is necessary to conduct research on securing independent disaster prevention technology, suitable for the environment in South Korea and reflective of the topographical characteristics thereof, and update and improve disaster prevention information. Accordingly, this study aimed to calculate the amount of debris flow, depending on disaster prevention performance targets for regions designated as steep slopes in South Korea, and develop an independent model to not only evaluate the impact of debris flow but also identify debris barriers that are optimal for mitigating damage. To validate the reliability of the two-dimensional debris flow model developed for the evaluation of debris barriers, the model's performance was compared with that of the hydraulic model. Furthermore, a 2-D debris model was constructed in consideration of the regional characteristics around the steep slopes to analyze the flow characteristics of the debris that directly reaches the damaged area. The flow characteristics of the debris delivered downstream were further analyzed, depending on the specifications (height) and installation locations of the debris barriers employed to reduce the damage. The experimental results showed that the reliability of the developed model is satisfactory; further, this study confirmed significant performance degradation of debris barriers in areas where the barriers were installed at a slope of 20° or more, which is the slope at which debris flows occur.