• Title/Summary/Keyword: Software V&V

Search Result 727, Processing Time 0.028 seconds

Development of Model Plans in Three Dimensional Conformal Radiotherapy for Brain Tumors (뇌종양 환자의 3차원 입체조형 치료를 위한 뇌내 주요 부위의 모델치료계획의 개발)

  • Pyo Hongryull;Lee Sanghoon;Kim GwiEon;Keum Kichang;Chang Sekyung;Suh Chang-Ok
    • Radiation Oncology Journal
    • /
    • v.20 no.1
    • /
    • pp.1-16
    • /
    • 2002
  • Purpose : Three dimensional conformal radiotherapy planning is being used widely for the treatment of patients with brain tumor. However, it takes much time to develop an optimal treatment plan, therefore, it is difficult to apply this technique to all patients. To increase the efficiency of this technique, we need to develop standard radiotherapy plant for each site of the brain. Therefore we developed several 3 dimensional conformal radiotherapy plans (3D plans) for tumors at each site of brain, compared them with each other, and with 2 dimensional radiotherapy plans. Finally model plans for each site of the brain were decide. Materials and Methods : Imaginary tumors, with sizes commonly observed in the clinic, were designed for each site of the brain and drawn on CT images. The planning target volumes (PTVs) were as follows; temporal $tumor-5.7\times8.2\times7.6\;cm$, suprasellar $tumor-3\times4\times4.1\;cm$, thalamic $tumor-3.1\times5.9\times3.7\;cm$, frontoparietal $tumor-5.5\times7\times5.5\;cm$, and occipitoparietal $tumor-5\times5.5\times5\;cm$. Plans using paralled opposed 2 portals and/or 3 portals including fronto-vertex and 2 lateral fields were developed manually as the conventional 2D plans, and 3D noncoplanar conformal plans were developed using beam's eye view and the automatic block drawing tool. Total tumor dose was 54 Gy for a suprasellar tumor, 59.4 Gy and 72 Gy for the other tumors. All dose plans (including 2D plans) were calculated using 3D plan software. Developed plans were compared with each other using dose-volume histograms (DVH), normal tissue complication probabilities (NTCP) and variable dose statistic values (minimum, maximum and mean dose, D5, V83, V85 and V95). Finally a best radiotherapy plan for each site of brain was selected. Results : 1) Temporal tumor; NTCPs and DVHs of the normal tissue of all 3D plans were superior to 2D plans and this trend was more definite when total dose was escalated to 72 Gy (NTCPs of normal brain 2D $plans:27\%,\;8\%\rightarrow\;3D\;plans:1\%,\;1\%$). Various dose statistic values did not show any consistent trend. A 3D plan using 3 noncoplanar portals was selected as a model radiotherapy plan. 2) Suprasellar tumor; NTCPs of all 3D plans and 2D plans did not show significant difference because the total dose of this tumor was only 54 Gy. DVHs of normal brain and brainstem were significantly different for different plans. D5, V85, V95 and mean values showed some consistent trend that was compatible with DVH. All 3D plans were superior to 2D plans even when 3 portals (fronto-vertex and 2 lateral fields) were used for 2D plans. A 3D plan using 7 portals was worse than plans using fewer portals. A 3D plan using 5 noncoplanar portals was selected as a model plan. 3) Thalamic tumor; NTCPs of all 3D plans were lower than the 2D plans when the total dose was elevated to 72 Gy. DVHs of normal tissues showed similar results. V83, V85, V95 showed some consistent differences between plans but not between 3D plans. 3D plans using 5 noncoplanar portals were selected as a model plan. 4) Parietal (fronto- and occipito-) tumors; all NTCPs of the normal brain in 3D plans were lower than in 2D plans. DVH also showed the same results. V83, V85, V95 showed consistent trends with NTCP and DVH. 3D plans using 5 portals for frontoparietal tumor and 6 portals for occipitoparietal tumor were selected as model plans. Conclusion : NTCP and DVH showed reasonable differences between plans and were through to be useful for comparing plans. All 3D plans were superior to 2D plans. Best 3D plans were selected for tumors in each site of brain using NTCP, DVH and finally by the planner's decision.

A Feasibility Study on the Development of Multifunctional Radar Software using a Model-Based Development Platform (모델기반 통합 개발 플랫폼을 이용한 다기능 레이다 소프트웨어 개발의 타당성 연구)

  • Seung Ryeon Kim ;Duk Geun Yoon ;Sun Jin Oh ;Eui Hyuk Lee;Sa Won Min ;Hyun Su Oh ;Eun Hee Kim
    • Journal of the Korea Society for Simulation
    • /
    • v.32 no.3
    • /
    • pp.23-31
    • /
    • 2023
  • Software development involves a series of stages, including requirements analysis, design, implementation, unit testing, and integration testing, similar to those used in the system engineering process. This study utilized MathWorks' model-based design platform to develop multi-function radar software and evaluated its feasibility and efficiency. Because the development of conventional radar software is performed by a unit algorithm rather than in an integrated form, it requires additional efforts to manage the integrated software, such as requirement analysis and integrated testing. The mode-based platform applied in this paper provides an integrated development environment for requirements analysis and allocation, algorithm development through simulation, automatic code generation for deployment, and integrated requirements testing, and result management. With the platform, we developed multi-level models of the multi-function radar software, verified them using test harnesses, managed requirements, and transformed them into hardware deployable language using the auto code generation tool. We expect this Model-based integrated development to reduce errors from miscommunication or other human factors and save on the development schedule and cost.

Development of an Analytic Software Using Pencil Beam Scanning Proton Beam

  • Jeong, Seonghoon;Yoon, Myonggeun;Chung, Kwangzoo;Han, Youngyih;Lim, Do Hoon;Choi, Doo Ho
    • Progress in Medical Physics
    • /
    • v.28 no.1
    • /
    • pp.22-26
    • /
    • 2017
  • We have developed an analytic software that can easily analyze the spot position and width of proton beam therapy nozzles in a periodic quality assurance. The developed software consists of an image processing method that conducts an analysis using center-of-spot geometry and a Gaussian fitting method that conducts an analysis through Gaussian fitting. By using the software, an analysis of 210 proton spots with energies 150, 190, and 230 MeV showed a deviation of approximately 3% from the mean. The software we developed to analyze proton spot positions and widths provides an accurate analysis and reduces the time for analysis.

KTM TOKAMAK OPERATION SCENARIOS SOFTWARE INFRASTRUCTURE

  • Pavlov, V.;Baystrukov, K.;Golobokov, Yu.;Ovchinnikov, A.;Mezentsev, A.;Merkulov, S.;Lee, A.;Tazhibayeva, I.;Shapovalov, G.
    • Nuclear Engineering and Technology
    • /
    • v.46 no.5
    • /
    • pp.667-674
    • /
    • 2014
  • One of the largest problems for tokamak devices such as Kazakhstan Tokamak for Material Testing (KTM) is the operation scenarios' development and execution. Operation scenarios may be varied often, so a convenient hardware and software solution is required for scenario management and execution. Dozens of diagnostic and control subsystems with numerous configuration settings may be used in an experiment, so it is required to automate the subsystem configuration process to coordinate changes of the related settings and to prevent errors. Most of the diagnostic and control subsystems software at KTM was unified using an extra software layer, describing the hardware abstraction interface. The experiment sequence was described using a command language. The whole infrastructure was brought together by a universal communication protocol supporting various media, including Ethernet and serial links. The operation sequence execution infrastructure was used at KTM to carry out plasma experiments.

Evaluating the settlement of lightweight coarse aggregate in self-compacting lightweight concrete

  • Mazloom, Moosa;Mahboubi, Farzan
    • Computers and Concrete
    • /
    • v.19 no.2
    • /
    • pp.203-210
    • /
    • 2017
  • The purpose of this paper is to evaluate the settlement of lightweight coarse aggregate of self-compacting lightweight concrete (SCLC) after placement of concrete on its final position. To investigate this issue, sixteen samples of concrete mixes were made. The water to cementitious materials ratios of the mixes were 0.35 and 0.4. In addition to the workability tests of self-compacting concrete (SCC) such as slump flow, V-funnel and L-box tests, a laboratory experiment was made to examine the segregation of lightweight coarse aggregate in concrete. Because of the difficulties of this test, the image processing technique of MATLAB software was used to check the segregation above too. Moreover, the fuzzy logic technique of MATLAB software was utilized to improve the clarity of the borders between the coarse aggregate and the paste of the mixtures. At the end, the results of segregation tests and software analyses are given and the accuracy of the software analyses is evaluated. It is worth noting that the minimum and maximum differences between the results of laboratory tests and software analyses were 1.2% and 9.19% respectively. It means, the results of image processing technique looks exact enough for estimating the segregation of lightweight coarse aggregate in SCLC.

Testing Environment based on TTCN-3 for Network-based Embedded Software (TTCN-3를 이용한 네트워크 기반 임베디드 소프트웨어 테스팅 환경 구축)

  • Chae, Hochang;Jin, Xiulin;Cho, Jeonghun;Lee, Seonghun
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.5 no.1
    • /
    • pp.29-38
    • /
    • 2010
  • It is highly requested that the more complicated embedded software is used for high performance and multiple functions of the systems. This is inevitably increasing the errors. Therefore the embedded software testing is getting important recently. There are not general testing methods which are able to be applied for any embedded systems, but via this research, we introduce a testing method which is based on TTCN-3, a testing standard, for embedded systems. A testing environment for network-based embedded software is implemented with considering the features of TTCN-3 testing which is based on message exchange. The testing environment has two additional parts with TTCN-3 test system, the network analyzer to access the network-based systems and the communication interface which is suggested for embedded systems in previous work, and we have implemented the whole testing environment with interacting these two parts. In addition to the normal testing domain, called single node testing as a unit testing of V-model, we suggest another concept to test multiple nodes in network. It could be achieved by adding keywords such as supervisor and object which are describing the feature of TTCN-3 testing component and generating the TTCN-3 Executable code which contains new keywords. The testing has done for embedded software which is based on CAN network and the demonstration of the testing environment has been shown in this paper.

Q-omics: Smart Software for Assisting Oncology and Cancer Research

  • Lee, Jieun;Kim, Youngju;Jin, Seonghee;Yoo, Heeseung;Jeong, Sumin;Jeong, Euna;Yoon, Sukjoon
    • Molecules and Cells
    • /
    • v.44 no.11
    • /
    • pp.843-850
    • /
    • 2021
  • The rapid increase in collateral omics and phenotypic data has enabled data-driven studies for the fast discovery of cancer targets and biomarkers. Thus, it is necessary to develop convenient tools for general oncologists and cancer scientists to carry out customized data mining without computational expertise. For this purpose, we developed innovative software that enables user-driven analyses assisted by knowledge-based smart systems. Publicly available data on mutations, gene expression, patient survival, immune score, drug screening and RNAi screening were integrated from the TCGA, GDSC, CCLE, NCI, and DepMap databases. The optimal selection of samples and other filtering options were guided by the smart function of the software for data mining and visualization on Kaplan-Meier plots, box plots and scatter plots of publication quality. We implemented unique algorithms for both data mining and visualization, thus simplifying and accelerating user-driven discovery activities on large multiomics datasets. The present Q-omics software program (v0.95) is available at http://qomics.sookmyung.ac.kr.

Optimization of Extraction Conditions to Obtain Functional Components from Buckwheat (Fagopyrum esculentum M.) Sprouts, using Response Surface Methodology (반응표면분석법에 의한 메밀(Fagopyrum esculentum M.) 새싹 기능성분의 추출 조건 최적화)

  • Park, Kee-Jai;Lim, Jeong-Ho;Kim, Bum-Keun;Jeong, Jin-Woong;Kim, Jong-Chan;Lee, Myung-Heon;Cho, Young-Sim;Jung, Hee-Yong
    • Food Science and Preservation
    • /
    • v.16 no.5
    • /
    • pp.734-741
    • /
    • 2009
  • Response surface methodology (RSM) was used to optimize extraction conditions for functional components of buckwheat (Fagopyrum esculentum). A central composite design was applied to investigate the effects of three independent variables, namelyextraction temperature (X1), extraction time (X2), and ethanol concentration (X3), on responses including extraction yield (Y1), total phenolic content in the extract (Y2), $\alpha$-glucosidase inhibition activity (Y3), and acetylcholine esterase (ACE) inhibition activity (Y4). Data were analyzed using an expert design strategy and statistical software. The maximum yield was 24.95% (w/w) at $55.75^{\circ}C$ extraction temperature, 8.75 hextraction time, and 15.65% (v/v) ethanol. The maximum total phenolic yield was 222.45 mg/100 g under the conditions of $28.11^{\circ}C$ extraction temperature, 8.65 h extraction time, and 81.72% (v/v) ethanol. The maximum $\alpha$-glucosidase inhibition activity was 85.38% at $9.62^{\circ}C$, 7.86 h, and 57.58% (v/v) ethanol. The maximum ACE inhibition activity was 86.91% under extraction conditions of $10.12^{\circ}C$, 4.86 h, and 44.44% (v/v) ethanol. Based on superimposition of a four-dimensional RSM with respect to levels of total phenolics, $\alpha$-glucosidase inhibition activity, and ACE inhibition activity, obtained under various extraction conditions, the optimum ranges of conditions were an extraction temperature of $0-70^{\circ}C$, an extraction time of 2-8 h, and an ethanol concentration of 30-80% (v/v).

FLUENT MODELLING OF CAVITATION IN POPPET VALVES (포펫트밸브내에서의 캐비테이션에 관한 FLUENT 모델링)

  • Chung-Do, Nam
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.5 no.2
    • /
    • pp.113-123
    • /
    • 1999
  • The aim of this paper was to expand on work already carried out on the modelling of the flow through a poppet valve using CFD software FLUENT V4.22. Several different models were run on FLUENT for various lifts of the poppet cone and various back pressures. The results for pressure and velocity obtained were interpreted. The results revealed the presence of cavitation downstream of the orifice around the cone tip, and the presence of a high velocity jet stream along the centre line. These results confirm what has been found to happen in practice.

  • PDF