• Title/Summary/Keyword: OPTIMIZATION

Search Result 21,596, Processing Time 0.058 seconds

Is Diabetes a Contraindication to Lower Extremity Flap Reconstruction? An Analysis of Threatened Lower Extremities in the NSQIP Database (2010-2020)

  • Amy Chen;Shannon R. Garvey;Nimish Saxena;Valeria P. Bustos;Emmeline Jia;Monica Morgenstern;Asha D. Nanda;Arriyan S. Dowlatshahi;Ryan P. Cauley
    • Archives of Plastic Surgery
    • /
    • v.51 no.2
    • /
    • pp.234-250
    • /
    • 2024
  • Background The impact of diabetes on complication rates following free flap (FF), pedicled flap (PF), and amputation (AMP) procedures on the lower extremity (LE) is examined. Methods Patients who underwent LE PF, FF, and AMP procedures were identified from the 2010 to 2020 American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP®) database using Current Procedural Terminology and International Classification of Diseases-9/10 codes, excluding cases for non-LE pathologies. The cohort was divided into diabetics and nondiabetics. Univariate and adjusted multivariable logistic regression analyses were performed. Results Among 38,998 patients undergoing LE procedures, 58% were diabetic. Among diabetics, 95% underwent AMP, 5% underwent PF, and <1% underwent FF. Across all procedure types, noninsulin-dependent (NIDDM) and insulin-dependent diabetes mellitus (IDDM) were associated with significantly greater all-cause complication rates compared with absence of diabetes, and IDDM was generally higher risk than NIDDM. Among diabetics, complication rates were not significantly different across procedure types (IDDM: p = 0.5969; NIDDM: p = 0.1902). On adjusted subgroup analysis by diabetic status, flap procedures were not associated with higher odds of complications compared with amputation for IDDM and NIDDM patients. Length of stay > 30 days was statistically associated with IDDM, particularly those undergoing FF (AMP: 5%, PF: 7%, FF: 14%, p = 0.0004). Conclusion Our study highlights the importance of preoperative diabetic optimization prior to LE procedures. For diabetic patients, there were few significant differences in complication rates across procedure type, suggesting that diabetic patients are not at higher risk of complications when attempting limb salvage instead of amputation.

A Simulation Study of the Inset-fed 2-patch Microstrip Array Antenna for X-band Applications (X-band 대역용 2-패치 마이크로스트립 인셋 급전 어레이 안테나 시뮬레이션 연구)

  • Nkundwanayo Seth;Gyoo-Soo Chae
    • Advanced Industrial SCIence
    • /
    • v.3 no.2
    • /
    • pp.31-37
    • /
    • 2024
  • This paper presents a single and 2-patch microstrip array antenna operated on a frequency of 10.3GHz(x-band). It outlines the process of designing a microstrip patch array antenna using CST MWS. Initially, a single microstrip antenna was designed, followed by optimization using CST MWS to attain optimal return losses and gain. Subsequently, the design was expanded to create a 2×1 microstrip inset-fed array antenna for the X-band applications. The construction material is Roger RO4350B, with specific dimensions (h=0.79mm, 𝜖r = 3.54). The achieved results include an S11 of -18dB at the resonant frequency (10.3GHz), a gain of 9.82dBi, a bandwidth of 0.165GHz, and a 3-dB beamwidth of 30°, 121° in Az(𝜑=0) and El(𝜑=90) plane, respectively. The future plan involves the fabrication of this array antenna and further expansion to a 4×4 array of microstrip antennas. It is then incorporated on the X-band applications for practical uses.

A Tracer Study on Mankyeong River Using Effluents from a Sewage Treatment Plant (하수처리장 방류수를 이용한 추적자 시험: 만경강 유역에 대한 사례 연구)

  • Kim Jin-Sam;Kim Kang-Joo;Hahn Chan;Hwang Gab-Soo;Park Sung-Min;Lee Sang-Ho;Oh Chang-Whan;Park Eun-Gyu
    • The Sea:JOURNAL OF THE KOREAN SOCIETY OF OCEANOGRAPHY
    • /
    • v.11 no.2
    • /
    • pp.82-91
    • /
    • 2006
  • We investigated the possibility of using effluents from a municipal sewage treatment plant (STP) as tracers a tracer for hydrologic studies of rivers. The possibility was checked in a 12-km long reach downstream of Jeonju Municipal Sewage Treatment Plant (JSTP). Time-series monitoring of the water chemistry reveals that chemical compositions of the effluent from the JSTP are fluctuating within a relatively wide range during the sampling period. In addition, the signals from the plant were observed at the downstream stations consecutively with increasing time lags, especially in concentrations of the conservative chemical parameters (concentrations f3r chloride and sulfate, total concentration of major cations, and electric conductivity). Based on this observation, we could estimate the stream flow (Q), velocity (v), and dispersion coefficient (D). A 1-D nonreactive solute-transport model with automated optimization schemes was used for this study. The values of Q, v, and D estimated from this study varied from 6.4 to $9.0m^3/sec$ (at the downstream end of the reach), from 0.06 to 0.10 m/sec, and from 0.7 to $6.4m^2/sec$, respectively. The results show that the effluent from a large-scaled municipal STP frequently provides good, multiple natural tracers far hydrologic studies.

Design and Implementation of MongoDB-based Unstructured Log Processing System over Cloud Computing Environment (클라우드 환경에서 MongoDB 기반의 비정형 로그 처리 시스템 설계 및 구현)

  • Kim, Myoungjin;Han, Seungho;Cui, Yun;Lee, Hanku
    • Journal of Internet Computing and Services
    • /
    • v.14 no.6
    • /
    • pp.71-84
    • /
    • 2013
  • Log data, which record the multitude of information created when operating computer systems, are utilized in many processes, from carrying out computer system inspection and process optimization to providing customized user optimization. In this paper, we propose a MongoDB-based unstructured log processing system in a cloud environment for processing the massive amount of log data of banks. Most of the log data generated during banking operations come from handling a client's business. Therefore, in order to gather, store, categorize, and analyze the log data generated while processing the client's business, a separate log data processing system needs to be established. However, the realization of flexible storage expansion functions for processing a massive amount of unstructured log data and executing a considerable number of functions to categorize and analyze the stored unstructured log data is difficult in existing computer environments. Thus, in this study, we use cloud computing technology to realize a cloud-based log data processing system for processing unstructured log data that are difficult to process using the existing computing infrastructure's analysis tools and management system. The proposed system uses the IaaS (Infrastructure as a Service) cloud environment to provide a flexible expansion of computing resources and includes the ability to flexibly expand resources such as storage space and memory under conditions such as extended storage or rapid increase in log data. Moreover, to overcome the processing limits of the existing analysis tool when a real-time analysis of the aggregated unstructured log data is required, the proposed system includes a Hadoop-based analysis module for quick and reliable parallel-distributed processing of the massive amount of log data. Furthermore, because the HDFS (Hadoop Distributed File System) stores data by generating copies of the block units of the aggregated log data, the proposed system offers automatic restore functions for the system to continually operate after it recovers from a malfunction. Finally, by establishing a distributed database using the NoSQL-based Mongo DB, the proposed system provides methods of effectively processing unstructured log data. Relational databases such as the MySQL databases have complex schemas that are inappropriate for processing unstructured log data. Further, strict schemas like those of relational databases cannot expand nodes in the case wherein the stored data are distributed to various nodes when the amount of data rapidly increases. NoSQL does not provide the complex computations that relational databases may provide but can easily expand the database through node dispersion when the amount of data increases rapidly; it is a non-relational database with an appropriate structure for processing unstructured data. The data models of the NoSQL are usually classified as Key-Value, column-oriented, and document-oriented types. Of these, the representative document-oriented data model, MongoDB, which has a free schema structure, is used in the proposed system. MongoDB is introduced to the proposed system because it makes it easy to process unstructured log data through a flexible schema structure, facilitates flexible node expansion when the amount of data is rapidly increasing, and provides an Auto-Sharding function that automatically expands storage. The proposed system is composed of a log collector module, a log graph generator module, a MongoDB module, a Hadoop-based analysis module, and a MySQL module. When the log data generated over the entire client business process of each bank are sent to the cloud server, the log collector module collects and classifies data according to the type of log data and distributes it to the MongoDB module and the MySQL module. The log graph generator module generates the results of the log analysis of the MongoDB module, Hadoop-based analysis module, and the MySQL module per analysis time and type of the aggregated log data, and provides them to the user through a web interface. Log data that require a real-time log data analysis are stored in the MySQL module and provided real-time by the log graph generator module. The aggregated log data per unit time are stored in the MongoDB module and plotted in a graph according to the user's various analysis conditions. The aggregated log data in the MongoDB module are parallel-distributed and processed by the Hadoop-based analysis module. A comparative evaluation is carried out against a log data processing system that uses only MySQL for inserting log data and estimating query performance; this evaluation proves the proposed system's superiority. Moreover, an optimal chunk size is confirmed through the log data insert performance evaluation of MongoDB for various chunk sizes.

Evaluation of Dose Distributions Recalculated with Per-field Measurement Data under the Condition of Respiratory Motion during IMRT for Liver Cancer (간암 환자의 세기조절방사선치료 시 호흡에 의한 움직임 조건에서 측정된 조사면 별 선량결과를 기반으로 재계산한 체내 선량분포 평가)

  • Song, Ju-Young;Kim, Yong-Hyeob;Jeong, Jae-Uk;Yoon, Mee Sun;Ahn, Sung-Ja;Chung, Woong-Ki;Nam, Taek-Keun
    • Progress in Medical Physics
    • /
    • v.25 no.2
    • /
    • pp.79-88
    • /
    • 2014
  • The dose distributions within the real volumes of tumor targets and critical organs during internal target volume-based intensity-modulated radiation therapy (ITV-IMRT) for liver cancer were recalculated by applying the effects of actual respiratory organ motion, and the dosimetric features were analyzed through comparison with gating IMRT (Gate-IMRT) plan results. The ITV was created using MIM software, and a moving phantom was used to simulate respiratory motion. The doses were recalculated with a 3 dose-volume histogram (3DVH) program based on the per-field data measured with a MapCHECK2 2-dimensional diode detector array. Although a sufficient prescription dose covered the PTV during ITV-IMRT delivery, the dose homogeneity in the PTV was inferior to that with the Gate-IMRT plan. We confirmed that there were higher doses to the organs-at-risk (OARs) with ITV-IMRT, as expected when using an enlarged field, but the increased dose to the spinal cord was not significant and the increased doses to the liver and kidney could be considered as minor when the reinforced constraints were applied during IMRT plan optimization. Because the Gate-IMRT method also has disadvantages such as unsuspected dosimetric variations when applying the gating system and an increased treatment time, it is better to perform a prior analysis of the patient's respiratory condition and the importance and fulfillment of the IMRT plan dose constraints in order to select an optimal IMRT method with which to correct the respiratory organ motional effect.

Evaluation of Contralateral Breast Surface Dose in FIF (Field In Field) Tangential Irradiation Technique for Patients Undergone Breast Conservative Surgery (보존적 유방절제 환자의 방사선치료 시 종속조사면 병합방법에 따른 반대편 유방의 표면선량평가)

  • Park, Byung-Moon;Bang, Dong-Wan;Bae, Yong-Ki;Lee, Jeong-Woo;Kim, You-Hyun
    • Journal of radiological science and technology
    • /
    • v.31 no.4
    • /
    • pp.401-406
    • /
    • 2008
  • The aim of this study is to evaluate contra-lateral breast (CLB) surface dose in Field-in-Field (FIF) technique for breast conserving surgery patients. For evaluation of surface dose in FIF technique, we have compared with other techniques, which were open fields (Open), metal wedge (MW), and enhanced dynamic wedge (EDW) techniques under same geometrical condition and prescribed dose. The three dimensional treatment planning system was used for dose optimization. For the verification of dose calculation, measurements using MOSFET detectors with Anderson Rando phantom were performed. The measured points for four different techniques were at the depth of 0cm (epidermis) and 0.5cm bolus (dermis), and spacing toward 2cm, 4cm, 6cm, 8cm, 10cm apart from the edge of tangential medial beam. The dose calculations were done in 0.25cm grid resolution by modified Batho method for inhomogeneity correction. In the planning results, the surface doses were differentiated in the range of $19.6{\sim}36.9%$, $33.2{\sim}138.2%$ for MW, $1.0{\sim}7.9%$, $1.6{\sim}37.4%$ for EDW, and for FIF at the depth of epidermis and dermis as compared to Open respectively. In the measurements, the surface doses were differentiated in the range of $11.1{\sim}71%$, $22.9{\sim}161%$ for MW, $4.1{\sim}15.5%$, $8.2{\sim}37.9%$ for EDW, and 4.9% for FIF at the depth of epidermis and dermis as compared to Open respectively. The surface doses were considered as underestimating in the planning calculation as compared to the measurement with MOSFET detectors. Was concluded as the lowest one among the techniques, even if it was compared with Open method. Our conclusion could be stated that the FIF technique could make the optimum dose distribution in Breast target, while effectively reduce the probability of secondary carcinogenesis due to undesirable scattered radiation to contra-lateral breast.

  • PDF

HW/SW Partitioning Techniques for Multi-Mode Multi-Task Embedded Applications (멀티모드 멀티태스크 임베디드 어플리케이션을 위한 HW/SW 분할 기법)

  • Kim, Young-Jun;Kim, Tae-Whan
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.34 no.8
    • /
    • pp.337-347
    • /
    • 2007
  • An embedded system is called a multi-mode embedded system if it performs multiple applications by dynamically reconfiguring the system functionality. Further, the embedded system is called a multi-mode multi-task embedded system if it additionally supports multiple tasks to be executed in a mode. In this Paper, we address a HW/SW partitioning problem, that is, HW/SW partitioning of multi-mode multi-task embedded applications with timing constraints of tasks. The objective of the optimization problem is to find a minimal total system cost of allocation/mapping of processing resources to functional modules in tasks together with a schedule that satisfies the timing constraints. The key success of solving the problem is closely related to the degree of the amount of utilization of the potential parallelism among the executions of modules. However, due to an inherently excessively large search space of the parallelism, and to make the task of schedulabilty analysis easy, the prior HW/SW partitioning methods have not been able to fully exploit the potential parallel execution of modules. To overcome the limitation, we propose a set of comprehensive HW/SW partitioning techniques which solve the three subproblems of the partitioning problem simultaneously: (1) allocation of processing resources, (2) mapping the processing resources to the modules in tasks, and (3) determining an execution schedule of modules. Specifically, based on a precise measurement on the parallel execution and schedulability of modules, we develop a stepwise refinement partitioning technique for single-mode multi-task applications. The proposed techniques is then extended to solve the HW/SW partitioning problem of multi-mode multi-task applications. From experiments with a set of real-life applications, it is shown that the proposed techniques are able to reduce the implementation cost by 19.0% and 17.0% for single- and multi-mode multi-task applications over that by the conventional method, respectively.

Reconstruction of Stereo MR Angiography Optimized to View Position and Distance using MIP (최대강도투사를 이용한 관찰 위치와 거리에 최적화 된 입체 자기공명 뇌 혈관영상 재구성)

  • Shin, Seok-Hyun;Hwang, Do-Sik
    • Investigative Magnetic Resonance Imaging
    • /
    • v.16 no.1
    • /
    • pp.67-75
    • /
    • 2012
  • Purpose : We studied enhanced method to view the vessels in the brain using Magnetic Resonance Angiography (MRA). Noticing that Maximum Intensity Projection (MIP) image is often used to evaluate the arteries of the neck and brain, we propose a new method for view brain vessels to stereo image in 3D space with more superior and more correct compared with conventional method. Materials and Methods: We use 3T Siemens Tim Trio MRI scanner with 4 channel head coil and get a 3D MRA brain data by fixing volunteers head and radiating Phase Contrast pulse sequence. MRA brain data is 3D rotated according to the view angle of each eyes. Optimal view angle (projection angle) is determined by the distance between eye and center of the data. Newly acquired MRA data are projected along with the projection line and display only the highest values. Each left and right view MIP image is integrated through anaglyph imaging method and optimal stereoscopic MIP image is acquired. Results: Result image shows that proposed method let enable to view MIP image at any direction of MRA data that is impossible to the conventional method. Moreover, considering disparity and distance from viewer to center of MRA data at spherical coordinates, we can get more realistic stereo image. In conclusion, we can get optimal stereoscopic images according to the position that viewers want to see and distance between viewer and MRA data. Conclusion: Proposed method overcome problems of conventional method that shows only specific projected image (z-axis projection) and give optimal depth information by converting mono MIP image to stereoscopic image considering viewers position. And can display any view of MRA data at spherical coordinates. If the optimization algorithm and parallel processing is applied, it may give useful medical information for diagnosis and treatment planning in real-time.

A Study on the Automation and Optimization of 9-(4-[$^{18}F$] Fluoro-3-hydroxymethylbutyl) Guanine Synthesis (9-(4-[$^{18}F$] Fluoro-3-hydroxymethylbutyl) guanine 합성의 자동화와 최적화에 관한 연구)

  • An, Jae-Seok;Hong, Sung-Tack;Kang, Se-Hun;Won, Woo-Jae
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.15 no.2
    • /
    • pp.72-75
    • /
    • 2011
  • Purpose: The HSV1-tk reporter gene system is the most widely used system because of its advantage is that it is possible to monitor directly without the introduction of a separate reporter gene in case of HSV1-tk suicide gene therapy. This study was performed to automate 9-(4-[$^{18}F$] Fluoro-3-hydroxymethylbutyl) guanine ([$^{18}F$] FHBG) that are widely used as substrate for the HSV1-tk reporter gene in living organisms with positron emission tomography (PET) and find the optimized conditions of synthesis. Materials and Methods: Fully automated synthesis of [$^{18}F$] FHBG was performed using Explora-RN (CTI, USA) module. We have changed of reaction time (3, 5, 10 min) and temperature (110, 120, $130^{\circ}C$) for the optimized conditions of synthesis. Also we experimented to find the optimal concentration of precursor (5, 7, 10 mg). Results: [$^{18}F$] FHBG was purified by HPLC system and collected at around 10-12 min. Synthesis using Explora-RN module showed a $32.0{\pm}1.2%$ yield of radiochemical (decay corrected), the purity was greater than 98%. And the entire synthesis time was less than 48 min. Temperature of the highest synthesis yield was $130^{\circ}C$, reaction time was 5 minutes and concentration of precursor was 10 mg (recommended volume in manual) (n=36). In contrast to radiochemical yield of precursor 10 mg ($32{\pm}1.2%$), yield of 5 and 7 mg precursor was unstable. Conclusion: Automation of [$^{18}F$] FHBG synthesis at Explora-RN module has been completed. In addition, we were able to obtain optimized reaction time, temperature and concentration of precursor. Therefore this study would be provided more rapid synthesis and higher radiochemical yield.

  • PDF

Verification of Gated Radiation Therapy: Dosimetric Impact of Residual Motion (여닫이형 방사선 치료의 검증: 잔여 움직임의 선량적 영향)

  • Yeo, Inhwan;Jung, Jae Won
    • Progress in Medical Physics
    • /
    • v.25 no.3
    • /
    • pp.128-138
    • /
    • 2014
  • In gated radiation therapy (gRT), due to residual motion, beam delivery is intended to irradiate not only the true extent of disease, but also neighboring normal tissues. It is desired that the delivery covers the true extent (i.e. clinical target volume or CTV) as a minimum, although target moves under dose delivery. The objectives of our study are to validate if the intended dose is surely delivered to the true target in gRT and to quantitatively understand the trend of dose delivery on it and neighboring normal tissues when gating window (GW), motion amplitude (MA), and CTV size changes. To fulfill the objectives, experimental and computational studies have been designed and performed. A custom-made phantom with rectangle- and pyramid-shaped targets (CTVs) on a moving platform was scanned for four-dimensional imaging. Various GWs were selected and image integration was performed to generate targets (internal target volume or ITV) for planning that included the CTVs and internal margins (IM). The planning was done conventionally for the rectangle target and IMRT optimization was done for the pyramid target. Dose evaluation was then performed on a diode array aligned perpendicularly to the gated beams through measurements and computational modeling of dose delivery under motion. This study has quantitatively demonstrated and analytically interpreted the impact of residual motion including penumbral broadening for both targets, perturbed but secured dose coverage on the CTV, and significant doses delivered in the neighboring normal tissues. Dose volume histogram analyses also demonstrated and interpreted the trend of dose coverage: for ITV, it increased as GW or MA decreased or CTV size increased; for IM, it increased as GW or MA decreased; for the neighboring normal tissue, opposite trend to that of IM was observed. This study has provided a clear understanding on the impact of the residual motion and proved that if breathing is reproducible gRT is secure despite discontinuous delivery and target motion. The procedures and computational model can be used for commissioning, routine quality assurance, and patient-specific validation of gRT. More work needs to be done for patient-specific dose reconstruction on CT images.