• Title/Summary/Keyword: Threshold setting

Search Result 150, Processing Time 0.028 seconds

The Diagnostic Performance of the Length of Tumor Capsular Contact on MRI for Detecting Prostate Cancer Extraprostatic Extension: A Systematic Review and Meta-Analysis

  • Tae-Hyung Kim;Sungmin Woo;Sangwon Han;Chong Hyun Suh;Soleen Ghafoor;Hedvig Hricak;Hebert Alberto Vargas
    • Korean Journal of Radiology
    • /
    • v.21 no.6
    • /
    • pp.684-694
    • /
    • 2020
  • Objective: The purpose was to review the diagnostic performance of the length of tumor capsular contact (LCC) on magnetic resonance imaging (MRI) for detecting prostate cancer extraprostatic extension (EPE). Materials and Methods: PubMed and EMBASE databases were searched up to March 24, 2019. We included diagnostic accuracy studies that evaluated LCC on MRI for EPE detection using radical prostatectomy specimen histopathology as the reference standard. Quality of studies was assessed using the Quality Assessment of Diagnostic Accuracy Studies-2 tool. Sensitivity and specificity were pooled and graphically presented using hierarchical summary receiver operating characteristic (HSROC) plots. Meta-regression and subgroup analyses were conducted to explore heterogeneity. Results: Thirteen articles with 2136 patients were included. Study quality was generally good. Summary sensitivity and specificity were 0.79 (95% confidence interval [CI] 0.73-0.83) and 0.67 (95% CI 0.60-0.74), respectively. Area under the HSROC was 0.81 (95% CI 0.77-0.84). Substantial heterogeneity was present among the included studies according to Cochran's Q-test (p < 0.01) and Higgins I2 (62% and 86% for sensitivity and specificity, respectively). In terms of heterogeneity, measurement method (curvilinear vs. linear), prevalence of Gleason score ≥ 7, MRI readers' experience, and endorectal coils were significant factors (p ≤ 0.01), whereas method to determine the LCC threshold, cutoff value, magnet strength, and publication year were not (p = 0.14-0.93). Diagnostic test accuracy estimates were comparable across all assessed MRI sequences. Conclusion: Greater LCC on MRI is associated with a higher probability of prostate cancer EPE. Due to heterogeneity among the studies, further investigation is needed to establish the optimal cutoff value for each clinical setting.

Development of a Baseline Setting Model Based on Time Series Structural Changes for Priority Assessment in the Korea Risk Information Surveillance System (K-RISS) (식·의약 위해 감시체계(K-RISS)의 우선순위 평가를 위한 시계열 구조변화 기반 기준선 설정 모델 개발)

  • Hyun Joung Jin;Seong-yoon Heo;Hunjoo Lee;Boyoun Jang
    • Journal of Environmental Health Sciences
    • /
    • v.50 no.2
    • /
    • pp.125-137
    • /
    • 2024
  • Background: The Korea Risk Information Surveillance System (K-RISS) was developed to enable the early detection of food and drug safety-related issues. Its goal is to deliver real-time risk indicators generated from ongoing food and drug risk monitoring. However, the existing K-RISS system suffers under several limitations. Objectives: This study aims to augment K-RISS with more detailed indicators and establish a severity standard that takes into account structural changes in the daily time series of K-RISS values. Methods: First, a Delphi survey was conducted to derive the required weights. Second, a control chart, commonly used in statistical process controls, was utilized to detect outliers and establish caution, attention, and serious levels for K-RISS values. Furthermore, Bai and Perron's method was employed to determine structural changes in K-RISS time series. Results: The study incorporated 'closeness to life' and 'sustainability' indicators into K-RISS. It obtained the necessary weights through a survey of experts for integrating variables, combining indicators by data source, and aggregating sub K-RISS values. We defined caution, attention, and serious levels for both average and maximum values of daily K-RISS. Furthermore, when structural changes were detected, leading to significant variations in daily K-RISS values according to different periods, the study systematically verified these changes and derived respective severity levels for each period. Conclusions: This study enhances the existing K-RISS system and introduces more advanced indicators. K-RISS is now more comprehensively equipped to serve as a risk warning index. The study has paved the way for an objective determination of whether the food safety risk index surpasses predefined thresholds through the application of severity levels.

The Effect of Relaxation Technique on Reduction of Postoperative Pain (이완술 사용이 수술후 동통 감소에 미치는 영향)

  • 박정숙
    • Journal of Korean Academy of Nursing
    • /
    • v.15 no.1
    • /
    • pp.76-96
    • /
    • 1985
  • Postoperative pain is one of the most frequently occurred pain in hospitals, but it has been underestimated because it is only a part of postoperative physiological Process and may disappear in time. It is necessary that nurses me the relaxation technique, planning and implementing by themselves independently, to reduce this postoperative pain. This study is aimed at showing the effect of relaxation technique on reduction of postoperative pain, and exploring the factors influencing postoperative. pain Fifty-seven patients with abdominal surgery who admitted in attacked D Medical Center to K University in Daegu have been studied. Of them twenty-nine were experimental group and the remaining twenty-eight were control group. This study has been conducted for collecting data through interviews and observation from August 23 to October 24, 1984. The tools of this study were two kinds: Postoperative Pain Scale is obtained from a review of references by the researcher, and relaxation technique, designed to use postoperative setting adequately, is also obtained from a review of references by the researcher. After confiriming no significant differences between the two groups, the hypotheses were statistically verified by x²-test, t-test, and pearson Correlation Coefficient. The results of this study are summerized as follows; * The nam hypothesis that the experimental group who use relaxation technique will have less degree of postoperative pain than the control group who don't use relaxation technique is devided into three sub-hypotheses. 1. The first sub-hypothesis that the experimental group will have less score of postoperative pain than control group was accepted (t=7.810, p <.01). Even with controlling pain threshold, showing difference in some degree between the two groups, the experimental group has less score of postoperative pain than the control group. Therefore this confirms the acceptance of the first sub-hypothesis more strongly. 2. The second sub-hypothesis that the expermental group will have less frequency of analgesics than the control group is accepted (x²=9.85, p <.01). 3. The third sub-hypothesis that the experimental group will have less variation of pulse, respiration, and blood pressure between pre End post operative periods than the control group is rejected. So this hypothesis is reverified through comparing the variation of pulse, respiration, and blood pressure between pre and post changing Position to measure the pure effect of relaxation technique. pulse and respiration is significantly lowered in the experimental group (t=7.209, p<.01, t=3.473, p<.01), but systolic and diastolic blood pressure is not different significantly between the two groups (t= 1.309, p>.05, t=1. 727 p>.05). Therefore the third sub-hypothesis is partially accepted. Conclusively, the researcher thinks that it is necessary that nurses should provide patients with relaxation technique to reduce postoperative pain, and to increase independence of nursing.

  • PDF

Feasibility study of the beating cancellation during the satellite vibration test

  • Bettacchioli, Alain
    • Advances in aircraft and spacecraft science
    • /
    • v.5 no.2
    • /
    • pp.225-237
    • /
    • 2018
  • The difficulties of satellite vibration testing are due to the commonly expressed qualification requirements being incompatible with the limited performance of the entire controlled system (satellite + interface + shaker + controller). Two features cause the problem: firstly, the main satellite modes (i.e., the first structural mode and the high and low tank modes) are very weakly damped; secondly, the controller is just too basic to achieve the expected performance in such cases. The combination of these two issues results in oscillations around the notching levels and high amplitude beating immediately after the mode. The beating overshoots are a major risk source because they can result in the test being aborted if the qualification upper limit is exceeded. Although the abort is, in itself, a safety measure protecting the tested satellite, it increases the risk of structural fatigue, firstly because the abort threshold has been already reached, and secondly, because the test must restart at the same close-resonance frequency and remain there until the qualification level is reached and the sweep frequency can continue. The beat minimum relates only to small successive frequency ranges in which the qualification level is not reached. Although they are less problematic because they do not cause an inadvertent test shutdown, such situations inevitably result in waiver requests from the client. A controlled-system analysis indicates an operating principle that cannot provide sufficient stability: the drive calculation (which controls the process) simply multiplies the frequency reference (usually called cola) and a function of the following setpoint, the ratio between the amplitude already reached and the previous setpoint, and the compression factor. This function value changes at each cola interval, but it never takes into account the sensor signal phase. Because of these limitations, we firstly examined whether it was possible to empirically determine, using a series of tests with a very simple dummy, a controller setting process that significantly improves the results. As the attempt failed, we have performed simulations seeking an optimum adjustment by finding the Least Mean Square of the difference between the reference and response signal. The simulations showed a significant improvement during the notch beat and a small reduction in the beat amplitude. However, the small improvement in this process was not useful because it highlighted the need to change the reference at each cola interval, sometimes with instructions almost twice the qualification level. Another uncertainty regarding the consequences of such an approach involves the impact of differences between the estimated model (used in the simulation) and the actual system. As limitations in the current controller were identified in different approaches, we considered the feasibility of a new controller that takes into account an estimated single-input multi-output (SIMO) model. Its parameters were estimated from a very low-level throughput. Against this backdrop, we analyzed the feasibility of an LQG control in cancelling beating, and this article highlights the relevance of such an approach.

Design and Implementation of Static Program Analyzer Finding All Buffer Overrun Errors in C Programs (C 프로그램의 버퍼 오버런(buffer overrun) 오류를 찾아 주는 정적 분석기의 설계와 구현)

  • Yi Kwang-Keun;Kim Jae-Whang;Jung Yung-Bum
    • Journal of KIISE:Software and Applications
    • /
    • v.33 no.5
    • /
    • pp.508-524
    • /
    • 2006
  • We present our experience of combining, in a realistic setting, a static analyzer with a statistical analysis. This combination is in order to reduce the inevitable false alarms from a domain-unaware static analyzer. Our analyzer named Airac(Array Index Range Analyzer for C) collects all the true buffer-overrun points in ANSI C programs. The soundness is maintained, and the analysis' cost-accuracy improvement is achieved by techniques that static analysis community has long accumulated. For still inevitable false alarms (e.g. Airac raised 970 buffer-overrun alarms in commercial C programs of 5.3 million lines and 737 among the 970 alarms were false), which are always apt for particular C programs, we use a statistical post analysis. The statistical analysis, given the analysis results (alarms), sifts out probable false alarms and prioritizes true alarms. It estimates the probability of each alarm being true. The probabilities are used in two ways: 1) only the alarms that have true-alarm probabilities higher than a threshold are reported to the user; 2) the alarms are sorted by the probability before reporting, so that the user can check highly probable errors first. In our experiments with Linux kernel sources, if we set the risk of missing true error is about 3 times greater than false alarming, 74.83% of false alarms could be filtered; only 15.17% of false alarms were mixed up until the user observes 50% of the true alarms.

Design of Curve Road Detection System by Convergence of Sensor (센서 융합에 의한 곡선차선 검출 시스템 설계)

  • Kim, Gea-Hee;Jeong, Seon-Mi;Mun, Hyung-Jin;Kim, Chang-Geun
    • Journal of Digital Convergence
    • /
    • v.14 no.8
    • /
    • pp.253-259
    • /
    • 2016
  • Regarding the research on lane recognition, continuous studies have been in progress for vehicles to navigate autonomously and to prevent traffic accidents, and lane recognition and detection have remarkably developed as different algorithms have appeared recently. Those studies were based on vision system and the recognition rate was improved. However, in case of driving at night or in rain, the recognition rate has not met the level at which it is satisfactory. Improving the weakness of the vision system-based lane recognition and detection, applying sensor convergence technology for the response after accident happened, among studies on lane detection, the study on the curve road detection was conducted. It proceeded to study on the curve road detection among studies on the lane recognition. In terms of the road detection, not only a straight road but also a curve road should be detected and it can be used in investigation on traffic accidents. Setting the threshold value of curvature from 0.001 to 0.06 showing the degree of the curve, it presented that it is able to compute the curve road.

Extraction of Renal Glomeruli Region using Genetic Algorithm (유전적 알고리듬을 이용한 신장 사구체 영역의 추출)

  • Kim, Eung-Kyeu
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.46 no.2
    • /
    • pp.30-39
    • /
    • 2009
  • Extraction of glomeruli region plays a very important role for diagnosing nephritis automatically. However, it is not easy to extract glomeruli region correctly because the difference between glomeruli region and other region is not obvious, simultaneously unevennesses that is brought in the sampling process and in the imaging process. In this study, a new method for extracting renal glomeruli region using genetic algorithm is proposed. The first, low and high resolution images are obtained by using Laplacian-Gaussian filter with ${\sigma}=2.1$ and ${\sigma}=1.8$, then, binary images by setting the threshold value to zero are obtained. And then border edge is detected from low resolution images, the border of glomeruli is expressed by a closed B-splines' curve line. The parameters that decide the closed curve line with this low resolution image prevent the noises and the border lines from breaking off in the middle by searching using genetic algorithm. Next, in order to obtain more precise border edges of glomeruli, the number of node points is increased and corrected in order from eight to sixteen and thirty two from high resolution images. Finally, the validity of this proposed method is shown to be effective by applying to the real images.

A Study on the Automatic Generation of Test Case Based on Source Code for Quality Improvement (소프트웨어 품질향상을 위한 소스코드 기반의 테스트 케이스 자동 생성에 관한 연구)

  • Son, Ung-Jin;Lee, Seung-Ho
    • Journal of IKEEE
    • /
    • v.19 no.2
    • /
    • pp.186-192
    • /
    • 2015
  • This paper proposes an automatic generation technology of test case based on API in source code for software's quality improvement. The proposed technology is comprised of four processes which are analyzing source code by using the Doxygen open source tool, defining API specification by using analyzed results, creating test design, generating a test case by adapting Pairwise test technology. Analyzing source code by using the Doxygen open source tool is the phase in which API information in source code such as the API name, input parameter and return parameter are extracted. Defined API specification by using analyzed results is the phase where API informations, which is needed to generate test case, are defined as a form of database by SQLite database on the basis of extracted API information. Creating test design is the phase in which the scenario is designed in order to be composed as database by defining threshold of input and return parameters and setting limitations based on the defined API. Generating a test case by adapting Pairwise test technique is the phase where real test cases are created and changed into database by adapting Pairwise technique on the base of test design information. To evaluate the efficiency of proposed technology, the research was conducted by begin compared to specification based test case creation. The result shows wider test coverage which means the more cases were created in the similar duration of time. The reduction of manpower and time for developing products is expected by changing the process of quality improving in software developing from man-powered handwork system into automatic test case generation based on API of source code.

Time Series Data Analysis and Prediction System Using PCA (주성분 분석 기법을 활용한 시계열 데이터 분석 및 예측 시스템)

  • Jin, Young-Hoon;Ji, Se-Hyun;Han, Kun-Hee
    • Journal of the Korea Convergence Society
    • /
    • v.12 no.11
    • /
    • pp.99-107
    • /
    • 2021
  • We live in a myriad of data. Various data are created in all situations in which we work, and we discover the meaning of data through big data technology. Many efforts are underway to find meaningful data. This paper introduces an analysis technique that enables humans to make better choices through the trend and prediction of time series data as a principal component analysis technique. Principal component analysis constructs covariance through the input data and presents eigenvectors and eigenvalues that can infer the direction of the data. The proposed method computes a reference axis in a time series data set having a similar directionality. It predicts the directionality of data in the next section through the angle between the directionality of each time series data constituting the data set and the reference axis. In this paper, we compare and verify the accuracy of the proposed algorithm with LSTM (Long Short-Term Memory) through cryptocurrency trends. As a result of comparative verification, the proposed method recorded relatively few transactions and high returns(112%) compared to LSTM in data with high volatility. It can mean that the signal was analyzed and predicted relatively accurately, and it is expected that better results can be derived through a more accurate threshold setting.

An Estimation on Average Service Life of Public Buildings in South Korea: In Case of RCC (우리나라 공공건물의 내용연수 추정: RCC를 중심으로)

  • Jung-Hoon Kwon;Jin-Hyung Cho;Hyun-Seung Oh;Sae-Jae Lee
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.46 no.1
    • /
    • pp.84-90
    • /
    • 2023
  • ASL estimation of public building is based on how appropriate the maximum age of the asset is derived based on the age record of the asset in the statistical data owned by public institutions. This is because we get a 'constrained' ASL by that number. And it is especially true because other studies have assumed that the building is an Iowa curve R3. Also, in this study, the survival rate is 1% as the threshold value at which the survival curve and the predictable life curve almost coincide. Rather than a theoretical basis, in the national statistical survey, the value of residual assets was recognized from the net value of 10% of the acquisition value when the average service life has elapsed, and 1% when doubling the average service life has elapsed. It is based on the setting mentioned above. The biggest constraint in fitting statistical data to the Iowa curve is that the maximum ASL is selected at R3 150%, and the 'constrained' ASL is calculated by the proportional expression on the assumption that the Iowa curve is followed. In like manner constraints were considered. First, the R3 disposal curve for the RCC(reinforced cement concrete) building was prepared according to the discarding method in the 2000 work, and it was jointly worked on with the National Statistical Office to secure the maximum amount of vintage data, but the lacking of sample size must be acknowledged. Even after that, the National Statistical Office and the Bank of Korea have been working on estimating the Iowa curve for each asset class in the I-O table. Another limitation is that the asset classification uses the broad classification of buildings as a subcategory. Second, if there were such assets with a lifespan of 115 years that were acquired in 1905 and disposed of in 2020, these discarded data would be omitted from this ASL calculation. Third, it is difficult to estimate the correct Iowa curve based on the stub-curve even if there is disposal data because Korea has a relatively shorter construction history, accumulated economic wealth since the 1980's. In other words, "constrained" ASL is an under-estimation of its ASL. Considering the fact that Korea was an economically developing country in the past and during rapid economic development, environmental factors such as asset accumulation and economic ability should be considered. Korea has a short period of accumulation of economic wealth, and the history of 'proper' architectures faithful to building regulations and principles is short and as a result, buildings 'not built properly' and 'proper' architectures are mixed. In this study, ASL of RCC public building was estimated at 70 years.