• Title/Summary/Keyword: accuracy index

Search Result 1,246, Processing Time 0.023 seconds

A Novel Method of Infant Chest Compression: A Study on the Cross-Simulation of Randomization Using Manekin (새로운 영아 가슴압박법의 비교: 마네킨을 이용한 랜덤화 교차 시뮬레이션 연구)

  • Yun, Seong-Woo
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2019.05a
    • /
    • pp.525-527
    • /
    • 2019
  • Cardiac arrest is a series of conditions occur when the heart is stopped, regardless of the cause. one of the only ways to save a patient's life in the event of cardiac arrest is cardiopulmonary resuscitation, which is very important beacause it can maintain circulation through this technique, and high-quality CPR affects the survival rate and neurological prognosis of the patient. For infant cardiopulmonary resuscitation, use two finger to compress the chest. Hower, this method can be diffcult to reach the chest commpressions recommended by the American Heart Association because of the anatomically increased fatigue of the fingers and diffculty of vertical pressure. The study aims to verify the effects of new chest compressions in the implementation of chest compressions during infant cardiopulmonary resuscitation. The study also showed singnificant differences in chest depth and average rate of pressure(p<0.001). Based on the results of this study, we can see that the accuracy of the new chest compressions during infant cardiopulmonary resuscitation is increased, and the depth of chest compressions is improved, improving the quality index of chest compressions.

  • PDF

Lifesaver: Android-based Application for Human Emergency Falling State Recognition

  • Abbas, Qaisar
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.8
    • /
    • pp.267-275
    • /
    • 2021
  • Smart application is developed in this paper by using an android-based platform to automatically determine the human emergency state (Lifesaver) by using different technology sensors of the mobile. In practice, this Lifesaver has many applications, and it can be easily combined with other applications as well to determine the emergency of humans. For example, if an old human falls due to some medical reasons, then this application is automatically determining the human state and then calls a person from this emergency contact list. Moreover, if the car accidentally crashes due to an accident, then the Lifesaver application is also helping to call a person who is on the emergency contact list to save human life. Therefore, the main objective of this project is to develop an application that can save human life. As a result, the proposed Lifesaver application is utilized to assist the person to get immediate attention in case of absence of help in four different situations. To develop the Lifesaver system, the GPS is also integrated to get the exact location of a human in case of emergency. Moreover, the emergency list of friends and authorities is also maintained to develop this application. To test and evaluate the Lifesaver system, the 50 different human data are collected with different age groups in the range of (40-70) and the performance of the Lifesaver application is also evaluated and compared with other state-of-the-art applications. On average, the Lifesaver system is achieved 95.5% detection accuracy and the value of 91.5 based on emergency index metric, which is outperformed compared to other applications in this domain.

Structural similarity based efficient keyframes extraction from multi-view videos (구조적인 유사성에 기반한 다중 뷰 비디오의 효율적인 키프레임 추출)

  • Hussain, Tanveer;Khan, Salman;Muhammad, Khan;Lee, Mi Young;Baik, Sung Wook
    • The Journal of Korean Institute of Next Generation Computing
    • /
    • v.14 no.6
    • /
    • pp.7-14
    • /
    • 2018
  • Salient information extraction from multi-view videos is a very challenging area because of inter-view, intra-view correlations, and computational complexity. There are several techniques developed for keyframes extraction from multi-view videos with very high computational complexities. In this paper, we present a keyframes extraction approach from multi-view videos using entropy and complexity information present inside frame. In first step, we extract representative shots of the whole video from each view based on structural similarity index measurement (SSIM) difference value between frames. In second step, entropy and complexity scores for all frames of shots in different views are computed. Finally, the frames with highest entropy and complexity scores are considered as keyframes. The proposed system is subjectively evaluated on available office benchmark dataset and the results are convenient in terms of accuracy and time complexity.

A study on the Filtering of Spam E-mail using n-Gram indexing and Support Vector Machine (n-Gram 색인화와 Support Vector Machine을 사용한 스팸메일 필터링에 대한 연구)

  • 서정우;손태식;서정택;문종섭
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.14 no.2
    • /
    • pp.23-33
    • /
    • 2004
  • Because of a rapid growth of internet environment, it is also fast increasing to exchange message using e-mail. But, despite the convenience of e-mail, it is rising a currently bi9 issue to waste their time and cost due to the spam mail in an individual or enterprise. Many kinds of solutions have been studied to solve harmful effects of spam mail. Such typical methods are as follows; pattern matching using the keyword with representative method and method using the probability like Naive Bayesian. In this paper, we propose a classification method of spam mails from normal mails using Support Vector Machine, which has excellent performance in pattern classification problems, to compensate for the problems of existing research. Especially, the proposed method practices efficiently a teaming procedure with a word dictionary including a generated index by the n-Gram. In the conclusion, we verified the proposed method through the accuracy comparison of spm mail separation between an existing research and proposed scheme.

A Performance Comparison of Histogram Equalization Algorithms for Cervical Cancer Classification Model (평활화 알고리즘에 따른 자궁경부 분류 모델의 성능 비교 연구)

  • Kim, Youn Ji;Park, Ye Rang;Kim, Young Jae;Ju, Woong;Nam, Kyehyun;Kim, Kwang Gi
    • Journal of Biomedical Engineering Research
    • /
    • v.42 no.3
    • /
    • pp.80-85
    • /
    • 2021
  • We developed a model to classify the absence of cervical cancer using deep learning from the cervical image to which the histogram equalization algorithm was applied, and to compare the performance of each model. A total of 4259 images were used for this study, of which 1852 images were normal and 2407 were abnormal. And this paper applied Image Sharpening(IS), Histogram Equalization(HE), and Contrast Limited Adaptive Histogram Equalization(CLAHE) to the original image. Peak Signal-to-Noise Ratio(PSNR) and Structural Similarity index for Measuring image quality(SSIM) were used to assess the quality of images objectively. As a result of assessment, IS showed 81.75dB of PSNR and 0.96 of SSIM, showing the best image quality. CLAHE and HE showed the PSNR of 62.67dB and 62.60dB respectively, while SSIM of CLAHE was shown as 0.86, which is closer to 1 than HE of 0.75. Using ResNet-50 model with transfer learning, digital image-processed images are classified into normal and abnormal each. In conclusion, the classification accuracy of each model is as follows. 90.77% for IS, which shows the highest, 90.26% for CLAHE and 87.60% for HE. As this study shows, applying proper digital image processing which is for cervical images to Computer Aided Diagnosis(CAD) can help both screening and diagnosing.

Effects of feather processing methods on quantity of extracted corticosterone in broiler chickens

  • Ataallahi, Mohammad;Nejad, Jalil Ghassemi;Song, Jun-Ik;Kim, Jin-Soo;Park, Kyu-Hyun
    • Journal of Animal Science and Technology
    • /
    • v.62 no.6
    • /
    • pp.884-892
    • /
    • 2020
  • Corticosterone is known as a biological stress index in many species including birds. Feather corticosterone concentration (FCC) has increasingly been used as a measure for chronic stress status in broiler chickens. As sample preparation is the first step of analytical process, different techniques of feather matrix disruption need to be validated for obtaining better result in analysing corticosterone extraction. The current study was a validation of pulverizing the feather by bead beater (BB) and surgical scissors (SS) processing prior to corticosterone extraction in feather of broiler chickens. The type of feather processing prior to the hormone extraction may alter the final output. Thereby, finding a standard method according to laboratory facilities is pivotal. This study carried out to determine the effects of feather pulverization methods on the extraction amount of corticosterone in broiler chickens. Feathers were sampled from four weeks old Ross 308 broiler chickens (n = 12 birds). All broiler chickens were kept under the same environmental condition and had access to feed and water. Feather samples were assigned to one of the following processing methods 1) using a BB for pulverizing and 2) using a SS for chopping into tiny pieces. Each sample was duplicated into two wells during enzyme immunoassay (EIA) analysis to improve the accuracy of the obtained data. The results showed lower standard errors and constant output of FCC by using the BB method compared with the SS method. Overall comparison of FCC showed a significantly higher (p < 0.001) amount of the FCC in the BB compared with the SS. Overall, using the BB method is recommended over the SS method for feather processing due to the ability to homogenize a large number of samples simultaneously, ease of use and greater extraction of feather corticosterone.

Deriving the Effective Atomic Number with a Dual-Energy Image Set Acquired by the Big Bore CT Simulator

  • Jung, Seongmoon;Kim, Bitbyeol;Kim, Jung-in;Park, Jong Min;Choi, Chang Heon
    • Journal of Radiation Protection and Research
    • /
    • v.45 no.4
    • /
    • pp.171-177
    • /
    • 2020
  • Background: This study aims to determine the effective atomic number (Zeff) from dual-energy image sets obtained using a conventional computed tomography (CT) simulator. The estimated Zeff can be used for deriving the stopping power and material decomposition of CT images, thereby improving dose calculations in radiation therapy. Materials and Methods: An electron-density phantom was scanned using Philips Brilliance CT Big Bore at 80 and 140 kVp. The estimated Zeff values were compared with those obtained using the calibration phantom by applying the Rutherford, Schneider, and Joshi methods. The fitting parameters were optimized using the nonlinear least squares regression algorithm. The fitting curve and mass attenuation data were obtained from the National Institute of Standards and Technology. The fitting parameters obtained from stopping power and material decomposition of CT images, were validated by estimating the residual errors between the reference and calculated Zeff values. Next, the calculation accuracy of Zeff was evaluated by comparing the calculated values with the reference Zeff values of insert plugs. The exposure levels of patients under additional CT scanning at 80, 120, and 140 kVp were evaluated by measuring the weighted CT dose index (CTDIw). Results and Discussion: The residual errors of the fitting parameters were lower than 2%. The best and worst Zeff values were obtained using the Schneider and Joshi methods, respectively. The maximum differences between the reference and calculated values were 11.3% (for lung during inhalation), 4.7% (for adipose tissue), and 9.8% (for lung during inhalation) when applying the Rutherford, Schneider, and Joshi methods, respectively. Under dual-energy scanning (80 and 140 kVp), the patient exposure level was approximately twice that in general single-energy scanning (120 kVp). Conclusion: Zeff was calculated from two image sets scanned by conventional single-energy CT simulator. The results obtained using three different methods were compared. The Zeff calculation based on single-energy exhibited appropriate feasibility.

Bending analysis of functionally graded plates using a new refined quasi-3D shear deformation theory and the concept of the neutral surface position

  • Hachemi, Houari;Bousahla, Abdelmoumen Anis;Kaci, Abdelhakim;Bourada, Fouad;Tounsi, Abdeldjebbar;Benrahou, Kouider Halim;Tounsi, Abdelouahed;Al-Zahrani, Mesfer Mohammad;Mahmoud, S.R.
    • Steel and Composite Structures
    • /
    • v.39 no.1
    • /
    • pp.51-64
    • /
    • 2021
  • This paper presents a high-order shear and normal deformation theory for the bending of FGM plates. The number of unknowns and governing equations of the present theory is reduced, and hence makes it simple to use. Unlike any other theory, the number of unknown functions involved in displacement field is only four, as against five or more in the case of other shear and normal deformation theories. Based on the novel shear and normal deformation theory, the position of neutral surface is determined and the governing equilibrium equations based on neutral surface are derived. There is no stretching-bending coupling effect in the neutral surface-based formulation, and consequently, the governing equations of functionally graded plates based on neutral surface have the simple forms as those of isotropic plates. Navier-type analytical solution is obtained for functionally graded plate subjected to transverse load for simply supported boundary conditions. The accuracy of the present theory is verified by comparing the obtained results with other quasi-3D higher-order theories reported in the literature. Other numerical examples are also presented to show the influences of the volume fraction distribution, geometrical parameters and power law index on the bending responses of the FGM plates are studied.

Correlation between gray values of cone-beam computed tomograms and Hounsfield units of computed tomograms: A systematic review and meta-analysis

  • Selvaraj, Abirami;Jain, Ravindra Kumar;Nagi, Ravleen;Balasubramaniam, Arthi
    • Imaging Science in Dentistry
    • /
    • v.52 no.2
    • /
    • pp.133-140
    • /
    • 2022
  • Purpose: The aim of this review was to systematically analyze the available literature on the correlation between the gray values (GVs) of cone-beam computed tomography (CBCT) and the Hounsfield units (HUs) of computed tomography (CT) for assessing bone mineral density. Materials and Methods: A literature search was carried out in PubMed, Cochrane Library, Google Scholar, Scopus, and LILACS for studies published through September 2021. In vitro, in vivo, and animal studies that analyzed the correlations GVs of CBCT and HUs of CT were included in this review. The review was prepared according to the PRISMA checklist for systematic reviews, and the risk of bias was assessed using the Quality Assessment of Diagnostic Accuracy Studies tool. A quantitative analysis was performed using a fixed-effects model. Results: The literature search identified a total of 5,955 studies, of which 14 studies were included for the qualitative analysis and 2 studies for the quantitative analysis. A positive correlation was observed between the GVs of CBCT and HUs of CT. Out of the 14 studies, 100% had low risks of bias for the domains of patient selection, index test, and reference standards, while 95% of studies had a low risk of bias for the domain of flow and timing. The fixed-effects meta-analysis performed for Pearson correlation coefficients between CBCT and CT showed a moderate positive correlation (r=0.669; 95% CI, 0.388 to 0.836; P<0.05). Conclusion: The available evidence showed a positive correlation between the GVs of CBCT and HUs of CT.

A Method of Reducing the Processing Cost of Similarity Queries in Databases (데이터베이스에서 유사도 질의 처리 비용 감소 방법)

  • Kim, Sunkyung;Park, Ji Su;Shon, Jin Gon
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.11 no.4
    • /
    • pp.157-162
    • /
    • 2022
  • Today, most data is stored in a database (DB). In the DB environment, the users requests the DB to find the data they wants. Similarity Query has predicate that explained by a similarity. However, in the process of processing the similarity query, it is difficult to use an index that can reduce the range of processed records, so the cost of calculating the similarity for all records in the table is high each time. To solve this problem, this paper defines a lightweight similarity function. The lightweight similarity function has lower data filtering accuracy than the similarity function, but consumes less cost than the similarity function. We present a method for reducing similarity query processing cost by using the lightweight similarity function features. Then, Chebyshev distance is presented as a lightweight similarity function to the Euclidean distance function, and the processing cost of a query using the existing similarity function and a query using the lightweight similarity function is compared. And through experiments, it is confirmed that the similarity query processing cost is reduced when Chebyshev distance is applied as a lightweight similarity function for Euclidean similarity.