• Title/Summary/Keyword: Normalization approach

Search Result 115, Processing Time 0.025 seconds

Spacecraft Attitude Estimation by Unscented Filtering (고른 필터를 이용한 인공위성의 자세 추정)

  • Leeghim, Hen-Zeh;Choi, Yoon-Hyuk;Bang, Hyo-Choong;Park, Jong-Oh
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.14 no.9
    • /
    • pp.865-872
    • /
    • 2008
  • Spacecraft attitude estimation using the nonlinear unscented filter is addressed to fully utilize capabilities of the unscented transformation. To release significant computational load, an efficient technique is proposed by reasonably removing correlation between random variables. This modification introduces considerable reduction of sigma points and computational burden in matrix square-root calculation for most nonlinear systems. Unscented filter technique makes use of a set of sample points to predict mean and covariance. The general QUEST(QUaternion ESTimator) algorithm preserves explicitly the quaternion normalization, whereas extended Kalman filter(EKF) implicitly obeys the constraint. For spacecraft attitude estimation based on quaternion, an approach to computing quaternion means from sampled quaternions with guarantee of the quaternion norm constraint is introduced applying a constrained optimization technique. Finally, the performance of the new approach is demonstrated using a star tracker and rate-gyro measurements.

A novel story on rock slope reliability, by an initiative model that incorporated the harmony of damage, probability and fuzziness

  • Wang, Yajun
    • Geomechanics and Engineering
    • /
    • v.12 no.2
    • /
    • pp.269-294
    • /
    • 2017
  • This study aimed to realize the creation of fuzzy stochastic damage to describe reliability more essentially with the analysis of harmony of damage conception, probability and fuzzy degree of membership in interval [0,1]. Two kinds of fuzzy behaviors of damage development were deduced. Fuzzy stochastic damage models were established based on the fuzzy memberships functional and equivalent normalization theory. Fuzzy stochastic damage finite element method was developed as the approach to reliability simulation. The three-dimensional fuzzy stochastic damage mechanical behaviors of Jianshan mine slope were analyzed and examined based on this approach. The comprehensive results, including the displacement, stress, damage and their stochastic characteristics, indicate consistently that the failure foci of Jianshan mine slope are the slope-cutting areas where, with the maximal failure probability 40%, the hazardous Domino effects will motivate the neighboring rock bodies' sliding activities.

Adaptable Center Detection of a Laser Line with a Normalization Approach using Hessian-matrix Eigenvalues

  • Xu, Guan;Sun, Lina;Li, Xiaotao;Su, Jian;Hao, Zhaobing;Lu, Xue
    • Journal of the Optical Society of Korea
    • /
    • v.18 no.4
    • /
    • pp.317-329
    • /
    • 2014
  • In vision measurement systems based on structured light, the key point of detection precision is to determine accurately the central position of the projected laser line in the image. The purpose of this research is to extract laser line centers based on a decision function generated to distinguish the real centers from candidate points with a high recognition rate. First, preprocessing of an image adopting a difference image method is conducted to realize image segmentation of the laser line. Second, the feature points in an integral pixel level are selected as the initiating light line centers by the eigenvalues of the Hessian matrix. Third, according to the light intensity distribution of a laser line obeying a Gaussian distribution in transverse section and a constant distribution in longitudinal section, a normalized model of Hessian matrix eigenvalues for the candidate centers of the laser line is presented to balance reasonably the two eigenvalues that indicate the variation tendencies of the second-order partial derivatives of the Gaussian function and constant function, respectively. The proposed model integrates a Gaussian recognition function and a sinusoidal recognition function. The Gaussian recognition function estimates the characteristic that one eigenvalue approaches zero, and enhances the sensitivity of the decision function to that characteristic, which corresponds to the longitudinal direction of the laser line. The sinusoidal recognition function evaluates the feature that the other eigenvalue is negative with a large absolute value, making the decision function more sensitive to that feature, which is related to the transverse direction of the laser line. In the proposed model the decision function is weighted for higher values to the real centers synthetically, considering the properties in the longitudinal and transverse directions of the laser line. Moreover, this method provides a decision value from 0 to 1 for arbitrary candidate centers, which yields a normalized measure for different laser lines in different images. The normalized results of pixels close to 1 are determined to be the real centers by progressive scanning of the image columns. Finally, the zero point of a second-order Taylor expansion in the eigenvector's direction is employed to refine further the extraction results of the central points at the subpixel level. The experimental results show that the method based on this normalization model accurately extracts the coordinates of laser line centers and obtains a higher recognition rate in two group experiments.

Truncation Artifact Reduction Using Weighted Normalization Method in Prototype R/F Chest Digital Tomosynthesis (CDT) System (프로토타입 R/F 흉부 디지털 단층영상합성장치 시스템에서 잘림 아티팩트 감소를 위한 가중 정규화 접근법에 대한 연구)

  • Son, Junyoung;Choi, Sunghoon;Lee, Donghoon;Kim, Hee-Joung
    • Journal of the Korean Society of Radiology
    • /
    • v.13 no.1
    • /
    • pp.111-118
    • /
    • 2019
  • Chest digital tomosynthesis has become a practical imaging modality because it can solve the problem of anatomy overlapping in conventional chest radiography. However, because of both limited scan angle and finite-size detector, a portion of chest cannot be represented in some or all of the projection. These bring a discontinuity in intensity across the field of view boundaries in the reconstructed slices, which we refer to as the truncation artifacts. The purpose of this study was to reduce truncation artifacts using a weighted normalization approach and to investigate the performance of this approach for our prototype chest digital tomosynthesis system. The system source-to-image distance was 1100 mm, and the center of rotation of X-ray source was located on 100 mm above the detector surface. After obtaining 41 projection views with ${\pm}20^{\circ}$ degrees, tomosynthesis slices were reconstructed with the filtered back projection algorithm. For quantitative evaluation, peak signal to noise ratio and structure similarity index values were evaluated after reconstructing reference image using simulation, and mean value of specific direction values was evaluated using real data. Simulation results showed that the peak signal to noise ratio and structure similarity index was improved respectively. In the case of the experimental results showed that the effect of artifact in the mean value of specific direction of the reconstructed image was reduced. In conclusion, the weighted normalization method improves the quality of image by reducing truncation artifacts. These results suggested that weighted normalization method could improve the image quality of chest digital tomosynthesis.

The Effect of the Bobath Approach on Balance and Motor Ability in Mentally Retarded Child (보바스 접근방법이 정신지체 아동의 균형 및 운동능력에 미치는 영향: 단일사례연구)

  • Ro, Hyo-Lyun
    • Journal of Korean Physical Therapy Science
    • /
    • v.15 no.2
    • /
    • pp.79-86
    • /
    • 2008
  • Background: The purpose of this study was to present a practical method of medical treatment to improve the balance and motor ability of the mentally retarded child with a single mentally retarded child-subject. Methods: The subject of the study was a 39-month-old mentally retarded female. This study included a 2-week basic period and a 13-week treatment period. The treatment method was based on the Bobath Approach. Gross motor function measurement (GMFM) was used to examine changes in motor ability, and the Pediatric Balance Scale (PBS) was used to measure changes in balance ability. The curative program was composed of normalization of muscle tone, strengthening of leg endurance and muscular strength, the improvement of trunk alignment, and the increase of balance. Visual rate of change was used to examine the results. Results: As a result of this study, balance ability increased on the Pediatric Balance Scale (PBS) by 24 points, and motor function increased in terms of Gross Motor Function Measurement (GMFM) by 6.9% (18 points). Standing increased by 41% (16 points), and walking, running, and jumping increased by 31.9% (23 points) compared to thebasic period. Therefore, the Bobath Approach appears to be an appropriate method to improve balance and motor ability in mentally retarded children. Conclusion: It is surmised that aggressive intervention by physical therapists and occupational therapists, and a follow-up study, are required for the growth of motor ability in mentally retarded children.

  • PDF

Hybrid model-based and deep learning-based metal artifact reduction method in dental cone-beam computed tomography

  • Jin Hur;Yeong-Gil Shin;Ho Lee
    • Nuclear Engineering and Technology
    • /
    • v.55 no.8
    • /
    • pp.2854-2863
    • /
    • 2023
  • Objective: To present a hybrid approach that incorporates a constrained beam-hardening estimator (CBHE) and deep learning (DL)-based post-refinement for metal artifact reduction in dental cone-beam computed tomography (CBCT). Methods: Constrained beam-hardening estimator (CBHE) is derived from a polychromatic X-ray attenuation model with respect to X-ray transmission length, which calculates associated parameters numerically. Deep-learning-based post-refinement with an artifact disentanglement network (ADN) is performed to mitigate the remaining dark shading regions around a metal. Artifact disentanglement network (ADN) supports an unsupervised learning approach, in which no paired CBCT images are required. The network consists of an encoder that separates artifacts and content and a decoder for the content. Additionally, ADN with data normalization replaces metal regions with values from bone or soft tissue regions. Finally, the metal regions obtained from the CBHE are blended into reconstructed images. The proposed approach is systematically assessed using a dental phantom with two types of metal objects for qualitative and quantitative comparisons. Results: The proposed hybrid scheme provides improved image quality in areas surrounding the metal while preserving native structures. Conclusion: This study may significantly improve the detection of areas of interest in many dentomaxillofacial applications.

Enhancing Heart Disease Prediction Accuracy through Soft Voting Ensemble Techniques

  • Byung-Joo Kim
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.16 no.3
    • /
    • pp.290-297
    • /
    • 2024
  • We investigate the efficacy of ensemble learning methods, specifically the soft voting technique, for enhancing heart disease prediction accuracy. Our study uniquely combines Logistic Regression, SVM with RBF Kernel, and Random Forest models in a soft voting ensemble to improve predictive performance. We demonstrate that this approach outperforms individual models in diagnosing heart disease. Our research contributes to the field by applying a well-curated dataset with normalization and optimization techniques, conducting a comprehensive comparative analysis of different machine learning models, and showcasing the superior performance of the soft voting ensemble in medical diagnosis. This multifaceted approach allows us to provide a thorough evaluation of the soft voting ensemble's effectiveness in the context of heart disease prediction. We evaluate our models based on accuracy, precision, recall, F1 score, and Area Under the ROC Curve (AUC). Our results indicate that the soft voting ensemble technique achieves higher accuracy and robustness in heart disease prediction compared to individual classifiers. This study advances the application of machine learning in medical diagnostics, offering a novel approach to improve heart disease prediction. Our findings have significant implications for early detection and management of heart disease, potentially contributing to better patient outcomes and more efficient healthcare resource allocation.

Improving Multinomial Naive Bayes Text Classifier (다항시행접근 단순 베이지안 문서분류기의 개선)

  • 김상범;임해창
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.3_4
    • /
    • pp.259-267
    • /
    • 2003
  • Though naive Bayes text classifiers are widely used because of its simplicity, the techniques for improving performances of these classifiers have been rarely studied. In this paper, we propose and evaluate some general and effective techniques for improving performance of the naive Bayes text classifier. We suggest document model based parameter estimation and document length normalization to alleviate the Problems in the traditional multinomial approach for text classification. In addition, Mutual-Information-weighted naive Bayes text classifier is proposed to increase the effect of highly informative words. Our techniques are evaluated on the Reuters21578 and 20 Newsgroups collections, and significant improvements are obtained over the existing multinomial naive Bayes approach.

A Missing Data Imputation by Combining K Nearest Neighbor with Maximum Likelihood Estimation for Numerical Software Project Data (K-NN과 최대 우도 추정법을 결합한 소프트웨어 프로젝트 수치 데이터용 결측값 대치법)

  • Lee, Dong-Ho;Yoon, Kyung-A;Bae, Doo-Hwan
    • Journal of KIISE:Software and Applications
    • /
    • v.36 no.4
    • /
    • pp.273-282
    • /
    • 2009
  • Missing data is one of the common problems in building analysis or prediction models using software project data. Missing imputation methods are known to be more effective missing data handling method than deleting methods in small software project data. While K nearest neighbor imputation is a proper missing imputation method in the software project data, it cannot use non-missing information of incomplete project instances. In this paper, we propose an approach to missing data imputation for numerical software project data by combining K nearest neighbor and maximum likelihood estimation; we also extend the average absolute error measure by normalization for accurate evaluation. Our approach overcomes the limitation of K nearest neighbor imputation and outperforms on our real data sets.

Generic Training Set based Multimanifold Discriminant Learning for Single Sample Face Recognition

  • Dong, Xiwei;Wu, Fei;Jing, Xiao-Yuan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.1
    • /
    • pp.368-391
    • /
    • 2018
  • Face recognition (FR) with a single sample per person (SSPP) is common in real-world face recognition applications. In this scenario, it is hard to predict intra-class variations of query samples by gallery samples due to the lack of sufficient training samples. Inspired by the fact that similar faces have similar intra-class variations, we propose a virtual sample generating algorithm called k nearest neighbors based virtual sample generating (kNNVSG) to enrich intra-class variation information for training samples. Furthermore, in order to use the intra-class variation information of the virtual samples generated by kNNVSG algorithm, we propose image set based multimanifold discriminant learning (ISMMDL) algorithm. For ISMMDL algorithm, it learns a projection matrix for each manifold modeled by the local patches of the images of each class, which aims to minimize the margins of intra-manifold and maximize the margins of inter-manifold simultaneously in low-dimensional feature space. Finally, by comprehensively using kNNVSG and ISMMDL algorithms, we propose k nearest neighbor virtual image set based multimanifold discriminant learning (kNNMMDL) approach for single sample face recognition (SSFR) tasks. Experimental results on AR, Multi-PIE and LFW face datasets demonstrate that our approach has promising abilities for SSFR with expression, illumination and disguise variations.