• Title/Summary/Keyword: linear algorithm

Search Result 4,036, Processing Time 0.027 seconds

A screening of Alzheimer's disease using basis synthesis by singular value decomposition from Raman spectra of platelet (혈소판 라만 스펙트럼에서 특이값 분해에 의한 기저 합성을 통한 알츠하이머병 검출)

  • Park, Aaron;Baek, Sung-June
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.14 no.5
    • /
    • pp.2393-2399
    • /
    • 2013
  • In this paper, we proposed a method to screening of Alzheimer's disease (AD) from Raman spectra of platelet with synthesis of basis spectra using singular value decomposition (SVD). Raman spectra of platelet from AD transgenic mice are preprocessed with denoising, removal background and normalization method. The column vectors of each data matrix consist of Raman spectrum of AD and normal (NR). The matrix is factorized using SVD algorithm and then the basis spectra of AD and NR are determined by 12 column vectors of each matrix. The classification process is completed by select the class that minimized the root-mean-square error between the validation spectrum and the linear synthesized spectrum of the basis spectra. According to the experiments involving 278 Raman spectra, the proposed method gave about 97.6% classification rate, which is better performance about 6.1% than multi-layer perceptron (MLP) with extracted features using principle components analysis (PCA). The results show that the basis spectra using SVD is well suited for the diagnosis of AD by Raman spectra from platelet.

Design and Performance Analysis of a Parallel Cell-Based Filtering Scheme using Horizontally-Partitioned Technique (수평 분할 방식을 이용한 병렬 셀-기반 필터링 기법의 설계 및 성능 평가)

  • Chang, Jae-Woo;Kim, Young-Chang
    • The KIPS Transactions:PartD
    • /
    • v.10D no.3
    • /
    • pp.459-470
    • /
    • 2003
  • It is required to research on high-dimensional index structures for efficiently retrieving high-dimensional data because an attribute vector in data warehousing and a feature vector in multimedia database have a characteristic of high-dimensional data. For this, many high-dimensional index structures have been proposed, but they have so called ‘dimensional curse’ problem that retrieval performance is extremely decreased as the dimensionality is increased. To solve the problem, the cell-based filtering (CBF) scheme has been proposed. But the CBF scheme show a linear decreasing on performance as the dimensionality. To cope with the problem, it is necessary to make use of parallel processing techniques. In this paper, we propose a parallel CBF scheme which uses a horizontally-partitioned technique as declustering. In order to maximize the retrieval performance of the proposed parallel CBF scheme, we construct our parallel CBF scheme under a SN (Shared Nothing) cluster architecture. In addition, we present a data insertion algorithm, a rage query processing one, and a k-NN query processing one which are suitable for the SN cluster architecture. Finally, we show that our parallel CBF scheme achieves better retrieval performance in proportion to the number of servers in the SN cluster architecture, compared with the conventional CBF scheme.

Construction of T$_1$ Map Image (T1 이완시간의 영상화)

  • 정은기;서진석;이종태;추성실;이삼현;권영길
    • Progress in Medical Physics
    • /
    • v.6 no.2
    • /
    • pp.83-92
    • /
    • 1995
  • The T1 mapping of an human anatomy may give a characteristic contrast among the various tissues and the normal/abnormal tissues. Here, the methodology of constructing T1 map out of several images with different TRs, will be described using non-linear curve fitting. The general curve fitting algorithm requires the initial trial values T1t and Mot for the variables to be fitted. Three different methods of suppling the trial T1t and Mot are suggested and compared for the efficiency and the accuracy. The curve-fitting routine was written in ANSI C and excuted on a SUN workstation. Several distilled-water phantoms with various concentrations of Gd-DTPA were prepared to examine the accuracy of the curve-fitting program. An MR image was used as the true proton density image without any random noise, and several images with different TRs were generated with the theoretical T1 relaxation times 250, 500, and 1000msec. The random noise of 1, 5, and 10% were embedded into the simulated images. These images were used to generate the T1 map, and the resultant T1 maps for each T1 were analyzed to study the effect of the random noise on the T1 map.

  • PDF

Reliability and Data Integration of Duplicated Test Results Using Two Bioelectrical Impedence Analysis Machines in the Korean Genome and Epidemiology Study

  • Park, Bo-Young;Yang, Jae-Jeong;Yang, Ji-Hyun;Kim, Ji-Min;Cho, Lisa-Y.;Kang, Dae-Hee;Shin, Chol;Hong, Young-Seoub;Choi, Bo-Youl;Kim, Sung-Soo;Park, Man-Suck;Park, Sue-K.
    • Journal of Preventive Medicine and Public Health
    • /
    • v.43 no.6
    • /
    • pp.479-485
    • /
    • 2010
  • Objectives: The Korean Genome and Epidemiology Study (KoGES), a multicenter-based multi-cohort study, has collected information on body composition using two different bioelectrical impedence analysis (BIA) machines. The aim of the study was to evaluate the possibility of whether the test values measured from different BIA machines can be integrated through statistical adjustment algorithm under excellent inter-rater reliability. Methods: We selected two centers to measure inter-rater reliability of the two BIA machines. We set up the two machines side by side and measured subjects' body compositions between October and December 2007. Duplicated test values of 848 subjects were collected. Pearson and intra-class correlation coefficients for inter-rater reliability were estimated using results from the two machines. To detect the feasibility for data integration, we constructed statistical compensation models using linear regression models with residual analysis and R-square values. Results: All correlation coefficients indicated excellent reliability except mineral mass. However, models using only duplicated body composition values for data integration were not feasible due to relatively low $R^2$ values of 0.8 for mineral mass and target weight. To integrate body composition data, models adjusted for four empirical variables that were age, sex, weight and height were most ideal (all $R^2$ > 0.9). Conclusions: The test values measured with the two BIA machines in the KoGES have excellent reliability for the nine body composition values. Based on reliability, values can be integrated through algorithmic statistical adjustment using regression equations that includes age, sex, weight, and height.

Shoreline-change Rates of the Barrier Islands in Nakdong River Estuary Using Aerial Photography and SPOT-5 Image (항공사진과 SPOT-5 위성영상을 이용한 낙동강 하구역 울타리섬들의 해안선 변화율)

  • Jeong, Sang-Hun;Khim, Boo-Keun;Kim, Beack-Oon;Lee, Sang-Ryong
    • Ocean and Polar Research
    • /
    • v.35 no.1
    • /
    • pp.1-14
    • /
    • 2013
  • Shoreline data of the barrier islands in Nakdong River Estuary for the last three decades were assembled using six sets of aerial photographs and seven sets of satellite images. Canny Algorithm was applied to untreated data in order to obtain a wet-dry boundary as a proxy shoreline. Digital Shoreline Analysis System (DSAS 4.0) was used to estimate the rate of shoreline changes in terms of five statistical variables; SCE (Shoreline Change Envelope), NSM (Net Shoreline Movement), EPR(End Point Rate), LRR (Linear Regression Rate), and LMS (Least Median of Squares). The shoreline in Jinwoodo varied differently from one place to another during the last three decades; the west tail has advanced (i.e., seaward or southward), the west part has regressed, the south part has advanced, and the east part has regressed. After the 2000s, the rate of shoreline changes (-2.5~6.7 m/yr) increased and the east advanced. The shoreline in Shinjado shows a counterclockwise movement; the west part has advanced, but the east part has retreated. Since Shinjado was built in its present form, the west part became stable, but the east part has regressed faster. The rate of shoreline changes (-16.0~12.0 m/yr) in Shinjado is greater than that of Jinwoodo. The shoreline in Doyodeung has advanced at a rate of 31.5 m/yr. Since Doyodeung was built in its present form, the south part has regressed at the rate of -18.2 m/yr, but the east and west parts have advanced at the rate of 13.5~14.3 m/yr. Based on Digital Shoreline Analysis, shoreline changes in the barrier islands in the Nakdong River Estuary have varied both temporally and spatially, although the exact reason for the shoreline changes requires more investigation.

Active Stabilization for Surge Motion of Moored Vessel in Irregular Head Waves (불규칙 선수파랑 중 계류된 선박의 전후동요 제어)

  • Lee, Sang-Do;Truong, Ngoc Cuong;Xu, Xiao;You, Sam-Sang
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.26 no.5
    • /
    • pp.437-444
    • /
    • 2020
  • This study was focused on the stabilization of surge motions of a moored vessel under irregular head seas. A two-point moored vessel shows strong non-linearity even in regular sea, owing to its inherent non-linear restoring force. A long-crested irregular wave is subjected to the vessel system, resulting in more complex nonlinear behavior of the displacement and velocities than in the case of regular waves. Sliding mode control (SMC) is implemented in the moored vessel to control both surge displacement and surge velocity. The SMC can provide a closed-loop system with performance and robustness against parameter uncertainties and disturbances; however, chattering is the main drawback for implementing SMC. The goal of minimizing the chattering and state convergence with accuracy is achieved using a quasi-sliding mode that approximates the discontinuous function via a continuous sigmoid function. Numerical simulations were conducted to validate the effectiveness of the proposed control algorithm.

Decision Algorithm of Natural Algae Coagulant Dose to Control Algae from the Influent of Water Works (정수장 유입조류 전처리를 위한 천연조류제거제(W.H.)의 최적주입농도 결정)

  • Jang, Yeo-Ju;Jung, Jin-Hong;Lim, Hyun-Man;Yoon, Young H.;Ahn, Kwang-Ho;Chang, Hyang-Youn;Kim, Weon-Jae
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.38 no.9
    • /
    • pp.482-496
    • /
    • 2016
  • Algal blooms of cyanobacteria (blue-green Algae) due to the eutrophication of rivers and lakes can cause not only the damage by its biological toxins but also the economic loss in drinking water treatment. The natural algae coagulant, a commercial product known as W.H. containing the algicidal and allelopathic material derived from oak, can control algal problems proactively through the coagulation flotation process. However, because there have been no applications of the process for pre-treatment in drinking water plants, we could find no report on the optimum injection dose of W.H.. In this study, we have conducted several sets of jar-tests while changing W.H. dose and concentration of chl-a for (1) Han-river samples and (2) subcultured cyanobacteria samples, and monitored the removal mechanisms of algae intensively. Based on these jar-test results, two linear equations with variables of chl-a and turbidity have been deduced to predict the optimal W.H. dose after the multiple regression analysis using IBM-SPSS. Also, prototypes of automatic control logic have been suggested to inject the optimal W.H. dose promptly in response to the variation of water quality.

A Case Study of Profit Optimization System Integration with Enhanced Security (관리보안이 강화된 수익성 최적화 시스템구축 사례연구)

  • Kim, Hyoung-Tae;Yoon, Ki-Chang;Yu, Seung-Hun
    • Journal of Distribution Science
    • /
    • v.13 no.11
    • /
    • pp.123-130
    • /
    • 2015
  • Purpose - Due to highly elevated levels of competition, many companies today have to face the problem of decreasing profits even when their actual sales volume is increasing. This is a common phenomenon that is seen occurring among companies that focus heavily on quantitative growth rather than qualitative growth. These two aspects of growth should be well balanced for a company to create a sustainable business model. For supply chain management (SCM) planners, the optimized, quantified flow of resources used to be of major interest for decades. However, this trend is rapidly changing so that managers can put the appropriate balance between sales volume and sales quality, which can be evaluated from the profit margin. Profit optimization is a methodology for companies to use to achieve solutions focused more on profitability than sales volume. In this study, we attempt to provide executional insight for companies considering implementation of the profit optimization system to enhance their business profitability. Research design, data, and methodology - In this study, we present a comprehensive explanation of the subject of profit optimization, including the fundamental concepts, the most common profit optimization logic algorithm -linear programming -the business functional scope of the profit optimization system, major key success factors for implementing the profit optimization system at a business organization, and weekly level detailed business processes to actively manage effective system performance in achieving the goals of the system. Additionally, for the purpose of providing more realistic and practical information, we carefully investigate a profit optimization system implementation case study project fulfilled for company S. The project duration was about eight months, with four full-time system development consultants deployed for the period. To guarantee the project's success, the organization adopted a proven system implementation methodology, supply chain management (SCM) six-sigma. SCM six-sigma was originally developed by a group of talented consultants within Samsung SDS through focused efforts and investment in synthesizing SCM and six-sigma to improve and innovate their SCM operations across the entire Samsung Organization. Results - Profit optimization can enable a company to create sales and production plans focused on more profitable products and customers, resulting in sustainable growth. In this study, we explain the concept of profit optimization and prerequisites for successful implementation of the system. Furthermore, the efficient way of system security administration, one of the hottest topics today, is also addressed. Conclusion - This case study can benefit numerous companies that are eagerly searching for ways to break-through current profitability levels. We cannot guarantee that the decision to deploy the profit optimization system will bring success, but we can guarantee that with the help of our study, companies trying to implement profit optimization systems can minimize various possible risks across various system implementation phases. The actual system implementation case of the profit optimization project at company S introduced here can provide valuable lessons for both business organizations and research communities.

Implementation of Web-based Remote Multi-View 3D Imaging Communication System Using Adaptive Disparity Estimation Scheme (적응적 시차 추정기법을 이용한 웹 기반의 원격 다시점 3D 화상 통신 시스템의 구현)

  • Ko Jung-Hwan;Kim Eun-Soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.31 no.1C
    • /
    • pp.55-64
    • /
    • 2006
  • In this paper, a new web-based remote 3D imaging communication system employing an adaptive matching algorithm is suggested. In the proposed method, feature values are extracted from the stereo image pair through estimation of the disparity and similarities between each pixel of the stereo image. And then, the matching window size for disparity estimation is adaptively selected depending on the magnitude of this feature value. Finally, the detected disparity map and the left image is transmitted into the client region through the network channel. And then, in the client region, right image is reconstructed and intermediate views be synthesized by a linear combination of the left and right images using interpolation in real-time. From some experiments on web based-transmission in real-time and synthesis of the intermediate views by using two kinds of stereo images of 'Joo' & 'Hoon' captured by real camera, it is analyzed that PSNRs of the intermediate views reconstructed by using the proposed transmission scheme are highly measured by 30dB for 'Joo', 27dB for 'Hoon' and the delay time required to obtain the intermediate image of 4 view is also kept to be very fast value of 67.2ms on average, respectively.

Why Gabor Frames? Two Fundamental Measures of Coherence and Their Role in Model Selection

  • Bajwa, Waheed U.;Calderbank, Robert;Jafarpour, Sina
    • Journal of Communications and Networks
    • /
    • v.12 no.4
    • /
    • pp.289-307
    • /
    • 2010
  • The problem of model selection arises in a number of contexts, such as subset selection in linear regression, estimation of structures in graphical models, and signal denoising. This paper studies non-asymptotic model selection for the general case of arbitrary (random or deterministic) design matrices and arbitrary nonzero entries of the signal. In this regard, it generalizes the notion of incoherence in the existing literature on model selection and introduces two fundamental measures of coherence-termed as the worst-case coherence and the average coherence-among the columns of a design matrix. It utilizes these two measures of coherence to provide an in-depth analysis of a simple, model-order agnostic one-step thresholding (OST) algorithm for model selection and proves that OST is feasible for exact as well as partial model selection as long as the design matrix obeys an easily verifiable property, which is termed as the coherence property. One of the key insights offered by the ensuing analysis in this regard is that OST can successfully carry out model selection even when methods based on convex optimization such as the lasso fail due to the rank deficiency of the submatrices of the design matrix. In addition, the paper establishes that if the design matrix has reasonably small worst-case and average coherence then OST performs near-optimally when either (i) the energy of any nonzero entry of the signal is close to the average signal energy per nonzero entry or (ii) the signal-to-noise ratio in the measurement system is not too high. Finally, two other key contributions of the paper are that (i) it provides bounds on the average coherence of Gaussian matrices and Gabor frames, and (ii) it extends the results on model selection using OST to low-complexity, model-order agnostic recovery of sparse signals with arbitrary nonzero entries. In particular, this part of the analysis in the paper implies that an Alltop Gabor frame together with OST can successfully carry out model selection and recovery of sparse signals irrespective of the phases of the nonzero entries even if the number of nonzero entries scales almost linearly with the number of rows of the Alltop Gabor frame.