• Title/Summary/Keyword: Data Reduction Technique

Search Result 494, Processing Time 0.032 seconds

Data Visualization using Linear and Non-linear Dimensionality Reduction Methods

  • Kim, Junsuk;Youn, Joosang
    • Journal of the Korea Society of Computer and Information
    • /
    • v.23 no.12
    • /
    • pp.21-26
    • /
    • 2018
  • As the large amount of data can be efficiently stored, the methods extracting meaningful features from big data has become important. Especially, the techniques of converting high- to low-dimensional data are crucial for the 'Data visualization'. In this study, principal component analysis (PCA; linear dimensionality reduction technique) and Isomap (non-linear dimensionality reduction technique) are introduced and applied to neural big data obtained by the functional magnetic resonance imaging (fMRI). First, we investigate how much the physical properties of stimuli are maintained after the dimensionality reduction processes. We moreover compared the amount of residual variance to quantitatively compare the amount of information that was not explained. As result, the dimensionality reduction using Isomap contains more information than the principal component analysis. Our results demonstrate that it is necessary to consider not only linear but also nonlinear characteristics in the big data analysis.

An efficient parallel solution algorithm on the linear second-order partial differential equations with large sparse matrix being based on the block cyclic reduction technique (Block Cyclic Reduction 기법에 의한 대형 Sparse Matrix 선형 2계편미분방정식의 효율적인 병렬 해 알고리즘)

  • 이병홍;김정선
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.15 no.7
    • /
    • pp.553-564
    • /
    • 1990
  • The co-efficient matrix of linear second-order partial differential equations in the general form is partitioned with (n-1)x(n-1) submartices and is transformed into the block tridiagonal system. Then the cyclic odd-even reduction technique is applied to this system with the large-grain data granularity and the block cyclic reduction algorithm to solve unknown vectors of this system is created. But this block cyclic reduction technique is not suitable for the parallel processing system because of its parallelism chanigng at every computing stages. So a new algorithm for solving linear second-order partical differential equations is presentes by the block cyclic reduction technique which is modified in order to keep its parallelism constant, and to reduce gteatly its execution time. Both of these algoriths are compared and studied.

  • PDF

A Data-line Sharing Method for Lower Cost and Lower Power in TFT-LCDs

  • Park, Haeng-Won;Moon, Seung-Hwan;Kang, Nam-Soo;Lee, Sung-Yung;Park, Jin-Hyuk;Kim, Sang-Soo
    • 한국정보디스플레이학회:학술대회논문집
    • /
    • 2005.07a
    • /
    • pp.531-534
    • /
    • 2005
  • This paper presents a new data line sharing technique for TFT-LCD panels. This technique reduces the number of data driver IC's to half by having two adjacent pixels share the same data line. This in turn doubles the number of gate lines, which are integrated directly on the glass substrate of amorphous silicon for further cost reduction and more compactness. The proposed technique with new pixel array structure was applied to 15.4 inch WXGA TFT-LCD panels and has proven that the number of driver IC's were halved with nearly 41% circuit cost reduction and 5.3% reduction in power consumption without degrading the image quality.

  • PDF

A Classification Method Using Data Reduction

  • Uhm, Daiho;Jun, Sung-Hae;Lee, Seung-Joo
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.12 no.1
    • /
    • pp.1-5
    • /
    • 2012
  • Data reduction has been used widely in data mining for convenient analysis. Principal component analysis (PCA) and factor analysis (FA) methods are popular techniques. The PCA and FA reduce the number of variables to avoid the curse of dimensionality. The curse of dimensionality is to increase the computing time exponentially in proportion to the number of variables. So, many methods have been published for dimension reduction. Also, data augmentation is another approach to analyze data efficiently. Support vector machine (SVM) algorithm is a representative technique for dimension augmentation. The SVM maps original data to a feature space with high dimension to get the optimal decision plane. Both data reduction and augmentation have been used to solve diverse problems in data analysis. In this paper, we compare the strengths and weaknesses of dimension reduction and augmentation for classification and propose a classification method using data reduction for classification. We will carry out experiments for comparative studies to verify the performance of this research.

Analysis of the Complementary Clipping Transform technique for the PAPR reduction of OFDM system (OFDM PAPR reduction을 위한 Complementary Clipping Transform 성능 분석)

  • Won, Seong-Ho
    • Proceedings of the Korea Electromagnetic Engineering Society Conference
    • /
    • 2005.11a
    • /
    • pp.57-62
    • /
    • 2005
  • In spite of many advantages of OFDM, a major drawback for implementation is a non-linear distortion in the HPA due to a high PAPR problem. In this paper, the Complementary Clipping Transform technique (CCT) for the PAPR reduction of OFDM system is analyzed for the QPSK and QAM mapping data. BER performance and PSD in front of HPA and after HPA are analytically demonstrated.

  • PDF

Identification of flutter derivatives of bridge decks using stochastic search technique

  • Chen, Ai-Rong;Xu, Fu-You;Ma, Ru-Jin
    • Wind and Structures
    • /
    • v.9 no.6
    • /
    • pp.441-455
    • /
    • 2006
  • A more applicable optimization model for extracting flutter derivatives of bridge decks is presented, which is suitable for time-varying weights for fitting errors and different lengths of vertical bending and torsional free vibration data. A stochastic search technique for searching the optimal solution of optimization problem is developed, which is more convenient in understanding and programming than the alternate iteration technique, and testified to be a valid and efficient method using two numerical examples. On the basis of the section model test of Sutong Bridge deck, the flutter derivatives are extracted by the stochastic search technique, and compared with the identification results using the modified least-square method. The Empirical Mode Decomposition method is employed to eliminate noise, trends and zero excursion of the collected free vibration data of vertical bending and torsional motion, by which the identification precision of flutter derivatives is improved.

Operative Treatment of Tongue Type Intra-articular Calcaneal Fractures: Comparison of the Open Reduction and Essex-Lopresti Technique (관절면을 침범한 설상형 종골골절의 수술적 치료: 관혈적 및 Essex-Lopresti 술식에 따른 비교)

  • Shin, Dong-Eun;Yoon, Hyung-Ku;Han, Soo-Hong;Choi, Woo-Jin;Ahn, Chang-Soo;Ok, Hyun-Soo
    • Journal of Korean Foot and Ankle Society
    • /
    • v.14 no.2
    • /
    • pp.151-156
    • /
    • 2010
  • Purpose: To analyze the clinical and radiological results of operative treatment in patients with tongue type intra-articular calcaneal fracture, and to compare the open reduction and Essex-Lopresti technique. Materials and Methods: We examined a consecutive series of 42 patients who received surgical treatment for tongue type calcaneal fracture (24 cases of the open reduction and 18 cases of the Essex-Lopresti technique) and the postoperative data was compared with a minimum 1 year follow-up. The clinical outcome was analyzed using the American Orthopaedic Foot and Ankle Society (AOFAS) hindfoot scale and Salama's criteria. The preoperative, postoperative, and last follow-up changes in the Bohler angle was radiologically analyzed. Results: There were no significant differences between the two groups in terms of the clinical and radiological results at the last follow-up. However, for the Sander's type 3 and 4 fractures, the open reduction group showed more improvement of AOFAS score and less reduction loss in the Bohler angle. Conclusion: Although the clinical results were good irrespective of surgical technique, the open reduction and internal fixation can improve clinical outcome and reduce the reduction loss as compared with the Essex-Lopresti technique in the comminuted tongue type calcaneal fracture.

An AHP Approach to Select the Task Related Technique for Work Efficiency Improvement in Shipbuilding Enterprise (AHP에 의한 조선기업의 작업능률향상을 위한 과업관련기법의 선택)

  • Kim, Tae-Soo;Lee, Kang-Woo
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.30 no.2
    • /
    • pp.67-74
    • /
    • 2007
  • The objective of this research is to select the most effective technique among task related techniques(motion & time study, job redesign, physical environment improvement) for improving work efficiency in shipbuilding enterprise. This study consists of several principal steps. The first step is to design critical criteria in evaluating work efficiency in ship-building enterprises. The second step is to develop sub-criteria of the critical criteria. The third step is to develop a four level AHP(Analytic Hierarchy Process) structure using the critical criteria, sub-criteria and techniques among task related techniques. The fourth step is to develop the pairwise comparison matrix at each level of AHP structure, which was based on survey data collected at the H heavy industry. And the last step is to select the most effective technique among task related techniques using AHP analysis. The result of AHP analysis has shown clear difference in priority among task related techniques in terms of work efficiency of the shipbuilding enterprise: The reduction of normal time is more important than the reduction of allowance time in improving of the work efficiency. Motion & time study is the most important technique for the reduction of normal time, and physical environment improvement is the most important technique for the reduction of allowance time as well.

Development of Image Processing Technique for Determining Wood Drying Schedules

  • Lee, Hyoung-Woo;Kim, Byung-Nam
    • Journal of the Korean Wood Science and Technology
    • /
    • v.31 no.6
    • /
    • pp.15-21
    • /
    • 2003
  • Image processing technique was adapted for exploring the more convenient ways to investigate the drying characteristics of wood. The acquisition of information about drying characteristics is indispensable for the development or improvement of dry-kiln schedules. A small internal fan type wood dry kiln was combined with image-processing and data-acquisition systems to monitor continuously the formation of checks and moisture reduction during drying. All the images and data were analyzed to improve or estimate the dry-kiln schedules and predict the drying time which would be required to dry green wood to 10% moisture content in internal fan type kiln. Samples of 20 mm- and 50 mm-thick Metasequoia glyptostrobodies, Paulownia coreana Uyeki, Pinus densiflora Sieb. Et Zucc., Platanus occidentalis L., Quercus acutissima and Robinia pseudo-acacia were used to verify the potentiality of this technique.

Issues and Empirical Results for Improving Text Classification

  • Ko, Young-Joong;Seo, Jung-Yun
    • Journal of Computing Science and Engineering
    • /
    • v.5 no.2
    • /
    • pp.150-160
    • /
    • 2011
  • Automatic text classification has a long history and many studies have been conducted in this field. In particular, many machine learning algorithms and information retrieval techniques have been applied to text classification tasks. Even though much technical progress has been made in text classification, there is still room for improvement in text classification. In this paper, we will discuss remaining issues in improving text classification. In this paper, three improvement issues are presented including automatic training data generation, noisy data treatment and term weighting and indexing, and four actual studies and their empirical results for those issues are introduced. First, the semi-supervised learning technique is applied to text classification to efficiently create training data. For effective noisy data treatment, a noisy data reduction method and a robust text classifier from noisy data are developed as a solution. Finally, the term weighting and indexing technique is revised by reflecting the importance of sentences into term weight calculation using summarization techniques.