• 제목/요약/키워드: Sparse data

검색결과 408건 처리시간 0.021초

Sparse 복원 알고리즘을 이용한 HRRP 및 ISAR 영상 형성에 관한 연구 (A Study on the Formulation of High Resolution Range Profile and ISAR Image Using Sparse Recovery Algorithm)

  • 배지훈;김경태;양은정
    • 한국전자파학회논문지
    • /
    • 제25권4호
    • /
    • pp.467-475
    • /
    • 2014
  • 본 논문에서는 1차원 레이더 특성(signature)인 고해상도 거리 측면도(HRRP)와 2차원 레이더 특성인 ISAR 영상을 형성하기 위하여 CS(Compressive Sensing) 기반의 레이더 신호 모델을 적용한 sparse 복원(sparse recovery) 알고리즘을 소개하고자 한다. 만약, 관측된 RCS(Radar Cross Section) 데이터 샘플에서 데이터 손실이 발생할 경우, 기존의 discrete Fourier transform(DFT) 방식으로는 올바른 고해상도의 레이더 특성들을 얻을 수 없다. 하지만, 데이터 손실이 존재하더라도 상기 sparse 복원 알고리즘을 적용하면 고해상도의 레이더 특성을 성공적으로 복원할 수 있고, 원래 광대역의 RCS 데이터를 이용한 레이더 특성과 동등하게 고해상도를 유지할 수 있다. 따라서, 본 논문에서 보여준 결과에서와 같이 원하지 않는 간섭신호나 전파 교란 신호에 의해 데이터 손실이 발생한 RCS 데이터를 수집하더라도, sparse 복원 알고리즘을 이용하면 기존 DFT 방식과 달리 고해상도의 레이더 특성을 성공적으로 복원할 수 있음을 관찰할 수 있었다.

DATA MINING AND PREDICTION OF SAI TYPE MATRIX PRECONDITIONER

  • Kim, Sang-Bae;Xu, Shuting;Zhang, Jun
    • Journal of applied mathematics & informatics
    • /
    • 제28권1_2호
    • /
    • pp.351-361
    • /
    • 2010
  • The solution of large sparse linear systems is one of the most important problems in large scale scientific computing. Among the many methods developed, the preconditioned Krylov subspace methods are considered the preferred methods. Selecting a suitable preconditioner with appropriate parameters for a specific sparse linear system presents a challenging task for many application scientists and engineers who have little knowledge of preconditioned iterative methods. The prediction of ILU type preconditioners was considered in [27] where support vector machine(SVM), as a data mining technique, is used to classify large sparse linear systems and predict best preconditioners. In this paper, we apply the data mining approach to the sparse approximate inverse(SAI) type preconditioners to find some parameters with which the preconditioned Krylov subspace method on the linear systems shows best performance.

Sparse Autoencoder의 데이터 특징 추출과 ProGReGA-KF를 결합한 새로운 부하 분산 알고리즘 (Combing data representation by Sparse Autoencoder and the well-known load balancing algorithm, ProGReGA-KF)

  • 김차영;박정민;김혜영
    • 한국게임학회 논문지
    • /
    • 제17권5호
    • /
    • pp.103-112
    • /
    • 2017
  • 많은 사용자가 함께 즐기는 온라인 게임(MMOGs)에서 IoT의 확장은 서버에 엄청난 부하를 지속적으로 증가시켜, 모든 데이터들이 Big-Data화 되어가는 환경에 있다. 이에 본 논문에서는 딥러닝 기법 중에서 가장 많이 사용되는 Sparse Autoencoder와 이미 잘 알려진 부하분산 알고리즘(ProGReGA-KF)을 결합한다. 기존 알고리즘 ProGReGA-KF과 본 논문에서 제안한 알고리즘을 이동 안정성으로 비교하였고, 제안한 알고리즘이 빅-데이터 환경에서 좀 더 안정적이고 확장성이 있음 시뮬레이션을 통해 보였다.

Distributed Video Compressive Sensing Reconstruction by Adaptive PCA Sparse Basis and Nonlocal Similarity

  • Wu, Minghu;Zhu, Xiuchang
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제8권8호
    • /
    • pp.2851-2865
    • /
    • 2014
  • To improve the rate-distortion performance of distributed video compressive sensing (DVCS), the adaptive sparse basis and nonlocal similarity of video are proposed to jointly reconstruct the video signal in this paper. Due to the lack of motion information between frames and the appearance of some noises in the reference frames, the sparse dictionary, which is constructed using the examples directly extracted from the reference frames, has already not better obtained the sparse representation of the interpolated block. This paper proposes a method to construct the sparse dictionary. Firstly, the example-based data matrix is constructed by using the motion information between frames, and then the principle components analysis (PCA) is used to compute some significant principle components of data matrix. Finally, the sparse dictionary is constructed by these significant principle components. The merit of the proposed sparse dictionary is that it can not only adaptively change in terms of the spatial-temporal characteristics, but also has ability to suppress noises. Besides, considering that the sparse priors cannot preserve the edges and textures of video frames well, the nonlocal similarity regularization term has also been introduced into reconstruction model. Experimental results show that the proposed algorithm can improve the objective and subjective quality of video frame, and achieve the better rate-distortion performance of DVCS system at the cost of a certain computational complexity.

Majorization-Minimization-Based Sparse Signal Recovery Method Using Prior Support and Amplitude Information for the Estimation of Time-varying Sparse Channels

  • Wang, Chen;Fang, Yong
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제12권10호
    • /
    • pp.4835-4855
    • /
    • 2018
  • In this paper, we study the sparse signal recovery that uses information of both support and amplitude of the sparse signal. A convergent iterative algorithm for sparse signal recovery is developed using Majorization-Minimization-based Non-convex Optimization (MM-NcO). Furthermore, it is shown that, typically, the sparse signals that are recovered using the proposed iterative algorithm are not globally optimal and the performance of the iterative algorithm depends on the initial point. Therefore, a modified MM-NcO-based iterative algorithm is developed that uses prior information of both support and amplitude of the sparse signal to enhance recovery performance. Finally, the modified MM-NcO-based iterative algorithm is used to estimate the time-varying sparse wireless channels with temporal correlation. The numerical results show that the new algorithm performs better than related algorithms.

Sparse Data Cleaning using Multiple Imputations

  • Jun, Sung-Hae;Lee, Seung-Joo;Oh, Kyung-Whan
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제4권1호
    • /
    • pp.119-124
    • /
    • 2004
  • Real data as web log file tend to be incomplete. But we have to find useful knowledge from these for optimal decision. In web log data, many useful things which are hyperlink information and web usages of connected users may be found. The size of web data is too huge to use for effective knowledge discovery. To make matters worse, they are very sparse. We overcome this sparse problem using Markov Chain Monte Carlo method as multiple imputations. This missing value imputation changes spare web data to complete. Our study may be a useful tool for discovering knowledge from data set with sparseness. The more sparseness of data in increased, the better performance of MCMC imputation is good. We verified our work by experiments using UCI machine learning repository data.

Constrained Sparse Concept Coding algorithm with application to image representation

  • Shu, Zhenqiu;Zhao, Chunxia;Huang, Pu
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제8권9호
    • /
    • pp.3211-3230
    • /
    • 2014
  • Recently, sparse coding has achieved remarkable success in image representation tasks. In practice, the performance of clustering can be significantly improved if limited label information is incorporated into sparse coding. To this end, in this paper, a novel semi-supervised algorithm, called constrained sparse concept coding (CSCC), is proposed for image representation. CSCC considers limited label information into graph embedding as additional hard constraints, and hence obtains embedding results that are consistent with label information and manifold structure information of the original data. Therefore, CSCC can provide a sparse representation which explicitly utilizes the prior knowledge of the data to improve the discriminative power in clustering. Besides, a kernelized version of our proposed CSCC, namely kernel constrained sparse concept coding (KCSCC), is developed to deal with nonlinear data, which leads to more effective clustering performance. The experimental evaluations on the MNIST, PIE and Yale image sets show the effectiveness of our proposed algorithms.

Optimized Entity Attribute Value Model: A Search Efficient Re-presentation of High Dimensional and Sparse Data

  • Paul, Razan;Latiful Hoque, Abu Sayed Md.
    • Interdisciplinary Bio Central
    • /
    • 제3권3호
    • /
    • pp.9.1-9.5
    • /
    • 2011
  • Entity Attribute Value (EAV) is the widely used solution to represent high dimensional and sparse data, but EAV is not search efficient for knowledge extraction. In this paper, we have proposed a search efficient data model: Optimized Entity Attribute Value (OEAV) for physical representation of high dimensional and sparse data as an alternative of widely used EAV. We have implemented both EAV and OEAV models in a data warehousing en-vironment and performed different relational and warehouse queries on both the models. The experimental results show that OEAV is dramatically search efficient and occupy less storage space compared to EAV.

Enhanced and applicable algorithm for Big-Data by Combining Sparse Auto-Encoder and Load-Balancing, ProGReGA-KF

  • Kim, Hyunah;Kim, Chayoung
    • International Journal of Advanced Culture Technology
    • /
    • 제9권1호
    • /
    • pp.218-223
    • /
    • 2021
  • Pervasive enhancement and required enforcement of the Internet of Things (IoTs) in a distributed massively multiplayer online architecture have effected in massive growth of Big-Data in terms of server over-load. There have been some previous works to overcome the overloading of server works. However, there are lack of considered methods, which is commonly applicable. Therefore, we propose a combing Sparse Auto-Encoder and Load-Balancing, which is ProGReGA for Big-Data of server loads. In the process of Sparse Auto-Encoder, when it comes to selection of the feature-pattern, the less relevant feature-pattern could be eliminated from Big-Data. In relation to Load-Balancing, the alleviated degradation of ProGReGA can take advantage of the less redundant feature-pattern. That means the most relevant of Big-Data representation can work. In the performance evaluation, we can find that the proposed method have become more approachable and stable.

Optimization Driven MapReduce Framework for Indexing and Retrieval of Big Data

  • Abdalla, Hemn Barzan;Ahmed, Awder Mohammed;Al Sibahee, Mustafa A.
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제14권5호
    • /
    • pp.1886-1908
    • /
    • 2020
  • With the technical advances, the amount of big data is increasing day-by-day such that the traditional software tools face a burden in handling them. Additionally, the presence of the imbalance data in big data is a massive concern to the research industry. In order to assure the effective management of big data and to deal with the imbalanced data, this paper proposes a new indexing algorithm for retrieving big data in the MapReduce framework. In mappers, the data clustering is done based on the Sparse Fuzzy-c-means (Sparse FCM) algorithm. The reducer combines the clusters generated by the mapper and again performs data clustering with the Sparse FCM algorithm. The two-level query matching is performed for determining the requested data. The first level query matching is performed for determining the cluster, and the second level query matching is done for accessing the requested data. The ranking of data is performed using the proposed Monarch chaotic whale optimization algorithm (M-CWOA), which is designed by combining Monarch butterfly optimization (MBO) [22] and chaotic whale optimization algorithm (CWOA) [21]. Here, the Parametric Enabled-Similarity Measure (PESM) is adapted for matching the similarities between two datasets. The proposed M-CWOA outperformed other methods with maximal precision of 0.9237, recall of 0.9371, F1-score of 0.9223, respectively.