• Title/Summary/Keyword: Information matrix

Search Result 3,491, Processing Time 0.03 seconds

Identifying Core Robot Technologies by Analyzing Patent Co-classification Information

  • Jeon, Jeonghwan;Suh, Yongyoon;Koh, Jinhwan;Kim, Chulhyun;Lee, Sanghoon
    • Asian Journal of Innovation and Policy
    • /
    • v.8 no.1
    • /
    • pp.73-96
    • /
    • 2019
  • This study suggests a new approach for identifying core robot tech-nologies based on technological cross-impact. Specifically, the approach applies data mining techniques and multi-criteria decision-making methods to the co-classification information of registered patents on the robots. First, a cross-impact matrix is constructed with the confidence values by applying association rule mining (ARM) to the co-classification information of patents. Analytic network process (ANP) is applied to the co-classification frequency matrix for deriving weights of each robot technology. Then, a technique for order performance by similarity to ideal solution (TOPSIS) is employed to the derived cross-impact matrix and weights for identifying core robot technologies from the overall cross-impact perspective. It is expected that the proposed approach could help robot technology managers to formulate strategy and policy for technology planning of robot area.

Conceptional Approach for Assembly Reconfiguration of Papering Robot Modules (선체 수직 외벽 Papering 용 로봇 모듈의 조합 최적설계의 개념적 접근)

  • Chung W.J.;Kim S.H.;Kim K.K.;Kim H.G.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2005.06a
    • /
    • pp.2015-2018
    • /
    • 2005
  • In this paper, we are willing to prepare the reasonable optimization, Combinatorial Optimization and Genetic Algorithm. Thus we define position status of end-effect (or terminative link module) using promised form, (G, M(G), A(G), and so on.). For this preparing step, the reorganizing procedure of Link and Joint Module is necessary, like as enumerating the kinematically identical assembly group of several links and joints. Thus, we draw a G, directed graph in a first step. Because, directed graph contains the path information between adjacent Link Module and Joint Module. From the directed graph,G, we can incite the Incidence Matrix, M(G). The incidence matrix, M(G), contains the contact information of the Link (Joint) Module and the type of Link (Joint). At the end of this paper, we generalize the modular information as a matrix form, A(G). From this matrix, we can make a population of assembly status. That is the finial output of this paper.

  • PDF

Rock Fracture Centerline Extraction based on Hessian Matrix and Steger algorithm

  • Wang, Weixing;Liang, Yanjie
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.12
    • /
    • pp.5073-5086
    • /
    • 2015
  • The rock fracture detection by image analysis is significant for fracture measurement and assessment engineering. The paper proposes a novel image segmentation algorithm for the centerline tracing of a rock fracture based on Hessian Matrix at Multi-scales and Steger algorithm. A traditional fracture detection method, which does edge detection first, then makes image binarization, and finally performs noise removal and fracture gap linking, is difficult for images of rough rock surfaces. To overcome the problem, the new algorithm extracts the centerlines directly from a gray level image. It includes three steps: (1) Hessian Matrix and Frangi filter are adopted to enhance the curvilinear structures, then after image binarization, the spurious-fractures and noise are removed by synthesizing the area, circularity and rectangularity; (2) On the binary image, Steger algorithm is used to detect fracture centerline points, then the centerline points or segments are linked according to the gap distance and the angle differences; and (3) Based on the above centerline detection roughly, the centerline points are searched in the original image in a local window along the direction perpendicular to the normal of the centerline, then these points are linked. A number of rock fracture images have been tested, and the testing results show that compared to other traditional algorithms, the proposed algorithm can extract rock fracture centerlines accurately.

Evaluation criterion for different methods of multiple-attribute group decision making with interval-valued intuitionistic fuzzy information

  • Qiu, Junda;Li, Lei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.7
    • /
    • pp.3128-3149
    • /
    • 2018
  • A number of effective methods for multiple-attribute group decision making (MAGDM) with interval-valued intuitionistic fuzzy numbers (IVIFNs) have been proposed in recent years. However, the different methods frequently yield different, even sometimes contradictory, results for the same problem. In this paper a novel criterion to determine the advantages and disadvantages of different methods is proposed. First, the decision-making process is divided into three parts: translation of experts' preferences, aggregation of experts' opinions, and comparison of the alternatives. Experts' preferences aggregation is considered the core step, and the quality of the collective matrix is considered the most important evaluation index for the aggregation methods. Then, methods to calculate the similarity measure, correlation, correlation coefficient, and energy of the intuitionistic fuzzy matrices are proposed, which are employed to evaluate the collective matrix. Thus, the optimal method can be selected by comparing the collective matrices when all the methods yield different results. Finally, a novel approach for aggregating experts' preferences with IVIFN is presented. In this approach, experts' preferences are mapped as points into two-dimensional planes, with the plant growth simulation algorithm (PGSA) being employed to calculate the optimal rally points, which are inversely mapped to IVIFNs to establish the collective matrix. In the study, four different methods are used to address one example problem to illustrate the feasibility and effectiveness of the proposed approach.

Robust Digital Watermarking for High-definition Video using Steerable Pyramid Transform, Two Dimensional Fast Fourier Transform and Ensemble Position-based Error Correcting

  • Jin, Xun;Kim, JongWeon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.7
    • /
    • pp.3438-3454
    • /
    • 2018
  • In this paper, we propose a robust blind watermarking scheme for high-definition video. In the embedding process, luminance component of each frame is transformed by 2-dimensional fast Fourier transform (2D FFT). A secret key is used to generate a matrix of random numbers for the security of watermark information. The matrix is transformed by inverse steerable pyramid transform (SPT). We embed the watermark into the low and mid-frequency of 2D FFT coefficients with the transformed matrix. In the extraction process, the 2D FFT coefficients of each frame and the transformed matrix are transformed by SPT respectively, to produce two oriented sub-bands. We extract the watermark from each frame by cross-correlating two oriented sub-bands. If a video is degraded by some attacks, the watermarks of frames contain some errors. Thus, we use an ensemble position-based error correcting algorithm to estimate the errors and correct them. The experimental results show that the proposed watermarking algorithm is imperceptible and moreover is robust against various attacks. After embedding 64 bits of watermark into each frame, the average peak signal-to-noise ratio between original frames and embedded frames is 45.7 dB.

Adaptive Selective Compressive Sensing based Signal Acquisition Oriented toward Strong Signal Noise Scene

  • Wen, Fangqing;Zhang, Gong;Ben, De
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.9
    • /
    • pp.3559-3571
    • /
    • 2015
  • This paper addresses the problem of signal acquisition with a sparse representation in a given orthonormal basis using fewer noisy measurements. The authors formulate the problem statement for randomly measuring with strong signal noise. The impact of white Gaussian signals noise on the recovery performance is analyzed to provide a theoretical basis for the reasonable design of the measurement matrix. With the idea that the measurement matrix can be adapted for noise suppression in the adaptive CS system, an adapted selective compressive sensing (ASCS) scheme is proposed whose measurement matrix can be updated according to the noise information fed back by the processing center. In terms of objective recovery quality, failure rate and mean-square error (MSE), a comparison is made with some nonadaptive methods and existing CS measurement approaches. Extensive numerical experiments show that the proposed scheme has better noise suppression performance and improves the support recovery of sparse signal. The proposed scheme should have a great potential and bright prospect of broadband signals such as biological signal measurement and radar signal detection.

A Theoretical Framework for Closeness Centralization Measurements in a Workflow-Supported Organization

  • Kim, Min-Joon;Ahn, Hyun;Park, Min-Jae
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.9
    • /
    • pp.3611-3634
    • /
    • 2015
  • In this paper, we build a theoretical framework for quantitatively measuring and graphically representing the degrees of closeness centralization among performers assigned to enact a workflow procedure. The degree of closeness centralization of a workflow-performer reflects how near the performer is to the other performers in enacting a corresponding workflow model designed for workflow-supported organizational operations. The proposed framework comprises three procedural phases and four functional transformations, such as discovery, analysis, and quantitation phases, which carry out ICN-to-WsoN, WsoN-to-SocioMatrix, SocioMatrix-to-DistanceMatrix, and DistanceMatrix-to-CCV transformations. We develop a series of algorithmic formalisms for the procedural phases and their transformative functionalities, and verify the proposed framework through an operational example. Finally, we expatiate on the functional expansion of the closeness centralization formulas so as for the theoretical framework to handle a group of workflow procedures (or a workflow package) with organization-wide workflow-performers.

Document Clustering Using Semantic Features and Fuzzy Relations

  • Kim, Chul-Won;Park, Sun
    • Journal of information and communication convergence engineering
    • /
    • v.11 no.3
    • /
    • pp.179-184
    • /
    • 2013
  • Traditional clustering methods are usually based on the bag-of-words (BOW) model. A disadvantage of the BOW model is that it ignores the semantic relationship among terms in the data set. To resolve this problem, ontology or matrix factorization approaches are usually used. However, a major problem of the ontology approach is that it is usually difficult to find a comprehensive ontology that can cover all the concepts mentioned in a collection. This paper proposes a new document clustering method using semantic features and fuzzy relations for solving the problems of ontology and matrix factorization approaches. The proposed method can improve the quality of document clustering because the clustered documents use fuzzy relation values between semantic features and terms to distinguish clearly among dissimilar documents in clusters. The selected cluster label terms can represent the inherent structure of a document set better by using semantic features based on non-negative matrix factorization, which is used in document clustering. The experimental results demonstrate that the proposed method achieves better performance than other document clustering methods.

Preparations and Photovoltaic Properties of Dye-Sensitized Solar Cells Using Polymer Electrolytes (고분자 전해질을 이용한 염료감응형 태양전지의 제작과 광기전 특성)

  • Kim, Mi-Ra;Shin, Won-Suk;Jin, Sung-Ho;Lee, Jin-Kook
    • 한국신재생에너지학회:학술대회논문집
    • /
    • 2006.06a
    • /
    • pp.175-178
    • /
    • 2006
  • Solid-state dye-sensitized solar cells were fabricated using a polymer matrix in electrolyte in the purpose of the improvement of the durability in the dye-sensitized solar cell. In these dye-sensitized solar cells, the polymer electrolyte consisting of $I_2$, LiI, ionic liquid, ethylene carbonate/propylene carbonate and polymer matrix was casted onto $TiO_2$ electrode impregnated Ruthenium complex dye as a photosensitizer. Photovoltaic properties of solid-state dye-sensitized solar cells using polymer matrix (PMMA, PEG, or PAN) were investigated. Comparing photovoltaic effects of cells using hole conducting polymers (BE or 6P) instead of polymer matrix, we investigated the availability of the solid-state polymer electrolyte in dye-sensitized solar cells.

  • PDF

Audio Fingerprint Retrieval Method Based on Feature Dimension Reduction and Feature Combination

  • Zhang, Qiu-yu;Xu, Fu-jiu;Bai, Jian
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.2
    • /
    • pp.522-539
    • /
    • 2021
  • In order to solve the problems of the existing audio fingerprint method when extracting audio fingerprints from long speech segments, such as too large fingerprint dimension, poor robustness, and low retrieval accuracy and efficiency, a robust audio fingerprint retrieval method based on feature dimension reduction and feature combination is proposed. Firstly, the Mel-frequency cepstral coefficient (MFCC) and linear prediction cepstrum coefficient (LPCC) of the original speech are extracted respectively, and the MFCC feature matrix and LPCC feature matrix are combined. Secondly, the feature dimension reduction method based on information entropy is used for column dimension reduction, and the feature matrix after dimension reduction is used for row dimension reduction based on energy feature dimension reduction method. Finally, the audio fingerprint is constructed by using the feature combination matrix after dimension reduction. When speech's user retrieval, the normalized Hamming distance algorithm is used for matching retrieval. Experiment results show that the proposed method has smaller audio fingerprint dimension and better robustness for long speech segments, and has higher retrieval efficiency while maintaining a higher recall rate and precision rate.