• Title/Summary/Keyword: Vectorization

Search Result 58, Processing Time 0.02 seconds

A cross-domain access control mechanism based on model migration and semantic reasoning

  • Ming Tan;Aodi Liu;Xiaohan Wang;Siyuan Shang;Na Wang;Xuehui Du
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.18 no.6
    • /
    • pp.1599-1618
    • /
    • 2024
  • Access control has always been one of the effective methods to protect data security. However, in new computing environments such as big data, data resources have the characteristics of distributed cross-domain sharing, massive and dynamic. Traditional access control mechanisms are difficult to meet the security needs. This paper proposes CACM-MMSR to solve distributed cross-domain access control problem for massive resources. The method uses blockchain and smart contracts as a link between different security domains. A permission decision model migration method based on access control logs is designed. It can realize the migration of historical policy to solve the problems of access control heterogeneity among different security domains and the updating of the old and new policies in the same security domain. Meanwhile, a semantic reasoning-based permission decision method for unstructured text data is designed. It can achieve a flexible permission decision by similarity thresholding. Experimental results show that the proposed method can reduce the decision time cost of distributed access control to less than 28.7% of a single node. The permission decision model migration method has a high decision accuracy of 97.4%. The semantic reasoning-based permission decision method is optimal to other reference methods in vectorization and index time cost.

Real-Time Object Tracking Algorithm based on Minimal Contour in Surveillance Networks (서베일런스 네트워크에서 최소 윤곽을 기초로 하는 실시간 객체 추적 알고리즘)

  • Kang, Sung-Kwan;Park, Yang-Jae
    • Journal of Digital Convergence
    • /
    • v.12 no.8
    • /
    • pp.337-343
    • /
    • 2014
  • This paper proposes a minimal contour tracking algorithm that reduces transmission of data for tracking mobile objects in surveillance networks in terms of detection and communication load. This algorithm perform detection for object tracking and when it transmit image data to server from camera, it minimized communication load by reducing quantity of transmission data. This algorithm use minimal tracking area based on the kinematics of the object. The modeling of object's kinematics allows for pruning out part of the tracking area that cannot be mechanically visited by the mobile object within scheduled time. In applications to detect an object in real time,when transmitting a large amount of image data it is possible to reduce the transmission load.

Improve the Performance of People Detection using Fisher Linear Discriminant Analysis in Surveillance (서베일런스에서 피셔의 선형 판별 분석을 이용한 사람 검출의 성능 향상)

  • Kang, Sung-Kwan;Lee, Jung-Hyun
    • Journal of Digital Convergence
    • /
    • v.11 no.12
    • /
    • pp.295-302
    • /
    • 2013
  • Many reported methods assume that the people in an image or an image sequence have been identified and localization. People detection is one of very important variable to affect for the system's performance as the basis technology about the detection of other objects and interacting with people and computers, motion recognition. In this paper, we present an efficient linear discriminant for multi-view people detection. Our approaches are based on linear discriminant. We define training data with fisher Linear discriminant to efficient learning method. People detection is considerably difficult because it will be influenced by poses of people and changes in illumination. This idea can solve the multi-view scale and people detection problem quickly and efficiently, which fits for detecting people automatically. In this paper, we extract people using fisher linear discriminant that is hierarchical models invariant pose and background. We estimation the pose in detected people. The purpose of this paper is to classify people and non-people using fisher linear discriminant.

A Stereo Image Recognition-Based Method for measuring the volume of 3D Object (스테레오 영상 인식에 기반한 3D 물체의 부피계측방법)

  • Jeong, Yun-Su;Lee, Hae-Won;Kim, Jin-Seok;Won, Jong-Un
    • The KIPS Transactions:PartB
    • /
    • v.9B no.2
    • /
    • pp.237-244
    • /
    • 2002
  • In this paper, we propose a stereo image recognition-based method for measuring the volume of the rectangular parallelepiped. The method measures the volume from two images captured with two CCD (charge coupled device) cameras by sequential processes such as ROI (region of interest) extraction, feature extraction, and stereo matching-based vortex recognition. The proposed method makes it possible to measure the volume of the 3D object at high speed because only a few features are used in the process of stereo matching. From experimental results, it is demonstrated that this method is very effective for measuring the volume of the rectangular parallelepiped at high speed.

A Systematic Evaluation of Thinning Algorithms for Automatic Vectorization of Cartographic Maps (지리도면의 자동 벡터화를 위한 영상 세선화 알고리즘의 체계적인 성능평가)

  • Lee, Kyung-Ho;Kim, Kyong-Ho;Cho, Sung-Bae;Choy, Yoon-Chul
    • The Transactions of the Korea Information Processing Society
    • /
    • v.4 no.12
    • /
    • pp.2960-2970
    • /
    • 1997
  • In a variety of fields, recently, there is a growing interest in Geographic Information System which facilitates efficient storage and retrieval of geographic information. It is of extreme importance to make a good choice of efficient input method, because it takes the most of the lime and cost in constructing a GIS. Among several steps, thinning input image to produce skeleton of unit width is prerequisite to the automatic input or geographic maps. In this paper, we systematically evaluate the performance of representative thinning algorithms in geographic maps such as contour, cadastral, and water and sewer maps, and suggest appropriate algorithms for the maps, respectively. A thorough experiment indicates that Arcelli's method is best for contour maps, Holt's method for cadastral maps, and Chen's method for water and sewer maps.

  • PDF

The automatic Lexical Knowledge acquisition using morpheme information and Clustering techniques (어절 내 형태소 출현 정보와 클러스터링 기법을 이용한 어휘지식 자동 획득)

  • Yu, Won-Hee;Suh, Tae-Won;Lim, Heui-Seok
    • The Journal of Korean Association of Computer Education
    • /
    • v.13 no.1
    • /
    • pp.65-73
    • /
    • 2010
  • This study offered lexical knowledge acquisition model of unsupervised learning method in order to overcome limitation of lexical knowledge hand building manual of supervised learning method for research of natural language processing. The offered model obtains the lexical knowledge from the lexical entry which was given by inputting through the process of vectorization, clustering, lexical knowledge acquisition automatically. In the process of obtaining the lexical knowledge acquisition of model, some parts of lexical knowledge dictionary which changes in the number of lexical knowledge and characteristics of lexical knowledge appeared by parameter changes were shown. The experimental results show that is possibility of automatic building of Machine-readable dictionary, because observed to the number of lexical class information cluster collected constant. also building of lexical ditionary including left-morphosyntactic information and right-morphosyntactic information is reflected korean characteristic.

  • PDF

Abusive Detection Using Bidirectional Long Short-Term Memory Networks (양방향 장단기 메모리 신경망을 이용한 욕설 검출)

  • Na, In-Seop;Lee, Sin-Woo;Lee, Jae-Hak;Koh, Jin-Gwang
    • The Journal of Bigdata
    • /
    • v.4 no.2
    • /
    • pp.35-45
    • /
    • 2019
  • Recently, the damage with social cost of malicious comments is increasing. In addition to the news of talent committing suicide through the effects of malicious comments. The damage to malicious comments including abusive language and slang is increasing and spreading in various type and forms throughout society. In this paper, we propose a technique for detecting abusive language using a bi-directional long short-term memory neural network model. We collected comments on the web through the web crawler and processed the stopwords on unused words such as English Alphabet or special characters. For the stopwords processed comments, the bidirectional long short-term memory neural network model considering the front word and back word of sentences was used to determine and detect abusive language. In order to use the bi-directional long short-term memory neural network, the detected comments were subjected to morphological analysis and vectorization, and each word was labeled with abusive language. Experimental results showed a performance of 88.79% for a total of 9,288 comments screened and collected.

  • PDF

Clustering Technique Using Relevance of Data and Applied Algorithms (데이터와 적용되는 알고리즘의 연관성을 이용한 클러스터링 기법)

  • Han Woo-Yeon;Nam Mi-Young;Rhee PhillKyu
    • The KIPS Transactions:PartB
    • /
    • v.12B no.5 s.101
    • /
    • pp.577-586
    • /
    • 2005
  • Many algorithms have been proposed for (ace recognition that is one of the most successful applications in image processing, pattern recognition and computer vision fields. Research for what kind of attribute of face that make harder or easier recognizing the target is going on recently. In flus paper, we propose method to improve recognition performance using relevance of face data and applied algorithms, because recognition performance of each algorithm according to facial attribute(illumination and expression) is change. In the experiment, we use n-tuple classifier, PCA and Gabor wavelet as recognition algorithm. And we propose three vectorization methods. First of all, we estimate the fitnesses of three recognition algorithms about each cluster after clustering the test data using k-means algorithm then we compose new clusters by integrating clusters that select same algorithm. We estimate similarity about a new cluster of test data and then we recognize the target using the nearest cluster. As a result, we can observe that the recognition performance has improved than the performance by a single algorithm without clustering.

A Vectorization Technique at Object Code Level (목적 코드 레벨에서의 벡터화 기법)

  • Lee, Dong-Ho;Kim, Ki-Chang
    • The Transactions of the Korea Information Processing Society
    • /
    • v.5 no.5
    • /
    • pp.1172-1184
    • /
    • 1998
  • ILP(Instruction Level Parallelism) processors use code reordering algorithms to expose parallelism in a given sequential program. When applied to a loop, this algorithm produces a software-pipelined loop. In a software-pipelined loop, each iteration contains a sequence of parallel instructions that are composed of data-independent instructions collected across from several iterations. For vector loops, however the software pipelining technique can not expose the maximum parallelism because it schedules the program based only on data-dependencies. This paper proposes to schedule differently for vector loops. We develop an algorithm to detect vector loops at object code level and suggest a new vector scheduling algorithm for them. Our vector scheduling improves the performance because it can schedule not only based on data-dependencies but on loop structure or iteration conditions at the object code level. We compare the resulting schedules with those by software-pipelining techniques in the aspect of performance.

  • PDF

Impact of rock microstructures on failure processes - Numerical study based on DIP technique

  • Yu, Qinglei;Zhu, Wancheng;Tang, Chun'an;Yang, Tianhong
    • Geomechanics and Engineering
    • /
    • v.7 no.4
    • /
    • pp.375-401
    • /
    • 2014
  • It is generally accepted that material heterogeneity has a great influence on the deformation, strength, damage and failure modes of rock. This paper presents numerical simulation on rock failure process based on the characterization of rock heterogeneity by using a digital image processing (DIP) technique. The actual heterogeneity of rock at mesoscopic scale (characterized as minerals) is retrieved by using a vectorization transformation method based on the digital image of rock surface, and it is imported into a well-established numerical code Rock Failure Process Analysis (RFPA), in order to examine the effect of rock heterogeneity on the rock failure process. In this regard, the numerical model of rock could be built based on the actual characterization of the heterogeneity of rock at the meso-scale. Then, the images of granite are taken as an example to illustrate the implementation of DIP technique in simulating the rock failure process. Three numerical examples are presented to demonstrate the impact of actual rock heterogeneity due to spatial distribution of constituent mineral grains (e.g., feldspar, quartz and mica) on the macro-scale mechanical response, and the associated rock failure mechanism at the meso-scale level is clarified. The numerical results indicate that the shape and distribution of constituent mineral grains have a pronounced impact on stress distribution and concentration, which may further control the failure process of granite. The proposed method provides an efficient tool for studying the mechanical behaviors of heterogeneous rock and rock-like materials whose failure processes are strongly influenced by material heterogeneity.