• Title/Summary/Keyword: Font Classification Rule

Search Result 3, Processing Time 0.018 seconds

Standardization Study of Font Shape Classification for Hangul Font Registration System (한글 글꼴 등록 시스템을 위한 글꼴 모양 분류체계 표준화 연구)

  • Kim, Hyun-Young;Lim, Soon-Bum
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.3
    • /
    • pp.571-580
    • /
    • 2017
  • Recently, there are many communication softwares based on text on various smart devices. Unlike traditional print publishing, mobile publishing and SNS tools tends to utilize more decorative or more emotional fonts so that users can pass some feelings from contents. So font providers have released new fonts which deal with the requirements of the market. Nevertheless being released lots of new fonts, general users have not used them because they searched only by font name or font provider's name. It means that there is no way for users to know and find new things. In this study, we suggest font shape classification rules for font registration system based on font design features. We proved the validity of classification standard study through some experiments with 50 commercial fonts. Also the result of this study was provided for Korea Telecommunication Technology Association and adopted by the Korea industrial standard.

Block Classification of Document Images by Block Attributes and Texture Features (블록의 속성과 질감특징을 이용한 문서영상의 블록분류)

  • Jang, Young-Nae;Kim, Joong-Soo;Lee, Cheol-Hee
    • Journal of Korea Multimedia Society
    • /
    • v.10 no.7
    • /
    • pp.856-868
    • /
    • 2007
  • We propose an effective method for block classification in a document image. The gray level document image is converted to the binary image for a block segmentation. This binary image would be smoothed to find the locations and sizes of each block. And especially during this smoothing, the inner block heights of each block are obtained. The gray level image is divided to several blocks by these location informations. The SGLDM(spatial gray level dependence matrices) are made using the each gray-level document block and the seven second-order statistical texture features are extracted from the (0,1) direction's SGLDM which include the document attributes. Document image blocks are classified to two groups, text and non-text group, by the inner block height of the block at the nearest neighbor rule. The seven texture features(that were extracted from the SGLDM) are used for the five detail categories of small font, large font, table, graphic and photo blocks. These document blocks are available not only for structure analysis of document recognition but also the various applied area.

  • PDF

Hangul Recognition Using a Hierarchical Neural Network (계층구조 신경망을 이용한 한글 인식)

  • 최동혁;류성원;강현철;박규태
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.28B no.11
    • /
    • pp.852-858
    • /
    • 1991
  • An adaptive hierarchical classifier(AHCL) for Korean character recognition using a neural net is designed. This classifier has two neural nets: USACL (Unsupervised Adaptive Classifier) and SACL (Supervised Adaptive Classifier). USACL has the input layer and the output layer. The input layer and the output layer are fully connected. The nodes in the output layer are generated by the unsupervised and nearest neighbor learning rule during learning. SACL has the input layer, the hidden layer and the output layer. The input layer and the hidden layer arefully connected, and the hidden layer and the output layer are partially connected. The nodes in the SACL are generated by the supervised and nearest neighbor learning rule during learning. USACL has pre-attentive effect, which perform partial search instead of full search during SACL classification to enhance processing speed. The input of USACL and SACL is a directional edge feature with a directional receptive field. In order to test the performance of the AHCL, various multi-font printed Hangul characters are used in learning and testing, and its processing its speed and and classification rate are compared with the conventional LVQ(Learning Vector Quantizer) which has the nearest neighbor learning rule.

  • PDF