• Title/Summary/Keyword: classification of expressions

Search Result 108, Processing Time 0.029 seconds

A Study on Definition and Classification of Expressions Dealt with in Elementary Mathematics (초등학교 수학에서 취급하는 식의 정의와 분류에 관한 연구)

  • Ko, Jun Seok;Kim, Ji Won;Park, Kyo
    • School Mathematics
    • /
    • v.16 no.2
    • /
    • pp.303-315
    • /
    • 2014
  • Even though the variety of expressions are dealt with in Korean elementary mathematics, the systematization of the subject matter of expressions is still insufficient. This is basically due to the failure of revealing clearly the identity of expressions dealt with in elementary mathematics. In this paper, as a groundwork to improve this situation, after the classification of signs as elements constituting expressions, in a position to consider elementary mathematics using transitional signs such as ${\square}$, ${\triangle}$, etc and words or phrases in expressions, expressions were defined and classified based on that classification of signs. It can be presented as the conclusion that the following four judgements which helps to promote the systematization of the subject matter of expressions are possible through this definition and classifications. First, by clarifying the identity of the expressions, any mathematical clauses or sentences can be determined whether those are expressions or not. Second, Forms of expressions can be identified. Third, the subject matter of expressions can be identified systematically. Fourth, the hierarchy of the subject matter of expressions can be identified.

  • PDF

Facial Expression Classification through Covariance Matrix Correlations

  • Odoyo, Wilfred O.;Cho, Beom-Joon
    • Journal of information and communication convergence engineering
    • /
    • v.9 no.5
    • /
    • pp.505-509
    • /
    • 2011
  • This paper attempts to classify known facial expressions and to establish the correlations between two regions (eye + eyebrows and mouth) in identifying the six prototypic expressions. Covariance is used to describe region texture that captures facial features for classification. The texture captured exhibit the pattern observed during the execution of particular expressions. Feature matching is done by simple distance measure between the probe and the modeled representations of eye and mouth components. We target JAFFE database in this experiment to validate our claim. A high classification rate is observed from the mouth component and the correlation between the two (eye and mouth) components. Eye component exhibits a lower classification rate if used independently.

Classification and Intensity Assessment of Korean Emotion Expressing Idioms for Human Emotion Recognition

  • Park, Ji-Eun;Sohn, Sun-Ju;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.5
    • /
    • pp.617-627
    • /
    • 2012
  • Objective: The aim of the study was to develop a most widely used Korean dictionary of emotion expressing idioms. This is anticipated to assist the development of software technology that recognizes and responds to verbally expressed human emotions. Method: Through rigorous and strategic classification processes, idiomatic expressions included in this dictionary have been rated in terms of nine different emotions (i.e., happiness, sadness, fear, anger, surprise, disgust, interest, boredom, and pain) for meaning and intensity associated with each expression. Result: The Korean dictionary of emotion expression idioms included 427 expressions, with approximately two thirds classified as 'happiness'(n=96), 'sadness'(n=96), and 'anger'(n=90) emotions. Conclusion: The significance of this study primarily rests in the development of a practical language tool that contains Korean idiomatic expressions of emotions, provision of information on meaning and strength, and identification of idioms connoting two or more emotions. Application: Study findings can be utilized in emotion recognition research, particularly in identifying primary and secondary emotions as well as understanding intensity associated with various idioms used in emotion expressions. In clinical settings, information provided from this research may also enhance helping professionals' competence in verbally communicating patients' emotional needs.

A Topic Classification System Based on Clue Expressions for Person-Related Questions and Passages (단서표현 기반의 인물관련 질의-응답문 문장 주제 분류 시스템)

  • Lee, Gyoung Ho;Lee, Kong Joo
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.4 no.12
    • /
    • pp.577-584
    • /
    • 2015
  • In general, Q&A system retrieves passages by matching terms of a question in order to find an answer to the question. However it is difficult for Q&A system to find a correct answer because too many passages are retrieved and matching using terms is not enough to rank them according to their relevancy to a question. To alleviate this problem, we introduce a topic for a sentence, and adopt it for ranking in Q&A system. We define a set of person-related topic class and a clue expression which can indicate a topic of a sentence. A topic classification system proposed in this paper can determine a target topic for an input sentence by using clue expressions, which are manually collected from a corpus. We explain an architecture of the topic classification system and evaluate the performance of the components of this system.

Classification of Microarray Gene Expression Data by MultiBlock Dimension Reduction

  • Oh, Mi-Ra;Kim, Seo-Young;Kim, Kyung-Sook;Baek, Jang-Sun;Son, Young-Sook
    • Communications for Statistical Applications and Methods
    • /
    • v.13 no.3
    • /
    • pp.567-576
    • /
    • 2006
  • In this paper, we applied the multiblock dimension reduction methods to the classification of tumor based on microarray gene expressions data. This procedure involves clustering selected genes, multiblock dimension reduction and classification using linear discrimination analysis and quadratic discrimination analysis.

Expression of p53, CD44v6 and VEGF in Gastric Adenocarcinomas (위선암종의 예후인자로서 p53, CD44v6과 VEGF 단백 발현)

  • Park, Eon-Sub;Lee, Chang-Young;Lee, Tae-Jin;Kim, Mi-Kyung;Yoo, Jae-Hyung
    • Journal of Gastric Cancer
    • /
    • v.1 no.1
    • /
    • pp.10-16
    • /
    • 2001
  • Purpose: The p53 protein is a tumor supressor gene, and its mutation is associated with biologic aggressiveness. CD44v6, one of the CD44 family, is a cell surface glycoprotein that plays a role in cancer invasion and metastasis. Vascular endothelial growth factor (VEGF) is another recently identified growth factor with significant angiogenic properties. The purpose of this study was to investigate p53, CD44v6, and VEGF expressions to determine whether degree of expression was related to pathological parameters such as Lauren's classification, depth of invasion, and lymph node metastasis. Materials and Methods: Immunohistochemical stains of p53, CD44v6, and VEGF in formalin-fixed paraffin-embedded tissue sections of 125 gastric adenocarcinomas were done. Results: The overall expression rates of p53, CD44v6, and VEGF were $54.4\%$ (68/125), $36.8\%$ (46/125), and $48.0\%$ (60/125), respectively. The p53, not CD44v6 and VEGF was higher in intestinal-type gastric carcinomas by Lauren's classification. The expressions of p53, CD44v6, and VEGF were statistically correlated with depth of tumor invasion. The expression of CD44v6 was higher in the lymph node metastatic group than in the negative group. The p53 expression was significantly associated with VEGF expression. Conclusions: These data suggest that the expressions of p53, CD44v6, and VEGF are biologically related to malignancy. The p53 and CD44v6 expressions are independent; however, p53 gene mutation is one of the contributing factors to VEGF expression in gastric adenocarcinoma.

  • PDF

Review and proposed improvements for Romanization and English expressions of rubrics in the WHO ICD-11 beta version traditional medicine chapter (세계보건기구 국제질병분류 11판 베타버전 중 한의학 고유 상병의 로마자 표기 및 영문표현 검토연구)

  • Kim, Jin Youp;Yin, Chang Shik;Jo, Hee Jin;Kim, Kyu Ri;Kang, Da Hyun;Lee, Jong Ran;Kim, Yong Suk
    • Journal of Acupuncture Research
    • /
    • v.32 no.4
    • /
    • pp.47-68
    • /
    • 2015
  • Objectives : The purpose of this study is to review and propose improvements for the Romanization and English expressions in the WHO international classification of diseases 11th revision beta version (ICD-11b) traditional medicine chapter. Methods : ICD-11b as of October 5, 2015, was reviewed. Romanization and English expressions were analyzed with reference to existing standards such as the Basic Principles of Romanization stipulated by the National Institute of Korean Language, and the Korean Standard Classification of Diseases (KCD), suggested improvements followed. Results : Following the Basic Principles of Romanization, 131 ICD-11b rubrics need improvement in the Romanization of Korean. When compared to KCD-6 comparable rubrics, 161 ICD-11b rubrics are the same and 64 are different. When compared to KCD-7 comparable rubrics, 118 ICD-11b rubrics are the same, and 51 are different. In KCD-6, there are 127 rubrics that do not match with items in ICD-11b. In KCD-7, there are 123 rubrics that do not match with items in ICD-11b. Conclusions : ICD-11b may be improved by correcting the Romanization and consideration of English expressions suggested in this study.

Facial Expression Classification Using Deep Convolutional Neural Network

  • Choi, In-kyu;Ahn, Ha-eun;Yoo, Jisang
    • Journal of Electrical Engineering and Technology
    • /
    • v.13 no.1
    • /
    • pp.485-492
    • /
    • 2018
  • In this paper, we propose facial expression recognition using CNN (Convolutional Neural Network), one of the deep learning technologies. The proposed structure has general classification performance for any environment or subject. For this purpose, we collect a variety of databases and organize the database into six expression classes such as 'expressionless', 'happy', 'sad', 'angry', 'surprised' and 'disgusted'. Pre-processing and data augmentation techniques are applied to improve training efficiency and classification performance. In the existing CNN structure, the optimal structure that best expresses the features of six facial expressions is found by adjusting the number of feature maps of the convolutional layer and the number of nodes of fully-connected layer. The experimental results show good classification performance compared to the state-of-the-arts in experiments of the cross validation and the cross database. Also, compared to other conventional models, it is confirmed that the proposed structure is superior in classification performance with less execution time.