• Title/Summary/Keyword: 모달실험

Search Result 163, Processing Time 0.02 seconds

The Interactive Effect of Translational Drift and Torsional Deformation on Shear Force and Torsional Moment (전단력 및 비틀림 모멘트에 의한 병진 변형 및 비틀림 변형의 상호 작용 효과)

  • Kim, In-Ho;Abegaz, Ruth A.
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.35 no.5
    • /
    • pp.277-286
    • /
    • 2022
  • The elastic and inelastic responses obtained from the experimental and analytical results of two RC building structures under the service level earthquake (SLE) and maximum considered earthquake (MCE) in Korea were used to weinvestigate the characteristics of the mechanisms resisting shear and torsional behavior in torsionally unbalanced structures. Equations representing the interactive effect of translational drift and torsional deformation on the shear force and torsional moment were proposed. Because there is no correlation in the behavior between elastic and inelastic forces and strains, the incremental shear forces and incremental torsional moments were analyzed in terms of their corresponding incremental drifts and incremental torsional deformations with respect to the yield, unloading, and reloading phases around the maximum edge-frame drift. In the elastic combination of the two dominant modes, the translational drift mainly contributes to the shear force, whereas the torsional deformation contributes significantly to the overall torsional moment. However, this phenomenon is mostly altered in the inelastic response such that the incremental translational drift contributes to both the incremental shear forces and incremental torsional moments. In addition, the given equation is used to account for all phenomena, such as the reduction in torsional eccentricity, degradation of torsional stiffness, and apparent energy generation in an inelastic response.

Artificial Intelligence for Assistance of Facial Expression Practice Using Emotion Classification (감정 분류를 이용한 표정 연습 보조 인공지능)

  • Dong-Kyu, Kim;So Hwa, Lee;Jae Hwan, Bong
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.17 no.6
    • /
    • pp.1137-1144
    • /
    • 2022
  • In this study, an artificial intelligence(AI) was developed to help with facial expression practice in order to express emotions. The developed AI used multimodal inputs consisting of sentences and facial images for deep neural networks (DNNs). The DNNs calculated similarities between the emotions predicted by the sentences and the emotions predicted by facial images. The user practiced facial expressions based on the situation given by sentences, and the AI provided the user with numerical feedback based on the similarity between the emotion predicted by sentence and the emotion predicted by facial expression. ResNet34 structure was trained on FER2013 public data to predict emotions from facial images. To predict emotions in sentences, KoBERT model was trained in transfer learning manner using the conversational speech dataset for emotion classification opened to the public by AIHub. The DNN that predicts emotions from the facial images demonstrated 65% accuracy, which is comparable to human emotional classification ability. The DNN that predicts emotions from the sentences achieved 90% accuracy. The performance of the developed AI was evaluated through experiments with changing facial expressions in which an ordinary person was participated.

Exploring the Effects of Passive Haptic Factors When Interacting with a Virtual Pet in Immersive VR Environment (몰입형 VR 환경에서 가상 반려동물과 상호작용에 관한 패시브 햅틱 요소의 영향 분석)

  • Donggeun KIM;Dongsik Jo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.30 no.3
    • /
    • pp.125-132
    • /
    • 2024
  • Recently, with immersive virtual reality(IVR) technologies, various services such as education, training, entertainment, industry, healthcare and remote collaboration have been applied. In particular, researches are actively being studied to visualize and interact with virtual humans, research on virtual pets in IVR is also emerging. For interaction with the virtual pet, similar to real-world interaction scenarios, the most important thing is to provide physical contact such as haptic and non-verbal interaction(e.g., gesture). This paper investigates the effects on factors (e.g., shape and texture) of passive haptic feedbacks using mapping physical props corresponding to the virtual pet. Experimental results show significant differences in terms of immersion, co-presence, realism, and friendliness depending on the levels of texture elements when interacting with virtual pets by passive haptic feedback. Additionally, as the main findings of this study by statistical interaction between two variables, we found that there was Uncanny valley effect in terms of friendliness. With our results, we will expect to be able to provide guidelines for creating interactive contents with the virtual pet in immersive VR environments.