• Title/Summary/Keyword: Term specificity

Search Result 78, Processing Time 0.024 seconds

Improvement of Classification Accuracy of Different Finger Movements Using Surface Electromyography Based on Long Short-Term Memory (LSTM을 이용한 표면 근전도 분석을 통한 서로 다른 손가락 움직임 분류 정확도 향상)

  • Shin, Jaeyoung;Kim, Seong-Uk;Lee, Yun-Sung;Lee, Hyung-Tak;Hwang, Han-Jeong
    • Journal of Biomedical Engineering Research
    • /
    • v.40 no.6
    • /
    • pp.242-249
    • /
    • 2019
  • Forearm electromyography (EMG) generated by wrist movements has been widely used to develop an electrical prosthetic hand, but EMG generated by finger movements has been rarely used even though 20% of amputees lose fingers. The goal of this study is to improve the classification performance of different finger movements using a deep learning algorithm, and thereby contributing to the development of a high-performance finger-based prosthetic hand. Ten participants took part in this study, and they performed seven different finger movements forty times each (thumb, index, middle, ring, little, fist and rest) during which EMG was measured from the back of the right hand using four bipolar electrodes. We extracted mean absolute value (MAV), root mean square (RMS), and mean (MEAN) from the measured EMGs for each trial as features, and a 5x5-fold cross-validation was performed to estimate the classification performance of seven different finger movements. A long short-term memory (LSTM) model was used as a classifier, and linear discriminant analysis (LDA) that is a widely used classifier in previous studies was also used for comparison. The best performance of the LSTM model (sensitivity: 91.46 ± 6.72%; specificity: 91.27 ± 4.18%; accuracy: 91.26 ± 4.09%) significantly outperformed that of LDA (sensitivity: 84.55 ± 9.61%; specificity: 84.02 ± 6.00%; accuracy: 84.00 ± 5.87%). Our result demonstrates the feasibility of a deep learning algorithm (LSTM) to improve the performance of classifying different finger movements using EMG.

Study of regularization of long short-term memory(LSTM) for fall detection system of the elderly (장단기 메모리를 이용한 노인 낙상감지시스템의 정규화에 대한 연구)

  • Jeong, Seung Su;Kim, Namg Ho;Yu, Yun Seop
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.25 no.11
    • /
    • pp.1649-1654
    • /
    • 2021
  • In this paper, we introduce a regularization of long short-term memory (LSTM) based fall detection system using TensorFlow that can detect falls that can occur in the elderly. Fall detection uses data from a 3-axis acceleration sensor attached to the body of an elderly person and learns about a total of 7 behavior patterns, each of which is a pattern that occurs in daily life, and the remaining 3 are patterns for falls. During training, a normalization process is performed to effectively reduce the loss function, and the normalization performs a maximum-minimum normalization for data and a L2 regularization for the loss function. The optimal regularization conditions of LSTM using several falling parameters obtained from the 3-axis accelerometer is explained. When normalization and regularization rate λ for sum vector magnitude (SVM) are 127 and 0.00015, respectively, the best sensitivity, specificity, and accuracy are 98.4, 94.8, and 96.9%, respectively.

A Study on Term Life Cycle for Science & Technology Terms -Focused on 'ETNEWS' Corpus- (과학기술 용어에 대한 용어 생명주기 고찰 -전자신문 말뭉치를 중심으로-)

  • Jung, Han-Min;Sung, Won-Kyung
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2006.11a
    • /
    • pp.84-89
    • /
    • 2006
  • Keeping pace with the speed of development of science & technology domain, the domain terms continuously repeat the step of creation and extinction. This study tries to define term life cycle and analyze extracted terms from a large corpus with the viewpoint. We chose 'ETNEWS' corpus which includes about 17 million Eojeols for 12 years because it is easy to inspect the transition of term life cycle and the corpus represents computer, IT, and electrotechnology domains. This study acquired several useful conclusions including the relation between specificity and life of terms. We expect that term life cycle will contribute to analyze the competition of similar technologies and determine which term be registered into general dictionary.

  • PDF

A Leveling and Similarity Measure using Extended AHP of Fuzzy Term in Information System (정보시스템에서 퍼지용어의 확장된 AHP를 사용한 레벨화와 유사성 측정)

  • Ryu, Kyung-Hyun;Chung, Hwan-Mook
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.2
    • /
    • pp.212-217
    • /
    • 2009
  • There are rule-based learning method and statistic based learning method and so on which represent learning method for hierarchy relation between domain term. In this paper, we propose to leveling and similarity measure using the extended AHP of fuzzy term in Information system. In the proposed method, we extract fuzzy term in document and categorize ontology structure about it and level priority of fuzzy term using the extended AHP for specificity of fuzzy term. the extended AHP integrates multiple decision-maker for weighted value and relative importance of fuzzy term. and compute semantic similarity of fuzzy term using min operation of fuzzy set, dice's coefficient and Min+dice's coefficient method. and determine final alternative fuzzy term. after that compare with three similarity measure. we can see the fact that the proposed method is more definite than classification performance of the conventional methods and will apply in Natural language processing field.

The Effect of Asset Specificity, Information Sharing, and a Collaborative Environment on Supply Chain Management (SCM): An Integrated SCM Performance Formation Model (자산전용성과 협업환경하에서의 정보공유가 공급사슬에 미치는 영향 : 통합적 SCM 성과형성 모델)

  • Kim, Tae-Ryong;Song, Jang-Gwen
    • Journal of Distribution Science
    • /
    • v.11 no.4
    • /
    • pp.51-60
    • /
    • 2013
  • Purpose - The objective of this paper is to investigate the effect of asset specificity, the level of information sharing, the importance of information sharing, and an integrated collaborative environment on supply chain performance. Research design, data, and methodology - Data collection was implemented as follows: questionnaires were distributed to 250 companies that have business ties with Halla Climate Control Corporation. The empirical study to test our hypothesis was based on statistical analysis (using SPSS 18.0 and AMOS 18.0). The hypothesis of this paper is that the asset specificity variable has positive effects on the following variables: Level of information sharing, the importance of information sharing, and integrated collaborative environment. Moreover the variables, the level of information sharing, and the importance of information sharing are strongly influenced by the variable integrated collaborative environment, and these when combined, have an effect on the dependent variable, supply chain performance. We tested our hypothesized model utilizing path analysis with latent variables. Results - According to the results of our analysis, hypothesis H1, which tests whether there is a relationship between asset specificity and the integrated collaborative environment, is supported at the 0.01 level. Hypotheses H2 and H3 were also confirmed, and asset specificity had positive effects (+) on the level of information sharing variable. The importance of the information sharing variable was statistically significant at the 0.01 level. Hypotheses H4 and H5 posited that the integrated collaborative environment variable would have a positive effect on the level of information sharing; the importance of information sharing variable was strongly supported statistically, with a significant p-value below. Moreover, the level of information sharing (H6), and the importance of information sharing (H7) variables also had a statistically relevant influence on supply chain performance. As a result, existence of a collaborative system between companies would influence supply chain performance by strengthening real-time information access and information sharing. Thus, it is important to construct a collaborative environment where information sharing among companies and cooperation is possible. Conclusions - First, with rapid changes in the business environment, it becomes necessary for enterprises to acquire the right information in order to properly implement SCM. For successful SCM, firms should understand the importance of collaboration with supply chain partners and an internally built collaboration system, which in turn will better promote a partnership commitment with suppliers as well as collaborative integration with buyers. A collaborative system, as we suggest in this paper, facilitates the maintenance of a long-term relationship of trust, and can help reinforce information sharing. Second, it is necessary to increase information sharing over time via a collaborative system so that employees of the suppliers become aware of the system. The more proactive and positive attitudes are towards such a collaborative system by the managerial group, the higher the level of information sharing will be among the users. Successful SCM performance is achieved by information sharing through a collaborative environment rather than by investing only in setting up an information system.

  • PDF

Prediction of Transient Ischemia Using ECG Signals (심전도 신호를 이용한 일시적 허혈 예측)

  • Han-Go Choi;Roger G. Mark
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.5 no.3
    • /
    • pp.190-197
    • /
    • 2004
  • This paper presents automated prediction of transient ischemic episodes using neural networks(NN) based pattern matching method. The learning algorithm used to train the multilayer networks is a modified backpropagation algorithm. The algorithm updates parameters of nonlinear function in a neuron as well as connecting weights between neurons to improve learning speed. The performance of the method was evaluated using ECG signals of the MIT/BIH long-term database. Experimental results for 15 records(237 ischemic episodes) show that the average sensitivity and specificity of ischemic episode prediction are 85.71% and 71.11%, respectively. It is also found that the proposed method predicts an average of 45.53[sec] ahead real ischemia. These results indicate that the NN approach as the pattern matching classifier can be a useful tool for the prediction of transient ischemic episodes.

  • PDF

Long-Term Wildfire Reconstruction: In Need of Focused and Dedicated Pre-Planning Efforts

  • Harris, William S.;Choi, Jin Ouk;Lim, Jaewon;Lee, Yong-Cheol
    • International conference on construction engineering and project management
    • /
    • 2022.06a
    • /
    • pp.923-928
    • /
    • 2022
  • Wildfire disasters in the United States impact lives and livelihoods by destroying private homes, businesses, community facilities, and infrastructure. Disaster victims suffer from damaged houses, inadequate shelters, inoperable civil infrastructure, and homelessness coupled with long-term recovery and reconstruction processes. Cities and their neighboring communities require an enormous commitment for a full recovery for as long as disaster recovery processes last. State, county, and municipal governments inherently have the responsibility to establish and provide governance and public services for the benefit and well being of community members. Municipal governments' comprehensive and emergency response plans are the artifacts of planning efforts that guide accomplishing those duties. Typically these plans include preparation and response to natural disasters, including wildfires. The standard wildfire planning includes and outlines (1) a wildfire hazard assessment, (2) response approaches to prevent human injury and minimize damage to physical property, and (3) near- and long-term recovery and reconstruction efforts. There is often a high level of detail in the assessment section, but the level of detail and specificity significantly lessons to general approaches in the long-term recovery subsection. This paper aims to document the extent of wildfire preparedness at the county level in general, focusing on the long-term recovery subsections of municipal plans. Based on the identified challenges, the researchers provide recommendations for better longer-term recovery and reconstruction opportunities: 1) building permit requirements, 2) exploration of the use of modular construction, 3) address through relief from legislative requirements, and 4) early, simple, funding, and the aid application process.

  • PDF

A Meta-analysis of the Timed Up and Go test for Predicting Falls (낙상 위험 선별검사 Timed Up and Go test의 예측 타당도 메타분석)

  • Park, Seong-Hi;Lee, On-Seok
    • Quality Improvement in Health Care
    • /
    • v.22 no.2
    • /
    • pp.27-40
    • /
    • 2016
  • Purpose: Globally, falls are a major public health problem. The study aimed to evaluate the predictive validity of the Timed Up and Go test (TUGT) as a screening tool for fall risk. Methods: An electronic search was performed Medline, EMBASE, CINAHL, Cochran Library, KoreaMed and the National Digital Science Library and other databases, using the following keywords: 'fall', 'fall risk assessment', 'fall screening', 'mobility scale', and 'risk assessment tool'. The QUADAS-II was applied to assess the internal validity of the diagnostic studies. Thirteen studies were analyzed using meta-analysis with MetaDisc 1.4. Results: The selected 13 studies reporting predictive validity of TUGT of fall risks were meta-analyzed with a sample size of 1004 with high methodological quality. Overall predictive validity of TGUT was as follows. The pooled sensitivity 0.72 (95% confidence interval [CI]: 0.67-0.77), pooled specificity 0.58 (95% CI: 0.54-0.63) and sROC AUC was 0.75 respectively. Heterogeneity among studies was a moderate level in sensitivity. Conclusion: The TGUT's predictive validity for fall risk is at a moderate level. Although there is a limit to interpret the results for heterogeneity between the literature, TGUT is an appropriate tool to apply to all patients at a potential risk of accidental fall in a hospital or long-term care facility.

Validation of Instruments to Classify the Frailty of the Elderly in Community (지역사회 거주 노인의 허약선별도구 타당도 평가)

  • Lee, In-Sook;Park, Young-Im;Park, Eun-Ok;Lee, Soon-Hee;Jeong, Ihn-Sook
    • Research in Community and Public Health Nursing
    • /
    • v.22 no.3
    • /
    • pp.302-314
    • /
    • 2011
  • Purpose: This study aimed to validate instruments to classify the frailty of Korean elderly people in community. Methods: For this study, 632 elders were selected from community-based elderly houses and home visiting registries, and data on frailty were collected using three instruments during November, 2008. The Korean Frail Scale (KFS) was composed of 10 domains with the maximum score of 20. The Edmonton Frail Scale (EFS) had 10 domains with the maximum score of 17. The 25_Japan Frail Scale (25_JFS) was composed of 6 domains with the maximum score of 25. Internal consistency was measured with Cronbach's ${\alpha}$. Sensitivity, specificity and area under the curve (AUC) of ROC were measured to see validity with long.term care insurance grade as a gold standard. Results: The Cronbach's ${\alpha}$ was .72 for KFS, .55 for EFS, and .80 for 25_JFS. Sensitivity, specificity, and AUC were 70.0%, 83.2%, and .83, respectively, at cutting point 10.5 for the KFS, 50.0%, 80.9%, and .66, respectively, at 8.5 for EFS, and 80.0%, 85.9%, and .86, respectively, at 12.5 for 25_JFS. Conclusion: KFS and three JFS showed favorable internal consistency and predictive validity. Further longitudinal studies are recommended to confirm predictive validity.

Forecasting Short-Term KOSPI using Wavelet Transforms and Fuzzy Neural Network (웨이블릿 변환과 퍼지 신경망을 이용한 단기 KOSPI 예측)

  • Shin, Dong-Kun;Chung, Kyung-Yong
    • The Journal of the Korea Contents Association
    • /
    • v.11 no.6
    • /
    • pp.1-7
    • /
    • 2011
  • The methodology of KOSPI forecast has been considered as one of the most difficult problem to develop accurately since short-term KOSPI is correlated with various factors including politics and economics. In this paper, we presents a methodology for forecasting short-term trends of stock price for five days using the feature selection method based on a neural network with weighted fuzzy membership functions (NEWFM). The distributed non-overlap area measurement method selects the minimized number of input features by removing the worst input features one by one. A technical indicator are selected for preprocessing KOSPI data in the first step. In the second step, thirty-nine numbers of input features are produced by wavelet transforms. Twelve numbers of input features are selected as the minimized numbers of input features from thirty-nine numbers of input features using the non-overlap area distribution measurement method. The proposed method shows that sensitivity, specificity, and accuracy rates are 72.79%, 74.76%, and 73.84%, respectively.