• Title/Summary/Keyword: Dichotomy Method

Search Result 30, Processing Time 0.027 seconds

A Study on Stage Classification of Eight Constitution Questionnaire (팔체질 진단을 위한 단계별 설문지 개발 연구)

  • Lee, Joo-Ho;Kim, Min-Yong;Kim, Hee-Ju;Shin, Young-Sup;Oh, Hwan-Sup;Park, Young-Bae;Park, Young-Jae
    • The Journal of the Society of Korean Medicine Diagnostics
    • /
    • v.16 no.2
    • /
    • pp.59-70
    • /
    • 2012
  • Objectives : Pulse diagnosis by Expert is the only way to classify 8 Constitutions so the study to supplement classifying method by the questionnaire has developed and modified and ECM-32 System has designed in 2010. But analyzing with Decision tree had many nodes and 32 important questions omitted while processing the data. So this study was to classify the 8 constitution patients into 2 groups first and analyze its characters in consecutive order. Methods : The participants of this study were 1027 patients who classified into one of the 8 constitutions according to pulse diagnosis and answered 251 questionnaires in 2010. They were divided into sympathetic nerve acceleration constitution and parasympathetic nerve acceleration constitution and analyzed with decision tree. Results : The reponses of the questionnaire were analyzed with 4 methods of 5 scales interval method from 0 to 5, Na, Low(1,2), Medium(3), High(4,5), average value, Y/N dichotomy. Average Value had no significance. 1. From the 5 scale interval method 6 questionnaires with 7 nodes (F5e, B1d, F7f, F2a, F1b, C4L) were significant. The accuracy was 92.5%. 2. From L, M, H method 7 questionnaires with 7 nodes(F5e, B1d, F7f, F1a, B1c, C4L, P3d) were significant. The accuracy was 92.5%. 3. From Y/N dichotomy 9 questionnaires with 9 nodes( F5e, B1d, F7f, F1a, B1c, C4L, B1b, P1i, B2a) were significant. The accuracy was 93.18%. Conclusions : Based on this study, Yes or No dichotomy method was most significant and categorized among the 4 methods. Unlike previous studies which used interval scale method only, Y/N dichotomy method was more statistically significant with the questionnaire to supplement the method of pulse diagnosis. For further study by analyzing decision tree method in consecutive order, the patients can be divided into 8 Constitutions with higher significance with less questionnaires.

An Automatic Urban Function District Division Method Based on Big Data Analysis of POI

  • Guo, Hao;Liu, Haiqing;Wang, Shengli;Zhang, Yu
    • Journal of Information Processing Systems
    • /
    • v.17 no.3
    • /
    • pp.645-657
    • /
    • 2021
  • Along with the rapid development of the economy, the urban scale has extended rapidly, leading to the formation of different types of urban function districts (UFDs), such as central business, residential and industrial districts. Recognizing the spatial distributions of these districts is of great significance to manage the evolving role of urban planning and further help in developing reliable urban planning programs. In this paper, we propose an automatic UFD division method based on big data analysis of point of interest (POI) data. Considering that the distribution of POI data is unbalanced in a geographic space, a dichotomy-based data retrieval method was used to improve the efficiency of the data crawling process. Further, a POI spatial feature analysis method based on the mean shift algorithm is proposed, where data points with similar attributive characteristics are clustered to form the function districts. The proposed method was thoroughly tested in an actual urban case scenario and the results show its superior performance. Further, the suitability of fit to practical situations reaches 88.4%, demonstrating a reasonable UFD division result.

Ensemble of Nested Dichotomies for Activity Recognition Using Accelerometer Data on Smartphone (Ensemble of Nested Dichotomies 기법을 이용한 스마트폰 가속도 센서 데이터 기반의 동작 인지)

  • Ha, Eu Tteum;Kim, Jeongmin;Ryu, Kwang Ryel
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.4
    • /
    • pp.123-132
    • /
    • 2013
  • As the smartphones are equipped with various sensors such as the accelerometer, GPS, gravity sensor, gyros, ambient light sensor, proximity sensor, and so on, there have been many research works on making use of these sensors to create valuable applications. Human activity recognition is one such application that is motivated by various welfare applications such as the support for the elderly, measurement of calorie consumption, analysis of lifestyles, analysis of exercise patterns, and so on. One of the challenges faced when using the smartphone sensors for activity recognition is that the number of sensors used should be minimized to save the battery power. When the number of sensors used are restricted, it is difficult to realize a highly accurate activity recognizer or a classifier because it is hard to distinguish between subtly different activities relying on only limited information. The difficulty gets especially severe when the number of different activity classes to be distinguished is very large. In this paper, we show that a fairly accurate classifier can be built that can distinguish ten different activities by using only a single sensor data, i.e., the smartphone accelerometer data. The approach that we take to dealing with this ten-class problem is to use the ensemble of nested dichotomy (END) method that transforms a multi-class problem into multiple two-class problems. END builds a committee of binary classifiers in a nested fashion using a binary tree. At the root of the binary tree, the set of all the classes are split into two subsets of classes by using a binary classifier. At a child node of the tree, a subset of classes is again split into two smaller subsets by using another binary classifier. Continuing in this way, we can obtain a binary tree where each leaf node contains a single class. This binary tree can be viewed as a nested dichotomy that can make multi-class predictions. Depending on how a set of classes are split into two subsets at each node, the final tree that we obtain can be different. Since there can be some classes that are correlated, a particular tree may perform better than the others. However, we can hardly identify the best tree without deep domain knowledge. The END method copes with this problem by building multiple dichotomy trees randomly during learning, and then combining the predictions made by each tree during classification. The END method is generally known to perform well even when the base learner is unable to model complex decision boundaries As the base classifier at each node of the dichotomy, we have used another ensemble classifier called the random forest. A random forest is built by repeatedly generating a decision tree each time with a different random subset of features using a bootstrap sample. By combining bagging with random feature subset selection, a random forest enjoys the advantage of having more diverse ensemble members than a simple bagging. As an overall result, our ensemble of nested dichotomy can actually be seen as a committee of committees of decision trees that can deal with a multi-class problem with high accuracy. The ten classes of activities that we distinguish in this paper are 'Sitting', 'Standing', 'Walking', 'Running', 'Walking Uphill', 'Walking Downhill', 'Running Uphill', 'Running Downhill', 'Falling', and 'Hobbling'. The features used for classifying these activities include not only the magnitude of acceleration vector at each time point but also the maximum, the minimum, and the standard deviation of vector magnitude within a time window of the last 2 seconds, etc. For experiments to compare the performance of END with those of other methods, the accelerometer data has been collected at every 0.1 second for 2 minutes for each activity from 5 volunteers. Among these 5,900 ($=5{\times}(60{\times}2-2)/0.1$) data collected for each activity (the data for the first 2 seconds are trashed because they do not have time window data), 4,700 have been used for training and the rest for testing. Although 'Walking Uphill' is often confused with some other similar activities, END has been found to classify all of the ten activities with a fairly high accuracy of 98.4%. On the other hand, the accuracies achieved by a decision tree, a k-nearest neighbor, and a one-versus-rest support vector machine have been observed as 97.6%, 96.5%, and 97.6%, respectively.

Dewey's Pragmatic Conception of Value (듀이의 실용주의적 가치 개념)

  • Kook, Soon-ah
    • Journal of Korean Philosophical Society
    • /
    • v.137
    • /
    • pp.1-31
    • /
    • 2016
  • The aim of this paper is to put forward the significance that Dewey's naturalistic theory of value has today in examining how value arises from experience. This is a necessary discussion as logical-positivists bring about the problem of fact/value dichotomy and further deny the possibility of intellectual discussion on value judgments. In this situation, the task that the discussion on value must be resolved is to go beyond the problem of fact/value dichotomy and to confer objectivity upon value judgments. In the stream of analytic philosophy, the significance of Dewey's theory of value is revealed by how Putnam and Johnson receive it. To overcome the problem of dichotomy, Putnam asserts that they are entangled because the value arises from a criticism through scientific inquiry. Also Johnson proves that Dewey's moral deliberation as valuation is wedded with cognition, feeling, and imagination by the research on cognitive science and shows that Dewey's theory of value is un-relativistic because it is on the basis of shared experience. So, if the absolute value is not given to us, Dewey's theory of value shows us how value is made by open inquiry. It has the significance of proposing the direction that the theory of value orients itself today.

Fabrication and Characterization of Electro-photonic Performance of Nanopatterned Organic Optoelectronics

  • Nil, Ri-Swi;Han, Ji-Yeong;Gwon, Hyeon-Geun;Lee, Gyu-Tae;Go, Du-Hyeon
    • Proceedings of the Korean Vacuum Society Conference
    • /
    • 2014.02a
    • /
    • pp.134.2-134.2
    • /
    • 2014
  • Photonic crystal solar cells have the potential for addressing the disparate length scales in polymer photovoltaic materials, thereby confronting the major challenge in solar cell technology: efficiency. One must achieve simultaneously an efficient absorption of photons with effective carrier extraction. Unfortunately the two processes have opposing requirements. Efficient absorption of light calls for thicker PV active layers whereas carrier transport always benefits from thinner ones, and this dichotomy is at the heart of an efficiency/cost conundrum that has kept solar energy expensive relative to fossil fuels. This dichotomy persists over the entire solar spectrum but increasingly so near a semiconductor's band edge where absorption is weak. We report a 2-D, photonic crystal morphology that enhances the efficiency of organic photovoltaic cells relative to conventional planar cells. The morphology is developed by patterning an organic photoactive bulk heterojunction blend of Poly(3-(2-methyl-2-hexylcarboxylate) thiophene-co-thiophene) and PCBM via PRINT, a nano-embossing method that lends itself to large area fabrication of nanostructures. The photonic crystal cell morphology increases photocurrents generally, and particularly through the excitation of resonant modes near the band edge of the organic PV material. The device performance of the photonic crystal cell showed a nearly doubled increase in efficiency relative to conventional planar cell designs. Photonic crystals can also enhance performance of other optoelectronic devices including organic laser.

  • PDF

Public Vehicle Routing Problem Algorithm (공공차량 경로문제 해법연구)

  • 장병만;박순달
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.14 no.2
    • /
    • pp.53-66
    • /
    • 1989
  • The Public Vehicle Routing Problem (PVRP) is to find the minimum total cost routes of M or less Public-Vehicles to traverse the required arcs(streets) at least once, and return to their starting depot on a directed network. In this paper, first, a mathematical model is formulated as minimal cost flow model with illegal subtour elimination constraints, and with the fixed cost and routing cost as an objective function. Second, an efficient branch and bound algorithm is developed to obtain an exact solution. A subproblem in this method is a minimal cost flow problem relaxing illegal subtour elimination constraints. The branching strategy is a variable dichotomy method according to the entering nonrequired arcs which are candidates to eneter into an illegal subtour. To accelerate the fathoming process, a tighter lower bound of a candidate subproblem is calculated by using a minimum reduced coast of the entering nonrequired arcs. Computational results based on randomly generated networks report that the developed algorithm is efficient.

  • PDF

INTERPRETING A SINGLE ANTISTREPTOLYSIN O TEST: A COMPARISON OF THE 'UPPER LIMIT OF NORMAL' AND LIKELIHOOD RATIO METHODS

  • Gray Gregory C.;Struewing Jeffery P.;Hyams Kenneth C.;Escamilla Joel;Tupponce Alan K.;Kaplan Edward L.
    • 대한예방의학회:학술대회논문집
    • /
    • 1994.02b
    • /
    • pp.164-168
    • /
    • 1994
  • Single serologic tests may occasionally influence clinicians in making diagnoses. The antistreptolysin O (ASO) test is a frequently used tool for detecting recent Streptococcus pyogenes infection and is helpful in the diagnosis of diseases like rheumatic fever. Using data from a 1989 prospective study of 600 healthy male military recruits, in which 43% experienced S. pyogenes upper respiratory tract infection (2-dilution rise in ASO), this report compared two methods of interpreting a single ASO titer. Using the 'upper limit of normal' (80 percentile) method, recruits with an ASO titer of greater than 400 showed evidence of recent S. pyogenes infection. This method had a sensitivity and specificity of only 65.9 and 81.9% respectively. In contrast to the 'yes-no'. dichotomy of the 'upper limit of normal' method. the likelihood ratio method statistics were ASO value specific, more consistent with clinical judgment, and better emphasized the caution clinicians must use in interpreting a single ASO test.

  • PDF

A Study on the Expressivity of Covering and Exposing of Architecture Surface after Modern Architecture - Focused on the Tectonic Concept through Semper's Theory "Dressing" - (근·현대 건축표면의 가림과 드러냄의 표현성에 관한 연구 - 젬퍼의 피복론을 통한 텍토닉개념을 중심으로 -)

  • Oh, Sang-Eun
    • Korean Institute of Interior Design Journal
    • /
    • v.23 no.3
    • /
    • pp.29-38
    • /
    • 2014
  • This paper is to analysis covering and exposing elements through surface in the spirit of the time through the meaning of relationship between structure and symbol(ornament) in the theory of dressing of Gottfrid Semper. In other words, The purpose is to illuminate how complementary tectonic between structure and symbolic of an architecture surface is expressed in accordance with the biased required conditions relating with the paradigm of the era. The advancement of the new method of tectonic and the new aesthetic taste have a deep relation with the reconsidering the dichotomy classification discussing a dominant position between structure and symbol(ornament). Expression of surface representing the era comes across the combined interpretation of technology, structure, and the non-physical culture's art of the community and the era.

The Role of Online Product Information in the Relationship between Quality, Preference and Customer's Purchase Intention (품질, 취향 및 소비자 구매 의도 간의 관계에 있어 온라인 상품 정보의 역할)

  • Lee, Jung;Lee, Jae-Nam
    • Journal of Information Technology Services
    • /
    • v.8 no.2
    • /
    • pp.205-228
    • /
    • 2009
  • This paper examines how online product information changes the customers' purchase intentions from subjectivity-objectivity dichotomy perspective. Quality and Preference are proposed as product evaluation criteria and their marginal changes with product information differentiation are hypothesized. An experimental survey was conducted to 57 subjects and the hypotheses were partially supported through PLS path comparison method. The study contributes to IS research by proposing a simple and effective product evaluation framework and by abstracting the impact of product information from other factors. Finally, we suggest the utilization of product information with the optimization of the cost-benefit structure between information and purchase intention.

An algorithm for simulation of cyclic eccentrically-loaded RC columns using fixed rectangular finite elements discretization

  • Sadeghi, Kabir;Nouban, Fatemeh
    • Computers and Concrete
    • /
    • v.23 no.1
    • /
    • pp.25-36
    • /
    • 2019
  • In this paper, an algorithm is presented to simulate numerically the reinforced concrete (RC) columns having any geometric form of section, loaded eccentrically along one or two axes. To apply the algorithm, the columns are discretized into two macro-elements (MEs) globally and the critical sections of columns are discretized into fixed rectangular finite elements locally. A proposed triple simultaneous dichotomy convergence method is applied to find the equilibrium state in the critical section of the column considering the three strains at three corners of the critical section as the main characteristic variables. Based on the proposed algorithm a computer program has been developed for simulation of the nonlinear behavior of the eccentrically-loaded columns. A good agreement has been witnessed between the results obtained applying the proposed algorithm and the experimental test results. The simulated results indicate that the ultimate strength and stiffness of the RC columns increase with the increase in axial force value, but large axial loads reduce the ductility of the column, make it brittle, impose great loss of material, and cause early failure.