• Title/Summary/Keyword: analysis of algorithms

Search Result 3,535, Processing Time 0.032 seconds

Performance Enhancement of Algorithms based on Error Distributions under Impulsive Noise (충격성 잡음하에서 오차 분포에 기반한 알고리듬의 성능향상)

  • Kim, Namyong;Lee, Gyoo-yeong
    • Journal of Internet Computing and Services
    • /
    • v.19 no.3
    • /
    • pp.49-56
    • /
    • 2018
  • Euclidean distance (ED) between error distribution and Dirac delta function has been used as an efficient performance criterion in impulsive noise environmentsdue to the outlier-cutting effect of Gaussian kernel for error signal. The gradient of ED for its minimization has two components; $A_k$ for kernel function of error pairs and the other $B_k$ for kernel function of errors. In this paper, it is analyzed that the first component is to govern gathering close together error samples, and the other one $B_k$ is to conduct error-sample concentration on zero. Based upon this analysis, it is proposed to normalize $A_k$ and $B_k$ with power of inputs which are modified by kernelled error pairs or errors for the purpose of reinforcing their roles of narrowing error-gap and drawing error samples to zero. Through comparison of fluctuation of steady state MSE and value of minimum MSE in the results of simulation of multipath equalization under impulsive noise, their roles and efficiency of the proposed normalization method are verified.

Sensitivity Identification Method for New Words of Social Media based on Naive Bayes Classification (나이브 베이즈 기반 소셜 미디어 상의 신조어 감성 판별 기법)

  • Kim, Jeong In;Park, Sang Jin;Kim, Hyoung Ju;Choi, Jun Ho;Kim, Han Il;Kim, Pan Koo
    • Smart Media Journal
    • /
    • v.9 no.1
    • /
    • pp.51-59
    • /
    • 2020
  • From PC communication to the development of the internet, a new term has been coined on the social media, and the social media culture has been formed due to the spread of smart phones, and the newly coined word is becoming a culture. With the advent of social networking sites and smart phones serving as a bridge, the number of data has increased in real time. The use of new words can have many advantages, including the use of short sentences to solve the problems of various letter-limited messengers and reduce data. However, new words do not have a dictionary meaning and there are limitations and degradation of algorithms such as data mining. Therefore, in this paper, the opinion of the document is confirmed by collecting data through web crawling and extracting new words contained within the text data and establishing an emotional classification. The progress of the experiment is divided into three categories. First, a word collected by collecting a new word on the social media is subjected to learned of affirmative and negative. Next, to derive and verify emotional values using standard documents, TF-IDF is used to score noun sensibilities to enter the emotional values of the data. As with the new words, the classified emotional values are applied to verify that the emotions are classified in standard language documents. Finally, a combination of the newly coined words and standard emotional values is used to perform a comparative analysis of the technology of the instrument.

Drug Use Evaluation of Antihypertensive Agents by JNC VI Guidelines (고혈압 치료 지침 Vl에 의한 항고혈압제의 사용평가)

  • Kim, Kyung Hwa;Lee, Suk Hyang
    • Korean Journal of Clinical Pharmacy
    • /
    • v.12 no.1
    • /
    • pp.29-38
    • /
    • 2002
  • Hypertension is an important public health problem because it increases the risk of stroke, angina, myocardial infarction, heart failure, and end-stage renal disease. If it is not actively treated, morbidity and mortality increase with hypertension-induced complications and quality of life decreases. This study was to evaluate the use of antihypertensive drugs and blood pressure changes and to compare algorithms chosen (or the 1st and 2nd line therapy of hypertension based on the JNC VI recommendations. The medical charts of 222 patients with essential hypertension at St. Vincent's Hospital in Suwon from January 1997 to January 2000 were reviewed retrospectively. Data collection and analysis included baseline BP underlying diseases and complications, administered antihypertensives, BP changes, changes of antihypertensive regimen, and adverse effects with treatments. As results, the higher BP the patients had, the more frequent they had target organ damages and clinical cardiovascular diseases. Mean duration to reduce blood pressure less than 140/90 mmHg was 8 weeks in $85.3\%$ of the patients. The rate of control in BP was $82.4\%$ at 6 months. The major antihypertensive drugs prescribed were calcium channel blockers $(61.8\%)$ , ACE inhibitors $(19.1\%),\;\beta-blockers\;(13.7\%)$ and diuretics $(5.3\%)$ as the 1st-line monotherapy. The methods of treatment used as the 1st-line therapy were monotherapy$(59\%)$ and combination therapy $(41\%)$. Blood pressure change was significantly greater for combination therapy than monotherapy$(-26.2\pm21.4\;vs.\;-18.56\pm16.7$ mmHg for systolic blood pressure; P<0.003, $-16.9\pm13.2\;vs.\;-9.2\pm12.8$ mmHg for diastolic blood pressure; p<0.001). When blood pressure was not completely controlled with the first antihypertensive selected, the 2nd line therapy had 4 options: addition of 2nd agent from different class; $66.2\%$, substitution with another drug, $21.9\%$ increase dose $11.9\%$ continue first regimen $27.9\%$ Calcium channel blockers were the most frequently prescribed agents. This was not comparable to the JNC VI guideline which recommended diuretics and $\beta-blockers$ for the 1st-line therapy. Most of patients achieved the goal BP and maintained it until 6 months, but the remaining patients should be controlled more tightly to improve their BP with combination of life style modification, patient education, and pharmacotherapy.

  • PDF

Out-of-Plane Buckling Analysis of Curved Beams Considering Rotatory Inertia Using DQM (미분구적법(DQM)을 이용 회전관성을 고려한 곡선 보의 외평면 좌굴해석)

  • Kang, Ki-jun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.17 no.10
    • /
    • pp.300-309
    • /
    • 2016
  • Curved beams are increasingly used in buildings, vehicles, ships, and aircraft, which has resulted in considerable effort towards developing an accurate method for analyzing the dynamic behavior of such structures. The stability behavior of elastic curved beams has been the subject of many investigations. Solutions to the relevant differential equations have traditionally been obtained by the standard finite difference or finite element methods. However, these techniques require a great deal of computer time for a large number of discrete nodes with conditions of complex geometry and loading. One efficient procedure for the solution of partial differential equations is the differential quadrature method (DQM). This method has been applied to many cases to overcome the difficulties of complex algorithms and high storage requirements for complex geometry and loading conditions. Out-of-plane buckling of curved beams with rotatory inertia were analyzed using DQM under uniformly distributed radial loads. Critical loads were calculated for the member with various parameter ratios, boundary conditions, and opening angles. The results were compared with exact results from other methods for available cases. The DQM used only a limited number of grid points and shows very good agreement with the exact results (less than 0.3% error). New results according to diverse variation are also suggested, which show important roles in the buckling behavior of curved beams and can be used for comparisons with other numerical solutions or experimental test data.

The Accuracy Analysis of Methods to solve the Geodetic Inverse Problem (측지 역 문제 해석기법의 정확도 분석)

  • Lee, Yong-Chang
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.29 no.4
    • /
    • pp.329-341
    • /
    • 2011
  • The object of this paper is to compare the accuracy and the characteristic of various methods of solving the geodetic inverse problem for the geodesic lines which be in the standard case and special cases(antipodal, near antipodal, equatorial, and near equatorial situation) on the WGS84 reference ellipsoid. For this, the various algorithms (classical and recent solutions) to deal with the geodetic inverse problem are examined, and are programmed in order to evaluate the calculation ability of each method for the precise geodesic determination. The main factors of geodetic inverse problem, the distance and the forward azimuths between two points on the sphere(or ellipsoid) are determined by the 18 kinds of methods for the geodetic inverse solutions. After then, the results from the 17 kinds of methods in the both standard and special cases are compared with those from the Karney method as a reference. When judging these comparison, in case of the standard geodesics whose length do not exceed 100km, all of the methods show the almost same ability to Karney method. Whereas to the geodesics is longer than 4,000km, only two methods (Vincenty and Pittman) show the similar ability to the Karney method. In the cases of special geodesics, all methods except the Modified Vincenty method was not proper to solve the geodetic inverse problem through the comparison with Karney method. Therefore, it is needed to modify and compensate the algorithm of each methods by examining the various behaviors of geodesics on the special regions.

The Development and Application of Biotop Value Assessment Tool(B-VAT) Based on GIS to Measure Landscape Value of Biotop (GIS 기반 비오톱 경관가치 평가도구(B-VAT)의 개발 및 적용)

  • Cho, Hyun-Ju;Ra, Jung-Hwa;Kwon, Oh-Sung
    • Journal of Korean Society of Rural Planning
    • /
    • v.18 no.4
    • /
    • pp.13-26
    • /
    • 2012
  • The purpose of this study is to select the study area, which will be formed into Daegu Science Park as an national industrial complex, and to assess the landscape value based on biotop classification with different polygon forms, and to develop and computerize Biotop Value Assessment Tool (B-VAT) based on GIS. The result is as follows. First, according to the result of biotop classification based on an advanced analysis on preliminary data, a field study, and a literature review, total 13 biotop groups such as forrest biotop groups and total 63 biotop types were classified. Second, based on the advanced research on landscape value assessment model of biotop, we development biotop value assessment tool by using visual basic programming language on the ArcGIS. The first application result with B-VAT showed that the first grade was classified into 19 types including riverside forest(BE), the second grade 12 types including artificial plantation(ED), and the third class, the fourth grade, and the fifth grade 12 types, 2 types, and 18 types respectively. Also, according to the second evaluation result with above results, we divided a total number of 31 areas and 34 areas, which had special meaning for landscape conservation(1a, 1b) and which had meaning for landscape conservation(2a, 2b, 2c). As such, biotop type classification and an landscape value evaluation, both of which were suggested from the result of the study, will help to scientifically understand a landscape value for a target land before undertaking reckless development. And it will serve to provide important preliminary data aimed to overcome damaged landscape due to developed and to manage a landscape planning in the future. In particular, we expect that B-VAT based on GIS will help overcome the limitations of applicability for of current value evaluation models, which are based on complicated algorithms, and will be a great contribution to an increase in convenience and popularity. In addition, this will save time and improve the accuracy for hand-counting. However, this study limited to aesthetic-visual part in biotop assessment. Therefore, it is certain that in the future research comprehensive assessment should be conducted with conservation and recreation view.

A Study on Improvement of Image Quality Decrease due to Tooth Restoration in Facial CT (안면부 CT 검사 시 치아 충전물에 의한 화질 저하 개선 방안에 관한 연구)

  • Kim, Hyeon ju;Yoon, Joon
    • Journal of the Korean Society of Radiology
    • /
    • v.12 no.4
    • /
    • pp.497-503
    • /
    • 2018
  • The purpose of this study was to investigate the degree of image degradation and the improvement of image quality caused by the density difference between the orthodontic filling material and the surrounding anatomical structure during the examination of the facial CT by quantitative and qualitative analysis. The teeth were scanned using 64-MDCT (Discovery 750 HD, GE HEALTH CARE, Milwaukee, USA). The teeth were scanned and compared according to tube voltage, silicone application, and MAR application. As a result, 10.36% CT value decreased at 140 kVp and 5.81% decrease at the application of silicon material. As a result of the qualitative evaluation, it was evaluated that 7 of the 10 observers and 3 of the acceptors were applied to the MAR algorithm. Therefore, it is possible to reduce the unnecessary burden on the radiation exposure dose as well as to reduce the loss of image data by reducing the high density artifacts, as well as the inspection parameters used in the current clinical application and various algorithms that can reduce the high density artifacts. It can be expected to provide a lot of image information.

Development of Decision Support System for the Design of Steel Frame Structure (강 프레임 구조물 설계를 위한 의사 결정 지원 시스템의 개발)

  • Choi, Byoung Han
    • Journal of Korean Society of Steel Construction
    • /
    • v.19 no.1
    • /
    • pp.29-41
    • /
    • 2007
  • Structural design, like other complex decision problems, involves many trade-offs among competing criteria. Although mathematical programming models are becoming increasingly realistic, they often have design limitations, that is, there are often relevant issues that cannot be easily captured. From the understanding of these limitations, a decision-support system is developed that can generate some useful alternatives as well as a single optimum value in the optimization of steel frame structures. The alternatives produced using this system are "good" with respect to modeled objectives, and yet are "different," and are often better, with respect to interesting objectives not present in the model. In this study, we created a decision-support system for designing the most cost-effective moment-resisting steel frame structures for resisting lateral loads without compromising overall stability. The proposed approach considers the cost of steel products and the cost of connections within the design process. This system makes use of an optimization formulation, which was modified to generate alternatives of optimum value, which is the result of the trade-off between the number of moment connections and total cost. This trade-off was achieved by reducing the number of moment connections and rearranging them, using the combination of analysis based on the LRFD code and optimization scheme based on genetic algorithms. To evaluate the usefulness of this system, the alternatives were examined with respect to various design aspects.

Bargaining Game using Artificial agent based on Evolution Computation (진화계산 기반 인공에이전트를 이용한 교섭게임)

  • Seong, Myoung-Ho;Lee, Sang-Yong
    • Journal of Digital Convergence
    • /
    • v.14 no.8
    • /
    • pp.293-303
    • /
    • 2016
  • Analysis of bargaining games utilizing evolutionary computation in recent years has dealt with important issues in the field of game theory. In this paper, we investigated interaction and coevolution process among heterogeneous artificial agents using evolutionary computation in the bargaining game. We present three kinds of evolving-strategic agents participating in the bargaining games; genetic algorithms (GA), particle swarm optimization (PSO) and differential evolution (DE). The co-evolutionary processes among three kinds of artificial agents which are GA-agent, PSO-agent, and DE-agent are tested to observe which EC-agent shows the best performance in the bargaining game. The simulation results show that a PSO-agent is better than a GA-agent and a DE-agent, and that a GA-agent is better than a DE-agent with respect to co-evolution in bargaining game. In order to understand why a PSO-agent is the best among three kinds of artificial agents in the bargaining game, we observed the strategies of artificial agents after completion of game. The results indicated that the PSO-agent evolves in direction of the strategy to gain as much as possible at the risk of gaining no property upon failure of the transaction, while the GA-agent and the DE-agent evolve in direction of the strategy to accomplish the transaction regardless of the quantity.

A Study on Prediction of EPB shield TBM Advance Rate using Machine Learning Technique and TBM Construction Information (머신러닝 기법과 TBM 시공정보를 활용한 토압식 쉴드TBM 굴진율 예측 연구)

  • Kang, Tae-Ho;Choi, Soon-Wook;Lee, Chulho;Chang, Soo-Ho
    • Tunnel and Underground Space
    • /
    • v.30 no.6
    • /
    • pp.540-550
    • /
    • 2020
  • Machine learning has been actively used in the field of automation due to the development and establishment of AI technology. The important thing in utilizing machine learning is that appropriate algorithms exist depending on data characteristics, and it is needed to analysis the datasets for applying machine learning techniques. In this study, advance rate is predicted using geotechnical and machine data of TBM tunnel section passing through the soil ground below the stream. Although there were no problems of application of statistical technology in the linear regression model, the coefficient of determination was 0.76. While, the ensemble model and support vector machine showed the predicted performance of 0.88 or higher. it is indicating that the model suitable for predicting advance rate of the EPB Shield TBM was the support vector machine in the analyzed dataset. As a result, it is judged that the suitability of the prediction model using data including mechanical data and ground information is high. In addition, research is needed to increase the diversity of ground conditions and the amount of data.