• Title/Summary/Keyword: Experimental and Calculation Analysis Method

Search Result 422, Processing Time 0.033 seconds

A New SPW Scheme for PAPR Reduction in OFDM Systems by Using Genetic Algorithm (유전자 알고리즘을 적용한 SPW에 의한 새로운 OFDM 시스템 PAPR 감소 기법)

  • Kim Sung-Soo;Kim Myoung-Je;Kee Jong-Hae
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.16 no.11 s.102
    • /
    • pp.1131-1137
    • /
    • 2005
  • An orthogonal frequency division multiplexing(OFDM) system has the problem of peak-to-average power ratio (PAPR) due to the overlapping phenomena of many sub-carriers. In order to improve the performance of PAPR, we propose in this paper a new genetic sub-block phase weighting(GA-SPW) using the SPW technique. Not only the selecting mapping(SLM) and the partial sequence(PTS) but also the previously proposed SPW becomes more effective as the number of sub-blocks and phase elements increases. However, all of them have limitation on the number of sub-blocks since the searching repetition increases exponentially as the number of sub-blocks increases. Therefore, in this research, a new GA SPW is proposed to reduce the amount of calculation by using Genetic algorithm(GA). In the proposed method, the number of calculations involved in the iterative phase searching yields to depend on the number of population and generation not on the number of sub-blocks and phase elements. The superiority of the proposed method is presented in the experimental results and analysis.

Vibration Analysis of Combined Deck Structure-Car System of Car Carriers (자동차운반선(自動車運搬船)의 갑판-차량(甲板-車輛) 연성계(聯成系)의 진동해석(振動解析))

  • S.Y.,Han;K.C.,Kim
    • Bulletin of the Society of Naval Architects of Korea
    • /
    • v.27 no.2
    • /
    • pp.63-77
    • /
    • 1990
  • The combined deckstructure-car system of a car carrier is especially sensitive to hull girder vibrations due to mechanical excitations and wave loads. For the free and forced vibration analysis of the system, the analytical methods based on the receptance method and two schemes for efficient applications of the methods are presented. The methods are especially relevant to dynamical reanalysis of the system subject to design modification or to dynamic optimization. The deck-car system is modelled as a combined system consisting of a stiffened plate representing deck, primary structure, and attached subsystems such as pillars, additional stiffeners and damped spring-mass systems representing cars/trucks. For response calculations of the system subjected to displacement excitations along the boundaries, the support displacement transfer ratio conceptually similar to the receptance is introduced. For the verification of accuracy and calculation efficiency of the proposed methods, numerical and experimental investigations are carried out.

  • PDF

A Variant of Improved Robust Fuzzy PCA (잡음 민감성이 개선된 변형 퍼지 주성분 분석 기법)

  • Kim, Seong-Hoon;Heo, Gyeong-Yong;Woo, Young-Woon
    • Journal of the Korea Society of Computer and Information
    • /
    • v.16 no.2
    • /
    • pp.25-31
    • /
    • 2011
  • Principal component analysis (PCA) is a well-known method for dimensionality reduction and feature extraction. Although PCA has been applied in many areas successfully, it is sensitive to outliers due to the use of sum-square-error. Several variants of PCA have been proposed to resolve the noise sensitivity and, among the variants, improved robust fuzzy PCA (RF-PCA2) demonstrated promising results. RF-PCA2, however, still can fall into a local optimum due to equal initial membership values for all data points. Another reason comes from the fact that RF-PCA2 is based on sum-square-error although fuzzy memberships are incorporated. In this paper, a variant of RF-PCA2 called RF-PCA3 is proposed. The proposed algorithm is based on the objective function of RF-PCA2. RF-PCA3 augments RF-PCA2 with the objective function of PCA and initial membership calculation using data distribution, which make RF-PCA3 to have more chance to converge on a better solution than that of RF-PCA2. RF-PCA3 outperforms RF-PCA2, which is demonstrated by experimental results.

Binding Mode Analysis of Bacillus subtilis Obg with Ribosomal Protein L13 through Computational Docking Study

  • Lee, Yu-No;Bang, Woo-Young;Kim, Song-Mi;Lazar, Prettina;Bahk, Jeong-Dong;Lee, Keun-Woo
    • Interdisciplinary Bio Central
    • /
    • v.1 no.1
    • /
    • pp.3.1-3.6
    • /
    • 2009
  • Introduction: GTPases known as translation factor play a vital role as ribosomal subunit assembly chaperone. The bacterial Obg proteins ($Spo{\underline{0B}}$-associated ${\underline{G}}TP$-binding protein) belong to the subfamily of P-loop GTPase proteins and now it is considered as one of the new target for antibacterial drug. The majority of bacterial Obgs have been commonly found to be associated with ribosome, implying that these proteins may play a fundamental role in ribosome assembly or maturation. In addition, one of the experimental evidences suggested that Bacillus subtilis Obg (BsObg) protein binds to the L13 ribosomal protein (BsL13) which is known to be one of the early assembly proteins of the 50S ribosomal subunit in Escherichia coli. In order to investigate binding mode between the BsObg and the BsL13, protein-protein docking simulation was carried out after generating 3D structure of the BsL13 structure using homology modeling method. Materials and Methods: Homology model structure of BsL13 was generated using the EcL13 crystal structure as a template. Protein-protein docking of BsObg protein with ribosomal protein BsL13 was performed by DOT, a macro-molecular docking software, in order to predict a reasonable binding mode. The solvated energy minimization calculation of the docked conformation was carried out to refine the structure. Results and Discussion: The possible binding conformation of BsL13 along with activated Obg fold in BsObg was predicted by computational docking study. The final structure is obtained from the solvated energy minimization. From the analysis, three important H-bond interactions between the Obg fold and the L13 were detected: Obg:Tyr27-L13:Glu32, Obg:Asn76-L13:Glu139, and Obg:Ala136-L13:Glu142. The interaction between the BsObg and BsL13 structures were also analyzed by electrostatic potential calculations to examine the interface surfaces. From the results, the key residues for hydrogen bonding and hydrophobic interaction between the two proteins were predicted. Conclusion and Prospects: In this study, we have focused on the binding mode of the BsObg protein with the ribosomal BsL13 protein. The interaction between the activated Obg and target protein was investigated with protein-protein docking calculations. The binding pattern can be further used as a base for structure-based drug design to find a novel antibacterial drug.

A Study on the Basic Investigation for the Fire Risk Assessment of Education Facilities (교육시설 화재위험성 평가를 위한 기초조사에 관한 연구)

  • Lee, Sung-Il;Ham, Eun-Gu
    • Journal of the Society of Disaster Information
    • /
    • v.17 no.2
    • /
    • pp.351-364
    • /
    • 2021
  • Purpose: Fire load analysis was conducted to secure basic data for evaluating fire risk of educational facilities. In order to calculate the fire load through a preliminary survey, basic data related to the fire load of school facilities were collected. Method: The basic data were the definition and types of fire loads, combustion heat data for the calculation of fire loads. The fire load was evaluated by multiplying the combustion heat by the weight of the combustibles in the compartment when calculating the fire load. Result: As for the fixed combustible materials of A-elementary school, the floor was mainly made of wood, in consideration of emotion and safety in the classroom, music room, and school office, and the rest of the compartments were made of stone. The ceiling and walls were made of gypsum board and concrete, so they were not combustible. The typical inflammable items in each room were desks, chairs, and lockers in the classroom, and the laboratory equipment box and experimental tool box were the main components in the science room, and books, bookshelves, and reading equipment occupied a large proportion in the library room. Conclusion: 'The fire loads of A-elementary' schools according to the combustibles loaded were in the order of library, computer room, English learning room, teacher's office, general classroom, science hall, and music room.

A Ranking Algorithm for Semantic Web Resources: A Class-oriented Approach (시맨틱 웹 자원의 랭킹을 위한 알고리즘: 클래스중심 접근방법)

  • Rho, Sang-Kyu;Park, Hyun-Jung;Park, Jin-Soo
    • Asia pacific journal of information systems
    • /
    • v.17 no.4
    • /
    • pp.31-59
    • /
    • 2007
  • We frequently use search engines to find relevant information in the Web but still end up with too much information. In order to solve this problem of information overload, ranking algorithms have been applied to various domains. As more information will be available in the future, effectively and efficiently ranking search results will become more critical. In this paper, we propose a ranking algorithm for the Semantic Web resources, specifically RDF resources. Traditionally, the importance of a particular Web page is estimated based on the number of key words found in the page, which is subject to manipulation. In contrast, link analysis methods such as Google's PageRank capitalize on the information which is inherent in the link structure of the Web graph. PageRank considers a certain page highly important if it is referred to by many other pages. The degree of the importance also increases if the importance of the referring pages is high. Kleinberg's algorithm is another link-structure based ranking algorithm for Web pages. Unlike PageRank, Kleinberg's algorithm utilizes two kinds of scores: the authority score and the hub score. If a page has a high authority score, it is an authority on a given topic and many pages refer to it. A page with a high hub score links to many authoritative pages. As mentioned above, the link-structure based ranking method has been playing an essential role in World Wide Web(WWW), and nowadays, many people recognize the effectiveness and efficiency of it. On the other hand, as Resource Description Framework(RDF) data model forms the foundation of the Semantic Web, any information in the Semantic Web can be expressed with RDF graph, making the ranking algorithm for RDF knowledge bases greatly important. The RDF graph consists of nodes and directional links similar to the Web graph. As a result, the link-structure based ranking method seems to be highly applicable to ranking the Semantic Web resources. However, the information space of the Semantic Web is more complex than that of WWW. For instance, WWW can be considered as one huge class, i.e., a collection of Web pages, which has only a recursive property, i.e., a 'refers to' property corresponding to the hyperlinks. However, the Semantic Web encompasses various kinds of classes and properties, and consequently, ranking methods used in WWW should be modified to reflect the complexity of the information space in the Semantic Web. Previous research addressed the ranking problem of query results retrieved from RDF knowledge bases. Mukherjea and Bamba modified Kleinberg's algorithm in order to apply their algorithm to rank the Semantic Web resources. They defined the objectivity score and the subjectivity score of a resource, which correspond to the authority score and the hub score of Kleinberg's, respectively. They concentrated on the diversity of properties and introduced property weights to control the influence of a resource on another resource depending on the characteristic of the property linking the two resources. A node with a high objectivity score becomes the object of many RDF triples, and a node with a high subjectivity score becomes the subject of many RDF triples. They developed several kinds of Semantic Web systems in order to validate their technique and showed some experimental results verifying the applicability of their method to the Semantic Web. Despite their efforts, however, there remained some limitations which they reported in their paper. First, their algorithm is useful only when a Semantic Web system represents most of the knowledge pertaining to a certain domain. In other words, the ratio of links to nodes should be high, or overall resources should be described in detail, to a certain degree for their algorithm to properly work. Second, a Tightly-Knit Community(TKC) effect, the phenomenon that pages which are less important but yet densely connected have higher scores than the ones that are more important but sparsely connected, remains as problematic. Third, a resource may have a high score, not because it is actually important, but simply because it is very common and as a consequence it has many links pointing to it. In this paper, we examine such ranking problems from a novel perspective and propose a new algorithm which can solve the problems under the previous studies. Our proposed method is based on a class-oriented approach. In contrast to the predicate-oriented approach entertained by the previous research, a user, under our approach, determines the weights of a property by comparing its relative significance to the other properties when evaluating the importance of resources in a specific class. This approach stems from the idea that most queries are supposed to find resources belonging to the same class in the Semantic Web, which consists of many heterogeneous classes in RDF Schema. This approach closely reflects the way that people, in the real world, evaluate something, and will turn out to be superior to the predicate-oriented approach for the Semantic Web. Our proposed algorithm can resolve the TKC(Tightly Knit Community) effect, and further can shed lights on other limitations posed by the previous research. In addition, we propose two ways to incorporate data-type properties which have not been employed even in the case when they have some significance on the resource importance. We designed an experiment to show the effectiveness of our proposed algorithm and the validity of ranking results, which was not tried ever in previous research. We also conducted a comprehensive mathematical analysis, which was overlooked in previous research. The mathematical analysis enabled us to simplify the calculation procedure. Finally, we summarize our experimental results and discuss further research issues.

Application of Westgard Multi-Rules for Improving Nuclear Medicine Blood Test Quality Control (핵의학 검체검사 정도관리의 개선을 위한 Westgard Multi-Rules의 적용)

  • Jung, Heung-Soo;Bae, Jin-Soo;Shin, Yong-Hwan;Kim, Ji-Young;Seok, Jae-Dong
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.16 no.1
    • /
    • pp.115-118
    • /
    • 2012
  • Purpose: The Levey-Jennings chart controlled measurement values that deviated from the tolerance value (mean ${\pm}2SD$ or ${\pm}3SD$). On the other hand, the upgraded Westgard Multi-Rules are actively recommended as a more efficient, specialized form of hospital certification in relation to Internal Quality Control. To apply Westgard Multi-Rules in quality control, credible quality control substance and target value are required. However, as physical examinations commonly use quality control substances provided within the test kit, there are many difficulties presented in the calculation of target value in relation to frequent changes in concentration value and insufficient credibility of quality control substance. This study attempts to improve the professionalism and credibility of quality control by applying Westgard Multi-Rules and calculating credible target value by using a commercialized quality control substance. Materials and Methods : This study used Immunoassay Plus Control Level 1, 2, 3 of Company B as the quality control substance of Total T3, which is the thyroid test implemented at the relevant hospital. Target value was established as the mean value of 295 cases collected for 1 month, excluding values that deviated from ${\pm}2SD$. The hospital quality control calculation program was used to enter target value. 12s, 22s, 13s, 2 of 32s, R4s, 41s, $10\bar{x}$, 7T of Westgard Multi-Rules were applied in the Total T3 experiment, which was conducted 194 times for 20 days in August. Based on the applied rules, this study classified data into random error and systemic error for analysis. Results: Quality control substances 1, 2, and 3 were each established as 84.2 ng/$dl$, 156.7 ng/$dl$, 242.4 ng/$dl$ for target values of Total T3, with the standard deviation established as 11.22 ng/$dl$, 14.52 ng/$dl$, 14.52 ng/$dl$ respectively. According to error type analysis achieved after applying Westgard Multi-Rules based on established target values, the following results were obtained for Random error, 12s was analyzed 48 times, 13s was analyzed 13 times, R4s was analyzed 6 times, for Systemic error, 22s was analyzed 10 times, 41s was analyzed 11 times, 2 of 32s was analyzed 17 times, $10\bar{x}$ was analyzed 10 times, and 7T was not applied. For uncontrollable Random error types, the entire experimental process was rechecked and greater emphasis was placed on re-testing. For controllable Systemic error types, this study searched the cause of error, recorded the relevant cause in the action form and reported the information to the Internal Quality Control committee if necessary. Conclusions : This study applied Westgard Multi-Rules by using commercialized substance as quality control substance and establishing target values. In result, precise analysis of Random error and Systemic error was achieved through the analysis of 12s, 22s, 13s, 2 of 32s, R4s, 41s, $10\bar{x}$, 7T rules. Furthermore, ideal quality control was achieved through analysis conducted on all data presented within the range of ${\pm}3SD$. In this regard, it can be said that the quality control method formed based on the systematic application of Westgard Multi-Rules is more effective than the Levey-Jennings chart and can maximize error detection.

  • PDF

Content-based Recommendation Based on Social Network for Personalized News Services (개인화된 뉴스 서비스를 위한 소셜 네트워크 기반의 콘텐츠 추천기법)

  • Hong, Myung-Duk;Oh, Kyeong-Jin;Ga, Myung-Hyun;Jo, Geun-Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.3
    • /
    • pp.57-71
    • /
    • 2013
  • Over a billion people in the world generate new news minute by minute. People forecasts some news but most news are from unexpected events such as natural disasters, accidents, crimes. People spend much time to watch a huge amount of news delivered from many media because they want to understand what is happening now, to predict what might happen in the near future, and to share and discuss on the news. People make better daily decisions through watching and obtaining useful information from news they saw. However, it is difficult that people choose news suitable to them and obtain useful information from the news because there are so many news media such as portal sites, broadcasters, and most news articles consist of gossipy news and breaking news. User interest changes over time and many people have no interest in outdated news. From this fact, applying users' recent interest to personalized news service is also required in news service. It means that personalized news service should dynamically manage user profiles. In this paper, a content-based news recommendation system is proposed to provide the personalized news service. For a personalized service, user's personal information is requisitely required. Social network service is used to extract user information for personalization service. The proposed system constructs dynamic user profile based on recent user information of Facebook, which is one of social network services. User information contains personal information, recent articles, and Facebook Page information. Facebook Pages are used for businesses, organizations and brands to share their contents and connect with people. Facebook users can add Facebook Page to specify their interest in the Page. The proposed system uses this Page information to create user profile, and to match user preferences to news topics. However, some Pages are not directly matched to news topic because Page deals with individual objects and do not provide topic information suitable to news. Freebase, which is a large collaborative database of well-known people, places, things, is used to match Page to news topic by using hierarchy information of its objects. By using recent Page information and articles of Facebook users, the proposed systems can own dynamic user profile. The generated user profile is used to measure user preferences on news. To generate news profile, news category predefined by news media is used and keywords of news articles are extracted after analysis of news contents including title, category, and scripts. TF-IDF technique, which reflects how important a word is to a document in a corpus, is used to identify keywords of each news article. For user profile and news profile, same format is used to efficiently measure similarity between user preferences and news. The proposed system calculates all similarity values between user profiles and news profiles. Existing methods of similarity calculation in vector space model do not cover synonym, hypernym and hyponym because they only handle given words in vector space model. The proposed system applies WordNet to similarity calculation to overcome the limitation. Top-N news articles, which have high similarity value for a target user, are recommended to the user. To evaluate the proposed news recommendation system, user profiles are generated using Facebook account with participants consent, and we implement a Web crawler to extract news information from PBS, which is non-profit public broadcasting television network in the United States, and construct news profiles. We compare the performance of the proposed method with that of benchmark algorithms. One is a traditional method based on TF-IDF. Another is 6Sub-Vectors method that divides the points to get keywords into six parts. Experimental results demonstrate that the proposed system provide useful news to users by applying user's social network information and WordNet functions, in terms of prediction error of recommended news.

Assessment of statistical errors of articles published in the Journal of the Korean Academy of Prosthodontics: 2006 - 2010 (대한치과보철학회지에서 볼 수 있는 통계적 오류의 고찰(2006 - 2010))

  • Kang, Dong-Wan;Seo, Yunam;Oh, Nam-Sik;Lim, Hoi-Jeong
    • The Journal of Korean Academy of Prosthodontics
    • /
    • v.50 no.4
    • /
    • pp.258-270
    • /
    • 2012
  • Purpose: Use of inappropriate statistical methods may lead to incorrect conclusions and a waste of valuable resources. The goal of this study was to assess the frequency and the types of several common statistical errors in the published articles of the Journal of the Korean Academy of Prosthodontics (JKAP) for a 5-year period. Materials and methods: Of 336 articles in the JKAP published from 2006 to 2010, 255 articles using statistics were reviewed and classified by statistical method and year. The frequency and types of the statistical methods were examined, and the statistical errors were evaluated by the appropriateness of the experimental design, assumption check, independent outcomes, proper sample size and suitable use of statistical method. Statistical guidelines were completed based on the appropriateness. Results: Of the 255 articles using statistics, 193 articles (75.9%) used inferential statistics and 153 articles used SPSS statistical software (60.0%). Of the articles using inferential statistics, the three most frequently used statistical methods were ANOVA (41.5%), t-test (20.0%), and the nonparametric method (16.9%). The average rate of statistical errors was 61.2 percent, similar to the rate reported by several studies completed for the medical journal. Conclusion: After the whole analysis of the difference among the groups, post-hoc tests for the pairwise comparisons are required. The optimal sample size calculation is an essential part of this study protocol. To minimize the occurrence of statistical errors, statistical guidelines were developed according to each statistical test procedure and will contribute to the academic improvement in the JKAP.

Development of tracer concentration analysis method using drone-based spatio-temporal hyperspectral image and RGB image (드론기반 시공간 초분광영상 및 RGB영상을 활용한 추적자 농도분석 기법 개발)

  • Gwon, Yeonghwa;Kim, Dongsu;You, Hojun;Han, Eunjin;Kwon, Siyoon;Kim, Youngdo
    • Journal of Korea Water Resources Association
    • /
    • v.55 no.8
    • /
    • pp.623-634
    • /
    • 2022
  • Due to river maintenance projects such as the creation of hydrophilic areas around rivers and the Four Rivers Project, the flow characteristics of rivers are continuously changing, and the risk of water quality accidents due to the inflow of various pollutants is increasing. In the event of a water quality accident, it is necessary to minimize the effect on the downstream side by predicting the concentration and arrival time of pollutants in consideration of the flow characteristics of the river. In order to track the behavior of these pollutants, it is necessary to calculate the diffusion coefficient and dispersion coefficient for each section of the river. Among them, the dispersion coefficient is used to analyze the diffusion range of soluble pollutants. Existing experimental research cases for tracking the behavior of pollutants require a lot of manpower and cost, and it is difficult to obtain spatially high-resolution data due to limited equipment operation. Recently, research on tracking contaminants using RGB drones has been conducted, but RGB images also have a limitation in that spectral information is limitedly collected. In this study, to supplement the limitations of existing studies, a hyperspectral sensor was mounted on a remote sensing platform using a drone to collect temporally and spatially higher-resolution data than conventional contact measurement. Using the collected spatio-temporal hyperspectral images, the tracer concentration was calculated and the transverse dispersion coefficient was derived. It is expected that by overcoming the limitations of the drone platform through future research and upgrading the dispersion coefficient calculation technology, it will be possible to detect various pollutants leaking into the water system, and to detect changes in various water quality items and river factors.