• Title/Summary/Keyword: weighted average

Search Result 890, Processing Time 0.025 seconds

Deterioration Evaluation Method of Noise Barriers for Managements of Highway (고속도로 방음벽 유지관리를 위한 방음벽 노후도 평가 방안)

  • Kim, Sangtae;Shin, Ilhyoung;Kim, Kyoungsu;Kim, Daae;Kim, Heungrae;Im, Jahae;Lee, Jajun
    • Journal of Environmental Impact Assessment
    • /
    • v.28 no.4
    • /
    • pp.387-399
    • /
    • 2019
  • This research aimed to prepare the classification of the damage types and the damage rating system of noise barriers for expressway noise barriers and to develop deterioration evaluation method of noise barriers by reflecting them. The noise barrier consists of soundproof panels, foundations and posts and the soundproof panels with 10 different types of materials are used in a single or mixed form.In this paper, damage of soundproof panel shows a single or composite damage, and thus a evaluation model of deterioration has been developed for noise barriers that can reflect the characteristic of noise barriers. Materials used mainly for soundproof walls were divided into material types for metal, plastic, timber, transparent and concrete. And damage types for noise barrier were classified into corrosion, discoloration, deformation, spalling and dislocation and damage types were subdivided according to the noise barrier's components and materials. Damage rating was divided into good, minor, normal and severe for each major part of noise barrier to assess damage rating of soundproof panel, foundation and post. The deterioration degree of noise barrier was evaluated comprehensively by using the deterioration evaluation method of whole noise barrier using weighted average. Deterioration evaluation method that can be systematically assessed has been developed for noise barrier using single or mixed soundproof panel and noise barrier with single or complex damage types. Through such an evaluation system, it is deemed that the deterioration status of noise barrier installed can be systematically understood and utilized for efficient maintenance planning and implementation for repair and improvement of noise barriers.

A Recidivism Prediction Model Based on XGBoost Considering Asymmetric Error Costs (비대칭 오류 비용을 고려한 XGBoost 기반 재범 예측 모델)

  • Won, Ha-Ram;Shim, Jae-Seung;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.127-137
    • /
    • 2019
  • Recidivism prediction has been a subject of constant research by experts since the early 1970s. But it has become more important as committed crimes by recidivist steadily increase. Especially, in the 1990s, after the US and Canada adopted the 'Recidivism Risk Assessment Report' as a decisive criterion during trial and parole screening, research on recidivism prediction became more active. And in the same period, empirical studies on 'Recidivism Factors' were started even at Korea. Even though most recidivism prediction studies have so far focused on factors of recidivism or the accuracy of recidivism prediction, it is important to minimize the prediction misclassification cost, because recidivism prediction has an asymmetric error cost structure. In general, the cost of misrecognizing people who do not cause recidivism to cause recidivism is lower than the cost of incorrectly classifying people who would cause recidivism. Because the former increases only the additional monitoring costs, while the latter increases the amount of social, and economic costs. Therefore, in this paper, we propose an XGBoost(eXtream Gradient Boosting; XGB) based recidivism prediction model considering asymmetric error cost. In the first step of the model, XGB, being recognized as high performance ensemble method in the field of data mining, was applied. And the results of XGB were compared with various prediction models such as LOGIT(logistic regression analysis), DT(decision trees), ANN(artificial neural networks), and SVM(support vector machines). In the next step, the threshold is optimized to minimize the total misclassification cost, which is the weighted average of FNE(False Negative Error) and FPE(False Positive Error). To verify the usefulness of the model, the model was applied to a real recidivism prediction dataset. As a result, it was confirmed that the XGB model not only showed better prediction accuracy than other prediction models but also reduced the cost of misclassification most effectively.

Granite Dike Swarm and U-Pb Ages in the Ueumdo, Hwaseong City, Korea (경기도 화성시 우음도 일원의 화강암 암맥군과 U-Pb 연령)

  • Chae, Yong-Un;Kang, Hee-Cheol;Kim, Jong-Sun;Park, Jeong-Woong;Ha, Sujin;Lim, Hyoun Soo;Shin, Seungwon;Kim, Hyeong Soo
    • Journal of the Korean earth science society
    • /
    • v.43 no.5
    • /
    • pp.618-638
    • /
    • 2022
  • The Middle Jurassic granite dike swarm intruding into the Paleoproterozoic banded gneiss is pervasively observed in Ueumdo, Hwaseong City, mid-western Gyeonggi Massif. Based on their cross-cutting relationships in a representative outcrop, there are four dikes (UE-A, UE-C, UE-D, UE-E), and depending on the direction, there are three granite dike groups, which are NW- (UE-A dike), NW to WNW- (UE-C dike), and NE-trending (UE-D and UE-E dikes). These granite dikes are massive, medium-to coarse-grained biotite granites, and their relative ages observed in outcrops are in the order of UE-A, UE-D (=UE-E), and UE-C. The geometric analysis of the dikes indicates that the UE-A and UE-C dikes intrude under approximately NE-SW trending horizontal minimum stress fields. The UE-A dike, which showed a relatively low average SiO2 content by major element analysis, is a product of early magma differentiation compared to other dikes; therefore, it is consistent with the relative age of each dike. The 206Pb/238U weighted mean ages for each dike obtained from SHRIMP zircon U-Pb dating were calculated to be 167 Ma (UE-A), 164 Ma (UE-C), 167 Ma (UE-D), and 167 Ma (UE-E), respectively. The samples of the UE-A, UE-D, and UE-E dikes showed very similar ages. The UE-C dike shows the youngest age, which is consistent with the results of the relative age in the outcrops and major element analysis. Therefore, the granite dikes intruded into the Middle Jurassic (approximately 167 and 164 Ma), coinciding with those of the Gyeonggi Massif, where the Middle Jurassic plutons are geographically widely distributed. This result indicates that the wide occurrence of the Middle Jurassic plutons on the Gyeonggi Massif was formed as a result of igneous activity moving in the northwest direction with the shallower subduction angle of the subducting oceanic plate during the Jurassic.

Analysis of Spatial Changes in the Forest Landscape of the Upper Reaches of Guem River Dam Basin according to Land Cover Change (토지피복변화에 따른 금강 상류 댐 유역 산림 경관의 구조적 변화 분석)

  • Kyeong-Tae Kim;Hyun-Jung Lee;Whee-Moon Kim;Won-Kyong Song
    • Korean Journal of Environment and Ecology
    • /
    • v.37 no.4
    • /
    • pp.289-301
    • /
    • 2023
  • Forests within watersheds are essential in maintaining ecosystems and are the central infrastructure for constructing an ecological network system. However, due to indiscriminate development projects carried out over past decades, forest fragmentation and land use changes have accelerated, and their original functions have been lost. Since a forest's structural pattern directly impacts ecological processes and functions in understanding forest ecosystems, identifying and analyzing change patterns is essential. Therefore, this study analyzed structural changes in the forest landscape according to the time-series land cover changes using the FRAGSTATS model for the dam watershed of the Geum River upstream. Land cover changes in the dam watershed of the Geum River upstream through land cover change detection showed an increase of 33.12 square kilometers (0.62%) of forests and 67.26 square kilometers (1.26%) of urbanized dry areas and a decrease of 148.25 square kilometers (2.79%) in agricultural areas from the 1980s to the 2010s. The results of no-sampling forest landscape analysis within the watershed indicated landscape percentage (PLAND), area-weighted proximity index (CONTIG_AM), average central area (CORE_MN), and adjacency index (PLADJ) increased, and the number of patches (NP), landscape shape index (LSI), and cohesion index (COHESION) decreased. Identification of structural change patterns through a moving window analysis showed the forest landscape in Sangju City, Gyeongsangbuk Province, Boeun County in Chungcheongbuk Province, and Jinan Province in Jeollabuk Province was relatively well preserved, but fragmentation was ongoing at the border between Okcheon County in Chungcheongbuk Province, Yeongdong and Geumsan Counties in Chungcheongnam Province, and the forest landscape in areas adjacent to Muju and Jangsu Counties in Jeollabuk Province. The results indicate that it is necessary to establish afforestation projects for fragmented areas when preparing a future regional forest management strategy. This study derived areas where fragmentation of forest landscapes is expected and the results may be used as basic data for assessing the health of watershed forests and establishing management plans.

Research on ITB Contract Terms Classification Model for Risk Management in EPC Projects: Deep Learning-Based PLM Ensemble Techniques (EPC 프로젝트의 위험 관리를 위한 ITB 문서 조항 분류 모델 연구: 딥러닝 기반 PLM 앙상블 기법 활용)

  • Hyunsang Lee;Wonseok Lee;Bogeun Jo;Heejun Lee;Sangjin Oh;Sangwoo You;Maru Nam;Hyunsik Lee
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.12 no.11
    • /
    • pp.471-480
    • /
    • 2023
  • The Korean construction order volume in South Korea grew significantly from 91.3 trillion won in public orders in 2013 to a total of 212 trillion won in 2021, particularly in the private sector. As the size of the domestic and overseas markets grew, the scale and complexity of EPC (Engineering, Procurement, Construction) projects increased, and risk management of project management and ITB (Invitation to Bid) documents became a critical issue. The time granted to actual construction companies in the bidding process following the EPC project award is not only limited, but also extremely challenging to review all the risk terms in the ITB document due to manpower and cost issues. Previous research attempted to categorize the risk terms in EPC contract documents and detect them based on AI, but there were limitations to practical use due to problems related to data, such as the limit of labeled data utilization and class imbalance. Therefore, this study aims to develop an AI model that can categorize the contract terms based on the FIDIC Yellow 2017(Federation Internationale Des Ingenieurs-Conseils Contract terms) standard in detail, rather than defining and classifying risk terms like previous research. A multi-text classification function is necessary because the contract terms that need to be reviewed in detail may vary depending on the scale and type of the project. To enhance the performance of the multi-text classification model, we developed the ELECTRA PLM (Pre-trained Language Model) capable of efficiently learning the context of text data from the pre-training stage, and conducted a four-step experiment to validate the performance of the model. As a result, the ensemble version of the self-developed ITB-ELECTRA model and Legal-BERT achieved the best performance with a weighted average F1-Score of 76% in the classification of 57 contract terms.

A Study on Web-based Technology Valuation System (웹기반 지능형 기술가치평가 시스템에 관한 연구)

  • Sung, Tae-Eung;Jun, Seung-Pyo;Kim, Sang-Gook;Park, Hyun-Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.1
    • /
    • pp.23-46
    • /
    • 2017
  • Although there have been cases of evaluating the value of specific companies or projects which have centralized on developed countries in North America and Europe from the early 2000s, the system and methodology for estimating the economic value of individual technologies or patents has been activated on and on. Of course, there exist several online systems that qualitatively evaluate the technology's grade or the patent rating of the technology to be evaluated, as in 'KTRS' of the KIBO and 'SMART 3.1' of the Korea Invention Promotion Association. However, a web-based technology valuation system, referred to as 'STAR-Value system' that calculates the quantitative values of the subject technology for various purposes such as business feasibility analysis, investment attraction, tax/litigation, etc., has been officially opened and recently spreading. In this study, we introduce the type of methodology and evaluation model, reference information supporting these theories, and how database associated are utilized, focusing various modules and frameworks embedded in STAR-Value system. In particular, there are six valuation methods, including the discounted cash flow method (DCF), which is a representative one based on the income approach that anticipates future economic income to be valued at present, and the relief-from-royalty method, which calculates the present value of royalties' where we consider the contribution of the subject technology towards the business value created as the royalty rate. We look at how models and related support information (technology life, corporate (business) financial information, discount rate, industrial technology factors, etc.) can be used and linked in a intelligent manner. Based on the classification of information such as International Patent Classification (IPC) or Korea Standard Industry Classification (KSIC) for technology to be evaluated, the STAR-Value system automatically returns meta data such as technology cycle time (TCT), sales growth rate and profitability data of similar company or industry sector, weighted average cost of capital (WACC), indices of industrial technology factors, etc., and apply adjustment factors to them, so that the result of technology value calculation has high reliability and objectivity. Furthermore, if the information on the potential market size of the target technology and the market share of the commercialization subject refers to data-driven information, or if the estimated value range of similar technologies by industry sector is provided from the evaluation cases which are already completed and accumulated in database, the STAR-Value is anticipated that it will enable to present highly accurate value range in real time by intelligently linking various support modules. Including the explanation of the various valuation models and relevant primary variables as presented in this paper, the STAR-Value system intends to utilize more systematically and in a data-driven way by supporting the optimal model selection guideline module, intelligent technology value range reasoning module, and similar company selection based market share prediction module, etc. In addition, the research on the development and intelligence of the web-based STAR-Value system is significant in that it widely spread the web-based system that can be used in the validation and application to practices of the theoretical feasibility of the technology valuation field, and it is expected that it could be utilized in various fields of technology commercialization.

Assessing the Damage: An Exploratory Examination of Electronic Word of Mouth (손해평고(损害评估): 대전자구비행소적탐색성고찰(对电子口碑行销的探索性考察))

  • Funches, Venessa Martin;Foxx, William;Park, Eun-Joo;Kim, Eun-Young
    • Journal of Global Scholars of Marketing Science
    • /
    • v.20 no.2
    • /
    • pp.188-198
    • /
    • 2010
  • This study attempts to examine the influence that negative WOM (NWOM) has in an online context. It specifically focuses on the impact of the service failure description and the perceived intention of the communication provider on consumer evaluations of firm competence, attitude toward the firm, positive word of mouth and behavioral intentions. Studies of communication persuasiveness focus on "who says what; to whom; in which channel; with what effect (Chiu 2007)." In this research study, we examine electronic web posting, particularly focusing on two aspects of "what": the level of service failure communicated and perceived intention of the individual posting. It stands to reason electronic NWOM that appears to be trying to damage a product’s or firm's reputation will be viewed as more biased and will thus be considered as less credible. According to attribution theory, people search for the causes of events especially those that are negative and unexpected (Weiner 2006). Hennig-Thurau and Walsh (2003) state "since the reader has only limited knowledge and trust of the author of an online articulation the quality of the contribution could be expected to serve as a potent moderator of the articulation-behavior relationship. We therefore posit the following hypotheses: H1. Subjects exposed to electronic NWOM describing a high level of service failure will provide lower scores on measures of (a) firm competence, (b) attitude toward the firm, (c) positive word of mouth, and (d) behavioral intention than will subjects exposed to electronic NWOM describing a low level of service failure. H2. Subjects exposed to electronic NWOM with a warning intent will provide lower scores on measures of (a) firm competence, (b) attitude toward the firm, (c) positive word of mouth, and (d) behavioral intention than will subjects exposed to electronic NWOM with a vengeful intent. H3. Level of service failure in electronic NWOM will interact with the perceived intention of the electronic NWOM, such that there will be a decrease in mean response on measures of (a) firm competence, (b) attitude toward the firm, (c) positive word of mouth, and (d) behavioral intention from electronic NWOM with a warning intent to a vengeful intent. The main study involved a2 (service failure severity) x2 (NWOM with warning versus vengeful intent) factorial experiment. Stimuli were presented to subjects online using a mock online web posting. The scenario described a service failure associated with non-acceptance of a gift card in a brick-and-mortar retail establishment. A national sample was recruited through an online research firm. A total of 113 subjects participated in the study. A total of 104 surveys were analyzed. The scenario was perceived to be realistic with 92.3% giving the scenario a greater than average response. Manipulations were satisfactory. Measures were pre-tested and validated. Items were analyzed and found reliable and valid. MANOVA results found the multivariate interaction was not significant, allowing our interpretation to proceed to the main effects. Significant main effects were found for post intent and service failure severity. The post intent main effect was attributable to attitude toward the firm, positive word of mouth and behavioral intention. The service failure severity main effect was attributable to all four dependent variables: firm competence, attitude toward the firm, positive word of mouth and behavioral intention. Specifically, firm competence for electronic NWOM describing high severity of service failure was lower than electronic NWOM describing low severity of service failure. Attitude toward the firm for electronic NWOM describing high severity of service failure was lower than electronic NWOM describing low severity of service failure. Positive word of mouth for electronic NWOM describing high severity of service failure was lower than electronic NWOM describing low severity of service failure. Behavioral intention for electronic NWOM describing high severity of service failure was lower for electronic NWOM describing low severity of service failure. Therefore, H1a, H1b, H1c and H1d were all supported. In addition, attitude toward the firm for electronic NWOM with a warning intent was lower than electronic NWOM with a vengeful intent. Positive word of mouth for electronic NWOM with a warning intent was lower than electronic NWOM with a vengeful intent. Behavioral intention for electronic NWOM with a warning intent was lower than electronic NWOM with a vengeful intent. Thus, H2b, H2c and H2d were supported. However, H2a was not supported though results were in the hypothesized direction. Otherwise, there was no significant multivariate service failure severity by post intent interaction, nor was there a significant univariate service failure severity by post intent interaction for any of the three hypothesized variables. Thus, H3 was not supported for any of the four hypothesized variables. This study has research and managerial implications. The findings of this study support prior research that service failure severity impacts consumer perceptions, attitude, positive word of mouth and behavioral intentions (Weun et al. 2004). Of further relevance, this response is evidenced in the online context, suggesting the need for firms to engage in serious focused service recovery efforts. With respect to perceived intention of electronic NWOM, the findings support prior research suggesting reader's attributions of the intentions of a source influence the strength of its impact on perceptions, attitude, positive word of mouth and behavioral intentions. The implication for managers suggests while consumers do find online communications to be credible and influential, not all communications are weighted the same. A benefit of electronic WOM, even when it may be potentially damaging, is it can be monitored for potential problems and additionally offers the possibility of redress.

A Folksonomy Ranking Framework: A Semantic Graph-based Approach (폭소노미 사이트를 위한 랭킹 프레임워크 설계: 시맨틱 그래프기반 접근)

  • Park, Hyun-Jung;Rho, Sang-Kyu
    • Asia pacific journal of information systems
    • /
    • v.21 no.2
    • /
    • pp.89-116
    • /
    • 2011
  • In collaborative tagging systems such as Delicious.com and Flickr.com, users assign keywords or tags to their uploaded resources, such as bookmarks and pictures, for their future use or sharing purposes. The collection of resources and tags generated by a user is called a personomy, and the collection of all personomies constitutes the folksonomy. The most significant need of the folksonomy users Is to efficiently find useful resources or experts on specific topics. An excellent ranking algorithm would assign higher ranking to more useful resources or experts. What resources are considered useful In a folksonomic system? Does a standard superior to frequency or freshness exist? The resource recommended by more users with mere expertise should be worthy of attention. This ranking paradigm can be implemented through a graph-based ranking algorithm. Two well-known representatives of such a paradigm are Page Rank by Google and HITS(Hypertext Induced Topic Selection) by Kleinberg. Both Page Rank and HITS assign a higher evaluation score to pages linked to more higher-scored pages. HITS differs from PageRank in that it utilizes two kinds of scores: authority and hub scores. The ranking objects of these pages are limited to Web pages, whereas the ranking objects of a folksonomic system are somewhat heterogeneous(i.e., users, resources, and tags). Therefore, uniform application of the voting notion of PageRank and HITS based on the links to a folksonomy would be unreasonable, In a folksonomic system, each link corresponding to a property can have an opposite direction, depending on whether the property is an active or a passive voice. The current research stems from the Idea that a graph-based ranking algorithm could be applied to the folksonomic system using the concept of mutual Interactions between entitles, rather than the voting notion of PageRank or HITS. The concept of mutual interactions, proposed for ranking the Semantic Web resources, enables the calculation of importance scores of various resources unaffected by link directions. The weights of a property representing the mutual interaction between classes are assigned depending on the relative significance of the property to the resource importance of each class. This class-oriented approach is based on the fact that, in the Semantic Web, there are many heterogeneous classes; thus, applying a different appraisal standard for each class is more reasonable. This is similar to the evaluation method of humans, where different items are assigned specific weights, which are then summed up to determine the weighted average. We can check for missing properties more easily with this approach than with other predicate-oriented approaches. A user of a tagging system usually assigns more than one tags to the same resource, and there can be more than one tags with the same subjectivity and objectivity. In the case that many users assign similar tags to the same resource, grading the users differently depending on the assignment order becomes necessary. This idea comes from the studies in psychology wherein expertise involves the ability to select the most relevant information for achieving a goal. An expert should be someone who not only has a large collection of documents annotated with a particular tag, but also tends to add documents of high quality to his/her collections. Such documents are identified by the number, as well as the expertise, of users who have the same documents in their collections. In other words, there is a relationship of mutual reinforcement between the expertise of a user and the quality of a document. In addition, there is a need to rank entities related more closely to a certain entity. Considering the property of social media that ensures the popularity of a topic is temporary, recent data should have more weight than old data. We propose a comprehensive folksonomy ranking framework in which all these considerations are dealt with and that can be easily customized to each folksonomy site for ranking purposes. To examine the validity of our ranking algorithm and show the mechanism of adjusting property, time, and expertise weights, we first use a dataset designed for analyzing the effect of each ranking factor independently. We then show the ranking results of a real folksonomy site, with the ranking factors combined. Because the ground truth of a given dataset is not known when it comes to ranking, we inject simulated data whose ranking results can be predicted into the real dataset and compare the ranking results of our algorithm with that of a previous HITS-based algorithm. Our semantic ranking algorithm based on the concept of mutual interaction seems to be preferable to the HITS-based algorithm as a flexible folksonomy ranking framework. Some concrete points of difference are as follows. First, with the time concept applied to the property weights, our algorithm shows superior performance in lowering the scores of older data and raising the scores of newer data. Second, applying the time concept to the expertise weights, as well as to the property weights, our algorithm controls the conflicting influence of expertise weights and enhances overall consistency of time-valued ranking. The expertise weights of the previous study can act as an obstacle to the time-valued ranking because the number of followers increases as time goes on. Third, many new properties and classes can be included in our framework. The previous HITS-based algorithm, based on the voting notion, loses ground in the situation where the domain consists of more than two classes, or where other important properties, such as "sent through twitter" or "registered as a friend," are added to the domain. Forth, there is a big difference in the calculation time and memory use between the two kinds of algorithms. While the matrix multiplication of two matrices, has to be executed twice for the previous HITS-based algorithm, this is unnecessary with our algorithm. In our ranking framework, various folksonomy ranking policies can be expressed with the ranking factors combined and our approach can work, even if the folksonomy site is not implemented with Semantic Web languages. Above all, the time weight proposed in this paper will be applicable to various domains, including social media, where time value is considered important.

Resolving the 'Gray sheep' Problem Using Social Network Analysis (SNA) in Collaborative Filtering (CF) Recommender Systems (소셜 네트워크 분석 기법을 활용한 협업필터링의 특이취향 사용자(Gray Sheep) 문제 해결)

  • Kim, Minsung;Im, Il
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.2
    • /
    • pp.137-148
    • /
    • 2014
  • Recommender system has become one of the most important technologies in e-commerce in these days. The ultimate reason to shop online, for many consumers, is to reduce the efforts for information search and purchase. Recommender system is a key technology to serve these needs. Many of the past studies about recommender systems have been devoted to developing and improving recommendation algorithms and collaborative filtering (CF) is known to be the most successful one. Despite its success, however, CF has several shortcomings such as cold-start, sparsity, gray sheep problems. In order to be able to generate recommendations, ordinary CF algorithms require evaluations or preference information directly from users. For new users who do not have any evaluations or preference information, therefore, CF cannot come up with recommendations (Cold-star problem). As the numbers of products and customers increase, the scale of the data increases exponentially and most of the data cells are empty. This sparse dataset makes computation for recommendation extremely hard (Sparsity problem). Since CF is based on the assumption that there are groups of users sharing common preferences or tastes, CF becomes inaccurate if there are many users with rare and unique tastes (Gray sheep problem). This study proposes a new algorithm that utilizes Social Network Analysis (SNA) techniques to resolve the gray sheep problem. We utilize 'degree centrality' in SNA to identify users with unique preferences (gray sheep). Degree centrality in SNA refers to the number of direct links to and from a node. In a network of users who are connected through common preferences or tastes, those with unique tastes have fewer links to other users (nodes) and they are isolated from other users. Therefore, gray sheep can be identified by calculating degree centrality of each node. We divide the dataset into two, gray sheep and others, based on the degree centrality of the users. Then, different similarity measures and recommendation methods are applied to these two datasets. More detail algorithm is as follows: Step 1: Convert the initial data which is a two-mode network (user to item) into an one-mode network (user to user). Step 2: Calculate degree centrality of each node and separate those nodes having degree centrality values lower than the pre-set threshold. The threshold value is determined by simulations such that the accuracy of CF for the remaining dataset is maximized. Step 3: Ordinary CF algorithm is applied to the remaining dataset. Step 4: Since the separated dataset consist of users with unique tastes, an ordinary CF algorithm cannot generate recommendations for them. A 'popular item' method is used to generate recommendations for these users. The F measures of the two datasets are weighted by the numbers of nodes and summed to be used as the final performance metric. In order to test performance improvement by this new algorithm, an empirical study was conducted using a publically available dataset - the MovieLens data by GroupLens research team. We used 100,000 evaluations by 943 users on 1,682 movies. The proposed algorithm was compared with an ordinary CF algorithm utilizing 'Best-N-neighbors' and 'Cosine' similarity method. The empirical results show that F measure was improved about 11% on average when the proposed algorithm was used

    . Past studies to improve CF performance typically used additional information other than users' evaluations such as demographic data. Some studies applied SNA techniques as a new similarity metric. This study is novel in that it used SNA to separate dataset. This study shows that performance of CF can be improved, without any additional information, when SNA techniques are used as proposed. This study has several theoretical and practical implications. This study empirically shows that the characteristics of dataset can affect the performance of CF recommender systems. This helps researchers understand factors affecting performance of CF. This study also opens a door for future studies in the area of applying SNA to CF to analyze characteristics of dataset. In practice, this study provides guidelines to improve performance of CF recommender systems with a simple modification.

  • Problems in the Korean National Family Planning Program (한국가족계획사업(韓國家族計劃事業)의 문제점(問題點))

    • Hong, Jong-Kwan
      • Clinical and Experimental Reproductive Medicine
      • /
      • v.2 no.2
      • /
      • pp.27-36
      • /
      • 1975
    • The success of the family planning program in Korea is reflected in the decrease in the growth rate from 3.0% in 1962 to 2.0% in 1971, and in the decrease in the fertility rate from 43/1,000 in 1960 to 29/1,000 in 1970. However, it would be erroneous to attribute these reductions entirely to the family planning program. Other socio-economic factors, such as the increasing age at marriage and the increasing use of induced abortions, definitely had an impact on the lowered growth and fertility rate. Despite the relative success of the program to data in meeting its goals, there is no room for complacency. Meeting the goal of a further reduction in the population growth rate to 1.3% by 1981 is a much more difficult task than any one faced in the past. Not only must fertility be lowered further, but the size of the target population itself will expand tremendously in the late seventies; due to the post-war baby boom of the 1950's reaching reproductive ages. Furthermore, it is doubtful that the age at marriage will continue to rise as in the past or that the incidence of induced abortion will continue to increase. Consequently, future reductions in fertility will be more dependent on the performance of the national family planning program, with less assistance from these non-program factors. This paper will describe various approaches to help to the solution of these current problems. 1. PRACTICE RATE IN FAMILY PLANNING In 1973, the attitude (approval) and knowledge rates were quite high; 94% and 98% respectively. But a large gap exists between that and the actual practice rate, which is only 3695. Two factors must be considered in attempting to close the KAP-gap. The first is to change social norms, which still favor a larger family, increasing the practice rate cannot be done very quickly. The second point to consider is that the family planning program has not yet reached all the eligible women. A 1973 study determineded that a large portion, 3096 in fact, of all eligible women do not want more children, but are not practicing family planning. Thus, future efforts to help close the KAP-gap must focus attention and services on this important large group of potential acceptors. 2. CONTINUATION RATES Dissatisfaction with the loop and pill has resulted in high discontinuation rates. For example, a 1973 survey revealed that within the first six months initial loop acceptance. nearly 50% were dropouts, and that within the first four months of inital pill acceptance. nearly 50% were dropouts. These discontinuation rates have risen over the past few years. The high rate of discontinuance obviously decreases the contraceptive effectiveness. and has resulted in many unwanted births which is directly related to the increase of induced abortions. In the future, the family planning program must emphasize the improved quality of initial and follow-up services. rather than more quantity, in order to insure higher continuation rates and thus more effective contraceptive protection. 3. INDUCED ABORTION As noted earlier. the use of induced abortions has been increase yearly. For example, in 1960, the average number of abortions was 0.6 abortions per women in the 15-44 age range. By 1970. that had increased to 2 abortions per women. In 1966. 13% of all women between 15-44 had experienced at least one abortion. By 1971, that figure jumped to 28%. In 1973 alone, the total number of abortions was 400,000. Besides the ever incre.sing number of induced abortions, another change has that those who use abortions have shifted since 1965 to include- not. only the middle class, but also rural and low-income women. In the future. in response to the demand for abortion services among rural and low-income w~men, the government must provide and support abortion services for these women as a part of the national family planning program. 4. TARGET SYSTIi:M Since 1962, the nationwide target system has been used to set a target for each method, and the target number of acceptors is then apportioned out to various sub-areas according to the number of eligible couples in each area. Because these targets are set without consideration for demographic factors, particular tastes, prejudices, and previous patterns of acceptance in the area, a high discontinuation rate for all methods and a high wastage rate for the oral pill and condom results. In the future. to alleviate these problems of the methodbased target system. an alternative. such as the weighted-credit system, should be adopted on a nation wide basis. In this system. each contraceptive method is. assigned a specific number of points based upon the couple-years of protection (CYP) provided by the method. and no specific targets for each method are given. 5. INCREASE OF STERILIZA.TION TARGET Two special projects. the hospital-based family planning program and the armed forces program, has greatly contributed to the increasing acceptance in female and male sterilization respectively. From January-September 1974, 28,773 sterilizations were performed. During the same time in 1975, 46,894 were performed; a 63% increase. If this trend continues, by the end of 1975. approximately 70,000 sterilizations will have been performed. Sterilization is a much better method than both the loop and pill, in terms of more effective contraceptive protection and the almost zero dropout rate. In the future, the. family planning program should continue to stress the special programs which make more sterilizations possible. In particular, it should seek to add the laparoscope techniques to facilitate female sterilization acceptance rates. 6. INCREASE NUMBER OF PRIVATE ACCEPTORS Among the current family planning users, approximately 1/3 are in the private sector and thus do not- require government subsidy. The number of private acceptors increases with increasing urbanization and economic growth. To speed this process, the government initiated the special hospital based family planning program which is utilized mostly by the private sector. However, in the future, to further hasten the increase of private acceptors, the government should encourage doctors in private practice to provide family planning services, and provide the contraceptive supplies. This way, those do utilize the private medical system will also be able to receive family planning services and pay for it. Another means of increasing the number of private acceptors, IS to greatly expand the commercial outlets for pills and condoms beyond the existing service points of drugstores, hospitals, and health centers. 7. IE&C PROGRAM The current preferred family size is nearly twice as high as needed to achieve a stable poplation. Also, a strong boy preference hinders a small family size as nearly all couples fuel they must have at least one or more sons. The IE&C program must, in the future, strive to emphasize the values of the small family and equality of the sexes. A second problem for the IE&C program to work. with in the: future is the large group of people who approves family planning, want no more children, but do not practice. The IE&C program must work to motivate these people to accept family planning And finally, for those who already practice, an IE&C program in the future must stress continuation of use. The IE&C campaign, to insure highest effectiveness, should be based on a detailed factor analysis of contraceptive discontinuance. In conclusion, Korea faces a serious unfavorable sociodemographic situation- in the future unless the population growth rate can be curtailed. And in the future, the decrease in fertility will depend solely on the family planning program, as the effect of other socio-economic factors has already been maximumally felt. A second serious factor to consider is the increasing number of eligible women due to the 1950's baby boom. Thus, to meet these challenges, the program target must be increased and the program must improve the effectiveness of its current activities and develop new programs.

    • PDF