• Title/Summary/Keyword: Weighted factor

Search Result 377, Processing Time 0.125 seconds

Relationship between Environmental Exposure and Biological Monitoring Values in Workers Exposed to Styrene (스타이렌 폭로 근로자의 환경중 폭로농도와 생물학적 모니터링에 관한 연구)

  • Paik, Jong-Min;Lee, Jong-Yung;Kim, Jung-Man
    • Journal of Korean Society of Occupational and Environmental Hygiene
    • /
    • v.7 no.2
    • /
    • pp.161-170
    • /
    • 1997
  • This is an effort to confirm changes biological monitoring according to changes in levels of exposure to styrene for industrial workers. This study was conducted on 108 workers, including male of 64 and female 44 who were working at factories of FRP, dipping, and coating. An improved passive monitor method(organic vapor monitor; OVM) was employed to determine levels of exposure. The biological monitoring include blood styrene concentration, urinary mandelic acid(MA), and urinary phenylglyoxylic acid(PGA). Biological monitoring were made through the Collection of blood and urine. The mean value of exposure to styrene was 21.0ppm, which is measured by organic vapor monitor, one of improved passive monitors. The highest exposure level was observed among workers in boat factories, laminating procedure workers, processing workers, respectively(p<0.01). For exposure level, 11% of subjects under study showed over 50ppm which is time weighted average(TWA). The correlation coefficient between biological specimens and the exposure level was 0.62 for blood styrene concentration, 0.58 for MA corrected by creatinine, and 0.70 for PGA corrected by creatinine, respectively(p<0.01). The regression analyses found exposure level relative importance in explaining variance in biological monitoring. In additional to that, gender was a significant factor in explaining variance of MA and MA+PGA. Almost half of variance(49%) in blood styrene concentration was explained by predictors, including exposure level, age, gender, duration, and drinking volume during the last week(p<0.01). The very high correlation(higher than 0.95 was found when a comparison was made among three types of corrected methods, including uncorrected specific gravity and creatinine. In conclusion, these findings suggest OVM to represent levels of exposure to styrene for industrial workers. A discussion was made on possible use of specific gravity sample for biological monitoring. Exposure level may be predicted on MA, PGA in urine, which could be applied to represent biological monitoring.

  • PDF

Factors Influencing the Adoption of Location-Based Smartphone Applications: An Application of the Privacy Calculus Model (스마트폰 위치기반 어플리케이션의 이용의도에 영향을 미치는 요인: 프라이버시 계산 모형의 적용)

  • Cha, Hoon S.
    • Asia pacific journal of information systems
    • /
    • v.22 no.4
    • /
    • pp.7-29
    • /
    • 2012
  • Smartphone and its applications (i.e. apps) are increasingly penetrating consumer markets. According to a recent report from Korea Communications Commission, nearly 50% of mobile subscribers in South Korea are smartphone users that accounts for over 25 million people. In particular, the importance of smartphone has risen as a geospatially-aware device that provides various location-based services (LBS) equipped with GPS capability. The popular LBS include map and navigation, traffic and transportation updates, shopping and coupon services, and location-sensitive social network services. Overall, the emerging location-based smartphone apps (LBA) offer significant value by providing greater connectivity, personalization, and information and entertainment in a location-specific context. Conversely, the rapid growth of LBA and their benefits have been accompanied by concerns over the collection and dissemination of individual users' personal information through ongoing tracking of their location, identity, preferences, and social behaviors. The majority of LBA users tend to agree and consent to the LBA provider's terms and privacy policy on use of location data to get the immediate services. This tendency further increases the potential risks of unprotected exposure of personal information and serious invasion and breaches of individual privacy. To address the complex issues surrounding LBA particularly from the user's behavioral perspective, this study applied the privacy calculus model (PCM) to explore the factors that influence the adoption of LBA. According to PCM, consumers are engaged in a dynamic adjustment process in which privacy risks are weighted against benefits of information disclosure. Consistent with the principal notion of PCM, we investigated how individual users make a risk-benefit assessment under which personalized service and locatability act as benefit-side factors and information privacy risks act as a risk-side factor accompanying LBA adoption. In addition, we consider the moderating role of trust on the service providers in the prohibiting effects of privacy risks on user intention to adopt LBA. Further we include perceived ease of use and usefulness as additional constructs to examine whether the technology acceptance model (TAM) can be applied in the context of LBA adoption. The research model with ten (10) hypotheses was tested using data gathered from 98 respondents through a quasi-experimental survey method. During the survey, each participant was asked to navigate the website where the experimental simulation of a LBA allows the participant to purchase time-and-location sensitive discounted tickets for nearby stores. Structural equations modeling using partial least square validated the instrument and the proposed model. The results showed that six (6) out of ten (10) hypotheses were supported. On the subject of the core PCM, H2 (locatability ${\rightarrow}$ intention to use LBA) and H3 (privacy risks ${\rightarrow}$ intention to use LBA) were supported, while H1 (personalization ${\rightarrow}$ intention to use LBA) was not supported. Further, we could not any interaction effects (personalization X privacy risks, H4 & locatability X privacy risks, H5) on the intention to use LBA. In terms of privacy risks and trust, as mentioned above we found the significant negative influence from privacy risks on intention to use (H3), but positive influence from trust, which supported H6 (trust ${\rightarrow}$ intention to use LBA). The moderating effect of trust on the negative relationship between privacy risks and intention to use LBA was tested and confirmed by supporting H7 (privacy risks X trust ${\rightarrow}$ intention to use LBA). The two hypotheses regarding to the TAM, including H8 (perceived ease of use ${\rightarrow}$ perceived usefulness) and H9 (perceived ease of use ${\rightarrow}$ intention to use LBA) were supported; however, H10 (perceived effectiveness ${\rightarrow}$ intention to use LBA) was not supported. Results of this study offer the following key findings and implications. First the application of PCM was found to be a good analysis framework in the context of LBA adoption. Many of the hypotheses in the model were confirmed and the high value of $R^2$ (i.,e., 51%) indicated a good fit of the model. In particular, locatability and privacy risks are found to be the appropriate PCM-based antecedent variables. Second, the existence of moderating effect of trust on service provider suggests that the same marginal change in the level of privacy risks may differentially influence the intention to use LBA. That is, while the privacy risks increasingly become important social issues and will negatively influence the intention to use LBA, it is critical for LBA providers to build consumer trust and confidence to successfully mitigate this negative impact. Lastly, we could not find sufficient evidence that the intention to use LBA is influenced by perceived usefulness, which has been very well supported in most previous TAM research. This may suggest that more future research should examine the validity of applying TAM and further extend or modify it in the context of LBA or other similar smartphone apps.

  • PDF

Comparative Analysis of GNSS Precipitable Water Vapor and Meteorological Factors (GNSS 가강수량과 기상인자의 상호 연관성 분석)

  • Jae Sup, Kim;Tae-Suk, Bae
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.33 no.4
    • /
    • pp.317-324
    • /
    • 2015
  • GNSS was firstly proposed for application in weather forecasting in the mid-1980s. It has continued to demonstrate the practical uses in GNSS meteorology, and other relevant researches are currently being conducted. Precipitable Water Vapor (PWV), calculated based on the GNSS signal delays due to the troposphere of the Earth, represents the amount of the water vapor in the atmosphere, and it is therefore widely used in the analysis of various weather phenomena such as monitoring of weather conditions and climate change detection. In this study we calculated the PWV through the meteorological information from an Automatic Weather Station (AWS) as well as GNSS data processing of a Continuously Operating Reference Station (CORS) in order to analyze the heavy snowfall of the Ulsan area in early 2014. Song’s model was adopted for the weighted mean temperature model (Tm), which is the most important parameter in the calculation of PWV. The study period is a total of 56 days (February 2013 and 2014). The average PWV of February 2014 was determined to be 11.29 mm, which is 11.34% lower than that of the heavy snowfall period. The average PWV of February 2013 was determined to be 10.34 mm, which is 8.41% lower than that of not the heavy snowfall period. In addition, certain meteorological factors obtained from AWS were compared as well, resulting in a very low correlation of 0.29 with the saturated vapor pressure calculated using the empirical formula of Magnus. The behavioral pattern of PWV has a tendency to change depending on the precipitation type, specifically, snow or rain. It was identified that the PWV showed a sudden increase and a subsequent rapid drop about 6.5 hours before precipitation. It can be concluded that the pattern analysis of GNSS PWV is an effective method to analyze the precursor phenomenon of precipitation.

A Study on the Development of Assessment Index for Catastrophic Incident Warning Sign at Refinery and Pertrochemical Plants (정유 및 석유화학플랜트 중대사고 전조신호 평가지표 개발에 관한 연구)

  • Yun, Yong Jin;Park, Dal Jae
    • Korean Chemical Engineering Research
    • /
    • v.57 no.5
    • /
    • pp.637-651
    • /
    • 2019
  • In the event of a major accident such as an explosion in a refinery or a petrochemical plant, it has caused a serious loss of life and property and has had a great impact on the insurance market. In the case of catastrophic incidents occurring in process industries such as refinery and petrochemical plants, only the proximate causes of loss have been drawn and studied from inspectors or claims adjustors responsible for claims of property insurers, incident cause investigators, and national forensic service workers. However, it has not been done well for conducting root cause analysis (RCA) and identifying the factors that contributed to the failure and establishing preventive measures before leading to chemical plant's catastrophic incidents. In this study, the criteria of warning signs on CCPS catastrophic incident waning sign self-assessment tool which was derived through the RCA method and the contribution factor analysis method using the swiss cheese model principle has been reviewed first. Secondly, in order to determine the major incident warning signs in an actual chemical plant, 614 recommendations which have been issued during last the 17 years by loss control engineers of global reinsurers were analyzed. Finally, in order to facilitate the assessment index for catastrophic incident warning signs, the criteria for the catastrophic incident warning sign index at chemical plants were grouped by type and classified into upper category and lower category. Then, a catastrophic incident warning sign index for a chemical plant was developed using the weighted values of each category derived by applying the analytic hierarchy process (pairwise comparison method) through a questionnaire answered by relevant experts of the chemical plant. It is expected that the final 'assessment index for catastrophic incident warning signs' can be utilized by the refinery and petrochemical plant's internal as well as external auditors to assess vulnerability levels related to incident warning signs, and identify the elements of incident warning signs that need to be tracked and managed to prevent the occurrence of serious incidents in the future.

Discount Presentation Framing & Bundle Evaluation: The Effects of Consumption Benefit and Perceived Uncertainty of Quality (묶음제품 가격 할인 제시 프레이밍 효과: 지각된 소비 혜택과 품질 불확실성의 영향을 중심으로)

  • Im, Meeja
    • Asia Marketing Journal
    • /
    • v.14 no.1
    • /
    • pp.53-81
    • /
    • 2012
  • Constructing attractive bundle offers depends on more than an understanding of the distribution of consumer preferences. Consumers are also sensitive to the framing of price information in a bundle offer. In classical economic theory, consumers' utility should not change as long as the total price paid stays same. However, even when total prices are identical, consumers' preferences toward a bundle product could be different depending on the format of price presentation and the locus of price discount. A weighted additive model predicts that the impact of a price discount on the overall evaluation of the bundle will be greater when the discount is assigned to the more important product in the bundle(Yadav 1995). Meanwhile, a reference dependent model asserts that it is better to assign a price discount to a tie-in component that has a negative valuation at its current offer price than to a focal product that has a positive valuation at its current offer price(Janiszewski and Cunha 2004). This paper has expanded previous research regarding price discount presentation format, investigating the reasons for mixed results of prior research and presenting new mechanisms for price discount framing effect. Prior research has hypothesized that bundling is used to sell a tie-in component with an offer price above the consumer's reference price plus a focal product of the same offer price with reference price(e.g., Janiszewski and Cunha 2004). However, this study suggests that bundling strategy can be used for increasing product's attractiveness through the synergy between components even when offer prices of bundle components are the same with reference prices. In this context, this study employed various realistic bundle sets with same price between offer price and reference price in the experiment. Hamilton and Srivastava(2008) demonstrated that when evaluating different partitions of the same total price, consumers prefer partitions in which the price of the high-benefit component is higher. This study determined that their mechanism can be applied to price discount presentation formats. This study hypothesized that price discount framing effect depends not on the negative perception of tie-in component with offer price above reference price but rather on the consumers' perceived consumption benefit in bundle product. This research also hypothesized that preference for low-benefit discount mechanism is that perceived consumption benefit reduces price sensitivity. Furthermore, this study investigated how consumers' concern for quality in a price discount--a factor not considered in previous research--influences price discount framing. Yadav(1995)'s experiment used only one magazine bundle of relatively low quality uncertainty and could not show the influence of perceived uncertainty of quality. This study assumed that as perceived uncertainty of quality increases, the price sensitivity mechanism for assigning the discount to low-benefit will increase. Further, this research investigated the moderating effect of uncertainty of quality in price discount framing. The results of the experiment showed that when evaluating different partitions of the same total price and the same amount of discounts, the partition that discounts in the price of low benefit component is preferred to the partition that decreases the price of high benefit component. This implies that price discount framing effect depends on the perceived consumption benefit. The results also demonstrated that consumers are more price sensitive to low benefit component and less price sensitive to high benefit component. Furthermore, the results showed that the influence of price discount presentation format on the evaluation of bundle product varies with the perceived uncertainty of quality in high consumption benefit. As perceived uncertainty of quality gradually increases, the preference for discounts in the price of low consumption benefit decreases. Besides, the results demonstrate that as perceived uncertainty of quality gradually increases, the effect of price sensitivity in consumption benefit also increases. This paper integrated prior research by using a new mechanism of perceived consumption benefit and moderating effect of perceived quality uncertainty, thus providing a clearer explanation for price discount framing effect.

  • PDF

A Folksonomy Ranking Framework: A Semantic Graph-based Approach (폭소노미 사이트를 위한 랭킹 프레임워크 설계: 시맨틱 그래프기반 접근)

  • Park, Hyun-Jung;Rho, Sang-Kyu
    • Asia pacific journal of information systems
    • /
    • v.21 no.2
    • /
    • pp.89-116
    • /
    • 2011
  • In collaborative tagging systems such as Delicious.com and Flickr.com, users assign keywords or tags to their uploaded resources, such as bookmarks and pictures, for their future use or sharing purposes. The collection of resources and tags generated by a user is called a personomy, and the collection of all personomies constitutes the folksonomy. The most significant need of the folksonomy users Is to efficiently find useful resources or experts on specific topics. An excellent ranking algorithm would assign higher ranking to more useful resources or experts. What resources are considered useful In a folksonomic system? Does a standard superior to frequency or freshness exist? The resource recommended by more users with mere expertise should be worthy of attention. This ranking paradigm can be implemented through a graph-based ranking algorithm. Two well-known representatives of such a paradigm are Page Rank by Google and HITS(Hypertext Induced Topic Selection) by Kleinberg. Both Page Rank and HITS assign a higher evaluation score to pages linked to more higher-scored pages. HITS differs from PageRank in that it utilizes two kinds of scores: authority and hub scores. The ranking objects of these pages are limited to Web pages, whereas the ranking objects of a folksonomic system are somewhat heterogeneous(i.e., users, resources, and tags). Therefore, uniform application of the voting notion of PageRank and HITS based on the links to a folksonomy would be unreasonable, In a folksonomic system, each link corresponding to a property can have an opposite direction, depending on whether the property is an active or a passive voice. The current research stems from the Idea that a graph-based ranking algorithm could be applied to the folksonomic system using the concept of mutual Interactions between entitles, rather than the voting notion of PageRank or HITS. The concept of mutual interactions, proposed for ranking the Semantic Web resources, enables the calculation of importance scores of various resources unaffected by link directions. The weights of a property representing the mutual interaction between classes are assigned depending on the relative significance of the property to the resource importance of each class. This class-oriented approach is based on the fact that, in the Semantic Web, there are many heterogeneous classes; thus, applying a different appraisal standard for each class is more reasonable. This is similar to the evaluation method of humans, where different items are assigned specific weights, which are then summed up to determine the weighted average. We can check for missing properties more easily with this approach than with other predicate-oriented approaches. A user of a tagging system usually assigns more than one tags to the same resource, and there can be more than one tags with the same subjectivity and objectivity. In the case that many users assign similar tags to the same resource, grading the users differently depending on the assignment order becomes necessary. This idea comes from the studies in psychology wherein expertise involves the ability to select the most relevant information for achieving a goal. An expert should be someone who not only has a large collection of documents annotated with a particular tag, but also tends to add documents of high quality to his/her collections. Such documents are identified by the number, as well as the expertise, of users who have the same documents in their collections. In other words, there is a relationship of mutual reinforcement between the expertise of a user and the quality of a document. In addition, there is a need to rank entities related more closely to a certain entity. Considering the property of social media that ensures the popularity of a topic is temporary, recent data should have more weight than old data. We propose a comprehensive folksonomy ranking framework in which all these considerations are dealt with and that can be easily customized to each folksonomy site for ranking purposes. To examine the validity of our ranking algorithm and show the mechanism of adjusting property, time, and expertise weights, we first use a dataset designed for analyzing the effect of each ranking factor independently. We then show the ranking results of a real folksonomy site, with the ranking factors combined. Because the ground truth of a given dataset is not known when it comes to ranking, we inject simulated data whose ranking results can be predicted into the real dataset and compare the ranking results of our algorithm with that of a previous HITS-based algorithm. Our semantic ranking algorithm based on the concept of mutual interaction seems to be preferable to the HITS-based algorithm as a flexible folksonomy ranking framework. Some concrete points of difference are as follows. First, with the time concept applied to the property weights, our algorithm shows superior performance in lowering the scores of older data and raising the scores of newer data. Second, applying the time concept to the expertise weights, as well as to the property weights, our algorithm controls the conflicting influence of expertise weights and enhances overall consistency of time-valued ranking. The expertise weights of the previous study can act as an obstacle to the time-valued ranking because the number of followers increases as time goes on. Third, many new properties and classes can be included in our framework. The previous HITS-based algorithm, based on the voting notion, loses ground in the situation where the domain consists of more than two classes, or where other important properties, such as "sent through twitter" or "registered as a friend," are added to the domain. Forth, there is a big difference in the calculation time and memory use between the two kinds of algorithms. While the matrix multiplication of two matrices, has to be executed twice for the previous HITS-based algorithm, this is unnecessary with our algorithm. In our ranking framework, various folksonomy ranking policies can be expressed with the ranking factors combined and our approach can work, even if the folksonomy site is not implemented with Semantic Web languages. Above all, the time weight proposed in this paper will be applicable to various domains, including social media, where time value is considered important.

Problems in the Korean National Family Planning Program (한국가족계획사업(韓國家族計劃事業)의 문제점(問題點))

  • Hong, Jong-Kwan
    • Clinical and Experimental Reproductive Medicine
    • /
    • v.2 no.2
    • /
    • pp.27-36
    • /
    • 1975
  • The success of the family planning program in Korea is reflected in the decrease in the growth rate from 3.0% in 1962 to 2.0% in 1971, and in the decrease in the fertility rate from 43/1,000 in 1960 to 29/1,000 in 1970. However, it would be erroneous to attribute these reductions entirely to the family planning program. Other socio-economic factors, such as the increasing age at marriage and the increasing use of induced abortions, definitely had an impact on the lowered growth and fertility rate. Despite the relative success of the program to data in meeting its goals, there is no room for complacency. Meeting the goal of a further reduction in the population growth rate to 1.3% by 1981 is a much more difficult task than any one faced in the past. Not only must fertility be lowered further, but the size of the target population itself will expand tremendously in the late seventies; due to the post-war baby boom of the 1950's reaching reproductive ages. Furthermore, it is doubtful that the age at marriage will continue to rise as in the past or that the incidence of induced abortion will continue to increase. Consequently, future reductions in fertility will be more dependent on the performance of the national family planning program, with less assistance from these non-program factors. This paper will describe various approaches to help to the solution of these current problems. 1. PRACTICE RATE IN FAMILY PLANNING In 1973, the attitude (approval) and knowledge rates were quite high; 94% and 98% respectively. But a large gap exists between that and the actual practice rate, which is only 3695. Two factors must be considered in attempting to close the KAP-gap. The first is to change social norms, which still favor a larger family, increasing the practice rate cannot be done very quickly. The second point to consider is that the family planning program has not yet reached all the eligible women. A 1973 study determineded that a large portion, 3096 in fact, of all eligible women do not want more children, but are not practicing family planning. Thus, future efforts to help close the KAP-gap must focus attention and services on this important large group of potential acceptors. 2. CONTINUATION RATES Dissatisfaction with the loop and pill has resulted in high discontinuation rates. For example, a 1973 survey revealed that within the first six months initial loop acceptance. nearly 50% were dropouts, and that within the first four months of inital pill acceptance. nearly 50% were dropouts. These discontinuation rates have risen over the past few years. The high rate of discontinuance obviously decreases the contraceptive effectiveness. and has resulted in many unwanted births which is directly related to the increase of induced abortions. In the future, the family planning program must emphasize the improved quality of initial and follow-up services. rather than more quantity, in order to insure higher continuation rates and thus more effective contraceptive protection. 3. INDUCED ABORTION As noted earlier. the use of induced abortions has been increase yearly. For example, in 1960, the average number of abortions was 0.6 abortions per women in the 15-44 age range. By 1970. that had increased to 2 abortions per women. In 1966. 13% of all women between 15-44 had experienced at least one abortion. By 1971, that figure jumped to 28%. In 1973 alone, the total number of abortions was 400,000. Besides the ever incre.sing number of induced abortions, another change has that those who use abortions have shifted since 1965 to include- not. only the middle class, but also rural and low-income women. In the future. in response to the demand for abortion services among rural and low-income w~men, the government must provide and support abortion services for these women as a part of the national family planning program. 4. TARGET SYSTIi:M Since 1962, the nationwide target system has been used to set a target for each method, and the target number of acceptors is then apportioned out to various sub-areas according to the number of eligible couples in each area. Because these targets are set without consideration for demographic factors, particular tastes, prejudices, and previous patterns of acceptance in the area, a high discontinuation rate for all methods and a high wastage rate for the oral pill and condom results. In the future. to alleviate these problems of the methodbased target system. an alternative. such as the weighted-credit system, should be adopted on a nation wide basis. In this system. each contraceptive method is. assigned a specific number of points based upon the couple-years of protection (CYP) provided by the method. and no specific targets for each method are given. 5. INCREASE OF STERILIZA.TION TARGET Two special projects. the hospital-based family planning program and the armed forces program, has greatly contributed to the increasing acceptance in female and male sterilization respectively. From January-September 1974, 28,773 sterilizations were performed. During the same time in 1975, 46,894 were performed; a 63% increase. If this trend continues, by the end of 1975. approximately 70,000 sterilizations will have been performed. Sterilization is a much better method than both the loop and pill, in terms of more effective contraceptive protection and the almost zero dropout rate. In the future, the. family planning program should continue to stress the special programs which make more sterilizations possible. In particular, it should seek to add the laparoscope techniques to facilitate female sterilization acceptance rates. 6. INCREASE NUMBER OF PRIVATE ACCEPTORS Among the current family planning users, approximately 1/3 are in the private sector and thus do not- require government subsidy. The number of private acceptors increases with increasing urbanization and economic growth. To speed this process, the government initiated the special hospital based family planning program which is utilized mostly by the private sector. However, in the future, to further hasten the increase of private acceptors, the government should encourage doctors in private practice to provide family planning services, and provide the contraceptive supplies. This way, those do utilize the private medical system will also be able to receive family planning services and pay for it. Another means of increasing the number of private acceptors, IS to greatly expand the commercial outlets for pills and condoms beyond the existing service points of drugstores, hospitals, and health centers. 7. IE&C PROGRAM The current preferred family size is nearly twice as high as needed to achieve a stable poplation. Also, a strong boy preference hinders a small family size as nearly all couples fuel they must have at least one or more sons. The IE&C program must, in the future, strive to emphasize the values of the small family and equality of the sexes. A second problem for the IE&C program to work. with in the: future is the large group of people who approves family planning, want no more children, but do not practice. The IE&C program must work to motivate these people to accept family planning And finally, for those who already practice, an IE&C program in the future must stress continuation of use. The IE&C campaign, to insure highest effectiveness, should be based on a detailed factor analysis of contraceptive discontinuance. In conclusion, Korea faces a serious unfavorable sociodemographic situation- in the future unless the population growth rate can be curtailed. And in the future, the decrease in fertility will depend solely on the family planning program, as the effect of other socio-economic factors has already been maximumally felt. A second serious factor to consider is the increasing number of eligible women due to the 1950's baby boom. Thus, to meet these challenges, the program target must be increased and the program must improve the effectiveness of its current activities and develop new programs.

  • PDF