• Title/Summary/Keyword: Relevant Use

Search Result 1,718, Processing Time 0.028 seconds

Discourse Analysis of Business Chinese and the Comparison of Negotiation Culture between Korea and China - Focused on Business Emails Related to 'Napkin Holder' Imports - (무역 중국어 담화 고찰과 한중 협상문화 비교 - '냅킨꽂이' 수입 관련 비즈니스 이메일을 중심으로 -)

  • Choi, Tae-Hoon
    • Cross-Cultural Studies
    • /
    • v.50
    • /
    • pp.103-130
    • /
    • 2018
  • This research aims to explore the associated linguistic features and functions of Chinese as used for business trading purposes, and which is based on a discourse analysis through a case in which a Korean buyer and a Chinese supplier have exchanged Internet based e-mails. The research questions include first, the linguistic functions and characteristics of Chinese shown as identified in this trade case through e-mails, second, the use of Chinese trade specific terms, and third, the apparent and dynamic negotiation strategies that are identified as followed by the cultural value systems which are used for resolving interest conflicts and issues between the buyer and supplier in the course of negotiating business contracts between two parties. The participants of this research pertain to a Korean buyer, James and a Chinese supplier, Sonya. The associated data consists of 74 e-mails exchanged between the two parties, initiated in an effort to begin and complete a trade item, in this case namely the product of napkin holders. The research for the study is based on the discourse analysis and empirically analyses models of Chinese linguistic functions and features. The findings are the following. First, as identified, the specific Chinese functions used and sequenced in this trade case are of a procedure, request, informing, negotiation and persuasion. Second, the essential trade terms used in this business interaction involve the relevant issues of 1) ordering and price negotiating, 2) marking the origin of the products, 3) the arrangement of the product examination and customs declaration for the anticipated import items, 4) preparation of the necessary legal documents, and 5) the package and transport of the product in the final instance. Third, the impact of the similarities and differences in the cultural value systems between Korea and China on the negotiations and conflict resolution during a negotiated contract between two parties are speculated in terms of the use of culturally based techniques such as face-saving and the utilization of uncertainty-avoiding strategies as meant to prevent misunderstandings from developing between the parties. The concluding part of the study discusses the implications for a practical Chinese language education utilizing the linguistic functions and features of the Chinese culture and language strategies as useful in business associations for trading purposes, and the importance of intercultural communication styles based on similar of different identified cultural values as noted between two parties.

The Records and Archives Administrative Reform in China in 1930s (1930년대 중국 문서당안 행정개혁론의 이해)

  • Lee, Won-Kyu
    • The Korean Journal of Archival Studies
    • /
    • no.10
    • /
    • pp.276-322
    • /
    • 2004
  • Historical interest in China in 1930s has been mostly focused on political characteristic of the National Government(國民政府) which was established by the KMT(中國國民黨) as a result of national unification. It is certain that China had a chance to construct a modern country by the establishment of the very unified revolutionary government. But, it was the time of expanding national crises that threatened the existence of the country such as the Manchurian Incident and the Chinese-Japanese War as well as the chaos of the domestic situation, too. So it has a good reason to examine the characteristic and pattern of the response of the political powers of those days. But, as shown in the recent studies, the manifestation method of political power by the revolutionary regime catches our attention through the understanding of internal operating system. Though this writing started from the fact that the Nationalist Government executed the administrative reform which aimed at "administrative efficiency" in the middle of 1930s, but it put stress on the seriousness of the problem and its solution rather than political background or results. "Committee on Administrative Efficiency(行政效率委員會)", the center of administrative reform movement which was established in 1934, examined the plan to execute the reform through legislation by the Executive Council(行政院) on the basis of the results of relevant studies. They claimed that the construction of a modern country should be performed by not political revolution anymore but by gradual improvement and daily reform, and that the operation of the government should become modern, scientific and efficient. There were many fields of administrative reform subjects, but especially, the field of records and archives adminstration(文書檔案行政) was studied intensively from the initial stage because that subject had already been discussed intensively. They recognized that records and archives were the basic tool of work performance and general activity but an inefficient field in spite of many input staff members, and most of all, archival reform bring about less conflicts than the fields of finance, organization and personnel. When it comes to the field of records adminstration, the key subjects that records should be written simply, the process of record treatment should be clear and the delay of that should be prevented were already presented in a records administrative meeting in 1922. That is, the unified law about record management was not established, so each government organization followed a conventional custom or performed independent improvement. It was through the other records administrative workshop of the Nationalist Government in 1933 when the new trend was appeared as the unified system improvement. They decided to unify the format of official records, to use marker and section, to unify the registration of receipt records and dispatch records and to strengthen the examination of records treatment. But, the method of records treatment was not unified yet, so the key point of records administrative reform was to establish a unified and standard record management system for preventing repetition by simplifying the treatment procedure and for intensive treatment by exclusive organizations. From the foundation of the Republic of China to 1930s, there was not big change in the field of archives administration, and archives management methods were prescribed differently even in the same section as well as same department. Therefore, the point at issue was to centralize scattered management systems that were performed in each section, to establish unified standard about filing and retention period allowance and to improve searching system through classification and proper number allowance. Especially, the problem was that each number system and classification system bring about different result due to dual operation of record registration and archives registration, and that strict management through mutual contrast, searching and application are impossible. Besides, various problems such as filing tools, arrangement method, preservation facilities & equipment, lending service and use method were raised also. In the process this study for the system improvement of records and archives management, they recognized that records and archives are the identical thing and reached to create a successive management method of records and archives called "Records and Archives Chain Management Method(文書檔案連鎖法)" as a potential alternative. Several principles that records and archives management should be performed unitedly in each organization by the general record recipient section and the general archives section under the principle of task centralization, a consistent classification system should be used by classification method decided in advance according to organizational constitution and work functions and an identical number system should be used in the process of record management stage and archive management stage by using a card-type register were established. Though, this "Records and Archives Chain Management Method" was developed to the stage of test application in several organizations, but it was not adopted as a regular system and discontinued. That was because the administrative reform of the Nationalist Government was discontinued by the outbreak of the Chinese-Japanese War. Even though the administrative reform in the middle of 1930s didn't produce practical results but merely an experimentation, it was verified that the reform against tradition and custom conducted by the Nationalist Government that aimed for the construction of a modern country was not only a field of politics, but on the other hand, the weak basis of the government operation became the obstacle to the realization of the political power of the revolutionary regime. Though the subject of records and archives administrative reform was postponed to the future, it should be understood that the consciousness of modern records and archives administration and overall studies began through this examination of administrative reform.

A Study on the Protection and Management System of the Southwestern Coast Tidal Flat for Inscription in the World Heritage List (서남해안 갯벌의 세계유산 등재를 위한 보호 및 관리체계 연구)

  • Moon, Kyong-O
    • Korean Journal of Heritage: History & Science
    • /
    • v.48 no.3
    • /
    • pp.80-95
    • /
    • 2015
  • The purpose of this study is to establish an effective protection and management system for World Heritage (WH) nomination of Southwestern Coast Tidal Flat (SCTF) by proposing a model of protection and management. SCTF has a potential to become a representative best practice to achieve a sustainable development for human society. SCTF has a potential Outstanding Universal Values (OUVs) for WH nomination, thus a harmony between human and nature for wise use of natural resources needs to be pursued. It is required to present the system of SCTF's protection and management and to analyze present status of the regions in the Tentative List by comparing the case which were already inscribed as WH. For better protection of nominated areas, SCTF should expand an area for protection with additional designation. For the management system, two separate management systems such as the Department of Culture & Tourism and the Department of Oceans & Fisheries need to be reconciled. Because of this overlapping management structure, the management of the nominated sites has been inefficient and long-term management plan is lacking. Therefore, it is necessary to integrate conflicting management system of each local government and make a long-term, integrated management plan. To make an efficient and sustainable protection and management, it is essential to set up a collaboration system by integrating various stakeholders such as central and local governments, academic organizations, local residents, and NGOs. As in the case of Wadden Sea, the combined community system of the stakeholders mentioned above should be established. Because it is essential for local residents to understand a basic concepts for protection and management, it is necessary to establish capacity-building of local people. The protection and management structure should be set up by bottom-up processes, that is the proper structure shoud be based on thorough research on local society as well as thorough communication with local residents to make relevant laws and policies. This study also propose the proper plan for better conservation and management of SCTF.

Extraction of Essential Design Elements for Urban Parks - Based on the Analysis of 2017 Satisfaction Survey of Park Use in Seoul - (도시공원의 필수 설계요소 추출 - 2017년 서울시 공원이용 만족도 조사의 결과 분석을 바탕으로 -)

  • Lee, Jae Ho;Kim, Soonki
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.46 no.6
    • /
    • pp.41-48
    • /
    • 2018
  • The aim of this study is to provide foundational knowledge of how to enhance the user satisfaction of urban parks. The study seeks to identify essential factors that influence user satisfaction and to provide better design strategies for future park design as well as the reorganization of existing ones. To measure user satisfaction, this study used factor analysis to extract essential factors - facility conditions, landscape and scenery, safety, and kindness - by using data from a survey conducted by the city of Seoul in 2017. We then used a regression analysis to infer causal relationships between the independent variables and the dependent variables (user satisfaction). The results revealed that the most significantly and positively related variable to user satisfaction in urban parks was safety (${\beta}=0.276$, p<.000), followed by landscape and scenery (${\beta}=0.230$, p<.000), facility conditions (${\beta}=0.215$, p<.000), and kindness (${\beta}=0.208$, p<.000). The results indicate that, for future urban park designs, planners and designers should prioritize the issues of safety by adopting crime prevention through environmental design (CPTED). In addition, planners and designers of future park designs should heavily weigh the selection and provision of relevant facilities for the intended use as well as well-arranged and well-managed plants and trees. Based on the results of IPA analysis, the most urgent improvement elements appeared to be the factor of kindness; however, the impact of kindness influencing user satisfaction was less important than that of safety and landscape and scenery in the urban park design processes. This study demonstrates that to maximize the user satisfaction of the urban park design processes and to provide more valuable spaces for users, it is necessary to secure park safety and to create well-composed landscape and scenery. Future research should provide more detailed and specified urban park design strategies corresponding with the importance of the factors identified in this study.

A Study on the Revitalization of BIM in the Field of Architecture Using AHP Method (AHP 기법을 이용한 건축분야 BIM 활성화 방안 연구)

  • Kim, Jin-Ho;Hwang, Chan-Gyu;Kim, Ji-Hyung
    • Journal of the Korea Institute of Building Construction
    • /
    • v.22 no.5
    • /
    • pp.473-483
    • /
    • 2022
  • BIM(Building Information Modeling) is a technology that can manage information throughout the entire life cycle of the construction industry and serves as a platform for improving productivity and integrating the entire construction industry. Currently, BIM is actively applied in developed countries, and its use at various overseas construction sites is increasing This is unclear. due to air shortening and budget savings. However, there is still a lack of institutional basis and technical limitations in the domestic construction sector, which have led to the lack of utilization of BIM. Various activation measures and institutional frameworks will need to be established for the early establishment of these productive BIMs in Korea. Therefore, as part of the research for the domestic settlement and revitalization of BIM, this study derived a number of key factors necessary for the development of the construction industry through brainstorming and expert surveys using AHP techniques and analyzed the relative importance of each factor. In addition, prior surveys by a group of experts resulted in 1, 3 items in level, 2, 9 items in level, and 3, 27 items in level, and priorities analysis was performed through pairwise comparisons. As a result of the AHP analysis, it was found that the relative importance weight of policy aspects was highest in level 1, and the policy factors in level 2 and the cost-based and incentive system introduction factors were considered most important in level 3. These findings show that the importance of the policy guidance or institutions underlying the activation of BIM rather than research and development or corporate innovation is relatively high, and that the preparation of policy plans by public institutions should be the first priority. Therefore, it is considered that the development of a policy system or guideline must be prioritized before it can be advanced to the next activation stage. The use of BIM technologies will not only contribute to improving the productivity of the construction industry, but also to the overall development of the industry and the growth of the construction industry. It is expected that the results of this study can provide as useful information when establishing policies for activating BIM in central government, relevant local governments, and related public institutions.

A Study on the Meaning and Strategy of Keyword Advertising Marketing

  • Park, Nam Goo
    • Journal of Distribution Science
    • /
    • v.8 no.3
    • /
    • pp.49-56
    • /
    • 2010
  • At the initial stage of Internet advertising, banner advertising came into fashion. As the Internet developed into a central part of daily lives and the competition in the on-line advertising market was getting fierce, there was not enough space for banner advertising, which rushed to portal sites only. All these factors was responsible for an upsurge in advertising prices. Consequently, the high-cost and low-efficiency problems with banner advertising were raised, which led to an emergence of keyword advertising as a new type of Internet advertising to replace its predecessor. In the beginning of 2000s, when Internet advertising came to be activated, display advertisement including banner advertising dominated the Net. However, display advertising showed signs of gradual decline, and registered minus growth in the year 2009, whereas keyword advertising showed rapid growth and started to outdo display advertising as of the year 2005. Keyword advertising refers to the advertising technique that exposes relevant advertisements on the top of research sites when one searches for a keyword. Instead of exposing advertisements to unspecified individuals like banner advertising, keyword advertising, or targeted advertising technique, shows advertisements only when customers search for a desired keyword so that only highly prospective customers are given a chance to see them. In this context, it is also referred to as search advertising. It is regarded as more aggressive advertising with a high hit rate than previous advertising in that, instead of the seller discovering customers and running an advertisement for them like TV, radios or banner advertising, it exposes advertisements to visiting customers. Keyword advertising makes it possible for a company to seek publicity on line simply by making use of a single word and to achieve a maximum of efficiency at a minimum cost. The strong point of keyword advertising is that customers are allowed to directly contact the products in question through its more efficient advertising when compared to the advertisements of mass media such as TV and radio, etc. The weak point of keyword advertising is that a company should have its advertisement registered on each and every portal site and finds it hard to exercise substantial supervision over its advertisement, there being a possibility of its advertising expenses exceeding its profits. Keyword advertising severs as the most appropriate methods of advertising for the sales and publicity of small and medium enterprises which are in need of a maximum of advertising effect at a low advertising cost. At present, keyword advertising is divided into CPC advertising and CPM advertising. The former is known as the most efficient technique, which is also referred to as advertising based on the meter rate system; A company is supposed to pay for the number of clicks on a searched keyword which users have searched. This is representatively adopted by Overture, Google's Adwords, Naver's Clickchoice, and Daum's Clicks, etc. CPM advertising is dependent upon the flat rate payment system, making a company pay for its advertisement on the basis of the number of exposure, not on the basis of the number of clicks. This method fixes a price for advertisement on the basis of 1,000-time exposure, and is mainly adopted by Naver's Timechoice, Daum's Speciallink, and Nate's Speedup, etc, At present, the CPC method is most frequently adopted. The weak point of the CPC method is that advertising cost can rise through constant clicks from the same IP. If a company makes good use of strategies for maximizing the strong points of keyword advertising and complementing its weak points, it is highly likely to turn its visitors into prospective customers. Accordingly, an advertiser should make an analysis of customers' behavior and approach them in a variety of ways, trying hard to find out what they want. With this in mind, her or she has to put multiple keywords into use when running for ads. When he or she first runs an ad, he or she should first give priority to which keyword to select. The advertiser should consider how many individuals using a search engine will click the keyword in question and how much money he or she has to pay for the advertisement. As the popular keywords that the users of search engines are frequently using are expensive in terms of a unit cost per click, the advertisers without much money for advertising at the initial phrase should pay attention to detailed keywords suitable to their budget. Detailed keywords are also referred to as peripheral keywords or extension keywords, which can be called a combination of major keywords. Most keywords are in the form of texts. The biggest strong point of text-based advertising is that it looks like search results, causing little antipathy to it. But it fails to attract much attention because of the fact that most keyword advertising is in the form of texts. Image-embedded advertising is easy to notice due to images, but it is exposed on the lower part of a web page and regarded as an advertisement, which leads to a low click through rate. However, its strong point is that its prices are lower than those of text-based advertising. If a company owns a logo or a product that is easy enough for people to recognize, the company is well advised to make good use of image-embedded advertising so as to attract Internet users' attention. Advertisers should make an analysis of their logos and examine customers' responses based on the events of sites in question and the composition of products as a vehicle for monitoring their behavior in detail. Besides, keyword advertising allows them to analyze the advertising effects of exposed keywords through the analysis of logos. The logo analysis refers to a close analysis of the current situation of a site by making an analysis of information about visitors on the basis of the analysis of the number of visitors and page view, and that of cookie values. It is in the log files generated through each Web server that a user's IP, used pages, the time when he or she uses it, and cookie values are stored. The log files contain a huge amount of data. As it is almost impossible to make a direct analysis of these log files, one is supposed to make an analysis of them by using solutions for a log analysis. The generic information that can be extracted from tools for each logo analysis includes the number of viewing the total pages, the number of average page view per day, the number of basic page view, the number of page view per visit, the total number of hits, the number of average hits per day, the number of hits per visit, the number of visits, the number of average visits per day, the net number of visitors, average visitors per day, one-time visitors, visitors who have come more than twice, and average using hours, etc. These sites are deemed to be useful for utilizing data for the analysis of the situation and current status of rival companies as well as benchmarking. As keyword advertising exposes advertisements exclusively on search-result pages, competition among advertisers attempting to preoccupy popular keywords is very fierce. Some portal sites keep on giving priority to the existing advertisers, whereas others provide chances to purchase keywords in question to all the advertisers after the advertising contract is over. If an advertiser tries to rely on keywords sensitive to seasons and timeliness in case of sites providing priority to the established advertisers, he or she may as well make a purchase of a vacant place for advertising lest he or she should miss appropriate timing for advertising. However, Naver doesn't provide priority to the existing advertisers as far as all the keyword advertisements are concerned. In this case, one can preoccupy keywords if he or she enters into a contract after confirming the contract period for advertising. This study is designed to take a look at marketing for keyword advertising and to present effective strategies for keyword advertising marketing. At present, the Korean CPC advertising market is virtually monopolized by Overture. Its strong points are that Overture is based on the CPC charging model and that advertisements are registered on the top of the most representative portal sites in Korea. These advantages serve as the most appropriate medium for small and medium enterprises to use. However, the CPC method of Overture has its weak points, too. That is, the CPC method is not the only perfect advertising model among the search advertisements in the on-line market. So it is absolutely necessary that small and medium enterprises including independent shopping malls should complement the weaknesses of the CPC method and make good use of strategies for maximizing its strengths so as to increase their sales and to create a point of contact with customers.

  • PDF

Application of Westgard Multi-Rules for Improving Nuclear Medicine Blood Test Quality Control (핵의학 검체검사 정도관리의 개선을 위한 Westgard Multi-Rules의 적용)

  • Jung, Heung-Soo;Bae, Jin-Soo;Shin, Yong-Hwan;Kim, Ji-Young;Seok, Jae-Dong
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.16 no.1
    • /
    • pp.115-118
    • /
    • 2012
  • Purpose: The Levey-Jennings chart controlled measurement values that deviated from the tolerance value (mean ${\pm}2SD$ or ${\pm}3SD$). On the other hand, the upgraded Westgard Multi-Rules are actively recommended as a more efficient, specialized form of hospital certification in relation to Internal Quality Control. To apply Westgard Multi-Rules in quality control, credible quality control substance and target value are required. However, as physical examinations commonly use quality control substances provided within the test kit, there are many difficulties presented in the calculation of target value in relation to frequent changes in concentration value and insufficient credibility of quality control substance. This study attempts to improve the professionalism and credibility of quality control by applying Westgard Multi-Rules and calculating credible target value by using a commercialized quality control substance. Materials and Methods : This study used Immunoassay Plus Control Level 1, 2, 3 of Company B as the quality control substance of Total T3, which is the thyroid test implemented at the relevant hospital. Target value was established as the mean value of 295 cases collected for 1 month, excluding values that deviated from ${\pm}2SD$. The hospital quality control calculation program was used to enter target value. 12s, 22s, 13s, 2 of 32s, R4s, 41s, $10\bar{x}$, 7T of Westgard Multi-Rules were applied in the Total T3 experiment, which was conducted 194 times for 20 days in August. Based on the applied rules, this study classified data into random error and systemic error for analysis. Results: Quality control substances 1, 2, and 3 were each established as 84.2 ng/$dl$, 156.7 ng/$dl$, 242.4 ng/$dl$ for target values of Total T3, with the standard deviation established as 11.22 ng/$dl$, 14.52 ng/$dl$, 14.52 ng/$dl$ respectively. According to error type analysis achieved after applying Westgard Multi-Rules based on established target values, the following results were obtained for Random error, 12s was analyzed 48 times, 13s was analyzed 13 times, R4s was analyzed 6 times, for Systemic error, 22s was analyzed 10 times, 41s was analyzed 11 times, 2 of 32s was analyzed 17 times, $10\bar{x}$ was analyzed 10 times, and 7T was not applied. For uncontrollable Random error types, the entire experimental process was rechecked and greater emphasis was placed on re-testing. For controllable Systemic error types, this study searched the cause of error, recorded the relevant cause in the action form and reported the information to the Internal Quality Control committee if necessary. Conclusions : This study applied Westgard Multi-Rules by using commercialized substance as quality control substance and establishing target values. In result, precise analysis of Random error and Systemic error was achieved through the analysis of 12s, 22s, 13s, 2 of 32s, R4s, 41s, $10\bar{x}$, 7T rules. Furthermore, ideal quality control was achieved through analysis conducted on all data presented within the range of ${\pm}3SD$. In this regard, it can be said that the quality control method formed based on the systematic application of Westgard Multi-Rules is more effective than the Levey-Jennings chart and can maximize error detection.

  • PDF

Research Trend Analysis Using Bibliographic Information and Citations of Cloud Computing Articles: Application of Social Network Analysis (클라우드 컴퓨팅 관련 논문의 서지정보 및 인용정보를 활용한 연구 동향 분석: 사회 네트워크 분석의 활용)

  • Kim, Dongsung;Kim, Jongwoo
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.1
    • /
    • pp.195-211
    • /
    • 2014
  • Cloud computing services provide IT resources as services on demand. This is considered a key concept, which will lead a shift from an ownership-based paradigm to a new pay-for-use paradigm, which can reduce the fixed cost for IT resources, and improve flexibility and scalability. As IT services, cloud services have evolved from early similar computing concepts such as network computing, utility computing, server-based computing, and grid computing. So research into cloud computing is highly related to and combined with various relevant computing research areas. To seek promising research issues and topics in cloud computing, it is necessary to understand the research trends in cloud computing more comprehensively. In this study, we collect bibliographic information and citation information for cloud computing related research papers published in major international journals from 1994 to 2012, and analyzes macroscopic trends and network changes to citation relationships among papers and the co-occurrence relationships of key words by utilizing social network analysis measures. Through the analysis, we can identify the relationships and connections among research topics in cloud computing related areas, and highlight new potential research topics. In addition, we visualize dynamic changes of research topics relating to cloud computing using a proposed cloud computing "research trend map." A research trend map visualizes positions of research topics in two-dimensional space. Frequencies of key words (X-axis) and the rates of increase in the degree centrality of key words (Y-axis) are used as the two dimensions of the research trend map. Based on the values of the two dimensions, the two dimensional space of a research map is divided into four areas: maturation, growth, promising, and decline. An area with high keyword frequency, but low rates of increase of degree centrality is defined as a mature technology area; the area where both keyword frequency and the increase rate of degree centrality are high is defined as a growth technology area; the area where the keyword frequency is low, but the rate of increase in the degree centrality is high is defined as a promising technology area; and the area where both keyword frequency and the rate of degree centrality are low is defined as a declining technology area. Based on this method, cloud computing research trend maps make it possible to easily grasp the main research trends in cloud computing, and to explain the evolution of research topics. According to the results of an analysis of citation relationships, research papers on security, distributed processing, and optical networking for cloud computing are on the top based on the page-rank measure. From the analysis of key words in research papers, cloud computing and grid computing showed high centrality in 2009, and key words dealing with main elemental technologies such as data outsourcing, error detection methods, and infrastructure construction showed high centrality in 2010~2011. In 2012, security, virtualization, and resource management showed high centrality. Moreover, it was found that the interest in the technical issues of cloud computing increases gradually. From annual cloud computing research trend maps, it was verified that security is located in the promising area, virtualization has moved from the promising area to the growth area, and grid computing and distributed system has moved to the declining area. The study results indicate that distributed systems and grid computing received a lot of attention as similar computing paradigms in the early stage of cloud computing research. The early stage of cloud computing was a period focused on understanding and investigating cloud computing as an emergent technology, linking to relevant established computing concepts. After the early stage, security and virtualization technologies became main issues in cloud computing, which is reflected in the movement of security and virtualization technologies from the promising area to the growth area in the cloud computing research trend maps. Moreover, this study revealed that current research in cloud computing has rapidly transferred from a focus on technical issues to for a focus on application issues, such as SLAs (Service Level Agreements).

An integrated Method of New Casuistry and Specified Principlism as Nursing Ethics Methodology (새로운 간호윤리학 방법론;통합된 사례방법론)

  • Um, Young-Rhan
    • Journal of Korean Academy of Nursing Administration
    • /
    • v.3 no.1
    • /
    • pp.51-64
    • /
    • 1997
  • The purpose of the study was to introduce an integrated approach of new Casuistry and specified principlism in resolving ethical problems and studying nursing ethics. In studying clinical ethics and nursing ethics, there is no systematic research method. While nurses often experience ethical dilemmas in practice, much of previous research on nursing ethics has focused merely on describing the existing problems. In addition, ethists presented theoretical analysis and critics rather than providing the specific problems solving strategies. There is a need in clinical situations for an integrated method which can provide the objective description for existing problem situations as well as specific problem solving methods. We inherit two distinct ways of discussing ethical issues. One of these frames these issues in terms of principles, rules, and other general ideas; the other focuses on the specific features of particular kinds of moral cases. In the first way general ethical rules relate to specific moral cases in a theoretical manner, with universal rules serving as "axioms" from which particular moral judgments are deduced as theorems. In the seconds, this relation is frankly practical. with general moral rules serving as "maxims", which can be fully understood only in terms of the paradigmatic cases that define their meaning and force. Theoretical arguments are structured in ways that free them from any dependence on the circumstances of their presentation and ensure them a validity of a kind that is not affected by the practical context of use. In formal arguments particular conclusions are deduced from("entailed by") the initial axioms or universal principles that are the apex of the argument. So the truth or certainty that attaches to those axioms flows downward to the specific instances to be "proved". In the language of formal logic, the axioms are major premises, the facts that specify the present instance are minor premises, and the conclusion to be "proved" is deduced (follows necessarily) from the initial presises. Practical arguments, by contrast, involve a wider range of factors than formal deductions and are read with an eye to their occasion of use. Instead of aiming at strict entailments, they draw on the outcomes of previous experience, carrying over the procedures used to resolve earlier problems and reapply them in new problmatic situations. Practical arguments depend for their power on how closely the present circumstances resemble those of the earlier precedent cases for which this particular type of argument was originally devised. So. in practical arguments, the truths and certitudes established in the precedent cases pass sideways, so as to provide "resolutions" of later problems. In the language of rational analysis, the facts of the present case define the gounds on which any resolution must be based; the general considerations that carried wight in similar situations provide warrants that help settle future cases. So the resolution of any problem holds good presumptively; its strengh depends on the similarities between the present case and the prededents; and its soundness can be challenged (or rebutted) in situations that are recognized ans exceptional. Jonsen & Toulmin (1988), and Jonsen (1991) introduce New Casuistry as a practical method. The oxford English Dictionary defines casuistry quite accurately as "that part of ethics which resolves cases of conscience, applying the general rules of religion and morality to particular instances in which circumstances alter cases or in which there appears to be a conflict of duties." They modified the casuistry of the medieval ages to use in clinical situations which is characterized by "the typology of cases and the analogy as an inference method". A case is the unit of analysis. The structure of case was made with interaction of situation and moral rules. The situation is what surrounds or stands around. The moral rule is the essence of case. The analogy can be objective because "the grounds, the warrants, the theoretical backing, the modal qualifiers" are identified in the cases. The specified principlism was the method that Degrazia (1992) integrated the principlism and the specification introduced by Richardson (1990). In this method, the principle is specified by adding information about limitations of the scope and restricting the range of the principle. This should be substantive qualifications. The integrated method is an combination of the New Casuistry and the specified principlism. For example, the study was "Ethical problems experienced by nurses in the care of terminally ill patients"(Um, 1994). A semi-structured in-depth interview was conducted for fifteen nurses who mainly took care of terminally ill patients. The first stage, twenty one cases were identified as relevant to the topic, and then were classified to four types of problems. For instance, one of these types was the patient's refusal of care. The second stage, the ethical problems in the case were defined, and then the case was analyzed. This was to analyze the reasons, the ethical values, and the related ethical principles in the cases. Then the interpretation was synthetically done by integration of the result of analysis and the situation. The third stage was the ordering phase of the cases, which was done according to the result of the interpretation and the common principles in the cases. The first two stages describe the methodology of new casuistry, and the final stage was for the methodology of the specified principlism. The common principles were the principle of autonomy and the principle of caring. The principle of autonomy was specified; when competent patients refused care, nurse should discontinue the care to respect for the patients' decision. The principle of caring was also specified; when the competent patients refused care, nurses should continue to provide the care in spite of the patients' refusal to preserve their life. These specification may lead the opposite behavior, which emphasizes the importance of nurse's will and intentions to make their decision in the clinical situations.

  • PDF

A Folksonomy Ranking Framework: A Semantic Graph-based Approach (폭소노미 사이트를 위한 랭킹 프레임워크 설계: 시맨틱 그래프기반 접근)

  • Park, Hyun-Jung;Rho, Sang-Kyu
    • Asia pacific journal of information systems
    • /
    • v.21 no.2
    • /
    • pp.89-116
    • /
    • 2011
  • In collaborative tagging systems such as Delicious.com and Flickr.com, users assign keywords or tags to their uploaded resources, such as bookmarks and pictures, for their future use or sharing purposes. The collection of resources and tags generated by a user is called a personomy, and the collection of all personomies constitutes the folksonomy. The most significant need of the folksonomy users Is to efficiently find useful resources or experts on specific topics. An excellent ranking algorithm would assign higher ranking to more useful resources or experts. What resources are considered useful In a folksonomic system? Does a standard superior to frequency or freshness exist? The resource recommended by more users with mere expertise should be worthy of attention. This ranking paradigm can be implemented through a graph-based ranking algorithm. Two well-known representatives of such a paradigm are Page Rank by Google and HITS(Hypertext Induced Topic Selection) by Kleinberg. Both Page Rank and HITS assign a higher evaluation score to pages linked to more higher-scored pages. HITS differs from PageRank in that it utilizes two kinds of scores: authority and hub scores. The ranking objects of these pages are limited to Web pages, whereas the ranking objects of a folksonomic system are somewhat heterogeneous(i.e., users, resources, and tags). Therefore, uniform application of the voting notion of PageRank and HITS based on the links to a folksonomy would be unreasonable, In a folksonomic system, each link corresponding to a property can have an opposite direction, depending on whether the property is an active or a passive voice. The current research stems from the Idea that a graph-based ranking algorithm could be applied to the folksonomic system using the concept of mutual Interactions between entitles, rather than the voting notion of PageRank or HITS. The concept of mutual interactions, proposed for ranking the Semantic Web resources, enables the calculation of importance scores of various resources unaffected by link directions. The weights of a property representing the mutual interaction between classes are assigned depending on the relative significance of the property to the resource importance of each class. This class-oriented approach is based on the fact that, in the Semantic Web, there are many heterogeneous classes; thus, applying a different appraisal standard for each class is more reasonable. This is similar to the evaluation method of humans, where different items are assigned specific weights, which are then summed up to determine the weighted average. We can check for missing properties more easily with this approach than with other predicate-oriented approaches. A user of a tagging system usually assigns more than one tags to the same resource, and there can be more than one tags with the same subjectivity and objectivity. In the case that many users assign similar tags to the same resource, grading the users differently depending on the assignment order becomes necessary. This idea comes from the studies in psychology wherein expertise involves the ability to select the most relevant information for achieving a goal. An expert should be someone who not only has a large collection of documents annotated with a particular tag, but also tends to add documents of high quality to his/her collections. Such documents are identified by the number, as well as the expertise, of users who have the same documents in their collections. In other words, there is a relationship of mutual reinforcement between the expertise of a user and the quality of a document. In addition, there is a need to rank entities related more closely to a certain entity. Considering the property of social media that ensures the popularity of a topic is temporary, recent data should have more weight than old data. We propose a comprehensive folksonomy ranking framework in which all these considerations are dealt with and that can be easily customized to each folksonomy site for ranking purposes. To examine the validity of our ranking algorithm and show the mechanism of adjusting property, time, and expertise weights, we first use a dataset designed for analyzing the effect of each ranking factor independently. We then show the ranking results of a real folksonomy site, with the ranking factors combined. Because the ground truth of a given dataset is not known when it comes to ranking, we inject simulated data whose ranking results can be predicted into the real dataset and compare the ranking results of our algorithm with that of a previous HITS-based algorithm. Our semantic ranking algorithm based on the concept of mutual interaction seems to be preferable to the HITS-based algorithm as a flexible folksonomy ranking framework. Some concrete points of difference are as follows. First, with the time concept applied to the property weights, our algorithm shows superior performance in lowering the scores of older data and raising the scores of newer data. Second, applying the time concept to the expertise weights, as well as to the property weights, our algorithm controls the conflicting influence of expertise weights and enhances overall consistency of time-valued ranking. The expertise weights of the previous study can act as an obstacle to the time-valued ranking because the number of followers increases as time goes on. Third, many new properties and classes can be included in our framework. The previous HITS-based algorithm, based on the voting notion, loses ground in the situation where the domain consists of more than two classes, or where other important properties, such as "sent through twitter" or "registered as a friend," are added to the domain. Forth, there is a big difference in the calculation time and memory use between the two kinds of algorithms. While the matrix multiplication of two matrices, has to be executed twice for the previous HITS-based algorithm, this is unnecessary with our algorithm. In our ranking framework, various folksonomy ranking policies can be expressed with the ranking factors combined and our approach can work, even if the folksonomy site is not implemented with Semantic Web languages. Above all, the time weight proposed in this paper will be applicable to various domains, including social media, where time value is considered important.