• Title/Summary/Keyword: Y-system

Search Result 237,878, Processing Time 0.217 seconds

Accuracy of the Registered Cause of Death in a County and its Related Factors (일개 군 사망신고자료에 기재된 사인의 정확성과 관련요인)

  • Shin, Hee-Young;Shin, Jun-Ho;Nam, Hae-Sung;Ryu, So-Yeon;Im, Jeong-Soo;Rhee, Jung-Ae;Chung, Eun-Kyung
    • Journal of Preventive Medicine and Public Health
    • /
    • v.35 no.2
    • /
    • pp.153-159
    • /
    • 2002
  • Objectives : To evaluate the accuracy of the registered cause of death in a county and its related factors. Methods : The data used in this study was based on 504 cases, in a county of Chonnam province, registered between January and December 1998. Study subjects consisted of 388 of the 504 cases, and their causes of death were established by an interview survey of the next of kin or neighbor and medical record surveys. We compared the registered cause of death with the confirmed cause of death, determined by surveys and medical records, and evaluated the factors associated with the accuracy of the registered cause of death. Results : 62.6% of the deaths were concordant with 19 Chapters classification of cause of death. external causes of mortality, endocrine, nutritional and metabolic diseases, neoplasms and diseases of the circulatory system showed the good agreement between the registered cause of death and the confirmed cause of death. The factors relating to the accuracy of the registered cause of death were the doctors' diagnosis for the cause of death (adjusted Odds Ratio: 2.67, 95% Confidence Interval: 1.21-5.89) and the grade of the public officials in charge of the death registry (adjusted Odds Ratio: 0.30, 95% CI=0.12-0.78). Conclusions : The accuracy of the registered cause of death was not high. It could be improved by using the doctors' diagnosis for death and improving the job specification for public officials who deal with death registration.

Genetic Relationship and Characteristics Using Microsatellite DNA Loci in Horse Breeds. (Microsatellite DNA를 이용한 말 집단의 유전적 특성 및 유연 관계)

  • Cho, Gil-Jae
    • Journal of Life Science
    • /
    • v.17 no.5 s.85
    • /
    • pp.699-705
    • /
    • 2007
  • The present study was conducted to investigate the genetic characteristic and to establish the parentage verification system of the Korean native horse(KNH). A total number of 192 horses from six horse breeds including the KNH were genotyped using 17 microsatellite loci. This method consisted of multiplexing PCR procedure. The number of alleles per locus varied from 5 to 10 with a mean value of 7.35 in KNH. The expected heterozygosity and observed heterozygosity were ranged from 0.387 to 0.841(mean 0.702) and from 0.429 to 0.905(mean 0.703), respectively. The total exclusion probability of 17 microsatellite loci was 0.9999. Of the 17 markers, AHT4, AHT5, CA425, HMS2, HMS3, HTG10, LEX3 and VHL20 marker have relatively high PIC value(>0.7). This study found that there were specific alleles, P allele at AHT5, Q allele and R allele at ASB23, H allele at CA425, S allele at HMS3, J allele at HTG10 and J allele at LEX3 marker in KNH when compared with other horse populations. Also, the results showed two distinct clusters: the Korean native horse cluster(Korean native horse, Mongolian horse), and the European cluster(Jeju racing horse, Thoroughbred horse). These results present basic information for detecting the genetic markers of the KNH, and has high potential for parentage verification and individual identification of the KNH.

Decrease of Aflatoxin M1 Level in Raw Cow’s Milk using the Hazard Analysis and Critical Control Points (HACCP) System (HACCP 제도에 의한 우유의 아플라톡신 M1의 저감화)

  • Kim, Ki-Hwan;Nam, Myoung Soo
    • Journal of Life Science
    • /
    • v.26 no.2
    • /
    • pp.190-197
    • /
    • 2016
  • Aflatoxin M1 can be produced in cow’s milk when cows eat contaminated produce. Milk is a major source of food for infants and for children who have a weak level of immunity, and the detection of Aflatoxin M1 for risk assessment is necessary in order to reduce the amount of it in milk. In this study, the Aflatoxin M1 level was monitored for one year in raw milk samples obtained from Chungnam Province, Korea. The milk samples were divided into three categories: 1. milk samples from a standard general farm, 2. milk samples from a HACCP controlled farm, and 3. milk samples from the supply of Aflatoxin M1 reduced fodder. The average concentrations of Aflatoxin M1 in milk were 0.023±0.005 ug/l for the standard general farm, 0.017±0.004 ug/l for the HACCP controlled farm, and 0.013±0.003 ug/l for the supply of Aflatoxin M1 reduction fodder. Milk collected from the supply of Aflatoxin M1 reduction fodder had the lowest level of Aflatoxin M1. However, when efficiency and economic aspects are considered the most effective way of reducting Aflatoxin M1, could be taking milk from the HACCP controlled farm and implementing good feed management. Institutional support from the government, careful management of dairy farming, and a strict farm sanitation program are required in order to lower the level of Aflatoxin M1 in milk.

Isolation and Identification of Fatty Acid and Volatile Compounds from Tuna Fish Oil with Supercritical Carbon Dioxide (초임계 이산화탄소를 이용한 참치안구유로부터 지방산 및 휘발성 성분의 분리 동정)

  • Roh, Hyung-Seob;Youn, Hyun-Seok;Park, Ji-Yeon;Sin, Sang-Kyu;Lee, Min-Kyung;Back, Sung-Sin;Chun, Byung-Soo
    • Journal of Marine Bioscience and Biotechnology
    • /
    • v.1 no.2
    • /
    • pp.105-118
    • /
    • 2006
  • Isolation and Identification of fatty acid and volatile compounds in tuna fish oil were successfully carried out using supercritical carbon dioxide. Samples of the oil were extracted in a 56 ml semi-batch stainless steel vessel under conditions which ranged from 80 to 200 bar and 40 to $60^{\circ}C$ with carbon dioxide flows from 10 ml/min. Volatiles in the oil extracted from the samples with supercritical carbon dioxide were analyzed by gas chromatography, mass detector with canister system. The extracts were contained with various fatty acids, 57.0% of unsaturated fatty acids such as docosahexaenoic acid(DHA) and eicosapentaenoic acid(EPA), and 43.0% of saturated fatty acids. The aroma compounds in the oil showed over 129 peaks, of which 100 compounds were identified. Volatile components included 2,4-hepatadienal(fishy), dimethyldisulfide (unpleasant), dimethyltrisulfide (unpleasant) and 2-nonenal(fatty). The isolation efficiency of the volatile compounds from the samples was 99.4% at $50^{\circ}C$ and 200 bar.

  • PDF

Characterization of HtrA2-deficient Mouse Embryonic Fibroblast Cells Based on Morphology and Analysis of their Sensitivity in Response to Cell Death Stimuli. (HtrA2 유전자가 결손된 mouse embryonic fibroblast 세포주의 형태학적 특징 및 세포사멸 자극에 대한 감수성 조사)

  • Lee, Sang-Kyu;Nam, Min-Kyung;Kim, Goo-Young;Rhim, Hyang-Shuk
    • Journal of Life Science
    • /
    • v.18 no.4
    • /
    • pp.522-529
    • /
    • 2008
  • High-temperature requirement A2(HtrA2) has been known as a human homologue of bacterial HtrA that has a molecular chaperone function. HtrA2 is mitochondrial serine protease that plays a significant role in regulating the apoptosis; however, the physiological function of HtrA2 still remains elusive. To establish experimental system for the investigation of new insights into the function of HtrA2 in mammalian cells, we first obtained $HtrA2^{+/+}$ and $HtrA2^{-/-}$ MEF cells lines and identified those cells based on the expression pattern and subcellular localization of HtrA2, using immunoblot and biochemical assays. Additionally, we observed that the morphological characteristics of $HtrA2^{-/-}$ MEF cells are different form those of $HtrA2^{+/+}$ MEF cells, showing a rounded shape instead of a typical fibroblast-like shape. Growth rate of $HtrA2^{-/-}$ MEF cells was also 1.4-fold higher than that of $HtrA2^{+/+}$ MEF cells at 36 hours. Furthermore, we verified both MEF cell lines induced caspsase-dependent cell death in response to apoptotic stimuli such as heat shock, staurosporine, and rotenone. The relationship between HtrA2 and heat shock-induced cell death is the first demonstration of the research field of HtrA2. Our study suggests that those MEF cell lines are suitable reagents to further investigate the molecular mechanism by which HtrA2 regulates the balance between cell death and survival.

Understanding the Relationship between Value Co-Creation Mechanism and Firm's Performance based on the Service-Dominant Logic (서비스지배논리하에서 가치공동창출 매커니즘과 기업성과간의 관계에 대한 연구)

  • Nam, Ki-Chan;Kim, Yong-Jin;Yim, Myung-Seong;Lee, Nam-Hee;Jo, Ah-Rha
    • Asia pacific journal of information systems
    • /
    • v.19 no.4
    • /
    • pp.177-200
    • /
    • 2009
  • AIn the advanced - economy, the services industry hasbecome a dominant sector. Evidently, the services sector has grown at a much faster rate than any other. For instance, in such developed countries as the U.S., the proportion of the services sector in its GDP is greater than 75%. Even in the developing countries including India and China, the magnitude of the services sector in their GDPs is rapidly growing. The increasing dependence on service gives rise to new initiatives including service science and service-dominant logic. These new initiatives propose a new theoretical prism to promote the better understanding of the changing economic structure. From the new perspectives, service is no longer regarded as a transaction or exchange, but rather co-creation of value through the interaction among service users, providers, and other stakeholders including partners, external environments, and customer communities. The purpose of this study is the following. First, we review previous literature on service, service innovation, and service systems and integrate the studies based on service dominant logic. Second, we categorize the ten propositions of service dominant logic into conceptual propositions and the ones that are directly related to service provision. Conceptual propositions are left out to form the research model. With the selected propositions, we define the research constructs for this study. Third, we develop measurement items for the new service concepts including service provider network, customer network, value co-creation, and convergence of service with product. We then propose a research model to explain the relationship among the factors that affect the value creation mechanism. Finally, we empirically investigate the effects of the factors on firm performance. Through the process of this research study, we want to show the value creation mechanism of service systems in which various participants in service provision interact with related parties in a joint effort to create values. To test the proposed hypotheses, we developed measurement items and distributed survey questionnaires to domestic companies. 500 survey questionnaires were distributed and 180 were returned among which 171 were usable. The results of the empirical test can be summarized as the following. First, service providers' network which is to help offer required services to customers is found to affect customer network, while it does not have a significant effect on value co-creation and product-service convergence. Second, customer network, on the other hand, appears to influence both value co-creation and product-service convergence. Third, value co-creation accomplished through the collaboration of service providers and customers is found to have a significant effect on both product-service convergence and firm performance. Finally, product-service convergence appears to affect firm performance. To interpret the results from the value creation mechanism perspective, service provider network well established to support customer network is found to have significant effect on customer network which in turn facilitates value co-creation in service provision and product-service convergence to lead to greater firm performance. The results have some enlightening implications for practitioners. If companies want to transform themselves into service-centered business enterprises, they have to consider the four factors suggested in this study: service provider network, customer network, value co-creation, and product-service convergence. That is, companies becoming a service-oriented organization need to understand what the four factors are and how the factors interact with one another in their business context. They then may want to devise a better tool to analyze the value creation mechanism and apply the four factors to their own environment. This research study contributes to the literature in following ways. First, this study is one of the very first empirical studies on the service dominant logic as it has categorized the fundamental propositions into conceptual and empirically testable ones and tested the proposed hypotheses against the data collected through the survey method. Most of the propositions are found to work as Vargo and Lusch have suggested. Second, by providing a testable set of relationships among the research variables, this study may provide policy makers and decision makers with some theoretical grounds for their decision making on what to do with service innovation and management. Finally, this study incorporates the concepts of value co-creation through the interaction between customers and service providers into the proposed research model and empirically tests the validity of the concepts. The results of this study will help establish a value creation mechanism in the service-based economy, which can be used to develop and implement new service provision.

Dynamics of Technology Adoption in Markets Exhibiting Network Effects

  • Hur, Won-Chang
    • Asia pacific journal of information systems
    • /
    • v.20 no.1
    • /
    • pp.127-140
    • /
    • 2010
  • The benefit that a consumer derives from the use of a good often depends on the number of other consumers purchasing the same goods or other compatible items. This property, which is known as network externality, is significant in many IT related industries. Over the past few decades, network externalities have been recognized in the context of physical networks such as the telephone and railroad industries. Today, as many products are provided as a form of system that consists of compatible components, the appreciation of network externality is becoming increasingly important. Network externalities have been extensively studied among economists who have been seeking to explain new phenomena resulting from rapid advancements in ICT (Information and Communication Technology). As a result of these efforts, a new body of theories for 'New Economy' has been proposed. The theoretical bottom-line argument of such theories is that technologies subject to network effects exhibit multiple equilibriums and will finally lock into a monopoly with one standard cornering the entire market. They emphasize that such "tippiness" is a typical characteristic in such networked markets, describing that multiple incompatible technologies rarely coexist and that the switch to a single, leading standard occurs suddenly. Moreover, it is argued that this standardization process is path dependent, and the ultimate outcome is unpredictable. With incomplete information about other actors' preferences, there can be excess inertia, as consumers only moderately favor the change, and hence are themselves insufficiently motivated to start the bandwagon rolling, but would get on it once it did start to roll. This startup problem can prevent the adoption of any standard at all, even if it is preferred by everyone. Conversely, excess momentum is another possible outcome, for example, if a sponsoring firm uses low prices during early periods of diffusion. The aim of this paper is to analyze the dynamics of the adoption process in markets exhibiting network effects by focusing on two factors; switching and agent heterogeneity. Switching is an important factor that should be considered in analyzing the adoption process. An agent's switching invokes switching by other adopters, which brings about a positive feedback process that can significantly complicate the adoption process. Agent heterogeneity also plays a important role in shaping the early development of the adoption process, which has a significant impact on the later development of the process. The effects of these two factors are analyzed by developing an agent-based simulation model. ABM is a computer-based simulation methodology that can offer many advantages over traditional analytical approaches. The model is designed such that agents have diverse preferences regarding technology and are allowed to switch their previous choice. The simulation results showed that the adoption processes in a market exhibiting networks effects are significantly affected by the distribution of agents and the occurrence of switching. In particular, it is found that both weak heterogeneity and strong network effects cause agents to start to switch early and this plays a role of expediting the emergence of 'lock-in.' When network effects are strong, agents are easily affected by changes in early market shares. This causes agents to switch earlier and in turn speeds up the market's tipping. The same effect is found in the case of highly homogeneous agents. When agents are highly homogeneous, the market starts to tip toward one technology rapidly, and its choice is not always consistent with the populations' initial inclination. Increased volatility and faster lock-in increase the possibility that the market will reach an unexpected outcome. The primary contribution of this study is the elucidation of the role of parameters characterizing the market in the development of the lock-in process, and identification of conditions where such unexpected outcomes happen.

The Impact of Service Level Management(SLM) Process Maturity on Information Systems Success in Total Outsourcing: An Analytical Case Study (토털 아웃소싱 환경 하에서 IT서비스 수준관리(Service Level Management) 프로세스 성숙도가 정보시스템 성공에 미치는 영향에 관한 분석적 사례연구)

  • Cho, Geun Su;An, Joon Mo;Min, Hyoung Jin
    • Asia pacific journal of information systems
    • /
    • v.23 no.2
    • /
    • pp.21-39
    • /
    • 2013
  • As the utilization of information technology and the turbulence of technological change increase in organizations, the adoption of IT outsourcing also grows to manage IT resource more effectively and efficiently. In this new way of IT management technique, service level management(SLM) process becomes critical to derive success from the outsourcing in the view of end users in organization. Even though much of the research on service level management or agreement have been done during last decades, the performance of the service level management process have not been evaluated in terms of final objectives of the management efforts or success from the view of end-users. This study explores the relationship between SLM maturity and IT outsourcing success from the users' point of view by a analytical case study in four client organizations under an IT outsourcing vendor, which is a member company of a major Korean conglomerate. For setting up a model for the analysis, previous researches on service level management process maturity and information systems success are reviewed. In particular, information systems success from users' point of view are reviewed based the DeLone and McLean's study, which is argued and accepted as a comprehensively tested model of information systems success currently. The model proposed in this study argues that SLM process maturity influences information systems success, which is evaluated in terms of information quality, systems quality, service quality, and net effect proposed by DeLone and McLean. SLM process maturity can be measured in planning process, implementation process and operation and evaluation process. Instruments for measuring the factors in the proposed constructs of information systems success and SL management process maturity were collected from previous researches and evaluated for securing reliability and validity, utilizing appropriate statistical methods and pilot tests before exploring the case study. Four cases from four different companies under one vendor company were utilized for the analysis. All of the cases had been contracted in SLA(Service Level Agreement) and had implemented ITIL(IT Infrastructure Library), Six Sigma and BSC(Balanced Scored Card) methods since last several years, which means that all the client organizations pursued concerted efforts to acquire quality services from IT outsourcing from the organization and users' point of view. For comparing the differences among the four organizations in IT out-sourcing sucess, T-test and non-parametric analysis have been applied on the data set collected from the organization using survey instruments. The process maturities of planning and implementation phases of SLM are found not to influence on any dimensions of information systems success from users' point of view. It was found that the SLM maturity in the phase of operations and evaluation could influence systems quality only from users' view. This result seems to be quite against the arguments in IT outsourcing practices in the fields, which emphasize usually the importance of planning and implementation processes upfront in IT outsourcing projects. According to after-the-fact observation by an expert in an organization participating in the study, their needs and motivations for outsourcing contracts had been quite familiar already to the vendors as long-term partners under a same conglomerate, so that the maturity in the phases of planning and implementation seems not to be differentiating factors for the success of IT outsourcing. This study will be the foundation for the future research in the area of IT outsourcing management and success, in particular in the service level management. And also, it could guide managers in practice in IT outsourcing management to focus on service level management process in operation and evaluation stage especially for long-term outsourcing contracts under very unique context like Korean IT outsourcing projects. This study has some limitations in generalization because the sample size is small and the context itself is confined in an unique environment. For future exploration, survey based research could be designed and implemented.

  • PDF

Analysis of Respiratory Motion Artifacts in PET Imaging Using Respiratory Gated PET Combined with 4D-CT (4D-CT와 결합한 호흡게이트 PET을 이용한 PET영상의 호흡 인공산물 분석)

  • Cho, Byung-Chul;Park, Sung-Ho;Park, Hee-Chul;Bae, Hoon-Sik;Hwang, Hee-Sung;Shin, Hee-Soon
    • The Korean Journal of Nuclear Medicine
    • /
    • v.39 no.3
    • /
    • pp.174-181
    • /
    • 2005
  • Purpose: Reduction of respiratory motion artifacts in PET images was studied using respiratory-gated PET (RGPET) with moving phantom. Especially a method of generating simulated helical CT images from 4D-CT datasets was developed and applied to a respiratory specific RGPET images for more accurate attenuation correction. Materials and Methods: Using a motion phantom with periodicity of 6 seconds and linear motion amplitude of 26 mm, PET/CT (Discovery ST: GEMS) scans with and without respiratory gating were obtained for one syringe and two vials with each volume of 3, 10, and 30 ml respectively. RPM (Real-Time Position Management, Varian) was used for tracking motion during PET/CT scanning. Ten datasets of RGPET and 4D-CT corresponding to every 10% phase intervals were acquired. from the positions, sizes, and uptake values of each subject on the resultant phase specific PET and CT datasets, the correlations between motion artifacts in PET and CT images and the size of motion relative to the size of subject were analyzed. Results: The center positions of three vials in RGPET and 4D-CT agree well with the actual position within the estimated error. However, volumes of subjects in non-gated PET images increase proportional to relative motion size and were overestimated as much as 250% when the motion amplitude was increased two times larger than the size of the subject. On the contrary, the corresponding maximal uptake value was reduced to about 50%. Conclusion: RGPET is demonstrated to remove respiratory motion artifacts in PET imaging, and moreover, more precise image fusion and more accurate attenuation correction is possible by combining with 4D-CT.

Increase of Tc-99m RBC SPECT Sensitivity for Small Liver Hemangioma using Ordered Subset Expectation Maximization Technique (Tc-99m RBC SPECT에서 Ordered Subset Expectation Maximization 기법을 이용한 작은 간 혈관종 진단 예민도의 향상)

  • Jeon, Tae-Joo;Bong, Jung-Kyun;Kim, Hee-Joung;Kim, Myung-Jin;Lee, Jong-Doo
    • The Korean Journal of Nuclear Medicine
    • /
    • v.36 no.6
    • /
    • pp.344-356
    • /
    • 2002
  • Purpose: RBC blood pool SPECT has been used to diagnose focal liver lesion such as hemangioma owing to its high specificity. However, low spatial resolution is a major limitation of this modality. Recently, ordered subset expectation maximization (OSEM) has been introduced to obtain tomographic images for clinical application. We compared this new modified iterative reconstruction method, OSEM with conventional filtered back projection (FBP) in imaging of liver hemangioma. Materials and Methods: Sixty four projection data were acquired using dual head gamma camera in 28 lesions of 24 patients with cavernous hemangioma of liver and these raw data were transferred to LINUX based personal computer. After the replacement of header file as interfile, OSEM was performed under various conditions of subsets (1,2,4,8,16, and 32) and iteration numbers (1,2,4,8, and 16) to obtain the best setting for liver imaging. The best condition for imaging in our investigation was considered to be 4 iterations and 16 subsets. After then, all the images were processed by both FBP and OSEM. Three experts reviewed these images without any information. Results: According to blind review of 28 lesions, OSEM images revealed at least same or better image quality than those of FBP in nearly all cases. Although there showed no significant difference in detection of large lesions more than 3 cm, 5 lesions with 1.5 to 3 cm in diameter were detected by OSEM only. However, both techniques failed to depict 4 cases of small lesions less than 1.5 cm. Conclusion: OSEM revealed better contrast and define in depiction of liver hemangioma as well as higher sensitivity in detection of small lesions. Furthermore this reconstruction method dose not require high performance computer system or long reconstruction time, therefore OSEM is supposed to be good method that can be applied to RBC blood pool SPECT for the diagnosis of liver hemangioma.