• Title/Summary/Keyword: Technologies

Search Result 18,600, Processing Time 0.051 seconds

Caffeine treatment during in vitro maturation improves developmental competence of morphologically poor oocytes after somatic cell nuclear transfer in pigs (돼지 난자의 체외성숙에서 Caffeine 처리가 난자 성숙과 체세포 핵이식 배아의 체외발육에 미치는 영향)

  • Lee, Joohyeong;You, Jinyoung;Lee, Hanna;Shin, Hyeji;Lee, Geun-Shik;Lee, Seung Tae;Lee, Eunsong
    • Journal of Embryo Transfer
    • /
    • v.32 no.3
    • /
    • pp.131-138
    • /
    • 2017
  • In most mammals, metaphase II (MII) oocytes having high maturation promoting factor (MPF) activity have been considered as good oocytes and then used for assisted reproductive technologies including somatic cell nuclear transfer (SCNT). Caffeine increases MPF activity in mammalian oocytes by inhibiting p34cdc2 phosphorylation. The objective of this study was to investigate the effects of caffeine treatment during in vitro maturation (IVM) on oocyte maturation and embryonic development after SCNT in pigs. To this end, morphologically good (MGCOCs) and poor oocytes (MPCOCs) based on the thickness of cumulus cell layer were untreated or treated with 2.5 mM caffeine during 22-42, 34-42, or 38-42 h of IVM according to the experimental design. Caffeine treatment for 20 h during 22-42 h of IVM significantly inhibited nuclear maturation compared to no treatment. Blastocyst formation of SCNT embryos was not influenced by the caffeine treatment during 38-42 h of IVM in MGCOCs (41.1-42.1%) but was significantly improved in MPCOCs compared to no treatment (43.4 vs. 30.1%, P<0.05). No significant effects of caffeine treatment was observed in embryo cleavage (78.7-88.0%) and mean cell number in blastocyst (38.7-43.5 cells). The MPF activity of MII oocytes in terms of p34cdc2 kinase activity was not influenced by the caffeine treatment in MGCOCs (160.4 vs. 194.3 pg/ml) but significantly increased in MPCOCs (133.9 vs. 204.8 pg/ml). Our results demonstrate that caffeine treatment during 38-42 h of IVM improves developmental competence of SCNT embryos derived from MPCOCs by influencing cytoplasmic maturation including increased MPF activity in IVM oocytes in pigs.

Application of Enhanced Coagulation for Nakdong River Water Using Aluminium and Ferric Salt Coagulants (낙동강 원수를 대상으로 Al염계 및 Fe염계 응집제를 이용한 고도응집의 적용)

  • Moon, Sin-Deok;Son, Hee-Jong;Yeom, Hoon-Sik;Choi, Jin-Taek;Jung, Chul-Woo
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.34 no.9
    • /
    • pp.590-596
    • /
    • 2012
  • Enhanced coagulation is best available technologies to treat NOM in water to produce clean drinking water. In this research, the comparison experiments between conventional coagulation (CC) and enhanced coagulation (EC) using 4 type coagulants i.e., ferric chloride, aluminium sulphate (alum), poly aluminium sulphate organic magnesium (PSOM) and poly aluminium chloride (PACl) were performed in terms of surrogate parameters such as dissolved organic carbon (DOC), trihalomethane formation potential (THMFP), haloacetic acid formation potential (HAAFP) and zeta potential variation in order to find out the most effective coagulant and conditions to fit Nakdong River water. When applied to EC process, the turbidity removal efficiency did not increased gradually compared to the CC process when adding coagulants. Furthermore, the removal efficiency of turbidity became decreased much more as coagulants were added increasingly whereas the removal efficiency of DOC, THMFP and HAAFP became increased by 13~18%, 9~18% and 9~18% respectively compared to the CC process. The characteristics of turbidity removal showed relatively high removal efficiency considering the pH variation in entire pH range when using $FeCl_3$ and PACl. Additionally, in case of alum and PSOM steady removal efficiency was shown between pH 5 and pH 8. In terms of DOC surrogate the coagulants including 4 type coagulants indicated high removal efficiency between pH 5 and pH 7. The removal efficiency of dissolved organic matter (DOM) in EC between less than 1 kDa and more than 10 kDa augmented by 11~21% and 16% respectively compared to the CC process. The removal efficiency of hydrophobic and hydrophilic organic matter proved to be increased by 27~38% and 11~15% respectively. In conclusion, the most effective coagulant relating to EC for Nakdong River water was proved to be $FeCl_3$ followed by PSOM, PAC and alum in order.

Effect of Organic Matter and Moisture Content on Reduction of Cr(VI) in Soils by Zerovalent Iron (영가철에 의한 토양 Cr(VI) 환원에 미치는 유기물 및 수분함량 영향)

  • Yang, Jae-E.;Lee, Su-Jae;Kim, Dong-Kuk;Oh, Sang-Eun;Yoon, Sung-Hwan;Ok, Yong-Sik
    • Korean Journal of Environmental Agriculture
    • /
    • v.27 no.1
    • /
    • pp.60-65
    • /
    • 2008
  • Current soil remediation principles for toxic metals have some limitations even though they vary with different technologies. An alternative technology that transforms hazardous substances into nonhazardous ones would be environmentally beneficial. Objective of this research was to assess optimum conditions for Cr(VI) reduction in soils as influenced by ZVI(Zero-Valent Iron), organic matter and moisture content. The reduction ratio of Cr(VI) was increased from 37 to 40% as organic matter content increased from 1.07 to 1.75%. In addition, Cr(VI) concentration was reduced as soil moisture content increased, but the direct effect of soil moisture content on Cr(VI) reduction was less than 5% of the Cr(VI) reduction ratio. However, combined treatment of ZVI(5%), organic matter(1.75%) and soil moisture(30%) effectively reduced the initial Cr(VI) to over 95% within 5 days and nearly 100% after 30 days by increasing oxidation of ZVI and concurrent reduction of Cr(VI) to Cr(III). The overall results demonstrated that ZVI was effective in remediating Cr(VI) contaminated soils, and the efficiency was synergistic with the combined treatments of soil moisture and organic matter.

Case Study on the Enterprise Microblog Usage: Focusing on Knowledge Management Strategy (기업용 마이크로블로그의 사용행태에 대한 사례연구: 지식경영전략을 중심으로)

  • Kang, Min Su;Park, Arum;Lee, Kyoung-Jun
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.1
    • /
    • pp.47-63
    • /
    • 2015
  • As knowledge is paid attention as a new production factor that generates added value, studies continue to apply knowledge management to business environment. In addition, as ICT (Information Communication Technology) was engrafted in business environment, it leads to increasing task efficiency and productivity of individual workers. Accordingly, the way that a business achieves its goal has changed to one in which its individual members are willing to take part in the organization and share information to create new values (Han, 2003) and studies for the system and service to support such transition are carrying out. Of late, a new concept called 'Enterprise 2.0' newly appears. It is the extension of Wen 2.0 and its technology, which focus on participation, sharing and openness, to the work environment of a business (Jung, 2013). Enterprise 2.0 is being used as a collaborative tool to prop up individual creativity and group brain power by combining Web 2.0 technologies such as blog, Wiki, RSS and tag with business software (McAfee, 2006). As Tweeter gets popular, Enterprise Microblog (EMB), which is an example of Enterprise 2.0 for business, has been developed as equivalent to Tweeter in business circle and SaaS (Software as a Service) such as Yammer was introduced The studies of EMB mainly focus on demonstrating its usability in terms of intra-firm communication and knowledge management. However existing studies lean too much towards large-sized companies and certain departments, rather than a company as a whole. Therefore, few studies have been conducted on small and medium-sized companies that have difficulty preparing separate resources and supplying exclusive workforce to introduce knowledge management. In this respect, the present study placed its analytic focus on small-sized companies actually equipped with EMB to know how they use it. And, based on the findings, this study examined their knowledge management strategies for EMB from the point of codification and personalization. Hypothesis -"as a company grows, it shifts EMB strategy from codification to personalization'- was established on the basis of reviewing precedent studies and literature. To demonstrate the hypothesis, this study analyzed the usage of EMB by small companies that have used it from foundation. For case study, the duration of the use was divided into 2 spans and longitudinal analysis was employed to examine the contents of the blogs. Using the key findings of the analysis, this study is aimed to propose practical implications for the operation of knowledge management of small-sized company and the suitable application of knowledge management system for operation Knowledge Management Strategy can be classified by codification strategy and personalization strategy (Hansen et. al., 1999), and how to manage the two strategies were always studied. Also, current studies regarding the knowledge management strategy were targeted mostly for major companies, resulting in lack of studies in how it can be applied on SMEs. This research, with the knowledge management strategy suited for SMEs, sets an Enterprise Microblog (EMB), and with the EMB applied on SMEs' Knowledge Management Strategy, it is reviewed on the perspective of SMEs' Codification and Personalization Strategies. Through the advanced research regarding Knowledge Management Strategy and EMB, the hypothesis is set that "Depending on the development of the company, the main application of EMB alters from Codification Strategy to Personalization Strategy". To check the hypothesis, SME that have used the EMB called 'Yammer' was analyzed from the date of their foundation until today. The case study has implemented longitudinal analysis which divides the period when the EMBs were used into three stages and analyzes the contents. As the result of the study, this suggests a substantial implication regarding the application of Knowledge Management Strategy and its Knowledge Management System that is suitable for SME.

Personalized Recommendation System for IPTV using Ontology and K-medoids (IPTV환경에서 온톨로지와 k-medoids기법을 이용한 개인화 시스템)

  • Yun, Byeong-Dae;Kim, Jong-Woo;Cho, Yong-Seok;Kang, Sang-Gil
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.3
    • /
    • pp.147-161
    • /
    • 2010
  • As broadcasting and communication are converged recently, communication is jointed to TV. TV viewing has brought about many changes. The IPTV (Internet Protocol Television) provides information service, movie contents, broadcast, etc. through internet with live programs + VOD (Video on demand) jointed. Using communication network, it becomes an issue of new business. In addition, new technical issues have been created by imaging technology for the service, networking technology without video cuts, security technologies to protect copyright, etc. Through this IPTV network, users can watch their desired programs when they want. However, IPTV has difficulties in search approach, menu approach, or finding programs. Menu approach spends a lot of time in approaching programs desired. Search approach can't be found when title, genre, name of actors, etc. are not known. In addition, inserting letters through remote control have problems. However, the bigger problem is that many times users are not usually ware of the services they use. Thus, to resolve difficulties when selecting VOD service in IPTV, a personalized service is recommended, which enhance users' satisfaction and use your time, efficiently. This paper provides appropriate programs which are fit to individuals not to save time in order to solve IPTV's shortcomings through filtering and recommendation-related system. The proposed recommendation system collects TV program information, the user's preferred program genres and detailed genre, channel, watching program, and information on viewing time based on individual records of watching IPTV. To look for these kinds of similarities, similarities can be compared by using ontology for TV programs. The reason to use these is because the distance of program can be measured by the similarity comparison. TV program ontology we are using is one extracted from TV-Anytime metadata which represents semantic nature. Also, ontology expresses the contents and features in figures. Through world net, vocabulary similarity is determined. All the words described on the programs are expanded into upper and lower classes for word similarity decision. The average of described key words was measured. The criterion of distance calculated ties similar programs through K-medoids dividing method. K-medoids dividing method is a dividing way to divide classified groups into ones with similar characteristics. This K-medoids method sets K-unit representative objects. Here, distance from representative object sets temporary distance and colonize it. Through algorithm, when the initial n-unit objects are tried to be divided into K-units. The optimal object must be found through repeated trials after selecting representative object temporarily. Through this course, similar programs must be colonized. Selecting programs through group analysis, weight should be given to the recommendation. The way to provide weight with recommendation is as the follows. When each group recommends programs, similar programs near representative objects will be recommended to users. The formula to calculate the distance is same as measure similar distance. It will be a basic figure which determines the rankings of recommended programs. Weight is used to calculate the number of watching lists. As the more programs are, the higher weight will be loaded. This is defined as cluster weight. Through this, sub-TV programs which are representative of the groups must be selected. The final TV programs ranks must be determined. However, the group-representative TV programs include errors. Therefore, weights must be added to TV program viewing preference. They must determine the finalranks.Based on this, our customers prefer proposed to recommend contents. So, based on the proposed method this paper suggested, experiment was carried out in controlled environment. Through experiment, the superiority of the proposed method is shown, compared to existing ways.

Design of MAHA Supercomputing System for Human Genome Analysis (대용량 유전체 분석을 위한 고성능 컴퓨팅 시스템 MAHA)

  • Kim, Young Woo;Kim, Hong-Yeon;Bae, Seungjo;Kim, Hag-Young;Woo, Young-Choon;Park, Soo-Jun;Choi, Wan
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.2 no.2
    • /
    • pp.81-90
    • /
    • 2013
  • During the past decade, many changes and attempts have been tried and are continued developing new technologies in the computing area. The brick wall in computing area, especially power wall, changes computing paradigm from computing hardwares including processor and system architecture to programming environment and application usage. The high performance computing (HPC) area, especially, has been experienced catastrophic changes, and it is now considered as a key to the national competitiveness. In the late 2000's, many leading countries rushed to develop Exascale supercomputing systems, and as a results tens of PetaFLOPS system are prevalent now. In Korea, ICT is well developed and Korea is considered as a one of leading countries in the world, but not for supercomputing area. In this paper, we describe architecture design of MAHA supercomputing system which is aimed to develop 300 TeraFLOPS system for bio-informatics applications like human genome analysis and protein-protein docking. MAHA supercomputing system is consists of four major parts - computing hardware, file system, system software and bio-applications. MAHA supercomputing system is designed to utilize heterogeneous computing accelerators (co-processors like GPGPUs and MICs) to get more performance/$, performance/area, and performance/power. To provide high speed data movement and large capacity, MAHA file system is designed to have asymmetric cluster architecture, and consists of metadata server, data server, and client file system on top of SSD and MAID storage servers. MAHA system softwares are designed to provide user-friendliness and easy-to-use based on integrated system management component - like Bio Workflow management, Integrated Cluster management and Heterogeneous Resource management. MAHA supercomputing system was first installed in Dec., 2011. The theoretical performance of MAHA system was 50 TeraFLOPS and measured performance of 30.3 TeraFLOPS with 32 computing nodes. MAHA system will be upgraded to have 100 TeraFLOPS performance at Jan., 2013.

Modern Paper Quality Control

  • Olavi Komppa
    • Proceedings of the Korea Technical Association of the Pulp and Paper Industry Conference
    • /
    • 2000.06a
    • /
    • pp.16-23
    • /
    • 2000
  • The increasing functional needs of top-quality printing papers and packaging paperboards, and especially the rapid developments in electronic printing processes and various computer printers during past few years, set new targets and requirements for modern paper quality. Most of these paper grades of today have relatively high filler content, are moderately or heavily calendered , and have many coating layers for the best appearance and performance. In practice, this means that many of the traditional quality assurance methods, mostly designed to measure papers made of pure. native pulp only, can not reliably (or at all) be used to analyze or rank the quality of modern papers. Hence, introduction of new measurement techniques is necessary to assure and further develop the paper quality today and in the future. Paper formation , i.e. small scale (millimeter scale) variation of basis weight, is the most important quality parameter of paper-making due to its influence on practically all the other quality properties of paper. The ideal paper would be completely uniform so that the basis weight of each small point (area) measured would be the same. In practice, of course, this is not possible because there always exists relatively large local variations in paper. However, these small scale basis weight variations are the major reason for many other quality problems, including calender blacking uneven coating result, uneven printing result, etc. The traditionally used visual inspection or optical measurement of the paper does not give us a reliable understanding of the material variations in the paper because in modern paper making process the optical behavior of paper is strongly affected by using e.g. fillers, dye or coating colors. Futhermore, the opacity (optical density) of the paper is changed at different process stages like wet pressing and calendering. The greatest advantage of using beta transmission method to measure paper formation is that it can be very reliably calibrated to measure true basis weight variation of all kinds of paper and board, independently on sample basis weight or paper grade. This gives us the possibility to measure, compare and judge papers made of different raw materials, different color, or even to measure heavily calendered, coated or printed papers. Scientific research of paper physics has shown that the orientation of the top layer (paper surface) fibers of the sheet paly the key role in paper curling and cockling , causing the typical practical problems (paper jam) with modern fax and copy machines, electronic printing , etc. On the other hand, the fiber orientation at the surface and middle layer of the sheet controls the bending stiffness of paperboard . Therefore, a reliable measurement of paper surface fiber orientation gives us a magnificent tool to investigate and predict paper curling and coclking tendency, and provides the necessary information to finetune, the manufacturing process for optimum quality. many papers, especially heavily calendered and coated grades, do resist liquid and gas penetration very much, bing beyond the measurement range of the traditional instruments or resulting invonveniently long measuring time per sample . The increased surface hardness and use of filler minerals and mechanical pulp make a reliable, nonleaking sample contact to the measurement head a challenge of its own. Paper surface coating causes, as expected, a layer which has completely different permeability characteristics compared to the other layer of the sheet. The latest developments in sensor technologies have made it possible to reliably measure gas flow in well controlled conditions, allowing us to investigate the gas penetration of open structures, such as cigarette paper, tissue or sack paper, and in the low permeability range analyze even fully greaseproof papers, silicon papers, heavily coated papers and boards or even detect defects in barrier coatings ! Even nitrogen or helium may be used as the gas, giving us completely new possibilities to rank the products or to find correlation to critical process or converting parameters. All the modern paper machines include many on-line measuring instruments which are used to give the necessary information for automatic process control systems. hence, the reliability of this information obtained from different sensors is vital for good optimizing and process stability. If any of these on-line sensors do not operate perfectly ass planned (having even small measurement error or malfunction ), the process control will set the machine to operate away from the optimum , resulting loss of profit or eventual problems in quality or runnability. To assure optimum operation of the paper machines, a novel quality assurance policy for the on-line measurements has been developed, including control procedures utilizing traceable, accredited standards for the best reliability and performance.

Efficient Remediation of Petroleum Hydrocarbon-Contaminated Soils through Sequential Fenton Oxidation and Biological Treatment Processes (펜톤산화 및 생물학적 연속처리를 통한 유류오염토양의 효율적 처리)

  • Bae, Jae-Sang;Kim, Jong-Hyang;Choi, Jung-Hye;Ekpeghere, Kalu I.;Kim, Soo-Gon;Koh, Sung-Cheol
    • Korean Journal of Microbiology
    • /
    • v.47 no.4
    • /
    • pp.356-363
    • /
    • 2011
  • The accidental releases of total petroleum hydrocarbons (TPH) due to oil spills frequently ended up with soil and ground water pollution. TPH may be degraded through physicochemical and biological processes in the environment but with relatively slow rates. In this study an attempt has been made to develop an integrated chemical and biological treatment technology in order to establish an efficient and environment-friendly restoration technology for the TPH contaminated soils. A Fenton-like reaction was employed as a preceding chemical treatment process and a bioaugmentation process utilizing a diesel fuel degrader consortium was subsequently applied as a biological treatment process. An efficient chemical removal of TPH from soils occurred when the surfactant OP-10S (0.05%) and oxidants ($FeSO_4$ 4%, and $H_2O_2$ 5%) were used. Bioaugmentation of the degrader consortium into the soil slurry led to an increase in their population density at least two orders of magnitude, indicating a good survival of the degradative populations in the contaminated soils ($10^8-10^9$ CFU/g slurry). TPH removal efficiencies for the Fenton-treated soils increased by at least 57% when the soils were subjected to bioaugmentation of the degradative consortium. However, relatively lower TPH treatment efficiencies (79-83%) have been observed in the soils treated with Fenton and the degraders as opposed to the control (95%) that was left with no treatment. This appeared to be due to the presence of free radicals and other oxidative products generated during the Fenton treatment which might inhibit their degradation activity. The findings in this study will contribute to development of efficient bioremediation treatment technologies for TPH-contaminated soils and sediments in the environment.

A Study on the Factors Affecting Health Promoting Lifestyles of Workers in the Small Scale Industries (소형 사업장 근로자들의 건강증진 생활양식에 영향을 미치는 요인)

  • Jang Yong-Nam;Lee Eun-Kyoung;Chong Myong-Soo;Jun Sun-Young;Kim Sang-Deok;Jeoung Jae-Yul;Jahng Doo-Sub;Song Yung-Sun;Lee Ki-Nam
    • Journal of Society of Preventive Korean Medicine
    • /
    • v.5 no.1
    • /
    • pp.10-30
    • /
    • 2001
  • Oriental medicine needs to be armed with theories on health-improvement concept under it and basic data matching its views, in order to participate in the health-improvement service in industrial work places. The Orient medicine health-improvement program defines factors that determine individuals' lifestyle, and provides information and technologies for workers to practice in life. To that end, this research compares and analyzes health-improvement concept and health care, defines relations between individuals' health state and their lifestyle as the basic data needed to perform health-improvement business for workers. 1. The subjects employed for this research is categorized into; by gender, males 52.1% and females 47.9% with no big difference between them; and by age, 20s, 6.1%, 30s. 33.9%, 40s, 34.1%, and 50s, 24.8% with 30-50 accounting for most of it. By marriage status, unmarried represents 7.1%, and married 79.1% with most of them married; by revenue, under one million won represents 3.0%, 1-2 million won 26.4%, 2-2.49 million won 11.2%, above 2.5 million won 11.2%, and 1-2.5 million won a majority. By living location, owned houses represents 65.4%, rented houses 14.7%, monthly-rented 9.5%; and by education, elementary and middle school represent 16.9%, high school and its dropouts 22.6%, and junior college and higher 51.6%, with high school and higher occupying most of the group. 2. By job, office workers and managerial workers represent 12.3%, part-timers 21.0%, manual workers 11.4%, jobless 0.6%, professionals 35.6%, service 0.6%, housewives 8.4%, and equipment/machinery operation/assemblers 10.1%. Of this, jobless and part-timers, totaling three, are dropped from this research. By years worked, 0-3.9 years represents 9.7%, 4-7.9 years 6.7%, 8-14.9 years 18.4%, above 15 years 28.7%, and no respondents 36.5%. 3. The degree of the subjects practicing life-improvement lifestyle, on a scale of 1 to 4, is an average of 2.69, personal relations 3.04, self-realization 2.92, stress management 2.76, nutritional state 2.73, responsibility for health 2.47, and athletic activities 2.18, with personal relations earning the highest points and athletic activities the lowest. As for factors influencing health-improvement lifestyle, there is no significant difference between gender, age, and marriage status. Meanwhile, there is significant difference between revenue, dwelling pattern, education level, etc. That is, higher income-bracket, owned houses, rented houses, monthly-rented houses, and higher-educated, in this order, show higher average in health-enhancement lifestyle. By job, housewives, manual workers, office workers, professionals, equipment/ machinery operation/ assemblers, and part-timers, in this order show higher points, while there is no difference with significance by years worked. 4. Factors that affect health-improvement lifestyle are shown below. Self-realization is influenced by age, marriage status, type of dwellings, and level of education; responsibility for health by type of dwellings; athletic activities by gender and age; nutrition by age, marriage status and type of dwellings; personal relations by marriage status; and stress management by type of dwellings. 5. Areas with high points by job show this: in self-realization, office workers, manual workers, housewives, professionals, equipment/ machinery operation/ assemblers, in this order, show difference with significance; in the area of responsibility for health, manual workers, housewives, equipment/ machinery operation/ assemblers, professionals, office workers and part-timers, in this order, do. In athletic activities, manual workers, housewives, office workers, professionals, equipment/ machinery operation/ assemblers, and part-timers, in this order, show difference with significance; in nutrition, housewives, office workers, manual workers, professionals, equipment/ machinery operation/ assemblers, and part-timers, in this order do; and in stress, housewives, office workers, manual workers, professionals, equipment/ machinery operation/ assemblers, part-timers, in this order do. By years worked, more years showed higher points in the area of responsibility for health and nutrition; in the area of athletic activities, above 15 years, 4-8 years, below 4 years and 8-14 years, in this order, show higher points; and no difference shows in realization, personal relation, and stress area. 6. To look at correlation between overall and divisional health-improvement practice degree, this researcher has analyzed it using Person's correlation coefficient. Self-realization, responsibility for health, athletic activities, nutrition, support for personal relations, and stress management show significant correlation with the sub-divisions, while all health-improvement lifestyle shows significant correlation with the six sub-divisions.

  • PDF

A Meta Analysis of Using Structural Equation Model on the Korean MIS Research (국내 MIS 연구에서 구조방정식모형 활용에 관한 메타분석)

  • Kim, Jong-Ki;Jeon, Jin-Hwan
    • Asia pacific journal of information systems
    • /
    • v.19 no.4
    • /
    • pp.47-75
    • /
    • 2009
  • Recently, researches on Management Information Systems (MIS) have laid out theoretical foundation and academic paradigms by introducing diverse theories, themes, and methodologies. Especially, academic paradigms of MIS encourage a user-friendly approach by developing the technologies from the users' perspectives, which reflects the existence of strong causal relationships between information systems and user's behavior. As in other areas in social science the use of structural equation modeling (SEM) has rapidly increased in recent years especially in the MIS area. The SEM technique is important because it provides powerful ways to address key IS research problems. It also has a unique ability to simultaneously examine a series of casual relationships while analyzing multiple independent and dependent variables all at the same time. In spite of providing many benefits to the MIS researchers, there are some potential pitfalls with the analytical technique. The research objective of this study is to provide some guidelines for an appropriate use of SEM based on the assessment of current practice of using SEM in the MIS research. This study focuses on several statistical issues related to the use of SEM in the MIS research. Selected articles are assessed in three parts through the meta analysis. The first part is related to the initial specification of theoretical model of interest. The second is about data screening prior to model estimation and testing. And the last part concerns estimation and testing of theoretical models based on empirical data. This study reviewed the use of SEM in 164 empirical research articles published in four major MIS journals in Korea (APJIS, ISR, JIS and JITAM) from 1991 to 2007. APJIS, ISR, JIS and JITAM accounted for 73, 17, 58, and 16 of the total number of applications, respectively. The number of published applications has been increased over time. LISREL was the most frequently used SEM software among MIS researchers (97 studies (59.15%)), followed by AMOS (45 studies (27.44%)). In the first part, regarding issues related to the initial specification of theoretical model of interest, all of the studies have used cross-sectional data. The studies that use cross-sectional data may be able to better explain their structural model as a set of relationships. Most of SEM studies, meanwhile, have employed. confirmatory-type analysis (146 articles (89%)). For the model specification issue about model formulation, 159 (96.9%) of the studies were the full structural equation model. For only 5 researches, SEM was used for the measurement model with a set of observed variables. The average sample size for all models was 365.41, with some models retaining a sample as small as 50 and as large as 500. The second part of the issue is related to data screening prior to model estimation and testing. Data screening is important for researchers particularly in defining how they deal with missing values. Overall, discussion of data screening was reported in 118 (71.95%) of the studies while there was no study discussing evidence of multivariate normality for the models. On the third part, issues related to the estimation and testing of theoretical models on empirical data, assessing model fit is one of most important issues because it provides adequate statistical power for research models. There were multiple fit indices used in the SEM applications. The test was reported in the most of studies (146 (89%)), whereas normed-test was reported less frequently (65 studies (39.64%)). It is important that normed- of 3 or lower is required for adequate model fit. The most popular model fit indices were GFI (109 (66.46%)), AGFI (84 (51.22%)), NFI (44 (47.56%)), RMR (42 (25.61%)), CFI (59 (35.98%)), RMSEA (62 (37.80)), and NNFI (48 (29.27%)). Regarding the test of construct validity, convergent validity has been examined in 109 studies (66.46%) and discriminant validity in 98 (59.76%). 81 studies (49.39%) have reported the average variance extracted (AVE). However, there was little discussion of direct (47 (28.66%)), indirect, and total effect in the SEM models. Based on these findings, we suggest general guidelines for the use of SEM and propose some recommendations on concerning issues of latent variables models, raw data, sample size, data screening, reporting parameter estimated, model fit statistics, multivariate normality, confirmatory factor analysis, reliabilities and the decomposition of effects.