• Title/Summary/Keyword: 정보학회

Search Result 186,834, Processing Time 0.173 seconds

Ecological Health Assessments on Turbidwater in the Downstream After a Construction of Yongdam Dam (용담댐 건설후 하류부 하천 생태계의 탁수영향 평가)

  • Kim, Ja-Hyun;Seo, Jin-Won;Na, Young-Eun;An, Kwang-Guk
    • Korean Journal of Ecology and Environment
    • /
    • v.40 no.1
    • /
    • pp.130-142
    • /
    • 2007
  • This study was to examine impacts of turbid water on fish community in the downstream of Yongdam Dam during the period from June to October 2006. For the research, we selected six sampling sites in the field: two sites were controls with no influences of turbid water from the dam and other remaining four sites were the stations for an assessment of potential turbid effects. We evaluated integrative health conditions throughout applications of various models such as necropsy-based fish health assessment model (FHA), Index of Biological Integrity (IBI) using fish assemblages, and Qualitative Habitat Evaluation Index (QHEI). Laboratory tests on fish exposure under 400 NTU were performed to find out impact of turbid water using scanning electron microscope (SEM). Results showed that fine solid particles were clogging in the gill in the treatments, while particles were not found in the control. This results indicate that when inorganic turbidity increases abruptedly, fish may have a mechanical abrasion or respiratory blocking. The stream health condition, based on the IBI values, ranged between 38 and 48 (average: 42), indicating a "excellent" or "good" condition after the criteria of US EPA (1993). In the mean time, physical habitat condition, based on the QHEI, ranged 97 to 187 (average 154), indicating a "suboptimal condition". These biological outcomes were compared with chemical dataset: IBI values were more correlated (r=0.526, p<0.05, n=18) with QHEI rather than chemical water quality, based on turbidity (r=0.260, p>0.05, n=18). Analysis of the FHA showed that the individual health indicated "excellent condition", while QHEI showed no habitat disturbances (especially bottom substrate and embeddeness), food-web, and spawning place. Consequently, we concluded that the ecological health in downstream of Yongdam Dam was not impacted by the turbid water.

The Analysis of the Fish Assemblage Characteristics by Wetland Type (River and lake) of National Wetland Classification System of Wetlands in Gyeongsangnam-do (국가습지유형분류체계의 습지 유형 (하천형과 호수형)에 따른 경남지역 습지의 어류군집 특성 분석)

  • Kim, Jeong-Hui;Yoon, Ju-Duk;Im, Ran-Young;Kim, Gu-Yeon;Jo, Hyunbin
    • Korean Journal of Ecology and Environment
    • /
    • v.51 no.2
    • /
    • pp.149-159
    • /
    • 2018
  • Twenty-nine wetlands (20 river type and 9 lake type wetlands) in Gyeongsangnam-do were investigated to understand the characteristics of fish assemblages by the wetland type and to suggest management strategies. As a result, $10.3{\pm}4.8$ species were collected from river type wetlands on average (${\pm}SD$) and $9.1{\pm}4.1$ species from lake type wetlands. Thus, there was no significant difference in the number of species between them (Mann-Whitney U test, P>0.05). However, the species that constitute the fish assemblage showed statistically significant differences between the two wetland types (PERMANOVA, Pseudo-F=2.9555, P=0.007). Furthermore, the species that contribute the most to each type of fish assemblage were Zacco koreanus (river type, 28.51%) and Lepomis macrochirus (lake type, 23.21%), respectively (SIMPER). The results of the NMDS analysis using the fish assemblage by place classified the species into three groups (river type, lake type, and others). The current wetland management is only focused on endangered species, but this study shows a difference in fish assemblage by wetland type. Therefore, a management system based information on endemic species, exotic species and major contribution species should be provided. Furthermore, the classification of some types of wetlands based on the present topography was found to be ambiguous, and wetland classification using living creatures can be used as a complementary method. This study has limitations because only two types of wetlands were analyzed. Therefore, a detailed management method that can represent every type of wetland should be prepared through the research of all types of wetlands in the future.

A Short Reveiw on the Acupoints Used in the Studies about Morphine Addiction (모르핀 중독의 침 연구에 사용된 경혈(經穴)에 대한 소고(小考))

  • Lee, Bong-Hyo;Lim, Sung-Chul;Kim, Jae-Su;Lee, Yun-Kyu;Lee, Hyun-Jong;Jung, Tae-Young;Jung, Hyun-Jung;Kam, Chul-Woo
    • Korean Journal of Acupuncture
    • /
    • v.29 no.2
    • /
    • pp.179-187
    • /
    • 2012
  • Objectives : Since acupuncture was accepted as an useful therapy for the drug addiction, a lot of studies about acupuncture have been carried out. This study was performed to review the articles about morphine addiction which used acupuncture as a treatment and to interpret the use of acupoints from the viewpoint of Six-meridian (Yuk Gyeong, three yin and three yang) theory. Methods : The authors searched 255 articles in PubMed with the key word of "morphine, acupuncture" and 629 articles in KISS (Koreanstudies Information Service System) with the key word of "morphine". The articles written in English only were included. The articles related with morphine (abuse, dependence, sensitization, addiction, intake, withdrawal sign, withdrawal syndrome, reinstatement, craving) only were included. The articles which used manual- or electro-acupuncture only were included and auricular acupuncture was excluded. Both of clinical and experimental study were reivewed. Results : The most frequently used acupoint was ST36-SP6 (electroacupuncture), and the second was HT7. LI4 was the third, and BL23 and PC6 were also used. Conclusions : The acupoints used in the morphine study seem to influence the brain through diverse mechanisms and it is thought that control of the reaction against stress appears to be related with these mechanisms.

Genetic Diversity and Relationship of the Walleye Pollock, Theragra chalcogramma Based on Microsatellite Analysis (Microsatellite marker 분석을 이용한 명태(Theragra chalcogramma) 5 집단의 유전적 다양성 및 유연관계 분석)

  • Dong, Chun Mae;Kang, Jung-Ha;Byun, Soon-Gyu;Park, Kie-Young;Park, Jung Youn;Kong, Hee Jeong;An, Cheul Min;Kim, Gun-Do;Kim, Eun-Mi
    • Journal of Life Science
    • /
    • v.26 no.11
    • /
    • pp.1237-1244
    • /
    • 2016
  • A comprehensive analysis of the genetic diversity and relationship of the cold-water fishery walleye pollock (Theragra chalcogramma), the most abundant economically important fishery resource in the East sea of Korea, has not been carried out, despite its importance in Korea. The present study assessed the genetic diversity and relationship between five walleye pollock populations (Korean population, Russian population, USA population, and Japanese populations) of T. chalcogramma using eight microsatellite DNA (msDNA) markers to provide the scientific data for the preservation and management of the Pollock fishery resource. The results of the analysis of 186 individuals of the Pollock revealed a range of 7.13-10.63 numbers of alleles (mean number of alleles=9.05). The means of observed heterozygosity ($H_O$), expected heterozygosity ($H_E$) were 0.732 and 0.698, respectively. The results of genetic distance, Pairwise $F_{ST}$, UPGMA (UPGMA: un-weighted pair-group method with an arithmetical average) (the phylogenetic tree), PCA (PCA: Principal Coordinate analysis) analysis pointed to significant differences between the Korean population, Russian population, USA population, and Japanese populations, although small (p<0.05). These results shed light on the genetic diversity and relationships of T. chalcogramma and can be utilized for research on the evaluation and conservation of Korean T. chalcogramma as genetic resources.

A Case Study - IT Outsourcing of the Korea Development Bank (산업은행: 금융 IT 아웃소싱 - 공동협력으로 안전한 문을 연다)

  • Kang, Ju-Young;Lee, Jae-Kyu
    • Information Systems Review
    • /
    • v.7 no.2
    • /
    • pp.229-255
    • /
    • 2005
  • The Korea Development Bank promoted a total outsourcing for IT operation in 1999 for the first time in the banking industry. The Korea Development Bank became the center of public attention because the most banks were unwilling to take an outsourcing with external sources for the reason of financial operation accidents, securities, and threats of strikes. After the introduction of the total IT outsourcing, the Korea Development Bank has continuously diagnosed the problems of the IT outsourcing and adopted various proper complements for the enhancement of the IT outsourcing. As the result of the enhancement, the IT outsourcing of the Korea Development Bank marched into the joint liability operation period after going through the outsourcing operation period and the co-operation period. The joint liability operation which is the most leading outsourcing system which is adopted by the Korea Development Bank for the first time in the banking industry. Through the joint liability operation, the Korea Development Bank could accept the most up-to-date IT, concentrate internal manpower on the core capability, and secure flexibility of manpower. Also, the bank changed the relationship between the bank and the external sources from the one-sided relationship between a producers and a consumer to the joint liability relationship on which both sides are responsible for the operation, and could integrate the internal capacity with the professional know-how of the external IT outsourcing company. In this paper, we testified the soundness and validity for the worries of banks about the total IT outsourcing with external sources. And, we arranged the advantages and outcomes of the total IT outsourcing with external sources compared to the IT outsourcing with internal sources. Moreover, we expect that we can improve the closed financial IT outsourcing industry structure and raise the world competitive power of domestic IT outsourcing companies by correcting wrong ideas on the IT outsourcing with external sources.

Using the METHONTOLOGY Approach to a Graduation Screen Ontology Development: An Experiential Investigation of the METHONTOLOGY Framework

  • Park, Jin-Soo;Sung, Ki-Moon;Moon, Se-Won
    • Asia pacific journal of information systems
    • /
    • v.20 no.2
    • /
    • pp.125-155
    • /
    • 2010
  • Ontologies have been adopted in various business and scientific communities as a key component of the Semantic Web. Despite the increasing importance of ontologies, ontology developers still perceive construction tasks as a challenge. A clearly defined and well-structured methodology can reduce the time required to develop an ontology and increase the probability of success of a project. However, no reliable knowledge-engineering methodology for ontology development currently exists; every methodology has been tailored toward the development of a particular ontology. In this study, we developed a Graduation Screen Ontology (GSO). The graduation screen domain was chosen for the several reasons. First, the graduation screen process is a complicated task requiring a complex reasoning process. Second, GSO may be reused for other universities because the graduation screen process is similar for most universities. Finally, GSO can be built within a given period because the size of the selected domain is reasonable. No standard ontology development methodology exists; thus, one of the existing ontology development methodologies had to be chosen. The most important considerations for selecting the ontology development methodology of GSO included whether it can be applied to a new domain; whether it covers a broader set of development tasks; and whether it gives sufficient explanation of each development task. We evaluated various ontology development methodologies based on the evaluation framework proposed by G$\acute{o}$mez-P$\acute{e}$rez et al. We concluded that METHONTOLOGY was the most applicable to the building of GSO for this study. METHONTOLOGY was derived from the experience of developing Chemical Ontology at the Polytechnic University of Madrid by Fern$\acute{a}$ndez-L$\acute{o}$pez et al. and is regarded as the most mature ontology development methodology. METHONTOLOGY describes a very detailed approach for building an ontology under a centralized development environment at the conceptual level. This methodology consists of three broad processes, with each process containing specific sub-processes: management (scheduling, control, and quality assurance); development (specification, conceptualization, formalization, implementation, and maintenance); and support process (knowledge acquisition, evaluation, documentation, configuration management, and integration). An ontology development language and ontology development tool for GSO construction also had to be selected. We adopted OWL-DL as the ontology development language. OWL was selected because of its computational quality of consistency in checking and classification, which is crucial in developing coherent and useful ontological models for very complex domains. In addition, Protege-OWL was chosen for an ontology development tool because it is supported by METHONTOLOGY and is widely used because of its platform-independent characteristics. Based on the GSO development experience of the researchers, some issues relating to the METHONTOLOGY, OWL-DL, and Prot$\acute{e}$g$\acute{e}$-OWL were identified. We focused on presenting drawbacks of METHONTOLOGY and discussing how each weakness could be addressed. First, METHONTOLOGY insists that domain experts who do not have ontology construction experience can easily build ontologies. However, it is still difficult for these domain experts to develop a sophisticated ontology, especially if they have insufficient background knowledge related to the ontology. Second, METHONTOLOGY does not include a development stage called the "feasibility study." This pre-development stage helps developers ensure not only that a planned ontology is necessary and sufficiently valuable to begin an ontology building project, but also to determine whether the project will be successful. Third, METHONTOLOGY excludes an explanation on the use and integration of existing ontologies. If an additional stage for considering reuse is introduced, developers might share benefits of reuse. Fourth, METHONTOLOGY fails to address the importance of collaboration. This methodology needs to explain the allocation of specific tasks to different developer groups, and how to combine these tasks once specific given jobs are completed. Fifth, METHONTOLOGY fails to suggest the methods and techniques applied in the conceptualization stage sufficiently. Introducing methods of concept extraction from multiple informal sources or methods of identifying relations may enhance the quality of ontologies. Sixth, METHONTOLOGY does not provide an evaluation process to confirm whether WebODE perfectly transforms a conceptual ontology into a formal ontology. It also does not guarantee whether the outcomes of the conceptualization stage are completely reflected in the implementation stage. Seventh, METHONTOLOGY needs to add criteria for user evaluation of the actual use of the constructed ontology under user environments. Eighth, although METHONTOLOGY allows continual knowledge acquisition while working on the ontology development process, consistent updates can be difficult for developers. Ninth, METHONTOLOGY demands that developers complete various documents during the conceptualization stage; thus, it can be considered a heavy methodology. Adopting an agile methodology will result in reinforcing active communication among developers and reducing the burden of documentation completion. Finally, this study concludes with contributions and practical implications. No previous research has addressed issues related to METHONTOLOGY from empirical experiences; this study is an initial attempt. In addition, several lessons learned from the development experience are discussed. This study also affords some insights for ontology methodology researchers who want to design a more advanced ontology development methodology.

The Analysis on the Relationship between Firms' Exposures to SNS and Stock Prices in Korea (기업의 SNS 노출과 주식 수익률간의 관계 분석)

  • Kim, Taehwan;Jung, Woo-Jin;Lee, Sang-Yong Tom
    • Asia pacific journal of information systems
    • /
    • v.24 no.2
    • /
    • pp.233-253
    • /
    • 2014
  • Can the stock market really be predicted? Stock market prediction has attracted much attention from many fields including business, economics, statistics, and mathematics. Early research on stock market prediction was based on random walk theory (RWT) and the efficient market hypothesis (EMH). According to the EMH, stock market are largely driven by new information rather than present and past prices. Since it is unpredictable, stock market will follow a random walk. Even though these theories, Schumaker [2010] asserted that people keep trying to predict the stock market by using artificial intelligence, statistical estimates, and mathematical models. Mathematical approaches include Percolation Methods, Log-Periodic Oscillations and Wavelet Transforms to model future prices. Examples of artificial intelligence approaches that deals with optimization and machine learning are Genetic Algorithms, Support Vector Machines (SVM) and Neural Networks. Statistical approaches typically predicts the future by using past stock market data. Recently, financial engineers have started to predict the stock prices movement pattern by using the SNS data. SNS is the place where peoples opinions and ideas are freely flow and affect others' beliefs on certain things. Through word-of-mouth in SNS, people share product usage experiences, subjective feelings, and commonly accompanying sentiment or mood with others. An increasing number of empirical analyses of sentiment and mood are based on textual collections of public user generated data on the web. The Opinion mining is one domain of the data mining fields extracting public opinions exposed in SNS by utilizing data mining. There have been many studies on the issues of opinion mining from Web sources such as product reviews, forum posts and blogs. In relation to this literatures, we are trying to understand the effects of SNS exposures of firms on stock prices in Korea. Similarly to Bollen et al. [2011], we empirically analyze the impact of SNS exposures on stock return rates. We use Social Metrics by Daum Soft, an SNS big data analysis company in Korea. Social Metrics provides trends and public opinions in Twitter and blogs by using natural language process and analysis tools. It collects the sentences circulated in the Twitter in real time, and breaks down these sentences into the word units and then extracts keywords. In this study, we classify firms' exposures in SNS into two groups: positive and negative. To test the correlation and causation relationship between SNS exposures and stock price returns, we first collect 252 firms' stock prices and KRX100 index in the Korea Stock Exchange (KRX) from May 25, 2012 to September 1, 2012. We also gather the public attitudes (positive, negative) about these firms from Social Metrics over the same period of time. We conduct regression analysis between stock prices and the number of SNS exposures. Having checked the correlation between the two variables, we perform Granger causality test to see the causation direction between the two variables. The research result is that the number of total SNS exposures is positively related with stock market returns. The number of positive mentions of has also positive relationship with stock market returns. Contrarily, the number of negative mentions has negative relationship with stock market returns, but this relationship is statistically not significant. This means that the impact of positive mentions is statistically bigger than the impact of negative mentions. We also investigate whether the impacts are moderated by industry type and firm's size. We find that the SNS exposures impacts are bigger for IT firms than for non-IT firms, and bigger for small sized firms than for large sized firms. The results of Granger causality test shows change of stock price return is caused by SNS exposures, while the causation of the other way round is not significant. Therefore the correlation relationship between SNS exposures and stock prices has uni-direction causality. The more a firm is exposed in SNS, the more is the stock price likely to increase, while stock price changes may not cause more SNS mentions.

Comparisons of Popularity- and Expert-Based News Recommendations: Similarities and Importance (인기도 기반의 온라인 추천 뉴스 기사와 전문 편집인 기반의 지면 뉴스 기사의 유사성과 중요도 비교)

  • Suh, Kil-Soo;Lee, Seongwon;Suh, Eung-Kyo;Kang, Hyebin;Lee, Seungwon;Lee, Un-Kon
    • Asia pacific journal of information systems
    • /
    • v.24 no.2
    • /
    • pp.191-210
    • /
    • 2014
  • As mobile devices that can be connected to the Internet have spread and networking has become possible whenever/wherever, the Internet has become central in the dissemination and consumption of news. Accordingly, the ways news is gathered, disseminated, and consumed have changed greatly. In the traditional news media such as magazines and newspapers, expert editors determined what events were worthy of deploying their staffs or freelancers to cover and what stories from newswires or other sources would be printed. Furthermore, they determined how these stories would be displayed in their publications in terms of page placement, space allocation, type sizes, photographs, and other graphic elements. In turn, readers-news consumers-judged the importance of news not only by its subject and content, but also through subsidiary information such as its location and how it was displayed. Their judgments reflected their acceptance of an assumption that these expert editors had the knowledge and ability not only to serve as gatekeepers in determining what news was valuable and important but also how to rank its value and importance. As such, news assembled, dispensed, and consumed in this manner can be said to be expert-based recommended news. However, in the era of Internet news, the role of expert editors as gatekeepers has been greatly diminished. Many Internet news sites offer a huge volume of news on diverse topics from many media companies, thereby eliminating in many cases the gatekeeper role of expert editors. One result has been to turn news users from passive receptacles into activists who search for news that reflects their interests or tastes. To solve the problem of an overload of information and enhance the efficiency of news users' searches, Internet news sites have introduced numerous recommendation techniques. Recommendations based on popularity constitute one of the most frequently used of these techniques. This popularity-based approach shows a list of those news items that have been read and shared by many people, based on users' behavior such as clicks, evaluations, and sharing. "most-viewed list," "most-replied list," and "real-time issue" found on news sites belong to this system. Given that collective intelligence serves as the premise of these popularity-based recommendations, popularity-based news recommendations would be considered highly important because stories that have been read and shared by many people are presumably more likely to be better than those preferred by only a few people. However, these recommendations may reflect a popularity bias because stories judged likely to be more popular have been placed where they will be most noticeable. As a result, such stories are more likely to be continuously exposed and included in popularity-based recommended news lists. Popular news stories cannot be said to be necessarily those that are most important to readers. Given that many people use popularity-based recommended news and that the popularity-based recommendation approach greatly affects patterns of news use, a review of whether popularity-based news recommendations actually reflect important news can be said to be an indispensable procedure. Therefore, in this study, popularity-based news recommendations of an Internet news portal was compared with top placements of news in printed newspapers, and news users' judgments of which stories were personally and socially important were analyzed. The study was conducted in two stages. In the first stage, content analyses were used to compare the content of the popularity-based news recommendations of an Internet news site with those of the expert-based news recommendations of printed newspapers. Five days of news stories were collected. "most-viewed list" of the Naver portal site were used as the popularity-based recommendations; the expert-based recommendations were represented by the top pieces of news from five major daily newspapers-the Chosun Ilbo, the JoongAng Ilbo, the Dong-A Daily News, the Hankyoreh Shinmun, and the Kyunghyang Shinmun. In the second stage, along with the news stories collected in the first stage, some Internet news stories and some news stories from printed newspapers that the Internet and the newspapers did not have in common were randomly extracted and used in online questionnaire surveys that asked the importance of these selected news stories. According to our analysis, only 10.81% of the popularity-based news recommendations were similar in content with the expert-based news judgments. Therefore, the content of popularity-based news recommendations appears to be quite different from the content of expert-based recommendations. The differences in importance between these two groups of news stories were analyzed, and the results indicated that whereas the two groups did not differ significantly in their recommendations of stories of personal importance, the expert-based recommendations ranked higher in social importance. This study has importance for theory in its examination of popularity-based news recommendations from the two theoretical viewpoints of collective intelligence and popularity bias and by its use of both qualitative (content analysis) and quantitative methods (questionnaires). It also sheds light on the differences in the role of media channels that fulfill an agenda-setting function and Internet news sites that treat news from the viewpoint of markets.

Effct of Species and Tedding Frequency on the Quality of Annual Legume Hay in Spring (초종 및 반전횟수가 봄철 일년생 콩과목초 건초의 품질에 미치는 영향)

  • Kim, J.D.;Kwon, C.H.;Kim, H.J.;Kim, M.G.
    • Journal of Animal Science and Technology
    • /
    • v.46 no.3
    • /
    • pp.451-458
    • /
    • 2004
  • No comprebensive forage quality of annual legumes harvested and cured in spring has been conducted in Korea. Therefore, this experiment was carried out to gain information on the quality of crimson clover (Trifolium incarnatum L.), bolta baIansa clover(Trifolium ba/anansae L.), and persian c1over(Trifolium resupinatum L.) during field curing in spring. The dry matter content of crimson clover at harvest was 24.7%, while bolta balansa and persian clovers had 20.4 and 18.8%, respectively. The moisture content of persian clover was low at the final curing day. But All species took 4 days to reach moisture content under 20%Tedding frequency did not affect moisture content, but consisten trends were also observed during the field curing. Persian clover tended to show a higher leaf-stem ratio than crimson and bolta balansa clovers on dry matter basis. Crude protein of persian clover(19.5%) was higher than other legumes. The percentage of erode protein was decreased from 17.8 to 16.5% as tedding frequency often did. Neutral detergent fiber(NDF) and acid detergent fiber(ADF) contents of persian clover were lower than those of other legumes. From the comparison among tedding frequency, NDF and ADF contents of three times were higher than those of one and two times. Relative feed value(RFV) of persian clover hay was the highest(178) and classified as Grade Prime in forage quality standard. Crimson and bolta balansa clovers in the RFV were also high quality as Grade 1 in forage quality standard. The RFV of legume hay was decreased from 150 to 140 as tedding frequency often did Results of the experiment indicate that hay quality of persian clover was higher than other clovers. And this is due to high leaf and stem content, hollow stem and late maturity stage. Then tedding frequency in annual legume can be teded by two times for quality.

Comparison of Results between Cytogenetic Technique and Molecular Genetic Technique in Colorectal Carcinoma Patients (대장암환자의 염색체 결실에서 세포유전학적 기법과 분자유전학적 기법의 결과 비교)

  • Park, Cheolin;Lee, Jae Sik
    • Korean Journal of Clinical Laboratory Science
    • /
    • v.49 no.3
    • /
    • pp.285-293
    • /
    • 2017
  • Globally, 1.3 million people develop colon cancer every year, and 600,000 people die each year it. In Korea, colorectal carcinoma was associated with the highest death rate, accounting for 8,380 people, among solid cancers in 2015. Among the various methods for the diagnosis and study of colorectal carcinoma, the results obtained by cytogenetic and molecular genetic methods were compared. Detection rate was 47% in 18q, 40% in 17p, 27% in 22q, and 17% in 10q via CGH; detection rate was 57% in D18S59, 50% in D18S68, 50% in TP53CA, 47% in D18S6940% in D22S274, 37% in D22S283, 27% in D10S187, and 23% in D10S541 with LOH. Microsatellite marker matching rates were 100% in D22S274, 100% in D22S283, 100% in D10S186, 100% in D10S187, 100% in D10S541, 93% in D18S69, 93% in D18S68, 92% in TP53CA, and 89% in D18S59. The agreement rate between the two methods was 94.4% based on positive results using CGH. Based on the advantages of CGH, which was the ability to obtain information regarding the entire tumor genome at once, this experiment could identify the region with significant deletion using CGH and the more limited region LOH, with a completely different approach. LOH in the recurrent high-risk group, 18q21, was helpful in the selection of treatment modalities and in prognostic estimation as well as making the most appropriate decision for treatment. Therefore, it is suggested that LOH with surgical site tissues could be one of the treatment methods for recurrent high-risk group among patients with colorectal carcinoma.