• Title/Summary/Keyword: 민감한 정보

Search Result 1,891, Processing Time 0.037 seconds

Increase of Tc-99m RBC SPECT Sensitivity for Small Liver Hemangioma using Ordered Subset Expectation Maximization Technique (Tc-99m RBC SPECT에서 Ordered Subset Expectation Maximization 기법을 이용한 작은 간 혈관종 진단 예민도의 향상)

  • Jeon, Tae-Joo;Bong, Jung-Kyun;Kim, Hee-Joung;Kim, Myung-Jin;Lee, Jong-Doo
    • The Korean Journal of Nuclear Medicine
    • /
    • v.36 no.6
    • /
    • pp.344-356
    • /
    • 2002
  • Purpose: RBC blood pool SPECT has been used to diagnose focal liver lesion such as hemangioma owing to its high specificity. However, low spatial resolution is a major limitation of this modality. Recently, ordered subset expectation maximization (OSEM) has been introduced to obtain tomographic images for clinical application. We compared this new modified iterative reconstruction method, OSEM with conventional filtered back projection (FBP) in imaging of liver hemangioma. Materials and Methods: Sixty four projection data were acquired using dual head gamma camera in 28 lesions of 24 patients with cavernous hemangioma of liver and these raw data were transferred to LINUX based personal computer. After the replacement of header file as interfile, OSEM was performed under various conditions of subsets (1,2,4,8,16, and 32) and iteration numbers (1,2,4,8, and 16) to obtain the best setting for liver imaging. The best condition for imaging in our investigation was considered to be 4 iterations and 16 subsets. After then, all the images were processed by both FBP and OSEM. Three experts reviewed these images without any information. Results: According to blind review of 28 lesions, OSEM images revealed at least same or better image quality than those of FBP in nearly all cases. Although there showed no significant difference in detection of large lesions more than 3 cm, 5 lesions with 1.5 to 3 cm in diameter were detected by OSEM only. However, both techniques failed to depict 4 cases of small lesions less than 1.5 cm. Conclusion: OSEM revealed better contrast and define in depiction of liver hemangioma as well as higher sensitivity in detection of small lesions. Furthermore this reconstruction method dose not require high performance computer system or long reconstruction time, therefore OSEM is supposed to be good method that can be applied to RBC blood pool SPECT for the diagnosis of liver hemangioma.

Diagnostic Availability of Estrogen Receptor Alpha mRNA on Cervical Cancer Tissue (자궁경부암 조직에서 에스트로겐 수용체 알파 mRNA의 진단적 유용성)

  • Kim, Geehyuk;Yu, Kwangmin;Kim, Jungho;Kim, Seoyong;Park, Sunyoung;Ahn, Sungwoo;Lee, Ji-Young;Kim, Sunghyun;Park, Ho-Hyun;Lee, Dongsup
    • Korean Journal of Clinical Laboratory Science
    • /
    • v.50 no.4
    • /
    • pp.449-456
    • /
    • 2018
  • Cervical cancer is the fourth most frequently diagnosed cancer in women worldwide. In lower Human Development Index countries, it has the second highest incidence and mortality among cancer in women. Therefore, better diagnosis and treatment systems are needed. Among them, estrogen receptor alpha ($ER-{\alpha}$) mRNA expression has been analyzed with RT-qPCR since several studies reported that $ER-{\alpha}$ is necessary in the maturation of the uterus and is related to cervical cancer. In this study, $ER-{\alpha}$ quantitative analysis was performed on various lesions and normal tissue samples. Based on the receiver operating characteristic (ROC) curve, its sensitivity and specificity were 85% and 75%, respectively, showing higher or similar results to those of conventional HPV tests. In addition, its expression level was analyzed with clinical information. With regression analysis, the R square value between the $ER-{\alpha}$ mRNA expression level and menopause status was 0.5041, indicating a strong correlation. This study was performed as part of a pilot study and suggests that $ER-{\alpha}$ is related to carcinogenesis. Future studies will examine other hormones and menopausal factors with a larger sample size.

Fabrication of Portable Self-Powered Wireless Data Transmitting and Receiving System for User Environment Monitoring (사용자 환경 모니터링을 위한 소형 자가발전 무선 데이터 송수신 시스템 개발)

  • Jang, Sunmin;Cho, Sumin;Joung, Yoonsu;Kim, Jaehyoung;Kim, Hyeonsu;Jang, Dayeon;Ra, Yoonsang;Lee, Donghan;La, Moonwoo;Choi, Dongwhi
    • Korean Chemical Engineering Research
    • /
    • v.60 no.2
    • /
    • pp.249-254
    • /
    • 2022
  • With the rapid advance of the semiconductor and Information and communication technologies, remote environment monitoring technology, which can detect and analyze surrounding environmental conditions with various types of sensors and wireless communication technologies, is also drawing attention. However, since the conventional remote environmental monitoring systems require external power supplies, it causes time and space limitations on comfortable usage. In this study, we proposed the concept of the self-powered remote environmental monitoring system by supplying the power with the levitation-electromagnetic generator (L-EMG), which is rationally designed to effectively harvest biomechanical energy in consideration of the mechanical characteristics of biomechanical energy. In this regard, the proposed L-EMG is designed to effectively respond to the external vibration with the movable center magnet considering the mechanical characteristics of the biomechanical energy, such as relatively low-frequency and high amplitude of vibration. Hence the L-EMG based on the fragile force equilibrium can generate high-quality electrical energy to supply power. Additionally, the environmental detective sensor and wireless transmission module are composed of the micro control unit (MCU) to minimize the required power for electronic device operation by applying the sleep mode, resulting in the extension of operation time. Finally, in order to maximize user convenience, a mobile phone application was built to enable easy monitoring of the surrounding environment. Thus, the proposed concept not only verifies the possibility of establishing the self-powered remote environmental monitoring system using biomechanical energy but further suggests a design guideline.

Comparative assessment and uncertainty analysis of ensemble-based hydrologic data assimilation using airGRdatassim (airGRdatassim을 이용한 앙상블 기반 수문자료동화 기법의 비교 및 불확실성 평가)

  • Lee, Garim;Lee, Songhee;Kim, Bomi;Woo, Dong Kook;Noh, Seong Jin
    • Journal of Korea Water Resources Association
    • /
    • v.55 no.10
    • /
    • pp.761-774
    • /
    • 2022
  • Accurate hydrologic prediction is essential to analyze the effects of drought, flood, and climate change on flow rates, water quality, and ecosystems. Disentangling the uncertainty of the hydrological model is one of the important issues in hydrology and water resources research. Hydrologic data assimilation (DA), a technique that updates the status or parameters of a hydrological model to produce the most likely estimates of the initial conditions of the model, is one of the ways to minimize uncertainty in hydrological simulations and improve predictive accuracy. In this study, the two ensemble-based sequential DA techniques, ensemble Kalman filter, and particle filter are comparatively analyzed for the daily discharge simulation at the Yongdam catchment using airGRdatassim. The results showed that the values of Kling-Gupta efficiency (KGE) were improved from 0.799 in the open loop simulation to 0.826 in the ensemble Kalman filter and to 0.933 in the particle filter. In addition, we analyzed the effects of hyper-parameters related to the data assimilation methods such as precipitation and potential evaporation forcing error parameters and selection of perturbed and updated states. For the case of forcing error conditions, the particle filter was superior to the ensemble in terms of the KGE index. The size of the optimal forcing noise was relatively smaller in the particle filter compared to the ensemble Kalman filter. In addition, with more state variables included in the updating step, performance of data assimilation improved, implicating that adequate selection of updating states can be considered as a hyper-parameter. The simulation experiments in this study implied that DA hyper-parameters needed to be carefully optimized to exploit the potential of DA methods.

Dynamic Equilibrium Position Prediction Model for the Confluence Area of Nakdong River (낙동강 합류부 삼각주의 동적 평형 위치 예측 모델: 감천-낙동강 합류점 중심 분석 연구)

  • Minsik Kim;Haein Shin;Wook-Hyun Nahm;Wonsuck Kim
    • Economic and Environmental Geology
    • /
    • v.56 no.4
    • /
    • pp.435-445
    • /
    • 2023
  • A delta is a depositional landform that is formed when sediment transported by a river is deposited in a relatively low-energy environment, such as a lake, sea, or a main channel. Among these, a delta formed at the confluence of rivers has a great importance in river management and research because it has a significant impact on the hydraulic and sedimentological characteristics of the river. Recently, the equilibrium state of the confluence area has been disrupted by large-scale dredging and construction of levees in the Nakdong River. However, due to the natural recovery of the river, the confluence area is returning to its pre-dredging natural state through ongoing sedimentation. The time-series data show that the confluence delta has been steadily growing since the dredging, but once it reaches a certain size, it repeats growth and retreat, and the overall size does not change significantly. In this study, we developed a model to explain the sedimentation-erosion processes in the confluence area based on the assumption that the confluence delta reaches a dynamic equilibrium. The model is based on two fundamental principles: sedimentation due to supply from the tributary and erosion due to the main channel. The erosion coefficient that represents the Nakdong River confluence areas, was obtained using data from the tributaries of the Nakdong River. Sensitivity analyses were conducted using the developed model to understand how the confluence delta responds to changes in the sediment and water discharges of the tributary and the main channel, respectively. We then used annual average discharge of the Nakdong River's tributaries to predict the dynamic equilibrium positions of the confluence deltas. Finally, we conducted a simulation experiment on the development of the Gamcheon-Nakdong River delta using recorded daily discharge. The results showed that even though it is a simple model, it accurately predicted the dynamic equilibrium positions of the confluence deltas in the Nakdong River, including the areas where the delta had not formed, and those where the delta had already formed and predicted the trend of the response of the Gamcheon-Nakdong River delta. However, the actual retreat in the Gamcheon-Nakdong River delta was not captured fully due to errors and limitations in the simplification process. The insights through this study provide basic information on the sediment supply of the Nakdong River through the confluence areas, which can be implemented as a basic model for river maintenance and management.

Analysis of Changes in Pine Forests According to Natural Forest Dynamics Using Time-series NFI Data (시계열 국가산림자원조사 자료 기반 자연적 임분동태 변화에 따른 소나무림의 감소 특성 평가)

  • Eun-Sook Kim;Jong Bin Jung;Sinyoung Park
    • Journal of Korean Society of Forest Science
    • /
    • v.113 no.1
    • /
    • pp.40-50
    • /
    • 2024
  • Pine forests are continuously declining due to competition with broadleaf trees, such as oaks, as a consequence of changes in the natural dynamics of forest ecosystem. This natural decline creates a risk of losing the various benefits pine trees have provided to people in the past. Therefore, it is necessary to prepare future forest management directions by considering the state of pine tree decline in each region. The goal of this study is to understand the characteristics of pine forest changes according to forest dynamics and to predict future regional changes. For this purpose, we evaluated the trend of change in pine forests and extracted various variables(topography, forest stand type, disturbance, and climate) that affect the change, using time-series National Forest Inventory (NFI) data. Also, using selected key variables, a model was developed to predict future changes in pine forests. As a results, it showed that the importance of pine trees in forests across the country has decreased overall over the past 10 years. Also, 75% of the sample points representing pine trees remained unchanged, while the remaining 25% had changed to mixed forests. It was found that these changes mainly occurred in areas with good moisture conditions or disturbance factors inside and outside the forest. In the next 10 years, approximately 14.2% of current pine forests was predicted to convert to mixed forests due to changes in natural forest dynamics. Regionally, the rate of pine forest change was highest in Jeju(42.8%) and Gyeonggi(26.9%) and lowest in Gyeongbuk(8.8%) and Gangwon(13.8%). It was predicted that pine forests would be at a high risk of decline in western areas of the Korean Peninsula, including Gyeonggi, Chungcheong, and Jeonnam. This results can be used to make a management plan for pine forests throughout the country.

Extension Method of Association Rules Using Social Network Analysis (사회연결망 분석을 활용한 연관규칙 확장기법)

  • Lee, Dongwon
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.4
    • /
    • pp.111-126
    • /
    • 2017
  • Recommender systems based on association rule mining significantly contribute to seller's sales by reducing consumers' time to search for products that they want. Recommendations based on the frequency of transactions such as orders can effectively screen out the products that are statistically marketable among multiple products. A product with a high possibility of sales, however, can be omitted from the recommendation if it records insufficient number of transactions at the beginning of the sale. Products missing from the associated recommendations may lose the chance of exposure to consumers, which leads to a decline in the number of transactions. In turn, diminished transactions may create a vicious circle of lost opportunity to be recommended. Thus, initial sales are likely to remain stagnant for a certain period of time. Products that are susceptible to fashion or seasonality, such as clothing, may be greatly affected. This study was aimed at expanding association rules to include into the list of recommendations those products whose initial trading frequency of transactions is low despite the possibility of high sales. The particular purpose is to predict the strength of the direct connection of two unconnected items through the properties of the paths located between them. An association between two items revealed in transactions can be interpreted as the interaction between them, which can be expressed as a link in a social network whose nodes are items. The first step calculates the centralities of the nodes in the middle of the paths that indirectly connect the two nodes without direct connection. The next step identifies the number of the paths and the shortest among them. These extracts are used as independent variables in the regression analysis to predict future connection strength between the nodes. The strength of the connection between the two nodes of the model, which is defined by the number of nodes between the two nodes, is measured after a certain period of time. The regression analysis results confirm that the number of paths between the two products, the distance of the shortest path, and the number of neighboring items connected to the products are significantly related to their potential strength. This study used actual order transaction data collected for three months from February to April in 2016 from an online commerce company. To reduce the complexity of analytics as the scale of the network grows, the analysis was performed only on miscellaneous goods. Two consecutively purchased items were chosen from each customer's transactions to obtain a pair of antecedent and consequent, which secures a link needed for constituting a social network. The direction of the link was determined in the order in which the goods were purchased. Except for the last ten days of the data collection period, the social network of associated items was built for the extraction of independent variables. The model predicts the number of links to be connected in the next ten days from the explanatory variables. Of the 5,711 previously unconnected links, 611 were newly connected for the last ten days. Through experiments, the proposed model demonstrated excellent predictions. Of the 571 links that the proposed model predicts, 269 were confirmed to have been connected. This is 4.4 times more than the average of 61, which can be found without any prediction model. This study is expected to be useful regarding industries whose new products launch quickly with short life cycles, since their exposure time is critical. Also, it can be used to detect diseases that are rarely found in the early stages of medical treatment because of the low incidence of outbreaks. Since the complexity of the social networking analysis is sensitive to the number of nodes and links that make up the network, this study was conducted in a particular category of miscellaneous goods. Future research should consider that this condition may limit the opportunity to detect unexpected associations between products belonging to different categories of classification.

Pareto Ratio and Inequality Level of Knowledge Sharing in Virtual Knowledge Collaboration: Analysis of Behaviors on Wikipedia (지식 공유의 파레토 비율 및 불평등 정도와 가상 지식 협업: 위키피디아 행위 데이터 분석)

  • Park, Hyun-Jung;Shin, Kyung-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.3
    • /
    • pp.19-43
    • /
    • 2014
  • The Pareto principle, also known as the 80-20 rule, states that roughly 80% of the effects come from 20% of the causes for many events including natural phenomena. It has been recognized as a golden rule in business with a wide application of such discovery like 20 percent of customers resulting in 80 percent of total sales. On the other hand, the Long Tail theory, pointing out that "the trivial many" produces more value than "the vital few," has gained popularity in recent times with a tremendous reduction of distribution and inventory costs through the development of ICT(Information and Communication Technology). This study started with a view to illuminating how these two primary business paradigms-Pareto principle and Long Tail theory-relates to the success of virtual knowledge collaboration. The importance of virtual knowledge collaboration is soaring in this era of globalization and virtualization transcending geographical and temporal constraints. Many previous studies on knowledge sharing have focused on the factors to affect knowledge sharing, seeking to boost individual knowledge sharing and resolve the social dilemma caused from the fact that rational individuals are likely to rather consume than contribute knowledge. Knowledge collaboration can be defined as the creation of knowledge by not only sharing knowledge, but also by transforming and integrating such knowledge. In this perspective of knowledge collaboration, the relative distribution of knowledge sharing among participants can count as much as the absolute amounts of individual knowledge sharing. In particular, whether the more contribution of the upper 20 percent of participants in knowledge sharing will enhance the efficiency of overall knowledge collaboration is an issue of interest. This study deals with the effect of this sort of knowledge sharing distribution on the efficiency of knowledge collaboration and is extended to reflect the work characteristics. All analyses were conducted based on actual data instead of self-reported questionnaire surveys. More specifically, we analyzed the collaborative behaviors of editors of 2,978 English Wikipedia featured articles, which are the best quality grade of articles in English Wikipedia. We adopted Pareto ratio, the ratio of the number of knowledge contribution of the upper 20 percent of participants to the total number of knowledge contribution made by the total participants of an article group, to examine the effect of Pareto principle. In addition, Gini coefficient, which represents the inequality of income among a group of people, was applied to reveal the effect of inequality of knowledge contribution. Hypotheses were set up based on the assumption that the higher ratio of knowledge contribution by more highly motivated participants will lead to the higher collaboration efficiency, but if the ratio gets too high, the collaboration efficiency will be exacerbated because overall informational diversity is threatened and knowledge contribution of less motivated participants is intimidated. Cox regression models were formulated for each of the focal variables-Pareto ratio and Gini coefficient-with seven control variables such as the number of editors involved in an article, the average time length between successive edits of an article, the number of sections a featured article has, etc. The dependent variable of the Cox models is the time spent from article initiation to promotion to the featured article level, indicating the efficiency of knowledge collaboration. To examine whether the effects of the focal variables vary depending on the characteristics of a group task, we classified 2,978 featured articles into two categories: Academic and Non-academic. Academic articles refer to at least one paper published at an SCI, SSCI, A&HCI, or SCIE journal. We assumed that academic articles are more complex, entail more information processing and problem solving, and thus require more skill variety and expertise. The analysis results indicate the followings; First, Pareto ratio and inequality of knowledge sharing relates in a curvilinear fashion to the collaboration efficiency in an online community, promoting it to an optimal point and undermining it thereafter. Second, the curvilinear effect of Pareto ratio and inequality of knowledge sharing on the collaboration efficiency is more sensitive with a more academic task in an online community.

The Effects of Self-regulatory Resources and Construal Levels on the Choices of Zero-cost Products (자아조절자원 및 해석수준이 공짜대안 선택에 미치는 영향)

  • Lee, Jinyong;Im, Seoung Ah
    • Asia Marketing Journal
    • /
    • v.13 no.4
    • /
    • pp.55-76
    • /
    • 2012
  • Most people prefer to choose zero-cost products they may get without paying any money. The 'zero-cost effect' can be explained with a 'zero-cost model' where consumers attach special values to zero-cost products in a different way from general economic models (Shampanier, Mazar and Ariely 2007). If 2 different products at the regular prices of ₩200 and ₩400 simultaneously offer ₩200 discounts, the prices will be changed to ₩0 and ₩200, respectively. In spite of the same price gap of the two products after the ₩200 discounts, people are much more likely to select the free alternative than the same product at the price of ₩200. Although prior studies have focused on the 'zero-cost effect' in isolation of other factors, this study investigates the moderating effects of a self-regulatory resource and a construal level on the selection of free products. Self-regulatory resources induce people to control or regulate their behavior. However, since self-regulatory resources are limited, they are to be easily depleted when exerted (Muraven, Tice, and Baumeister 1998). Without the resources, consumers tend to become less sensitive to price changes and to spend money more extravagantly (Vohs and Faber 2007). Under this condition, they are also likely to invest less effort on their information processing and to make more intuitive decisions (Pocheptsova, Amir, Dhar, and Baumeister 2009). Therefore, context effects such as price changes and zero cost effects are less likely in the circumstances of resource depletion. In addition, construal levels have profound effects on the ways of information processing (Trope and Liberman 2003, 2010). In a high construal level, people tend to attune their minds to core features and desirability aspects, whereas, in a low construal level, they are more likely to process information based on secondary features and feasibility aspects (Khan, Zhu, and Kalra 2010). A perceived value of a product is more related to desirability whereas a zero cost or a price level is more associated with feasibility. Thus, context effects or reliance on feasibility (for instance, the zero cost effect) will be diminished in a high level construal while those effects may remain in a low level construal. When people make decisions, these 2 factors can influence the magnitude of the 'zero-cost effect'. This study ran two experiments to investigate the effects of self-regulatory resources and construal levels on the selection of a free product. Kisses and Ferrero-Rocher, which were adopted in the prior study (Shampanier et al. 2007) were also used as alternatives in Experiments 1 and 2. We designed Experiment 1 in order to test whether self-regulatory resource depletion will moderate the zero-cost effect. The level of self-regulatory resources was manipulated with two different tasks, a Sudoku task in the depletion condition and a task of drawing diagrams in the non-depletion condition. Upon completion of the manipulation task, subjects were randomly assigned to one of a decision set with a zero-cost option (i.e., Kisses ₩0, and Ferrero-Rocher ₩200) or a set without a zero-cost option (i.e., Kisses ₩200, and Ferrero-Rocher ₩400). A pair of alternatives in the two decision sets have the same price gap of ₩200 between a low-priced Kisses and a high-priced Ferrero-Rocher. Subjects in the no-depletion condition selected Kisses more often (71.88%) over Ferrero-Rocher when Kisses was free than when it was priced at ₩200 (34.88%). However, the zero-cost effect disappeared when people do not have self-regulatory resources. Experiment 2 was conducted to investigate whether constual levels influence the magnitude of the 'zero-cost effect'. To manipulate construal levels, 4 different 'why (in the high construal level condition)' or 'how (in the low construal level condition)' questions about health management were asked. They were presented with 4 boxes connected with downward arrows. In a box at the top, there was one question, 'Why do I maintain good physical health?' or 'How do I maintain good physical health?' Subjects inserted a response to the question of why or how they would maintain good physical health. Similar tasks were repeated for the 2nd, 3rd, and 4th responses. After the manipulation task, subjects were randomly assigned either to a decision set with a zero-cost option, or to a set without it, as in Experiment 1. When a low construal level is primed with 'how', subjects chose free Kisses (60.66%) more often over Ferrero-Rocher than they chose ₩200 Kisses (42.19%) over ₩400 FerreroRocher. On contrast, the zero-cost effect could not be observed any longer when a high construal level is primed with 'why'.

  • PDF

Tissue Culture Method as a Possible Tool to Study Herbicidal Behaviour and Herbicide Tolerance Screening (조직배양(組織培養) 방법(方法)을 이용(利用)한 제초제(除草劑) 작용성(作用性) 및 제초제(除草劑) 저항성(抵抗性) 검정방법(檢定方法) 연구(硏究))

  • Kim, S.C.;Lee, S.K.;Chung, G.S.
    • Korean Journal of Weed Science
    • /
    • v.6 no.2
    • /
    • pp.174-190
    • /
    • 1986
  • A series of laboratory and greenhouse experiments were conducted to find out the possibility of tissue culture and cell culture methods as a tool to study herbicidal behaviour and herbicide tolerance screening from 1985 to 1986 at the Yeongnam Crop Experiment Station. For dehulled-rice culture, pure agar medium was the most appropriate in rice growth campared to other media used for plant tissue culture method. All the media but the pure agar medium resulted in growth retardance by approximately 50% and this effect was more pronounced to root growth than shoot growth. Herbicidal phytotoxicity was enhanced under light condition for butachlor, 2.4-D, and propanil while this effect was reversed for DPX F-5384 and CGA 142464, respectively. And also, herbicides of butachlor, chlornitrofen, oxadiazon, and BAS-514 resulted in more phytotoxic effect when shoot and root of rice were exposed to herbicide than root exposure only while other used herbicides exhibited no significant difference between two exposure regimes. Similar response was obtained from Echinochloa crusgalli even though the degree of growth retardance was much greater. Particularly, butachlor, 2.4-D, chlornitrofen, oxadiaxon, pyrazolate and BAS-514 totally inhibited chlorophyll biosynthesis even at the single contact of root. Apparent cultivar differences to herbicide were observed at the young seedling culture method and dehulled rice cultivars were more tolerant in DPX F-5384, NC-311, pyrazolate and pyrazoxyfen, respectively. For derant than other types or rice cultivar in butachlor, pretilachlor, perfluidone and oxadiazon while Tongil-type rice cultivars were more tolerant in DPXF-5384, NC-311, Pyrazolate and Pyrazoxyfen, respectively. For dehulled rice culture, on the other hand, Japonica-type rice cultivar was less tolerant to herbicides of butachlor, propanil, chlornitrofen and oxadiazon that was reversed trend to young seedling culture test. Cultivar differences were also exhibited within same cultivar type. In general, relatively higher tolerant cultivars were Milyang 42, Cheongcheongbyeo, Samgangbyeo, Chilseoungbyeo for Tongil-type, Somjinbyeo for Japonica-type and IR50 for Indica-type, respectively. The response of callus growth showed similar to dehulled rice culture method in all herbicides regardless of property variables. However, concentration response was much sensitive in callus response. The concentration ranges of $10^{-9}M-10^(-8)M$ were appropriate to distinguish the difference between herbicides for E. crusgalli callus growth. Among used herbicides, BAS-514 was the most effective to E. crusgalli callus growth. Based on the above results, tissue culture method could be successfully used as a tool for studying herbicidal behaviour and tolerance screening to herbicide.

  • PDF