• Title/Summary/Keyword: Low risk

Search Result 4,500, Processing Time 0.046 seconds

Improvement of Proliferation Capacity of Non-adapted CHO Cells Subcultured Using Serum Free Media in Long-term Culture (무혈청 배지에서 계대배양한 비적응 CHO(Chinese Hamster Ovary) 세포의 증식력 개선에 관한 연구)

  • Lee, Seung-Sun;Lee, Jin-Sung;Byun, Soon-Hyu;Park, Hong-Woo;Choe, Tae-Boo
    • KSBB Journal
    • /
    • v.21 no.4
    • /
    • pp.248-254
    • /
    • 2006
  • Animal cell culture industry has a large market and an exponential growth rate among biological industry field. Chines hamster ovary(CHO) cells are the most widely used cell lines for recombinant protein production. They can avoid infection from polio, herpes, hepatitis B, HIV, measles, adenovirus and etc. Moreover it is easy to transfection recombinant genes and possible to suspension culture. Serum free media is one of the most important factor of protein production. Because serum has problems. Serum is not defined the contents until now, it has a number of proteins, lipids, carbohydrates and unknown molecules that cause of risk involve in infection and high cost of product purification. CHO cell line cultured using serum free media were the basis of a very successful method to produce(glyco-)protein in mammalian cells, which are then used as pharmaceutical products. Also, the low protein content of the developed medium facilitates downstream processing and product purification. But non-adapted CHO cells have a limit of proliferation cultured using serum free media and it takes very long time to adapt non-adapted cells to serum free media. There are a number of causes of a limit of proliferation using serum free media. Absence of growth factors and growth stimulating molecules is a major factor of the reasons. It makes growth signals and moves cell cycle. And increase of cellular stress is another reason. It induces increase of intraceullar ROS concentration. The purpose of this study is about improvement of proliferation capacity of non-adapted CHO cells cultured using serum free media without adaptation process.

Analysis of source localization of P300 in college students with schizotypal traits (조현형 인격 성향을 가진 대학생의 P300 국소화 분석)

  • Jang, Kyoung-Mi;Kim, Bo-Mi;Na, Eun-Chan;An, Eun-Ji;Kim, Myung-Sun
    • Korean Journal of Cognitive Science
    • /
    • v.28 no.1
    • /
    • pp.1-26
    • /
    • 2017
  • This study investigated the cortical generators of P300 in college students with schizotypal traits by using an auditory oddball paradigm, event-related potentials (ERPs) and standardized low resolution brain electromagnetic tomography (sLORETA) model. We also investigated the relationship between the current density of P300 and the clinical symptoms of schizophrenia. Based on the scores of Schizotypal Personality Questionnaire(SPQ), schizotypal trait (n=37) and control (n=42) groups were selected. For the measurement of P300, an auditory oddball paradigm, in which frequent standard tones (1000Hz) and rare target tones (1500Hz) were presented randomly, was used. Participants were required to count the number of the target tones during the task and report this at the end of the experiment. The two groups did not differ significantly in the accuracy of the oddball task. The schizotypal trait group showed significantly smaller P300 amplitudes than control group. In terms of source localization, both groups showed the P300 current density over bilateral frontal, parietal, temporal and occipital lobes. However, the schizotypal trait group showed significantly reduced activations in the left superior temporal gyrus and the right middle temporal gyrus, but increased activations in both left inferior frontal gyrus and right superior frontal gyrus compared to the control group. Furthermore, a negative correlation between the current density of the right superior frontal gyrus and SPQ disorganization score was found in the schizotypal trait group. These findings indicate that the individuals with schizotypal traits have dysfunctions of frontal and temporal areas, which are known to be the source of P300, as observed in patients with schizophrenia. In addition, the present results indicate that the disorganization score, rather than total score, of the SPQ is useful in predicting the risk of future schizophrenia.

Allele Distribution and Frequency of Human Surfactant Protein-A1 in Korean Neonates (한국 신생아의 폐 표면 활성제 단백-A1 (Human Surfactant Protein-A1) 유전자 대립형질의 분포와 빈도)

  • Lee, Kyung Shin;Kim, Young Hee;Suk, Jung Su;Ko, Jung Ho;Yoo, Ook Joon;Lee, In Kyu;Oh, Myung Ho;Bae, Chong Woo
    • Clinical and Experimental Pediatrics
    • /
    • v.45 no.12
    • /
    • pp.1497-1502
    • /
    • 2002
  • Purpose : We evaluated allele frequencies and distribution of surfactant protein A1(SP-A1) in Korean neonates in order to estimate prevalence of RDS to find out new SP-A alleles, and to establish new steroid therapy. Methods : Genomic DNA was extracted from 100 neonates and served as a template in PCR for genotype analysis. SP-A gene-specific amplications and gene-specific allele determinations were performed using PCR-RFLP methods. Results : The distribution for the alleles of the SP-A1 gene in the study population were 6A, $6A^2$, $6A^3$, $6A^4$, $6A^8$, $6A^9$, $6A^{10}$, $6A^{11}$, $6A^{12}$, $6A^{13}$, $6A^{14}$, $6A^{15}$, $6A^{16}$, $6A^{17}$, $6A^{18}$, $6A^{20}$. The specific frequencies for the alleles of the SP-A1 gene in the study population were : $6A^2=21%$, $6A^3=45%$, $6A^4=11%$, $6A^8=9%$, $6A^{14}=8%$. Conclusion : The frequency of $6A^3$ was higher than the other SP-A1 alleles in Korean neonates. This finding suggests that the prevalence of RDS in Korea may be low compared with other countries. However, this finding also suggests that Korean neonates have a high risk of infection.

Heart Rate Variability and Autonomic Activity in Patients Affected with Rett Syndrome (Rett 증후군 환자에서의 자율신경 활성도 및 심박수 변이도 측정)

  • Choi, Deok Young;Chang, Jin Ha;Chung, Hee Jung
    • Clinical and Experimental Pediatrics
    • /
    • v.46 no.10
    • /
    • pp.996-1002
    • /
    • 2003
  • Purpose : In Rett syndrome patients, the incidence of sudden death is greater than that of the general population, and cardiac electrical instability including fatal cardiac arrhythmia is a main suspected cause. In this study, we are going to find out the possible cause of the higher risk of sudden death in Rett patients by the evaluation of heart rate variability, a marker of cardiac autonomic activity and corrected QT intervals. Methods : Diagnosis of Rett syndrome was made by molecular genetic study of Rett syndrome (MECP2 gene) or clinical diagnostic criteria of Rett syndrome. Heart rate variability and corrected QT intervals were measured by 24 h-Holter study in 12 Rett patients, and in 30 age-matched healthy children with chief complaints of chest pain or suspected heart murmurs. The were compared with the normal age-matched control. Results : Patients with total Rett syndrome, classic Rett syndrome, and Rett variants had significantly lower heart rate variability(especially rMSSD)(P<0.05) and longer corrected QT intervals than age-matched healthy children(P<0.05). Sympathovagal balance expressed by the ratio of high to low frequency(LF/HF ratio) also showed statistically significant differences between the three groups considered(P<0.05). Conclusion : A significant reduction of heart rate variability, a marker of autonomic disarray, suggests a possible explanation of cardiac dysfunction in sudden death associated with Rett syndrome.

A Study on Clinical Variables Contributing to Differentiation of Delirium and Non-Delirium Patients in the ICU (중환자실 섬망 환자와 비섬망 환자 구분에 기여하는 임상 지표에 관한 연구)

  • Ko, Chanyoung;Kim, Jae-Jin;Cho, Dongrae;Oh, Jooyoung;Park, Jin Young
    • Korean Journal of Psychosomatic Medicine
    • /
    • v.27 no.2
    • /
    • pp.101-110
    • /
    • 2019
  • Objectives : It is not clear which clinical variables are most closely associated with delirium in the Intensive Care Unit (ICU). By comparing clinical data of ICU delirium and non-delirium patients, we sought to identify variables that most effectively differentiate delirium from non-delirium. Methods : Medical records of 6,386 ICU patients were reviewed. Random Subset Feature Selection and Principal Component Analysis were utilized to select a set of clinical variables with the highest discriminatory capacity. Statistical analyses were employed to determine the separation capacity of two models-one using just the selected few clinical variables and the other using all clinical variables associated with delirium. Results : There was a significant difference between delirium and non-delirium individuals across 32 clinical variables. Richmond Agitation Sedation Scale (RASS), urinary catheterization, vascular catheterization, Hamilton Anxiety Rating Scale (HAM-A), Blood urea nitrogen, and Acute Physiology and Chronic Health Examination II most effectively differentiated delirium from non-delirium. Multivariable logistic regression analysis showed that, with the exception of vascular catheterization, these clinical variables were independent risk factors associated with delirium. Separation capacity of the logistic regression model using just 6 clinical variables was measured with Receiver Operating Characteristic curve, with Area Under the Curve (AUC) of 0.818. Same analyses were performed using all 32 clinical variables;the AUC was 0.881, denoting a very high separation capacity. Conclusions : The six aforementioned variables most effectively separate delirium from non-delirium. This highlights the importance of close monitoring of patients who received invasive medical procedures and were rated with very low RASS and HAM-A scores.

Agricultural Policies and Geographical Specialization of Farming in England (영국의 농업정책이 지리적 전문화에 미친 영향 연구)

  • Kim, Ki-Hyuk
    • Journal of the Korean association of regional geographers
    • /
    • v.5 no.1
    • /
    • pp.101-120
    • /
    • 1999
  • The purpose of this study is to analyze the impact of agricultural polices on the change of regional structure based on the specialization during the productivism period. Analysis are carried on through the comparison of distribution in 1950s and 1997. Since the 1950s, governmental policy has played a leading role in shaping the pattern of farming in Great Britain. The range of British measures have also been employed in an attempt to improve the efficiency of agriculture and raise farm income. Three fairly distinct phase can be identified in the developing relationship between government policies and British agriculture in the postwar period. In the 1st phase, The Agricultural Act of 1947 laid the foundations for agricultural productivism in Great Britain until membership of the EC. This was to be achieved through the system of price support and guaranteed prices and the means of a series of grants and subsidies. Guaranteed prices encouraged farmenrs to intensify production and specialize in either cereal farming or milk-beef enterprise. The former favoured eastern areas, whereas the latter favoured western areas. Various grants and subsidies were made available to farmers during this period, again as a way of increasing efficiency and farm incomes. Many policies, such as Calf Subsidy and the Ploughing Grant, Hill cow and Hill Sheep Schemes and the Hill Farming and Livestock Rearing Grant was provided. Some of these policies favoured western uplands, whilst the others was biased towards the Lake District. Concentration of farms occured especially in near the London Metropolitan Area and south part of Scotland. In the 2nd stage after the membership of EC, very high guaranteed price created a relatively risk-free environment, so farmers intensified production and levels of self-sufficiency for most agriculture risen considerably. As farmers were being paid high prices for as much as they could produce, the policy favoured areas of larger-scale farming in eastern Britain. As a result of increasing regional disparities in agriculture, the CAP became more geographically sensitive in 1975 with the setting up of the Less Favoured Areas(LFAs). But they are biased towards the larger farms, because such farms have more crops and/or livestock, but small farms with low incomes are in most need of support. Specialization of cereals such wheat and barely was occured, but these two cereal crops have experienced rather different trend since 1950s. Under the CAP, farmers have been paid higher guaranteed prices for wheat than for barely because of the relative shortage of wheat in the EC. And more barely were cultivated as feedstuffs for livestock by home-grown cereals. In the 1950s dairying was already declining in what was to become the arable areas of southern and eastern England. By the mid-1980s, the pastral core had maintained its dominance, but the pastoral periphery had easily surpassed arable England as the second most important dairying district. Pig farming had become increasingly concentrated in intensive units in the main cereal areas of eastern England. These results show that the measure of agricultural policy induced the concentration and specialization implicitly. Measures for increasing demand, reducing supply or raising farm incomes are favoured by large scale farming. And price support induced specialization of farming. And technology for specialization are diffused and induced geographical specialization. This is the process of change of regional structure through the specialization.

  • PDF

Relation between Health Examination Outcome and Intake of Soy Food and Isoflavone among Adult Male in Seoul (서울 거주 성인 남자의 대두식품 및 이소플라본 섭취와 각종 건강지표와의 관련성 분석)

  • Lee, Min-June;Sohn, Chun-Young;Kim, Ji-Hyang
    • Journal of Nutrition and Health
    • /
    • v.41 no.3
    • /
    • pp.254-263
    • /
    • 2008
  • This study was conducted to analyze the effect of isoflavone intake on prevention of chronic disease in middle and old aged man. In this study we used FFQ (Food frequency questionnaire) and the isoflavone intake level of the subjects was 25.10 mg per day. We divided the subjects into three group -high, medium, low isoflavone intake level- and investigated the relation among isoflavone intake level and clinical/anthropometric characteristics. The intake of isoflavone was inversely related with the body fat in male subjects. And we also divided the subjects into 2 groups with normal and abnormal clinical/anthropometric risk factor. The isoflavone intake level of the abnormal group with high TG, high WHR and high body fat was lower than the normal group. The main food source of isoflavone was soybean curd, bean sprout, soybean paste, soybean and soy milk, and we also investigated the relation between frequency of soybean food and anthropometric and clinical variables. The frequencies of soybean curd, soybean paste, soybean broth, soy milk, bean sprouts, peanuts, soybean and dambuk as well as intake of isoflavone were inversely correlated with some anthropometric and clinical variables such as blood pressure, TG, BMI, % body fat, and waist-hip ratio, whereas positively correlated with HDL cholesterol, muscle mass and bone density. We suggest that high consumption of soy products and isoflavone is associated with decreased blood lipid and body fat in middle and old aged man and might be useful for prevention cardiovascular diseases. From this study, we obtained valuable basic information on recommended isoflavone intake level and guidelines for the prevention of some chronic diseases/health problems.

Acute Ecotoxicity Evaluation of 3 Emulsifiable Concentrates Containing Garlic Extract, Zanthoxylum Extract, and Lemon Grass Oil Originated from Plant (식물추출물 마늘 추출액, 잔톡실럼 정유, 레몬그라스 정유 함유 유제 3종의 생태독성평가)

  • You, Are-Sun;Hong, Soon-Sung;Jeong, Mihye;Park, Kyung-Hun;Chang, Hee-Seop;Lee, Je Bong;Park, Jae-Yup
    • The Korean Journal of Pesticide Science
    • /
    • v.16 no.4
    • /
    • pp.376-382
    • /
    • 2012
  • Environment-friendly agro-materials are are likely to be preferred to chemical insecticides recently. For this reason, many studies are conducted to develop environment-friendly insecticides containing natural materials. This study was also conducted so as to assess ecotoxicity for Emulsifiable concentrate (EC) containing 30% of garlic extract or two plant essential oils (Zanthoxylum, Lemongrass) expected to prevent from pests and be used for agro-materials. Target species used to assess acute toxicity were invertebrate (Daphina magna), fish (Oryzias latipes), honeybee (Apis mellifera L.) and earthworm (Eisenia fetida). The $EC_{50}$ values for of garlic extract 30% EC, Zanthoxylum oil 30% EC and lemongrass oil 30% EC to Daphina magna were 3.3, 10, and $10mg\;L^{-1}$, respectively. The category of garlic extract 30% EC was moderately toxic, while those of Zanthoxylum oil 30% EC and lemongrass oil 30% EC were slightly toxic according to standard of USEPA. $EC_{50}$ for both of Zanthoxylum oil 30% EC and lemongrass oil 30% EC were more than $10mg\;L^{-1}$ then they were considered as slightly toxicity. In case of acute toxicity test to fish, $LC_{50}$ of garlic extract 30% EC was $3.3mg\;L^{-1}$. Zanthoxylum oil 30% EC and lemongrass oil 30% EC indicated $LC_{50}$ > $10mg\;L^{-1}$. Classification of acute toxicity to all test substances was in Korea criteria. Acute contact and oral toxicity test to Honeybee were conducted. As a result, $LD_{50}$ of all test substances were more than 100 a.i. ${\mu}g\;bee^{-1}$ in the acute contact test while $LD_{50}$ of garlic extract 30% EC was 4.4 a.i. ${\mu}g\;bee^{-1}$ and $LD_{50}$ of Zanthoxylum oil 30% EC and lemongrass oil 30% EC were more than 100 a.i. ${\mu}g\;bee^{-1}$. In case of acute toxicity test to earthworm, $LC_{50}$ of garlic extract 30% EC, Zanthoxylum oil 30% EC and lemongrass oil 30% EC were 267, 592, and $430mg\;kg^{-1}$, respectively. In conclusion, if the safety for earthworm is confirmed, these substances are expected to be use for environment-friendly insecticide materials with low risk against ecosystem and contribute to developing environment-friendly agro-materials.

The change of validity of blood zinc protoporphyrin test by different cut-off level in lead workers (연취급 근로자들의 혈중 ZPP 농도 선별기준에 따른 정확도의 변화)

  • Kim, Yong-Bae;Ahn, Hyun-Cheol;HwangBo, Young;Lee, Gap-Soo;Lee, Sung-Soo;Ahn, Kyu-Dong;Lee, Byung-Kook
    • Journal of Preventive Medicine and Public Health
    • /
    • v.30 no.4 s.59
    • /
    • pp.741-751
    • /
    • 1997
  • Measurement of blood lead (PbB) and blood zinc protoporphyrin (ZPP) are most common biological indices to identify the individual at risk for excess or the health sequences by lead exposure. Because PbB is known most important and reliable index of lead exposure, PbB is often regarded as a gold standard to detect lead exposure. But in Korea PbB is a secondary test item of detailed health check-up with positive finding of screening test in most occasion. Our lead standard requires all lead workers to take annual heath-check twice a year for investigation of their health effect due to lead exposure. Blood ZPP is one of most important index to detect high lead absorption in lead workers as a screening test. Measurement of blood ZPP is known ,well to correlate with PbB in steady state of exposure in most lead workers and is often used as a primary screening test to detect high lead absorption of lead workers with the advantage of simplicity, easiness, portability and low cost. The current cut-off criteria of blood ZPP for further detailed health check-up is $100{\mu}g/d\ell$ which is supposed to match the level of $40{\mu}g/d\ell$ of PbB according to our standard. Authors tried to investigate the validity of current criteria of cut-off level $(100{\mu}g/d\ell)$ of blood ZPP and possible another better cut-off level of it to detect the lead workers whose PbB level over $40{\mu}g/d\ell$. The subjects in our study were 212 male workers in three small scale storage battery industries. Blood ZPP, PbB and hemoglobin (Hb) were selected as the indices of lead exposure. The results were as follows. 1. The mean of blood ZPP, PbB and Hb in lead workers were $79.5{\pm}46.7{\mu}g/d\ell,\;38.7{\pm}15.1{\mu}g/d\ell,\;and\;14.8{\pm}1.2g/d\ell$, respectively. There were significant differences in blood ZPP, PbB and Hb by industry (P<0.01). 2. The percents of lead workers whose blood ZPP were above $100{\mu}g/d\ell$ in the group of work duration below 1, 1-4, 5-9 and above 10 years were 8.6%, 17.2%, 47.6%, and 50.0%, respectively. The percents of lead workers whose PbB were above $40{\mu}g/d\ell$ in those were 31.4%, 40.4%, 71.4%, and 86.4%, respectively. 3. The percents of lead workers whose PbB were below $40{\mu}g/d\ell$, $40-59{\mu}g/d\ell$ and above $60{\mu}g/d\ell$ were 54.7%, 34.9% and 10.4%, respectively. Those of lead workers whose blood ZPP were below $100{\mu}g/d\ell$, $100-149{\mu}g/d\ell$ and above $150{\mu}g/d\ell$ were 79.2%, 13.7% and 7.1%, respectively. 4. Simple linear regression of PbB on blood ZPP was statistically significant (P<0.01) and as PbB was $40{\mu}g/d\ell$, blood ZPP was $82.1{\mu}g/d\ell$. 5. While the highest sensitivity and specificity of blood ZPP test to detect lead workers with PbB eve. $40{\mu}g/d\ell$ were observed in the cut-off level of $50{\mu}g/d\ell$ and $100{\mu}g/d\ell$ of blood ZPP, respectively, the highest validity (sensitivity+specificity) of blood ZPP to detect lead workers with PbB over $40{\mu}g/d\ell$ was observed in the cut-off level of around $70{\mu}g/d\ell$ of blood ZPP. But even with optimal cut-off level of around $70{\mu}g/d\ell$ of blood ZPP, still 25.0% of false negative and 20.7% false positive lead workers were found. As the result of this study, it was suggested that reconsideration of current blood ZPP cut-off of our lead standard from $100{\mu}g/d\ell$ to somewhat lower level such as around $70{\mu}g/d\ell$ and the inclusion of PbB measurement as a primary screening test for lead workers was highly recommended for the effective prevention of lead workers.

  • PDF

Coronary Artery Bypass Surgery Using Retrograde Cardioplegics (역행성 심정지액을 이용한 관상동맥 우회술)

  • Mun, Hyeon-Jong;Kim, Gi-Bong;No, Jun-Ryang
    • Journal of Chest Surgery
    • /
    • v.30 no.1
    • /
    • pp.27-33
    • /
    • 1997
  • Retrograde myocardial protection is widely accepted in CABG operation because of the limitations of the antegrade method in the coronary arterial stenosis lesions. We analyzed 76 c ses of retrograde myocardial protection among 96 cases of CABG operation performed between April 1994 and August 1995, There were 48 males and 25 females, and the mean age was 58.2 $\pm$ 8.3 years. 53 patients (70%) were operated for unstable angina, 14 (18%) for stable angina, 6 (8%) for post-infarct angina, 1 (1%) for acute myocardial infarction, and 2()%) for failed PTCA. Preoperative coronary angiography revealed 3-vessel disease in 42 cases, 2-vessel disease in 11, 1-vessel disease in 10, and left main disease in 13 cases. We used SVG(63 cases), LIMA(69 cases), RIMA(11 cases), radial artery(6 cases), and gastroepiploic artery(1 case) for the grafts. Mean anastomosis was 3.2 $\pm$ 1.1. We protected the myocardium with antegrade induction and retrograde maintenance in all the cases except a case of retrograde induction and maintenance. During the aortic cross-clamping, blood cardioplegia was administered intermittently in 19 cases, and continuously in 57 In 39 cases, we used retrograde ardioplegia and antegrade perfusion of RCA graft simultaneously. We had no operative motality. Perioperative complications were arrhythmia in 15 cases, perioperatve myocardial infarction in 10, low cardiac output syndrome In 8, transient neurologic problem in 7, transient psychiatric problem in 6, ARF in 3, bleeding in 2, pneumonia in 2, wound infection in 1, and duodenal ulcer perforation in 1 . In this report, we experienced 76 cases of CABG operation with retrograde myocardial protection under the acceptable operative risk without operative mortality.

  • PDF