• Title/Summary/Keyword: Choi Donghoon

Search Result 50, Processing Time 0.028 seconds

A Study on the Insulation Performance of Composite Multilayer Insulation by Applciation of Heat Storage Tank (축열조용 복합 다층 단열재의 단열 성능 연구)

  • Choi, Gyuhong;Hwang, Seung Sik;Shin, Donghoon;Park, Woo Sung;Park, Dae Woong;Son, Seung Kil;Chung, Tae Yong
    • Journal of Energy Engineering
    • /
    • v.23 no.3
    • /
    • pp.82-87
    • /
    • 2014
  • MLI(Multi-layer Insulation) is widely used to get highly insulating on cryogenic system in order to reduce heat loads. MLI for satellites thermal performance is changed by materials and laminated method. In this study, a composite multilayer insulation by application of heat stroage tank performance were compared with materials and laminated to change the way. Experimental methods of the KS C 9805 was used, the composite multilayer insulation and EPS was compared with the insulation performance. A method for analysis of experimental results is the equivalent thickness about CMI and the insulation performance were used to compare thermal conductance. As a results, the equivalnet thickenss and the thermal conductance of the composite multilayer insulation were smaller than the EPS and the thermal performance are more excellent. In addition, the configuration of the composite multilayer insulation materials and laminated method varies depending on the overall heat transfer coefficient was confirmed.

Investigations on Techniques and Applications of Text Analytics (텍스트 분석 기술 및 활용 동향)

  • Kim, Namgyu;Lee, Donghoon;Choi, Hochang;Wong, William Xiu Shun
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.42 no.2
    • /
    • pp.471-492
    • /
    • 2017
  • The demand and interest in big data analytics are increasing rapidly. The concepts around big data include not only existing structured data, but also various kinds of unstructured data such as text, images, videos, and logs. Among the various types of unstructured data, text data have gained particular attention because it is the most representative method to describe and deliver information. Text analysis is generally performed in the following order: document collection, parsing and filtering, structuring, frequency analysis, and similarity analysis. The results of the analysis can be displayed through word cloud, word network, topic modeling, document classification, and semantic analysis. Notably, there is an increasing demand to identify trending topics from the rapidly increasing text data generated through various social media. Thus, research on and applications of topic modeling have been actively carried out in various fields since topic modeling is able to extract the core topics from a huge amount of unstructured text documents and provide the document groups for each different topic. In this paper, we review the major techniques and research trends of text analysis. Further, we also introduce some cases of applications that solve the problems in various fields by using topic modeling.

Reactivated Timings of Inje Fault since the Mesozoic Era (인제단층의 중생대 이 후 재활동 연대)

  • Khulganakhuu, Chuluunbaatar;Song, Yungoo;Chung, Donghoon;Park, Changyun;Choi, Sung-Ja;Kang, Il-Mo;Yi, Keewook
    • Economic and Environmental Geology
    • /
    • v.48 no.1
    • /
    • pp.41-49
    • /
    • 2015
  • Recently developed illite-age-analysis(IAA) approach was applied to determine the fault-reactivated events for the Inje fault that cut through Precambrian biotite granitic gneiss with NNE-SSW trend in the middle of Korean peninsula. Three distinct fault-reactivated events of shallow crustal regime were recognized using the combined approach of optimized illite-polytype quantification and K-Ar age-dating of clay fractions separated from 4 fault clay samples: $87.0{\pm}0.12Ma$, $65.5{\pm}0.05$ and $66.6{\pm}1.38Ma$, $45.6{\pm}0.15Ma$, respectively. As well, $2M_1$ illite ages of 193~196 Ma and $254.3{\pm}6.96Ma$ were discernible, which may be related to the fault-activated time in the relatively deep crust. The study results suggest that the Inje fault would be firstly formed at $254.3^{\circ}$ ${\ae}6.96Ma$ and sporadically reactivated in shallow regime since about 87 Ma. These reactivation events in shallow regime might be due to the Bulguksa orogeny that would be strongly influenced in Korean peninsula at that time.

Transfection of Mesenchymal Stem Cells with the FGF-2 Gene Improves Their Survival Under Hypoxic Conditions

  • Song, Heesang;Kwon, Kihwan;Lim, Soyeon;Kang, Seok-Min;Ko, Young-Guk;Xu, ZhengZhe;Chung, Ji Hyung;Kim, Byung-Soo;Lee, Hakbae;Joung, Boyoung;Park, Sungha;Choi, Donghoon;Jang, Yangsoo;Chung, Nam-Sik;Yoo, Kyung-Jong;Hwang, Ki-Chul
    • Molecules and Cells
    • /
    • v.19 no.3
    • /
    • pp.402-407
    • /
    • 2005
  • Bone marrow mesenchymal stem cells (MSCs) have shown potential for cardiac repair following myocardial injury, but this approach is limited by their poor viability after transplantation. To reduce cell loss after transplantation, we introduced the fibroblast growth factor-2 (FGF-2) gene ex vivo before transplantation. The isolated MSCs produced colonies with a fibroblast-like morphology in 2 weeks; over 95% expressed CD71, and 28% expressed the cardiomyocyte-specific transcription factor, Nkx2.5, as well as ${\alpha}$-skeletal actin, Nkx2.5, and GATA4. In hypoxic culture, the FGF-2-transfected MSCs (FGF-2-MSCs) secreted increased levels of FGF-2 and displayed a threefold increase in viability, as well as increased expression of the anti-apoptotic gene, Bcl2, and reduced DNA laddering. They had functional adrenergic receptors, like cardiomyocytes, and exposure to norepinephrine led to phosphorylation of ERK1/2. Viable cells persisted 4 weeks after implantation of $5.0{\times}10^5$ FGF-2-MSCs into infarcted myocardia. Expression of cardiac troponin T (CTn T) and a voltage-gated $Ca^{2+}$ channel (CaV2.1) increased, and new blood vessels formed. These data suggest that genetic modification of MSCs before transplantation could be useful for treating myocardial infarction and end-stage cardiac failure.

Cell-Based Screen Using Amyloid Mimic β23 Expression Identifies Peucedanocoumarin III as a Novel Inhibitor of α-Synuclein and Huntingtin Aggregates

  • Ham, Sangwoo;Kim, Hyojung;Hwang, Seojin;Kang, Hyunook;Yun, Seung Pil;Kim, Sangjune;Kim, Donghoon;Kwon, Hyun Sook;Lee, Yun-Song;Cho, MyoungLae;Shin, Heung-Mook;Choi, Heejung;Chung, Ka Young;Ko, Han Seok;Lee, Gum Hwa;Lee, Yunjong
    • Molecules and Cells
    • /
    • v.42 no.6
    • /
    • pp.480-494
    • /
    • 2019
  • Aggregates of disease-causing proteins dysregulate cellular functions, thereby causing neuronal cell loss in diverse neurodegenerative diseases. Although many in vitro or in vivo studies of protein aggregate inhibitors have been performed, a therapeutic strategy to control aggregate toxicity has not been earnestly pursued, partly due to the limitations of available aggregate models. In this study, we established a tetracycline (Tet)-inducible nuclear aggregate (${\beta}23$) expression model to screen potential lead compounds inhibiting ${\beta}23$-induced toxicity. High-throughput screening identified several natural compounds as nuclear ${\beta}23$ inhibitors, including peucedanocoumarin III (PCIII). Interestingly, PCIII accelerates disaggregation and proteasomal clearance of both nuclear and cytosolic ${\beta}23$ aggregates and protects SH-SY5Y cells from toxicity induced by ${\beta}23$ expression. Of translational relevance, PCIII disassembled fibrils and enhanced clearance of cytosolic and nuclear protein aggregates in cellular models of huntingtin and ${\alpha}$-synuclein aggregation. Moreover, cellular toxicity was diminished with PCIII treatment for polyglutamine (PolyQ)-huntingtin expression and ${\alpha}$-synuclein expression in conjunction with 6-hydroxydopamine (6-OHDA) treatment. Importantly, PCIII not only inhibited ${\alpha}$-synuclein aggregation but also disaggregated preformed ${\alpha}$-synuclein fibrils in vitro. Taken together, our results suggest that a Tet-Off ${\beta}23$ cell model could serve as a robust platform for screening effective lead compounds inhibiting nuclear or cytosolic protein aggregates. Brain-permeable PCIII or its derivatives could be beneficial for eliminating established protein aggregates.

A Study on Generating Virtual Shot-Gathers from Traffic Noise Data (교통차량진동 자료에 대한 최적 가상공통송신원모음 제작 연구)

  • Woohyun Son;Yunsuk Choi;Seonghyung Jang;Donghoon Lee;Snons Cheong;Yonghwan Joo;Byoung-yeop Kim
    • Geophysics and Geophysical Exploration
    • /
    • v.26 no.4
    • /
    • pp.229-237
    • /
    • 2023
  • The use of artificial sources such as explosives and mechanical vibrations for seismic exploration in urban areas poses challenges, as the vibrations and noise generated can lead to complaints. As an alternative to artificial sources, the surface waves generated by traffic noise can be used to investigate the subsurface properties of urban areas. However, traffic noise takes the form of plane waves moving continuously at a constant speed. To apply existing surface wave processing/inversion techniques to traffic noise, the recorded data need to be transformed into a virtual shot gather format using seismic interferometry. In this study, various seismic interferometry methods were applied to traffic noise data, and the optimal method was derived by comparing the results in the Radon and F-K domains. Additionally, the data acquired using various receiver arrays were processed using seismic interferometry, and the results were compared and analyzed to determine the most optimal receiver array direction for exploration.

Association Between Body Mass Index and Clinical Outcomes According to Diabetes in Patients Who Underwent Percutaneous Coronary Intervention

  • Byung Gyu Kim;Sung-Jin Hong;Byeong-Keuk Kim;Yong-Joon Lee;Seung-Jun Lee;Chul-Min Ahn;Dong-Ho Shin;Jung-Sun Kim;Young-Guk Ko;Donghoon Choi;Myeong-Ki Hong;Yangsoo Jang
    • Korean Circulation Journal
    • /
    • v.53 no.12
    • /
    • pp.843-854
    • /
    • 2023
  • Background and Objectives: We evaluated the effect of diabetes on the relationship between body mass index (BMI) and clinical outcomes in patients following percutaneous coronary intervention (PCI) with drug-eluting stent implantation. Methods: A total of 6,688 patients who underwent PCI were selected from five different registries led by Korean Multicenter Angioplasty Team. They were categorized according to their BMI into the following groups: underweight (<18.5 kg/m2), normal weight (18.5-24.9 kg/m2), overweight to obese (≥25.0 kg/m2). Major adverse cardiac and cerebrovascular events (MACCE), defined as a composite of death, nonfatal myocardial infarction, stroke, and target-vessel revascularization, were compared according to the BMI categories (underweight, normal and overweight to obese group) and diabetic status. All subjects completed 1-year follow-up. Results: Among the 6,688 patients, 2,561 (38%) had diabetes. The underweight group compared to normal weight group had higher 1-year MACCE rate in both non-diabetic (adjusted hazard ratio [HR], 2.24; 95% confidence interval [CI], 1.04-4.84; p=0.039) and diabetic patients (adjusted HR, 2.86; 95% CI, 1.61-5.07; p<0.001). The overweight to obese group had a lower MACCE rate than the normal weight group in diabetic patients (adjusted HR, 0.67 [0.49-0.93]) but not in non-diabetic patients (adjusted HR, 1.06 [0.77-1.46]), with a significant interaction (p-interaction=0.025). Conclusions: Between the underweight and normal weight groups, the association between the BMI and clinical outcomes was consistent regardless of the presence of diabetes. However, better outcomes in overweight to obese over normal weight were observed only in diabetic patients. These results suggest that the association between BMI and clinical outcomes may differ according to the diabetic status.

A Method for Evaluating News Value based on Supply and Demand of Information Using Text Analysis (텍스트 분석을 활용한 정보의 수요 공급 기반 뉴스 가치 평가 방안)

  • Lee, Donghoon;Choi, Hochang;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.4
    • /
    • pp.45-67
    • /
    • 2016
  • Given the recent development of smart devices, users are producing, sharing, and acquiring a variety of information via the Internet and social network services (SNSs). Because users tend to use multiple media simultaneously according to their goals and preferences, domestic SNS users use around 2.09 media concurrently on average. Since the information provided by such media is usually textually represented, recent studies have been actively conducting textual analysis in order to understand users more deeply. Earlier studies using textual analysis focused on analyzing a document's contents without substantive consideration of the diverse characteristics of the source medium. However, current studies argue that analytical and interpretive approaches should be applied differently according to the characteristics of a document's source. Documents can be classified into the following types: informative documents for delivering information, expressive documents for expressing emotions and aesthetics, operational documents for inducing the recipient's behavior, and audiovisual media documents for supplementing the above three functions through images and music. Further, documents can be classified according to their contents, which comprise facts, concepts, procedures, principles, rules, stories, opinions, and descriptions. Documents have unique characteristics according to the source media by which they are distributed. In terms of newspapers, only highly trained people tend to write articles for public dissemination. In contrast, with SNSs, various types of users can freely write any message and such messages are distributed in an unpredictable way. Again, in the case of newspapers, each article exists independently and does not tend to have any relation to other articles. However, messages (original tweets) on Twitter, for example, are highly organized and regularly duplicated and repeated through replies and retweets. There have been many studies focusing on the different characteristics between newspapers and SNSs. However, it is difficult to find a study that focuses on the difference between the two media from the perspective of supply and demand. We can regard the articles of newspapers as a kind of information supply, whereas messages on various SNSs represent a demand for information. By investigating traditional newspapers and SNSs from the perspective of supply and demand of information, we can explore and explain the information dilemma more clearly. For example, there may be superfluous issues that are heavily reported in newspaper articles despite the fact that users seldom have much interest in these issues. Such overproduced information is not only a waste of media resources but also makes it difficult to find valuable, in-demand information. Further, some issues that are covered by only a few newspapers may be of high interest to SNS users. To alleviate the deleterious effects of information asymmetries, it is necessary to analyze the supply and demand of each information source and, accordingly, provide information flexibly. Such an approach would allow the value of information to be explored and approximated on the basis of the supply-demand balance. Conceptually, this is very similar to the price of goods or services being determined by the supply-demand relationship. Adopting this concept, media companies could focus on the production of highly in-demand issues that are in short supply. In this study, we selected Internet news sites and Twitter as representative media for investigating information supply and demand, respectively. We present the notion of News Value Index (NVI), which evaluates the value of news information in terms of the magnitude of Twitter messages associated with it. In addition, we visualize the change of information value over time using the NVI. We conducted an analysis using 387,014 news articles and 31,674,795 Twitter messages. The analysis results revealed interesting patterns: most issues show lower NVI than average of the whole issue, whereas a few issues show steadily higher NVI than the average.

Truncation Artifact Reduction Using Weighted Normalization Method in Prototype R/F Chest Digital Tomosynthesis (CDT) System (프로토타입 R/F 흉부 디지털 단층영상합성장치 시스템에서 잘림 아티팩트 감소를 위한 가중 정규화 접근법에 대한 연구)

  • Son, Junyoung;Choi, Sunghoon;Lee, Donghoon;Kim, Hee-Joung
    • Journal of the Korean Society of Radiology
    • /
    • v.13 no.1
    • /
    • pp.111-118
    • /
    • 2019
  • Chest digital tomosynthesis has become a practical imaging modality because it can solve the problem of anatomy overlapping in conventional chest radiography. However, because of both limited scan angle and finite-size detector, a portion of chest cannot be represented in some or all of the projection. These bring a discontinuity in intensity across the field of view boundaries in the reconstructed slices, which we refer to as the truncation artifacts. The purpose of this study was to reduce truncation artifacts using a weighted normalization approach and to investigate the performance of this approach for our prototype chest digital tomosynthesis system. The system source-to-image distance was 1100 mm, and the center of rotation of X-ray source was located on 100 mm above the detector surface. After obtaining 41 projection views with ${\pm}20^{\circ}$ degrees, tomosynthesis slices were reconstructed with the filtered back projection algorithm. For quantitative evaluation, peak signal to noise ratio and structure similarity index values were evaluated after reconstructing reference image using simulation, and mean value of specific direction values was evaluated using real data. Simulation results showed that the peak signal to noise ratio and structure similarity index was improved respectively. In the case of the experimental results showed that the effect of artifact in the mean value of specific direction of the reconstructed image was reduced. In conclusion, the weighted normalization method improves the quality of image by reducing truncation artifacts. These results suggested that weighted normalization method could improve the image quality of chest digital tomosynthesis.

Transcriptomic Analysis of Triticum aestivum under Salt Stress Reveals Change of Gene Expression (RNA sequencing을 이용한 염 스트레스 처리 밀(Triticum aestivum)의 유전자 발현 차이 확인 및 후보 유전자 선발)

  • Jeon, Donghyun;Lim, Yoonho;Kang, Yuna;Park, Chulsoo;Lee, Donghoon;Park, Junchan;Choi, Uchan;Kim, Kyeonghoon;Kim, Changsoo
    • KOREAN JOURNAL OF CROP SCIENCE
    • /
    • v.67 no.1
    • /
    • pp.41-52
    • /
    • 2022
  • As a cultivar of Korean wheat, 'Keumgang' wheat variety has a fast growth period and can be grown stably. Hexaploid wheat (Triticum aestivum) has moderately high salt tolerance compared to tetraploid wheat (Triticum turgidum L.). However, the molecular mechanisms related to salt tolerance of hexaploid wheat have not been elucidated yet. In this study, the candidate genes related to salt tolerance were identified by investigating the genes that are differently expressed in Keumgang variety and examining salt tolerant mutation '2020-s1340.'. A total of 85,771,537 reads were obtained after quality filtering using NextSeq 500 Illumina sequencing technology. A total of 23,634,438 reads were aligned with the NCBI Campala Lr22a pseudomolecule v5 reference genome (Triticum aestivum). A total of 282 differentially expressed genes (DEGs) were identified in the two Triticum aestivum materials. These DEGs have functions, including salt tolerance related traits such as 'wall-associated receptor kinase-like 8', 'cytochrome P450', '6-phosphofructokinase 2'. In addition, the identified DEGs were classified into three categories, including biological process, molecular function, cellular component using gene ontology analysis. These DEGs were enriched significantly for terms such as the 'copper ion transport', 'oxidation-reduction process', 'alternative oxidase activity'. These results, which were obtained using RNA-seq analysis, will improve our understanding of salt tolerance of wheat. Moreover, this study will be a useful resource for breeding wheat varieties with improved salt tolerance using molecular breeding technology.