• Title/Summary/Keyword: Multi-step

Search Result 1,474, Processing Time 0.029 seconds

Severe choline deficiency induces alternative splicing aberrance in optimized duck primary hepatocyte cultures

  • Zhao, Lulu;Cai, Hongying;Wu, Yongbao;Tian, Changfu;Wen, Zhiguo;Yang, Peilong
    • Animal Bioscience
    • /
    • v.35 no.11
    • /
    • pp.1787-1799
    • /
    • 2022
  • Objective: Choline deficiency, one main trigger for nonalcoholic fatty liver disease (NAFLD), is closely related to lipid metabolism disorder. Previous study in a choline-deficient model has largely focused on gene expression rather than gene structure, especially sparse are studies regarding to alternative splicing (AS). In modern life science research, primary hepatocytes culture technology facilitates such studies, which can accurately imitate liver activity in vitro and show unique superiority. Whereas limitations to traditional hepatocytes culture technology exist in terms of efficiency and operability. This study pursued an optimization culture method for duck primary hepatocytes to explore AS in choline-deficient model. Methods: We performed an optimization culture method for duck primary hepatocytes with multi-step digestion procedure from Pekin duck embryos. Subsequently a NAFLD model was constructed with choline-free medium. RNA-seq and further analysis by rMATS were performed to identify AS events alterations in choline-deficency duck primary hepatocytes. Results: The results showed E13 (embryonic day 13) to E15 is suitable to obtain hepatocytes, and the viability reached over 95% by trypan blue exclusion assay. Primary hepatocyte retained their biological function as well identified by Periodic Acid-Schiff staining method and Glucose-6-phosphate dehydrogenase activity assay, respectively. Meanwhile, genes of alb and afp and specific protein of albumin were detected to verify cultured hepatocytes. Immunofluorescence was used to evaluate purity of hepatocytes, presenting up to 90%. On this base, choline-deficient model was constructed and displayed significantly increase of intracellular triglyceride and cholesterol as reported previously. Intriguingly, our data suggested that AS events in choline-deficient model were implicated in pivotal biological processes as an aberrant transcriptional regulator, of which 16 genes were involved in lipid metabolism and highly enriched in glycerophospholipid metabolism. Conclusion: An effective and rapid protocol for obtaining duck primary hepatocytes was established, by which our findings manifested choline deficiency could induce the accumulation of lipid and result in aberrant AS events in hepatocytes, providing a novel insight into various AS in the metabolism role of choline.

A Thoracic Spine Segmentation Technique for Automatic Extraction of VHS and Cobb Angle from X-ray Images (X-ray 영상에서 VHS와 콥 각도 자동 추출을 위한 흉추 분할 기법)

  • Ye-Eun, Lee;Seung-Hwa, Han;Dong-Gyu, Lee;Ho-Joon, Kim
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.12 no.1
    • /
    • pp.51-58
    • /
    • 2023
  • In this paper, we propose an organ segmentation technique for the automatic extraction of medical diagnostic indicators from X-ray images. In order to calculate diagnostic indicators of heart disease and spinal disease such as VHS(vertebral heart scale) and Cobb angle, it is necessary to accurately segment the thoracic spine, carina, and heart in a chest X-ray image. A deep neural network model in which the high-resolution representation of the image for each layer and the structure converted into a low-resolution feature map are connected in parallel was adopted. This structure enables the relative position information in the image to be effectively reflected in the segmentation process. It is shown that learning performance can be improved by combining the OCR module, in which pixel information and object information are mutually interacted in a multi-step process, and the channel attention module, which allows each channel of the network to be reflected as different weight values. In addition, a method of augmenting learning data is presented in order to provide robust performance against changes in the position, shape, and size of the subject in the X-ray image. The effectiveness of the proposed theory was evaluated through an experiment using 145 human chest X-ray images and 118 animal X-ray images.

A Study on the Electrical Characteristics of Ge2Sb2Te5/Ti/W-Ge8Sb2Te11 Structure for Multi-Level Phase Change Memory (다중준위 상변환 메모리를 위한 Ge2Sb2Te5/Ti/W-Ge8Sb2Te11 구조의 전기적 특성 연구)

  • Oh, Woo-Young;Lee, Hyun-Yong
    • Journal of the Korean Institute of Electrical and Electronic Material Engineers
    • /
    • v.35 no.1
    • /
    • pp.44-49
    • /
    • 2022
  • In this paper, we investigated current (I)- and voltage (V)-sweeping properties in a double-stack structure, Ge2Sb2Te5/Ti/W-doped Ge8Sb2Te11, a candidate medium for applications to multilevel phase-change memory. 200-nm-thick and W-doped Ge2Sb2Te5 and W-doped Ge8Sb2Te11 films were deposited on p-type Si(100) substrate using magnetron sputtering system, and the sheet resistance was measured using 4 point-probe method. The sheet resistance of amorphous-phase W-doped Ge8Sb2Te11 film was about 1 order larger than that of Ge2Sb2Te5 film. The I- and V-sweeping properties were measured using sourcemeter, pulse generator, and digital multimeter. The speed of amorphous-to-multilevel crystallization was evaluated from a graph of resistance vs. pulse duration (t) at a fixed applied voltage (12 V). All the double-stack cells exhibited a two-step phase change process with the multilevel memory states of high-middle-low resistance (HR-MR-LR). In particular, the stable MR state is required to guarantee the reliability of the multilevel phase-change memory. For the Ge2Sb2Te5 (150 nm)/Ti (20 nm)/W-Ge8Sb2Te11 (50 nm), the phase transformations of HR→MR and MR→LR were observed at t<30ns and t<65ns, respectively. We believe that a high speed and stable multilevel phase-change memory can be optimized by the double-stack structure of proper Ge-Sb-Te films separated by a barrier metal (Ti).

A Style Study on the Iranian Vampire Film (이란-뱀파이어 영화 <밤을 걷는 뱀파이어 소녀> 스타일 연구)

The Mediating Effects of Depression on Loneliness and Suicidal Ideation in Elderly Living Alone with Diabetes Mellitus: A Secondary Data Analysis (독거 당뇨병 노인의 외로움이 자살사고에 미치는 영향에 대한 우울의 매개효과 검증: 2차 자료 분석)

  • Moonhee Gang;Yujin Ahn
    • Journal of Industrial Convergence
    • /
    • v.21 no.5
    • /
    • pp.51-58
    • /
    • 2023
  • The purpose of this study is to analyze suicidal ideation of elderly people with diabetes living alone and to investigate the relationship between suicidal ideation, loneliness, and depression. This study was conducted through secondary data analysis using total survey data for the elderly living alone in O province. The subjects of this study were 466 elderly people who were diagnosed with diabetes among the elderly living alone. The data of this study were analyzed using descriptive statistics, t-test, Pearson's correlation coefficients, multi-linear regression analysis, and the three-step mediating effect verification procedure of Baron & Kenny using the SPSS 26.0 program. The average loneliness score and average depression score of the subjects were 4.52±3.30 and 4.88±4.03, respectively, and 27.3% of the subjects said they had a suicidal ideation. The subject's loneliness (β=.20, p=.005) and depression (β=.30, p<.001) was found to be a significant factor in predicting suicide ideation. In addition, depression was found to be partially mediated in the relationship between loneliness and suicidal ideation. Through the results of this study, loneliness and depression were found to be important factors related to suicidal ideation in the elderly with diabetes living alone. In addition, in order to improve suicidal ideation in the elderly with diabetes living alone, intervention to lower the level of loneliness and depression is necessary.

Evaluation of accuracy for measurement of Dioxins (PCDDs/PCDFs) by using certified reference material (CRM) (인증표준물질(Certified reference materials, CRM)을 이용한 이옥신류(PCDDs/PCDFs) 측정의 정확도 평가)

  • Youn, Yeu Young;Park, Deok Hie;Lim, Young Hee;Cho, Hye Sung
    • Analytical Science and Technology
    • /
    • v.22 no.5
    • /
    • pp.376-385
    • /
    • 2009
  • In our study, the accuracy for measurement of seventeen 2,3,7,8-substituted PCDDs/PCDFs in certified reference material (CRM) which is the sample of homogeneous sediment matrix taken from an area known to have significant chemical contamination, particularly PCBs (polychlorinated biphenyls), was evaluated. Though the methodology in this study followed the official method of unintentionally produced persistent organic pollutants (UPOPs) announced by the Ministry of Environment of the Republic of Korea in 2007, there were slight changes using additional purification step by activated carbon column because the interferences of sample were not sufficiently removed when only multi-silica column and alumina column have been used for purification. The |En| number proposed by the Korea Research Institute of Standards and Science was used for a valuation basis of the accuracy. The |En| numbers of seventeen 2,3,7,8-substituted PCDDs/PCDFs have been indicated as 1 and below, they were decided "Pass" in this test, when DB-5MS column and SP-2331 column were used together. Because 1,2,3,7,8-PeCDD and #169-HxCB were not separated on DB-5MS column, the ions of 1,2,3,7,8-PeCDD were selected at M/M+2 instead of M+2/M+4 suggested by EPA 1613. It is possible to distinguish them in HRGC/HRMS analysis.

Implementation of Git's Commit Message Classification Model Using GPT-Linked Source Change Data

  • Ji-Hoon Choi;Jae-Woong Kim;Seong-Hyun Park
    • Journal of the Korea Society of Computer and Information
    • /
    • v.28 no.10
    • /
    • pp.123-132
    • /
    • 2023
  • Git's commit messages manage the history of source changes during project progress or operation. By utilizing this historical data, project risks and project status can be identified, thereby reducing costs and improving time efficiency. A lot of research related to this is in progress, and among these research areas, there is research that classifies commit messages as a type of software maintenance. Among published studies, the maximum classification accuracy is reported to be 95%. In this paper, we began research with the purpose of utilizing solutions using the commit classification model, and conducted research to remove the limitation that the model with the highest accuracy among existing studies can only be applied to programs written in the JAVA language. To this end, we designed and implemented an additional step to standardize source change data into natural language using GPT. This text explains the process of extracting commit messages and source change data from Git, standardizing the source change data with GPT, and the learning process using the DistilBERT model. As a result of verification, an accuracy of 91% was measured. The proposed model was implemented and verified to ensure accuracy and to be able to classify without being dependent on a specific program. In the future, we plan to study a classification model using Bard and a management tool model helpful to the project using the proposed classification model.

Assessment of Landslide Susceptibility in Jecheon Using Deep Learning Based on Exploratory Data Analysis (데이터 탐색을 활용한 딥러닝 기반 제천 지역 산사태 취약성 분석)

  • Sang-A Ahn;Jung-Hyun Lee;Hyuck-Jin Park
    • The Journal of Engineering Geology
    • /
    • v.33 no.4
    • /
    • pp.673-687
    • /
    • 2023
  • Exploratory data analysis is the process of observing and understanding data collected from various sources to identify their distributions and correlations through their structures and characterization. This process can be used to identify correlations among conditioning factors and select the most effective factors for analysis. This can help the assessment of landslide susceptibility, because landslides are usually triggered by multiple factors, and the impacts of these factors vary by region. This study compared two stages of exploratory data analysis to examine the impact of the data exploration procedure on the landslide prediction model's performance with respect to factor selection. Deep-learning-based landslide susceptibility analysis used either a combinations of selected factors or all 23 factors. During the data exploration phase, we used a Pearson correlation coefficient heat map and a histogram of random forest feature importance. We then assessed the accuracy of our deep-learning-based analysis of landslide susceptibility using a confusion matrix. Finally, a landslide susceptibility map was generated using the landslide susceptibility index derived from the proposed analysis. The analysis revealed that using all 23 factors resulted in low accuracy (55.90%), but using the 13 factors selected in one step of exploration improved the accuracy to 81.25%. This was further improved to 92.80% using only the nine conditioning factors selected during both steps of the data exploration. Therefore, exploratory data analysis selected the conditioning factors most suitable for landslide susceptibility analysis and thereby improving the performance of the analysis.

Interactive analysis tools for the wide-angle seismic data for crustal structure study (Technical Report) (지각 구조 연구에서 광각 탄성파 자료를 위한 대화식 분석 방법들)

  • Fujie, Gou;Kasahara, Junzo;Murase, Kei;Mochizuki, Kimihiro;Kaneda, Yoshiyuki
    • Geophysics and Geophysical Exploration
    • /
    • v.11 no.1
    • /
    • pp.26-33
    • /
    • 2008
  • The analysis of wide-angle seismic reflection and refraction data plays an important role in lithospheric-scale crustal structure study. However, it is extremely difficult to develop an appropriate velocity structure model directly from the observed data, and we have to improve the structure model step by step, because the crustal structure analysis is an intrinsically non-linear problem. There are several subjective processes in wide-angle crustal structure modelling, such as phase identification and trial-and-error forward modelling. Because these subjective processes in wide-angle data analysis reduce the uniqueness and credibility of the resultant models, it is important to reduce subjectivity in the analysis procedure. From this point of view, we describe two software tools, PASTEUP and MODELING, to be used for developing crustal structure models. PASTEUP is an interactive application that facilitates the plotting of record sections, analysis of wide-angle seismic data, and picking of phases. PASTEUP is equipped with various filters and analysis functions to enhance signal-to-noise ratio and to help phase identification. MODELING is an interactive application for editing velocity models, and ray-tracing. Synthetic traveltimes computed by the MODELING application can be directly compared with the observed waveforms in the PASTEUP application. This reduces subjectivity in crustal structure modelling because traveltime picking, which is one of the most subjective process in the crustal structure analysis, is not required. MODELING can convert an editable layered structure model into two-way traveltimes which can be compared with time-sections of Multi Channel Seismic (MCS) reflection data. Direct comparison between the structure model of wide-angle data with the reflection data will give the model more credibility. In addition, both PASTEUP and MODELING are efficient tools for handling a large dataset. These software tools help us develop more plausible lithospheric-scale structure models using wide-angle seismic data.

A Study on the Applicability of Soilremediation Technology for Contaminated Sediment in Agro-livestock Reservoir (농축산저수지 오염퇴적토의 토양정화기술에 대한 적용성 연구)

  • Jung, Jaeyun;Chang, Yoonyoung
    • Journal of Environmental Impact Assessment
    • /
    • v.29 no.3
    • /
    • pp.157-181
    • /
    • 2020
  • Sediments from rivers, lakes and marine ports serve as end points for pollutants discharged into the water, and at the same time serve as sources of pollutants that are continuously released into the water. Until now, the contaminated sediments have been landfilled or dumped at sea. Landfilling, however, was expensive and dumping at sea was completely banned due to the London Convention. Therefore, this study applied contaminated sedimentation soil of 'Royal Palace Livestock Complex' as soil purification method. Soil remediation methods were applied to pretreatment, composting, soil washing, electrokinetics, and thermal desorption by selecting overseas application cases and domestically applicable application technologies. As a result of surveying the site for pollutant characteristics, Disolved Oxigen (DO), Suspended Solid (SS), Chemical Oxygen Demand (COD), Total Nitrogen (TN), and Total Phosphorus (TP) exceeded the discharged water quality standard, and especially SS, COD, TN, and TP exceeded the standard several tens to several hundred times. Soil showed high concentrations of copper and zinc, which promote the growth of pig feed, and cadmium exceeded 1 standard of Soil Environment Conservation Act. In the pretreatment technology, hydrocyclone was used for particle size separation, and the fine soil was separated by more than 80%. Composting was performed on organic and Total Petroleum Hydrocarbon (TPH) contaminated soils. TPH was treated within the standard of concern, and E. coli was analyzed to be high in organic matter, and the fertilizer specification was satisfied by applying the optimum composting conditions at 70℃, but the organic matter content was lower than the fertilizer specification. As a result of continuous washing test, Cd has 5 levels of residual material in fine soil. Cu and Zn were mostly composed of ion exchange properties (stage 1), carbonates (stage 2), and iron / manganese oxides (stage 3), which facilitate easy separation of contamination. As a result of applying acid dissolution and multi-stage washing step by step, hydrochloric acid, 1.0M, 1: 3, 200rpm, 60min was analyzed as the optimal washing factor. Most of the contaminated sediments were found to satisfy the Soil Environmental Conservation Act's standards. Therefore, as a result of the applicability test of this study, soil with high heavy metal contamination was used as aggregate by applying soil cleaning after pre-treatment. It was possible to verify that it was efficient to use organic and oil-contaminated soil as compost Maturity after exterminating contaminants and E. coli by applying composting.