• Title/Summary/Keyword: computational materials science

Search Result 201, Processing Time 0.027 seconds

Inferring B-cell derived T-cell receptor induced multi-epitope-based vaccine candidate against enterovirus 71: a reverse vaccinology approach

  • Subrat Kumar Swain;Subhasmita Panda;Basanta Pravas Sahu;Soumya Ranjan Mahapatra;Jyotirmayee Dey;Rachita Sarangi;Namrata Misra
    • Clinical and Experimental Vaccine Research
    • /
    • v.13 no.2
    • /
    • pp.132-145
    • /
    • 2024
  • Purpose: Enterovirus 71, a pathogen that causes hand-foot and mouth disease (HFMD) is currently regarded as an increasing neurotropic virus in Asia and can cause severe complications in pediatric patients with blister-like sores or rashes on the hand, feet, and mouth. Notwithstanding the significant burden of the disease, no authorized vaccine is available. Previously identified attenuated and inactivated vaccines are worthless over time owing to changes in the viral genome. Materials and Methods: A novel vaccine construct using B-cell derived T-cell epitopes from the virulent polyprotein found the induction of possible immune response. In order to boost the immune system, a beta-defensin 1 preproprotein adjuvant with EAAAK linker was added at the N-terminal end of the vaccine sequence. Results: The immunogenicity of the designed, refined, and verified prospective three-dimensional-structure of the multi-epitope vaccine was found to be quite high, exhibiting non-allergenic and antigenic properties. The vaccine candidates bound to toll-like receptor 3 in a molecular docking analysis, and the efficacy of the potential vaccine to generate a strong immune response was assessed through in silico immunological simulation. Conclusion: Computational analysis has shown that the proposed multi-epitope vaccine is possibly safe for use in humans and can elicit an immune response.

Conclusions and Suggestions on Low-Dose and Low-Dose Rate Radiation Risk Estimation Methodology

  • Sakai, Kazuo;Yamada, Yutaka;Yoshida, Kazuo;Yoshinaga, Shinji;Sato, Kaoru;Ogata, Hiromitsu;Iwasaki, Toshiyasu;Kudo, Shin'ichi;Asada, Yasuki;Kawaguchi, Isao;Haeno, Hiroshi;Sasaki, Michiya
    • Journal of Radiation Protection and Research
    • /
    • v.46 no.1
    • /
    • pp.14-23
    • /
    • 2021
  • Background: For radiological protection and control, the International Commission on Radiological Protection (ICRP) provides the nominal risk coefficients related to radiation exposure, which can be extrapolated using the excess relative risk and excess absolute risk obtained from the Life Span Study of atomic bomb survivors in Hiroshima and Nagasaki with the dose and dose-rate effectiveness factor (DDREF). Materials and Methods: Since it is impossible to directly estimate the radiation risk at doses less than approximately 100 mSv only from epidemiological knowledge and data, support from radiation biology is absolutely imperative, and thus, several national and international bodies have advocated the importance of bridging knowledge between biology and epidemiology. Because of the accident at the Tokyo Electric Power Company (TEPCO)'s Fukushima Daiichi Nuclear Power Station in 2011, the exposure of the public to radiation has become a major concern and it was considered that the estimation of radiation risk should be more realistic to cope with the prevailing radiation exposure situation. Results and Discussion: To discuss the issues from wide aspects related to radiological protection, and to realize bridging knowledge between biology and epidemiology, we have established a research group to develop low-dose and low-dose-rate radiation risk estimation methodology, with the permission of the Japan Health Physics Society. Conclusion: The aim of the research group was to clarify the current situation and issues related to the risk estimation of low-dose and low-dose-rate radiation exposure from the viewpoints of different research fields, such as epidemiology, biology, modeling, and dosimetry, to identify a future strategy and roadmap to elucidate a more realistic estimation of risk against low-dose and low-dose-rate radiation exposure.

Internal Defection Evaluation of Spot Weld Part and Carbon Composite using the Non-contact Air-coupled Ultrasonic Transducer Method (비접촉 초음파 탐상기법을 이용한 스폿용접부 및 탄소복합체의 내부 결함평가)

  • Kwak, Nam-Su;Lee, Seung-Chul
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.15 no.11
    • /
    • pp.6432-6439
    • /
    • 2014
  • The NAUT (Non-contact Air coupled Ultrasonic Testing) technique is one of the ultrasonic testing methods that enables non-contact ultrasonic testing by compensating for the energy loss caused by the difference in acoustic impedance of air with an ultrasonic pulser receiver, PRE-AMP and high-sensitivity transducer. As the NAUT is performed in a state of steady ultrasonic transmission and reception, testing can be performed on materials of high or low temperatures or specimens with a rough surface or narrow part, which could not have been tested using the conventional contact-type testing technique. For this study, the internal defects of spot weld, which are often applied to auto parts, and CFRP parts, were tested to determine if it is practical to make the NAUT technique commercial. As the spot welded part had a high ultrasonic transmissivity, the result was shown as red. On the other hand, the part with an internal defect had a layer of air and low transmissivity, which was shown as blue. In addition, depending on the PRF (Pulse Repetition Frequency), an important factor that determines the measurement speed, the color sharpness showed differences. With the images obtained from CFRP specimens or an imaging device, it was possible to identify the shape, size and position of the internal defect within a short period of time. In this paper, it was confirmed in the above-described experiment that both internal defect detection and image processing of the defect could be possible using the NAUT technique. Moreover, it was possible to apply NAUT to the detection of internal defects in the spot welded parts or in CFRP parts, and commercialize its practical application to various fields.

Development of Network Based MT Data Processing System (네트워크에 기반한 MT자료의 처리기술 개발 연구)

  • Lee Heuisoon;Kwon Byung-Doo;Chung Hojoon;Oh Seokhoon
    • Geophysics and Geophysical Exploration
    • /
    • v.3 no.2
    • /
    • pp.53-60
    • /
    • 2000
  • The server/client systems using the web protocol and distribution computing environment by network was applied to the MT data processing based on the Java technology. Using this network based system, users can get consistent and stable results because the system has standard analysing methods and has been tested from many users through the internet. Users can check the MT data processing at any time and get results during exploration to reduce the exploration time and money. The pure/enterprised Java technology provides facilities to develop the network based MT data processing system. Web based socket communication and RMI technology are tested respectively to produce the effective and practical client application. Intrinsically, the interpretation of MT data performing the inversion and data process requires heavy computational ability. Therefore we adopt the MPI parallel processing technique to fit the desire of in situ users and expect the effectiveness for the control and upgrade of programing codes.

  • PDF

Data-centric XAI-driven Data Imputation of Molecular Structure and QSAR Model for Toxicity Prediction of 3D Printing Chemicals (3D 프린팅 소재 화학물질의 독성 예측을 위한 Data-centric XAI 기반 분자 구조 Data Imputation과 QSAR 모델 개발)

  • ChanHyeok Jeong;SangYoun Kim;SungKu Heo;Shahzeb Tariq;MinHyeok Shin;ChangKyoo Yoo
    • Korean Chemical Engineering Research
    • /
    • v.61 no.4
    • /
    • pp.523-541
    • /
    • 2023
  • As accessibility to 3D printers increases, there is a growing frequency of exposure to chemicals associated with 3D printing. However, research on the toxicity and harmfulness of chemicals generated by 3D printing is insufficient, and the performance of toxicity prediction using in silico techniques is limited due to missing molecular structure data. In this study, quantitative structure-activity relationship (QSAR) model based on data-centric AI approach was developed to predict the toxicity of new 3D printing materials by imputing missing values in molecular descriptors. First, MissForest algorithm was utilized to impute missing values in molecular descriptors of hazardous 3D printing materials. Then, based on four different machine learning models (decision tree, random forest, XGBoost, SVM), a machine learning (ML)-based QSAR model was developed to predict the bioconcentration factor (Log BCF), octanol-air partition coefficient (Log Koa), and partition coefficient (Log P). Furthermore, the reliability of the data-centric QSAR model was validated through the Tree-SHAP (SHapley Additive exPlanations) method, which is one of explainable artificial intelligence (XAI) techniques. The proposed imputation method based on the MissForest enlarged approximately 2.5 times more molecular structure data compared to the existing data. Based on the imputed dataset of molecular descriptor, the developed data-centric QSAR model achieved approximately 73%, 76% and 92% of prediction performance for Log BCF, Log Koa, and Log P, respectively. Lastly, Tree-SHAP analysis demonstrated that the data-centric-based QSAR model achieved high prediction performance for toxicity information by identifying key molecular descriptors highly correlated with toxicity indices. Therefore, the proposed QSAR model based on the data-centric XAI approach can be extended to predict the toxicity of potential pollutants in emerging printing chemicals, chemical process, semiconductor or display process.

Geological Achievements of the 20th Century and Their Influence on Geological Thinking (20세기에 이룩된 지질과학 업적과 이것이 지질과학 사고방식에 끼친 영향)

  • Chang, Soon-Keun;Lee, Sang-Mook
    • Journal of the Korean earth science society
    • /
    • v.21 no.5
    • /
    • pp.635-646
    • /
    • 2000
  • Geological achievements of the 20th century revolutionized our views about geological understanding and concept. A good example is the concept of continental drift suggested early in the 20th century and later explained in terms of seafloor spreading and plate tectonics. Our understanding of the compositions of materials forming earth has also improved during the20th century. Radio and stable isotopes together with biostratigraphy and sequence stratigraphy allow us to interpret the evolution of sedimentary basins in terms of plate movement and sedimentation processes. The Deep Sea Drilling Project initiated in 1960s and continued as the Ocean Drilling Project in 1980s is one of the most successful international research observations, and new developments in computational techniques have provided a wholly new view about the interior of the earth. Most of the geological features and phenomena observed in deep sea and around continental margins are now explained in terms of global tectonic processes such as superplumes flowing up from the interior of our planet and interacting with such as Rodinia Pannotia and Nena back in the Precambrian time. The space explorations which began in the late 1950s opened up a new path to astrogeology, astrobiology, and astropaleontology. The impact theory rooted in the discovery of iridium and associated phenomena in 1980s revived Cuvier's catastrophism as a possible explanation for the extinctions of biotas found in the geological record of this planet. Due to the geological achievements made in the 20th century, we now have a better understanding of geologic times and processes that were too long to be grasped by human records.

  • PDF

Probable Volcanic Flood of the Cheonji Caldera Lake Triggered by Volcanic Eruption of Mt. Baekdusan (백두산 화산분화로 인해 천지에서 발생 가능한 화산홍수)

  • Lee, Khil-Ha;Kim, Sung-Wook;Yoo, Soon-Young;Kim, Sang-Hyun
    • Journal of the Korean earth science society
    • /
    • v.34 no.6
    • /
    • pp.492-506
    • /
    • 2013
  • The historical accounts and materials about the eruption of Mt. Baekdusan as observed by the geological survey is now showing some signs of waking from a long slumber. As a response of the volcanic eruption of Mt. Baekdusan, water release may occur from the stored water in Lake Cheonjii caldera. The volcanic flood is crucial in that it has huge potential energy that can destruct all kinds of man-made structures and that its velocity can reach up to 100 km $hr^{-1}$ to cover hundreds of kilometers of downstream of Lake Cheonji. The ultimate goal of the study is to estimate the level of damage caused by the volcanic flood of Lake Cheon-Ji caldera. As a preliminary study a scenario-based numerical analysis is performed to build hydrographs as a function of time. The analysis is performed for each scenario (breach, magma uplift, combination of uplift and breach, formation of precipitation etc.) and the parameters to require a model structure is chosen on the basis of the historic records of other volcanos. This study only considers the amount of water at the rim site as a function of time for the estimation whereas the downstream routing process is not considered in this study.

Radiomics Analysis of Gray-Scale Ultrasonographic Images of Papillary Thyroid Carcinoma > 1 cm: Potential Biomarker for the Prediction of Lymph Node Metastasis (Radiomics를 이용한 1 cm 이상의 갑상선 유두암의 초음파 영상 분석: 림프절 전이 예측을 위한 잠재적인 바이오마커)

  • Hyun Jung Chung;Kyunghwa Han;Eunjung Lee;Jung Hyun Yoon;Vivian Youngjean Park;Minah Lee;Eun Cho;Jin Young Kwak
    • Journal of the Korean Society of Radiology
    • /
    • v.84 no.1
    • /
    • pp.185-196
    • /
    • 2023
  • Purpose This study aimed to investigate radiomics analysis of ultrasonographic images to develop a potential biomarker for predicting lymph node metastasis in papillary thyroid carcinoma (PTC) patients. Materials and Methods This study included 431 PTC patients from August 2013 to May 2014 and classified them into the training and validation sets. A total of 730 radiomics features, including texture matrices of gray-level co-occurrence matrix and gray-level run-length matrix and single-level discrete two-dimensional wavelet transform and other functions, were obtained. The least absolute shrinkage and selection operator method was used for selecting the most predictive features in the training data set. Results Lymph node metastasis was associated with the radiomics score (p < 0.001). It was also associated with other clinical variables such as young age (p = 0.007) and large tumor size (p = 0.007). The area under the receiver operating characteristic curve was 0.687 (95% confidence interval: 0.616-0.759) for the training set and 0.650 (95% confidence interval: 0.575-0.726) for the validation set. Conclusion This study showed the potential of ultrasonography-based radiomics to predict cervical lymph node metastasis in patients with PTC; thus, ultrasonography-based radiomics can act as a biomarker for PTC.

Prediction of Lung Cancer Based on Serum Biomarkers by Gene Expression Programming Methods

  • Yu, Zhuang;Chen, Xiao-Zheng;Cui, Lian-Hua;Si, Hong-Zong;Lu, Hai-Jiao;Liu, Shi-Hai
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.15 no.21
    • /
    • pp.9367-9373
    • /
    • 2014
  • In diagnosis of lung cancer, rapid distinction between small cell lung cancer (SCLC) and non-small cell lung cancer (NSCLC) tumors is very important. Serum markers, including lactate dehydrogenase (LDH), C-reactive protein (CRP), carcino-embryonic antigen (CEA), neurone specific enolase (NSE) and Cyfra21-1, are reported to reflect lung cancer characteristics. In this study classification of lung tumors was made based on biomarkers (measured in 120 NSCLC and 60 SCLC patients) by setting up optimal biomarker joint models with a powerful computerized tool - gene expression programming (GEP). GEP is a learning algorithm that combines the advantages of genetic programming (GP) and genetic algorithms (GA). It specifically focuses on relationships between variables in sets of data and then builds models to explain these relationships, and has been successfully used in formula finding and function mining. As a basis for defining a GEP environment for SCLC and NSCLC prediction, three explicit predictive models were constructed. CEA and NSE are requentlyused lung cancer markers in clinical trials, CRP, LDH and Cyfra21-1 have significant meaning in lung cancer, basis on CEA and NSE we set up three GEP models-GEP 1(CEA, NSE, Cyfra21-1), GEP2 (CEA, NSE, LDH), GEP3 (CEA, NSE, CRP). The best classification result of GEP gained when CEA, NSE and Cyfra21-1 were combined: 128 of 135 subjects in the training set and 40 of 45 subjects in the test set were classified correctly, the accuracy rate is 94.8% in training set; on collection of samples for testing, the accuracy rate is 88.9%. With GEP2, the accuracy was significantly decreased by 1.5% and 6.6% in training set and test set, in GEP3 was 0.82% and 4.45% respectively. Serum Cyfra21-1 is a useful and sensitive serum biomarker in discriminating between NSCLC and SCLC. GEP modeling is a promising and excellent tool in diagnosis of lung cancer.

A Study on GPU-based Iterative ML-EM Reconstruction Algorithm for Emission Computed Tomographic Imaging Systems (방출단층촬영 시스템을 위한 GPU 기반 반복적 기댓값 최대화 재구성 알고리즘 연구)

  • Ha, Woo-Seok;Kim, Soo-Mee;Park, Min-Jae;Lee, Dong-Soo;Lee, Jae-Sung
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.43 no.5
    • /
    • pp.459-467
    • /
    • 2009
  • Purpose: The maximum likelihood-expectation maximization (ML-EM) is the statistical reconstruction algorithm derived from probabilistic model of the emission and detection processes. Although the ML-EM has many advantages in accuracy and utility, the use of the ML-EM is limited due to the computational burden of iterating processing on a CPU (central processing unit). In this study, we developed a parallel computing technique on GPU (graphic processing unit) for ML-EM algorithm. Materials and Methods: Using Geforce 9800 GTX+ graphic card and CUDA (compute unified device architecture) the projection and backprojection in ML-EM algorithm were parallelized by NVIDIA's technology. The time delay on computations for projection, errors between measured and estimated data and backprojection in an iteration were measured. Total time included the latency in data transmission between RAM and GPU memory. Results: The total computation time of the CPU- and GPU-based ML-EM with 32 iterations were 3.83 and 0.26 see, respectively. In this case, the computing speed was improved about 15 times on GPU. When the number of iterations increased into 1024, the CPU- and GPU-based computing took totally 18 min and 8 see, respectively. The improvement was about 135 times and was caused by delay on CPU-based computing after certain iterations. On the other hand, the GPU-based computation provided very small variation on time delay per iteration due to use of shared memory. Conclusion: The GPU-based parallel computation for ML-EM improved significantly the computing speed and stability. The developed GPU-based ML-EM algorithm could be easily modified for some other imaging geometries.