• Title/Summary/Keyword: A key technique

Search Result 1,745, Processing Time 0.039 seconds

A Methodology for Quality Control of Railroad Trackbed Fills Using Compressional Wave Velocities : I. Preliminary Investigation (압축파 속도를 이용한 철도 토공노반의 품질관리 방안 : I. 예비연구)

  • Park, Chul-Soo;Mok, Young-Jin;Choi, Chan-Yong;Lee, Tai-Hee
    • Journal of the Korean Geotechnical Society
    • /
    • v.25 no.9
    • /
    • pp.45-55
    • /
    • 2009
  • The quality of railroad trackbed fills has been controlled by field measurements of density and bearing resistance of plate-load tests. The control measures are compatible with the design procedures whose design parameter is $k_{30}$ for both ordinary-speed railways and high-speed railways. However, one of fatal flaws of the design procedures is that there are no simple laboratory measurement procedures for the design parameters ($k_{30}$ or, $E_{v2}$ and $E_{v2}/E_{v1}$) in design stage. To overcome the defect, the compressional wave velocity was adopted as a control measure, in parallel with the advent of the new design procedure, and its measurement technique was proposed in the preliminary investigation. The key concept of the quality control procedure is that the target value for field compaction control is the compressional wave velocity determined at optimum moisture content using modified compaction test, and direct-arrival method is used for the field measurements during construction, which is simple and reliable enough for practice engineers to access. This direct-arrival method is well-suited for such a shallow and homogeneous fill lift in terms of applicability and cost effectiveness. The sensitivity of direct-arrival test results according to the compaction quality was demonstrated at a test site, and it was concluded that compressional wave velocity can be effectively used as quality control measure. The experimental background far the companion study (Park et al., 2009) was established through field and laboratory measurements of the compressional wave velocity.

Artificial Neural Network with Firefly Algorithm-Based Collaborative Spectrum Sensing in Cognitive Radio Networks

  • Velmurugan., S;P. Ezhumalai;E.A. Mary Anita
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.17 no.7
    • /
    • pp.1951-1975
    • /
    • 2023
  • Recent advances in Cognitive Radio Networks (CRN) have elevated them to the status of a critical instrument for overcoming spectrum limits and achieving severe future wireless communication requirements. Collaborative spectrum sensing is presented for efficient channel selection because spectrum sensing is an essential part of CRNs. This study presents an innovative cooperative spectrum sensing (CSS) model that is built on the Firefly Algorithm (FA), as well as machine learning artificial neural networks (ANN). This system makes use of user grouping strategies to improve detection performance dramatically while lowering collaboration costs. Cooperative sensing wasn't used until after cognitive radio users had been correctly identified using energy data samples and an ANN model. Cooperative sensing strategies produce a user base that is either secure, requires less effort, or is faultless. The suggested method's purpose is to choose the best transmission channel. Clustering is utilized by the suggested ANN-FA model to reduce spectrum sensing inaccuracy. The transmission channel that has the highest weight is chosen by employing the method that has been provided for computing channel weight. The proposed ANN-FA model computes channel weight based on three sets of input parameters: PU utilization, CR count, and channel capacity. Using an improved evolutionary algorithm, the key principles of the ANN-FA scheme are optimized to boost the overall efficiency of the CRN channel selection technique. This study proposes the Artificial Neural Network with Firefly Algorithm (ANN-FA) for cognitive radio networks to overcome the obstacles. This proposed work focuses primarily on sensing the optimal secondary user channel and reducing the spectrum handoff delay in wireless networks. Several benchmark functions are utilized We analyze the efficacy of this innovative strategy by evaluating its performance. The performance of ANN-FA is 22.72 percent more robust and effective than that of the other metaheuristic algorithm, according to experimental findings. The proposed ANN-FA model is simulated using the NS2 simulator, The results are evaluated in terms of average interference ratio, spectrum opportunity utilization, three metrics are measured: packet delivery ratio (PDR), end-to-end delay, and end-to-average throughput for a variety of different CRs found in the network.

Added Value of Chemical Exchange-Dependent Saturation Transfer MRI for the Diagnosis of Dementia

  • Jang-Hoon Oh;Bo Guem Choi;Hak Young Rhee;Jin San Lee;Kyung Mi Lee;Soonchan Park;Ah Rang Cho;Chang-Woo Ryu;Key Chung Park;Eui Jong Kim;Geon-Ho Jahng
    • Korean Journal of Radiology
    • /
    • v.22 no.5
    • /
    • pp.770-781
    • /
    • 2021
  • Objective: Chemical exchange-dependent saturation transfer (CEST) MRI is sensitive for detecting solid-like proteins and may detect changes in the levels of mobile proteins and peptides in tissues. The objective of this study was to evaluate the characteristics of chemical exchange proton pools using the CEST MRI technique in patients with dementia. Materials and Methods: Our institutional review board approved this cross-sectional prospective study and informed consent was obtained from all participants. This study included 41 subjects (19 with dementia and 22 without dementia). Complete CEST data of the brain were obtained using a three-dimensional gradient and spin-echo sequence to map CEST indices, such as amide, amine, hydroxyl, and magnetization transfer ratio asymmetry (MTRasym) values, using six-pool Lorentzian fitting. Statistical analyses of CEST indices were performed to evaluate group comparisons, their correlations with gray matter volume (GMV) and Mini-Mental State Examination (MMSE) scores, and receiver operating characteristic (ROC) curves. Results: Amine signals (0.029 for non-dementia, 0.046 for dementia, p = 0.011 at hippocampus) and MTRasym values at 3 ppm (0.748 for non-dementia, 1.138 for dementia, p = 0.022 at hippocampus), and 3.5 ppm (0.463 for non-dementia, 0.875 for dementia, p = 0.029 at hippocampus) were significantly higher in the dementia group than in the non-dementia group. Most CEST indices were not significantly correlated with GMV; however, except amide, most indices were significantly correlated with the MMSE scores. The classification power of most CEST indices was lower than that of GMV but adding one of the CEST indices in GMV improved the classification between the subject groups. The largest improvement was seen in the MTRasym values at 2 ppm in the anterior cingulate (area under the ROC curve = 0.981), with a sensitivity of 100 and a specificity of 90.91. Conclusion: CEST MRI potentially allows noninvasive image alterations in the Alzheimer's disease brain without injecting isotopes for monitoring different disease states and may provide a new imaging biomarker in the future.

Bending analysis of nano-Fe2O3 reinforced concrete slabs exposed to temperature fields and supported by viscoelastic foundation

  • Zouaoui R. Harrat;Mohammed Chatbi;Baghdad Krour;Sofiane Amziane;Mohamed Bachir Bouiadjra;Marijana Hadzima-Nyarko;Dorin Radu;Ercan Isik
    • Advances in concrete construction
    • /
    • v.17 no.2
    • /
    • pp.111-126
    • /
    • 2024
  • During the clinkering stages of cement production, the chemical composition of fine raw materials such as limestone and clay, which include iron oxide (Fe2O3), silicon dioxide (SiO2) and aluminum oxide (Al2O3), significantly influences the quality of the final product. Specifically, the chemical interaction of Fe2O3 with CaO, SiO2 and Al2O3 during clinkerisation plays a key role in determining the chemical reactivity and overall quality of the final cement, shaping the properties of the concrete produced. As an extension, this study aims to investigate the physical effects of incorporating nanosized Fe2O3 particles as fillers in concrete matrices, and their impact on concrete structures, namely slabs. To accurately model the reinforced concrete (RC) slabs, a refined trigonometric shear deformation theory (RTSDT) is used. Additionally, the stochastic Eshelby's homogenization approach is employed to determine the thermoelastic properties of nano-Fe2O3 infused concrete slabs. To ensure comprehensive coverage in the study, the RC slabs undergo various mechanical loads and are exposed to temperature fields to assess their thermo-mechanical performance. Furthermore, the slabs are assumed to rest on a three-parameter viscoelastic foundation, comprising the Winkler elastic springs, Pasternak shear layer and a damping parameter. The equilibrium governing equations of the system are derived using the principle of virtual work and subsequently solved using Navier's technique. The findings indicate that while ferric oxide nanoparticles enhance the mechanical properties of concrete against mechanical loading, they have less favorable effects on its performance against thermal exposure. However, the viscoelastic foundation contributes to mitigating these effects, improving the concrete's overall performance in both scenarios. These results highlight the trade-offs between mechanical and thermal performance when using Fe2O3 nanoparticles in concrete and underscore the importance of optimizing nanoparticle content and loading conditions to improve the structural performance of concrete structures.

Text Mining-Based Emerging Trend Analysis for the Aviation Industry (항공산업 미래유망분야 선정을 위한 텍스트 마이닝 기반의 트렌드 분석)

  • Kim, Hyun-Jung;Jo, Nam-Ok;Shin, Kyung-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.1
    • /
    • pp.65-82
    • /
    • 2015
  • Recently, there has been a surge of interest in finding core issues and analyzing emerging trends for the future. This represents efforts to devise national strategies and policies based on the selection of promising areas that can create economic and social added value. The existing studies, including those dedicated to the discovery of future promising fields, have mostly been dependent on qualitative research methods such as literature review and expert judgement. Deriving results from large amounts of information under this approach is both costly and time consuming. Efforts have been made to make up for the weaknesses of the conventional qualitative analysis approach designed to select key promising areas through discovery of future core issues and emerging trend analysis in various areas of academic research. There needs to be a paradigm shift in toward implementing qualitative research methods along with quantitative research methods like text mining in a mutually complementary manner. The change is to ensure objective and practical emerging trend analysis results based on large amounts of data. However, even such studies have had shortcoming related to their dependence on simple keywords for analysis, which makes it difficult to derive meaning from data. Besides, no study has been carried out so far to develop core issues and analyze emerging trends in special domains like the aviation industry. The change used to implement recent studies is being witnessed in various areas such as the steel industry, the information and communications technology industry, the construction industry in architectural engineering and so on. This study focused on retrieving aviation-related core issues and emerging trends from overall research papers pertaining to aviation through text mining, which is one of the big data analysis techniques. In this manner, the promising future areas for the air transport industry are selected based on objective data from aviation-related research papers. In order to compensate for the difficulties in grasping the meaning of single words in emerging trend analysis at keyword levels, this study will adopt topic analysis, which is a technique used to find out general themes latent in text document sets. The analysis will lead to the extraction of topics, which represent keyword sets, thereby discovering core issues and conducting emerging trend analysis. Based on the issues, it identified aviation-related research trends and selected the promising areas for the future. Research on core issue retrieval and emerging trend analysis for the aviation industry based on big data analysis is still in its incipient stages. So, the analysis targets for this study are restricted to data from aviation-related research papers. However, it has significance in that it prepared a quantitative analysis model for continuously monitoring the derived core issues and presenting directions regarding the areas with good prospects for the future. In the future, the scope is slated to expand to cover relevant domestic or international news articles and bidding information as well, thus increasing the reliability of analysis results. On the basis of the topic analysis results, core issues for the aviation industry will be determined. Then, emerging trend analysis for the issues will be implemented by year in order to identify the changes they undergo in time series. Through these procedures, this study aims to prepare a system for developing key promising areas for the future aviation industry as well as for ensuring rapid response. Additionally, the promising areas selected based on the aforementioned results and the analysis of pertinent policy research reports will be compared with the areas in which the actual government investments are made. The results from this comparative analysis are expected to make useful reference materials for future policy development and budget establishment.

Efficient Linear Path Query Processing using Information Retrieval Techniques for Large-Scale Heterogeneous XML Documents (정보 검색 기술을 이용한 대규모 이질적인 XML 문서에 대한 효율적인 선형 경로 질의 처리)

  • 박영호;한욱신;황규영
    • Journal of KIISE:Databases
    • /
    • v.31 no.5
    • /
    • pp.540-552
    • /
    • 2004
  • We propose XIR-Linear, a novel method for processing partial match queries on large-scale heterogeneous XML documents using information retrieval (IR) techniques. XPath queries are written in path expressions on a tree structure representing an XML document. An XPath query in its major form is a partial match query. The objective of XIR-Linear is to efficiently support this type of queries for large-scale documents of heterogeneous schemas. XIR-Linear has its basis on the schema-level methods using relational tables and drastically improves their efficiency and scalability using an inverted index technique. The method indexes the labels in label paths as key words in texts, and allows for finding the label paths that match the queries far more efficiently than string match used in conventional methods. We demonstrate the efficiency and scalability of XIR-Linear by comparing it with XRel and XParent using XML documents crawled from the Internet. The results show that XIR-Linear is more efficient than both XRel and XParent by several orders of magnitude for linear path expressions as the number of XML documents increases.

Analysis of Area Type Classification of Seoul Using Geodemographics Methods (Geodemographics의 연구기법을 활용한 서울시 지역유형 분석 연구)

  • Woo, Hyun-Jee;Kim, Young-Hoon
    • Journal of the Korean association of regional geographers
    • /
    • v.15 no.4
    • /
    • pp.510-523
    • /
    • 2009
  • Geodemographics(GD) can be defined as an analytical approach of socio-economic and behavioral data about people to investigate geographical patterns. GD is based on the assumptions that demographical and behavioral characteristics of people who live in the same neighborhood are similar and then the neighborhoods can be categorized with spatial classifications with the geographical classifications. Thus, this paper, in order to identify the applicability of the geographical classification of the GD, explores the concepts of the geodemographics into Seoul city areas with Korea census data sets that contain key characteristics of demographic profiles in the area. Then, this paper attempt to explain each area classification profile by using clustering techniques with Ward's and k-means statistical methods. For this as as as, this paper employs 2005 Census dataset released by Korea National Statistics Office and the neighborhood unit is based on Dong level, the smallest administrative boundary unit in Korea. After selecting and standardizing variables, several areas are categorized by the cluster techniques into 13, this paps as distinctive cluster profiles. These cluster profiles are used to cthite a short description and expand on the cluster names. Finally, the results of the classification propose a reasonable judgement for target area types which benefits for the people who make a spatial decision for their spatial problem-solving.

  • PDF

COMPARISON OF TRAMADOL/ACETAMINOPHEN AND CODEINE/ACETAMINOPHEN/IBUPROFEN IN ONSET OF ANALGESIA AND ANALGESIC EFFICACY FOR POSTOPERATIVE ACUTE PAIN (수술후 급성 동통에 대한 Tramadol/Acetaminophen과 Codeine/Acetaminophen/Ibuprofen의 효과 발현시점과 진통효과의 비교)

  • Jung, Young-Soo;Kim, Dong-Kee;Kim, Moon-Key;Kim, Hyung-Jun;Cha, In-Ho;Han, Moo-Young;Lee, Eui-Wung
    • Journal of the Korean Association of Oral and Maxillofacial Surgeons
    • /
    • v.30 no.2
    • /
    • pp.143-149
    • /
    • 2004
  • Background: Some clinical trials have reported that a new analgesic combination of tramadol and acetaminophen provides good efficacy in various pain models. For the more clinical uses of this agent, comparisons about the onset of analgesia and analgesic efficacy in the acute state of pain with the other drugs known as strong analgesics were needed. Purpose: The goal of this study was to compare the times to onset of analgesia and the other analgesic efficacy of 75 mg tramadol/650 mg acetaminophen and 20 mg codeine/500 mg acetaminophen/400 mg ibuprofen in the treatment of acute pain after oral surgery. Patients and Methods: Using a randomized, single-dose, parallel-group, single-center, and active-controlled test design, this clinical study compared the times to onset of analgesia using a two-stopwatch technique and the other analgesic efficacy of the single-dose tramadol/acetaminophen and codeine/acetaminophen/ibuprofen. These were assessed in 128 healthy subjects with pain from oral surgical procedures involving extraction of one or more impacted third molars requiring bone removal. From the time of pain development, the times to onset of perceptible and meaningful pain relief, pain intensity, pain relief, an overall assessment, and adverse events of the study medications were recorded for 6 hours. Results: The demographic distribution and baseline pain data in the two groups were statistically similar. The median times to onset of perceptible pain relief were 21.0 and 24.4 minutes in the tramadol/acetaminophen and codeine/acetaminophen/ibuprofen groups respectively and those to onset of meaningful pain relief were 56.4 and 57.3 minutes, which were statistically similar. The other efficacy variables such as mean total pain relief (TOTPAR) and the sum of pain intensity differences (SPID) were also similar in the early period after pain development and drug dosing. The safety of tramadol/acetaminophen was well tolerated and very comparable to that of codeine/acetaminophen/ibuprofen. Conclusions: In this acute dental pain model, the onset of analgesia and analgesic efficacy of tramadol/acetaminophen was comparable to that of codeine/acetaminophen/ibuprofen. These results showed that tramadol/acetaminophen was recommendable for fast and effective treatment in the management of postoperative acute pain.

A Review of Time Series Analysis for Environmental and Ecological Data (환경생태 자료 분석을 위한 시계열 분석 방법 연구)

  • Mo, Hyoung-ho;Cho, Kijong;Shin, Key-Il
    • Korean Journal of Environmental Biology
    • /
    • v.34 no.4
    • /
    • pp.365-373
    • /
    • 2016
  • Much of the data used in the analysis of environmental ecological data is being obtained over time. If the number of time points is small, the data will not be given enough information, so repeated measurements or multiple survey points data should be used to perform a comprehensive analysis. The method used for that case is longitudinal data analysis or mixed model analysis. However, if the amount of information is sufficient due to the large number of time points, repetitive data are not needed and these data are analyzed using time series analysis technique. In particular, with a large number of data points in the current situation, when we want to predict how each variable affects each other, or what trends will be expected in the future, we should analyze the data using time series analysis techniques. In this study, we introduce univariate time series analysis, intervention time series model, transfer function model, and multivariate time series model and review research papers studied in Korea. We also introduce an error correction model, which can be used to analyze environmental ecological data.

Concurrent Weekly Docetaxel Chemotherapy in Combination with Radiotherapy for Stage III and IVA-B Nasopharyngeal Carcinoma

  • Wei, Wei-Hong;Cai, Xiu-Yu;Xu, Tao;Zhang, Guo-Yi;Wu, Yong-Feng;Feng, Wei-Neng;Lin, Li;Deng, Yan-Ming;Lu, Qiu-Xia;Huang, Zhe-Li
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.13 no.3
    • /
    • pp.785-789
    • /
    • 2012
  • Background and Purpose: Cisplatin is the most common chemotherapeutic agent for loco-regionally advanced nasopharyngeal carcinoma (NPC); however, toxicity is a limiting factor for some patients. We retrospectively compared the efficacy and toxicity of weekly docetaxel-based and cisplatin-based concurrent chemoradiotherapy in loco-regionally advanced NPC. Methods and Materials: Eighty-four patients with Stage III and IVA-B NPCs, treated between 2007 and 2008, were retrospectively analyzed. Thirty received weekly docetaxel-based concurrent chemotherapy, and 43 were given weekly cisplatin-based concurrent chemotherapy. Radiotherapy was administered using a conventional technique (seven weeks, 2.0 Gy per fraction, total dose 70-74 Gy) with 6-8 Gy boosts for some patients with locally advanced disease. Results: Median follow-up time was 42.3 months (range, 8.6-50.8 months). There were no significant differences in the 3-year loco-regional failure-free survival (85.6% vs. 92.3%; p=0.264), distant failure-free survival (87.0% vs. 92.5%; p=0.171), progression-free survival (85.7% vs. 88.4%; p=0.411) or overall survival (86.5% vs. 92.5%, p=0.298) of patients treated concurrently with docetaxel or cisplatin. Severe toxicity was not common in either group. Conclusions: Weekly docetaxel-based concurrent chemoradiotherapy is potentially effective and has a tolerable toxicity; however, further investigations are required to determine if docetaxel is superior to cisplatin for advanced stage NPC.