• Title/Summary/Keyword: Civil-engineering dataset

Search Result 202, Processing Time 0.029 seconds

Analysis ofriverflow using the ADCP postprocessing software (adcptools) (ADCP 후처리 소프트웨어(adcptools)를 이용한 하천 흐름 분석)

  • Lee, Chanjoo;Kim, Jong Pil;Park, Edward;Kastner, Karl
    • Journal of The Geomorphological Association of Korea
    • /
    • v.23 no.1
    • /
    • pp.103-115
    • /
    • 2016
  • At present, an acoustic Doppler current profiler (ADCP) is one of the most suitable tools for measurement of three dimensional flow characteristics in the river. The data resulting from this approach can be used for flow visualization and velocity mapping together with post-processing software tools. Among them, 'adcptools' is the latest one and provides more realistic velocity distribution in the cross-section since it uses velocity along the beam direction. In this study, a flow analysis was made using the 'adcptools' for the Amazon River and the Han River dataset. Discharge was recalculated and accuracy of discharge and velocity was evaluated. Streamwise velocity distribution and secondary flow pattern in cross-sections were visualized. Geo-referenced velocity distribution was also mapped. A summary with future prospect of 'adcptools' for studies on fluvial geomorphology is briefly given.

Three-dimensional geostatistical modeling of subsurface stratification and SPT-N Value at dam site in South Korea

  • Mingi Kim;Choong-Ki Chung;Joung-Woo Han;Han-Saem Kim
    • Geomechanics and Engineering
    • /
    • v.34 no.1
    • /
    • pp.29-41
    • /
    • 2023
  • The 3D geospatial modeling of geotechnical information can aid in understanding the geotechnical characteristic values of the continuous subsurface at construction sites. In this study, a geostatistical optimization model for the three-dimensional (3D) mapping of subsurface stratification and the SPT-N value based on a trial-and-error rule was developed and applied to a dam emergency spillway site in South Korea. Geospatial database development for a geotechnical investigation, reconstitution of the target grid volume, and detection of outliers in the borehole dataset were implemented prior to the 3D modeling. For the site-specific subsurface stratification of the engineering geo-layer, we developed an integration method for the borehole and geophysical survey datasets based on the geostatistical optimization procedure of ordinary kriging and sequential Gaussian simulation (SGS) by comparing their cross-validation-based prediction residuals. We also developed an optimization technique based on SGS for estimating the 3D geometry of the SPT-N value. This method involves quantitatively testing the reliability of SGS and selecting the realizations with a high estimation accuracy. Boring tests were performed for validation, and the proposed method yielded more accurate prediction results and reproduced the spatial distribution of geotechnical information more effectively than the conventional geostatistical approach.

Extrapolation of wind pressure for low-rise buildings at different scales using few-shot learning

  • Yanmo Weng;Stephanie G. Paal
    • Wind and Structures
    • /
    • v.36 no.6
    • /
    • pp.367-377
    • /
    • 2023
  • This study proposes a few-shot learning model for extrapolating the wind pressure of scaled experiments to full-scale measurements. The proposed ML model can use scaled experimental data and a few full-scale tests to accurately predict the remaining full-scale data points (for new specimens). This model focuses on extrapolating the prediction to different scales while existing approaches are not capable of accurately extrapolating from scaled data to full-scale data in the wind engineering domain. Also, the scaling issue observed in wind tunnel tests can be partially resolved via the proposed approach. The proposed model obtained a low mean-squared error and a high coefficient of determination for the mean and standard deviation wind pressure coefficients of the full-scale dataset. A parametric study is carried out to investigate the influence of the number of selected shots. This technique is the first of its kind as it is the first time an ML model has been used in the wind engineering field to deal with extrapolation in wind performance prediction. With the advantages of the few-shot learning model, physical wind tunnel experiments can be reduced to a great extent. The few-shot learning model yields a robust, efficient, and accurate alternative to extrapolating the prediction performance of structures from various model scales to full-scale.

Probabilistic bearing capacity assessment for cross-bracings with semi-rigid connections in transmission towers

  • Zhengqi Tang;Tao Wang;Zhengliang Li
    • Structural Engineering and Mechanics
    • /
    • v.89 no.3
    • /
    • pp.309-321
    • /
    • 2024
  • In this paper, the effect of semi-rigid connections on the stability bearing capacity of cross-bracings in steel tubular transmission towers is investigated. Herein, a prediction method based on the hybrid model which is a combination of particle swarm optimization (PSO) and backpropagation neural network (BPNN) is proposed to accurately predict the stability bearing capacity of cross-bracings with semi-rigid connections and to efficiently conduct its probabilistic assessment. Firstly, the establishment of the finite element (FE) model of cross-bracings with semi-rigid connections is developed on the basis of the development of the mechanical model. Then, a dataset of 7425 samples generated by the FE model is used to train and test the PSO-BPNN model, and the accuracy of the proposed method is evaluated. Finally, the probabilistic assessment for the stability bearing capacity of cross-bracings with semi-rigid connections is conducted based on the proposed method and the Monte Carlo simulation, in which the geometric and material properties including the outer diameter and thickness of cross-sections and the yield strength of steel are considered as random variables. The results indicate that the proposed method based on the PSO-BPNN model has high accuracy in predicting the stability bearing capacity of cross-bracings with semi-rigid connections. Meanwhile, the semi-rigid connections could enhance the stability bearing capacity of cross-bracings and the reliability of cross-bracings would significantly increase after considering semi-rigid connections.

ACCURACY IMPROVEMENT OF LOBLOLLY PINE INVENTORY DATA USING MULTI SENSOR DATASETS

  • Kim, Jin-Woo;Kim, Jong-Hong;Sohn, Hong-Gyoo;Heo, Joon
    • Proceedings of the KSRS Conference
    • /
    • v.2
    • /
    • pp.590-593
    • /
    • 2006
  • Timber inventory management includes to measure and update forest attributes, which is crucial information for private companies and public organizations in property assessment and environment monitoring. Field measurement would be accurate, but time-consuming and inefficient. For the reason, remote sensing technology has been an alternative to field measurement from an economic perspective. Among several sensors, LiDAR and Radar interferometry are known for their efficiency for forest monitoring because they are less influenced by weather and light conditions, and provide reasonably accurate vertical/horizontal measurement for a large area in a short period. For example, Shuttle Radar Topography Mission (SRTM) and National Elevation Dataset (NED) in the U.S. can provide tree height information and DSM. On the other hand, LiDAR DSM (the first return) and DEM (the last return) can also present tree height estimation. With respect to project site of loblolly pine plantation in Louisiana in the U.S., the accuracy of SRTM C-Band approach estimating tree height was assessed by the LiDAR approaches. In addition, SRTM X-Band and NED were also compared with the results. Plantation year in inventory GIS, which is directly related to forest age, is high correlated with the difference between SRTM C-Band and NED. As a byproduct, several stands of age mismatch could be recognized using an outlier detection algorithm, and optical satellite image (ETM+) were used to verify the mismatch. The findings of this study were (1) the confirmation of usefulness of the SRTM DSM for forest monitoring and (2) Multi-sensors- Radar, LiDAR, ETM+, MODIS can be used for accuracy improvement of forest inventory GIS altogether.

  • PDF

Predictive modeling of the compressive strength of bacteria-incorporated geopolymer concrete using a gene expression programming approach

  • Mansouri, Iman;Ostovari, Mobin;Awoyera, Paul O.;Hu, Jong Wan
    • Computers and Concrete
    • /
    • v.27 no.4
    • /
    • pp.319-332
    • /
    • 2021
  • The performance of gene expression programming (GEP) in predicting the compressive strength of bacteria-incorporated geopolymer concrete (GPC) was examined in this study. Ground-granulated blast-furnace slag (GGBS), new bacterial strains, fly ash (FA), silica fume (SF), metakaolin (MK), and manufactured sand were used as ingredients in the concrete mixture. For the geopolymer preparation, an 8 M sodium hydroxide (NaOH) solution was used, and the ambient curing temperature (28℃) was maintained for all mixtures. The ratio of sodium silicate (Na2SiO3) to NaOH was 2.33, and the ratio of alkaline liquid to binder was 0.35. Based on experimental data collected from the literature, an evolutionary-based algorithm (GEP) was proposed to develop new predictive models for estimating the compressive strength of GPC containing bacteria. Data were classified into training and testing sets to obtain a closed-form solution using GEP. Independent variables for the model were the constituent materials of GPC, such as FA, MK, SF, and Bacillus bacteria. A total of six GEP formulations were developed for predicting the compressive strength of bacteria-incorporated GPC obtained at 1, 3, 7, 28, 56, and 90 days of curing. 80% and 20% of the data were used for training and testing the models, respectively. R2 values in the range of 0.9747 and 0.9950 (including train and test dataset) were obtained for the concrete samples, which showed that GEP can be used to predict the compressive strength of GPC containing bacteria with minimal error. Moreover, the GEP models were in good agreement with the experimental datasets and were robust and reliable. The models developed could serve as a tool for concrete constructors using geopolymers within the framework of this research.

A hierarchical semantic segmentation framework for computer vision-based bridge damage detection

  • Jingxiao Liu;Yujie Wei ;Bingqing Chen;Hae Young Noh
    • Smart Structures and Systems
    • /
    • v.31 no.4
    • /
    • pp.325-334
    • /
    • 2023
  • Computer vision-based damage detection enables non-contact, efficient and low-cost bridge health monitoring, which reduces the need for labor-intensive manual inspection or that for a large number of on-site sensing instruments. By leveraging recent semantic segmentation approaches, we can detect regions of critical structural components and identify damages at pixel level on images. However, existing methods perform poorly when detecting small and thin damages (e.g., cracks); the problem is exacerbated by imbalanced samples. To this end, we incorporate domain knowledge to introduce a hierarchical semantic segmentation framework that imposes a hierarchical semantic relationship between component categories and damage types. For instance, certain types of concrete cracks are only present on bridge columns, and therefore the noncolumn region may be masked out when detecting such damages. In this way, the damage detection model focuses on extracting features from relevant structural components and avoid those from irrelevant regions. We also utilize multi-scale augmentation to preserve contextual information of each image, without losing the ability to handle small and/or thin damages. In addition, our framework employs an importance sampling, where images with rare components are sampled more often, to address sample imbalance. We evaluated our framework on a public synthetic dataset that consists of 2,000 railway bridges. Our framework achieves a 0.836 mean intersection over union (IoU) for structural component segmentation and a 0.483 mean IoU for damage segmentation. Our results have in total 5% and 18% improvements for the structural component segmentation and damage segmentation tasks, respectively, compared to the best-performing baseline model.

Computing machinery techniques for performance prediction of TBM using rock geomechanical data in sedimentary and volcanic formations

  • Hanan Samadi;Arsalan Mahmoodzadeh;Shtwai Alsubai;Abdullah Alqahtani;Abed Alanazi;Ahmed Babeker Elhag
    • Geomechanics and Engineering
    • /
    • v.37 no.3
    • /
    • pp.223-241
    • /
    • 2024
  • Evaluating the performance of Tunnel Boring Machines (TBMs) stands as a pivotal juncture in the domain of hard rock mechanized tunneling, essential for achieving both a dependable construction timeline and utilization rate. In this investigation, three advanced artificial neural networks namely, gated recurrent unit (GRU), back propagation neural network (BPNN), and simple recurrent neural network (SRNN) were crafted to prognosticate TBM-rate of penetration (ROP). Drawing from a dataset comprising 1125 data points amassed during the construction of the Alborze Service Tunnel, the study commenced. Initially, five geomechanical parameters were scrutinized for their impact on TBM-ROP efficiency. Subsequent statistical analyses narrowed down the effective parameters to three, including uniaxial compressive strength (UCS), peak slope index (PSI), and Brazilian tensile strength (BTS). Among the methodologies employed, GRU emerged as the most robust model, demonstrating exceptional predictive prowess for TBM-ROP with staggering accuracy metrics on the testing subset (R2 = 0.87, NRMSE = 6.76E-04, MAD = 2.85E-05). The proposed models present viable solutions for analogous ground and TBM tunneling scenarios, particularly beneficial in routes predominantly composed of volcanic and sedimentary rock formations. Leveraging forecasted parameters holds the promise of enhancing both machine efficiency and construction safety within TBM tunneling endeavors.

CEOP Annual Enhanced Observing Period Starts

  • Koike, Toshio
    • Proceedings of the KSRS Conference
    • /
    • 2002.10a
    • /
    • pp.343-346
    • /
    • 2002
  • Toward more accurate determination of the water cycle in association with climate variability and change as well as baseline data on the impacts of this variability on water resources, the Coordinated Enhanced Observing Period (CEOP) was launched on July 1,2001. The preliminary data period, EOP-1, was implemented from July to September in 2001. The first annual enhanced observing period, EOP-3, is going to start on October 1,2002. CEOP is seeking to achieve a database of common measurements from both in situ and satellite remote sensing, model output, and four-dimensional data analyses (4DDA; including global and regional reanalyses) for a specified period. In this context a number of carefully selected reference stations are linked closely with the existing network of observing sites involved in the GEWEX Continental Scale Experiments, which are distributed across the world. The initial step of CEOP is to develop a pilot global hydro-climatological dataset with global consistency under the climate variability that can be used to help validate satellite hydrology products and evaluate, develop and eventually predict water and energy cycle processes in global and regional models. Based on the dataset, we will address the studies on the inter-comparison and inter-connectivity of the monsoon systems and regional water and energy budget, and a path to down-scaling from the global climate to local water resources, as the second step.

  • PDF

Automated Prioritization of Construction Project Requirements using Machine Learning and Fuzzy Logic System

  • Hassan, Fahad ul;Le, Tuyen;Le, Chau;Shrestha, K. Joseph
    • International conference on construction engineering and project management
    • /
    • 2022.06a
    • /
    • pp.304-311
    • /
    • 2022
  • Construction inspection is a crucial stage that ensures that all contractual requirements of a construction project are verified. The construction inspection capabilities among state highway agencies have been greatly affected due to budget reduction. As a result, efficient inspection practices such as risk-based inspection are required to optimize the use of limited resources without compromising inspection quality. Automated prioritization of textual requirements according to their criticality would be extremely helpful since contractual requirements are typically presented in an unstructured natural language in voluminous text documents. The current study introduces a novel model for predicting the risk level of requirements using machine learning (ML) algorithms. The ML algorithms tested in this study included naïve Bayes, support vector machines, logistic regression, and random forest. The training data includes sequences of requirement texts which were labeled with risk levels (such as very low, low, medium, high, very high) using the fuzzy logic systems. The fuzzy model treats the three risk factors (severity, probability, detectability) as fuzzy input variables, and implements the fuzzy inference rules to determine the labels of requirements. The performance of the model was examined on labeled dataset created by fuzzy inference rules and three different membership functions. The developed requirement risk prediction model yielded a precision, recall, and f-score of 78.18%, 77.75%, and 75.82%, respectively. The proposed model is expected to provide construction inspectors with a means for the automated prioritization of voluminous requirements by their importance, thus help to maximize the effectiveness of inspection activities under resource constraints.

  • PDF