• Title/Summary/Keyword: traditional experiments

Search Result 1,064, Processing Time 0.027 seconds

A Case Study of Contemporary Textile Art in Loewe Craft Prize

  • Hyojeong Park;Jinyoung Kim
    • Journal of Fashion Business
    • /
    • v.27 no.6
    • /
    • pp.99-109
    • /
    • 2023
  • Loewe Craft Prize is currently the most influential craft contest. During the contest, contemporary craftworks, as art with excellent aesthetic value, are selected as finalists but there are no enough studies on them as subjects. This study aimed to investigate contemporary textile pieces found in the fashion brand Loewe's Craft Prize, a Loewe Foundation Craft Prize and elucidate their expressive characteristics. The methodology of the study was a qualitative study that derives the expressive characteristics of the works within the scope of the study through case analysis along with theoretical reviews. The selection of research subjects was based on 22 works of textile pieces among the works selected as finalists for the last six years since the first year of the prize in 2016. The analysis of textile pieces showed first, the emphasis on traditional expression, second, the development of new expressive techniques for the material, and third, the pictorial character revealed in flat pieces. The expressive characteristics of contemporary textile art shown in the Loewe Foundation Craft Prize, derived based on such results were, first, the confirmation of the unique capabilities of craft through the inheritance of tradition, second, the rediscovery of textile properties through material experimentation, and third, the possibility of expanding the field of textile due to the pictorial character.

Design of Smart City Considering Carbon Emissions under The Background of Industry 5.0

  • Fengjiao Zhou;Rui Ma;Mohamad Shaharudin bin Samsurijan;Xiaoqin Xie
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.18 no.4
    • /
    • pp.903-921
    • /
    • 2024
  • Industry 5.0 puts forward higher requirements for smart cities, including low-carbon, sustainable, and people-oriented, which pose challenges to the design of smart cities. In response to the above challenges, this study introduces the cyber-physical-social system (CPSS) and parallel system theory into the design of smart cities, and constructs a smart city framework based on parallel system theory. On this basis, in order to enhance the security of smart cities, a sustainable patrol subsystem for smart cities has been established. The intelligent patrol system uses a drone platform, and the trajectory planning of the drone is a key problem that needs to be solved. Therefore, a mathematical model was established that considers various objectives, including minimizing carbon emissions, minimizing noise impact, and maximizing coverage area, while also taking into account the flight performance constraints of drones. In addition, an improved metaheuristic algorithm based on ant colony optimization (ACO) algorithm was designed for trajectory planning of patrol drones. Finally, a digital environmental map was established based on real urban scenes and simulation experiments were conducted. The results show that compared with the other three metaheuristic algorithms, the algorithm designed in this study has the best performance.

Hyperspectral Image Classification using EfficientNet-B4 with Search and Rescue Operation Algorithm

  • S.Srinivasan;K.Rajakumar
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.12
    • /
    • pp.213-219
    • /
    • 2023
  • In recent years, popularity of deep learning (DL) is increased due to its ability to extract features from Hyperspectral images. A lack of discrimination power in the features produced by traditional machine learning algorithms has resulted in poor classification results. It's also a study topic to find out how to get excellent classification results with limited samples without getting overfitting issues in hyperspectral images (HSIs). These issues can be addressed by utilising a new learning network structure developed in this study.EfficientNet-B4-Based Convolutional network (EN-B4), which is why it is critical to maintain a constant ratio between the dimensions of network resolution, width, and depth in order to achieve a balance. The weight of the proposed model is optimized by Search and Rescue Operations (SRO), which is inspired by the explorations carried out by humans during search and rescue processes. Tests were conducted on two datasets to verify the efficacy of EN-B4, with Indian Pines (IP) and the University of Pavia (UP) dataset. Experiments show that EN-B4 outperforms other state-of-the-art approaches in terms of classification accuracy.

Experimental study on the generation of ultrafine-sized dry fog and removal of particulate matter (초미세 크기의 마른 안개 생성과 이를 이용한 미세먼지 제거 연구)

  • Kiwoong Kim
    • Journal of the Korean Society of Visualization
    • /
    • v.22 no.1
    • /
    • pp.34-39
    • /
    • 2024
  • With the fine particulate matter (PM) poses a serious threat to public health and the environment. The ultrafine PM in particular can cause serious problems. This study investigates the effectiveness of a submicron dry fog system in removing fine PM. Two methods are used to create fine dust particles: burning incense and utilizing an aerosol generator. Results indicate that the dry fog system effectively removes fine dust particles, with a removal efficiency of up to 81.9% for PM10 and 61.9% for PM2.5 after 30 minutes of operation. The dry fog, characterized by a mean size of approximately 1.5 ㎛, exhibits superior performance in comparison to traditional water spraying methods, attributed to reduced water consumption and increased contact probability between water droplets and dust particles. Furthermore, experiments with uniform-sized particles which sizes are 1 ㎛ and 2 ㎛ demonstrate the system's capability in removing ultrafine PM. The proposed submicron dry fog system shows promise for mitigating fine dust pollution in various industrial settings, offering advantages such as energy consumption and enhanced safety for workers and equipment.

Application of a comparative analysis of random forest programming to predict the strength of environmentally-friendly geopolymer concrete

  • Ying Bi;Yeng Yi
    • Steel and Composite Structures
    • /
    • v.50 no.4
    • /
    • pp.443-458
    • /
    • 2024
  • The construction industry, one of the biggest producers of greenhouse emissions, is under a lot of pressure as a result of growing worries about how climate change may affect local communities. Geopolymer concrete (GPC) has emerged as a feasible choice for construction materials as a result of the environmental issues connected to the manufacture of cement. The findings of this study contribute to the development of machine learning methods for estimating the properties of eco-friendly concrete, which might be used in lieu of traditional concrete to reduce CO2 emissions in the building industry. In the present work, the compressive strength (fc) of GPC is calculated using random forests regression (RFR) methodology where natural zeolite (NZ) and silica fume (SF) replace ground granulated blast-furnace slag (GGBFS). From the literature, a thorough set of experimental experiments on GPC samples were compiled, totaling 254 data rows. The considered RFR integrated with artificial hummingbird optimization (AHA), black widow optimization algorithm (BWOA), and chimp optimization algorithm (ChOA), abbreviated as ARFR, BRFR, and CRFR. The outcomes obtained for RFR models demonstrated satisfactory performance across all evaluation metrics in the prediction procedure. For R2 metric, the CRFR model gained 0.9988 and 0.9981 in the train and test data set higher than those for BRFR (0.9982 and 0.9969), followed by ARFR (0.9971 and 0.9956). Some other error and distribution metrics depicted a roughly 50% improvement for CRFR respect to ARFR.

Analyzing the Influence of Spatial Sampling Rate on Three-dimensional Temperature-field Reconstruction

  • Shenxiang Feng;Xiaojian Hao;Tong Wei;Xiaodong Huang;Pan Pei;Chenyang Xu
    • Current Optics and Photonics
    • /
    • v.8 no.3
    • /
    • pp.246-258
    • /
    • 2024
  • In aerospace and energy engineering, the reconstruction of three-dimensional (3D) temperature distributions is crucial. Traditional methods like algebraic iterative reconstruction and filtered back-projection depend on voxel division for resolution. Our algorithm, blending deep learning with computer graphics rendering, converts 2D projections into light rays for uniform sampling, using a fully connected neural network to depict the 3D temperature field. Although effective in capturing internal details, it demands multiple cameras for varied angle projections, increasing cost and computational needs. We assess the impact of camera number on reconstruction accuracy and efficiency, conducting butane-flame simulations with different camera setups (6 to 18 cameras). The results show improved accuracy with more cameras, with 12 cameras achieving optimal computational efficiency (1.263) and low error rates. Verification experiments with 9, 12, and 15 cameras, using thermocouples, confirm that the 12-camera setup as the best, balancing efficiency and accuracy. This offers a feasible, cost-effective solution for real-world applications like engine testing and environmental monitoring, improving accuracy and resource management in temperature measurement.

Token-Based Classification and Dataset Construction for Detecting Modified Profanity (변형된 비속어 탐지를 위한 토큰 기반의 분류 및 데이터셋)

  • Sungmin Ko;Youhyun Shin
    • The Transactions of the Korea Information Processing Society
    • /
    • v.13 no.4
    • /
    • pp.181-188
    • /
    • 2024
  • Traditional profanity detection methods have limitations in identifying intentionally altered profanities. This paper introduces a new method based on Named Entity Recognition, a subfield of Natural Language Processing. We developed a profanity detection technique using sequence labeling, for which we constructed a dataset by labeling some profanities in Korean malicious comments and conducted experiments. Additionally, to enhance the model's performance, we augmented the dataset by labeling parts of a Korean hate speech dataset using one of the large language models, ChatGPT, and conducted training. During this process, we confirmed that filtering the dataset created by the large language model by humans alone could improve performance. This suggests that human oversight is still necessary in the dataset augmentation process.

Acoustic Characteristics of the Haegeum Body (해금 몸체의 음향학적 특성에 관한 연구)

  • Noh, Jung-Uk;Park, Sang-Ha;Sung, Koeng-Mo
    • The Journal of the Acoustical Society of Korea
    • /
    • v.26 no.7
    • /
    • pp.317-322
    • /
    • 2007
  • This paper is the first step to study on the acoustic characteristics of the Haegeum, a Korean traditional bowed-string instrument. We measured acoustic transfer functions of a Haegeum body using impulse response method. All the measurements are performed in anechoic chamber, INMC, SNU. We examined resonant characteristics of the Haegeum body with obtained transfer functions. Then we performed additional studies which are the Chladni pattern experiments and calculations of air cavity resonances to verify relations between the resonant peaks on the transfer functions and the resonances of each component, such as top plate, air cavity and so on. As a result, we can explain the acoustic characteristics of a Haegeum body and its components.

Two-dimensional concrete meso-modeling research based on pixel matrix and skeleton theory

  • Jingwei Ying;Yujun Jian;Jianzhuang Xiao
    • Computers and Concrete
    • /
    • v.33 no.6
    • /
    • pp.671-688
    • /
    • 2024
  • The modeling efficiency of concrete meso-models close to real concrete is one of the important issues that limit the accuracy of mechanical simulation. In order to improve the modeling efficiency and the closeness of the numerical aggregate shape to the real aggregate, this paper proposes a method for generating a two-dimensional concrete meso-model based on pixel matrix and skeleton theory. First, initial concrete model (a container for placing aggregate) is generated using pixel matrix. Then, the skeleton curve of the residual space that is the model after excluding the existing aggregate is obtained using a thinning algorithm. Finally, the final model is obtained by placing the aggregate according to the curve branching points. Compared with the traditional Monte Carlo placement method, the proposed method greatly reduces the number of overlaps between aggregates by up to 95%, and the placement efficiency does not significantly decrease with increasing aggregate content. The model developed is close to the actual concrete experiments in terms of aggregate gradation, aspect ratio, asymmetry, concavity and convexity, and old-new mortar ratio, cracking form, and stress-strain curve. In addition, the cracking loss process of concrete under uniaxial compression was explained at the mesoscale.

An Empirical Study on Improving the Accuracy of Demand Forecasting Based on Multi-Machine Learning (다중 머신러닝 기법을 활용한 무기체계 수리부속 수요예측 정확도 개선에 관한 실증연구)

  • Myunghwa Kim;Yeonjun Lee;Sangwoo Park;Kunwoo Kim;Taehee Kim
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.27 no.3
    • /
    • pp.406-415
    • /
    • 2024
  • As the equipment of the military has become more advanced and expensive, the cost of securing spare parts is also constantly increasing along with the increase in equipment assets. In particular, forecasting demand for spare parts one of the important management tasks in the military, and the accuracy of these predictions is directly related to military operations and cost management. However, because the demand for spare parts is intermittent and irregular, it is often difficult to make accurate predictions using traditional statistical methods or a single statistical or machine learning model. In this paper, we propose a model that can increase the accuracy of demand forecasting for irregular patterns of spare parts demanding by using a combination of statistical and machine learning algorithm, and through experiments on Cheonma spare parts demanding data.