• Title/Summary/Keyword: Processing

Search Result 69,617, Processing Time 0.089 seconds

The Contact and Parallel Analysis of Smoothed Particle Hydrodynamics (SPH) Using Polyhedral Domain Decomposition (다면체영역분할을 이용한 SPH의 충돌 및 병렬해석)

  • Moonho Tak
    • Journal of the Korean GEO-environmental Society
    • /
    • v.25 no.4
    • /
    • pp.21-28
    • /
    • 2024
  • In this study, a polyhedral domain decomposition method for Smoothed Particle Hydrodynamics (SPH) analysis is introduced. SPH which is one of meshless methods is a numerical analysis method for fluid flow simulation. It can be useful for analyzing fluidic soil or fluid-structure interaction problems. SPH is a particle-based method, where increased particle count generally improves accuracy but diminishes numerical efficiency. To enhance numerical efficiency, parallel processing algorithms are commonly employed with the Cartesian coordinate-based domain decomposition method. However, for parallel analysis of complex geometric shapes or fluidic problems under dynamic boundary conditions, the Cartesian coordinate-based domain decomposition method may not be suitable. The introduced polyhedral domain decomposition technique offers advantages in enhancing parallel efficiency in such problems. It allows partitioning into various forms of 3D polyhedral elements to better fit the problem. Physical properties of SPH particles are calculated using information from neighboring particles within the smoothing length. Methods for sharing particle information physically separable at partitioning and sharing information at cross-points where parallel efficiency might diminish are presented. Through numerical analysis examples, the proposed method's parallel efficiency approached 95% for up to 12 cores. However, as the number of cores is increased, parallel efficiency is decreased due to increased information sharing among cores.

Comparative analysis of wavelet transform and machine learning approaches for noise reduction in water level data (웨이블릿 변환과 기계 학습 접근법을 이용한 수위 데이터의 노이즈 제거 비교 분석)

  • Hwang, Yukwan;Lim, Kyoung Jae;Kim, Jonggun;Shin, Minhwan;Park, Youn Shik;Shin, Yongchul;Ji, Bongjun
    • Journal of Korea Water Resources Association
    • /
    • v.57 no.3
    • /
    • pp.209-223
    • /
    • 2024
  • In the context of the fourth industrial revolution, data-driven decision-making has increasingly become pivotal. However, the integrity of data analysis is compromised if data quality is not adequately ensured, potentially leading to biased interpretations. This is particularly critical for water level data, essential for water resource management, which often encounters quality issues such as missing values, spikes, and noise. This study addresses the challenge of noise-induced data quality deterioration, which complicates trend analysis and may produce anomalous outliers. To mitigate this issue, we propose a noise removal strategy employing Wavelet Transform, a technique renowned for its efficacy in signal processing and noise elimination. The advantage of Wavelet Transform lies in its operational efficiency - it reduces both time and costs as it obviates the need for acquiring the true values of collected data. This study conducted a comparative performance evaluation between our Wavelet Transform-based approach and the Denoising Autoencoder, a prominent machine learning method for noise reduction.. The findings demonstrate that the Coiflets wavelet function outperforms the Denoising Autoencoder across various metrics, including Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE), and Mean Squared Error (MSE). The superiority of the Coiflets function suggests that selecting an appropriate wavelet function tailored to the specific application environment can effectively address data quality issues caused by noise. This study underscores the potential of Wavelet Transform as a robust tool for enhancing the quality of water level data, thereby contributing to the reliability of water resource management decisions.

An Exploratory Study of the Determinants of Global Sourcing Intention in Korean Clothing Sewing Industry: Focusing on Women's Knit Wear Production (국내 의류봉제 산업의 글로벌소싱 의향 고려요인 연구: 여성니트복종(women's knit wear) 생산을 중심으로)

  • Dabin Yoo;Sunwook Chung
    • Asia-Pacific Journal of Business
    • /
    • v.14 no.4
    • /
    • pp.67-85
    • /
    • 2023
  • Purpose - This study seeks to investigate the determinants of global sourcing intention in clothing sewing industry, in particular with its focus on women's knit wear production. Design/methodology/approach - This study collected a unique set of qualitative data through 31 in-depth interviews with fashion brands, promotion agencies, and sewing factories between July 2023 and October 2023. In addition, it analyzed the dataset using the MAXQDA to complement the research findings. Findings - We have two findings. First, the interviewees commonly mentioned the following factors as reasons for considering global sourcing: the human factors(aging of skilled technicians and labor shortages), the financial factors(gap in production unit prices at home and abroad), the relational factors(lack of novelty), and the physical factors(loss of production infrastructure and network), while the human factors(skilled workforce), the production factors(delivery date and product quality), and the relational factors(timely communication and mutual trust) as reasons for continuing domestic sourcing. Additional code analysis of interview also supports this finding. On the other hand, there was also a subtle difference between buyers(brands) and suppliers(promotion agencies and processing plants), and buyers consider the exact delivery date critical so that they could see trend-sensitive women's knit wear on time, and suppliers took production costs, labor costs, and labor shortages, which are financial factors, more seriously. Research implications or Originality - This study provides a richer and more balanced view of existing literature, which has generally tended to introduce global sourcing across the clothing industry despite the existence of various diversity within the industry. In addition, through qualitative research, we introduce that the sewing industry is carried out according to complex factors, and by revealing and categorizing the determinants of global sourcing, we supplement the existing research on the clothing sewing industry centered on survey. On a practical note, this study introduces that there is a difference in view of domestic sourcing and global sourcing between buyers(brands) and suppliers(promotion agencies and sewing factories), suggesting practical implications for revitalizing networks and deriving win-win cooperation network models among members in the future.

Enhancing Throughput and Reducing Network Load in Central Bank Digital Currency Systems using Reinforcement Learning (강화학습 기반의 CBDC 처리량 및 네트워크 부하 문제 해결 기술)

  • Yeon Joo Lee;Hobin Jang;Sujung Jo;GyeHyun Jang;Geontae Noh;Ik Rae Jeong
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.34 no.1
    • /
    • pp.129-141
    • /
    • 2024
  • Amidst the acceleration of digital transformation across various sectors, the financial market is increasingly focusing on the development of digital and electronic payment methods, including currency. Among these, Central Bank Digital Currencies (CBDC) are emerging as future digital currencies that could replace physical cash. They are stable, not subject to value fluctuation, and can be exchanged one-to-one with existing physical currencies. Recently, both domestic and international efforts are underway in researching and developing CBDCs. However, current CBDC systems face scalability issues such as delays in processing large transactions, response times, and network congestion. To build a universal CBDC system, it is crucial to resolve these scalability issues, including the low throughput and network overload problems inherent in existing blockchain technologies. Therefore, this study proposes a solution based on reinforcement learning for handling large-scale data in a CBDC environment, aiming to improve throughput and reduce network congestion. The proposed technology can increase throughput by more than 64 times and reduce network congestion by over 20% compared to existing systems.

Feasibility of Emotional Freedom Techniques in Patients with Posttraumatic Stress Disorder: a pilot study

  • Yujin Choi;Yunna Kim;Do-Hyung Kwon;Sunyoung Choi;Young-Eun Choi;Eun Kyoung Ahn;Seung-Hun Cho;Hyungjun Kim
    • Journal of Pharmacopuncture
    • /
    • v.27 no.1
    • /
    • pp.27-37
    • /
    • 2024
  • Objectives: Posttraumatic stress disorder (PTSD) is a prevalent mental health condition, and techniques using sensory stimulation in processing traumatic memories have gained attention. The Emotional Freedom Techniques (EFT) is a psychotherapy that combines tapping on acupoints with exposure to cognitive reframing. This pilot study aimed to assess the feasibility of EFT as a treatment for PTSD by answering the following research questions: 1) What is the compliance and completion rate of patients with PTSD with regard to EFT protocol? Is the dropout rate reasonable? 2) Is the effect size of EFT protocol for PTSD sufficient to justify a future trial? Methods: Thirty participants diagnosed with PTSD were recruited. They received weekly EFT sessions for five weeks, in which they repeated a statement acknowledging the problem and accepting themselves while tapping the SI3 acupoint on the side of their hand. PTSD symptoms were evaluated using the PTSD Checklist for DSM-5 (PCL-5) before and after the intervention. Results: Of the 30 PTSD patients (mean age: 34.1 ± 9.1, 80% female), 96.7% showed over 80% compliance to the EFT sessions, and 86.7% completed the entire study process. The mean PCL-5 total score decreased significantly after the intervention, with a large effect size (change from baseline: -14.33 [95% CI: -19.79, -8.86], p < 0.0001, d = 1.06). Conclusion: The study suggests that EFT is a feasible treatment for PTSD, with high session compliance and low dropout rates. The effect size observed in this study supports the need for a larger trial in the future to further investigate EFT as a treatment for PTSD. However, the lack of a control group and the use of a self-rated questionnaire for PTSD symptoms are limitations of this study. The findings of this pilot study can be used to plan a future trial.

Simulation analysis and evaluation of decontamination effect of different abrasive jet process parameters on radioactively contaminated metal

  • Lin Zhong;Jian Deng;Zhe-wen Zuo;Can-yu Huang;Bo Chen;Lin Lei;Ze-yong Lei;Jie-heng Lei;Mu Zhao;Yun-fei Hua
    • Nuclear Engineering and Technology
    • /
    • v.55 no.11
    • /
    • pp.3940-3955
    • /
    • 2023
  • A new method of numerical simulating prediction and decontamination effect evaluation for abrasive jet decontamination to radioactively contaminated metal is proposed. Based on the Computational Fluid Dynamics and Discrete Element Model (CFD-DEM) coupled simulation model, the motion patterns and distribution of abrasives can be predicted, and the decontamination effect can be evaluated by image processing and recognition technology. The impact of three key parameters (impact distance, inlet pressure, abrasive mass flow rate) on the decontamination effect is revealed. Moreover, here are experiments of reliability verification to decontamination effect and numerical simulation methods that has been conducted. The results show that: 60Co and other homogeneous solid solution radioactive pollutants can be removed by abrasive jet, and the average removal rate of Co exceeds 80%. It is reliable for the proposed numerical simulation and evaluation method because of the well goodness of fit between predicted value and actual values: The predicted values and actual values of the abrasive distribution diameter are Ф57 and Ф55; the total coverage rate is 26.42% and 23.50%; the average impact velocity is 81.73 m/s and 78.00 m/s. Further analysis shows that the impact distance has a significant impact on the distribution of abrasive particles on the target surface, the coverage rate of the core area increases at first, and then decreases with the increase of the impact distance of the nozzle, which reach a maximum of 14.44% at 300 mm. It is recommended to set the impact distance around 300 mm, because at this time the core area coverage of the abrasive is the largest and the impact velocity is stable at the highest speed of 81.94 m/s. The impact of the nozzle inlet pressure on the decontamination effect mainly affects the impact kinetic energy of the abrasive and has little impact on the distribution. The greater the inlet pressure, the greater the impact kinetic energy, and the stronger the decontamination ability of the abrasive. But in return, the energy consumption is higher, too. For the decontamination of radioactively contaminated metals, it is recommended to set the inlet pressure of the nozzle at around 0.6 MPa. Because most of the Co elements can be removed under this pressure. Increasing the mass and flow of abrasives appropriately can enhance the decontamination effectiveness. The total mass of abrasives per unit decontamination area is suggested to be 50 g because the core area coverage rate of the abrasive is relatively large under this condition; and the nozzle wear extent is acceptable.

Digital Library Interface Research Based on EEG, Eye-Tracking, and Artificial Intelligence Technologies: Focusing on the Utilization of Implicit Relevance Feedback (뇌파, 시선추적 및 인공지능 기술에 기반한 디지털 도서관 인터페이스 연구: 암묵적 적합성 피드백 활용을 중심으로)

  • Hyun-Hee Kim;Yong-Ho Kim
    • Journal of the Korean Society for information Management
    • /
    • v.41 no.1
    • /
    • pp.261-282
    • /
    • 2024
  • This study proposed and evaluated electroencephalography (EEG)-based and eye-tracking-based methods to determine relevance by utilizing users' implicit relevance feedback while navigating content in a digital library. For this, EEG/eye-tracking experiments were conducted on 32 participants using video, image, and text data. To assess the usefulness of the proposed methods, deep learning-based artificial intelligence (AI) techniques were used as a competitive benchmark. The evaluation results showed that EEG component-based methods (av_P600 and f_P3b components) demonstrated high classification accuracy in selecting relevant videos and images (faces/emotions). In contrast, AI-based methods, specifically object recognition and natural language processing, showed high classification accuracy for selecting images (objects) and texts (newspaper articles). Finally, guidelines for implementing a digital library interface based on EEG, eye-tracking, and artificial intelligence technologies have been proposed. Specifically, a system model based on implicit relevance feedback has been presented. Moreover, to enhance classification accuracy, methods suitable for each media type have been suggested, including EEG-based, eye-tracking-based, and AI-based approaches.

Study on Customer Satisfaction Performance Evaluation through e-SCM-based OMS Implementation (e-SCM 기반 OMS 구현을 통한 고객 만족 성과평가에 관한 연구)

  • Hyungdo Zun;ChiGon Kim;KyungBae Yoon
    • The Journal of the Convergence on Culture Technology
    • /
    • v.10 no.3
    • /
    • pp.891-899
    • /
    • 2024
  • The Fourth Industrial Revolution is centered on a personalized demand fulfillment economy and is all about transformation and flexible processing that can deliver what customers want in real time across space and time. This paper implements the construction and operation of a packaging platform that can instantly procure the required packaging products based on real-time orders and evaluates its performance. The components of customer satisfaction are flexible and dependent on the situation which requires efficient management of enterprise operational processes based on an e-SCM platform. An OMS optimized for these conditions plays an important role in maximizing and differentiating the efficiency of a company's operations and improving its cost advantage. OMS is a system of mass customization that provides efficient MOT(Moment of Truth) logistics services to meet the eco-friendly issues of many individual customers and achieve optimized logistics operation goals to enhance repurchase intentions and sustainable business. OMS precisely analyzes the collected data to support information and decision-making related to efficiency, productivity, cost and provide accurate reports. It uses data visualization tools to express data visually and suggests directions for improvement of the operational process through statistics and prediction analysis.

Proposals to Revise the Occupational Exposure Limits for Aluminum in Korea (국내 크롬 및 그 화합물의 노출실태 및 노출기준 개정 제안)

  • Seung Won Kim;Young Gyu Phee;Yong-Joon Baek;Taejin Chung;Jeong-Hee Han
    • Journal of Korean Society of Occupational and Environmental Hygiene
    • /
    • v.34 no.2
    • /
    • pp.166-178
    • /
    • 2024
  • Objectives: The 12 occupational exposure limits(OELs) for chromium and its compounds in Korea were set by applying the American Conference of Governmental Industrial Hygienists (ACGIH) Threshold Limit Values (TLVs). However, this is significantly different from the TLVs after the existing TLVs were integrated and withdrawn in 2018, so it is necessary to review the revision. Methods: Various documents related to chromium OELs were reviewed, including the ACGIH TLV Documentations for chromium and its compounds. A field survey was conducted targeting workplaces handling chromium and its compounds. Based on this, a revised OELs were proposed and a socio-economic evaluation was conducted. Results: The OELs for chromium compounds in Korea was first enacted in 2002, and in 2007, the OELs for chromium (hexavalent) compounds (insoluble) was lowered from 0.05 mg/m3 to 0.01 mg/m3. In 2008, the OELs for strontium chromate was newly established as 0.0005 mg/m3, and in 2018, the OELs for calcium chromate was newly established as 0.001 mg/m3. Total chromium and hexavalent chromium were measured for each of 6 samples at 2 welding sites, 4 plating sites, and 2 spray coating sites. When omparing the average of the results measured by ICP, a total chromium analysis method, and the analysis results by IC, a hexavalent chromium analysis method, only workplace 4 was the same, and total chromium was evaluated more, and total chromium was evaluated at 0.0004 to 0.0027 mg/m3. And hexavalent chromium was evaluated as non-detection ~ 0.0014 mg/m3. Amendment ①: The exposure standard for hexavalent chromium is not divided into water soluble, insoluble, chromium ore processing, and other hexavalent chromium compounds, and is integrated into 0.01 mg/m3, which is the level of chromium (hexavalent) compound (insoluble)., OELs for chromium (metal) and chromium (trivalent) compounds are integrated into chromium (trivalent) compounds, and the exposure level is maintained. Amendment ②: As in the amendment ①, the OELs are integrated, but the level is lowered to 0.005 mg/m3, which is the OELs of OSHA, and there is a grace period of 4 years. Amendment ③: As in the amendment ①, the OELs are integrated, but the level is lowered to 0.0002 mg/m3, which is the exposure standard of ACGIH, and there is a grace period of 5 years. Conclusions: Amendment ①: The change in the OELs is insignificant, so the cost required is small, and the benefit/cost ratio is greater than 1, so there is no problem in applying the amendment. Amendment ②: In all scenarios except chromium 6(insoluble), the benefit/cost ratio is greater than 1, so it is thought that there will be no major problem in applying the amendment. Amendment ③: Since the benefit/cost ratio is less than 1 in all scenarios, it is thought that the total social benefit that can be obtained when applying the amendment is not large.

An Empirical Study on Consumers' Dissatisfaction, Attribution and Complaint Behavior (소비자의 구매 후 불만족과 귀인 및 불평행동에 대한 실증적 연구)

  • In-Kon, Koh
    • Asia-Pacific Journal of Business Venturing and Entrepreneurship
    • /
    • v.19 no.3
    • /
    • pp.69-79
    • /
    • 2024
  • Companies should resolve consumer dissatisfaction and increase brand loyalty by actively identifying the factors of consumer dissatisfaction and proactively responding to expected complaint behavior to induce repurchase. This is a management goal that should be pursued in common regardless of the size of the company. The specific purpose of this study is to find out whether the degree of dissatisfaction differs depending on whether or not consumers' expected performance before purchase and the actual perceived performance after purchase is compared, whether the degree of dissatisfaction affects the type of complaint behavior, which is a subsequent behavior, and whether the attributable behavior has a moderating effect in this process and whether the persistence of the result and the controllability of the cause act as a factor that determines the attribution position. In particular, compared to general companies, venture companies are more likely to overload the information processing ability of managers and are likely to make various irrational errors in decision making, so this study has important academic and practical implications. As a result of the analysis, the negative inconsistency group had the highest degree of dissatisfaction, and the higher the degree of inconsistency, the higher the dissatisfaction. The attributable behavior of unsatisfied consumers had a moderating effect on the degree of dissatisfaction, and the dissatisfaction was significantly higher in the external attributable group than the internal attributable group, which was statistically significant. On the other hand, the persistence of the result had a statistically significant effect on the attribution position, but the controllability of the cause was not. The degree of attributable behavior and dissatisfaction did not affect the type of complaining behavior, showing limited influence. Along with the interpretation of these results, this study presents various implications, especially for small and medium-sized/venture companies that provide new durable products.

  • PDF