• Title/Summary/Keyword: Deterministic trend

Search Result 35, Processing Time 0.029 seconds

Loss Estimation in Southeast Korea from a Scenario Earthquake using the Deterministic Method in HAZUS

  • Kim, Kwang-Hee;Kang, Su-Young
    • 한국방재학회:학술대회논문집
    • /
    • 2009.02b
    • /
    • pp.43-50
    • /
    • 2009
  • Strong ground motion attenuation relationship represents a comprehensive trend of ground shakings at sites with distances from the source, geology, local soil conditions, and others. It is necessary to develop an attenuation relationship with careful considerations of characteristics of the target area for reliable seismic hazard/risk assessments. In the study, observed ground motions from the January 2007 magnitude 4.9 Odaesan earthquake and the events occurring in the Gyeongsang provinces are compared with the previously proposed ground attenuation relationships in the Korean Peninsula to select most appropriate one. In the meantime, a few strong ground motion attenuation relationships are proposed and introduced in HAZUS, which have been designed for the Western United States and the Central and Eastern United States. The selected relationship from the ones for the Korean Peninsula has been compared with attenuation relationships available in HAZUS. Then, the attenuation relation for the Western United States proposed by Sadigh et al. (1997) for the Site Class B has been selected for this study. Reliability of the assessment will be improved by using an appropriate attenuation relation. It has been used for the earthquake loss estimation of the Gyeongju area located in southeast Korea using the deterministic method in HAZUS with a scenario earthquake (M=6.7). Our preliminary estimates show 15.6% damage of houses, shelter needs for about three thousands residents, and 75 life losses in the study area for the scenario events occurring at 2 A.M. Approximately 96% of hospitals will be in normal operation in 24 hours from the proposed event. Losses related to houses will be more than 114 million US dollars. Application of the improved methodology for loss estimation in Korea will help decision makers for planning disaster responses and hazard mitigation.

  • PDF

Ensuring Data Confidentiality and Privacy in the Cloud using Non-Deterministic Cryptographic Scheme

  • John Kwao Dawson;Frimpong Twum;James Benjamin Hayfron Acquah;Yaw Missah
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.7
    • /
    • pp.49-60
    • /
    • 2023
  • The amount of data generated by electronic systems through e-commerce, social networks, and data computation has risen. However, the security of data has always been a challenge. The problem is not with the quantity of data but how to secure the data by ensuring its confidentiality and privacy. Though there are several research on cloud data security, this study proposes a security scheme with the lowest execution time. The approach employs a non-linear time complexity to achieve data confidentiality and privacy. A symmetric algorithm dubbed the Non-Deterministic Cryptographic Scheme (NCS) is proposed to address the increased execution time of existing cryptographic schemes. NCS has linear time complexity with a low and unpredicted trend of execution times. It achieves confidentiality and privacy of data on the cloud by converting the plaintext into Ciphertext with a small number of iterations thereby decreasing the execution time but with high security. The algorithm is based on Good Prime Numbers, Linear Congruential Generator (LGC), Sliding Window Algorithm (SWA), and XOR gate. For the implementation in C, thirty different execution times were performed and their average was taken. A comparative analysis of the NCS was performed against AES, DES, and RSA algorithms based on key sizes of 128kb, 256kb, and 512kb using the dataset from Kaggle. The results showed the proposed NCS execution times were lower in comparison to AES, which had better execution time than DES with RSA having the longest. Contrary, to existing knowledge that execution time is relative to data size, the results obtained from the experiment indicated otherwise for the proposed NCS algorithm. With data sizes of 128kb, 256kb, and 512kb, the execution times in milliseconds were 38, 711, and 378 respectively. This validates the NCS as a Non-Deterministic Cryptographic Algorithm. The study findings hence are in support of the argument that data size does not determine the execution.

Task offloading scheme based on the DRL of Connected Home using MEC (MEC를 활용한 커넥티드 홈의 DRL 기반 태스크 오프로딩 기법)

  • Ducsun Lim;Kyu-Seek Sohn
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.23 no.6
    • /
    • pp.61-67
    • /
    • 2023
  • The rise of 5G and the proliferation of smart devices have underscored the significance of multi-access edge computing (MEC). Amidst this trend, interest in effectively processing computation-intensive and latency-sensitive applications has increased. This study investigated a novel task offloading strategy considering the probabilistic MEC environment to address these challenges. Initially, we considered the frequency of dynamic task requests and the unstable conditions of wireless channels to propose a method for minimizing vehicle power consumption and latency. Subsequently, our research delved into a deep reinforcement learning (DRL) based offloading technique, offering a way to achieve equilibrium between local computation and offloading transmission power. We analyzed the power consumption and queuing latency of vehicles using the deep deterministic policy gradient (DDPG) and deep Q-network (DQN) techniques. Finally, we derived and validated the optimal performance enhancement strategy in a vehicle based MEC environment.

Analysis of the Secular Trend of the Annual and Monthly Precipitation Amount of South Korea (우리나라 월 및 연강수량의 경년변동 분석)

  • Kim, Gwang-Seob;Yim, Tae-Kyung;Park, Chan-Hee
    • Journal of the Korean Society of Hazard Mitigation
    • /
    • v.9 no.6
    • /
    • pp.17-30
    • /
    • 2009
  • In this study, the existence of possible deterministic longterm trend of precipitation amount, monthly maximum precipitation, rain day, the number of rain day greater than 20mm, 30mm, and 80mm was analyzed using the Mann-Kendall rank test and the data from 62 stations between 1905 and 2004 in South Korea. Results indicate that the annual and monthly rainfall amount increases and the number of rain days which have more than 80mm rainfall a day, increases. However the number of rain days decreases. Also, monthly trend analysis of precipitation amount and monthly maximum precipitation increases in Jan., May, Jun., Jul., Aug., and Sep. and they decrease in Mar., Apr., Oct., Nov., and Dec. Monthly trend of the number of rain day greater than 20mm, 30mm, and 80mm increases in Jun., Jul., Aug., and Sep. However results of Mann-Kedall test demonstrated that the ratio of stations, which have meaningful longterm trend in the significance level of 90% and 95%, is very low. It means that the random variability of the analyzed precipitation related data is much greater than their linear increment.

Robust Predictive Control of Robot Manipulators with Uncertainties (불확실 로봇 매니퓰레이터의 견실 예측 제어기 설계)

  • 김정관;한명철
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.10 no.1
    • /
    • pp.10-14
    • /
    • 2004
  • We present a predictive control algorithm combined with the robust robot control that is constructed on the Lyapunov min-max approach. Since the control design of a real manipulator system may often be made on the basis of the imperfect knowledge about the model, it is an important trend to design a robust control law that guarantees the desired properties of the manipulator under uncertain elements. In the preceding robust control work, we need to tune several control parameters in the admissible set where the desired stability can be achieved. By introducing an optimal predictive control technique in robust control we can find out much more deterministic controller for both the stability and the performance of manipulators. A new class of robust control combined with an optimal predictive control is constructed. We apply it to a simple type of 2-link robot manipulator and show that a desired performance can be achieved through the computer simulation.

Research Trend in Ultra-Low Latency Networking for Fourth Industrial Revolution (제4차 산업혁명 시대를 위한 초저지연 네트워킹 기술 동향)

  • Kang, T.K.;Kang, Y.H.;Ryoo, Y.C.;Cheung, T.S.
    • Electronics and Telecommunications Trends
    • /
    • v.34 no.6
    • /
    • pp.108-122
    • /
    • 2019
  • Ultra-low latency networking is a technology that reduces the end-to-end latency related to transport time-sensitive or mission-critical traffic in a network. As the proliferation of the fourth industrial revolution and 5G mobile communications continues, ultra-low latency networking is emerging as an essential technology for supporting various network applications (such as industrial control, tele-surgery, and unmanned vehicles). In this report, we introduce the ultra-low-latency networking technologies that are in progress, categorized by application area, and examine their up-to-date standard status.

A Study on the Utility of Statistical Power Balance Method for Efficient Electromagnetic Analysis of Large and Complex Structures (복잡한 대형 구조물의 효율적인 전자파 해석을 위한 통계적인 PWB 방법의 유용성에 관한 연구)

  • Lee, Young-Seung;Park, Seung-Keun
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.24 no.2
    • /
    • pp.189-197
    • /
    • 2013
  • With the trend of technological advances in electronic communications and the advent of ubiquitous environments, the density of existing electronic equipment in the surroundings is increasing significantly. It is hence great importance to study the numerically efficient and fast algorithm for complex and large environments to identify their electromagnetic compatibility and interference characteristics of equipments installed in those structure. This paper introduces a statistical-based power balance method(PWB) for the analysis of these problems and considers its practical utility. The 2-dimensional lossy rectangular cavity was numerically revisited to clarify its relationship with the classical deterministic analysis solutions based on the Maxwell's equation. It can be shown that the statistical assumptions and analysis results from the power balance method correspond to the volume average over the realistic deterministic domain. This statistical power balance approach should be a sufficiently practical alternative to the electromagnetic problem of complex and large environment since it is apparent that the full-wave analysis methods have some severe limits of its computational burdens under the situation of complex and large environment.

Use of Space-time Autocorrelation Information in Time-series Temperature Mapping (시계열 기온 분포도 작성을 위한 시공간 자기상관성 정보의 결합)

  • Park, No-Wook;Jang, Dong-Ho
    • Journal of the Korean association of regional geographers
    • /
    • v.17 no.4
    • /
    • pp.432-442
    • /
    • 2011
  • Climatic variables such as temperature and precipitation tend to vary both in space and in time simultaneously. Thus, it is necessary to include space-time autocorrelation into conventional spatial interpolation methods for reliable time-series mapping. This paper introduces and applies space-time variogram modeling and space-time kriging to generate time-series temperature maps using hourly Automatic Weather System(AWS) temperature observation data for a one-month period. First, temperature observation data are decomposed into deterministic trend and stochastic residual components. For trend component modeling, elevation data which have reasonable correlation with temperature are used as secondary information to generate trend component with topographic effects. Then, space-time variograms of residual components are estimated and modelled by using a product-sum space-time variogram model to account for not only autocorrelation both in space and in time, but also their interactions. From a case study, space-time kriging outperforms both conventional space only ordinary kriging and regression-kriging, which indicates the importance of using space-time autocorrelation information as well as elevation data. It is expected that space-time kriging would be a useful tool when a space-poor but time-rich dataset is analyzed.

  • PDF

A Study on the Change of Concept in Architectural Space following the Aesthetic Cognition of Space (미학적 공간인식에 따른 건축공간개념의 변화에 관한 연구)

  • 이용재;윤도근
    • Korean Institute of Interior Design Journal
    • /
    • no.16
    • /
    • pp.22-28
    • /
    • 1998
  • The purpose of this study is to analyze the architectural space of modern and contemporary architecture which has been changed by the aesthetic cognition on space. The intention of considering architectural space aesthetically is to convert the viewpoint of seeing space as simple physical structure into different viewpoint of regarding 'space' as 'cultural place' However this does not means to apply aesthetic theory to architectural space. The aesthetic cognition on space is one of the main subjects of the expression of art from ancient to today however the appearance of space concept as architectural aesthetics accelerated by G. Semper theory after the latter half of 19th century. On the standpoint of perpetuity in architecture the aesthetics of scientific rationalism in modernism based on the reasonable thinking regards the variety of inherent characteristic in architectural space as 'Transferential Space'. On the other hand, in regarding to architectural trend, the nature in architectural space has been considered as 'Existential Space' starting from the conscious construction of environments to help human existence in the existentialism. The Conclusion logic of follows as belows; first, the concept of space structure in architecture has been exchanged from Enclosed Space to Topological Space. Second, the concept of architectural space has been changed and developed to the Deterministic, Profound, Dissipative, and Recognizable Space according to the change of expression in architecture.

  • PDF

Bootstrap estimation of long-run variance under strong dependence (장기간 의존 시계열에서 붓스트랩을 이용한 장기적 분산 추정)

  • Baek, Changryong;Kwon, Yong
    • The Korean Journal of Applied Statistics
    • /
    • v.29 no.3
    • /
    • pp.449-462
    • /
    • 2016
  • This paper considers a long-run variance estimation using a block bootstrap method under strong dependence also known as long range dependence. We extend currently available methods in two ways. First, it extends bootstrap methods under short range dependence to long range dependence. Second, to accommodate the observation that strong dependence may come from deterministic trend plus noise models, we propose to utilize residuals obtained from the nonparametric kernel estimation with the bimodal kernel. The simulation study shows that our method works well; in addition, a data illustration is presented for practitioners.