• Title/Summary/Keyword: Timing Metric

Search Result 18, Processing Time 0.026 seconds

The timing of unprecedented hydrological drought under climate change

  • Yusuke Satoh;Hyungjun Kim
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2023.05a
    • /
    • pp.48-48
    • /
    • 2023
  • The intensified droughts under climate change are expected to threaten stable water resource availability. Droughts exceeding the magnitude of historical variability could occur increasingly frequently under future climate conditions. It is crucial to understand how drought will evolve over time because the assumption of hydrological stationarity of the past decades would be inappropriate for future water resources management. However, the timing of the emergence of unprecedented drought conditions under climate change has rarely been examined. Here, using multimodel hydrological simulations, we investigate the changes in the frequency of hydrological drought (defined as abnormally low river discharge) under high and low greenhouse gas concentration scenarios and with existing water resources management and estimate the timing of the first emergence of unprecedented regional drought conditions that persist for over several consecutive years. This new metric enables a new quantification of the urgency of adaptation and mitigation with regard to drought under climate change. The times are detected for several sub-continental-scale regions, and three regions, namely, southwestern South America, Mediterranean Europe, and northern Africa, exhibit particularly robust and earlier critical times under the high-emission scenario. These three regions are expected to confront unprecedented conditions within the next 30 years with a high likelihood, regardless of the emission scenarios. In addition, the results obtained herein demonstrate the benefits of the lower-emission pathway in reducing the likelihood of emergence. The Paris Agreement goals are shown to be effective in reducing the likelihood to the unlikely level in most regions. Nevertheless, appropriate and prior adaptation measures are considered indispensable to when facing unprecedented drought conditions. The results of this study underscore the importance of improving drought preparedness within the considered time horizons.

  • PDF

Area-Power Trade-Offs for Flexible Filtering in Green Radios

  • Michael, Navin;Moy, Christophe;Vinod, Achutavarrier Prasad;Palicot, Jacques
    • Journal of Communications and Networks
    • /
    • v.12 no.2
    • /
    • pp.158-167
    • /
    • 2010
  • The energy efficiency of wireless infrastructure and terminals has been drawing renewed attention of late, due to their significant environmental cost. Emerging green communication paradigms such as cognitive radios, are also imposing the additional requirement of flexibility. This dual requirement of energy efficiency and flexibility poses new design challenges for implementing radio functional blocks. This paper focuses on the area vs. power trade-offs for the type of channel filters that are required in the digital frontend of a flexible, energy-efficient radio. In traditional CMOS circuits, increased area was traded for reduced dynamic power consumption. With leakage power emerging as the dominant mode of power consumption in nanoscale CMOS, these trade-offs must be revisited due to the strong correlation between area and leakage power. The current work discusses how the increased timing slacks obtained by increasing the parallelism can be exploited for overall power reduction even in nanoscale circuits. In this context the paper introduces the notion of 'area efficiency' and a metric for evaluating it. The proposed metric has also been used to compare the area efficiencies of different classes of time-shared filters.

Triple Helix and the Circle of Innovation

  • Phillips, Fred
    • Journal of Contemporary Eastern Asia
    • /
    • v.13 no.1
    • /
    • pp.57-68
    • /
    • 2014
  • This paper positions the triple-Helix as a meso-level notion, an epicycle in a grander circle of technological change, institutional change, and psychological change. Because of the differing speeds of these several kinds of change, speed is proposed as a high-level system metric. This implies that what we commonly call bridging agencies or facilitators - lawyers, venture capitalists, incubators, etc. - are better called buffering agencies, as they help to engage entities changing at different speeds. They use human judgment as well as information technologies to choose feasible timing for these engagements. The paper highlights implications for thinking about innovation diffusion: The grand cycle of socio-technical change means we should, rather, think in terms of innovation reinforcement, or a circle of innovation.

A Simple Technique on Estimating Delay Time Considering Crosstalk Noise in RC-class Interconnects Under Saturated Ramp Input (램프 입력에 대한 RC-class 연결선의 누화잡음을 고려한 지연시간 예측 기법)

  • Kim Ki-Young;Oh Kvung-Mi;Kim Seok-Yoon
    • The Transactions of the Korean Institute of Electrical Engineers C
    • /
    • v.54 no.7
    • /
    • pp.299-303
    • /
    • 2005
  • This paper proposes an analytic method can estimate delay time considering crosstalk noise at an arbitrary node of RC-class interconnects under saturated ramp input using a simple closed-form expression. In the case of single interconnects, algebraic expression presented in existent research can estimate delay time under ramp input using delay time under step input, and we applied it to estimate delay time considering crosstalk noise. As the result, we can provide a intuitive analysis about signal integrity of circuits that include crosstalk noise reducing computational complexity significantly.

A Bit-level ACSU of High Speed Viterbi Decoder

  • Kim, Min-Woo;Cho, Jun-Dong
    • JSTS:Journal of Semiconductor Technology and Science
    • /
    • v.6 no.4
    • /
    • pp.240-245
    • /
    • 2006
  • Viterbi decoder is composed of BMU(Branch metric Unit), ACSU(Add Compare Select Unit), and SMU(Survivor path Memory Unit). For high speed viterbi decoders, ACSU is the main bottleneck due to the compare-select and feedback operation. Thus, many studies have been advanced to solve the problem. For example, M-step look ahead technique and Minimized method are typical high speed algorithms. In this paper, we designed a bit-level ACSU(K=3, R=1/2, 4bit soft decision) based on those algorithms and switched the matrix product order in the backward direction of Minimized method so as to apply Code-Optimized-Array in order to reduce the area complexity. For experimentation, we synthesized our design by using SYNOPSYS Design compiler, with TSMC 0.18 um library, and verified the timing by using CADENCE verilog-XL.

Research on data augmentation algorithm for time series based on deep learning

  • Shiyu Liu;Hongyan Qiao;Lianhong Yuan;Yuan Yuan;Jun Liu
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.17 no.6
    • /
    • pp.1530-1544
    • /
    • 2023
  • Data monitoring is an important foundation of modern science. In most cases, the monitoring data is time-series data, which has high application value. The deep learning algorithm has a strong nonlinear fitting capability, which enables the recognition of time series by capturing anomalous information in time series. At present, the research of time series recognition based on deep learning is especially important for data monitoring. Deep learning algorithms require a large amount of data for training. However, abnormal sample is a small sample in time series, which means the number of abnormal time series can seriously affect the accuracy of recognition algorithm because of class imbalance. In order to increase the number of abnormal sample, a data augmentation method called GANBATS (GAN-based Bi-LSTM and Attention for Time Series) is proposed. In GANBATS, Bi-LSTM is introduced to extract the timing features and then transfer features to the generator network of GANBATS.GANBATS also modifies the discriminator network by adding an attention mechanism to achieve global attention for time series. At the end of discriminator, GANBATS is adding averagepooling layer, which merges temporal features to boost the operational efficiency. In this paper, four time series datasets and five data augmentation algorithms are used for comparison experiments. The generated data are measured by PRD(Percent Root Mean Square Difference) and DTW(Dynamic Time Warping). The experimental results show that GANBATS reduces up to 26.22 in PRD metric and 9.45 in DTW metric. In addition, this paper uses different algorithms to reconstruct the datasets and compare them by classification accuracy. The classification accuracy is improved by 6.44%-12.96% on four time series datasets.

Fault Coverage Metric for Delay Fault Testing (지연 고장 테스팅에 대한 고장 검출율 메트릭)

  • Kim, Myeong-Gyun;Gang, Seong-Ho;Han, Chang-Ho;Min, Hyeong-Bok
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.38 no.4
    • /
    • pp.266-276
    • /
    • 2001
  • Due to the rapid development of semiconductor technology, the complexity of VLSI circuits has heavily increased. With the increased densities of integrated circuits, several different types of faults can occur Thus, testing such circuits is becoming a sever problem. Delay testing can detect system timing failures caused by delay faults. However, the conventional delay fault coverage in terms of the number of detected faults may not be an effective measure of delay testing because, unlike a stuck-at-faults, the impact of a delay fault is dependent on its delay defect size rather than on its existence. Thus, the effectiveness of delay testing is dependent on the propagation delay of the path to be tested, the delay defect size, and the system clock interval. This paper proposes a new delay defect fault coverage that considers both propagation delay of the path to be tested and additional delay defect size. And the relationship between delay defect fault coverage and defect level is analyzed.

  • PDF

A Nobel Video Quality Degradation Monitoring Schemes Over an IPTV Service with Packet Loss (IPTV 서비스에서 패킷손실에 의한 비디오품질 열화 모니터링 방법)

  • Kwon, Jae-Cheol;Oh, Seoung-Jun;Suh, Chang-Ryul;Chin, Young-Min
    • Journal of Broadcast Engineering
    • /
    • v.14 no.5
    • /
    • pp.573-588
    • /
    • 2009
  • In this paper, we propose a novel video quality degradation monitoring scheme titled VR-VQMS(Visual Rhythm based Video Quality Monitoring Scheme) over an IPTV service prone to packet losses during network transmission. Proposed scheme quantifies the amount of quality degradation due to packet losses, and can be classified into a RR(reduced-reference) based quality measurement scheme exploiting visual rhythm data of H.264-encoded video frames at a media server and reconstructed ones at an Set-top Box as feature information. Two scenarios, On-line and Off-line VR-VQMS, are proposed as the practical solutions. We define the NPSNR(Networked Peak-to-peak Signal-to-Noise Ratio) modified by the well-known PSNR as a new objective quality metric, and several additional objective and subjective metrics based on it to obtain the statistics on timing, duration, occurrence, and amount of quality degradation. Simulation results show that the proposed method closely approximates the results from 2D video frames and gives good estimation of subjective quality(i.e.,MOS(mean opinion score)) performed by 10 test observers. We expect that the proposed scheme can play a role as a practical solution to monitor the video quality experienced by individual customers in a commercial IPTV service, and be implemented as a small and light agent program running on a resource-limited set-top box.