• Title/Summary/Keyword: adaptive model

Search Result 2,838, Processing Time 0.031 seconds

Analytical Models and Performance Evaluations of SNMP and Mobile Agent (SNMP와 이동에이전트의 해석적 모델 및 성능 평가)

  • 이정우;윤완오;신광식;최상방
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.28 no.8B
    • /
    • pp.716-729
    • /
    • 2003
  • As the public Internet and private internet have grown from small networks into large infrastructures, the need to more systematically manage the large number of network components within these networks has grown more important as well. The rapid growth of network size has brought into question the salability of the existing centralized model, such as SNMP(Simple Network Management Protocol) and CMIP(Common Management Information Protocol). Thus, for efficient network management, researches about mobile agent have also been performed recently. This paper presents analytical models of centralized approach based on SNMP protocol, distributed approach based on mobile agent, and mixed mode to make up for shortcomings of SNMP and mobile agent. We compare the performance of these analytical models based on network management response time. Experiment results show that performance of mobile agent and the nixed mode is less sensitive to the delay in WAN network environment. However, SNMP is more efficient for the simple network environment like LAN. We also propose an adaptive network management algorithm in consideration of network t environment. delay, task, and the number of nodes based on the results of analytical models. The results show that the adaptive network management algorithm can reduce the network management response time by 10% compared with either mobile agent or mixed mode network management algorithm.

An Active Queue Management Method Based on the Input Traffic Rate Prediction for Internet Congestion Avoidance (인터넷 혼잡 예방을 위한 입력율 예측 기반 동적 큐 관리 기법)

  • Park, Jae-Sung;Yoon, Hyun-Goo
    • 전자공학회논문지 IE
    • /
    • v.43 no.3
    • /
    • pp.41-48
    • /
    • 2006
  • In this paper, we propose a new active queue management (AQM) scheme by utilizing the predictability of the Internet traffic. The proposed scheme predicts future traffic input rate by using the auto-regressive (AR) time series model and determines the future congestion level by comparing the predicted input rate with the service rate. If the congestion is expected, the packet drop probability is dynamically adjusted to avoid the anticipated congestion level. Unlike the previous AQM schemes which use the queue length variation as the congestion measure, the proposed scheme uses the variation of the traffic input rate as the congestion measure. By predicting the network congestion level, the proposed scheme can adapt more rapidly to the changing network condition and stabilize the average queue length and its variation even if the traffic input level varies widely. Through ns-2 simulation study in varying network environments, we compare the performance among RED, Adaptive RED (ARED), REM, Predicted AQM (PAQM) and the proposed scheme in terms of average queue length and packet drop rate, and show that the proposed scheme is more adaptive to the varying network conditions and has shorter response time.

Adaptive Control of End Milling Machine to Improve Machining Straightness (직선도 개선을 위한 엔드밀링머시인 의 적응제어)

  • 김종선;정성종;이종원
    • Transactions of the Korean Society of Mechanical Engineers
    • /
    • v.9 no.5
    • /
    • pp.590-597
    • /
    • 1985
  • A recursive geometric adaptive control method to compensate for machining straightness error in the finished surface due to tool deflection and guideway error generated by end milling process is developed. The relationship between the tool deflection and the feedrate is modeled by a modified Taylor's tool life equation. Without a priori knowledge on the variations off cutting parameters, time varying parameters are then estimated by an exponentially windowed recursive least squares method with only post-process measurements of the straightness error. The location error is controlled by shifting the milling bed in the direction perpendicular to the finished surface and adding a certain amount of feedrate with respect to the tool deflection model before cutting. The waviness error is compensated by adjusting the feedrate during machining. Experimental results show that location error is controlled within a range of fixturing error of the bed on the guideway and that about 60% reduction in the waviness error can be achieved within a few steps of parameter adaption under wide operating ranges of cutting conditions even if the parameters do not converge to fixed values.

ECG Signal Compression based on Adaptive Multi-level Code (적응적 멀티 레벨 코드 기반의 심전도 신호 압축)

  • Kim, Jungjoon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.23 no.6
    • /
    • pp.519-526
    • /
    • 2013
  • ECG signal has the feature that is repeated in a cycle of P, Q, R, S, and T waves and is sampled at a high sampling frequency in general. By using the feature of periodic ECG signals, maximizing compression efficiency while minimizing the loss of important information for diagnosis is required. However, the periodic characteristics of such amplitude and period is not constant by measuring time and patients. Even though measured at the same time, the patient's characteristics display different periodic intervals. In this paper, an adaptive multi-level coding is provided by coding adaptively the dominant and non-dominant signal interval of the ECG signal. The proposed method can maximize the compression efficiency by using a multi-level code that applies different compression ratios considering information loss associated with the dominant signal intervals and non-dominant signal intervals. For the case of long time measurement, this method has a merit of maximizing compression ratio compared with existing compression methods that do not use the periodicity of the ECG signal and for the lossless compression coding of non-dominant signal intervals, the method has an advantage that can be stored without loss of information. The effectiveness of the ECG signal compression is proved throughout the experiment on ECG signal of MIT-BIH arrhythmia database.

A Study on Projection Image Restoration by Adaptive Filtering (적응적 필터링에 의한 투사영상 복원에 관한 연구)

  • 김정희;김광익
    • Journal of Biomedical Engineering Research
    • /
    • v.19 no.2
    • /
    • pp.119-128
    • /
    • 1998
  • This paper describes a filtering algorithm which employs apriori information of SPECT lesion detectability potential for the filtering of degraded projection images prior to the backprojection reconstruction. In this algorithm, we determined m minimum detectable lesion sized(MDLSs) by assuming m object contrasts uniformly-chosen in the range of 0.0-1.0, based on a signal/noise model which provides the capability potential of SPECT in terms of physical factors. A best estimate of given projection image is attempted as a weighted combination of the subimages from m optimal filters whose design is focused on maximizing the local S/N ratios for the MDLS-lesions. These subimages show relatively larger resolution recovery effect and relatively smaller noise reduction effect with the decreased MDLS, and the weighting on each subimage was controlled by the difference between the subimage and the maximum-resolution-recovered projection image. The proposed filtering algoritym was tested on SPECT image reconstruction problems, and produced good results. Especially, this algorithm showed the adaptive effect that approximately averages the filter outputs in homogeneous areas and sensitively depends on each filter strength on contrast preserving/enhancing in textured lesion areas of the reconstructed image.

  • PDF

Improved VFM Method for High Accuracy Flight Simulation (고정밀 비행 시뮬레이션을 위한 개선 VFM 기법 연구)

  • Lee, Chiho;Kim, Mukyeom;Lee, Jae-Lyun;Jeon, Kwon-Su;Tyan, Maxim;Lee, Jae-Woo
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.49 no.9
    • /
    • pp.709-719
    • /
    • 2021
  • Recent progress in analysis and flight simulation methods enables wider use of a virtual certification and reduces number of certification flight tests. Aerodynamic database (AeroDB) is one of the most important components for the flight simulation. It is composed of aerodynamic coefficients at a range of flight conditions and control deflections. This paper proposes and efficient method for construction of AeroDB that combines Gaussian Process based Variable Fidelity Modeling with adaptive sampling algorithm. A case study of virtual certification of a F-16 fighter is presented. Four AeroDB were constructed using different number and distribution of high-fidelity data points. The constructed database is then used to simulate gliding, short pitch, and roll response. Compliance with certification regulations is then checked. The case study demonstrates that the proposed method can significantly reduce number of high-fidelity data points while maintaining high accuracy of the simulation.

Optimal Spatial Scale for Land Use Change Modelling : A Case Study in a Savanna Landscape in Northern Ghana (지표피복변화 연구에서 최적의 공간스케일의 문제 : 가나 북부지역의 사바나 지역을 사례로)

  • Nick van de Giesen;Paul L. G. Vlek;Park Soo Jin
    • Journal of the Korean Geographical Society
    • /
    • v.40 no.2 s.107
    • /
    • pp.221-241
    • /
    • 2005
  • Land Use and Land Cover Changes (LUCC) occur over a wide range of space and time scales, and involve complex natural, socio-economic, and institutional processes. Therefore, modelling and predicting LUCC demands an understanding of how various measured properties behave when considered at different scales. Understanding spatial and temporal variability of driving forces and constraints on LUCC is central to understanding the scaling issues. This paper aims to 1) assess the heterogeneity of land cover change processes over the landscape in northern Ghana, where intensification of agricultural activities has been the dominant land cover change process during the past 15 years, 2) characterise dominant land cover change mechanisms for various spatial scales, and 3) identify the optimal spatial scale for LUCC modelling in a savanna landscape. A multivariate statistical method was first applied to identify land cover change intensity (LCCI), using four time-sequenced NDVI images derived from LANDSAT scenes. Three proxy land use change predictors: distance from roads, distance from surface water bodies, and a terrain characterisation index, were regressed against the LCCI using a multi-scale hierarchical adaptive model to identify scale dependency and spatial heterogeneity of LUCC processes. High spatial associations between the LCCI and land use change predictors were mostly limited to moving windows smaller than 10$\times$10km. With increasing window size, LUCC processes within the window tend to be too diverse to establish clear trends, because changes in one part of the window are compensated elsewhere. This results in a reduced correlation between LCCI and land use change predictors at a coarser spatial extent. The spatial coverage of 5-l0km is incidentally equivalent to a village or community area in the study region. In order to reduce spatial variability of land use change processes for regional or national level LUCC modelling, we suggest that the village level is the optimal spatial investigation unit in this savanna landscape.

Development and Testing of a Machine Learning Model Using 18F-Fluorodeoxyglucose PET/CT-Derived Metabolic Parameters to Classify Human Papillomavirus Status in Oropharyngeal Squamous Carcinoma

  • Changsoo Woo;Kwan Hyeong Jo;Beomseok Sohn;Kisung Park;Hojin Cho;Won Jun Kang;Jinna Kim;Seung-Koo Lee
    • Korean Journal of Radiology
    • /
    • v.24 no.1
    • /
    • pp.51-61
    • /
    • 2023
  • Objective: To develop and test a machine learning model for classifying human papillomavirus (HPV) status of patients with oropharyngeal squamous cell carcinoma (OPSCC) using 18F-fluorodeoxyglucose (18F-FDG) PET-derived parameters in derived parameters and an appropriate combination of machine learning methods in patients with OPSCC. Materials and Methods: This retrospective study enrolled 126 patients (118 male; mean age, 60 years) with newly diagnosed, pathologically confirmed OPSCC, that underwent 18F-FDG PET-computed tomography (CT) between January 2012 and February 2020. Patients were randomly assigned to training and internal validation sets in a 7:3 ratio. An external test set of 19 patients (16 male; mean age, 65.3 years) was recruited sequentially from two other tertiary hospitals. Model 1 used only PET parameters, Model 2 used only clinical features, and Model 3 used both PET and clinical parameters. Multiple feature transforms, feature selection, oversampling, and training models are all investigated. The external test set was used to test the three models that performed best in the internal validation set. The values for area under the receiver operating characteristic curve (AUC) were compared between models. Results: In the external test set, ExtraTrees-based Model 3, which uses two PET-derived parameters and three clinical features, with a combination of MinMaxScaler, mutual information selection, and adaptive synthetic sampling approach, showed the best performance (AUC = 0.78; 95% confidence interval, 0.46-1). Model 3 outperformed Model 1 using PET parameters alone (AUC = 0.48, p = 0.047) and Model 2 using clinical parameters alone (AUC = 0.52, p = 0.142) in predicting HPV status. Conclusion: Using oversampling and mutual information selection, an ExtraTree-based HPV status classifier was developed by combining metabolic parameters derived from 18F-FDG PET/CT and clinical parameters in OPSCC, which exhibited higher performance than the models using either PET or clinical parameters alone.

Collaborative Planning Model for Brownfield Regeneration (브라운필드 재생을 위한 협력적 계획 모델 연구)

  • Kim, Eujin Julia;Miller, Patrick
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.43 no.3
    • /
    • pp.92-100
    • /
    • 2015
  • Unlike most other planning processes, brownfield planning generally requires a high level of technical and legal expertise due to potential site contamination. To successfully engage in inclusionary decision making, an adaptive collaboration strategy for brownfield planning is therefore critical. This study examines how a communicative planning approach can be used to overcome the challenge of enabling experts from different fields to work alongside lay people from the local community to achieve a properly balanced collaboration in brownfield planning. After identifying appropriate indicators for collaboration through a literature review of established communicative planning theory, these indicators are applied to the brownfield planning process, highlighting critical points of collaboration such as site prioritization, assessment, remediation, and redevelopment throughout. The results suggest the critical need for an adaptive model focusing on three aspects: 1. Facilitation of a balanced dialogue between the experts with social, cultural, and design-based knowledge and the ones with scientific and engineering-based knowledge, 2. Preparation of an appropriate tool for risk communication with the lay people, 3. Development of decision support system for the integration of expert-oriented technical data and public opinion-oriented subjective data.

A Hierarchical Group-Based CAVLC Decoder (계층적 그룹 기반의 CAVLC 복호기)

  • Ham, Dong-Hyeon;Lee, Hyoung-Pyo;Lee, Yong-Surk
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.45 no.2
    • /
    • pp.26-32
    • /
    • 2008
  • Video compression schemes have been developed and used for many years. Currently, H.264/AVC is the most efficient video coding standard. The H.264/AVC baseline profile adopts CAVLC(Context-Adaptive Variable Length Coding) method as an entropy coding method. CAVLC gives better performance in compression ratios than conventional VLC(Variable Length Coding). However, because CAVLC decoder uses a lot of VLC tables, the CAVLC decoder requires a lot of area in terms of hardware. Conversely, since it must look up the VLC tables, it gives a worse performance in terms of software. In this paper, we propose a new hierarchical grouping method for the VLC tables. We can obtain an index of codes in the reconstructed VLC tables by simple arithmetic operations. In this method, the VLC tables are accessed just once in decoding a symbol. We modeled the proposed algorithm in C language, compiled under ARM ADS1.2 and simulated it with Armulator. Experimental results show that the proposed algorithm reduces execution time by about 80% and 15% compared with the H.264/AVC reference program JM(Joint Model) 10.2 and the arithmetic operation algorithm which is recently proposed, respectively.