• Title/Summary/Keyword: Propose technological

Search Result 333, Processing Time 0.026 seconds

Prefix Cuttings for Packet Classification with Fast Updates

  • Han, Weitao;Yi, Peng;Tian, Le
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.4
    • /
    • pp.1442-1462
    • /
    • 2014
  • Packet classification is a key technology of the Internet for routers to classify the arriving packets into different flows according to the predefined rulesets. Previous packet classification algorithms have mainly focused on search speed and memory usage, while overlooking update performance. In this paper, we propose PreCuts, which can drastically improve the update speed. According to the characteristics of IP field, we implement three heuristics to build a 3-layer decision tree. In the first layer, we group the rules with the same highest byte of source and destination IP addresses. For the second layer, we cluster the rules which share the same IP prefix length. Finally, we use the heuristic of information entropy-based bit partition to choose some specific bits of IP prefix to split the ruleset into subsets. The heuristics of PreCuts will not introduce rule duplication and incremental update will not reduce the time and space performance. Using ClassBench, it is shown that compared with BRPS and EffiCuts, the proposed algorithm not only improves the time and space performance, but also greatly increases the update speed.

Simulation Based Method for Mid-and-Long Term Technological Forecasting (중장기 기술예측을 위한 시뮬레이션 기반 방법론)

  • Yu, Sung-Yeol
    • The Journal of the Korea Contents Association
    • /
    • v.10 no.1
    • /
    • pp.372-380
    • /
    • 2010
  • In this study, we consider a mid-and-long term technological forecasting method based on simulation technique. We, first, gather information about a point of appearance time of new technologies which will be developed in the future and influence relationship among those technologies by Delphi survey. And then we propose a simulation-based heuristic approach searching for the key technology among new technologies which will be developed to attain a normative objective using the Delphi data. We also provide the range of occurrence time for individual technology and define key technologies in this study in contrast that a expert's estimate to occurrence time is only one point in traditional Delphi survey. The information for key technologies which are detected by this procedure gives priorities of R&D planning and aids the R&D planner or project manager in resource allocation.

Simple Fuzzy Rule Based Edge Detection

  • Verma, O.P.;Jain, Veni;Gumber, Rajni
    • Journal of Information Processing Systems
    • /
    • v.9 no.4
    • /
    • pp.575-591
    • /
    • 2013
  • Most of the edge detection methods available in literature are gradient based, which further apply thresholding, to find the final edge map in an image. In this paper, we propose a novel method that is based on fuzzy logic for edge detection in gray images without using the gradient and thresholding. Fuzzy logic is a mathematical logic that attempts to solve problems by assigning values to an imprecise spectrum of data in order to arrive at the most accurate conclusion possible. Here, the fuzzy logic is used to conclude whether a pixel is an edge pixel or not. The proposed technique begins by fuzzifying the gray values of a pixel into two fuzzy variables, namely the black and the white. Fuzzy rules are defined to find the edge pixels in the fuzzified image. The resultant edge map may contain some extraneous edges, which are further removed from the edge map by separately examining the intermediate intensity range pixels. Finally, the edge map is improved by finding some left out edge pixels by defining a new membership function for the pixels that have their entire 8-neighbourhood pixels classified as white. We have compared our proposed method with some of the existing standard edge detector operators that are available in the literature on image processing. The quantitative analysis of the proposed method is given in terms of entropy value.

Unified Psycholinguistic Framework: An Unobtrusive Psychological Analysis Approach Towards Insider Threat Prevention and Detection

  • Tan, Sang-Sang;Na, Jin-Cheon;Duraisamy, Santhiya
    • Journal of Information Science Theory and Practice
    • /
    • v.7 no.1
    • /
    • pp.52-71
    • /
    • 2019
  • An insider threat is a threat that comes from people within the organization being attacked. It can be described as a function of the motivation, opportunity, and capability of the insider. Compared to managing the dimensions of opportunity and capability, assessing one's motivation in committing malicious acts poses more challenges to organizations because it usually involves a more obtrusive process of psychological examination. The existing body of research in psycholinguistics suggests that automated text analysis of electronic communications can be an alternative for predicting and detecting insider threat through unobtrusive behavior monitoring. However, a major challenge in employing this approach is that it is difficult to minimize the risk of missing any potential threat while maintaining an acceptable false alarm rate. To deal with the trade-off between the risk of missed catches and the false alarm rate, we propose a unified psycholinguistic framework that consolidates multiple text analyzers to carry out sentiment analysis, emotion analysis, and topic modeling on electronic communications for unobtrusive psychological assessment. The user scenarios presented in this paper demonstrated how the trade-off issue can be attenuated with different text analyzers working collaboratively to provide more comprehensive summaries of users' psychological states.

Deep Learning based Human Recognition using Integration of GAN and Spatial Domain Techniques

  • Sharath, S;Rangaraju, HG
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.8
    • /
    • pp.127-136
    • /
    • 2021
  • Real-time human recognition is a challenging task, as the images are captured in an unconstrained environment with different poses, makeups, and styles. This limitation is addressed by generating several facial images with poses, makeup, and styles with a single reference image of a person using Generative Adversarial Networks (GAN). In this paper, we propose deep learning-based human recognition using integration of GAN and Spatial Domain Techniques. A novel concept of human recognition based on face depiction approach by generating several dissimilar face images from single reference face image using Domain Transfer Generative Adversarial Networks (DT-GAN) combined with feature extraction techniques such as Local Binary Pattern (LBP) and Histogram is deliberated. The Euclidean Distance (ED) is used in the matching section for comparison of features to test the performance of the method. A database of millions of people with a single reference face image per person, instead of multiple reference face images, is created and saved on the centralized server, which helps to reduce memory load on the centralized server. It is noticed that the recognition accuracy is 100% for smaller size datasets and a little less accuracy for larger size datasets and also, results are compared with present methods to show the superiority of proposed method.

Building Efficient Multi-level Wireless Sensor Networks with Cluster-based Routing Protocol

  • Shwe, Hnin Yu;Kumar, Arun;Chong, Peter Han Joo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.9
    • /
    • pp.4272-4286
    • /
    • 2016
  • In resource constrained sensor networks, usage of efficient routing protocols can have significant impact on energy dissipation. To save energy, we propose an energy efficient routing protocol. In our approach, which integrates clustering and routing in sensor networks, we perform network coding during data routing in order to achieve additional power savings in the cluster head nodes. Efficacy of the proposed method in terms of the throughput and end-to-end delay is demonstrated through simulation results. Significant network lifetime is also achieved as compared with other techniques.

Development of Correlation Based Feature Selection Method by Predicting the Markov Blanket for Gene Selection Analysis

  • Adi, Made;Yun, Zhen;Keong, Kwoh-Chee
    • Proceedings of the Korean Society for Bioinformatics Conference
    • /
    • 2005.09a
    • /
    • pp.183-187
    • /
    • 2005
  • In this paper, we propose a heuristic method to select features using a Two-Phase Markov Blanket-based (TPMB) algorithm. The first phase, filtering phase, of TPMB algorithm works by filtering the obviously redundant features. A non-linear correlation method based on Information theory is used as a metric to measure the redundancy of a feature [1]. In second phase, approximating phase, the Markov Blanket (MB) of a system is estimated by employing the concept of cross entropy to identify the MB. We perform experiments on microarray data and report two popular dataset, AML-ALL [3] and colon tumor [4], in this paper. The experimental results show that the TPMB algorithm can significantly reduce the number of features while maintaining the accuracy of the classifiers.

  • PDF

Dynamics of Nanosciences and Technologies: Policy Implication

  • Laredo, Philippe;Delemarle, Aurelie;Kahane, Bernard
    • STI Policy Review
    • /
    • v.1 no.1
    • /
    • pp.43-62
    • /
    • 2010
  • Whatever the country, nanotechnology features as a key priority of most national research and innovation policies. This focus on nanotechnology is due to the promises of this general purpose technology, this new technological wave. As 'one size does not fit all', policies supporting its development cannot just adopt the 'best practices' of the preceding wave. We argue that specific on-going dynamics of nanoscience and technology production justifies the existence of dedicated nanotechnology policies. It also questions the portfolio of instruments mobilized and their balance. In this article, we discuss policies developed for the preceding technological waves and, based on the characteristics of nanosciences and technologies, propose five dimensions of policies to be taken into consideration for their governance at the country and cluster levels.

A Technology Analysis Model using Dynamic Time Warping

  • Choi, JunHyeog;Jun, SungHae
    • Journal of the Korea Society of Computer and Information
    • /
    • v.20 no.2
    • /
    • pp.113-120
    • /
    • 2015
  • Technology analysis is to analyze technological data such as patent and paper for a given technology field. From the results of technology analysis, we can get novel knowledge for R&D planing and management. For the technology analysis, we can use diverse methods of statistics. Time series analysis is one of efficient approaches for technology analysis, because most technologies have researched and developed depended on time. So many technological data are time series. Time series data are occurred through time. In this paper, we propose a methodology of technology forecasting using the dynamic time warping (DTW) of time series analysis. To illustrate how to apply our methodology to real problem, we perform a case study of patent documents in target technology field. This research will contribute to R&D planning and technology management.

Efficient Message Scheduling for WDM Optical Networks with Minimizing Flow Time

  • Huang, Xiaohong;Ma, Maode
    • Journal of Communications and Networks
    • /
    • v.6 no.2
    • /
    • pp.147-155
    • /
    • 2004
  • In this paper, we propose an efficient sequencing technique, namely minimum Row time scheduling (MFTS), to manage variable-Iength message transmissions for single-hop passive starcoupled WDM optical networks. By considering not only the message length but also the state of the receivers and the tuning latency, the proposed protocol can reduce the average delay of the network greatly. This paper also introduces a new channel assignment technique latency minimizing scheduling (LMS), which aims to reduce the scheduling latency. We evaluate the proposed algorithm, using extensive discrete-event simulations, by comparing its performance with shortest job first (SJF) algorithm. We find that significant improvement in average delay could be achieved by MFTS algorithm. By combining the proposed message sequencing technique with the channel selection technique, the performance of the optical network could be further improved.