• Title/Summary/Keyword: network overlap

Search Result 117, Processing Time 0.025 seconds

A Study of Arrow Performance using Artificial Neural Network (Artificial Neural Network를 이용한 화살 성능에 대한 연구)

  • Jeong, Yeongsang;Kim, Sungshin
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.24 no.5
    • /
    • pp.548-553
    • /
    • 2014
  • In order to evaluate the performance of arrow that manufactures through production process, it is used that personal experiences such as hunters who have been using bow and arrow for a long time, technicians who produces leisure and sports equipment, and experts related with this industries. Also, the intensity of arrow's impact point which obtains from repeated shooting experiments is an important indicator for evaluating the performance of arrow. There are some ongoing researches for evaluating performance of arrow using intensity of the arrow's impact point and the arrow's flying image that obtained from high-speed camera. However, the research that deals with mutual relation between distribution of the arrow's impact point and characteristics of the arrow (length, weight, spine, overlap, straightness) is not enough. Therefore, this paper suggests both the system that could describes the distribution of the arrow's impact point into numerical representation and the correlation model between characteristics of arrow and impact points. The inputs of the model are characteristics of arrow (spine, straightness). And the output is MAD (mean absolute distance) of triangular shaped coordinates that could be obtained from 3 times repeated shooting by changing knock degree 120. The input-output data is collected for learning the correlation model, and ANN (artificial neural network) is used for implementing the model.

Automatic detection of periodontal compromised teeth in digital panoramic radiographs using faster regional convolutional neural networks

  • Thanathornwong, Bhornsawan;Suebnukarn, Siriwan
    • Imaging Science in Dentistry
    • /
    • v.50 no.2
    • /
    • pp.169-174
    • /
    • 2020
  • Purpose: Periodontal disease causes tooth loss and is associated with cardiovascular diseases, diabetes, and rheumatoid arthritis. The present study proposes using a deep learning-based object detection method to identify periodontally compromised teeth on digital panoramic radiographs. A faster regional convolutional neural network (faster R-CNN) which is a state-of-the-art deep detection network, was adapted from the natural image domain using a small annotated clinical data- set. Materials and Methods: In total, 100 digital panoramic radiographs of periodontally compromised patients were retrospectively collected from our hospital's information system and augmented. The periodontally compromised teeth found in each image were annotated by experts in periodontology to obtain the ground truth. The Keras library, which is written in Python, was used to train and test the model on a single NVidia 1080Ti GPU. The faster R-CNN model used a pretrained ResNet architecture. Results: The average precision rate of 0.81 demonstrated that there was a significant region of overlap between the predicted regions and the ground truth. The average recall rate of 0.80 showed that the periodontally compromised teeth regions generated by the detection method excluded healthiest teeth areas. In addition, the model achieved a sensitivity of 0.84, a specificity of 0.88 and an F-measure of 0.81. Conclusion: The faster R-CNN trained on a limited amount of labeled imaging data performed satisfactorily in detecting periodontally compromised teeth. The application of a faster R-CNN to assist in the detection of periodontally compromised teeth may reduce diagnostic effort by saving assessment time and allowing automated screening documentation.

The World as Seen from Venice (1205-1533) as a Case Study of Scalable Web-Based Automatic Narratives for Interactive Global Histories

  • NANETTI, Andrea;CHEONG, Siew Ann
    • Asian review of World Histories
    • /
    • v.4 no.1
    • /
    • pp.3-34
    • /
    • 2016
  • This introduction is both a statement of a research problem and an account of the first research results for its solution. As more historical databases come online and overlap in coverage, we need to discuss the two main issues that prevent 'big' results from emerging so far. Firstly, historical data are seen by computer science people as unstructured, that is, historical records cannot be easily decomposed into unambiguous fields, like in population (birth and death records) and taxation data. Secondly, machine-learning tools developed for structured data cannot be applied as they are for historical research. We propose a complex network, narrative-driven approach to mining historical databases. In such a time-integrated network obtained by overlaying records from historical databases, the nodes are actors, while thelinks are actions. In the case study that we present (the world as seen from Venice, 1205-1533), the actors are governments, while the actions are limited to war, trade, and treaty to keep the case study tractable. We then identify key periods, key events, and hence key actors, key locations through a time-resolved examination of the actions. This tool allows historians to deal with historical data issues (e.g., source provenance identification, event validation, trade-conflict-diplomacy relationships, etc.). On a higher level, this automatic extraction of key narratives from a historical database allows historians to formulate hypotheses on the courses of history, and also allow them to test these hypotheses in other actions or in additional data sets. Our vision is that this narrative-driven analysis of historical data can lead to the development of multiple scale agent-based models, which can be simulated on a computer to generate ensembles of counterfactual histories that would deepen our understanding of how our actual history developed the way it did. The generation of such narratives, automatically and in a scalable way, will revolutionize the practice of history as a discipline, because historical knowledge, that is the treasure of human experiences (i.e. the heritage of the world), will become what might be inherited by machine learning algorithms and used in smart cities to highlight and explain present ties and illustrate potential future scenarios and visionarios.

An Efficient Load-Sharing Scheme for Internet-Based Clustering Systems (인터넷 기반 클러스터 시스템 환경에서 효율적인 부하공유 기법)

  • 최인복;이재동
    • Journal of Korea Multimedia Society
    • /
    • v.7 no.2
    • /
    • pp.264-271
    • /
    • 2004
  • A load-sharing algorithm must deal with load imbalance caused by characteristics of a network and heterogeneity of nodes in Internet-based clustering systems. This paper has proposed the Efficient Load-Sharing algorithm. Efficient-Load-Sharing algorithm creates a scheduler based on the WF(Weighted Factoring) algorithm and then allocates tasks by an adaptive granularity strategy and the refined fixed granularity algorithm for better performance. In this paper, adaptive granularity strategy is that master node allocates tasks of relatively slower node to faster node and refined fixed granularity algorithm is to overlap between the time spent by slave nodes on computation and the time spent for network communication. For the simulation, the matrix multiplication using PVM is performed on the heterogeneous clustering environment which consists of two different networks. Compared to other algorithms such as Send, GSS and Weighted Factoring, the proposed algorithm results in an improvement of performance by 75%, 79% and 17%, respectively.

  • PDF

Boundary Zone Overlapping Scheme for Fast Handoff Based on Session Key Reuse (AAA MIP 환경에서 공유영역 기반 세션키 재사용을 통한 고속 핸드오프 방식 연구)

  • Choi, Yu-Mi;Chung, Min-Young;Choo, Hyun-Seung
    • The KIPS Transactions:PartC
    • /
    • v.12C no.4 s.100
    • /
    • pp.481-488
    • /
    • 2005
  • The Mobile W provides an efficient and scalable mechanism for host mobility within the Internet. However, the mobility implies higher security risks than static operations in fixed networks. In this paper, the Mobile IP has been adapted to allow AAA protocol that supports authentication, authorization, and accounting(AAA) for security and collection for accounting information of network usage by mobile nodes(MNs). For this goal, we Propose the boundary tone overlapped network structure while solidifying the security for the authentication of an MN. That is, the Proposed scheme delivers the session keys at the wired link for MN's security instead of the wireless one, so that it provides a fast and seamless handoff mechanism. According to the analysis of modeling result, the proposed mechanism compared to the existing session key reuse method is up to about $40\%$ better in terms of normalized surcharge for the handoff failure rate that considers handoff total time.

Wavelet-Based Minimized Feature Selection for Motor Imagery Classification (운동 형상 분류를 위한 웨이블릿 기반 최소의 특징 선택)

  • Lee, Sang-Hong;Shin, Dong-Kun;Lim, Joon-S.
    • The Journal of the Korea Contents Association
    • /
    • v.10 no.6
    • /
    • pp.27-34
    • /
    • 2010
  • This paper presents a methodology for classifying left and right motor imagery using a neural network with weighted fuzzy membership functions (NEWFM) and wavelet-based feature extraction. Wavelet coefficients are extracted from electroencephalogram(EEG) signal by wavelet transforms in the first step. In the second step, sixty numbers of initial features are extracted from wavelet coefficients by the frequency distribution and the amount of variability in frequency distribution. The distributed non-overlap area measurement method selects the minimized number of features by removing the worst input features one by one, and then minimized six numbers of features are selected with the highest performance result. The proposed methodology shows that accuracy rate is 86.43% with six numbers of features.

Clustering for Improved Actor Connectivity and Coverage in Wireless Sensor and Actor Networks (무선 센서 액터 네트워크에서 액터의 연결성과 커버리지를 향상시키기 위한 클러스터 구성)

  • Kim, Young-Kyun;Jeon, Chang-Ho
    • Journal of the Korea Society of Computer and Information
    • /
    • v.19 no.8
    • /
    • pp.63-71
    • /
    • 2014
  • This paper proposes an algorithm that forms the clusters on the basis of hop distance in order to improve the actor coverage and connectivity in the sink-based wireless sensor and actor networks. The proposed algorithm forms the clusters that are distributed evenly in the target area by electing the CHs(Cluster Heads) at regular hop intervals from a sink. The CHs are elected sequentially from the sink in order to ensure the connectivity between the sink and the actors that are located on the CHs. Additionally, the electing are achieved from the area of the higher rate of the sensors density in order to improve the actor coverage. The number of clusters that are created in the target area and the number of the actors that are placed on the positions of the CHs are reduced by forming the clusters with regular distribution and minimizing the overlap of them through the proposed algorithm. Simulations are performed to verify that the proposed algorithm constructs the actor network that is connected to the sink. Moreover, we shows that the proposed algorithm improves the actor coverage and, therefore, reduces the amount of the actors that will be deployed in the region by 9~20% compared to the IDSC algorithm.

Adjusting the Retry Limit for Congestion Control in an Overlapping Private BSS Environment

  • Park, Chang Yun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.6
    • /
    • pp.1881-1900
    • /
    • 2014
  • Since 802.11 wireless LANs are so widely used, it has become common for numerous access points (APs) to overlap in a region, where most of those APs are managed individually without any coordinated control. This pattern of wireless LAN usage is called the private OBSS (Overlapping Basic Service Set) environment in this paper. Due to frame collisions across BSSs, each BSS in the private OBSS environment suffers severe performance degradation. This study approaches the problem from the perspective of congestion control rather than noise or collision resolution. The retry limit, one of the 802.11 attributes, could be used for traffic control in conjunction with TCP. Reducing the retry limit causes early discard of a frame, and it has a similar effect of random early drops at a router, well known in the research area of congestion control. It makes the shared link less crowded with frames, and then the benefit of fewer collisions surpasses the penalty of less strict error recovery. As a result, the network-wide performance improves and so does the performance of each BSS eventually. Reducing the retry limit also has positive effects of merging TCP ACKs and reducing HOL-like blocking time at the AP. Extensive experiments have validated the idea that in the OBSS environment, reducing the retry limit provides better performance, which is contrary to the common wisdom. Since our strategy is basically to sacrifice error recovery for congestion control, it could yield side-effects in an environment where the cost of error recovery is high. Therefore, to be useful in general network and traffic environments, adaptability is required. To prove the feasibility of the adaptive scheme, a simple method to dynamically adjust the value of the retry limit has been proposed. Experiments have shown that this approach could provide comparable performance in unfriendly environments.

The Amplification of the Morse Codes, which Cho Ji-Hoon's Poem Silent Night 1 Leaves in the Human Body

  • Park, In-Kwa
    • International Journal of Advanced Culture Technology
    • /
    • v.6 no.1
    • /
    • pp.42-49
    • /
    • 2018
  • In this study, we tried to reveal the state of stillness of Cho Ji-Hoon's poem "Silent Night 1" as a healing modifier. The language of poem is synaptically linked to the calmness emotion of the human body, seeking a principle that leads to a state of healing. Therefore, this study was carried out for the purpose of applying the principle to literary therapy program. The silent signal embedded in the poem is encoded into the signals of the sound as it is synapsed to the human body. Encoding of auditory nerves by poem lines is like a Morse code that word and word leave in the human body. The action potential of the auditory nerve is further activated by the potential difference between the word and the word represented by the neural network, such as a Morse code, which is accessed to the human body by such a path. There is worked as amplified potential difference between the words perceived by a sound which is synapsed to the human body and by a silence which is synapsed to the human body. The phenomenon of the words approaching the human body and setting the absence of sound and amplifying the sound is because the words amplifies the Morse codes in the human neural network. At this time, the signals overlap each other. Thereby this poem is increasing the amplitude of the sound. This overlapping of auditory signals appears and amplifies the catharsis. If this Cho Ji-Hoon Poem's principle is applied to literary therapy program in the future, more effective treatment will be done.

Minimized Stock Forecasting Features Selection by Automatic Feature Extraction Method (자동 특징 추출기법에 의한 최소의 주식예측 특징선택)

  • Lee, Sang-Hong;Lim, Joon-S.
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.2
    • /
    • pp.206-211
    • /
    • 2009
  • This paper presents a methodology to 1-day-forecast stock index using the automatic feature extraction method based on the neural network with weighted fuzzy membership functions (NEWFM). The distributed non-overlap area measurement method selects the minimized number of input features by automatically removing the worst input features one by one. CPP$_{n,m}$(Current Price Position of the day n: a percentage of the difference between the price of the day n and the moving average from the day n-1 to the day n-m) and the 2 wavelet transformed coefficients from the recent 32 days of CPP$_{n,m}$ are selected as minimized features using bounded sum of weighted fuzzy membership functions (BSWFMs). For the data sets, from 1989 to 1998, the proposed method shows that the forecast rate is 60.93%.