• Title/Summary/Keyword: Network Performance

Search Result 13,974, Processing Time 0.04 seconds

Development of a Remotely Sensed Image Processing/Analysis System : GeoPixel Ver. 1.0 (JAVA를 이용한 위성영상처리/분석 시스템 개발 : GeoPixel Ver. 1.0)

  • 안충현;신대혁
    • Korean Journal of Remote Sensing
    • /
    • v.13 no.1
    • /
    • pp.13-30
    • /
    • 1997
  • Recent improvements of satellite remote sensing sensors which are represented by hyperspectral imaging sensors and high spatial resolution sensors provide a large amount of data, typically several hundred megabytes per one scene. Moreover, increasing information exchange via internet and information super-highway requires the developments of more active service systems for processing and analysing of remote sensing data in order to provide value-added products. In this sense, an advanced satellite data processing system is being developed to achive high performance in computing speed and efficieney in processing a huge volume of data, and to make possible network computing and easy improving, upgrading and managing of systems. JAVA internet programming language provides several advantages for developing software such as object-oriented programming, multi-threading and robust memory managent. Using these features, a satellite data processing system named as GeoPixel has been developing using JAVA language. The GeoPixel adopted newly developed techniques including object-pipe connect method between each process and multi-threading structure. In other words, this system has characteristics such as independent operating platform and efficient data processing by handling a huge volume of remote sensing data with robustness. In the evaluation of data processing capability, the satisfactory results were shown in utilizing computer resources(CPU and Memory) and processing speeds.

Comparative assessment of frost event prediction models using logistic regression, random forest, and LSTM networks (로지스틱 회귀, 랜덤포레스트, LSTM 기법을 활용한 서리예측모형 평가)

  • Chun, Jong Ahn;Lee, Hyun-Ju;Im, Seul-Hee;Kim, Daeha;Baek, Sang-Soo
    • Journal of Korea Water Resources Association
    • /
    • v.54 no.9
    • /
    • pp.667-680
    • /
    • 2021
  • We investigated changes in frost days and frost-free periods and to comparatively assess frost event prediction models developed using logistic regression (LR), random forest (RF), and long short-term memory (LSTM) networks. The meteorological variables for the model development were collected from the Suwon, Cheongju, and Gwangju stations for the period of 1973-2019 for spring (March - May) and fall (September - November). The developed models were then evaluated by Precision, Recall, and f-1 score and graphical evaluation methods such as AUC and reliability diagram. The results showed that significant decreases (significance level of 0.01) in the frequencies of frost days were at the three stations in both spring and fall. Overall, the evaluation metrics showed that the performance of RF was highest, while that of LSTM was lowest. Despite higher AUC values (above 0.9) were found at the three stations, reliability diagrams showed inconsistent reliability. A further study is suggested on the improvement of the predictability of both frost events and the first and last frost days by the frost event prediction models and reliability of the models. It would be beneficial to replicate this study at more stations in other regions.

A Problematic Bubble Detection Algorithm for Conformal Coated PCB Using Convolutional Neural Networks (합성곱 신경망을 이용한 컨포멀 코팅 PCB에 발생한 문제성 기포 검출 알고리즘)

  • Lee, Dong Hee;Cho, SungRyung;Jung, Kyeong-Hoon;Kang, Dong Wook
    • Journal of Broadcast Engineering
    • /
    • v.26 no.4
    • /
    • pp.409-418
    • /
    • 2021
  • Conformal coating is a technology that protects PCB(Printed Circuit Board) and minimizes PCB failures. Since the defects in the coating are linked to failure of the PCB, the coating surface is examined for air bubbles to satisfy the successful conditions of the conformal coating. In this paper, we propose an algorithm for detecting problematic bubbles in high-risk groups by applying image signal processing. The algorithm consists of finding candidates for problematic bubbles and verifying candidates. Bubbles do not appear in visible light images, but can be visually distinguished from UV(Ultra Violet) light sources. In particular the center of the problematic bubble is dark in brightness and the border is high in brightness. In the paper, these brightness characteristics are called valley and mountain features, and the areas where both characteristics appear at the same time are candidates for problematic bubbles. However, it is necessary to verify candidates because there may be candidates who are not bubbles. In the candidate verification phase, we used convolutional neural network models, and ResNet performed best compared to other models. The algorithms presented in this paper showed the performance of precision 0.805, recall 0.763, and f1-score 0.767, and these results show sufficient potential for bubble test automation.

A study on combination of loss functions for effective mask-based speech enhancement in noisy environments (잡음 환경에 효과적인 마스크 기반 음성 향상을 위한 손실함수 조합에 관한 연구)

  • Jung, Jaehee;Kim, Wooil
    • The Journal of the Acoustical Society of Korea
    • /
    • v.40 no.3
    • /
    • pp.234-240
    • /
    • 2021
  • In this paper, the mask-based speech enhancement is improved for effective speech recognition in noise environments. In the mask-based speech enhancement, enhanced spectrum is obtained by multiplying the noisy speech spectrum by the mask. The VoiceFilter (VF) model is used as the mask estimation, and the Spectrogram Inpainting (SI) technique is used to remove residual noise of enhanced spectrum. In this paper, we propose a combined loss to further improve speech enhancement. In order to effectively remove the residual noise in the speech, the positive part of the Triplet loss is used with the component loss. For the experiment TIMIT database is re-constructed using NOISEX92 noise and background music samples with various Signal to Noise Ratio (SNR) conditions. Source to Distortion Ratio (SDR), Perceptual Evaluation of Speech Quality (PESQ), and Short-Time Objective Intelligibility (STOI) are used as the metrics of performance evaluation. When the VF was trained with the mean squared error and the SI model was trained with the combined loss, SDR, PESQ, and STOI were improved by 0.5, 0.06, and 0.002 respectively compared to the system trained only with the mean squared error.

A Study on Korean Firms' Outward FDIs to China (중국 내 순차적 직접투자와 경영 전략적 특성에 관한 연구)

  • Yim, Hyung-Rok;Chung, Wonjin
    • International Area Studies Review
    • /
    • v.18 no.3
    • /
    • pp.47-66
    • /
    • 2014
  • A noticeable aspect of Korean firms' outward sequential FDIs to China is that they occur sequentially, which means that they implement the outward FDIs to China with a long-term perspective. To analyze the strategic advantages of sequential investment, we introduce Cournot type quantity competition model. According to the model, three important implications are derived. First, sequential FDIs enhances the Korean parents' production capabilities. Second, the parents are more likely to establish new Chinese subsidiaries as they stay longer in China. Third, the production effect of sequential investments incurs more sequential investments. Some regression models are tested for verifying the predictions. According to empirical results, three important results are found. First, initial entry mode affects the size expansion of the Korean parents. Second, the longer the duration of intial subsidiary in China, the more the sequential investment will be. Third, sequential investments are positively associated with the productivity of the Korean parents.

The Effects of Medium and Small-sized Venture Firms' Liability of Foreignness on Business Performance - Comparison of Taiwanese and Korean Firms - (대만과 한국 중소벤처기업의 외국비용이 경영성과에 미치는 영향)

  • Cho, Dae-Woo
    • International Area Studies Review
    • /
    • v.12 no.3
    • /
    • pp.293-319
    • /
    • 2008
  • Medium and small-sized venture firms as well as multinational companies pay liabilities of foreignness. We defined these costs as three different factors which are liability of handicaps(deficit of time, money, experience and, increase of financial risk), overseas market entry costs(information gathering costs, network building costs, marketing costs, channelling costs, monitoring costs), internationalization preparing costs(forecasting and market research of local markets, ex-ante cooperation with local firms), and then empirically tested how each of these factors affects on their business performances. The more important both Taiwanese and Korean firms consider liability of handicaps, the more bigger they pay overseas market entry costs(H1). On the contrary, the more important they consider overseas entry costs, the more they focus on internationalization preparation(H4) and get the better business performances(H5). The more important Korean firms consider liability of handicaps, the bigger they focus on internationalization preparation, on the contrary, the less Taiwanese firms do this(H2). Taiwanese firms as well as Korean firms rejected Hypothesis 3 and 6 which mean both liability of handicaps and internationalization preparation are no relation with their own business performances.

Improving Non-Profiled Side-Channel Analysis Using Auto-Encoder Based Noise Reduction Preprocessing (비프로파일링 기반 전력 분석의 성능 향상을 위한 오토인코더 기반 잡음 제거 기술)

  • Kwon, Donggeun;Jin, Sunghyun;Kim, HeeSeok;Hong, Seokhie
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.29 no.3
    • /
    • pp.491-501
    • /
    • 2019
  • In side-channel analysis, which exploit physical leakage from a cryptographic device, deep learning based attack has been significantly interested in recent years. However, most of the state-of-the-art methods have been focused on classifying side-channel information in a profiled scenario where attackers can obtain label of training data. In this paper, we propose a new method based on deep learning to improve non-profiling side-channel attack such as Differential Power Analysis and Correlation Power Analysis. The proposed method is a signal preprocessing technique that reduces the noise in a trace by modifying Auto-Encoder framework to the context of side-channel analysis. Previous work on Denoising Auto-Encoder was trained through randomly added noise by an attacker. In this paper, the proposed model trains Auto-Encoder through the noise from real data using the noise-reduced-label. Also, the proposed method permits to perform non-profiled attack by training only a single neural network. We validate the performance of the noise reduction of the proposed method on real traces collected from ChipWhisperer board. We demonstrate that the proposed method outperforms classic preprocessing methods such as Principal Component Analysis and Linear Discriminant Analysis.

A proposal on a proactive crawling approach with analysis of state-of-the-art web crawling algorithms (최신 웹 크롤링 알고리즘 분석 및 선제적인 크롤링 기법 제안)

  • Na, Chul-Won;On, Byung-Won
    • Journal of Internet Computing and Services
    • /
    • v.20 no.3
    • /
    • pp.43-59
    • /
    • 2019
  • Today, with the spread of smartphones and the development of social networking services, structured and unstructured big data have stored exponentially. If we analyze them well, we will get useful information to be able to predict data for the future. Large amounts of data need to be collected first in order to analyze big data. The web is repository where these data are most stored. However, because the data size is large, there are also many data that have information that is not needed as much as there are data that have useful information. This has made it important to collect data efficiently, where data with unnecessary information is filtered and only collected data with useful information. Web crawlers cannot download all pages due to some constraints such as network bandwidth, operational time, and data storage. This is why we should avoid visiting many pages that are not relevant to what we want and download only important pages as soon as possible. This paper seeks to help resolve the above issues. First, We introduce basic web-crawling algorithms. For each algorithm, the time-complexity and pros and cons are described, and compared and analyzed. Next, we introduce the state-of-the-art web crawling algorithms that have improved the shortcomings of the basic web crawling algorithms. In addition, recent research trends show that the web crawling algorithms with special purposes such as collecting sentiment words are actively studied. We will one of the introduce Sentiment-aware web crawling techniques that is a proactive web crawling technique as a study of web crawling algorithms with special purpose. The result showed that the larger the data are, the higher the performance is and the more space is saved.

SOURCE-FREQUENCY PHASE-REFERENCING OBSERVATION OF AGNS WITH KAVA USING SIMULTANEOUS DUAL-FREQUENCY RECEIVING

  • Zhao, Guang-Yao;Jung, Taehyun;Sohn, Bong Won;Kino, Motoki;Honma, Mareki;Dodson, Richard;Rioja, Maria;Han, Seog-Tae;Shibata, Katsunori;Byun, Do-Young;Akiyama, Kazunori;Algaba, Juan-Carlos;An, Tao;Cheng, Xiaopeng;Cho, Ilje;Cui, Yuzhu;Hada, Kazuhiro;Hodgson, Jeffrey A.;Jiang, Wu;Lee, Jee Won;Lee, Jeong Ae;Niinuma, Kotaro;Park, Jong-Ho;Ro, Hyunwook;Sawada-Satoh, Satoko;Shen, Zhi-Qiang;Tazaki, Fumie;Trippe, Sascha;Wajima, Kiyoaki;Zhang, Yingkang
    • Journal of The Korean Astronomical Society
    • /
    • v.52 no.1
    • /
    • pp.23-30
    • /
    • 2019
  • The KVN(Korean VLBI Network)-style simultaneous multi-frequency receiving mode is demonstrated to be promising for mm-VLBI observations. Recently, other Very long baseline interferometry (VLBI) facilities all over the globe start to implement compatible optics systems. Simultaneous dual/multi-frequency VLBI observations at mm wavelengths with international baselines are thus possible. In this paper, we present the results from the first successful simultaneous 22/43 GHz dual-frequency observation with KaVA(KVN and VERA array), including images and astrometric results. Our analysis shows that the newly implemented simultaneous receiving system has brought a significant extension of the coherence time of the 43 GHz visibility phases along the international baselines. The astrometric results obtained with KaVA are consistent with those obtained with the independent analysis of the KVN data. Our results thus confirm the good performance of the simultaneous receiving systems for the nonKVN stations. Future simultaneous observations with more global stations bring even higher sensitivity and micro-arcsecond level astrometric measurements of the targets.

A Study on the Vulnerability Management of Internet Connection Devices based on Internet-Wide Scan (인터넷 와이드 스캔 기술 기반 인터넷 연결 디바이스의 취약점 관리 구조 연구)

  • Kim, Taeeun;Jung, Yong Hoon;Jun, Moon-Seog
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.20 no.9
    • /
    • pp.504-509
    • /
    • 2019
  • Recently, both wireless communications technology and the performance of small devices have developed exponentially, while the number of services using various types of Internet of Things (IoT) devices has also massively increased in line with the ongoing technological and environmental changes. Furthermore, ever more devices that were previously used in the offline environment-including small-size sensors and CCTV-are being connected to the Internet due to the huge increase in IoT services. However, many IoT devices are not equipped with security functions, and use vulnerable open source software as it is. In addition, conventional network equipment, such as switches and gateways, operates with vulnerabilities, because users tend not to update the equipment on a regular basis. Recently, the simple vulnerability of IoT devices has been exploited through the distributed denial of service (DDoS) from attackers creating a large number of botnets. This paper proposes a system that is capable of identifying Internet-connected devices quickly, analyzing and managing the vulnerability of such devices using Internet-wide scan technology. In addition, the vulnerability analysis rate of the proposed technology was verified through collected banner information. In the future, the company plans to automate and upgrade the proposed system so that it can be used as a technology to prevent cyber attacks.