• Title/Summary/Keyword: network computing

Search Result 3,187, Processing Time 0.032 seconds

Hallym Jikimi: A Remote Monitoring System for Daily Activities of Elders Living Alone (한림 지킴이: 독거노인 일상 활동 원격 모니터링 시스템)

  • Lee, Seon-Woo;Kim, Yong-Joong;Lee, Gi-Sup;Kim, Byung-Jung
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.15 no.4
    • /
    • pp.244-254
    • /
    • 2009
  • This paper describes a remote system to monitor the circadian behavioral patterns of elders who live alone. The proposed system was designed and implemented to provide more conveniently and reliably the required functionalities of a remote monitoring system for elders based on the development of first phase prototype[2]. The developed system is composed of an in-house sensing system and a server system. The in-house sensing system is a set of wireless sensor nodes which have pyroelectric infrared (PIR) sensor to detect a motion of elder. Each sensing node sends its detection signal to a home gateway via wireless link. The home gateway stores the received signals into a remote database. The server system is composed of a database server and a web server, which provides web-based monitoring system to caregivers (friends, family and social workers) for more cost effective intelligent care service. The improved second phase system can provide 'automatic diagnosis', 'going out detection', and enhanced user interface functionalities. We have evaluated the first and second phase monitoring systems from real field experiments of 3/4 months continuous operation with installation of 9/15 elders' houses, respectively. The experimental results show the promising possibilities to estimate the behavioral patterns and the current status of elder even though the simplicity of sensing capability.

A Centralized Deployment Protocol with Sufficient Coverage and Connectivity Guarantee for WSNs (무선 센서 네트워크에서 유효 커버리지 및 접속성 보장을 위한 중앙 집중형 배치 프로토콜)

  • Kim, Hyun-Tae;Zhang, Gui-Ping;Kim, Hyoung-Jin;Joo, Young-Hoon;Ra, In-Ho
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.16 no.6
    • /
    • pp.683-690
    • /
    • 2006
  • Reducing power consumption to extend network lifetime is one of the most important challenges in designing wireless sensor networks. One promising approach to conserving system energy is to keep only a minimal number of sensors active and put others into low-powered sleep mode, while the active sensors can maintain a connected covet set for the target area. The problem of computing such minimum working sensor set is NP-hard. In this paper, a centralized Voronoi tessellation (CVT) based approximate algorithm is proposed to construct the near optimal cover set. When sensor's communication radius is at least twice of its sensing radius, the covet set is connected at the same time; In case of sensor's communication radius is smaller than twice of its sensing radius, a connection scheme is proposed to calculate the assistant nodes needed for constructing the connectivity of the cover set. Finally, the performance of the proposed algorithm is evaluated through theoretical analysis and extensive numerical experiments. Experimental results show that the proposed algorithm outperforms the greedy algorithm in terms of the runtime and the size of the constructed connected cover set.

A Data Aggregation Scheme for Enhancing the Efficiency of Data Aggregation and Correctness in Wireless Sensor Networks (무선 센서 네트워크에서 데이터 수집의 효율성 및 정확성 향상을 위한 데이터 병합기법)

  • Kim, Hyun-Tae;Yu, Tae-Young;Jung, Kyu-Su;Jeon, Yeong-Bae;Ra, In-Ho
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.16 no.5
    • /
    • pp.531-536
    • /
    • 2006
  • Recently, many of researchers have been studied in data processing oriented middleware for wireless sensor networks with the rapid advances on sensor and wireless communication technologies. In a wireless sensor network, a middleware should handle the data loss problem at an intermediate sensor node caused by instantaneous data burstness to support efficient processing and fast delivering of the sensing data. To handle this problem, a simple data discarding or data compressing policy for reducing the total amount of data to be transferred is typically used. But, data discarding policy decreases the correctness of a collected data, in other hand, data compressing policy requires additional processing overhead with the high complexity of the given algorithm. In this paper, it proposes a data-average method for enhancing the efficiency of data aggregation and correctness where the sensed data should be delivered only with the limited computing power and energy resource. With the proposed method, unnecessary data transfer of the overlapped data is eliminated and data correctness is enhanced by using the proposed averaging scheme when an instantaneous data burstness is occurred. Finally, with the TOSSTM simulation results on TinyBB, we show that the correctness of the transferred data is enhanced.

Development of Korea Ocean Satellite Center (KOSC): System Design on Reception, Processing and Distribution of Geostationary Ocean Color Imager (GOCI) Data (해양위성센터 구축: 통신해양기상위성 해색센서(GOCI) 자료의 수신, 처리, 배포 시스템 설계)

  • Yang, Chan-Su;Cho, Seong-Ick;Han, Hee-Jeong;Yoon, Sok;Kwak, Ki-Yong;Yhn, Yu-Whan
    • Korean Journal of Remote Sensing
    • /
    • v.23 no.2
    • /
    • pp.137-144
    • /
    • 2007
  • In KORDI (Korea Ocean Research and Development Institute), the KOSC (Korea Ocean Satellite Center) construction project is being prepared for acquisition, processing and distribution of sensor data via L-band from GOCI (Geostationary Ocean Color Imager) instrument which is loaded on COMS (Communication, Ocean and Meteorological Satellite); it will be launched in 2008. Ansan (the headquarter of KORDI) has been selected for the location of KOSC between 5 proposed sites, because it has the best condition to receive radio wave. The data acquisition system is classified into antenna and RF. Antenna is designed to be $\phi$ 9m cassegrain antenna which has 19.35 G/T$(dB/^{\circ}K)$ at 1.67GHz. RF module is divided into LNA (low noise amplifier) and down converter, those are designed to send only horizontal polarization to modem. The existing building is re-designed and arranged for the KOSC operation concept; computing room, board of electricity, data processing room, operation room. Hardware and network facilities have been designed to adapt for efficiency of each functions. The distribution system which is one of the most important systems will be constructed mainly on the internet. and it is also being considered constructing outer data distribution system as a web hosting service for offering received data to user less than an hour.

A Study on the Research Trends for Smart City using Topic Modeling (토픽 모델링을 활용한 스마트시티 연구동향 분석)

  • Park, Keon Chul;Lee, Chi Hyung
    • Journal of Internet Computing and Services
    • /
    • v.20 no.3
    • /
    • pp.119-128
    • /
    • 2019
  • This study aims to analyze the research trends on Smart City and to present implications to policy maker, industry professional, and researcher. Cities around globe have undergone the rapid progress in urbanization and the consequent dramatic increase in urban dwellings over the past few decades, and faced many urban problems in such areas as transportation, environment and housing. Cities around the globe are in a hurry to introduce Smart City to pursue a common goal of solving these urban problems and improving the quality of their lives. However, various conceptual approaches to smart city are causing uncertainty in setting policy goals and establishing direction for implementation. The study collected 11,527 papers titled "Smart City(cities)" from the Scopus DB and Springer DB, and then analyze research status, topic, trends based on abstracts and publication date(year) information using the LDA based Topic Modeling approaches. Research topics are classified into three categories(Services, Technologies, and User Perspective) and eight regarding topics. Out of eight topics, citizen-driven innovation is the most frequently referred. Additional topic network analysis reveals that data and privacy/security are the most prevailing topics affecting others. This study is expected to helps understand the trends of Smart City researches and predict the future researches.

A proposal on a proactive crawling approach with analysis of state-of-the-art web crawling algorithms (최신 웹 크롤링 알고리즘 분석 및 선제적인 크롤링 기법 제안)

  • Na, Chul-Won;On, Byung-Won
    • Journal of Internet Computing and Services
    • /
    • v.20 no.3
    • /
    • pp.43-59
    • /
    • 2019
  • Today, with the spread of smartphones and the development of social networking services, structured and unstructured big data have stored exponentially. If we analyze them well, we will get useful information to be able to predict data for the future. Large amounts of data need to be collected first in order to analyze big data. The web is repository where these data are most stored. However, because the data size is large, there are also many data that have information that is not needed as much as there are data that have useful information. This has made it important to collect data efficiently, where data with unnecessary information is filtered and only collected data with useful information. Web crawlers cannot download all pages due to some constraints such as network bandwidth, operational time, and data storage. This is why we should avoid visiting many pages that are not relevant to what we want and download only important pages as soon as possible. This paper seeks to help resolve the above issues. First, We introduce basic web-crawling algorithms. For each algorithm, the time-complexity and pros and cons are described, and compared and analyzed. Next, we introduce the state-of-the-art web crawling algorithms that have improved the shortcomings of the basic web crawling algorithms. In addition, recent research trends show that the web crawling algorithms with special purposes such as collecting sentiment words are actively studied. We will one of the introduce Sentiment-aware web crawling techniques that is a proactive web crawling technique as a study of web crawling algorithms with special purpose. The result showed that the larger the data are, the higher the performance is and the more space is saved.

RDP-based Lateral Movement Detection using PageRank and Interpretable System using SHAP (PageRank 특징을 활용한 RDP기반 내부전파경로 탐지 및 SHAP를 이용한 설명가능한 시스템)

  • Yun, Jiyoung;Kim, Dong-Wook;Shin, Gun-Yoon;Kim, Sang-Soo;Han, Myung-Mook
    • Journal of Internet Computing and Services
    • /
    • v.22 no.4
    • /
    • pp.1-11
    • /
    • 2021
  • As the Internet developed, various and complex cyber attacks began to emerge. Various detection systems were used outside the network to defend against attacks, but systems and studies to detect attackers inside were remarkably rare, causing great problems because they could not detect attackers inside. To solve this problem, studies on the lateral movement detection system that tracks and detects the attacker's movements have begun to emerge. Especially, the method of using the Remote Desktop Protocol (RDP) is simple but shows very good results. Nevertheless, previous studies did not consider the effects and relationships of each logon host itself, and the features presented also provided very low results in some models. There was also a problem that the model could not explain why it predicts that way, which resulted in reliability and robustness problems of the model. To address this problem, this study proposes an interpretable RDP-based lateral movement detection system using page rank algorithm and SHAP(Shapley Additive Explanations). Using page rank algorithms and various statistical techniques, we create features that can be used in various models and we provide explanations for model prediction using SHAP. In this study, we generated features that show higher performance in most models than previous studies and explained them using SHAP.

An IoT based Green Home Architecture for Green Score Calculation towards Smart Sustainable Cities

  • Kumaran, K. Manikanda;Chinnadurai, M.;Manikandan, S.;Murugan, S. Palani;Elakiya, E.
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.7
    • /
    • pp.2377-2398
    • /
    • 2021
  • In the recent modernized world, utilization of natural resources (renewable & non-renewable) is increasing drastically due to the sophisticated life style of the people. The over-consumption of non-renewable resources causes pollution which leads to global warming. Consequently, government agencies have been taking several initiatives to control the over-consumption of non-renewable natural resources and encourage the production of renewable energy resources. In this regard, we introduce an IoT powered integrated framework called as green home architecture (GHA) for green score calculation based on the usage of natural resources for household purpose. Green score is a credit point (i.e.,10 pts) of a family which can be calculated once in a month based on the utilization of energy, production of renewable energy and pollution caused. The green score can be improved by reducing the consumption of energy, generation of renewable energy and preventing the pollution. The main objective of GHA is to monitor the day-to-day usage of resources and calculate the green score using the proposed green score algorithm. This algorithm gives positive credits for economic consumption of resources and production of renewable energy and also it gives negative credits for pollution caused. Here, we recommend a green score based tax calculation system which gives tax exemption based on the green score value. This direct beneficiary model will appreciate and encourage the citizens to consume fewer natural resources and prevent pollution. Rather than simply giving subsidy, this proposed system allows monitoring the subsidy scheme periodically and encourages the proper working system with tax exemption rewards. Also, our GHA will be used to monitor all the household appliances, vehicles, wind mills, electricity meter, water re-treatment plant, pollution level to read the consumption/production in appropriate units by using the suitable sensors. These values will be stored in mass storage platform like cloud for the calculation of green score and also employed for billing purpose by the government agencies. This integrated platform can replace the manual billing and directly benefits the government.

Analysis of Feature Map Compression Efficiency and Machine Task Performance According to Feature Frame Configuration Method (피처 프레임 구성 방안에 따른 피처 맵 압축 효율 및 머신 태스크 성능 분석)

  • Rhee, Seongbae;Lee, Minseok;Kim, Kyuheon
    • Journal of Broadcast Engineering
    • /
    • v.27 no.3
    • /
    • pp.318-331
    • /
    • 2022
  • With the recent development of hardware computing devices and software based frameworks, machine tasks using deep learning networks are expected to be utilized in various industrial fields and personal IoT devices. However, in order to overcome the limitations of high cost device for utilizing the deep learning network and that the user may not receive the results requested when only the machine task results are transmitted from the server, Collaborative Intelligence (CI) proposed the transmission of feature maps as a solution. In this paper, an efficient compression method for feature maps with vast data sizes to support the CI paradigm was analyzed and presented through experiments. This method increases redundancy by applying feature map reordering to improve compression efficiency in traditional video codecs, and proposes a feature map method that improves compression efficiency and maintains the performance of machine tasks by simultaneously utilizing image compression format and video compression format. As a result of the experiment, the proposed method shows 14.29% gain in BD-rate of BPP and mAP compared to the feature compression anchor of MPEG-VCM.

Estimation of KOSPI200 Index option volatility using Artificial Intelligence (이기종 머신러닝기법을 활용한 KOSPI200 옵션변동성 예측)

  • Shin, Sohee;Oh, Hayoung;Kim, Jang Hyun
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.26 no.10
    • /
    • pp.1423-1431
    • /
    • 2022
  • Volatility is one of the variables that the Black-Scholes model requires for option pricing. It is an unknown variable at the present time, however, since the option price can be observed in the market, implied volatility can be derived from the price of an option at any given point in time and can represent the market's expectation of future volatility. Although volatility in the Black-Scholes model is constant, when calculating implied volatility, it is common to observe a volatility smile which shows that the implied volatility is different depending on the strike prices. We implement supervised learning to target implied volatility by adding V-KOSPI to ease volatility smile. We examine the estimation performance of KOSPI200 index options' implied volatility using various Machine Learning algorithms such as Linear Regression, Tree, Support Vector Machine, KNN and Deep Neural Network. The training accuracy was the highest(99.9%) in Decision Tree model and test accuracy was the highest(96.9%) in Random Forest model.