• Title/Summary/Keyword: Computer System

Search Result 32,341, Processing Time 0.06 seconds

Why Gabor Frames? Two Fundamental Measures of Coherence and Their Role in Model Selection

  • Bajwa, Waheed U.;Calderbank, Robert;Jafarpour, Sina
    • Journal of Communications and Networks
    • /
    • v.12 no.4
    • /
    • pp.289-307
    • /
    • 2010
  • The problem of model selection arises in a number of contexts, such as subset selection in linear regression, estimation of structures in graphical models, and signal denoising. This paper studies non-asymptotic model selection for the general case of arbitrary (random or deterministic) design matrices and arbitrary nonzero entries of the signal. In this regard, it generalizes the notion of incoherence in the existing literature on model selection and introduces two fundamental measures of coherence-termed as the worst-case coherence and the average coherence-among the columns of a design matrix. It utilizes these two measures of coherence to provide an in-depth analysis of a simple, model-order agnostic one-step thresholding (OST) algorithm for model selection and proves that OST is feasible for exact as well as partial model selection as long as the design matrix obeys an easily verifiable property, which is termed as the coherence property. One of the key insights offered by the ensuing analysis in this regard is that OST can successfully carry out model selection even when methods based on convex optimization such as the lasso fail due to the rank deficiency of the submatrices of the design matrix. In addition, the paper establishes that if the design matrix has reasonably small worst-case and average coherence then OST performs near-optimally when either (i) the energy of any nonzero entry of the signal is close to the average signal energy per nonzero entry or (ii) the signal-to-noise ratio in the measurement system is not too high. Finally, two other key contributions of the paper are that (i) it provides bounds on the average coherence of Gaussian matrices and Gabor frames, and (ii) it extends the results on model selection using OST to low-complexity, model-order agnostic recovery of sparse signals with arbitrary nonzero entries. In particular, this part of the analysis in the paper implies that an Alltop Gabor frame together with OST can successfully carry out model selection and recovery of sparse signals irrespective of the phases of the nonzero entries even if the number of nonzero entries scales almost linearly with the number of rows of the Alltop Gabor frame.

Flow analysis of the Sump Pump (흡수정의 유동해석)

  • Jung, Han-Byul;Noh, Seung-Hee
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.18 no.3
    • /
    • pp.673-680
    • /
    • 2017
  • sump pump is a system that draws in water that is stored in a dam or reservoir. They are used to pump large amounts of water for cooling systems in large power plants, such as thermal and nuclear plants. However, if the flow and sump pump ratio are small, the flow rate increases around the inlet port. This causes a turbulent vortex or swirl flows. The turbulent flow reduces the performance and can cause failure. Various methods have been devised to solve the problem, but a correct solution has not been found for low water level. The most efficient solution is to install an anti-vortex device (AVD) or increase the length of the sump inlet, which makes the flow uniform. This paper presents a computational fluid dynamics (CFD) analysis of the flow characteristics in a sump pump for different sump inlet lengths and AVD types. Modeling was performed in three stages based on the pump intake, sump, and pump. For accurate analysis, the grid was made denser in the intake part, and the grid for the sump pump and AVD were also dense. 1.2-1.5 million grid elements were generated using ANSYS ICEM-CFD 14.5 with a mixture of tetra and prism elements. The analysis was done using the SST turbulence model of ANSYS CFX14.5, a commercial CFD program. The conditions were as follows: H.W.L 6.0 m, L.W.L 3.5, Qmax 4.000 kg/s, Qavg 3.500 kg/s Qmin 2.500 kg/s. The results of analysis by the vertex angle and velocity distribution are as follows. A sump pump with an Ext E-type AVD was accepted at a high water level. However, further studies are needed for a low water level using the Ext E-type AVD as a base.

Digital Forensic Investigation of HBase (HBase에 대한 디지털 포렌식 조사 기법 연구)

  • Park, Aran;Jeong, Doowon;Lee, Sang Jin
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.6 no.2
    • /
    • pp.95-104
    • /
    • 2017
  • As the technology in smart device is growing and Social Network Services(SNS) are becoming more common, the data which is difficult to be processed by existing RDBMS are increasing. As a result of this, NoSQL databases are getting popular as an alternative for processing massive and unstructured data generated in real time. The demand for the technique of digital investigation of NoSQL databases is increasing as the businesses introducing NoSQL database in their system are increasing, although the technique of digital investigation of databases has been researched centered on RDMBS. New techniques of digital forensic investigation are needed as NoSQL Database has no schema to normalize and the storage method differs depending on the type of database and operation environment. Research on document-based database of NoSQL has been done but it is not applicable as itself to other types of NoSQL Database. Therefore, the way of operation and data model, grasp of operation environment, collection and analysis of artifacts and recovery technique of deleted data in HBase which is a NoSQL column-based database are presented in this paper. Also the proposed technique of digital forensic investigation to HBase is verified by an experimental scenario.

Improving Haskell GC-Tuning Time Using Divide-and-Conquer (분할 정복법을 이용한 Haskell GC 조정 시간 개선)

  • An, Hyungjun;Kim, Hwamok;Liu, Xiao;Kim, Yeoneo;Byun, Sugwoo;Woo, Gyun
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.6 no.9
    • /
    • pp.377-384
    • /
    • 2017
  • The performance improvement of a single core processor has reached its limit since the circuit density cannot be increased any longer due to overheating. Therefore, the multicore and manycore architectures have emerged as viable approaches and parallel programming becomes more important. Haskell, a purely functional language, is getting popular in this situation since it naturally supports parallel programming owing to its beneficial features including the implicit parallelism in evaluating expressions and the monadic tools supporting parallel constructs. However, the performance of Haskell parallel programs is strongly influenced by the performance of the run-time system including the garbage collector. Though a memory profiling tool namely GC-tune has been suggested, we need a more systematic way to use this tool. Since GC-tune finds the optimal memory size by executing the target program with all the different possible GC options, the GC-tuning time takes too long. This paper suggests a basic divide-and-conquer method to reduce the number of GC-tune executions by reducing the search area by one-quarter for every searching step. Applying this method to two parallel programs, a maximally independent set and a K-means programs, the memory tuning time is reduced by 7.78 times with accuracy 98% on average.

Time Series Analysis for Traffic Flow Using Dynamic Linear Model (동적 선형 모델을 이용한 교통 흐름 시계열 분석)

  • Kim, Hong Geun;Park, Chul Young;Shin, Chang Sun;Cho, Yong Yun;Park, Jang Woo
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.6 no.4
    • /
    • pp.179-188
    • /
    • 2017
  • It is very challenging to analyze the traffic flow in the city because there are lots of traffic accidents, intersections, and pedestrians etc. Now, even in mid-size cities Bus Information Systems(BIS) have been deployed, which have offered the forecast of arriving times at the stations to passengers. BIS also provides more informations such as the current locations, departure-arrival times of buses. In this paper, we perform the time-series analysis of the traffic flow using the data of the average trvel time and the average speed between stations extracted from the BIS. In the mid size cities, the data from BIS will have a important role on prediction and analysis of the traffic flow. We used the Dynamic Linear Model(DLM) for how to make the time series forecasting model to analyze and predict the average speeds at the given locations, which seem to show the representative of traffics in the city. Especially, we analysis travel times for weekdays and weekends separately. We think this study can help forecast the traffic jams, congestion areas and more accurate arrival times of buses.

Improved CS-RANSAC Algorithm Using K-Means Clustering (K-Means 클러스터링을 적용한 향상된 CS-RANSAC 알고리즘)

  • Ko, Seunghyun;Yoon, Ui-Nyoung;Alikhanov, Jumabek;Jo, Geun-Sik
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.6 no.6
    • /
    • pp.315-320
    • /
    • 2017
  • Estimating the correct pose of augmented objects on the real camera view efficiently is one of the most important questions in image tracking area. In computer vision, Homography is used for camera pose estimation in augmented reality system with markerless. To estimating Homography, several algorithm like SURF features which extracted from images are used. Based on extracted features, Homography is estimated. For this purpose, RANSAC algorithm is well used to estimate homography and DCS-RANSAC algorithm is researched which apply constraints dynamically based on Constraint Satisfaction Problem to improve performance. In DCS-RANSAC, however, the dataset is based on pattern of feature distribution of images manually, so this algorithm cannot classify the input image, pattern of feature distribution is not recognized in DCS-RANSAC algorithm, which lead to reduce it's performance. To improve this problem, we suggest the KCS-RANSAC algorithm using K-means clustering in CS-RANSAC to cluster the images automatically based on pattern of feature distribution and apply constraints to each image groups. The suggested algorithm cluster the images automatically and apply the constraints to each clustered image groups. The experiment result shows that our KCS-RANSAC algorithm outperformed the DCS-RANSAC algorithm in terms of speed, accuracy, and inlier rate.

A Study on Big Data Based Non-Face-to-Face Identity Proofing Technology (빅데이터 기반 비대면 본인확인 기술에 대한 연구)

  • Jung, Kwansoo;Yeom, Hee Gyun;Choi, Daeseon
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.6 no.10
    • /
    • pp.421-428
    • /
    • 2017
  • The need for various approaches to non-face-to-face identification technology for registering and authenticating users online is being required because of the growth of online financial services and the rapid development of financial technology. In general, non-face-to-face approaches can be exposed to a greater number of threats than face-to-face approaches. Therefore, identification policies and technologies to verify users by using various factors and channels are being studied in order to complement the risks and to be more reliable non-face-to-face identification methods. One of these new approaches is to collect and verify a large number of personal information of user. Therefore, we propose a big-data based non-face-to-face Identity Proofing method that verifies identity on online based on various and large amount of information of user. The proposed method also provides an identification information management scheme that collects and verifies only the user information required for the identity verification level required by the service. In addition, we propose an identity information sharing model that can provide the information to other service providers so that user can reuse verified identity information. Finally, we prove by implementing a system that verifies and manages only the identity assurance level required by the service through the enhanced user verification in the non-face-to-face identity proofing process.

REMOTE SENSING AND GIS INTEGRATION FOR HOUSE MANAGEMENT

  • Wu, Mu-Lin;Wang, Yu-Ming;Wong, Deng-Ching;Chiou, Fu-Shen
    • Proceedings of the KSRS Conference
    • /
    • v.2
    • /
    • pp.551-554
    • /
    • 2006
  • House management is very important in water resource protection in order to provide sustainable drinking water for about four millions population in northern Taiwan. House management can be a simple job that can be done without any ingredient of remote sensing or geographic information systems. Remote sensing and GIS integration for house management can provide more efficient management prescription when land use enforcement, soil and water conservation, sewage management, garbage collection, and reforestation have to be managed simultaneously. The objective of this paper was to integrate remote sensing and GIS to manage houses in a water resource protection district. More than four thousand houses have been surveyed and created as a house data base. Site map of every single house and very detail information consisting of address, ownership, date of creation, building materials, acreages floor by floor, parcel information, and types of house condition. Some houses have their photos in different directions. One house has its own card consists these information and these attributes were created into a house data base. Site maps of all houses were created with the same coordinates system as parcel maps, topographic maps, sewage maps, and city planning maps. Visual Basic.NET, Visual C#.NET have been implemented to develop computer programs for house information inquiry and maps overlay among house maps and other GIS map layers. Remote sensing techniques have been implemented to generate the background information of a single house in the past 15 years. Digital orthophoto maps at a scale of 1:5000 overlay with house site maps are very useful in determination of a house was there or not for a given year. Satellite images if their resolutions good enough are also very useful in this type of daily government operations. The developed house management systems can work with commercial GIS software such as ArcView and ArcPad. Remote sensing provided image information of a single house whether it was there or not in a given year. GIS provided overlay and inquiry functions to automatically extract attributes of a given house by ownership, address, and so on when certain house management prescriptions have to be made by government agency. File format is the key component that makes remote sensing and GIS integration smoothly. The developed house management systems are user friendly and can be modified to meet needs encountered in a single task of a government technician.

  • PDF

Effect of UV-LED Irradiation on Respiration and Ethylene Production of Cherry Tomatoes (방울토마토의 호흡 및 에틸렌 발생에 미치는 자외선 LED의 효과)

  • Kim, Nam-Yong;Lee, Dong-Sun;Lee, Hyuk-Jae;An, Duck-Soon
    • Food Science and Preservation
    • /
    • v.19 no.1
    • /
    • pp.47-53
    • /
    • 2012
  • UV light irradiation is known to give beneficial effects on fresh produce preservation. A container system equipped with UV-LED was fabricated for storing cherry tomatoes under computer-controlled conditions of intermittent on-off cycles (1 hour on/1 hour off). Wavelength (365 and 405 nm) and physical location of the LED (2 and 5 cm above fruit) were studied as variables affecting the respiration, ethylene production and quality preservation of the fruits at 10 and $20^{\circ}C$. 365 nm wavelength gave much higher radiation intensity than 405 nm, and intensity on surface decreased in inverse proportion to square of distance from LED. When compared to non-irradiated control, UV-LED irradiation decreased the respiration by 5-10% at $10^{\circ}C$ while there was no obvious effect at $20^{\circ}C$. Ethylene production was reduced when the fruits were placed at 5 cm distance, while there was no significant difference from control at 2 cm location. The reduction of ethylene production at 5 cm was more pronounced at $20^{\circ}C$. UV-LED irradiation was shown to have delayed increase or lower concentration in carotenoids compared to control treatment. Any negative effect of UV-LED irradiation on ascorbic acid content and firmness was not observed.

Analysis of Real Ship Operation Data using a Smart Ship Platform (스마트선박 플랫폼을 활용한 실운항 데이터 분석 연구)

  • Kang, Jin-Hui;Lee, Hyun-Ho;Lee, Won-Ju;Lee, In-Ho;Kim, Jae-Woo;Park, Cheong-Hee
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.25 no.6
    • /
    • pp.649-657
    • /
    • 2019
  • An essential part of the development of an autonomous ship is supporting technology that can effectively check and diagnose the operational status of the ship form the shore control center on land. This development has recently occurred in the shipbuilding and shipping industries. In this paper, we present a smart ship solution that operates, as a single system, a data collection platform that gathers ship operation data and a service platform that provides various services. When this smart ship solution was applied to an operating ship, it was determined that a variety of high-quality data could be collected compared to existing ship data collection systems. In addition, it was shown that of the operation data collected, analysis of parameters related to the main engine can be used to determine the overall state by deriving valid results and visualizing patterns. In conclusion, it was suggested that a ship's operation status could be checked more effectively and a comprehensive evaluation could be possible at the shore control center if the results of this study were extended to various ship equipment and analyzed together with the operational environment data.