• Title/Summary/Keyword: iterative process

Search Result 652, Processing Time 0.024 seconds

Iterative Deep Convolutional Grid Warping Network for Joint Depth Upsampling (반복적인 격자 워핑 기법을 이용한 깊이 영상 초해상화 기술)

  • Kim, Dongsin;Yang, Yoonmo;Oh, Byung Tae
    • Journal of Broadcast Engineering
    • /
    • v.25 no.6
    • /
    • pp.965-972
    • /
    • 2020
  • Depth maps have distance information of objects. They play an important role in organizing 3D information. Color and depth images are often simultaneously obtained. However, depth images have lower resolution than color images due to limitation in hardware technology. Therefore, it is useful to upsample depth maps to have the same resolution as color images. In this paper, we propose a novel method to upsample depth map by shifting the pixel position instead of compensating pixel value. This approach moves the position of the pixel around the edge to the center of the edge, and this process is carried out in several steps to restore blurred depth map. The experimental results show that the proposed method improves both quantitative and visual quality compared to the existing methods.

A Study on the Development Method of Android App GUI Test Automation Tool (안드로이드 앱 GUI 테스트 자동화 툴 개발 방법에 관한 연구)

  • Park, Se-jun;Kim, Kyu-jung
    • The Journal of the Korea Contents Association
    • /
    • v.21 no.8
    • /
    • pp.403-412
    • /
    • 2021
  • As the number of mobile apps increases exponentially, automation of tests performed in the app development process is becoming more important. Until the app is released, iterative verification is performed through various types of tests, and this study was conducted focusing on the GUI test among various types of tests. This study is meaningful in that it can contribute to the stable app distribution of the developer by suggesting the development direction of the GUI test. To develop Android's GUI test tool, I collected basic data before presenting the development method by researching Android's UI controls and Material design guideline. After that, for the existing GUI test automation tool, two tools based on screen capture test and four tools based on source code analysis test were studied. Through this, it was found that existing GUI test tools don't consider visual design, usability, and component arrangement. In order to supplement the shortcomings of existing tools, a new GUI test automation tool development method was presented based on the basic data previously studied.

Performance Optimization and Analysis on P2P Mobile Communication Systems Accelerated by MEC Servers

  • Liang, Xuesong;Wu, Yongpeng;Huang, Yujin;Ng, Derrick Wing Kwan;Li, Pei;Yao, Yingbiao
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.1
    • /
    • pp.188-210
    • /
    • 2022
  • As a promising technique to support tremendous numbers of Internet of Things devices and a variety of applications efficiently, mobile edge computing (MEC) has attracted extensive studies recently. In this paper, we consider a MEC-assisted peer-to-peer (P2P) mobile communication system where MEC servers are deployed at access points to accelerate the communication process between mobile terminals. To capture the tradeoff between the time delay and the energy consumption of the system, a cost function is introduced to facilitate the optimization of the computation and communication resources. The formulated optimization problem is non-convex and is tackled by an iterative block coordinate descent algorithm that decouples the original optimization problem into two subproblems and alternately optimizes the computation and communication resources. Moreover, the MEC-assisted P2P communication system is compared with the conventional P2P communication system, then a condition is provided in closed-form expression when the MEC-assisted P2P communication system performs better. Simulation results show that the advantage of this system is enhanced when the computing capability of the receiver increases whereas it is reduced when the computing capability of the transmitter increases. In addition, the performance of this system is significantly improved when the signal-to-noise ratio of hop-1 exceeds that of hop-2.

AutoFe-Sel: A Meta-learning based methodology for Recommending Feature Subset Selection Algorithms

  • Irfan Khan;Xianchao Zhang;Ramesh Kumar Ayyasam;Rahman Ali
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.17 no.7
    • /
    • pp.1773-1793
    • /
    • 2023
  • Automated machine learning, often referred to as "AutoML," is the process of automating the time-consuming and iterative procedures that are associated with the building of machine learning models. There have been significant contributions in this area across a number of different stages of accomplishing a data-mining task, including model selection, hyper-parameter optimization, and preprocessing method selection. Among them, preprocessing method selection is a relatively new and fast growing research area. The current work is focused on the recommendation of preprocessing methods, i.e., feature subset selection (FSS) algorithms. One limitation in the existing studies regarding FSS algorithm recommendation is the use of a single learner for meta-modeling, which restricts its capabilities in the metamodeling. Moreover, the meta-modeling in the existing studies is typically based on a single group of data characterization measures (DCMs). Nonetheless, there are a number of complementary DCM groups, and their combination will allow them to leverage their diversity, resulting in improved meta-modeling. This study aims to address these limitations by proposing an architecture for preprocess method selection that uses ensemble learning for meta-modeling, namely AutoFE-Sel. To evaluate the proposed method, we performed an extensive experimental evaluation involving 8 FSS algorithms, 3 groups of DCMs, and 125 datasets. Results show that the proposed method achieves better performance compared to three baseline methods. The proposed architecture can also be easily extended to other preprocessing method selections, e.g., noise-filter selection and imbalance handling method selection.

User-Centered Design in Virtual Reality Safety Education Contents - Disassembly Training for City Gas Governor - (VR 안전교육콘텐츠에서의 사용자 중심 디자인(UCD) - 도시가스 정압기 분해점검 훈련을 소재로 -)

  • Min-Soo Park;Sun-Hee Chang;Ji-Woo Jung;Jung-Chul Suh;Chan-Young Park;Duck-Hun Kim;Jung-Hyun Yoon
    • Journal of the Korean Institute of Gas
    • /
    • v.28 no.2
    • /
    • pp.84-92
    • /
    • 2024
  • This study applied the User-Centered Design(UCD) to develop effective VR safety training content for specific users. The UCD-based design was tailored to the VR, facilitating efficient design activities. The UCD process comprises key activities: deriving design concepts from user needs, designing with VR features, developing prototypes, conducting comprehensive evaluations with experts and users, and completing the finals. Unlike traditional UCD, this flexible approach allows iterative cycles, enhancing the quality and user satisfaction of VR safety training contents.

Design Vessel Selection of Maritime Bridges using Collision Risk Allocation Model (충돌위험분배모델을 이용한 해상교량의 설계선박 선정)

  • Lee, Seong-Lo;Lee, Byung Hwa;Bae, Yong-Gwi;Shin, Ho-Sang
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.10 no.3
    • /
    • pp.123-134
    • /
    • 2006
  • In this study ship collision risk analysis is performed to determine the design vessel for collision impact analysis of the maritime bridge. Method II which is a probability based analysis procedure is used to select the design vessel for collision impact from the risk analysis results. The analysis procedure, an iterative process in which a computed annual frequency of collapse(AF) is compared to the acceptance criterion, includes allocation method of acceptance criterion of annual frequency of bridge component collapse. The AF allocation by weights seems to be more reasonable than the pylon concentration allocation method because this AF allocation takes the design parameter characteristics quantitatively into consideration although the pylon concentration allocation method brings more economical results when the overestimated design collision strength of piers compared to the strength of pylon is moderately modified. From the assessment of ship collision risk for each bridge pier exposed to ship collision, a representative design vessel for all bridge components is selected. The design vessel size varies much from each other in the same bridge structure depending upon the vessel traffic characteristics.

Comparison of Effectiveness about Image Quality and Scan Time According to Reconstruction Method in Bone SPECT (영상 재구성 방법에 따른 Bone SPECT 영상의 질과 검사시간에 대한 실효성 비교)

  • Kim, Woo-Hyun;Jung, Woo-Young;Lee, Ju-Young;Ryu, Jae-Kwang
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.13 no.1
    • /
    • pp.9-14
    • /
    • 2009
  • Purpose: Nowadays in the nuclear medicine, many studies and efforts are being made to reduce the scan time, as well as the waiting time to be needed to execute exams after injection of radionuclide medicines. Several methods are being used in clinic, such as developing new radionuclide compounds that enable to be absorbed into target organs more quickly and reducing acquisition scan time by increase the number of Gamma Camera detectors to examine. Each medical equipment manufacturer has improved the imaging process techniques to reduce scan time. In this paper, we tried to analyze the difference of image quality between FBP, 3D OSEM reconstruction methods that commercialized and being clinically applied, and Astonish reconstruction method (A kind of Iterative fast reconstruction method of Philips), also difference of image quality on scan time. Material and Methods: We investigated in 32 patients that examined the Bone SPECT from June to July 2008 at department of nuclear medicine, ASAN Medical Center in Seoul. 40sec/frame and 20sec/frame images were acquired that using Philips‘ PRECEDENCE 16 Gamma Camera and then reconstructed those images by using the Astonish (Philips’ Reconstruction Method), 3D OSEM and FBP methods. The blinded test was performed to the clinical interpreting physicians with all images analyzed by each reconstruction method for qualitative analysis. And we analyzed target to non target ratio by draws lesions as the center of disease for quantitative analysis. At this time, each image was analyzed with same location and size of ROI. Results: In a qualitative analysis, there was no significant difference by acquisition time changes in image quality. In a quantitative analysis, the images reconstructed Astonish method showed good quality due to better sharpness and distinguish sharply between lesions and peripheral lesions. After measuring each mean value and standard deviation value of target to non target ratio with 40 sec/frame and 20sec/frame images, those values are Astonish (40 sec-$13.91{\pm}5.62$ : 20 sec-$13.88{\pm}5.92$), 3D OSEM (40 sec-$10.60{\pm}3.55$ : 20 sec-$10.55{\pm}3.64$), FBP (40 sec-$8.30{\pm}4.44$ : 20 sec-$8.19{\pm}4.20$). We analyzed target to non target ratio from 20 sec and 40 sec images. And we analyzed the result, In Astonish (t=0.16, p=0.872), 3D OSEM (t=0.51, p=0.610), FBP (t=0.73, p=0.469) methods, there was no significant difference statistically by acquisition time change in image quality. But FBP indicates no statistical differences while some images indicate difference between 40 sec/frame and 20 sec/frame images by various factors. Conclusions: In the circumstance, try to find a solution to reduce nuclear medicine scan time, the development of nuclear medicine equipment hardware has decreased while software has marched forward at a relentless. Due to development of computer hardware, the image reconstruction time was reduced and the expanded capacity to restore enables iterative methods that couldn't be performed before due to technical limits. As imaging process technique developed, it reduced scan time and we could observe that image quality keep similar level. While keeping exam quality and reducing scan time can induce the reduction of patient's pain and sensory waiting time, also accessibility of nuclear medicine exam will be improved and it provide better service to patients and clinical physician who order exams. Consequently, those things make the image of department of nuclear medicine be improved. Concurrent Imaging - A new function that setting up each image acquisition parameter and enables to acquire images simultaneously with various parameters to once examine.

  • PDF

Development of e-navigation shipboard technical architecture (e-navigation 선상시스템을 위한 기술적 아키텍처 개발)

  • Shim, Woo-Seong;Kim, Sun-Young;Lee, Sang-Jeong
    • Journal of Navigation and Port Research
    • /
    • v.37 no.1
    • /
    • pp.9-14
    • /
    • 2013
  • The e-navigation has been being developed in IMO is a sort of strategy to provide user-oriented services for safe navigation and environmental protection based on the architecture and its related services complying with the user needs. At NAV $57^{th}$ meeting in 2011, the overarching e-navigation architecture was approved which represent overall relationship only between onboard and ashore elements, so more detail technical architecture for each element should be developed for implementation in view of services and systems. Considering the continuous and iterative verification of e-navigation development process required by IMO, the relationship and traceability should be took in consideration between the outcome of e-navigation process and the element of the architecture. In this paper, we have surveyed literarily the user needs, result of gap analysis and practical solutions to address them and defined the architecture elements and their relationship considering the three kinds of views of DoDAF(Architecture Framework) of US department of Defence, in result, proposed the e-navigation shipboard technical architecture.

Image Restoration and Segmentation for PAN-sharpened High Multispectral Imagery (PAN-SHARPENED 고해상도 다중 분광 자료의 영상 복원과 분할)

  • Lee, Sanghoon
    • Korean Journal of Remote Sensing
    • /
    • v.33 no.6_1
    • /
    • pp.1003-1017
    • /
    • 2017
  • Multispectral image data of high spatial resolution is required to obtain correct information on the ground surface. The multispectral image data has lower resolution compared to panchromatic data. PAN-sharpening fusion technique produces the multispectral data with higher resolution of panchromatic image. Recently the object-based approach is more applied to the high spatial resolution data than the conventional pixel-based one. For the object-based image analysis, it is necessary to perform image segmentation that produces the objects of pixel group. Image segmentation can be effectively achieved by the process merging step-by-step two neighboring regions in RAG (Regional Adjacency Graph). In the satellite remote sensing, the operational environment of the satellite sensor causes image degradation during the image acquisition. This degradation increases variation of pixel values in same area, and results in deteriorating the accuracy of image segmentation. An iterative approach that reduces the difference of pixel values in two neighboring pixels of same area is employed to alleviate variation of pixel values in same area. The size of segmented regions is associated with the quality of image segmentation and is decided by a stopping rue in the merging process. In this study, the image restoration and segmentation was quantitatively evaluated using simulation data and was also applied to the three PAN-sharpened multispectral images of high resolution: Dubaisat-2 data of 1m panchromatic resolution from LA, USA and KOMPSAT3 data of 0.7m panchromatic resolution from Daejeon and Chungcheongnam-do in the Korean peninsula. The experimental results imply that the proposed method can improve analytical accuracy in the application of remote sensing high resolution PAN-sharpened multispectral imagery.

Improvement of Procedures for Reasonable Implementation of TMDL (수질오염총량관리제의 합리적인 시행을 위한 시행절차 개선방안)

  • Kim, Young-Il;Yi, Sang-Jin
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.33 no.8
    • /
    • pp.617-622
    • /
    • 2011
  • The policy of total maximum daily load (TMDL) was introduced to manage wasteload within the loading capacity to achieve water quality standards in the watershed. While the TMDL was implemented, the institutional and technical correction for the improvement of procedure was accomplished even though there were various problems and basically through the process of trial & error. However, a fundamental improvement of this policy is needed to implement the TMDL. This study has come up with a new viewpoint on improving this procedure for reasonable implementation of TMDL. First of all, the water quality and flowrate monitoring of the tributaries should be implemented. This should be done through the establishment of a monitoring system which will include standards of scope, a set time period, water quality parameters and frequency follow ups for the implementation of TMDL. The basic plan in all of the watersheds should be developed based on the establishment of water quality parameters and standards for water use and ecological purposes according to the results of the water quality and flowrate monitoring in the watersheds. The implementation plan for water quality improvement should be established in the watersheds where exceeds the targeted water quality standards. The performance assessment of TMDL should be conducted every year to meet the satisfaction assessment of water quality standards in the watersheds. Finally, if the water quality standards in the watersheds can not be attained or the water quality parameters and standards should be changed, the implementation procedure will be performed according to the iterative process. On the contrary, the policy of TMDL in the watersheds where the water quality standards have been met the goal will be finished.