• Title/Summary/Keyword: Efficient handling

Search Result 467, Processing Time 0.028 seconds

An Accelerated IK Solver for Deformation of 3D Models with Triangular Meshes (삼각형 메쉬로 이루어진 3D 모델의 변형을 위한 IK 계산 가속화)

  • Park, Hyunah;Kang, Daeun;Kwon, Taesoo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.27 no.5
    • /
    • pp.1-11
    • /
    • 2021
  • The purpose of our research is to efficiently deform a 3D models which is composed of a triangular mesh and a skeleton. We designed a novel inverse kinematics (IK) solver that calculates the updated positions of mesh vertices with fewer computing operations. Through our user interface, one or more markers are selected on the surface of the model and their target positions are set, then the system updates the positions of surface vertices to construct a deformed model. The IK solving process for updating vertex positions includes many computations for obtaining transformations of the markers, their affecting joints, and their parent joints. Many of these computations are often redundant. We precompute those redundant terms in advance so that the 3-nested loop computation structure was improved to a 2-nested loop structure, and thus the computation time for a deformation is greatly reduced. This novel IK solver can be adopted for efficient performance in various research fields, such as handling 3D models implemented by LBS method, or object tracking without any markers.

Designing a Molecular Diagnostic Laboratory for Testing Highly Pathogenic Viruses (고병원성 바이러스 검사를 위한 분자진단검사실 구축)

  • Jung, Tae Won;Jung, Jaeyoung;Kim, Sunghyun;Kim, Young-Kwon
    • Korean Journal of Clinical Laboratory Science
    • /
    • v.53 no.2
    • /
    • pp.143-150
    • /
    • 2021
  • The recent spread of novel and highly variant pathogenic viruses, including the coronavirus (SARS-CoV-2), has increased the demand for diagnostic testing for rapid confirmation. This has resulted in investigating the functional capability of each space, and preparing facility guidelines to secure the safety of medical technologists. During viral evaluations, there is a requirement of negative pressure facilities along with thread separation, during pre-treatment of samples and before nucleic acid amplification. Space composition therefore needs to be planned by considering unidirectional air flow. This classification of safety management facilities is designated as biosafety level 2, and personal protective equipment is placed accordingly. In case of handling dangerous materials, they need to be carried out of the biosafety cabinet, and sterilizers are required for suitable disposal of infectious agents. A common feature of domestic laboratories is maintenance of the sample pre-treatment space at a negative pressure of -2.5 Pa or less, and arranging separate pre-treatment and reagent preparation spaces during the test process. We believe that the data generated in this study is meaningful, and offers an efficient direction and detailed flow for separation of the inspection process and space functions. Moreover, this study introduces construction of the laboratory by applying the safety management standards.

A Study on Effective Real Estate Big Data Management Method Using Graph Database Model (그래프 데이터베이스 모델을 이용한 효율적인 부동산 빅데이터 관리 방안에 관한 연구)

  • Ju-Young, KIM;Hyun-Jung, KIM;Ki-Yun, YU
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.25 no.4
    • /
    • pp.163-180
    • /
    • 2022
  • Real estate data can be big data. Because the amount of real estate data is growing rapidly and real estate data interacts with various fields such as the economy, law, and crowd psychology, yet is structured with complex data layers. The existing Relational Database tends to show difficulty in handling various relationships for managing real estate big data, because it has a fixed schema and is only vertically extendable. In order to improve such limitations, this study constructs the real estate data in a Graph Database and verifies its usefulness. For the research method, we modeled various real estate data on MySQL, one of the most widely used Relational Databases, and Neo4j, one of the most widely used Graph Databases. Then, we collected real estate questions used in real life and selected 9 different questions to compare the query times on each Database. As a result, Neo4j showed constant performance even in queries with multiple JOIN statements with inferences to various relationships, whereas MySQL showed a rapid increase in its performance. According to this result, we have found out that a Graph Database such as Neo4j is more efficient for real estate big data with various relationships. We expect to use the real estate Graph Database in predicting real estate price factors and inquiring AI speakers for real estate.

Study on the Efficient Integration of Long-term Care Facilities and Geriatric Hospitals by Using NHIC Survey Data (실태조사를 통한 장기요양시설과 요양병원의 효율적 연계방안)

  • Choi, in-duck;Lee, eun-mi
    • 한국노년학
    • /
    • v.30 no.3
    • /
    • pp.855-869
    • /
    • 2010
  • The purpose of this study is to identify how to efficiently integrate long-term care facilities into geriatric hospitals. We conducted a survey on the current operations of facilities and medical services of 2009 of 192 long-term facilities and 168 geriatric hospitals in Korea between October and November. Technical statistics and chi-square test were conducted on the collected data using the SPSS 13.0/Win program. There was a difference between the two facility types in terms of the co-payment levels of the food services. Both types selected the budget deficit as their major management problem. Ease of access and the surrounding environment were critical factors used to select the location of both types of facilities. Facility users benefited from the discounted co-payments of both facility types. However, facility users wanted more frequent visits and support from their family members during their stay at the facilities. It was discovered that users in the long-term care facilities stayed longer, that is until they died, compared to their counterparts in geriatric hospitals. The two types of facilities provided their services totally separately to users. Users of the two types of facilities are poorly supported and cared for by their families. This study suggests that setting reasonable service fees, paying caretakers, introducing an integrated facility, strengthening facility assessment standards, introducing the family doctor system, and introducing the handling of long-term care insurance by geriatric hospitals would allow the integration between long- term care facilities and geriatric hospitals to be beneficial.

A Study on Operating Vertiport Cooperative Decision Making (버티포트 협력적 의사결정지원체계 운용방안연구)

  • Jae-wook Chun;Ye-seung Hwang;Gang-san Kim;Eui Jang;Yeong-min Sim;Woo-choon Moon
    • Journal of Advanced Navigation Technology
    • /
    • v.27 no.6
    • /
    • pp.690-698
    • /
    • 2023
  • Information sharing and decision making between airport stakeholders became possible after the introduction of airport cooperative making system (A-CDM). This also resulted in optimizing aircraft handling time and increased the efficiency of aircraft operations. Technological advances have recently led to the development of urban air mobility (UAM) which is a small aircraft taking off and landing vertically. It is emerging as a new air transportation system in the future due to its advantage of saving time and solving congestion problem in the urban area. This study aims to suggest how vertiport cooperative decision making system (V-CDM) should be managed for efficient operation of UAM. By establishing procedure for decision making system based on Vertiport ecosystem of UAM. By establishing procedure for decision making system based on Vertiport ecosystem and UAM aircraft, unnecessary flight delays or cancellations can be minimized and efficiency of UAM operation will be improved as well.

Efficient Culture Method for Early Passage hESCs after Thawing (초기 계대 인간 배아줄기세포의 해동 후 효율적인 배양 방법)

  • Baek, Jin-Ah;Kim, Hee-Sun;Seol, Hye-Won;Seo, Jin;Jung, Ju-Won;Yoon, Bo-Ae;Park, Yong-Bin;Oh, Sun-Kyung;Ku, Seung-Yup;Kim, Seok-Hyun;Choi, Young-Min;Moon, Shin-Yong
    • Clinical and Experimental Reproductive Medicine
    • /
    • v.36 no.4
    • /
    • pp.311-319
    • /
    • 2009
  • Objective: Human embryonic stem cells (hESCs) have the capacity to differentiate into all of the cell types and therefore hold promise for cell therapeutic applications. In order to utilize this important potential of hESCs, enhancement of currently used technologies for handling and manipulating the cells is required. The cryopreservation of hESC colonies was successfully performed using the vitrification and slow freezing-rapid thawing method. However, most of the hESC colonies were showed extremely spontaneous differentiation after freezing and thawing. In this study, we were performed to rapidly collect of early passage hESCs, which was thawed and had high rate of spontaneously differentiation of SNUhES11 cell line. Methods: Four days after plating, partially spontaneously differentiated parts of hESC colony were cut off using finely drawn-out dissecting pipette, which is mechanical separation method. Results: After separating of spontaneously differentiated cells, we observed that removed parts were recovered by undifferentiated cells. Furthermore, mechanical separation method was more efficient for hESCs expansion after thawing when we repeated this method. The recovery rate after removing differentiated parts of hESC colonies were 55.0%, 74.5%, and 71.1% when we have applied this method to three passages. Conclusion: Mechanical separation method is highly effective for rapidly collecting and large volumes of undifferentiated cells after thawing of cryopreserved early passage hESCs.

A Laboratory-Scale Study of the Applicability of a Halophilic Sediment Bioelectrochemical System for in situ Reclamation of Water and Sediment in Brackish Aquaculture Ponds: Effects of Operational Conditions on Performance

  • Pham, Hai The;Vu, Phuong Ha;Nguyen, Thuy Thu Thi;Bui, Ha Viet Thi;Tran, Huyen Thanh Thi;Tran, Hanh My;Nguyen, Huy Quang;Kim, Byung Hong
    • Journal of Microbiology and Biotechnology
    • /
    • v.29 no.10
    • /
    • pp.1607-1623
    • /
    • 2019
  • Sediment bioelectrochemical systems (SBESs) can be integrated into brackish aquaculture ponds for in-situ bioremediation of the pond water and sediment. Such an in-situ system offers advantages including reduced treatment cost, reusability and simple handling. In order to realize such an application potential of the SBES, in this laboratory-scale study we investigated the effect of several controllable and uncontrollable operational factors on the in-situ bioremediation performance of a tank model of a brackish aquaculture pond, into which a SBES was integrated, in comparison with a natural degradation control model. The performance was evaluated in terms of electricity generation by the SBES, Chemical oxygen demand (COD) removal and nitrogen removal of both the tank water and the tank sediment. Real-life conditions of the operational parameters were also experimented to understand the most close-to-practice responses of the system to their changes. Predictable effects of controllable parameters including external resistance and electrode spacing, similar to those reported previously for the BESs, were shown by the results but exceptions were observed. Accordingly, while increasing the electrode spacing reduced the current densities but generally improved COD and nitrogen removal, increasing the external resistance could result in decreased COD removal but also increased nitrogen removal and decreased current densities. However, maximum electricity generation and COD removal efficiency difference of the SBES (versus the control) could be reached with an external resistance of $100{\Omega}$, not with the lowest one of $10{\Omega}$. The effects of uncontrollable parameters such as ambient temperature, salinity and pH of the pond (tank) water were rather unpredictable. Temperatures higher than $35^{\circ}C$ seemed to have more accelaration effect on natural degradation than on bioelectrochemical processes. Changing salinity seriously changed the electricity generation but did not clearly affect the bioremediation performance of the SBES, although at 2.5% salinity the SBES displayed a significantly more efficient removal of nitrogen in the water, compared to the control. Variation of pH to practically extreme levels (5.5 and 8.8) led to increased electricity generations but poorer performances of the SBES (vs. the control) in removing COD and nitrogen. Altogether, the results suggest some distinct responses of the SBES under brackish conditions and imply that COD removal and nitrogen removal in the system are not completely linked to bioelectrochemical processes but electrochemically enriched bacteria can still perform non-bioelectrochemical COD and nitrogen removals more efficiently than natural ones. The results confirm the application potential of the SBES in brackish aquaculture bioremediation and help propose efficient practices to warrant the success of such application in real-life scenarios.

Evaluating Reverse Logistics Networks with Centralized Centers : Hybrid Genetic Algorithm Approach (집중형센터를 가진 역물류네트워크 평가 : 혼합형 유전알고리즘 접근법)

  • Yun, YoungSu
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.4
    • /
    • pp.55-79
    • /
    • 2013
  • In this paper, we propose a hybrid genetic algorithm (HGA) approach to effectively solve the reverse logistics network with centralized centers (RLNCC). For the proposed HGA approach, genetic algorithm (GA) is used as a main algorithm. For implementing GA, a new bit-string representation scheme using 0 and 1 values is suggested, which can easily make initial population of GA. As genetic operators, the elitist strategy in enlarged sampling space developed by Gen and Chang (1997), a new two-point crossover operator, and a new random mutation operator are used for selection, crossover and mutation, respectively. For hybrid concept of GA, an iterative hill climbing method (IHCM) developed by Michalewicz (1994) is inserted into HGA search loop. The IHCM is one of local search techniques and precisely explores the space converged by GA search. The RLNCC is composed of collection centers, remanufacturing centers, redistribution centers, and secondary markets in reverse logistics networks. Of the centers and secondary markets, only one collection center, remanufacturing center, redistribution center, and secondary market should be opened in reverse logistics networks. Some assumptions are considered for effectively implementing the RLNCC The RLNCC is represented by a mixed integer programming (MIP) model using indexes, parameters and decision variables. The objective function of the MIP model is to minimize the total cost which is consisted of transportation cost, fixed cost, and handling cost. The transportation cost is obtained by transporting the returned products between each centers and secondary markets. The fixed cost is calculated by opening or closing decision at each center and secondary markets. That is, if there are three collection centers (the opening costs of collection center 1 2, and 3 are 10.5, 12.1, 8.9, respectively), and the collection center 1 is opened and the remainders are all closed, then the fixed cost is 10.5. The handling cost means the cost of treating the products returned from customers at each center and secondary markets which are opened at each RLNCC stage. The RLNCC is solved by the proposed HGA approach. In numerical experiment, the proposed HGA and a conventional competing approach is compared with each other using various measures of performance. For the conventional competing approach, the GA approach by Yun (2013) is used. The GA approach has not any local search technique such as the IHCM proposed the HGA approach. As measures of performance, CPU time, optimal solution, and optimal setting are used. Two types of the RLNCC with different numbers of customers, collection centers, remanufacturing centers, redistribution centers and secondary markets are presented for comparing the performances of the HGA and GA approaches. The MIP models using the two types of the RLNCC are programmed by Visual Basic Version 6.0, and the computer implementing environment is the IBM compatible PC with 3.06Ghz CPU speed and 1GB RAM on Windows XP. The parameters used in the HGA and GA approaches are that the total number of generations is 10,000, population size 20, crossover rate 0.5, mutation rate 0.1, and the search range for the IHCM is 2.0. Total 20 iterations are made for eliminating the randomness of the searches of the HGA and GA approaches. With performance comparisons, network representations by opening/closing decision, and convergence processes using two types of the RLNCCs, the experimental result shows that the HGA has significantly better performance in terms of the optimal solution than the GA, though the GA is slightly quicker than the HGA in terms of the CPU time. Finally, it has been proved that the proposed HGA approach is more efficient than conventional GA approach in two types of the RLNCC since the former has a GA search process as well as a local search process for additional search scheme, while the latter has a GA search process alone. For a future study, much more large-sized RLNCCs will be tested for robustness of our approach.

Branching Path Query Processing for XML Documents using the Prefix Match Join (프리픽스 매취 조인을 이용한 XML 문서에 대한 분기 경로 질의 처리)

  • Park Young-Ho;Han Wook-Shin;Whang Kyu-Young
    • Journal of KIISE:Databases
    • /
    • v.32 no.4
    • /
    • pp.452-472
    • /
    • 2005
  • We propose XIR-Branching, a novel method for processing partial match queries on heterogeneous XML documents using information retrieval(IR) techniques and novel instance join techniques. A partial match query is defined as the one having the descendent-or-self axis '//' in its path expression. In its general form, a partial match query has branch predicates forming branching paths. The objective of XIR-Branching is to efficiently support this type of queries for large-scale documents of heterogeneous schemas. XIR-Branching has its basis on the conventional schema-level methods using relational tables(e.g., XRel, XParent, XIR-Linear[21]) and significantly improves their efficiency and scalability using two techniques: an inverted index technique and a novel prefix match join. The former supports linear path expressions as the method used in XIR-Linear[21]. The latter supports branching path expressions, and allows for finding the result nodes more efficiently than containment joins used in the conventional methods. XIR-Linear shows the efficiency for linear path expressions, but does not handle branching path expressions. However, we have to handle branching path expressions for querying more in detail and general. The paper presents a novel method for handling branching path expressions. XIR-Branching reduces a candidate set for a query as a schema-level method and then, efficiently finds a final result set by using a novel prefix match join as an instance-level method. We compare the efficiency and scalability of XIR-Branching with those of XRel and XParent using XML documents crawled from the Internet. The results show that XIR-Branching is more efficient than both XRel and XParent by several orders of magnitude for linear path expressions, and by several factors for branching path expressions.

Interactive analysis tools for the wide-angle seismic data for crustal structure study (Technical Report) (지각 구조 연구에서 광각 탄성파 자료를 위한 대화식 분석 방법들)

  • Fujie, Gou;Kasahara, Junzo;Murase, Kei;Mochizuki, Kimihiro;Kaneda, Yoshiyuki
    • Geophysics and Geophysical Exploration
    • /
    • v.11 no.1
    • /
    • pp.26-33
    • /
    • 2008
  • The analysis of wide-angle seismic reflection and refraction data plays an important role in lithospheric-scale crustal structure study. However, it is extremely difficult to develop an appropriate velocity structure model directly from the observed data, and we have to improve the structure model step by step, because the crustal structure analysis is an intrinsically non-linear problem. There are several subjective processes in wide-angle crustal structure modelling, such as phase identification and trial-and-error forward modelling. Because these subjective processes in wide-angle data analysis reduce the uniqueness and credibility of the resultant models, it is important to reduce subjectivity in the analysis procedure. From this point of view, we describe two software tools, PASTEUP and MODELING, to be used for developing crustal structure models. PASTEUP is an interactive application that facilitates the plotting of record sections, analysis of wide-angle seismic data, and picking of phases. PASTEUP is equipped with various filters and analysis functions to enhance signal-to-noise ratio and to help phase identification. MODELING is an interactive application for editing velocity models, and ray-tracing. Synthetic traveltimes computed by the MODELING application can be directly compared with the observed waveforms in the PASTEUP application. This reduces subjectivity in crustal structure modelling because traveltime picking, which is one of the most subjective process in the crustal structure analysis, is not required. MODELING can convert an editable layered structure model into two-way traveltimes which can be compared with time-sections of Multi Channel Seismic (MCS) reflection data. Direct comparison between the structure model of wide-angle data with the reflection data will give the model more credibility. In addition, both PASTEUP and MODELING are efficient tools for handling a large dataset. These software tools help us develop more plausible lithospheric-scale structure models using wide-angle seismic data.