• Title/Summary/Keyword: map models

Search Result 719, Processing Time 0.029 seconds

Design of a Mapping Framework on Image Correction and Point Cloud Data for Spatial Reconstruction of Digital Twin with an Autonomous Surface Vehicle (무인수상선의 디지털 트윈 공간 재구성을 위한 이미지 보정 및 점군데이터 간의 매핑 프레임워크 설계)

  • Suhyeon Heo;Minju Kang;Jinwoo Choi;Jeonghong Park
    • Journal of the Society of Naval Architects of Korea
    • /
    • v.61 no.3
    • /
    • pp.143-151
    • /
    • 2024
  • In this study, we present a mapping framework for 3D spatial reconstruction of digital twin model using navigation and perception sensors mounted on an Autonomous Surface Vehicle (ASV). For improving the level of realism of digital twin models, 3D spatial information should be reconstructed as a digitalized spatial model and integrated with the components and system models of the ASV. In particular, for the 3D spatial reconstruction, color and 3D point cloud data which acquired from a camera and a LiDAR sensors corresponding to the navigation information at the specific time are required to map without minimizing the noise. To ensure clear and accurate reconstruction of the acquired data in the proposed mapping framework, a image preprocessing was designed to enhance the brightness of low-light images, and a preprocessing for 3D point cloud data was included to filter out unnecessary data. Subsequently, a point matching process between consecutive 3D point cloud data was conducted using the Generalized Iterative Closest Point (G-ICP) approach, and the color information was mapped with the matched 3D point cloud data. The feasibility of the proposed mapping framework was validated through a field data set acquired from field experiments in a inland water environment, and its results were described.

Trend of In Silico Prediction Research Using Adverse Outcome Pathway (독성발현경로(Adverse Outcome Pathway)를 활용한 In Silico 예측기술 연구동향 분석)

  • Sujin Lee;Jongseo Park;Sunmi Kim;Myungwon Seo
    • Journal of Environmental Health Sciences
    • /
    • v.50 no.2
    • /
    • pp.113-124
    • /
    • 2024
  • Background: The increasing need to minimize animal testing has sparked interest in alternative methods with more humane, cost-effective, and time-saving attributes. In particular, in silico-based computational toxicology is gaining prominence. Adverse outcome pathway (AOP) is a biological map depicting toxicological mechanisms, composed of molecular initiating events (MIEs), key events (KEs), and adverse outcomes (AOs). To understand toxicological mechanisms, predictive models are essential for AOP components in computational toxicology, including molecular structures. Objectives: This study reviewed the literature and investigated previous research cases related to AOP and in silico methodologies. We describe the results obtained from the analysis, including predictive techniques and approaches that can be used for future in silico-based alternative methods to animal testing using AOP. Methods: We analyzed in silico methods and databases used in the literature to identify trends in research on in silico prediction models. Results: We reviewed 26 studies related to AOP and in silico methodologies. The ToxCast/Tox21 database was commonly used for toxicity studies, and MIE was the most frequently used predictive factor among the AOP components. Machine learning was most widely used among prediction techniques, and various in silico methods, such as deep learning, molecular docking, and molecular dynamics, were also utilized. Conclusions: We analyzed the current research trends regarding in silico-based alternative methods for animal testing using AOPs. Developing predictive techniques that reflect toxicological mechanisms will be essential to replace animal testing with in silico methods. In the future, since the applicability of various predictive techniques is increasing, it will be necessary to continue monitoring the trend of predictive techniques and in silico-based approaches.

Quantitative Assessment Technology of Small Animal Myocardial Infarction PET Image Using Gaussian Mixture Model (다중가우시안혼합모델을 이용한 소동물 심근경색 PET 영상의 정량적 평가 기술)

  • Woo, Sang-Keun;Lee, Yong-Jin;Lee, Won-Ho;Kim, Min-Hwan;Park, Ji-Ae;Kim, Jin-Su;Kim, Jong-Guk;Kang, Joo-Hyun;Ji, Young-Hoon;Choi, Chang-Woon;Lim, Sang-Moo;Kim, Kyeong-Min
    • Progress in Medical Physics
    • /
    • v.22 no.1
    • /
    • pp.42-51
    • /
    • 2011
  • Nuclear medicine images (SPECT, PET) were widely used tool for assessment of myocardial viability and perfusion. However it had difficult to define accurate myocardial infarct region. The purpose of this study was to investigate methodological approach for automatic measurement of rat myocardial infarct size using polar map with adaptive threshold. Rat myocardial infarction model was induced by ligation of the left circumflex artery. PET images were obtained after intravenous injection of 37 MBq $^{18}F$-FDG. After 60 min uptake, each animal was scanned for 20 min with ECG gating. PET data were reconstructed using ordered subset expectation maximization (OSEM) 2D. To automatically make the myocardial contour and generate polar map, we used QGS software (Cedars-Sinai Medical Center). The reference infarct size was defined by infarction area percentage of the total left myocardium using TTC staining. We used three threshold methods (predefined threshold, Otsu and Multi Gaussian mixture model; MGMM). Predefined threshold method was commonly used in other studies. We applied threshold value form 10% to 90% in step of 10%. Otsu algorithm calculated threshold with the maximum between class variance. MGMM method estimated the distribution of image intensity using multiple Gaussian mixture models (MGMM2, ${\cdots}$ MGMM5) and calculated adaptive threshold. The infarct size in polar map was calculated as the percentage of lower threshold area in polar map from the total polar map area. The measured infarct size using different threshold methods was evaluated by comparison with reference infarct size. The mean difference between with polar map defect size by predefined thresholds (20%, 30%, and 40%) and reference infarct size were $7.04{\pm}3.44%$, $3.87{\pm}2.09%$ and $2.15{\pm}2.07%$, respectively. Otsu verse reference infarct size was $3.56{\pm}4.16%$. MGMM methods verse reference infarct size was $2.29{\pm}1.94%$. The predefined threshold (30%) showed the smallest mean difference with reference infarct size. However, MGMM was more accurate than predefined threshold in under 10% reference infarct size case (MGMM: 0.006%, predefined threshold: 0.59%). In this study, we was to evaluate myocardial infarct size in polar map using multiple Gaussian mixture model. MGMM method was provide adaptive threshold in each subject and will be a useful for automatic measurement of infarct size.

An MIB Access Control Modeling for the Secure Management of Large Networks (대규모 망의 안전한 관리를 위한 관리 정보베이스의 접근 제어 모형화)

  • Seo, Jae-Hyeon;Lee, Chang-Jin;No, Bong-Nam
    • The Transactions of the Korea Information Processing Society
    • /
    • v.2 no.4
    • /
    • pp.581-591
    • /
    • 1995
  • An MIB is the heart of a network management system and it stores all information that is necessary for network management. To operate networks safely, it is essential to control accesses to managed objects. This paper provides three-level architecture of managers so as to perform network management more efficiently in large networks. Moreover, mandatory access control(MAC) policy and role-based access control policy are adopted to ensure the secure access to the MIB. These policies are modeled by using the active object-oriented data model, which makes easy to map these access control models into the active object-oriented database.

  • PDF

Egypt's Science and Technology Parks Outlook : A Focus on SRTACity (City for Scientific Research and Technology Applications)

  • Abdel-Fattah, Yasser R.;Kashyout, Abdel-Hady B.;Sheta, Walaa M.
    • World Technopolis Review
    • /
    • v.2 no.2
    • /
    • pp.96-108
    • /
    • 2013
  • Egypt has been known as the light house of science and innovation not only in the Middle East but to the world across ages. Recently, there have been many ups and downs that positioned Egypt in a lower rank that it actually deserves according to its long history. This review entitles the current condition of science, technology and innovation in Egypt and the consequent setting up of best practices of science and technology parks (STPs) experiences. Egypt's science, technology and innovation (STI) system is highly centralized and dominated by the public sector, with R&D happening mostly in state-run universities and research centers supervised by the Ministry of Higher Education and Ministry of Scientific Research. R&D indicators state that Egypt ranking is 40th worldwide for the published articles (around 10,000 papers in 2011), while the numbers of issued patents (350 local and 50 international in 2011) is still far beyond expected. STPs in Egypt are addressed in this review by three examples; smart village in Cairo, Investment zone in Borg El-Arab City and Technology Valley in Ismailia. The three models are discussed in details and a suggested road map for developing more STPs is estimated.

2D Flood Simulation for Estimating the Economic Loss in the Building Areas

  • Kang, Sang-Hyeok
    • Spatial Information Research
    • /
    • v.15 no.4
    • /
    • pp.397-406
    • /
    • 2007
  • 2D hydraulic models of urban areas are at the forefront of current research of flood inundation mechanisms, but they are constrained by inadequate parameters of topography and insufficient data. In this paper a numerical model based on DEMs is presented to represent overflow waters due to bank break in urban areas. The surface flow in the building areas is assumed to be properly modeled by solving Saint-Venant equation. In order to represent flooding broken out in Samcheok city, 2002, hydraulic model test using tracer has been carried out and validated. These efforts will serve for making flood hazard map and for estimating economic loss due to inundation of personal properties in urban areas.

  • PDF

Measuring the Managerial Efficiency of Insurance Companies in Saudi Arabia: A Data Envelopment Analysis Approach

  • NAUSHAD, Mohammad;FARIDI, Mohammad Rishad;FAISAL, Shaha
    • The Journal of Asian Finance, Economics and Business
    • /
    • v.7 no.6
    • /
    • pp.297-304
    • /
    • 2020
  • This paper applies the Data Envelopment Analysis (DEA) to compute the managerial efficiency of 30 insurance companies listed on the Saudi stock exchange for the duration of four years from 2015 to 2018. The companies taken as a sample of study included both conventional and Takaful insurance companies. The insurance sector of KSA is one of the largest sectors in the country, contributing a substantial percentage in the non-oil economy. Efficiency measurement and evaluation will provide a venue to introspect and benchmark frontiers to the sector. In the present study, we have utilized the basic Banker Charnes Cooper and Charnes Copper Rhodes models of DEA. Two inputs, namely, general & administrative expenses and policy & acquisition costs, and two outputs (Net premium earned and Investment Income & other incomes) were taken for efficiency calculations. The final outcomes of the study reveal that a good number of insurance companies operating in KSA are found to be efficient on managerial efficiency scale. Three firms remain the leader on the frontier of the managerial efficiency. And no company found with zero (0) efficiency or a negative efficiency. It is expected that the outcome of the study will provide benchmarks to managers and a road map to further improvement.

Quasi real-time post-earthquake damage assessment of lifeline systems based on available intensity measure maps

  • Torbol, Marco
    • Smart Structures and Systems
    • /
    • v.16 no.5
    • /
    • pp.873-889
    • /
    • 2015
  • In civil engineering, probabilistic seismic risk assessment is used to predict the economic damage to a lifeline system of possible future earthquakes. The results are used to plan mitigation measures and to strengthen the structures where necessary. Instead, after an earthquake public authorities need mathematical models that compute: the damage caused by the earthquake to the individual vulnerable components and links, and the global behavior of the lifeline system. In this study, a framework that was developed and used for prediction purpose is modified to assess the consequences of an earthquake in quasi real-time after such earthquake happened. This is possible because nowadays entire seismic regions are instrumented with tight networks of strong motion stations, which provide and broadcast accurate intensity measure maps of the event to the public within minutes. The framework uses the broadcasted map and calculates the damage to the lifeline system and its component in quasi real-time. The results give the authorities the most likely status of the system. This helps emergency personnel to deal with the damage and to prioritize visual inspections and repairs. A highway transportation network is used as a test bed but any lifeline system can be analyzed.

Incorporation of Scene Geometry in Least Squares Correlation Matching for DEM Generation from Linear Pushbroom Images

  • Kim, Tae-Jung;Yoon, Tae-Hun;Lee, Heung-Kyu
    • Proceedings of the KSRS Conference
    • /
    • 1999.11a
    • /
    • pp.182-187
    • /
    • 1999
  • Stereo matching is one of the most crucial parts in DEM generation. Naive stereo matching algorithms often create many holes and blunders in a DEM and therefore a carefully designed strategy must be employed to guide stereo matching algorithms to produce “good” 3D information. In this paper, we describe one such a strategy designed by the use of scene geometry, in particular, the epipolarity for generation of a DEM from linear pushbroom images. The epipolarity for perspective images is a well-known property, i.e., in a stereo image pair, a point in the reference image will map to a line in the search image uniquely defined by sensor models of the image pair. This concept has been utilized in stereo matching by applying epipolar resampling prior to matching. However, the epipolar matching for linear pushbroom images is rather complicated. It was found that the epipolarity can only be described by a Hyperbola- shaped curve and that epipolar resampling cannot be applied to linear pushbroom images. Instead, we have developed an algorithm of incorporating such epipolarity directly in least squares correlation matching. Experiments showed that this approach could improve the quality of a DEM.

  • PDF

Performance Evaluation of the Gas Turbine of Integrated Gasification Combined Cycle Considering Off-design Operation Effect (탈설계점 효과를 고려한 석탄가스화 복합발전용 가스터빈의 성능평가)

  • Lee, Chan;Kim, Yong Chul;Lee, Jin Wook;Kim, Hyung Taek
    • 유체기계공업학회:학술대회논문집
    • /
    • 1998.12a
    • /
    • pp.209-214
    • /
    • 1998
  • A thermodynamic simulation method is developed for the process design and the performance evaluation of the gas turbine in IGCC power plant. The present study adopts four clean coal gases derived from four different coal gasification and gas clean-up processes as IGCC gas turbine fuel, and considers the integration design condition of the gas turbine with ASU(Air Separation Unit). In addition, the present simulation method includes compressor performance map and expander choking models for considering the off-design effects due to coal gas firing and ASU integration. The present prediction results show that the efficiency and the net power of the IGCC gas turbines are seperior to those of the natural gas fired one but they are decreased with the air extraction from gas turbine to ASU. The operation point of the IGCC gas turbine compressor is shifted to the higher pressure ratio condition far from the design point by reducing the air extraction ratio. The exhaust gas of the IGCC gas turbine has more abundant wast heat for the heat recovery steam generator than that of the natural gas fired gas turbine.

  • PDF