• Title/Summary/Keyword: standard platform

Search Result 739, Processing Time 0.031 seconds

A MapReduce-Based Workflow BIG-Log Clustering Technique (맵리듀스기반 워크플로우 빅-로그 클러스터링 기법)

  • Jin, Min-Hyuck;Kim, Kwanghoon Pio
    • Journal of Internet Computing and Services
    • /
    • v.20 no.1
    • /
    • pp.87-96
    • /
    • 2019
  • In this paper, we propose a MapReduce-supported clustering technique for collecting and classifying distributed workflow enactment event logs as a preprocessing tool. Especially, we would call the distributed workflow enactment event logs as Workflow BIG-Logs, because they are satisfied with as well as well-fitted to the 5V properties of BIG-Data like Volume, Velocity, Variety, Veracity and Value. The clustering technique we develop in this paper is intentionally devised for the preprocessing phase of a specific workflow process mining and analysis algorithm based upon the workflow BIG-Logs. In other words, It uses the Map-Reduce framework as a Workflow BIG-Logs processing platform, it supports the IEEE XES standard data format, and it is eventually dedicated for the preprocessing phase of the ${\rho}$-Algorithm that is a typical workflow process mining algorithm based on the structured information control nets. More precisely, The Workflow BIG-Logs can be classified into two types: of activity-based clustering patterns and performer-based clustering patterns, and we try to implement an activity-based clustering pattern algorithm based upon the Map-Reduce framework. Finally, we try to verify the proposed clustering technique by carrying out an experimental study on the workflow enactment event log dataset released by the BPI Challenges.

Counting and Localizing Occupants using IR-UWB Radar and Machine Learning

  • Ji, Geonwoo;Lee, Changwon;Yun, Jaeseok
    • Journal of the Korea Society of Computer and Information
    • /
    • v.27 no.5
    • /
    • pp.1-9
    • /
    • 2022
  • Localization systems can be used with various circumstances like measuring population movement and rescue technology, even in security technology (like infiltration detection system). Vision sensors such as camera often used for localization is susceptible with light and temperature, and can cause invasion of privacy. In this paper, we used ultra-wideband radar technology (which is not limited by aforementioned problems) and machine learning techniques to measure the number and location of occupants in other indoor spaces behind the wall. We used four different algorithms and compared their results, including extremely randomized tree for four different situations; detect the number of occupants in a classroom, split the classroom into 28 locations and check the position of occupant, select one out of the 28 locations, divide it into 16 fine-grained locations, and check the position of occupant, and checking the positions of two occupants (existing in different locations). Overall, four algorithms showed good results and we verified that detecting the number and location of occupants are possible with high accuracy using machine learning. Also we have considered the possibility of service expansion using the oneM2M standard platform and expect to develop more service and products if this technology is used in various fields.

Experimental Study for Evaluation of Chloride Ion Diffusion Characteristics of Concrete Mix for Nuclear Power Plant Water Distribution Structures (원전 취배수 구조물 콘크리트 배합의 염소이온 확산특성 평가를 위한 실험적 연구)

  • Lee, Ho-Jae;Seo, Eun-A
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.26 no.5
    • /
    • pp.112-118
    • /
    • 2022
  • In this study, the diffusion characteristics were evaluated using the concrete mix design of nuclear safety-related structures. Among the concrete structures related to nuclear power safety, we selected the composition of intake and drainage structures that are immersed in seawater or located on the tidal platform and evaluated the chloride ion permeation resistance by compressive strength and electrical conductivity and the diffusion characteristics by immersion in salt water. analyzed. Compressive strength was measured on the 1st, 7th, 14th, 28th, 56th, and 91st days until the 91st day, which is the design standard strength of the nuclear power plant concrete structure, and chloride ion permeation resistance was evaluated on the 28th and 91st. After immersing the 28-day concrete specimens in salt water for 28 days, the diffusion coefficient was derived by collecting samples at different depths and analyzing the amount of chloride. As a result, it was found that after 28 days, the long-term strength enhancement effect of the nuclear power plant concrete mix with 20% fly ash replacement was higher than that of concrete using 100% ordinary Portland cement. It was also found that the nuclear power plant concrete mix has higher chloride ion permeation resistance, lower diffusion coefficient, and higher resistance to salt damage than the concrete mix using 100% ordinary Portland cement.

IBN-based: AI-driven Multi-Domain e2e Network Orchestration Approach (IBN 기반: AI 기반 멀티 도메인 네트워크 슬라이싱 접근법)

  • Khan, Talha Ahmed;Muhammad, Afaq;Abbas, Khizar;Song, Wang-Cheol
    • KNOM Review
    • /
    • v.23 no.2
    • /
    • pp.29-41
    • /
    • 2020
  • Networks are growing faster than ever before causing a multi-domain complexity. The diversity, variety and dynamic nature of network traffic and services require enhanced orchestration and management approaches. While many standard orchestrators and network operators are resulting in an increase of complexity for handling E2E slice orchestration. Besides, there are multiple domains involved in E2E slice orchestration including access, edge, transport and core network each having their specific challenges. Hence, handling of multi-domain, multi-platform and multi-operator based networking environments manually requires specified experts and using this approach it is impossible to handle the dynamic changes in the network at runtime. Also, the manual approaches towards handling such complexity is always error-prone and tedious. Hence, this work proposes an automated and abstracted solution for handling E2E slice orchestration using an intent-based approach. It abstracts the domains from the operators and enable them to provide their orchestration intention in the form of high-level intents. Besides, it actively monitors the orchestrated resources and based on current monitoring stats using the machine learning it predicts future utilization of resources for updating the system states. Resulting in a closed-loop automated E2E network orchestration and management system.

A Study on the Mid- to Long-term Public Library Expansion Plan in Daegu City (대구시 중장기 공공도서관 확충방안 연구)

  • Hee-Yoon Yoon;Seon-Kyung Oh
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.57 no.3
    • /
    • pp.97-117
    • /
    • 2023
  • The purpose of this study is to suggest a mid- to long-term expansion plan to resolve the blind spot and alleviate the imbalance of public library services in Daegu City. The research methods for this purpose included literature review, related laws and statistical data analysis, case study, and opinion survey. As a result, the first service area was set as a total of 14 areas based on administrative districts(Jung-gu, Seo-gu, Nam-gu, and Dalseong-gun each have one, Dong-gu and Buk-gu each have two, and Suseong-gu and Dalseo-gu have three each). Second, the expansion scenario for public libraries in Daegu City was proposed to add 26 libraries by the final target year (2032) based on the trend of national library growth over the past 13 years (2008-2020) and the forecast for the next 10 years (2023-2032). Third, the construction scenarios for each basic local government, excluding the Daegu representative library, are as follows: One library each in Jung-gu, Seo-gu, and Nam-gu; two libraries in Suseong-gu; three libraries in Dalseong-gun; four libraries in Dong-gu; and seven libraries each in Buk-gu and Dalseo-gu. In terms of floor area, it is proposed to add a total of 17 branch libraries with a minimum legal standard of 330-2,499㎡, four central libraries with 2,500-4,999㎡ each, and four central libraries with 5,000-9,999㎡ each. On the premise of these conditions, Daegu City and public libraries should focus on creating an inclusive and open community space, creating a digital platform, strengthening the library operation and cooperation system centered on Daegu representative library, developing collections and specializing services for local hub libraries, enhancing various knowledge information and program services, managing key library indicators and improving social contribution.

Development of Probabilistic Seismic Coefficients of Korea (국내 확률론적 지진계수 생성)

  • Kwak, Dong-Yeop;Jeong, Chang-Gyun;Park, Du-Hee;Lee, Hong-Sung
    • Journal of the Korean Geotechnical Society
    • /
    • v.25 no.10
    • /
    • pp.87-97
    • /
    • 2009
  • The seismic site coefficients are often used with the seismic hazard maps to develop the design response spectrum at the surface. The site coefficients are most commonly developed deterministically, while the seismic hazarde maps are derived probabilistically. There is, hence, an inherent incompatibility between the two approaches. However, they are used together in the seismic design codes without a clear rational basis. To resolve the fundamental imcompatibility between the site coefficients and hazard maps, this study uses a novel probabilistic seismic hazard analysis (PSHA) technique that simulates the results of a standard PSHA at a rock outcrop, but integrates the site response analysis function to capture the site amplification effects within the PSHA platform. Another important advantage of the method is its ability to model the uncertainty, variability, and randomness of the soil properties. The new PSHA was used to develop fully probabilistic site coefficients for site classes of the seismic design code and another sets of site classes proposed in Korea. Comparisons highlight the pronounced discrepancy between the site coefficients of the seismic design code and the proposed coefficients, while another set of site coefficients show differences only at selected site classes.

Application of peak based-Bayesian statistical method for isotope identification and categorization of depleted, natural and low enriched uranium measured by LaBr3:Ce scintillation detector

  • Haluk Yucel;Selin Saatci Tuzuner;Charles Massey
    • Nuclear Engineering and Technology
    • /
    • v.55 no.10
    • /
    • pp.3913-3923
    • /
    • 2023
  • Todays, medium energy resolution detectors are preferably used in radioisotope identification devices(RID) in nuclear and radioactive material categorization. However, there is still a need to develop or enhance « automated identifiers » for the useful RID algorithms. To decide whether any material is SNM or NORM, a key parameter is the better energy resolution of the detector. Although masking, shielding and gain shift/stabilization and other affecting parameters on site are also important for successful operations, the suitability of the RID algorithm is also a critical point to enhance the identification reliability while extracting the features from the spectral analysis. In this study, a RID algorithm based on Bayesian statistical method has been modified for medium energy resolution detectors and applied to the uranium gamma-ray spectra taken by a LaBr3:Ce detector. The present Bayesian RID algorithm covers up to 2000 keV energy range. It uses the peak centroids, the peak areas from the measured gamma-ray spectra. The extraction features are derived from the peak-based Bayesian classifiers to estimate a posterior probability for each isotope in the ANSI library. The program operations were tested under a MATLAB platform. The present peak based Bayesian RID algorithm was validated by using single isotopes(241Am, 57Co, 137Cs, 54Mn, 60Co), and then applied to five standard nuclear materials(0.32-4.51% at.235U), as well as natural U- and Th-ores. The ID performance of the RID algorithm was quantified in terms of F-score for each isotope. The posterior probability is calculated to be 54.5-74.4% for 238U and 4.7-10.5% for 235U in EC-NRM171 uranium materials. For the case of the more complex gamma-ray spectra from CRMs, the total scoring (ST) method was preferred for its ID performance evaluation. It was shown that the present peak based Bayesian RID algorithm can be applied to identify 235U and 238U isotopes in LEU or natural U-Th samples if a medium energy resolution detector is was in the measurements.

Design of Ship-type Floating LiDAR Buoy System for Wind Resource Measurement inthe Korean West Sea and Numerical Analysis of Stability Assessment of Mooring System (서해안 해상풍력단지 풍황관측용 부유식 라이다 운영을 위한 선박형 부표식 설계 및 계류 시스템의 수치 해석적 안정성 평가)

  • Yong-Soo, Gang;Jong-Kyu, Kim;Baek-Bum, Lee;Su-In, Yang;Jong-Wook, Kim
    • Journal of Navigation and Port Research
    • /
    • v.46 no.6
    • /
    • pp.483-490
    • /
    • 2022
  • Floating LiDAR is a system that provides a new paradigm for wind condition observation, which is essential when creating an offshore wind farm. As it can save time and money, minimize environmental impact, and even reduce backlash from local communities, it is emerging as the industry standard. However, the design and verification of a stable platform is very important, as disturbance factors caused by fluctuations of the buoy affect the reliability of observation data. In Korea, due to the nation's late entry into the technology, a number of foreign equipment manufacturers are dominating the domestic market. The west coast of Korea is a shallow sea environment with a very large tidal difference, so strong currents repeatedly appear depending on the region, and waves of strong energy that differ by season are formed. This paper conducted a study examining buoys suitable for LiDAR operation in the waters of Korea, which have such complex environmental characteristics. In this paper, we will introduce examples of optimized design and verification of ship-type buoys, which were applied first, and derive important concepts that will serve as the basis for the development of various platforms in the future.

Prelaunch Study of Validation for the Geostationary Ocean Color Imager (GOCI) (정지궤도 해색탑재체(GOCI) 자료 검정을 위한 사전연구)

  • Ryu, Joo-Hyung;Moon, Jeong-Eon;Son, Young-Baek;Cho, Seong-Ick;Min, Jee-Eun;Yang, Chan-Su;Ahn, Yu-Hwan;Shim, Jae-Seol
    • Korean Journal of Remote Sensing
    • /
    • v.26 no.2
    • /
    • pp.251-262
    • /
    • 2010
  • In order to provide quantitative control of the standard products of Geostationary Ocean Color Imager (GOCI), on-board radiometric correction, atmospheric correction, and bio-optical algorithm are obtained continuously by comprehensive and consistent calibration and validation procedures. The calibration/validation for radiometric, atmospheric, and bio-optical data of GOCI uses temperature, salinity, ocean optics, fluorescence, and turbidity data sets from buoy and platform systems, and periodic oceanic environmental data. For calibration and validation of GOCI, we compared radiometric data between in-situ measurement and HyperSAS data installed in the Ieodo ocean research station, and between HyperSAS and SeaWiFS radiance. HyperSAS data were slightly different in in-situ radiance and irradiance, but they did not have spectral shift in absorption bands. Although all radiance bands measured between HyperSAS and SeaWiFS had an average 25% error, the 11% absolute error was relatively lower when atmospheric correction bands were omitted. This error is related to the SeaWiFS standard atmospheric correction process. We have to consider and improve this error rate for calibration and validation of GOCI. A reference target site around Dokdo Island was used for studying calibration and validation of GOCI. In-situ ocean- and bio-optical data were collected during August and October, 2009. Reflectance spectra around Dokdo Island showed optical characteristic of Case-1 Water. Absorption spectra of chlorophyll, suspended matter, and dissolved organic matter also showed their spectral characteristics. MODIS Aqua-derived chlorophyll-a concentration was well correlated with in-situ fluorometer value, which installed in Dokdo buoy. As we strive to solv the problems of radiometric, atmospheric, and bio-optical correction, it is important to be able to progress and improve the future quality of calibration and validation of GOCI.

Using the METHONTOLOGY Approach to a Graduation Screen Ontology Development: An Experiential Investigation of the METHONTOLOGY Framework

  • Park, Jin-Soo;Sung, Ki-Moon;Moon, Se-Won
    • Asia pacific journal of information systems
    • /
    • v.20 no.2
    • /
    • pp.125-155
    • /
    • 2010
  • Ontologies have been adopted in various business and scientific communities as a key component of the Semantic Web. Despite the increasing importance of ontologies, ontology developers still perceive construction tasks as a challenge. A clearly defined and well-structured methodology can reduce the time required to develop an ontology and increase the probability of success of a project. However, no reliable knowledge-engineering methodology for ontology development currently exists; every methodology has been tailored toward the development of a particular ontology. In this study, we developed a Graduation Screen Ontology (GSO). The graduation screen domain was chosen for the several reasons. First, the graduation screen process is a complicated task requiring a complex reasoning process. Second, GSO may be reused for other universities because the graduation screen process is similar for most universities. Finally, GSO can be built within a given period because the size of the selected domain is reasonable. No standard ontology development methodology exists; thus, one of the existing ontology development methodologies had to be chosen. The most important considerations for selecting the ontology development methodology of GSO included whether it can be applied to a new domain; whether it covers a broader set of development tasks; and whether it gives sufficient explanation of each development task. We evaluated various ontology development methodologies based on the evaluation framework proposed by G$\acute{o}$mez-P$\acute{e}$rez et al. We concluded that METHONTOLOGY was the most applicable to the building of GSO for this study. METHONTOLOGY was derived from the experience of developing Chemical Ontology at the Polytechnic University of Madrid by Fern$\acute{a}$ndez-L$\acute{o}$pez et al. and is regarded as the most mature ontology development methodology. METHONTOLOGY describes a very detailed approach for building an ontology under a centralized development environment at the conceptual level. This methodology consists of three broad processes, with each process containing specific sub-processes: management (scheduling, control, and quality assurance); development (specification, conceptualization, formalization, implementation, and maintenance); and support process (knowledge acquisition, evaluation, documentation, configuration management, and integration). An ontology development language and ontology development tool for GSO construction also had to be selected. We adopted OWL-DL as the ontology development language. OWL was selected because of its computational quality of consistency in checking and classification, which is crucial in developing coherent and useful ontological models for very complex domains. In addition, Protege-OWL was chosen for an ontology development tool because it is supported by METHONTOLOGY and is widely used because of its platform-independent characteristics. Based on the GSO development experience of the researchers, some issues relating to the METHONTOLOGY, OWL-DL, and Prot$\acute{e}$g$\acute{e}$-OWL were identified. We focused on presenting drawbacks of METHONTOLOGY and discussing how each weakness could be addressed. First, METHONTOLOGY insists that domain experts who do not have ontology construction experience can easily build ontologies. However, it is still difficult for these domain experts to develop a sophisticated ontology, especially if they have insufficient background knowledge related to the ontology. Second, METHONTOLOGY does not include a development stage called the "feasibility study." This pre-development stage helps developers ensure not only that a planned ontology is necessary and sufficiently valuable to begin an ontology building project, but also to determine whether the project will be successful. Third, METHONTOLOGY excludes an explanation on the use and integration of existing ontologies. If an additional stage for considering reuse is introduced, developers might share benefits of reuse. Fourth, METHONTOLOGY fails to address the importance of collaboration. This methodology needs to explain the allocation of specific tasks to different developer groups, and how to combine these tasks once specific given jobs are completed. Fifth, METHONTOLOGY fails to suggest the methods and techniques applied in the conceptualization stage sufficiently. Introducing methods of concept extraction from multiple informal sources or methods of identifying relations may enhance the quality of ontologies. Sixth, METHONTOLOGY does not provide an evaluation process to confirm whether WebODE perfectly transforms a conceptual ontology into a formal ontology. It also does not guarantee whether the outcomes of the conceptualization stage are completely reflected in the implementation stage. Seventh, METHONTOLOGY needs to add criteria for user evaluation of the actual use of the constructed ontology under user environments. Eighth, although METHONTOLOGY allows continual knowledge acquisition while working on the ontology development process, consistent updates can be difficult for developers. Ninth, METHONTOLOGY demands that developers complete various documents during the conceptualization stage; thus, it can be considered a heavy methodology. Adopting an agile methodology will result in reinforcing active communication among developers and reducing the burden of documentation completion. Finally, this study concludes with contributions and practical implications. No previous research has addressed issues related to METHONTOLOGY from empirical experiences; this study is an initial attempt. In addition, several lessons learned from the development experience are discussed. This study also affords some insights for ontology methodology researchers who want to design a more advanced ontology development methodology.