• Title/Summary/Keyword: integrated extraction

Search Result 228, Processing Time 0.022 seconds

Intertidal DEM Generation Using Waterline Extracted from Remotely Sensed Data (원격탐사 자료로부터 해안선 추출에 의한 조간대 DEM 생성)

  • 류주형;조원진;원중선;이인태;전승수
    • Korean Journal of Remote Sensing
    • /
    • v.16 no.3
    • /
    • pp.221-233
    • /
    • 2000
  • An intertidal topography is continuously changed due to morphodynamics processes. Detection and measurement of topographic change for a tidal flat is important to make an integrated coastal area management plan as well as to carry out sedimentologic study. The objective of this study is to generate intertidal DEM using leveling data and waterlines extracted from optical and microwave remotely sensed data in a relatively short period. Waterline is defined as the border line between exposed tidal flat and water body. The contour of the terrain height in tidal flat is equivalent to the waterline. One can utilize satellite images to generate intertidal DEM over large areas. Extraction of the waterline in a SAR image is a difficult task to perform partly because of the presence of speckle and partly because of similarity between the signal returned from the sea surface and that from the exposed tidal flat surface or land. Waterlines in SAR intensity and coherence map can effectively be extracted with MSP-RoA edge detector. From multiple images obtained over a range of tide elevation, it is possible to build up a set of heighted waterline within intertidal zone, and then a gridded DEM can be interpolated. We have tested the proposed method over the Gomso Bay, and succeeded in generating intertidal DEM with relatively high accuracy.

A Study on Analysis of national R&D research trends for Artificial Intelligence using LDA topic modeling (LDA 토픽모델링을 활용한 인공지능 관련 국가R&D 연구동향 분석)

  • Yang, MyungSeok;Lee, SungHee;Park, KeunHee;Choi, KwangNam;Kim, TaeHyun
    • Journal of Internet Computing and Services
    • /
    • v.22 no.5
    • /
    • pp.47-55
    • /
    • 2021
  • Analysis of research trends in specific subject areas is performed by examining related topics and subject changes by using topic modeling techniques through keyword extraction for most of the literature information (paper, patents, etc.). Unlike existing research methods, this paper extracts topics related to the research topic using the LDA topic modeling technique for the project information of national R&D projects provided by the National Science and Technology Knowledge Information Service (NTIS) in the field of artificial intelligence. By analyzing these topics, this study aims to analyze research topics and investment directions for national R&D projects. NTIS provides a vast amount of national R&D information, from information on tasks carried out through national R&D projects to research results (thesis, patents, etc.) generated through research. In this paper, the search results were confirmed by performing artificial intelligence keywords and related classification searches in NTIS integrated search, and basic data was constructed by downloading the latest three-year project information. Using the LDA topic modeling library provided by Python, related topics and keywords were extracted and analyzed for basic data (research goals, research content, expected effects, keywords, etc.) to derive insights on the direction of research investment.

Development of Risk Analysis Structure for Large-scale Underground Construction in Urban Areas (도심지 대규모 지하공사의 리스크 분석 체계 개발)

  • Seo, Jong-Won;Yoon, Ji-Hyeok;Kim, Jeong-Hwan;Jee, Sung-Hyun
    • Journal of the Korean Geotechnical Society
    • /
    • v.26 no.3
    • /
    • pp.59-68
    • /
    • 2010
  • Systematic risk management is necessary in grand scaled urban construction because of the existence of complicated and various risk factors. Problems of obstructions, adjacent structures, safety, environment, traffic and geotechnical properties need to be solved because urban construction is progressed in limited space not as general earthwork. Therefore the establishment of special risk management system is necessary to manage not only geotechnical properties but also social and cultural uncertainties. This research presents the technique analysis by the current state of risk management technique. Risk factors were noticed and the importance of each factor was estimated through survey. The systemically categorized database was established. Risk extraction module, matrix and score module were developed based on the database. Expected construction budget and time distribution can be computed by Monte Carlo analysis of probabilities and influences. Construction budgets and time distributions of before and after response can be compared and analyzed 80 the risks are manageable for entire whole construction time. This system will be the foundation of standardization and integration. Procurement, efficiency improvement, effective time and resource management are available through integrated management technique development and application. Conclusively decrease in cost and time is expected by systemization of project management.

Remediation of Arsenic Contaminated soils Using a Hybrid Technology Integrating Bioleaching and Electrokinetics (생용출과 전기동력학을 연계한 통합기술을 이용한 비소 오염 토양의 정화)

  • Lee, Keun-Young;Kimg, Kyoung-Woong;Kim, Soon-Oh
    • Journal of Soil and Groundwater Environment
    • /
    • v.14 no.2
    • /
    • pp.33-44
    • /
    • 2009
  • The objective of the study was to develop a hybrid technology integrating biological and physicochemical technologies to efficiently remediate arsenic contaminated lands such as abandoned mine area. The tailing soil samples contaminated with As at a high level were obtained from Songchon abandoned mine, and the content of arsenic and heavy metals as well as physicochemical properties and mineral composition were investigated. In addition, two sets of sequential extraction methods were applied to analyze chemical speciations of arsenic and heavy metals to expect their leachability and mobility in geoenvironment. Based on these geochemical data of arsenic and heavy metal contaminants, column-type experiments on the bioleaching of arsenic were undertaken. Subsequently, experiments on the hybrid process incorporating bioleaching and electrokinetics were accomplished and its removal efficiency of arsenic was compared with that of the individual electrokinetic process. With the results, finally, the feasibilty of the hybrid technnology was evaluated. The arsenic removal efficiencies of the individual electrokinetic process (44 days) and the hybrid process incorporating bioleaching (28 days) and electrokinetics (16 dyas) were measured 57.8% and 64.5%, respectively, when both two processes were operated in an identical condition. On the contrary, the arsenic removal efficiency during the bioleaching process (28 days) appeared relatively lower (11.8%), and the result indicates that the bioleaching process enhanced the efficacy of the electrokinetic process as a result of mobilization of arsenic rather than removed arsenic by itself. In particular, the arsenic removal rate of the electrokinetics integrated with bioleaching was observed over than 2 times larger than that obtained by the electrokinetics alone. From the results of the study, if the bioleaching which is considered a relatively economic process is applied sufficiently prior to electrokinetics, the removal efficiency and rate of arsenic can be significantly improved. Consequently, the study proves the feasibility of the hybrid process integrating both technologies.

Lithospheric Mantle beneath the Korean Peninsula: Implications from Peridotite Xenoliths in Alkali Basalts (우리나라 상부암석권 맨틀: 페리도타이트 포획암으로부터의 고찰)

  • Choi, Sung-Hi
    • The Journal of the Petrological Society of Korea
    • /
    • v.21 no.2
    • /
    • pp.235-247
    • /
    • 2012
  • Peridotite xenoliths hosted by alkali basalts from South Korea occur in Baengnyeong Island, Jeju Island, Boeun, Asan, Pyeongtaek and Ganseong areas. K-Ar whole-rock ages of the basaltic rocks range from 0.1 to 18.9 Ma. The peridotites are dominantly lherzolites and magnesian harzburgites, and the constituent minerals are Fo-rich olivine ($Fo_{88.4-92.0}$), En-rich orthopyroxene, Di-rich clinopyroxene, and Cr-rich spinel (Cr# = 7.8-53.6). Hydrous minerals, such as pargasite and phlogopite, or garnet have not been reported yet. The Korean peridotites are residues after variable degree of partial melting (up to 26%) and melt extraction from fertile MORB mantle. However, some samples (usually refractory harzburgites) exhibit metasomatic enrichment of the highly incompatible elements, such as LREE. Equilibration temperatures estimated using two-pyroxene geothermometry range from ca. 850 to $1050^{\circ}C$. Sr and Nd isotopic compositions in clinopyroxene separates from the Korean peridotites show trends between depleted MORB-like mantle (DMM) and bulk silicate earth (BSE), which can be explained by secondary metasomatic overprinting of a precursor time-integrated depleted mantle. The Korean peridotite clinopyroxenes define mixing trends between DMM and EM2 end members on Sr-Pb and Nd-Pb isotopic correlation diagrams, without any corresponding changes in the basement. This is contrary to what we observe in late Cenozoic intraplate volcanism in East Asia which shows two distinct mantle sources such as a DMM-EM1 array for NE China including Baengnyeong Island and a DMM-EM2 array for Southeast Asia including Jeju Island. This observation suggests the existence of large-scale two distinct mantle domains in the shallow asthenosphere beneath East Asia. The Re-Os model ages on Korean peridotites indicate that they have been isolated from convecting mantle between ca. 1.8 and 1.9 Ga.

PCA­based Waveform Classification of Rabbit Retinal Ganglion Cell Activity (주성분분석을 이용한 토끼 망막 신경절세포의 활동전위 파형 분류)

  • 진계환;조현숙;이태수;구용숙
    • Progress in Medical Physics
    • /
    • v.14 no.4
    • /
    • pp.211-217
    • /
    • 2003
  • The Principal component analysis (PCA) is a well-known data analysis method that is useful in linear feature extraction and data compression. The PCA is a linear transformation that applies an orthogonal rotation to the original data, so as to maximize the retained variance. PCA is a classical technique for obtaining an optimal overall mapping of linearly dependent patterns of correlation between variables (e.g. neurons). PCA provides, in the mean-squared error sense, an optimal linear mapping of the signals which are spread across a group of variables. These signals are concentrated into the first few components, while the noise, i.e. variance which is uncorrelated across variables, is sequestered in the remaining components. PCA has been used extensively to resolve temporal patterns in neurophysiological recordings. Because the retinal signal is stochastic process, PCA can be used to identify the retinal spikes. With excised rabbit eye, retina was isolated. A piece of retina was attached with the ganglion cell side to the surface of the microelectrode array (MEA). The MEA consisted of glass plate with 60 substrate integrated and insulated golden connection lanes terminating in an 8${\times}$8 array (spacing 200 $\mu$m, electrode diameter 30 $\mu$m) in the center of the plate. The MEA 60 system was used for the recording of retinal ganglion cell activity. The action potentials of each channel were sorted by off­line analysis tool. Spikes were detected with a threshold criterion and sorted according to their principal component composition. The first (PC1) and second principal component values (PC2) were calculated using all the waveforms of the each channel and all n time points in the waveform, where several clusters could be separated clearly in two dimension. We verified that PCA-based waveform detection was effective as an initial approach for spike sorting method.

  • PDF

Development of the Multi-Parametric Mapping Software Based on Functional Maps to Determine the Clinical Target Volumes (임상표적체적 결정을 위한 기능 영상 기반 생물학적 인자 맵핑 소프트웨어 개발)

  • Park, Ji-Yeon;Jung, Won-Gyun;Lee, Jeong-Woo;Lee, Kyoung-Nam;Ahn, Kook-Jin;Hong, Se-Mie;Juh, Ra-Hyeong;Choe, Bo-Young;Suh, Tae-Suk
    • Progress in Medical Physics
    • /
    • v.21 no.2
    • /
    • pp.153-164
    • /
    • 2010
  • To determine the clinical target volumes considering vascularity and cellularity of tumors, the software was developed for mapping of the analyzed biological clinical target volumes on anatomical images using regional cerebral blood volume (rCBV) maps and apparent diffusion coefficient (ADC) maps. The program provides the functions for integrated registrations using mutual information, affine transform and non-rigid registration. The registration accuracy is evaluated by the calculation of the overlapped ratio of segmented bone regions and average distance difference of contours between reference and registered images. The performance of the developed software was tested using multimodal images of a patient who has the residual tumor of high grade gliomas. Registration accuracy of about 74% and average 2.3 mm distance difference were calculated by the evaluation method of bone segmentation and contour extraction. The registration accuracy can be improved as higher as 4% by the manual adjustment functions. Advanced MR images are analyzed using color maps for rCBV maps and quantitative calculation based on region of interest (ROI) for ADC maps. Then, multi-parameters on the same voxels are plotted on plane and constitute the multi-functional parametric maps of which x and y axis representing rCBV and ADC values. According to the distributions of functional parameters, tumor regions showing the higher vascularity and cellularity are categorized according to the criteria corresponding malignant gliomas. Determined volumes reflecting pathological and physiological characteristics of tumors are marked on anatomical images. By applying the multi-functional images, errors arising from using one type of image would be reduced and local regions representing higher probability as tumor cells would be determined for radiation treatment plan. Biological tumor characteristics can be expressed using image registration and multi-functional parametric maps in the developed software. The software can be considered to delineate clinical target volumes using advanced MR images with anatomical images.

Construction of Event Networks from Large News Data Using Text Mining Techniques (텍스트 마이닝 기법을 적용한 뉴스 데이터에서의 사건 네트워크 구축)

  • Lee, Minchul;Kim, Hea-Jin
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.1
    • /
    • pp.183-203
    • /
    • 2018
  • News articles are the most suitable medium for examining the events occurring at home and abroad. Especially, as the development of information and communication technology has brought various kinds of online news media, the news about the events occurring in society has increased greatly. So automatically summarizing key events from massive amounts of news data will help users to look at many of the events at a glance. In addition, if we build and provide an event network based on the relevance of events, it will be able to greatly help the reader in understanding the current events. In this study, we propose a method for extracting event networks from large news text data. To this end, we first collected Korean political and social articles from March 2016 to March 2017, and integrated the synonyms by leaving only meaningful words through preprocessing using NPMI and Word2Vec. Latent Dirichlet allocation (LDA) topic modeling was used to calculate the subject distribution by date and to find the peak of the subject distribution and to detect the event. A total of 32 topics were extracted from the topic modeling, and the point of occurrence of the event was deduced by looking at the point at which each subject distribution surged. As a result, a total of 85 events were detected, but the final 16 events were filtered and presented using the Gaussian smoothing technique. We also calculated the relevance score between events detected to construct the event network. Using the cosine coefficient between the co-occurred events, we calculated the relevance between the events and connected the events to construct the event network. Finally, we set up the event network by setting each event to each vertex and the relevance score between events to the vertices connecting the vertices. The event network constructed in our methods helped us to sort out major events in the political and social fields in Korea that occurred in the last one year in chronological order and at the same time identify which events are related to certain events. Our approach differs from existing event detection methods in that LDA topic modeling makes it possible to easily analyze large amounts of data and to identify the relevance of events that were difficult to detect in existing event detection. We applied various text mining techniques and Word2vec technique in the text preprocessing to improve the accuracy of the extraction of proper nouns and synthetic nouns, which have been difficult in analyzing existing Korean texts, can be found. In this study, the detection and network configuration techniques of the event have the following advantages in practical application. First, LDA topic modeling, which is unsupervised learning, can easily analyze subject and topic words and distribution from huge amount of data. Also, by using the date information of the collected news articles, it is possible to express the distribution by topic in a time series. Second, we can find out the connection of events in the form of present and summarized form by calculating relevance score and constructing event network by using simultaneous occurrence of topics that are difficult to grasp in existing event detection. It can be seen from the fact that the inter-event relevance-based event network proposed in this study was actually constructed in order of occurrence time. It is also possible to identify what happened as a starting point for a series of events through the event network. The limitation of this study is that the characteristics of LDA topic modeling have different results according to the initial parameters and the number of subjects, and the subject and event name of the analysis result should be given by the subjective judgment of the researcher. Also, since each topic is assumed to be exclusive and independent, it does not take into account the relevance between themes. Subsequent studies need to calculate the relevance between events that are not covered in this study or those that belong to the same subject.