• Title/Summary/Keyword: Target extraction

Search Result 531, Processing Time 0.048 seconds

Simultaneous Analysis of Alternative Antifouling Agents (Diuron and Irgarol 1051) and Triazine Herbicide (Prometryn) in Seawater Using LC/MS-MS (해수 중 신방오도료(Diuron and Irgarol 1051) 및 트리아진계 제초제 (Prometryn)에 대한 LC-MS/MS 동시 분석법 정립)

  • Mikyoung Lee;Sunggyu Lee;Minkyu Choi
    • Korean Journal of Fisheries and Aquatic Sciences
    • /
    • v.57 no.4
    • /
    • pp.327-335
    • /
    • 2024
  • A simultaneous analytical method was developed to quantify antifouling agents and triazine herbicides in seawater using liquid chromatography-tandem mass spectrometry (LC-MS/MS). The target compounds included diuron, irgarol 1051, and prometryn, which are prevalent in marine environments owing to their extensive use in antifouling coatings and agriculture. The analytical procedure involves solid-phase extraction (SPE) using HLB cartridges followed by LC-MS/MS analysis for precise quantification. The method exhibits high recovery rates for diuron (101% ± 1.25), irgarol 1051 (94.7% ± 2.08), and prometryn (93.7% ± 3.06). Seawater samples from 30 coastal sites in Korea were analyzed. Irgarol 1051 was not detected, whereas diuron was consistently detected across all sites, with concentrations from 0.68 to 11.3 ng/L, and prometryn was present at levels between 0.12 and 7.06 ng/L. The highest diuron and prometryn concentrations were recorded along the southeastern and western coasts, respectively. These findings underscore the critical need for continuous monitoring and regulations to manage these contaminants in marine ecosystems, thereby safeguarding ecological integrity and public health. This study establishes a robust analytical framework for the comprehensive assessment of multiple marine contaminants.

Integrated Rotary Genetic Analysis Microsystem for Influenza A Virus Detection

  • Jung, Jae Hwan;Park, Byung Hyun;Choi, Seok Jin;Seo, Tae Seok
    • Proceedings of the Korean Vacuum Society Conference
    • /
    • 2013.08a
    • /
    • pp.88-89
    • /
    • 2013
  • A variety of influenza A viruses from animal hosts are continuously prevalent throughout the world which cause human epidemics resulting millions of human infections and enormous industrial and economic damages. Thus, early diagnosis of such pathogen is of paramount importance for biomedical examination and public healthcare screening. To approach this issue, here we propose a fully integrated Rotary genetic analysis system, called Rotary Genetic Analyzer, for on-site detection of influenza A viruses with high speed. The Rotary Genetic Analyzer is made up of four parts including a disposable microchip, a servo motor for precise and high rate spinning of the chip, thermal blocks for temperature control, and a miniaturized optical fluorescence detector as shown Fig. 1. A thermal block made from duralumin is integrated with a film heater at the bottom and a resistance temperature detector (RTD) in the middle. For the efficient performance of RT-PCR, three thermal blocks are placed on the Rotary stage and the temperature of each block is corresponded to the thermal cycling, namely $95^{\circ}C$ (denature), $58^{\circ}C$ (annealing), and $72^{\circ}C$ (extension). Rotary RT-PCR was performed to amplify the target gene which was monitored by an optical fluorescent detector above the extension block. A disposable microdevice (10 cm diameter) consists of a solid-phase extraction based sample pretreatment unit, bead chamber, and 4 ${\mu}L$ of the PCR chamber as shown Fig. 2. The microchip is fabricated using a patterned polycarbonate (PC) sheet with 1 mm thickness and a PC film with 130 ${\mu}m$ thickness, which layers are thermally bonded at $138^{\circ}C$ using acetone vapour. Silicatreated microglass beads with 150~212 ${\mu}L$ diameter are introduced into the sample pretreatment chambers and held in place by weir structure for construction of solid-phase extraction system. Fig. 3 shows strobed images of sequential loading of three samples. Three samples were loaded into the reservoir simultaneously (Fig. 3A), then the influenza A H3N2 viral RNA sample was loaded at 5000 RPM for 10 sec (Fig. 3B). Washing buffer was followed at 5000 RPM for 5 min (Fig. 3C), and angular frequency was decreased to 100 RPM for siphon priming of PCR cocktail to the channel as shown in Figure 3D. Finally the PCR cocktail was loaded to the bead chamber at 2000 RPM for 10 sec, and then RPM was increased up to 5000 RPM for 1 min to obtain the as much as PCR cocktail containing the RNA template (Fig. 3E). In this system, the wastes from RNA samples and washing buffer were transported to the waste chamber, which is fully filled to the chamber with precise optimization. Then, the PCR cocktail was able to transport to the PCR chamber. Fig. 3F shows the final image of the sample pretreatment. PCR cocktail containing RNA template is successfully isolated from waste. To detect the influenza A H3N2 virus, the purified RNA with PCR cocktail in the PCR chamber was amplified by using performed the RNA capture on the proposed microdevice. The fluorescence images were described in Figure 4A at the 0, 40 cycles. The fluorescence signal (40 cycle) was drastically increased confirming the influenza A H3N2 virus. The real-time profiles were successfully obtained using the optical fluorescence detector as shown in Figure 4B. The Rotary PCR and off-chip PCR were compared with same amount of influenza A H3N2 virus. The Ct value of Rotary PCR was smaller than the off-chip PCR without contamination. The whole process of the sample pretreatment and RT-PCR could be accomplished in 30 min on the fully integrated Rotary Genetic Analyzer system. We have demonstrated a fully integrated and portable Rotary Genetic Analyzer for detection of the gene expression of influenza A virus, which has 'Sample-in-answer-out' capability including sample pretreatment, rotary amplification, and optical detection. Target gene amplification was real-time monitored using the integrated Rotary Genetic Analyzer system.

  • PDF

Trends Analysis on Research Articles of the Sharing Economy through a Meta Study Based on Big Data Analytics (빅데이터 분석 기반의 메타스터디를 통해 본 공유경제에 대한 학술연구 동향 분석)

  • Kim, Ki-youn
    • Journal of Internet Computing and Services
    • /
    • v.21 no.4
    • /
    • pp.97-107
    • /
    • 2020
  • This study aims to conduct a comprehensive meta-study from the perspective of content analysis to explore trends in Korean academic research on the sharing economy by using the big data analytics. Comprehensive meta-analysis methodology can examine the entire set of research results historically and wholly to illuminate the tendency or properties of the overall research trend. Academic research related to the sharing economy first appeared in the year in which Professor Lawrence Lessig introduced the concept of the sharing economy to the world in 2008, but research began in earnest in 2013. In particular, between 2006 and 2008, research improved dramatically. In order to grasp the overall flow of domestic academic research of trends, 8 years of papers from 2013 to the present have been selected as target analysis papers, focusing on titles, keywords, and abstracts using database of electronic journals. Big data analysis was performed in the order of cleaning, analysis, and visualization of the collected data to derive research trends and insights by year and type of literature. We used Python3.7 and Textom analysis tools for data preprocessing, text mining, and metrics frequency analysis for key word extraction, and N-gram chart, centrality and social network analysis and CONCOR clustering visualization based on UCINET6/NetDraw, Textom program, the keywords clustered into 8 groups were used to derive the typologies of each research trend. The outcomes of this study will provide useful theoretical insights and guideline to future studies.

Extraction of Forest Resources Using High Density LiDAR Data (고밀도 LiDAR 자료를 이용한 산림자원 추출에 관한 연구)

  • Young Rak, Choi;Jong Sin, Lee;Hee Cheon, Yun
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.33 no.2
    • /
    • pp.73-81
    • /
    • 2015
  • The objective of this study is in investigating the research for more accurately quantify the information on mountain forest by using the data on high density LiDAR. For the quantitative analysis of mountain forest resources, we investigated the method to acquire the data on high density LiDAR and extract mountain forest resources. Consequently, the height and girth of a tree each mountain forest resources could be extracted by using the data on high density LiDAR. When using the data on low density LiDAR of 2.5points/m2 in average used to produce digital map, it was difficult to extract the exact height and girth of mountain forest resources. If using the data on high density LiDAR of 7points/m2 by considering topography, the property of mountain forest resources, data capacity and process velocity, etc, it was found that multitudinous entities could be extracted. It was found that mountain topography and mixed topography were generally denser than plane topography and multitudinous mountain forest resources could be extracted. Furthermore, it was also found that the entity at the border could not be extracted, when each partition was individually processed and the area should be subdivided and extracted by considering the process time and property of target area rather than processing wide area at once. We expect to be studied more profoundly the absorption quantity of greenhouse gas later by using information on mountain forest resources in the future.

Extraction of Ocean Surface Current Velocity Using Envisat ASAR Raw Data (Envisat ASAR 원시자료를 이용한 표층 해류 속도 추출)

  • Kang, Ki-Mook;Kim, Duk-Jin
    • Korean Journal of Remote Sensing
    • /
    • v.29 no.1
    • /
    • pp.11-20
    • /
    • 2013
  • Space-borne Synthetic Aperture Radar(SAR) has been one of the most effective tools for monitoring quantitative oceanographic physical parameters. The Doppler information recorded in single-channel SAR raw data can be useful in estimating moving velocity of water mass in ocean. The Doppler shift is caused by the relative motion between SAR sensor and the water mass of ocean surface. Thus, the moving velocity can be extracted by measuring the Doppler anomaly between extracted Doppler centroid and predicted Doppler centroid. The predicted Doppler centroid, defined as the Doppler centroid assuming that the target is not moving, is calculated based on the geometric parameters of a satellite, such as the satellite's orbit, look angle, and attitude with regard to the rotating Earth. While the estimated Doppler shift, corresponding to the actual Doppler centroid in the situation of real SAR data acquisition, can be extracted directly from raw SAR signal data, which usually calculated by applying the Average Cross Correlation Coefficient(ACCC). The moving velocity was further refined to obtain ocean surface current by subtracting the phase velocity of Bragg-resonant capillary waves. These methods were applied to Envisat ASAR raw data acquired in the East Sea, and the extracted ocean surface currents were compared with the current measured by HF-radar.

DISCRIMINATION BETWEEN VIRGIN OLIVE OILS FROM CRETE AND THE PELOPONESE USING NEAR INFRARED TRANSFLECTANCE SPECTROSCOPY

  • Flynn, Stephen J.;Downey, Gerard
    • Proceedings of the Korean Society of Near Infrared Spectroscopy Conference
    • /
    • 2001.06a
    • /
    • pp.1520-1520
    • /
    • 2001
  • Food adulteration is a serious consumer fraud and a potentially dangerous practice. Regulatory authorities and food processors require a rapid, non-destructive test to accurately confirm authenticity in a range of food products and raw materials. Olive oil is prime target for adulteration either on the basis of the processing treatments used for its extraction (extra virgin vs virgin vs ordinary oil) or its geographical origin (e.g. Greek vs Italian vs Spanish). As part of an investigation into this problem, some preliminary work focused on the ability of near infrared spectroscopy to discriminate between virgin olive oils from separate regions of the Mediterranean i. e. Crete and the Peloponese. A total of 46 oils were collected: 18 originated in Crete and 28 in the Peloponese. Oils were stored in a temperature-controlled room at 2$0^{\circ}C$ prior to spectral collection at room temperature (15-18$^{\circ}C$). Samples (approximately 0.5$m\ell$) were placed in the centre of the quartz window in a camlock reflectance cell; the gold-plated baking plate was then gently placed into the cell against the glass so as to minimize the formation of air bubbles. The rear of the camlock cell was then screwed into place producing a sample thickness of 0.5mm. Spectra were recorded between 400 and 2498nm at 2nm intervals on a NIR Systems 6500 scanning monochromator. Spectral collection took place over 2-3 days. Data were analysed using both WINISI and The Unscrambler software to investigate the possibility of discriminating between the oils from Crete and the Peloponese. A number of data pre-treatments were used and discriminant models were developed using discriminant PLS (WINISI & Unscrambler) and SIMCA (Unscrambler). Despite the small number of samples involved, a satisfactory discrimination between these two oil types was achieved. Graphical examination of principal component scores for each oil type also holds out the possibility of separating oils from either Crete and the Peloponese on the basis of districts within each region. These preliminary data suggest the potential of near infrared spectroscopy to act as a screening technique for the confirmation of geographic origin of extra virgin olive oils. The sample presentation strategy adopted uses only small volumes of material and produces high quality spectra.

  • PDF

Development of a Method for Calculating the Allowable Storage Capacity of Rivers by Using Drone Images (드론 영상을 이용한 하천의 구간별 허용 저수량 산정 방법 개발)

  • Kim, Han-Gyeol;Kim, Jae-In;Yoon, Sung-Joo;Kim, Taejung
    • Korean Journal of Remote Sensing
    • /
    • v.34 no.2_1
    • /
    • pp.203-211
    • /
    • 2018
  • Dam discharge is carried out for the management of rivers and area around rivers due to rainy season or drought. Dam discharge should be based on an accurate understanding of the flow rate that can be accommodated in the river. Therefore, understanding the allowable storage capacity of river is an important factor in the management of the environment around the river. However, the methods using water level meters and images, which are currently used to determine the allowable flow rate of rivers, show limitations in terms of accuracy and efficiency. In order to solve these problems, this paper proposes a method to automatically calculate the allowable storage capacity of river based on the images taken by drone. In the first step, we create a 3D model of the river by using the drone images. This generation process consists of tiepoint extraction, image orientation, and image matching. In the second step, the allowable storage capacity is calculated by cross section analysis of the river using the generated river 3D model and the road and river layers in the target area. In this step, we determine the maximum water level of the river, extract the cross-sectional profile along the river, and use the 3D model to calculate the allowable storage capacity for the area. To prove our method, we used Bukhan river's data and as a result, the allowable storage volume was automatically extracted. It is expected that the proposed method will be useful for real - time management of rivers and surrounding areas and 3D models using drone.

Trace-level Determination of N-nitrosodimethylamine(NDMA) in Water Samples using a High-Performance Liquid Chromatography with Fluorescence Derivatization (HPLC와 Fluorescence Derivatization 기법을 이용한 극미량 NDMA의 수질분석)

  • Cha, Woo-Suk;Fox, Peter;Nalinakumari, Brijesh;Choi, Hee-Chul
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.28 no.2
    • /
    • pp.223-228
    • /
    • 2006
  • High-performance liquid chromatography(HPLC) and fluorescence derivatization were applied for a trace-level N-nitrosodimethylamine(NDMA) analysis of water samples. Fluorescence intensity was optimized with the excitation wavelength of 340 nm and the emission wavelength of 530 nm. pH adjustment after denitrosation was necessary to get a maximum intensity at pH between 9 and 12. Maximum intensity was found with a dansyl chloride concentration of 330 to 500 mg/L. Percentile error in the water sample analyses through solid phase extraction was 12-162% and 6-23% for the lower concentration level(10-200 ng/L NDMA) and the higher level(100-1000 ng/L NDMA), respectively, showing more discrepancy in lower level. However, the average ratios of estimated NDMA to the standard NDMA were close to 1 for both concentration ranges, presenting this HPLC method could detect from tens to hundreds nanograms NDMA per liter. Accurate determination of NDMA, which was injected to a wastewater effluent, revealed the selectivity of fluorescence derivatization for the target compound(NDMA) in the presence of complex interfering compounds. The HPLC with fluorescence derivatization may be applicable for determining NDMA of water and wastewater samples fur various research purposes.

A Study on the Development of Dynamic Models under Inter Port Competition (항만의 경쟁상황을 고려한 동적모형 개발에 관한 연구)

  • 여기태;이철영
    • Journal of the Korean Institute of Navigation
    • /
    • v.23 no.1
    • /
    • pp.75-84
    • /
    • 1999
  • Although many studies on modelling of port competitive situation have been conducted, both theoretical frame and methodology are still very weak. In this study, therefore, a new algorithm called ESD (Extensional System Dynamics) for the evaluation of port competition was presented, and applied to simulate port systems in northeast asia. The detailed objectives of this paper are to develop Unit fort Model by using SD(System Dynamics) method; to develop Competitive Port Model by ESD method; to perform sensitivity analysis by altering parameters, and to propose port development strategies. For these the algorithm for the evaluation of part's competition was developed in two steps. Firstly, SD method was adopted to develop the Unit Port models, and secondly HFP(Hierarchical Fuzzy Process) method was introduced to expand previous SD method. The proposed models were then developed and applied to the five ports - Pusan, Kobe, Yokohama, Kaoshiung, Keelung - with real data on each ports, and several findings were derived. Firstly, the extraction of factors for Unit Port was accomplished by consultation of experts such as research worker, professor, research fellows related to harbor, and expert group, and finally, five factor groups - location, facility, service, cargo volumes, and port charge - were obtained. Secondly, system's structure consisting of feedback loop was found easily by location of representative and detailed factors on keyword network of STGB map. Using these keyword network, feedback loop was found. Thirdly, for the target year of 2003, the simulation for Pusan port revealed that liner's number would be increased from 829 ships to 1,450 ships and container cargo volumes increased from 4.56 million TEU to 7.74 million TEU. It also revealed that because of increased liners and container cargo volumes, length of berth should be expanded from 2,162m to 4,729m. This berth expansion was resulted in the decrease of congested ship's number from 97 to 11. It was also found that port's charge had a fluctuation. Results of simulation for Kobe, Yokohama, Kaoshiung, Keelung in northeast asia were also acquired. Finally, the inter port competition models developed by ESB method were used to simulate container cargo volumes for Pusan port. The results revealed that under competitive situation container cargo volume was smaller than non-competitive situation, which means Pusan port is lack of competitive power to other ports. Developed models in this study were then applied to estimate change of container cargo volumes in competitive relation by altering several parameters. And, the results were found to be very helpful for port mangers who are in charge of planning of port development.

  • PDF

An Emulation System for Efficient Verification of ASIC Design (ASIC 설계의 효과적인 검증을 위한 에뮬레이션 시스템)

  • 유광기;정정화
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.36C no.10
    • /
    • pp.17-28
    • /
    • 1999
  • In this paper, an ASIC emulation system called ACE (ASIC Emulator) is proposed. It can produce the prototype of target ASIC in a short time and verify the function of ASIC circuit immediately The ACE is consist of emulation software in which there are EDIF reader, library translator, technology mapper, circuit partitioner and LDF generator and emulation hardware including emulation board and logic analyzer. Technology mapping is consist of three steps such as circuit partitioning and extraction of logic function, minimization of logic function and grouping of logic function. During those procedures, the number of basic logic blocks and maximum levels are minimized by making the output to be assigned in a same block sharing product-terms and input variables as much as possible. Circuit partitioner obtain chip-level netlists satisfying some constraints on routing structure of emulation board as well as the architecture of FPGA chip. A new partitioning algorithm whose objective function is the minimization of the number of interconnections among FPGA chips and among group of FPGA chips is proposed. The routing structure of emulation board take the advantage of complete graph and partial crossbar structure in order to minimize the interconnection delay between FPGA chips regardless of circuit size. logic analyzer display the waveform of probing signal on PC monitor that is designated by user. In order to evaluate the performance of the proposed emulation system, video Quad-splitter, one of the commercial ASIC, is implemented on the emulation board. Experimental results show that it is operated in the real time of 14.3MHz and functioned perfectly.

  • PDF