• Title/Summary/Keyword: Monitoring well

Search Result 2,877, Processing Time 0.031 seconds

Generation of Daily High-resolution Sea Surface Temperature for the Seas around the Korean Peninsula Using Multi-satellite Data and Artificial Intelligence (다종 위성자료와 인공지능 기법을 이용한 한반도 주변 해역의 고해상도 해수면온도 자료 생산)

  • Jung, Sihun;Choo, Minki;Im, Jungho;Cho, Dongjin
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.5_2
    • /
    • pp.707-723
    • /
    • 2022
  • Although satellite-based sea surface temperature (SST) is advantageous for monitoring large areas, spatiotemporal data gaps frequently occur due to various environmental or mechanical causes. Thus, it is crucial to fill in the gaps to maximize its usability. In this study, daily SST composite fields with a resolution of 4 km were produced through a two-step machine learning approach using polar-orbiting and geostationary satellite SST data. The first step was SST reconstruction based on Data Interpolate Convolutional AutoEncoder (DINCAE) using multi-satellite-derived SST data. The second step improved the reconstructed SST targeting in situ measurements based on light gradient boosting machine (LGBM) to finally produce daily SST composite fields. The DINCAE model was validated using random masks for 50 days, whereas the LGBM model was evaluated using leave-one-year-out cross-validation (LOYOCV). The SST reconstruction accuracy was high, resulting in R2 of 0.98, and a root-mean-square-error (RMSE) of 0.97℃. The accuracy increase by the second step was also high when compared to in situ measurements, resulting in an RMSE decrease of 0.21-0.29℃ and an MAE decrease of 0.17-0.24℃. The SST composite fields generated using all in situ data in this study were comparable with the existing data assimilated SST composite fields. In addition, the LGBM model in the second step greatly reduced the overfitting, which was reported as a limitation in the previous study that used random forest. The spatial distribution of the corrected SST was similar to those of existing high resolution SST composite fields, revealing that spatial details of oceanic phenomena such as fronts, eddies and SST gradients were well simulated. This research demonstrated the potential to produce high resolution seamless SST composite fields using multi-satellite data and artificial intelligence.

A Study on the Selection of Base Port and Establishment of International Cooperation System for Seafarer Rotation In case of Emergency - Focusing on the Service Network of HMM - (비상 시 선원교대를 위한 거점항만 선정과 국제협력 방안 - HMM 정기선을 중심으로 -)

  • Kim, Bo-ram;Lee, Hye-jin
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.27 no.2
    • /
    • pp.275-285
    • /
    • 2021
  • COVID-19 is threatening the safety of ships and seafarers by delaying seafarer rotation. Shipping companies and governments have a blindspot in case of the onboard environment of seafarers. An effective, alternative plan should be devised to eliminate the possibility of human accidents in an emergency that threatens the safety of seafarers. According to the survey of former and current seafarers, the most important factor in boarding life was safety, and the most necessary thing during emergencies was to secure smooth seafarer rotation rather than improve wages and welfare. By analyzing the major routes of national shipping companies by continent, ports with a large number of calls and a high Air Connectivity Index were selected as the base port. In addition, the route was designed for effective, domestic seafarer rotation during international shipping. Other countries must be consulted to establish a travel route linking ships, ports, and airports for the safe return of sailors to their home countries during an emergency. In addition, it is necessary to work together for the seafarers who are in trouble of seafarer rotation through cooperation with the International Maritime Organization(IMO). Starting with this, the government should have a monitoring system for the return and non-return routes as well as the number of seafarers on board. If such a system is established, it will be able to determine the response direction of our country's policy in case of an emergency. Along with the shipping company's ef orts to improve the treatment of seafarers, national and social attention will be needed to review domestic laws and improve awareness about seafarers.

Development of simultaneous analytical method for investigation of ketamine and dexmedetomidine in feed (사료 내 케타민과 덱스메데토미딘의 잔류조사를 위한 동시분석법 개발)

  • Chae, Hyun-young;Park, Hyejin;Seo, Hyung-Ju;Jang, Su-nyeong;Lee, Seung Hwa;Jeong, Min-Hee;Cho, Hyunjeong;Hong, Seong-Hee;Na, Tae Woong
    • Analytical Science and Technology
    • /
    • v.35 no.3
    • /
    • pp.136-142
    • /
    • 2022
  • According to media reports, the carcasses of euthanized abandoned dogs were processed at high temperature and pressure to make powder, and then used as feed materials (meat and bone meal), raising the possibility of residuals in the feed of the anesthetic ketamine and dexmedetomidine used for euthanasia. Therefore, a simultaneous analysis method using QuEChERS combined with high-performance liquid chromatography coupled with electrospray ionization tandem mass spectrometry was developed for rapid residue analysis. The method developed in this study exhibited linearity of 0.999 and higher. Selectivity was evaluated by analyzing blank and spiked samples at the limit of quantification. The MRM chromatograms of blank samples were compared with those of spiked samples with the analyte, and there were no interferences at the respective retention times of ketamine and dexmedetomidine. The detection and quantitation limits of the instrument were 0.6 ㎍/L and 2 ㎍/L, respectively. The limit of quantitation for the method was 10 ㎍/kg. The results of the recovery test on meat and bone meal, meat meal, and pet food showed ketamine in the range of 80.48-98.63 % with less than 5.00 % RSD, and dexmedetomidine in the range of 72.75-93.00 % with less than 4.83 % RSD. As a result of collecting and analyzing six feeds, such as meat and bone meal, prepared at the time the raw material was distributed, 10.8 ㎍/kg of ketamine was detected in one sample of meat and bone meal, while dexmedetomidine was found to have a concentration below the limit of quantitation. It was confirmed that the detected sample was distributed before the safety issue was known, and thereafter, all the meat and bone meal made with the carcasses of euthanized abandoned dogs was recalled and completely discarded. To ensure the safety of the meat and bone meal, 32 samples of the meat and bone meal as well as compound feed were collected, and additional residue investigations were conducted for ketamine and dexmedetomidine. As a result of the analysis, no component was detected. However, through this investigation, it was confirmed that some animal drugs, such as anesthetics, can remain without decomposition even at high temperature and pressure; therefore, there is a need for further investigation of other potentially hazardous substances not controlled in the feed.

Estimation of ecological flow and fish habitats for Andong Dam downstream reach using 1-D and 2-D physical habitat models (1차원 및 2차원 물리서식처 모형을 활용한 안동댐 하류 하천의 환경생태유량 및 어류서식처 추정)

  • Kim, Yongwon;Lee, Jiwan;Woo, Soyoung;Kim, Soohong;Lee, Jongjin;Kim, Seongjoon
    • Journal of Korea Water Resources Association
    • /
    • v.55 no.12
    • /
    • pp.1041-1052
    • /
    • 2022
  • This study is to estimate the optimal ecological flow and analysis the spatial distribution of fish habitat for Andong dam downstream reach (4,565.7 km2) using PHABSIM (Physical Habiat Simulation System) and River2D. To establish habitat models, the cross-section informations and hydraulic input data were collected uisng the Nakdong river basic plan report. The establishment range of PHABSIM was set up about 410.0 m from Gudam streamflow gauging station (GD) and about 6.0 km including GD for River2D. To select representative fish species and construct HSI (Habitat Suitability Index), the fish survey was performed at Pungji bridge where showed well the physical characteristics of target stream located downstream of GD. As a result of the fish survey, Zacco platypus was showed highly relative abundance resulting in selecting as the representative fish species, and HSI was constructed using physical habitat characteristics of the Zacco platypus. The optimal range of HSI was 0.3~0.5 m/s at the velocity suitability index, 0.4~0.6 m at the depth suitability index, and the substrate was sand to fine gravel. As a result of estimating the optimal ecological flow by applying HSI to PHABSIM, the optimal ecological flow for target stream was 20.0 m3/sec. As a result of analysis two-dimensional spatial analysis of fish habitat using River2D, WUA (Weighted Usable Area) was estimated 107,392.0 m2/1000 m under the ecological flow condition and it showed the fish habitat was secured throughout the target stream compared with Q355 condition.

Application of Flux Average Discharge Equation to Assess the Submarine Fresh Groundwater Discharge in a Coastal Aquifer (연안 대수층의 해저 담지하수 유출량 산정을 위한 유량 평균 유출량 방정식의 적용)

  • Il Hwan Kim;Min-Gyu Kim;Il-Moon Chung;Gyo-Cheol Jeong;Sunwoo Chang
    • The Journal of Engineering Geology
    • /
    • v.33 no.1
    • /
    • pp.105-119
    • /
    • 2023
  • Water supply is decreasing due to climate change, and coastal and island regions are highly dependent on groundwater, reducing the amount of available water. For sustainable water supply in coastal and island regions, it is necessary to accurately diagnose the current condition and efficiently distribute and manage water. For a precise analysis of the groundwater flow in the coastal island region, submarine fresh groundwater discharge was calculated for the Seongsan basin in the eastern part of Jeju Island. Two methods were used to estimate the thickness of the fresh groundwater. One method employed vertical interpolation of measured electrical conductivity in a multi depth monitoring well; the other used theoretical Ghyben-Herzberg ratio. The value using the Ghyben-Herzberg ratio makes it impossible to accurately estimate the changing salt-saltwater interface, and the value analyzed by electrical conductivity can represent the current state of the freshwater-saltwater interface. Observed parameter was distributed on a virtual grid. The average of submarine fresh groundwater discharge fluxes for the virtual grid was determined as the watershed's representative flux. The submarine fresh groundwater discharge and flux distribution by year were also calculated at the basin scale. The method using electrical conductivity estimated the submarine fresh groundwater discharge from 2018 to 2020 to be 6.27 × 106 m3/year; the method using the Ghyben-Herzberg ratio estimated a discharge of 10.87 × 106 m3/year. The results presented in this study can be used as basis data for policies that determine sustainable water supply by using precise water budget analysis in coastal and island areas.

A Performance Comparison of Land-Based Floating Debris Detection Based on Deep Learning and Its Field Applications (딥러닝 기반 육상기인 부유쓰레기 탐지 모델 성능 비교 및 현장 적용성 평가)

  • Suho Bak;Seon Woong Jang;Heung-Min Kim;Tak-Young Kim;Geon Hui Ye
    • Korean Journal of Remote Sensing
    • /
    • v.39 no.2
    • /
    • pp.193-205
    • /
    • 2023
  • A large amount of floating debris from land-based sources during heavy rainfall has negative social, economic, and environmental impacts, but there is a lack of monitoring systems for floating debris accumulation areas and amounts. With the recent development of artificial intelligence technology, there is a need to quickly and efficiently study large areas of water systems using drone imagery and deep learning-based object detection models. In this study, we acquired various images as well as drone images and trained with You Only Look Once (YOLO)v5s and the recently developed YOLO7 and YOLOv8s to compare the performance of each model to propose an efficient detection technique for land-based floating debris. The qualitative performance evaluation of each model showed that all three models are good at detecting floating debris under normal circumstances, but the YOLOv8s model missed or duplicated objects when the image was overexposed or the water surface was highly reflective of sunlight. The quantitative performance evaluation showed that YOLOv7 had the best performance with a mean Average Precision (intersection over union, IoU 0.5) of 0.940, which was better than YOLOv5s (0.922) and YOLOv8s (0.922). As a result of generating distortion in the color and high-frequency components to compare the performance of models according to data quality, the performance degradation of the YOLOv8s model was the most obvious, and the YOLOv7 model showed the lowest performance degradation. This study confirms that the YOLOv7 model is more robust than the YOLOv5s and YOLOv8s models in detecting land-based floating debris. The deep learning-based floating debris detection technique proposed in this study can identify the spatial distribution of floating debris by category, which can contribute to the planning of future cleanup work.

A Study on Problems and Improvement of Home-help Services of Long-term Care Insurance (노인장기요양보험 재가서비스의 문제점과 개선방안)

  • Lee, Jun Woo;Jin, Hee
    • 한국노년학
    • /
    • v.29 no.1
    • /
    • pp.149-175
    • /
    • 2009
  • The purpose of this research is to analyze the overall problems at the moment of October 2008, and then to find the improvements of home-help services of the Long-Term Care Insurance(LTCI), which has been revealed many problems since it was released in July 2008. The research uses the literature survey which analyzes 2nd-hand materials studied by other people already, and survey research was executed from active social workers in the area of LTCI. Based on the policy analysis framework of Gilbert and Specht, all the data are analyzed in the scopes of client·benefit(service)·finance·transferring system. This research has found the problems in each scope of home-help services of the LTCI. Firstly, the client system has some problems in mismatching between registered and service clients, estimating client number, and judging service levels. Secondly, the service system reveals deficiency in professionality of social workers, service quality lowering by loose qualification criteria on workers, non-reasonable limitation of service time available, and the same fare system applied to visiting-help service in spite of different levels. Thirdly, in financing system, clients need to pay additional money to get extra services such as meal, hair cutting, bathing etc., due to government financial support stopped, some organizations have to reduce services and replace full-time workers to part-time ones, which makes the service quality worse. Lastly, in the transferring system, the management system for service quality is not well prepared. There are too much competion because of allowing too many home-help service organizations and care worker academies. The suggestions that this research has found to improve the policy are as follows. ① It is desirable to make the registered clients the service ones as many as possible in the long term perspective. ② The LTCI organization requires more workers and higher professionality. ③ Many elderly people who are not eligible now require connection system to be more served. ④ Management system and service manual for care worker are to be developed. ⑤ Laws related to the service contents and process should be modified, the proportion of client charge needs to adjust. ⑥ Home-help service organization licensed by the LTCI needs to be financially supported publicly. ⑦ Monitoring system to home-help service organization needs to be strengthened. ⑧ Evaluation tools to home-help service organization and workers is required. ⑨ Specification to open the home-help service organization needs to be more strict.

Predicting the Potential Habitat and Future Distribution of Brachydiplax chalybea flavovittata Ris, 1911 (Odonata: Libellulidae) (기후변화에 따른 남색이마잠자리 잠재적 서식지 및 미래 분포예측)

  • Soon Jik Kwon;Yung Chul Jun;Hyeok Yeong Kwon;In Chul Hwang;Chang Su Lee;Tae Geun Kim
    • Journal of Wetlands Research
    • /
    • v.25 no.4
    • /
    • pp.335-344
    • /
    • 2023
  • Brachydiplax chalybea flavovittata, a climate-sensitive biological indicator species, was first observed and recorded at Jeju Island in Korea in 2010. Overwintering was recently confirmed in the Yeongsan River area. This study was aimed to predict the potential distribution patterns for the larvae of B. chalybea flavovittata and to understand its ecological characteristics as well as changes of population under global climate change circumstances. Data was collected both from the Global Biodiversity Information Facility (GBIF) and by field surveys from May 2019 to May 2023. We used for the distribution model among downloaded 19 variables from the WorldClim database. MaxEnt model was adopted for the prediction of potential and future distribution for B. chalybea flavovittata. Larval distribution ranged within a region delimited by northern latitude from Jeju-si, Jeju Special Self-Governing Province (33.318096°) to Yeoju-si, Gyeonggi-do (37.366734°) and eastern longitude from Jindo-gun, Jeollanam-do (126.054925°) to Yangsan-si, Gyeongsangnam-do (129.016472°). M type (permanent rivers, streams and creeks) wetlands were the most common habitat based on the Ramsar's wetland classification system, followed by Tp type (permanent freshwater marshes and pools) (45.8%) and F type (estuarine waters) (4.2%). MaxEnt model presented that potential distribution with high inhabiting probability included Ulsan and Daegu Metropolitan City in addition to the currently discovered habitats. Applying to the future scenarios by Intergovernmental Panel on Climate Change (IPCC), it was predicted that the possible distribution area would expand in the 2050s and 2090s, covering the southern and western coastal regions, the southern Daegu metropolitan area and the eastern coastal regions in the near future. This study suggests that B. chalybea flavovittata can be used as an effective indicator species for climate changes with a monitoring of their distribution ranges. Our findings will also help to provide basic information on the conservation and management of co-existing native species.

Bathymetric and Topographic Changes of the Gomso-Bay Tidal Flat, West Coast of the Korean Peninsula (한반도 서해안 곰소만 갯벌의 수심 및 지형 변화)

  • Jin Ho Chang;Yong-Gil Kim;Myong Sun Lee
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.29 no.6
    • /
    • pp.552-561
    • /
    • 2023
  • The seafloor topography of Gomso Bay on the west coast of Korea was investigated using subtidal bathymetry and tidal-flat altimetry. Gomso Bay consists of 80% tidal flats and 20% subtidal zone, and is divided into an outer bay and an inner bay by the Jujincheon esturary channel. The outer bay tidal flat, has few tidal channels, has a concave topographic profile, and is characterized by the development of chenier and intertidal sand bars, giving it the appearance of gently sloping, dissipative beaches. The inner bay tidal flat has wide upper and middle tidal flats with a well-developed tidal channel system without cheniers. Moreover, the topographical cross-section between these tidal channels is convex upward, and shows the characteristics of a depositional environment greatly influenced by tidal channels and tidal action. An analysis of the horizontal movement of the tidal flat environment over the past 37 years investigating changes in the iso-depth lines in the Gomso-Bay tidal flat between 1981 and 2018 revealed that the Gomso-Bay tidal flat retreated gradually landward. As a result of analyzing the erosion and sedimentation characteristics of Gomso Bay, assuming that most of the water depth changes were due to changes in the elevation of the sea floor and sea level, an average of 1 cm (0 mm/y) of sediment was eroded in the outer bay over the past 37 years (1981-2018), In the inner bay, an average of 50 cm (14 mm/y) was deposited. Notably, the high tidal flats of the outer bay were largely eroded. Monitoring photographs of the coast showed that most of the erosion of the high tidal flats in the outer bay occurred in a short period around 1999 (probably 1997-2002), and that the erosion resulted from the erosion of sand dunes and high-tide beaches caused by temporarily greatly raised high tide levels and storms.

A Study on the Meaning and Strategy of Keyword Advertising Marketing

  • Park, Nam Goo
    • Journal of Distribution Science
    • /
    • v.8 no.3
    • /
    • pp.49-56
    • /
    • 2010
  • At the initial stage of Internet advertising, banner advertising came into fashion. As the Internet developed into a central part of daily lives and the competition in the on-line advertising market was getting fierce, there was not enough space for banner advertising, which rushed to portal sites only. All these factors was responsible for an upsurge in advertising prices. Consequently, the high-cost and low-efficiency problems with banner advertising were raised, which led to an emergence of keyword advertising as a new type of Internet advertising to replace its predecessor. In the beginning of 2000s, when Internet advertising came to be activated, display advertisement including banner advertising dominated the Net. However, display advertising showed signs of gradual decline, and registered minus growth in the year 2009, whereas keyword advertising showed rapid growth and started to outdo display advertising as of the year 2005. Keyword advertising refers to the advertising technique that exposes relevant advertisements on the top of research sites when one searches for a keyword. Instead of exposing advertisements to unspecified individuals like banner advertising, keyword advertising, or targeted advertising technique, shows advertisements only when customers search for a desired keyword so that only highly prospective customers are given a chance to see them. In this context, it is also referred to as search advertising. It is regarded as more aggressive advertising with a high hit rate than previous advertising in that, instead of the seller discovering customers and running an advertisement for them like TV, radios or banner advertising, it exposes advertisements to visiting customers. Keyword advertising makes it possible for a company to seek publicity on line simply by making use of a single word and to achieve a maximum of efficiency at a minimum cost. The strong point of keyword advertising is that customers are allowed to directly contact the products in question through its more efficient advertising when compared to the advertisements of mass media such as TV and radio, etc. The weak point of keyword advertising is that a company should have its advertisement registered on each and every portal site and finds it hard to exercise substantial supervision over its advertisement, there being a possibility of its advertising expenses exceeding its profits. Keyword advertising severs as the most appropriate methods of advertising for the sales and publicity of small and medium enterprises which are in need of a maximum of advertising effect at a low advertising cost. At present, keyword advertising is divided into CPC advertising and CPM advertising. The former is known as the most efficient technique, which is also referred to as advertising based on the meter rate system; A company is supposed to pay for the number of clicks on a searched keyword which users have searched. This is representatively adopted by Overture, Google's Adwords, Naver's Clickchoice, and Daum's Clicks, etc. CPM advertising is dependent upon the flat rate payment system, making a company pay for its advertisement on the basis of the number of exposure, not on the basis of the number of clicks. This method fixes a price for advertisement on the basis of 1,000-time exposure, and is mainly adopted by Naver's Timechoice, Daum's Speciallink, and Nate's Speedup, etc, At present, the CPC method is most frequently adopted. The weak point of the CPC method is that advertising cost can rise through constant clicks from the same IP. If a company makes good use of strategies for maximizing the strong points of keyword advertising and complementing its weak points, it is highly likely to turn its visitors into prospective customers. Accordingly, an advertiser should make an analysis of customers' behavior and approach them in a variety of ways, trying hard to find out what they want. With this in mind, her or she has to put multiple keywords into use when running for ads. When he or she first runs an ad, he or she should first give priority to which keyword to select. The advertiser should consider how many individuals using a search engine will click the keyword in question and how much money he or she has to pay for the advertisement. As the popular keywords that the users of search engines are frequently using are expensive in terms of a unit cost per click, the advertisers without much money for advertising at the initial phrase should pay attention to detailed keywords suitable to their budget. Detailed keywords are also referred to as peripheral keywords or extension keywords, which can be called a combination of major keywords. Most keywords are in the form of texts. The biggest strong point of text-based advertising is that it looks like search results, causing little antipathy to it. But it fails to attract much attention because of the fact that most keyword advertising is in the form of texts. Image-embedded advertising is easy to notice due to images, but it is exposed on the lower part of a web page and regarded as an advertisement, which leads to a low click through rate. However, its strong point is that its prices are lower than those of text-based advertising. If a company owns a logo or a product that is easy enough for people to recognize, the company is well advised to make good use of image-embedded advertising so as to attract Internet users' attention. Advertisers should make an analysis of their logos and examine customers' responses based on the events of sites in question and the composition of products as a vehicle for monitoring their behavior in detail. Besides, keyword advertising allows them to analyze the advertising effects of exposed keywords through the analysis of logos. The logo analysis refers to a close analysis of the current situation of a site by making an analysis of information about visitors on the basis of the analysis of the number of visitors and page view, and that of cookie values. It is in the log files generated through each Web server that a user's IP, used pages, the time when he or she uses it, and cookie values are stored. The log files contain a huge amount of data. As it is almost impossible to make a direct analysis of these log files, one is supposed to make an analysis of them by using solutions for a log analysis. The generic information that can be extracted from tools for each logo analysis includes the number of viewing the total pages, the number of average page view per day, the number of basic page view, the number of page view per visit, the total number of hits, the number of average hits per day, the number of hits per visit, the number of visits, the number of average visits per day, the net number of visitors, average visitors per day, one-time visitors, visitors who have come more than twice, and average using hours, etc. These sites are deemed to be useful for utilizing data for the analysis of the situation and current status of rival companies as well as benchmarking. As keyword advertising exposes advertisements exclusively on search-result pages, competition among advertisers attempting to preoccupy popular keywords is very fierce. Some portal sites keep on giving priority to the existing advertisers, whereas others provide chances to purchase keywords in question to all the advertisers after the advertising contract is over. If an advertiser tries to rely on keywords sensitive to seasons and timeliness in case of sites providing priority to the established advertisers, he or she may as well make a purchase of a vacant place for advertising lest he or she should miss appropriate timing for advertising. However, Naver doesn't provide priority to the existing advertisers as far as all the keyword advertisements are concerned. In this case, one can preoccupy keywords if he or she enters into a contract after confirming the contract period for advertising. This study is designed to take a look at marketing for keyword advertising and to present effective strategies for keyword advertising marketing. At present, the Korean CPC advertising market is virtually monopolized by Overture. Its strong points are that Overture is based on the CPC charging model and that advertisements are registered on the top of the most representative portal sites in Korea. These advantages serve as the most appropriate medium for small and medium enterprises to use. However, the CPC method of Overture has its weak points, too. That is, the CPC method is not the only perfect advertising model among the search advertisements in the on-line market. So it is absolutely necessary that small and medium enterprises including independent shopping malls should complement the weaknesses of the CPC method and make good use of strategies for maximizing its strengths so as to increase their sales and to create a point of contact with customers.

  • PDF