• Title/Summary/Keyword: cost-effective monitoring

Search Result 313, Processing Time 0.028 seconds

Fabrication of 3D Paper-based Analytical Device Using Double-Sided Imprinting Method for Metal Ion Detection (양면 인쇄법을 이용한 중금속 검출용 3D 종이 기반 분석장치 제작)

  • Jinsol, Choi;Heon-Ho, Jeong
    • Clean Technology
    • /
    • v.28 no.4
    • /
    • pp.323-330
    • /
    • 2022
  • Microfluidic paper-based analytical devices (μPADs) have recently been in the spotlight for their applicability in point-of-care diagnostics and environmental material detection. This study presents a double-sided printing method for fabricating 3D-μPADs, providing simple and cost effective metal ion detection. The design of the 3D-μPAD was made into an acryl stamp by laser cutting and then coating it with a thin layer of PDMS using the spin-coating method. This fabricated stamp was used to form the 3D structure of the hydrophobic barrier through a double-sided contact printing method. The fabrication of the 3D hydrophobic barrier within a single sheet was optimized by controlling the spin-coating rate, reagent ratio and contacting time. The optimal conditions were found by analyzing the area change of the PDMS hydrophobic barrier and hydrophilic channel using ink with chromatography paper. Using the fabricated 3D-μPAD under optimized conditions, Ni2+, Cu2+, Hg2+, and pH were detected at different concentrations and displayed with color intensity in grayscale for quantitative analysis using ImageJ. This study demonstrated that a 3D-μPAD biosensor can be applied to detect metal ions without special analysis equipment. This 3D-μPAD provides a highly portable and rapid on-site monitoring platform for detecting multiple heavy metal ions with extremely high repeatability, which is useful for resource-limited areas and developing countries.

Corporate Bankruptcy Prediction Model using Explainable AI-based Feature Selection (설명가능 AI 기반의 변수선정을 이용한 기업부실예측모형)

  • Gundoo Moon;Kyoung-jae Kim
    • Journal of Intelligence and Information Systems
    • /
    • v.29 no.2
    • /
    • pp.241-265
    • /
    • 2023
  • A corporate insolvency prediction model serves as a vital tool for objectively monitoring the financial condition of companies. It enables timely warnings, facilitates responsive actions, and supports the formulation of effective management strategies to mitigate bankruptcy risks and enhance performance. Investors and financial institutions utilize default prediction models to minimize financial losses. As the interest in utilizing artificial intelligence (AI) technology for corporate insolvency prediction grows, extensive research has been conducted in this domain. However, there is an increasing demand for explainable AI models in corporate insolvency prediction, emphasizing interpretability and reliability. The SHAP (SHapley Additive exPlanations) technique has gained significant popularity and has demonstrated strong performance in various applications. Nonetheless, it has limitations such as computational cost, processing time, and scalability concerns based on the number of variables. This study introduces a novel approach to variable selection that reduces the number of variables by averaging SHAP values from bootstrapped data subsets instead of using the entire dataset. This technique aims to improve computational efficiency while maintaining excellent predictive performance. To obtain classification results, we aim to train random forest, XGBoost, and C5.0 models using carefully selected variables with high interpretability. The classification accuracy of the ensemble model, generated through soft voting as the goal of high-performance model design, is compared with the individual models. The study leverages data from 1,698 Korean light industrial companies and employs bootstrapping to create distinct data groups. Logistic Regression is employed to calculate SHAP values for each data group, and their averages are computed to derive the final SHAP values. The proposed model enhances interpretability and aims to achieve superior predictive performance.

Developments of Space Radiation Dosimeter using Commercial Si Radiation Sensor (범용 실리콘 방사선 센서를 이용한 우주방사선 선량계 개발)

  • Jong-kyu Cheon;Sunghwan Kim
    • Journal of the Korean Society of Radiology
    • /
    • v.17 no.3
    • /
    • pp.367-373
    • /
    • 2023
  • Aircrews and passengers are exposed to radiation from cosmic rays and secondary scattered rays generated by reactions with air or aircraft. For aircrews, radiation safety management is based on the exposure dose calculated using a space-weather environment simulation. However, the exposure dose varies depending on solar activity, altitude, flight path, etc., so measuring by route is more suggestive than the calculation. In this study, we developed an instrument to measure the cosmic radiation dose using a general-purpose Si sensor and a multichannel analyzer. The dose calculation applied the algorithm of CRaTER (Cosmic Ray Telescope for the Effects of Radiation), a space radiation measuring device of NASA. Energy and dose calibration was performed with Cs-137 662 keV gamma rays at a standard calibration facility, and good dose rate dependence was confirmed in the experimental range. Using the instrument, the dose was directly measured on the international line between Dubai and Incheon in May 2023, and it was similar to the result calculated by KREAM (Korean Radiation Exposure Assessment Model for Aviation Route Dose) within 12%. It was confirmed that the dose increased as the altitude and latitude increased, consistent with the calculation results by KREAM. Some limitations require more verification experiments. However, we confirmed it has sufficient utilization potential as a cost-effective measuring instrument for monitoring exposure dose inside or on personal aircraft.

A Method for the Effective Implementation of a Consignment Contract in Road Constructions (도로 수탁공사의 효과적 수행을 위한 방법론)

  • Bak, Gwon-June;Kim, Sung-Keun
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.30 no.2D
    • /
    • pp.153-161
    • /
    • 2010
  • The city planning of a local government is a continuous process that does not end with the creation of a plan but proceeds through decision-making, monitoring and evaluation phases. As a new city planning is changed and confirmed, there is a chance to construct a large scale road that is connected with an under constructed road. In this case, the expansion of the width and length of road, the addition of bridges or tunnels, and the change of the size and location of interchanges lead to many changes on road design and construction. In the past, the consignment contracts for a road construction have been made in limited numbers and for limited civil works. Now, it is growing in numbers and is making for large scale multi-works. However, the standard process and guidelines for the consignment contracts have not been suggested yet, so there is difficulty in performing the consigned road construction effectively. In this paper, the important factors for the consignment contracts are determined by construction document reviews and expert interviews. Based on these results, a standard process for the consigned contracts and a guideline for agreeing on construction cost are suggested. The costs that should be paid by a consignor are also defined.

Using Trophic State Index (TSI) Values to Draw Inferences Regarding Phytoplankton Limiting Factors and Seston Composition from Routine Water Quality Monitoring Data (영양상태지수 (trophic state index)를 이용한 수체 내 식물플랑크톤 제한요인 및 seston조성의 유추)

  • Havens, Karl E
    • Korean Journal of Ecology and Environment
    • /
    • v.33 no.3 s.91
    • /
    • pp.187-196
    • /
    • 2000
  • This paper describes a simple method that uses differences among Carlson's (1977) trophic state index (TSI) values based on total phosphorus (TP), chlorophyll a (CHL) and Secchi depth (SD) to draw inferences regarding the factors that are limiting to phytoplankton growth and the composition of lake seston. Examples are provided regarding seasonal and spatial patterns in a large subtropical lake (Lake Okeechobee, Florida, USA) and inter- and intra-lake variations from a multilake data set developed from published studies. Once an investigator has collected routine water quality data and established TSI values based on TP, CHL, and SD, a number of inferences can be made. Additional information can be provided where it also is possible to calculate a TSI based on total nitrogen (TN). Where TSI (CHL)<>TSI (SD), light attenuating particles are large (large filaments or colonies of algae), and the phytoplankton may be limited by zooplankton grazing. Other limiting conditions are inferred by different relationships between the TSI values. Results of this study indicate that the analysis is quite robust, and that it generally gives good agreement with conclusions based on more direct methods (e.g., nutrientaddition bioassays, zooplankton size data, zooplankton removal experiments). The TSI approach, when validated periodically with these more costly and time-intensive methods, provides an effective, low cost method for tracking long-term changes in pelagic structure and function with potential value in monitoring lake ecology and responses to management.

  • PDF

A Study on the establishment of IoT management process in terms of business according to Paradigm Shift (패러다임 전환에 의한 기업 측면의 IoT 경영 프로세스 구축방안 연구)

  • Jeong, Min-Eui;Yu, Song-Jin
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.2
    • /
    • pp.151-171
    • /
    • 2015
  • This study examined the concepts of the Internet of Things(IoT), the major issue and IoT trend in the domestic and international market. also reviewed the advent of IoT era which caused a 'Paradigm Shift'. This study proposed a solution for the appropriate corresponding strategy in terms of Enterprise. Global competition began in the IoT market. So, Businesses to be competitive and responsive, the government's efforts, as well as the efforts of companies themselves is needed. In particular, in order to cope with the dynamic environment appropriately, faster and more efficient strategy is required. In other words, proposed a management strategy that can respond the IoT competitive era on tipping point through the vision of paradigm shift. We forecasted and proposed the emergence of paradigm shift through a comparative analysis of past management paradigm and IoT management paradigm as follow; I) Knowledge & learning oriented management, II) Technology & innovation oriented management, III) Demand driven management, IV) Global collaboration management. The Knowledge & learning oriented management paradigm is expected to be a new management paradigm due to the development of IT technology development and information processing technology. In addition to the rapid development such as IT infrastructure and processing of data, storage, knowledge sharing and learning has become more important. Currently Hardware-oriented management paradigm will be changed to the software-oriented paradigm. In particular, the software and platform market is a key component of the IoT ecosystem, has been estimated to be led by Technology & innovation oriented management. In 2011, Gartner announced the concept of "Demand-Driven Value Networks(DDVN)", DDVN emphasizes value of the whole of the network. Therefore, Demand driven management paradigm is creating demand for advanced process, not the process corresponding to the demand simply. Global collaboration management paradigm create the value creation through the fusion between technology, between countries, between industries. In particular, cooperation between enterprises that has financial resources and brand power and venture companies with creative ideas and technical will generate positive synergies. Through this, The large enterprises and small companies that can be win-win environment would be built. Cope with the a paradigm shift and to establish a management strategy of Enterprise process, this study utilized the 'RTE cyclone model' which proposed by Gartner. RTE concept consists of three stages, Lead, Operate, Manage. The Lead stage is utilizing capital to strengthen the business competitiveness. This stages has the goal of linking to external stimuli strategy development, also Execute the business strategy of the company for capital and investment activities and environmental changes. Manege stage is to respond appropriately to threats and internalize the goals of the enterprise. Operate stage proceeds to action for increasing the efficiency of the services across the enterprise, also achieve the integration and simplification of the process, with real-time data capture. RTE(Real Time Enterprise) concept has the value for practical use with the management strategy. Appropriately applied in this study, we propose a 'IoT-RTE Cyclone model' which emphasizes the agility of the enterprise. In addition, based on the real-time monitoring, analysis, act through IT and IoT technology. 'IoT-RTE Cyclone model' that could integrate the business processes of the enterprise each sector and support the overall service. therefore the model be used as an effective response strategy for Enterprise. In particular, IoT-RTE Cyclone Model is to respond to external events, waste elements are removed according to the process is repeated. Therefore, it is possible to model the operation of the process more efficient and agile. This IoT-RTE Cyclone Model can be used as an effective response strategy of the enterprise in terms of IoT era of rapidly changing because it supports the overall service of the enterprise. When this model leverages a collaborative system among enterprises it expects breakthrough cost savings through competitiveness, global lead time, minimizing duplication.

A Study on the Meaning and Strategy of Keyword Advertising Marketing

  • Park, Nam Goo
    • Journal of Distribution Science
    • /
    • v.8 no.3
    • /
    • pp.49-56
    • /
    • 2010
  • At the initial stage of Internet advertising, banner advertising came into fashion. As the Internet developed into a central part of daily lives and the competition in the on-line advertising market was getting fierce, there was not enough space for banner advertising, which rushed to portal sites only. All these factors was responsible for an upsurge in advertising prices. Consequently, the high-cost and low-efficiency problems with banner advertising were raised, which led to an emergence of keyword advertising as a new type of Internet advertising to replace its predecessor. In the beginning of 2000s, when Internet advertising came to be activated, display advertisement including banner advertising dominated the Net. However, display advertising showed signs of gradual decline, and registered minus growth in the year 2009, whereas keyword advertising showed rapid growth and started to outdo display advertising as of the year 2005. Keyword advertising refers to the advertising technique that exposes relevant advertisements on the top of research sites when one searches for a keyword. Instead of exposing advertisements to unspecified individuals like banner advertising, keyword advertising, or targeted advertising technique, shows advertisements only when customers search for a desired keyword so that only highly prospective customers are given a chance to see them. In this context, it is also referred to as search advertising. It is regarded as more aggressive advertising with a high hit rate than previous advertising in that, instead of the seller discovering customers and running an advertisement for them like TV, radios or banner advertising, it exposes advertisements to visiting customers. Keyword advertising makes it possible for a company to seek publicity on line simply by making use of a single word and to achieve a maximum of efficiency at a minimum cost. The strong point of keyword advertising is that customers are allowed to directly contact the products in question through its more efficient advertising when compared to the advertisements of mass media such as TV and radio, etc. The weak point of keyword advertising is that a company should have its advertisement registered on each and every portal site and finds it hard to exercise substantial supervision over its advertisement, there being a possibility of its advertising expenses exceeding its profits. Keyword advertising severs as the most appropriate methods of advertising for the sales and publicity of small and medium enterprises which are in need of a maximum of advertising effect at a low advertising cost. At present, keyword advertising is divided into CPC advertising and CPM advertising. The former is known as the most efficient technique, which is also referred to as advertising based on the meter rate system; A company is supposed to pay for the number of clicks on a searched keyword which users have searched. This is representatively adopted by Overture, Google's Adwords, Naver's Clickchoice, and Daum's Clicks, etc. CPM advertising is dependent upon the flat rate payment system, making a company pay for its advertisement on the basis of the number of exposure, not on the basis of the number of clicks. This method fixes a price for advertisement on the basis of 1,000-time exposure, and is mainly adopted by Naver's Timechoice, Daum's Speciallink, and Nate's Speedup, etc, At present, the CPC method is most frequently adopted. The weak point of the CPC method is that advertising cost can rise through constant clicks from the same IP. If a company makes good use of strategies for maximizing the strong points of keyword advertising and complementing its weak points, it is highly likely to turn its visitors into prospective customers. Accordingly, an advertiser should make an analysis of customers' behavior and approach them in a variety of ways, trying hard to find out what they want. With this in mind, her or she has to put multiple keywords into use when running for ads. When he or she first runs an ad, he or she should first give priority to which keyword to select. The advertiser should consider how many individuals using a search engine will click the keyword in question and how much money he or she has to pay for the advertisement. As the popular keywords that the users of search engines are frequently using are expensive in terms of a unit cost per click, the advertisers without much money for advertising at the initial phrase should pay attention to detailed keywords suitable to their budget. Detailed keywords are also referred to as peripheral keywords or extension keywords, which can be called a combination of major keywords. Most keywords are in the form of texts. The biggest strong point of text-based advertising is that it looks like search results, causing little antipathy to it. But it fails to attract much attention because of the fact that most keyword advertising is in the form of texts. Image-embedded advertising is easy to notice due to images, but it is exposed on the lower part of a web page and regarded as an advertisement, which leads to a low click through rate. However, its strong point is that its prices are lower than those of text-based advertising. If a company owns a logo or a product that is easy enough for people to recognize, the company is well advised to make good use of image-embedded advertising so as to attract Internet users' attention. Advertisers should make an analysis of their logos and examine customers' responses based on the events of sites in question and the composition of products as a vehicle for monitoring their behavior in detail. Besides, keyword advertising allows them to analyze the advertising effects of exposed keywords through the analysis of logos. The logo analysis refers to a close analysis of the current situation of a site by making an analysis of information about visitors on the basis of the analysis of the number of visitors and page view, and that of cookie values. It is in the log files generated through each Web server that a user's IP, used pages, the time when he or she uses it, and cookie values are stored. The log files contain a huge amount of data. As it is almost impossible to make a direct analysis of these log files, one is supposed to make an analysis of them by using solutions for a log analysis. The generic information that can be extracted from tools for each logo analysis includes the number of viewing the total pages, the number of average page view per day, the number of basic page view, the number of page view per visit, the total number of hits, the number of average hits per day, the number of hits per visit, the number of visits, the number of average visits per day, the net number of visitors, average visitors per day, one-time visitors, visitors who have come more than twice, and average using hours, etc. These sites are deemed to be useful for utilizing data for the analysis of the situation and current status of rival companies as well as benchmarking. As keyword advertising exposes advertisements exclusively on search-result pages, competition among advertisers attempting to preoccupy popular keywords is very fierce. Some portal sites keep on giving priority to the existing advertisers, whereas others provide chances to purchase keywords in question to all the advertisers after the advertising contract is over. If an advertiser tries to rely on keywords sensitive to seasons and timeliness in case of sites providing priority to the established advertisers, he or she may as well make a purchase of a vacant place for advertising lest he or she should miss appropriate timing for advertising. However, Naver doesn't provide priority to the existing advertisers as far as all the keyword advertisements are concerned. In this case, one can preoccupy keywords if he or she enters into a contract after confirming the contract period for advertising. This study is designed to take a look at marketing for keyword advertising and to present effective strategies for keyword advertising marketing. At present, the Korean CPC advertising market is virtually monopolized by Overture. Its strong points are that Overture is based on the CPC charging model and that advertisements are registered on the top of the most representative portal sites in Korea. These advantages serve as the most appropriate medium for small and medium enterprises to use. However, the CPC method of Overture has its weak points, too. That is, the CPC method is not the only perfect advertising model among the search advertisements in the on-line market. So it is absolutely necessary that small and medium enterprises including independent shopping malls should complement the weaknesses of the CPC method and make good use of strategies for maximizing its strengths so as to increase their sales and to create a point of contact with customers.

  • PDF

Minimal Stimulation using rhFSH and GnRH Antagonist for IVF Treated Patients of Advanced Age (고령 불임여성의 체외수정술시 최소자극법의 효용성)

  • Kim, So-Ra;Kim, Chung-Hoon;Lee, Jin-Kyoung;Jeon, Gyun-Ho;Kim, Sung-Hoon;Chae, Hee-Dong;Kang, Byung-Moon
    • Clinical and Experimental Reproductive Medicine
    • /
    • v.36 no.1
    • /
    • pp.63-70
    • /
    • 2009
  • Objective: This study was performed to investigate the effectiveness of minimal stimulation using rhFSH and GnRH antagonist compared with GnRH antagonist multidose protocol (MDP) in IVF treated patients with aged 40 and above. Methods: Seventy-five patients with aged 40 and above were equally randomized to minimal stimulation group (n=37) or GnRH antagonist MDP group (n=38). For minimal stimulation group, ultrasound monitoring was started on cycle day 7 or 8. Daily injections of 0.25 mg cetrorelix together with 150 IU rhFSH were started from the day at 13${\sim}$14 mm of a leading follicle diameter. For GnRH antagonist MDP group, daily injections of 225 IU rhFSH were initiated from cycle day 2 and GnRH antagonist was started at a dose of 0.25 mg/day on rhFSH stimulation day 6 or the day at 13${\sim}$14 mm of leading follicle diameter. In both groups, transvaginal ultrasound-guided oocyte retrieval was performed. According to cleavage and morphologic characteristics of embryos, embryos were transferred 3 to 5 days after oocyte retrieval. Results: There were no differences in patients' characteristics and cycle cancellation rate between the two groups. Total dose and duration of rhFSH used were significantly fewer and shorter in minimal stimulation group than those in GnRH antagonist MDP group. The numbers of oocytes retrieved, mature oocytes and transferred embryos were also lower in minimal stimulation group. However, there were no significant differences in the clinical pregnancy rate and miscarriage rate between the two groups. Conclusions: This study demonstrates that minimal stimulation protocol provides comparable pregnancy rates to GnRH antagonist MDP with fewer dose and days of rhFSH used, and thus can be a cost-effective alternative in women aged 40 and above.

Field Tests for Assessing the Bioremediation Feasibility of a Trichloroethylene-Contaminated Aquifer (관측정 자연표류 실험을 통한 트리클로로에틸렌(Trichloroethylene) 오염 지하수의 생물학적 복원 타당성 연구)

  • Kim Young;Kim Jin-Wook;Ha Chul-Yoon;Kim Nam-Hee;Hong Kwang-Pyo;Kwon Soo-Yul;Ahn Young-Ho;Ha Joon-Su;Park Hoo-Won
    • Journal of Soil and Groundwater Environment
    • /
    • v.10 no.3
    • /
    • pp.38-45
    • /
    • 2005
  • The feasibility of stimulating in situ aerobic cometabolic activity of indigenous microorganisms was investigated in a trichloroethylene (TCE)-contaminated aquifer. A series of single-well natural drift tests (SWNDTs) was conducted by injecting site groundwater amended with a bromide tracer and combinations of toluene, oxygen, nitrate, ethylene and TCE into an existing monitoring well and by sampling the same well over time. Three field tests, Push-pull Transport Test, Drift Biostimulation Test, and Drift Surrogate Activity Test, were performed in sequence. Initial rate of toluene degradation was much faster than the rate of bromide dilution resulting from natural groundwater drift, indicating stimulation of indigenous toluene-oxidizing microorganisms. Transformation of ethylene, a surrogate probing overall activity of TCE transformation, was also observed, and its transformation results in the production of ethylene oxide, suggesting that some tolueneoxidizing microorganisms stimulated may express a orthomonooxygenase enzyme. Also in situ transformation of TCE was confirmed by greater retardation of TCE than bromide after the stimulation of toluene-oxidizing microorganisms. These results indicate that, in this environment, toluene and oxygen additions stimulated the growth and aerobic cometabolic activity of indigenous microorganisms expressing orthomonooxygenase enzymes. The simple, low-cost field test method presented in this study provides an effective method for conducting rapid field assessments and pilot testing of aerobic cometabolism, which has previously hindered application of this technology to groundwater remediation.

How effective has the Wairau River erodible embankment been in removing sediment from the Lower Wairau River?

  • Kyle, Christensen
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2015.05a
    • /
    • pp.237-237
    • /
    • 2015
  • The district of Marlborough has had more than its share of river management projects over the past 150 years, each one uniquely affecting the geomorphology and flood hazard of the Wairau Plains. A major early project was to block the Opawa distributary channel at Conders Bend. The Opawa distributary channel took a third and more of Wairau River floodwaters and was a major increasing threat to Blenheim. The blocking of the Opawa required the Wairau and Lower Wairau rivers to carry greater flood flows more often. Consequently the Lower Wairau River was breaking out of its stopbanks approximately every seven years. The idea of diverting flood waters at Tuamarina by providing a direct diversion to the sea through the beach ridges was conceptualised back around the 1920s however, limits on resources and machinery meant the mission of excavating this diversion didn't become feasible until the 1960s. In 1964 a 10 m wide pilot channel was cut from the sea to Tuamarina with an initial capacity of $700m^3/s$. It was expected that floods would eventually scour this 'Wairau Diversion' to its design channel width of 150 m. This did take many more years than initially thought but after approximately 50 years with a little mechanical assistance the Wairau Diversion reached an adequate capacity. Using the power of the river to erode the channel out to its design width and depth was a brilliant idea that saved many thousands of dollars in construction costs and it is somewhat ironic that it is that very same concept that is now being used to deal with the aggradation problem that the Wairau Diversion has caused. The introduction of the Wairau Diversion did provide some flood relief to the lower reaches of the river but unfortunately as the Diversion channel was eroding and enlarging the Lower Wairau River was aggrading and reducing in capacity due to its inability to pass its sediment load with reduced flood flows. It is estimated that approximately $2,000,000m^3$ of sediment was deposited on the bed of the Lower Wairau River in the time between the Diversion's introduction in 1964 and 2010, raising the Lower Wairau's bed upwards of 1.5m in some locations. A numerical morphological model (MIKE-11 ST) was used to assess a number of options which led to the decision and resource consent to construct an erodible (fuse plug) bank at the head of the Wairau Diversion to divert more frequent scouring-flows ($+400m^3/s$)down the Lower Wairau River. Full control gates were ruled out on the grounds of expense. The initial construction of the erodible bank followed in late 2009 with the bank's level at the fuse location set to overtop and begin washing out at a combined Wairau flow of $1,400m^3/s$ which avoids berm flooding in the Lower Wairau. In the three years since the erodible bank was first constructed the Wairau River has sustained 14 events with recorded flows at Tuamarina above $1,000m^3/s$ and three of events in excess of $2,500m^3/s$. These freshes and floods have resulted in washout and rebuild of the erodible bank eight times with a combined rebuild expenditure of $80,000. Marlborough District Council's Rivers & Drainage Department maintains a regular monitoring program for the bed of the Lower Wairau River, which consists of recurrently surveying a series of standard cross sections and estimating the mean bed level (MBL) at each section as well as an overall MBL change over time. A survey was carried out just prior to the installation of the erodible bank and another survey was carried out earlier this year. The results from this latest survey show for the first time since construction of the Wairau Diversion the Lower Wairau River is enlarging. It is estimated that the entire bed of the Lower Wairau has eroded down by an overall average of 60 mm since the introduction of the erodible bank which equates to a total volume of $260,000m^3$. At a cost of $$0.30/m^3$ this represents excellent value compared to mechanical dredging which would likely be in excess of $$10/m^3$. This confirms that the idea of using the river to enlarge the channel is again working for the Wairau River system and that in time nature's "excavator" will provide a channel capacity that will continue to meet design requirements.

  • PDF