• Title/Summary/Keyword: Demand Curve

Search Result 240, Processing Time 0.022 seconds

Comparison of Approximate Nonlinear Methods for Incremental Dynamic Analysis of Seismic Performance (내진성능의 증분동적해석을 위한 비선형 약산법의 비교 검토)

  • Bae, Kyeong-Geun;Yu, Myeong-Hwa;Kang, Pyeong-Doo;Kim, Jae-Ung
    • Journal of the Earthquake Engineering Society of Korea
    • /
    • v.12 no.1
    • /
    • pp.79-87
    • /
    • 2008
  • Seismic performance evaluation of structure requires an estimation of the structural performance in terms of displacement demand imposed by earthquakes on the structure. Incremental Dynamic Analysis(IDA) is a analysis method that has recently emerged to estimate structural performance under earthquakes. This method can obtained the entire range of structural performance from the linear elastic stage to yielding and finally collapse by subjecting the structure to increasing levels of ground acceleration. Most structures are expected to deform beyond the limit of linearly elastic behavior when subjected to strong ground motion. The nonlinear response history analysis(NRHA) among various nonlinear analysis methods is the most accurate to compute seismic performance of structures, but it is time-consuming and necessitate more efforts. The nonlinear approximate methods, which is more practical and reliable tools for predicting seismic behavior of structures, are extensively studied. The uncoupled modal response history analysis(UMRHA) is a method which can find the nonlinear reponse of the structures for ESDF from the pushover curve using NRHA or response spectrum. The direct spectrum analysis(DSA) is approximate nonlinear method to evaluate nonlinear response of structures, without iterative computations, given by the structural linear vibration period and yield strength from the pushover analysis. In this study, the practicality and the reliability of seismic performance of approximate nonlinear methods for incremental dynamic analysis of mixed building structures are to be compared.

Effect of Velocity-Pulse-Like Ground Motions on Seismic Fragility of Bridges (교량의 지진취약도에 대한 속도 펄스를 가진 지반운동의 영향)

  • Yeeun Kim;Sina Kong;Sinith Kung;Jiho Moon;Jong-Keol Song
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.37 no.2
    • /
    • pp.119-131
    • /
    • 2024
  • Pulse-like ground motion can cause greater damage to structures than nonpulse-like ground motion. Currently, much research is being conducted to determine the presence or absence of velocity pulses and to quantify them from seismic-acceleration records. Existing ground motion is divided into far-field (FF) and near-fault ground motion, based on the distance of the measurement point from the fault. Near-fault ground motion is further classified into near-fault pulse-like (NFP) and near-fault nonpulse-like (NFNP) ground motion by quantifying the presence or absence of velocity pulses. For each ground motion group, 40 FF, 40 NFP, and 40 NFNP ground motions are selected; thus, 120 ground motions are used in the seismic analysis to assess the seismic fragility of sample bridges. Probabilistic seismic demand models (PSDMs) are created by evaluating the seismic responses of two types of sample bridges with lead-rubber and elastomeric rubber bearings using three groups of ground motions. Seismic fragility analysis is performed using the PSDM, and from these results, the effect of the presence or absence of seismic velocity pulses on the seismic fragility is evaluated. From the comparison results of the seismic fragility curve, the seismic fragility of NFP ground motion appears to be approximately three to five times greater than that of NFNP ground motion, according to the presence or absence of a velocity pulse of seismic waves. This means that the damage to the bridge is greater in the case of NFP ground motion than that in the case of NFNP ground motion.

Estimation of Stem Taper Equations and Stem Volume Table for Phyllostachys pubescens Mazel in South Korea (맹종죽의 수간곡선식 및 수간재적표 추정)

  • Eun-Ji, Bae;Yeong-Mo, Son;Jin-Taek, Kang
    • Journal of Korean Society of Forest Science
    • /
    • v.111 no.4
    • /
    • pp.622-629
    • /
    • 2022
  • The study aim was to derive a stem taper equation for Phyllostachys pubescens, a type of bamboo in South Korea, and to develop a stem volume table. To derive the stem taper equation, three stem taper models (Max & Burkhart, Kozak, and Lee) were used. Since bamboo stalks are hollow because of its woody characteristics, the outer and inner diameters of the tree were calculated, and connecting them enabled estimating the tree curves. The results of the three equations for estimating the outer and inner diameters led to selection of the Kozak model for determining the optimal stem taper because it had the highest fitness index and lowest error and bias. We used the Kozak model to estimate the diameter of Phyllostachys pubescens by stem height, which proved optimal, and drew the stem curve. After checking the residual degree in the stem taper equation, all residuals were distributed around "0", which proved the suitability of the equation. To calculate the stem volume of Phyllostachys pubescens, a rotating cube was created by rotating the stem curve with the outer diameter at 360°, and the volume was calculated by applying Smalian's method. The volume of Phyllostachys pubescens was calculated by deducting the inner diameter calculated volume from the outer diameter calculated volume. The volume of Phyllostachys pubescens was only 20~30% of the volume of Larix kaempferi, which is a general species. However, considering the current trees/ha of Phyllostachys pubescens and the amount of bamboo shoots generated every year, the individual tree volume was predicted to be small, but the volume/ha was not very different or perhaps more. The significance of this study is the stem taper equation and stem volume table for Phyllostachys pubescens developed for the first time in South Korea. The results are expected to be used as basic data for bamboo trading that is in increasing public and industrial demand and carbon absorption estimation.

Evaluation of microplastic in the inflow of municipal wastewater treatment plant according to pretreatment methods (전처리 방법에 따른 하수처리장 유입수에서의 미세플라스틱 성상분석 평가)

  • Kim, Sungryul;Gil, Kyungik
    • Journal of Wetlands Research
    • /
    • v.24 no.2
    • /
    • pp.83-92
    • /
    • 2022
  • The amount of the plastic waste has been increasing according to global demand for plastic. Microplastics are the most hazardous among all plastic pollutants due to their toxicity and unknown physicochemical properties. This study investigates the optimal methodology that can be applied to sewage samples for detecting microplastics before discussing reducing microplastics in MWTPs. In this study, the effect of different pretreatment methods while detecting microplastic analysis of MWTP influent samples was investigated; the samples were collected from the J sewage treatment plant. There are many pretreatment methods but two of them are widely used: Fenton digestion and hydrogen peroxide oxidation. Although there are many pretreatment methods that can be applied to investigate microplastics, the most widely used methods for sewage treatment plant samples are Fenton digestion and H2O2 oxidation. For each pretreatment method, there were factors that could cause an error in the measurement. To overcome this, in the case of the Fenton digestion pretreatment, it is recommended to proceed with the analysis by filtration instead of the density separation method. In the case of the H2O2 oxidation method, the process of washing with distilled water after the reaction is recommended. As a result of the analysis, the concentration of microplastics was measured to be 2.75ea/L for the sample using the H2O2 oxidation method and 3.2ea/L for the sample using the Fenton oxidation method, and most of them were present in the form of fibers. In addition, it is difficult to guarantee the reliability of measurement results from quantitative analysis performed via microscope with eyes. A calibration curve was created for prove the reliability. A total of three calibration curves were drawn, and as a result of analysis of the calibration curves, all R2 values were more than 0.9. This ensures high reliability for quantitative analysis. The qualitative analysis could determine the series of microplastics flowing into the MWTP, but could not confirm the chemical composition of each microplastic. This study can be used to confirm the chemical composition of microplastics introduced into MWTP in the future research.

Forecasting Substitution and Competition among Previous and New products using Choice-based Diffusion Model with Switching Cost: Focusing on Substitution and Competition among Previous and New Fixed Charged Broadcasting Services (전환 비용이 반영된 선택 기반 확산 모형을 통한 신.구 상품간 대체 및 경쟁 예측: 신.구 유료 방송서비스간 대체 및 경쟁 사례를 중심으로)

  • Koh, Dae-Young;Hwang, Jun-Seok;Oh, Hyun-Seok;Lee, Jong-Su
    • Journal of Global Scholars of Marketing Science
    • /
    • v.18 no.2
    • /
    • pp.223-252
    • /
    • 2008
  • In this study, we attempt to propose a choice-based diffusion model with switching cost, which can be used to forecast the dynamic substitution and competition among previous and new products at both individual-level and aggregate level, especially when market data for new products is insufficient. Additionally, we apply the proposed model to the empirical case of substitution and competition among Analog Cable TV that represents previous fixed charged broadcasting service and Digital Cable TV and Internet Protocol TV (IPTV) that are new ones, verify the validities of our proposed model, and finally derive related empirical implications. For empirical application, we obtained data from survey conducted as follows. Survey was administered by Dongseo Research to 1,000 adults aging from 20 to 60 living in Seoul, Korea, in May of 2007, under the title of 'Demand analysis of next generation fixed interactive broadcasting services'. Conjoint survey modified as follows, was used. First, as the traditional approach in conjoint analysis, we extracted 16 hypothetical alternative cards from the orthogonal design using important attributes and levels of next generation interactive broadcasting services which were determined by previous literature review and experts' comments. Again, we divided 16 conjoint cards into 4 groups, and thus composed 4 choice sets with 4 alternatives each. Therefore, each respondent faces 4 different hypothetical choice situations. In addition to this, we added two ways of modification. First, we asked the respondents to include the status-quo broadcasting services they subscribe to, as another alternative in each choice set. As a result, respondents choose the most preferred alternative among 5 alternatives consisting of 1 alternative with current subscription and 4 hypothetical alternatives in 4 choice sets. Modification of traditional conjoint survey in this way enabled us to estimate the factors related to switching cost or switching threshold in addition to the effects of attributes. Also, by using both revealed preference data(1 alternative with current subscription) and stated preference data (4 hypothetical alternatives), additional advantages in terms of the estimation properties and more conservative and realistic forecast, can be achieved. Second, we asked the respondents to choose the most preferred alternative while considering their expected adoption timing or switching timing. Respondents are asked to report their expected adoption or switching timing among 14 half-year points after the introduction of next generation broadcasting services. As a result, for each respondent, 14 observations with 5 alternatives for each period, are obtained, which results in panel-type data. Finally, this panel-type data consisting of $4{\ast}14{\ast}1000=56000$observations is used for estimation of the individual-level consumer adoption model. From the results obtained by empirical application, in case of forecasting the demand of new products without considering existence of previous product(s) and(or) switching cost factors, it is found that overestimated speed of diffusion at introductory stage or distorted predictions can be obtained, and as such, validities of our proposed model in which both existence of previous products and switching cost factors are properly considered, are verified. Also, it is found that proposed model can produce flexible patterns of market evolution depending on the degree of the effects of consumer preferences for the attributes of the alternatives on individual-level state transition, rather than following S-shaped curve assumed a priori. Empirically, it is found that in various scenarios with diverse combinations of prices, IPTV is more likely to take advantageous positions over Digital Cable TV in obtaining subscribers. Meanwhile, despite inferiorities in many technological attributes, Analog Cable TV, which is regarded as previous product in our analysis, is likely to be substituted by new services gradually rather than abruptly thanks to the advantage in low service charge and existence of high switching cost in fixed charged broadcasting service market.

  • PDF

Valuation of the Water Pollution Reduction: An Application of the Imaginary Emission Market Concept (수질오염물질 감소의 편익 추정 -수질총량제하 가상배출권시장 개념의 적용-)

  • Han, Tak-Whan;Lee, Hyo Chang
    • Environmental and Resource Economics Review
    • /
    • v.23 no.4
    • /
    • pp.719-746
    • /
    • 2014
  • This study attempts to estimate the value of the water quality improvement by deriving the equilibrium price of the water pollutant emission permit for the imaginary water pollutant emission trading market. It is reasonable to say that there is already an implicit social agreement for the unit value of water pollutant, when the government set the Total Water Pollutant Loading System for the major river basin as a part of the Comprehensive Measures for Water Management, particularly for the Nakdong River Basin. Therefore, we can derive the unit value of water pollutant emission, which is already implied in the pollution allowance for each city or county by the Total Water Pollutant Loading System. Once estimated, it will be useful to the economic assessment of the water quality related projects. An imaginary water pollutant emission trading system for the Nakdong River Basin, where Total Water Pollutant Loading System is already effective, is constructed for the estimation of the equilibrium price of water pollutant permit. By estimating marginal abatement cost curve or each city or county, we can compute the equilibrium price of the permit and then it is regarded as the economic value of the water pollutant. The marginal net benefit function results from the relationship between the emission and the benefit, and then the equilibrium price of permit comes from constructing the excess demand function of the permit by using the total allowable permit of the local government entity. The equilibrium price of the permit would be estimated to be $1,409.3won/kg{\cdot}BOD$. This is within reasonable boundary compared for the permit price compared to foreign example. This permit price would be applied to calculate for the economic value of the water quality pollutants, and also be expected to use directly for the B/C analysis of the business involved with water quality change.

A Study on the Market Design of Designing GHG Emissions Trading (국내 배출권 거래시장 활성화 방안에 관한 연구)

  • Park, Soon Chul;Choi, Ki-Ryun
    • Environmental and Resource Economics Review
    • /
    • v.14 no.2
    • /
    • pp.493-518
    • /
    • 2005
  • It has been taken for 10 years since Climate Change Convention could it be made. And Kyoto Protocol will come into force as an international law as from 16. Feb 2005. As based on it, Annex I countries will implement their mitigation projects on GHG reductions and press developing countries on GHG reduction target. Korea has not duty target on it yet. But it will be held a COP(Conference of Party) on negotiation for reduction target of second commitment period. If Korea has a real duty, Industry sector should reduce GHG emissions. Then Market mechanism will be need to introduce for this. This study started having a question "Is it possible to introduce emissions trading in Korea?". To solve the problem, this study analysed GHG emissions, marginal abatement cost, market price with 11 companies of industry (about 36% of Korea emissions). minus target is impossible to implement reduction target ver base year (2002). And emissions trading scheme also can't make the market without additional policy and measures. This study suggest that it is need to import credits and give a subsidy of government to encourage it. The imported credit can reduce the demand curve within the marginal abatement cost curves. But the effectiveness of credit is not the same as continually growth. As a result, Allowing 40% credit into emissions trading market is the best to reduce costs. However, a subsidy is the little bit difference. A subsidy make marginal abatement cost curves down for itself. Giving 30% for subsidy, it is the best. Considering both of importing credits and subsidy, it is the best effects in the reducing cost for company. especially 30% is the best effects respectively. This Study show that government wants to consider designing emissions trading, encourage participants competitiveness, and encourage the early action, government has to allow credit trading and give a subsidy to participants.

  • PDF

The Effects of e-Business on Business Performance - In the home-shopping industry - (e-비즈니스가 경영성과에 미치는 영향 -홈쇼핑을 중심으로-)

  • Kim, Sae-Jung;Ahn, Seon-Sook
    • Management & Information Systems Review
    • /
    • v.22
    • /
    • pp.137-165
    • /
    • 2007
  • It seems high time to increase productivity by adopting e-business to overcome challenges posed by both external factors including the appreciation of Korean won, oil hikes and fierce global competition and domestic issues represented by disparities between large corporations and small and medium enterprises (SMEs), Seoul metropolitan and local cities, and export and domestic demand all of which weaken future growth engines in the Korean economy. The demands of the globalization era are for innovative changes in businessprocess and industrial structure aiming for creating new values. To this end, e-business is expected to play a core role in the sophistication of the Korean economy through new values and innovation. In order to examine business performance in e-business-adopting industries, this study analyzed the home shopping industry by closely looking into the financial ratios including the ratio of net profit to sales, the ratio of operation income to sales, the ratio of gross cost to sales cost, the ratio of gross cost to selling, general and administrative (SG&A) expense, and return of investment (ROI). This study, for best outcome, referred to corporate financial statements as a main resource to calculate financial ratios by utilizing Data Analysis, Retrieval and Transfer System (DART) of the Financial Supervisory Service, one of the Korea's financial supervisory authorities. First of all, the result of the trend analysis on the ratio of net profit to sales is as following. CJ Home Shopping has registered a remarkable increase in its ratio of net profit rate to sales since 2002 while its competitors find it hard to catch up with CJ's stunning performances. This is partly due to the efficient management compared to CJ's value of capital. Such significance, if the current trend continues, will make the front-runner assume the largest market share. On the other hand, GS Home Shopping, despite its best organized system and largest value of capital among others, lacks efficiency in management. Second of all, the result of the trend analysis on the ratio of operation income to sales is as following. Both CJ Home Shopping and GS Home Shopping have, until 2004, recorded similar growth trend. However, while CJ Home Shopping's operating income continued to increase in 2005, GS Home Shopping observed its operating income declining which resulted in the increasing income gap with CJ Home Shopping. While CJ Home Shopping with the largest market share in home shopping industryis engaged in aggressive marketing, GS Home Shopping due to its stability-driven management strategies falls behind CJ again in the ratio of operation income to sales in spite of its favorable management environment including its large capital. Companies in the Group B were established in the same year of 2001. NS Home Shopping was the first in the Group B to shift its loss to profit. Woori Home Shopping has continued to post operating loss for three consecutive years and finally was sold to Lotte Group in 2007, but since then, has registered a continuing increase in net income on sales. Third of all, the result of the trend analysis on the ratio of gross cost to sales cost is as following. Since home shopping falls into sales business, its cost of sales is much lower than that of other types of business such as manufacturing industry. Since 2002 in gross costs including cost of sales, SG&A expense, and non-operating expense, cost of sales turned out to have remarkably decreased. Group B has also posted a notable decline in the same sector since 2002. Fourth of all, the result of the trend analysis on the ratio of gross cost to SG&A expense is as following. Due to its unique characteristics, the home shopping industry usually posts ahigh ratio of SG&A expense. However, more than 80% of SG&A expense means the result of lax management and at the same time, a sharp lower net income on sales than other industries. Last but not least, the result of the trend analysis on ROI is as following. As for CJ Home Shopping, the curve of ROI looks similar to that of its investment on fixed assets. As it turned out, the company's ratio of fixed assets to operating income skyrocketed in 2004 and 2005. As far as GS Home Shopping is concerned, its fixed assets are not as much as that of CJ Home Shopping. Consequently, competition in the home shopping industry, at the moment, is among CJ, GS, Hyundai, NS and Woori Home Shoppings, and all of them need to more thoroughly manage their costs. In order for the late-comers of Group B and other home shopping companies to advance further, the current lax management should be reformed particularly on their SG&A expense sector. Provided that the total sales volume in the Internet shopping sector is projected to grow over 20 trillion won by the year 2010, it is concluded that all the participants in the home shopping industry should put strategies on efficient management on costs and expenses as their top priority rather than increase revenues, if they hope to grow even further after 2007.

  • PDF

Validation of Stem-loop RT-qPCR Method on the Pharmacokinetic Analysis of siRNA Therapeutics (Stem-loop RT-qPCR 분석법을 이용한 siRNA 치료제의 생체시료 분석법 검증 및 약물 동태학적 분석)

  • Kim, Hye Jeong;Kim, Taek Min;Kim, Hong Joong;Jung, Hun Soon;Lee, Seung Ho
    • Journal of Life Science
    • /
    • v.29 no.6
    • /
    • pp.653-661
    • /
    • 2019
  • The first small interfering RNA (siRNA) therapeutics have recently been approved by the Food and Drug Administration in the U.S., and the demand for a new RNA therapeutics bioanalysis method-which is essential for pharmacokinetics, including the absorption, distribution, metabolism, and excretion of siRNA therapeutics-is rapidly increasing. The stem-loop real-time qPCR (RT-qPCR) assay is a useful molecular technique for the identification and quantification of small RNA (e.g., micro RNA and siRNA) and can be applied for the bioanalysis of siRNA therapeutics. When the anti-HPV E6/E7 siRNA therapeutic was used in preclinical trials, the established stem-loop RT-qPCR assay was validated. The limit of detection was sensitive up to 10 fM and the lower limit of quantification up to 100 fM. In fact, the reliability of the established test method was further validated in three intra assays. Here, the correlation coefficient of $R^2$>0.99, the slope of -3.10 ~ -3.40, and the recovery rate within ${\pm}20%$ of the siRNA standard curve confirm its excellent robustness. Finally, the circulation profiles of siRNAs were demonstrated in rat serum, and the pharmacokinetic properties of the anti-HPV E6/E7 siRNA therapeutic were characterized using a stem-loop RT-qPCR assay. Therefore, the stemloop RT-qPCR assay enables accurate, precise, and sensitive siRNA duplex quantification and is suitable for the quantification of small RNA therapeutics using small volumes of biological samples.

DEVELOPMENT OF STATEWIDE TRUCK TRAFFIC FORECASTING METHOD BY USING LIMITED O-D SURVEY DATA (한정된 O-D조사자료를 이용한 주 전체의 트럭교통예측방법 개발)

  • 박만배
    • Proceedings of the KOR-KST Conference
    • /
    • 1995.02a
    • /
    • pp.101-113
    • /
    • 1995
  • The objective of this research is to test the feasibility of developing a statewide truck traffic forecasting methodology for Wisconsin by using Origin-Destination surveys, traffic counts, classification counts, and other data that are routinely collected by the Wisconsin Department of Transportation (WisDOT). Development of a feasible model will permit estimation of future truck traffic for every major link in the network. This will provide the basis for improved estimation of future pavement deterioration. Pavement damage rises exponentially as axle weight increases, and trucks are responsible for most of the traffic-induced damage to pavement. Consequently, forecasts of truck traffic are critical to pavement management systems. The pavement Management Decision Supporting System (PMDSS) prepared by WisDOT in May 1990 combines pavement inventory and performance data with a knowledge base consisting of rules for evaluation, problem identification and rehabilitation recommendation. Without a r.easonable truck traffic forecasting methodology, PMDSS is not able to project pavement performance trends in order to make assessment and recommendations in the future years. However, none of WisDOT's existing forecasting methodologies has been designed specifically for predicting truck movements on a statewide highway network. For this research, the Origin-Destination survey data avaiiable from WisDOT, including two stateline areas, one county, and five cities, are analyzed and the zone-to'||'&'||'not;zone truck trip tables are developed. The resulting Origin-Destination Trip Length Frequency (00 TLF) distributions by trip type are applied to the Gravity Model (GM) for comparison with comparable TLFs from the GM. The gravity model is calibrated to obtain friction factor curves for the three trip types, Internal-Internal (I-I), Internal-External (I-E), and External-External (E-E). ~oth "macro-scale" calibration and "micro-scale" calibration are performed. The comparison of the statewide GM TLF with the 00 TLF for the macro-scale calibration does not provide suitable results because the available 00 survey data do not represent an unbiased sample of statewide truck trips. For the "micro-scale" calibration, "partial" GM trip tables that correspond to the 00 survey trip tables are extracted from the full statewide GM trip table. These "partial" GM trip tables are then merged and a partial GM TLF is created. The GM friction factor curves are adjusted until the partial GM TLF matches the 00 TLF. Three friction factor curves, one for each trip type, resulting from the micro-scale calibration produce a reasonable GM truck trip model. A key methodological issue for GM. calibration involves the use of multiple friction factor curves versus a single friction factor curve for each trip type in order to estimate truck trips with reasonable accuracy. A single friction factor curve for each of the three trip types was found to reproduce the 00 TLFs from the calibration data base. Given the very limited trip generation data available for this research, additional refinement of the gravity model using multiple mction factor curves for each trip type was not warranted. In the traditional urban transportation planning studies, the zonal trip productions and attractions and region-wide OD TLFs are available. However, for this research, the information available for the development .of the GM model is limited to Ground Counts (GC) and a limited set ofOD TLFs. The GM is calibrated using the limited OD data, but the OD data are not adequate to obtain good estimates of truck trip productions and attractions .. Consequently, zonal productions and attractions are estimated using zonal population as a first approximation. Then, Selected Link based (SELINK) analyses are used to adjust the productions and attractions and possibly recalibrate the GM. The SELINK adjustment process involves identifying the origins and destinations of all truck trips that are assigned to a specified "selected link" as the result of a standard traffic assignment. A link adjustment factor is computed as the ratio of the actual volume for the link (ground count) to the total assigned volume. This link adjustment factor is then applied to all of the origin and destination zones of the trips using that "selected link". Selected link based analyses are conducted by using both 16 selected links and 32 selected links. The result of SELINK analysis by u~ing 32 selected links provides the least %RMSE in the screenline volume analysis. In addition, the stability of the GM truck estimating model is preserved by using 32 selected links with three SELINK adjustments, that is, the GM remains calibrated despite substantial changes in the input productions and attractions. The coverage of zones provided by 32 selected links is satisfactory. Increasing the number of repetitions beyond four is not reasonable because the stability of GM model in reproducing the OD TLF reaches its limits. The total volume of truck traffic captured by 32 selected links is 107% of total trip productions. But more importantly, ~ELINK adjustment factors for all of the zones can be computed. Evaluation of the travel demand model resulting from the SELINK adjustments is conducted by using screenline volume analysis, functional class and route specific volume analysis, area specific volume analysis, production and attraction analysis, and Vehicle Miles of Travel (VMT) analysis. Screenline volume analysis by using four screenlines with 28 check points are used for evaluation of the adequacy of the overall model. The total trucks crossing the screenlines are compared to the ground count totals. L V/GC ratios of 0.958 by using 32 selected links and 1.001 by using 16 selected links are obtained. The %RM:SE for the four screenlines is inversely proportional to the average ground count totals by screenline .. The magnitude of %RM:SE for the four screenlines resulting from the fourth and last GM run by using 32 and 16 selected links is 22% and 31 % respectively. These results are similar to the overall %RMSE achieved for the 32 and 16 selected links themselves of 19% and 33% respectively. This implies that the SELINICanalysis results are reasonable for all sections of the state.Functional class and route specific volume analysis is possible by using the available 154 classification count check points. The truck traffic crossing the Interstate highways (ISH) with 37 check points, the US highways (USH) with 50 check points, and the State highways (STH) with 67 check points is compared to the actual ground count totals. The magnitude of the overall link volume to ground count ratio by route does not provide any specific pattern of over or underestimate. However, the %R11SE for the ISH shows the least value while that for the STH shows the largest value. This pattern is consistent with the screenline analysis and the overall relationship between %RMSE and ground count volume groups. Area specific volume analysis provides another broad statewide measure of the performance of the overall model. The truck traffic in the North area with 26 check points, the West area with 36 check points, the East area with 29 check points, and the South area with 64 check points are compared to the actual ground count totals. The four areas show similar results. No specific patterns in the L V/GC ratio by area are found. In addition, the %RMSE is computed for each of the four areas. The %RMSEs for the North, West, East, and South areas are 92%, 49%, 27%, and 35% respectively, whereas, the average ground counts are 481, 1383, 1532, and 3154 respectively. As for the screenline and volume range analyses, the %RMSE is inversely related to average link volume. 'The SELINK adjustments of productions and attractions resulted in a very substantial reduction in the total in-state zonal productions and attractions. The initial in-state zonal trip generation model can now be revised with a new trip production's trip rate (total adjusted productions/total population) and a new trip attraction's trip rate. Revised zonal production and attraction adjustment factors can then be developed that only reflect the impact of the SELINK adjustments that cause mcreases or , decreases from the revised zonal estimate of productions and attractions. Analysis of the revised production adjustment factors is conducted by plotting the factors on the state map. The east area of the state including the counties of Brown, Outagamie, Shawano, Wmnebago, Fond du Lac, Marathon shows comparatively large values of the revised adjustment factors. Overall, both small and large values of the revised adjustment factors are scattered around Wisconsin. This suggests that more independent variables beyond just 226; population are needed for the development of the heavy truck trip generation model. More independent variables including zonal employment data (office employees and manufacturing employees) by industry type, zonal private trucks 226; owned and zonal income data which are not available currently should be considered. A plot of frequency distribution of the in-state zones as a function of the revised production and attraction adjustment factors shows the overall " adjustment resulting from the SELINK analysis process. Overall, the revised SELINK adjustments show that the productions for many zones are reduced by, a factor of 0.5 to 0.8 while the productions for ~ relatively few zones are increased by factors from 1.1 to 4 with most of the factors in the 3.0 range. No obvious explanation for the frequency distribution could be found. The revised SELINK adjustments overall appear to be reasonable. The heavy truck VMT analysis is conducted by comparing the 1990 heavy truck VMT that is forecasted by the GM truck forecasting model, 2.975 billions, with the WisDOT computed data. This gives an estimate that is 18.3% less than the WisDOT computation of 3.642 billions of VMT. The WisDOT estimates are based on the sampling the link volumes for USH, 8TH, and CTH. This implies potential error in sampling the average link volume. The WisDOT estimate of heavy truck VMT cannot be tabulated by the three trip types, I-I, I-E ('||'&'||'pound;-I), and E-E. In contrast, the GM forecasting model shows that the proportion ofE-E VMT out of total VMT is 21.24%. In addition, tabulation of heavy truck VMT by route functional class shows that the proportion of truck traffic traversing the freeways and expressways is 76.5%. Only 14.1% of total freeway truck traffic is I-I trips, while 80% of total collector truck traffic is I-I trips. This implies that freeways are traversed mainly by I-E and E-E truck traffic while collectors are used mainly by I-I truck traffic. Other tabulations such as average heavy truck speed by trip type, average travel distance by trip type and the VMT distribution by trip type, route functional class and travel speed are useful information for highway planners to understand the characteristics of statewide heavy truck trip patternS. Heavy truck volumes for the target year 2010 are forecasted by using the GM truck forecasting model. Four scenarios are used. Fo~ better forecasting, ground count- based segment adjustment factors are developed and applied. ISH 90 '||'&'||' 94 and USH 41 are used as example routes. The forecasting results by using the ground count-based segment adjustment factors are satisfactory for long range planning purposes, but additional ground counts would be useful for USH 41. Sensitivity analysis provides estimates of the impacts of the alternative growth rates including information about changes in the trip types using key routes. The network'||'&'||'not;based GMcan easily model scenarios with different rates of growth in rural versus . . urban areas, small versus large cities, and in-state zones versus external stations. cities, and in-state zones versus external stations.

  • PDF