• Title/Summary/Keyword: Target Identification

Search Result 718, Processing Time 0.049 seconds

An Analysis on the Usability of Unmanned Aerial Vehicle(UAV) Image to Identify Water Quality Characteristics in Agricultural Streams (농업지역 소하천의 수질 특성 파악을 위한 UAV 영상 활용 가능성 분석)

  • Kim, Seoung-Hyeon;Moon, Byung-Hyun;Song, Bong-Geun;Park, Kyung-Hun
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.22 no.3
    • /
    • pp.10-20
    • /
    • 2019
  • Irregular rainfall caused by climate change, in combination with non-point pollution, can cause water systems worldwide to suffer from frequent eutrophication and algal blooms. This type of water pollution is more common in agricultural prone to water system inflow of non-point pollution. Therefore, in this study, the correlation between Unmanned Aerial Vehicle(UAV) multi-spectral images and total phosphorus, total nitrogen, and chlorophyll-a with indirect association of algal blooms, was analyzed to identify the usability of UAV image to identify water quality characteristics in agricultural streams. The analysis the vegetation index Normalized Differences Index (NDVI), the Normalized Differences Red Edge(NDRE), and the Chlorophyll Index Red Edge(CIRE) for the detection of multi-spectral images and algal blooms collected from the target regions Yang cheon and Hamyang Wicheon. The analysis of the correlation between image values and water quality analysis values for the water sampling points, total phosphorus at a significance level of 0.05 was correlated with the CIRE(0.66), and chlorophyll-a showed correlation with Blue(-0.67), Green(-0.66), NDVI(0.75), NDRE (0.67), CIRE(0.74). Total nitrogen was correlated with the Red(-0.64), Red edge (-0.64) and Near-Infrared Ray(NIR)(-0.72) wavelength at the significance level of 0.05. The results of this study confirmed a significant correlations between multi-spectral images collected through UAV and the factors responsible for water pollution, In the case of the vegetation index used for the detection of algal bloom, the possibility of identification of not only chlorophyll-a but also total phosphorus was confirmed. This data will be used as a meaningful data for counterplan such as selecting non-point pollution apprehensive area in agricultural area.

Verification of Kompsat-5 Sigma Naught Equation (다목적실용위성 5호 후방산란계수 방정식 검증)

  • Yang, Dochul;Jeong, Horyung
    • Korean Journal of Remote Sensing
    • /
    • v.34 no.6_3
    • /
    • pp.1457-1468
    • /
    • 2018
  • The sigma naught (${\sigma}^0$) equation is essential to calculate geo-physical properties from Synthetic Aperture Radar (SAR) images for the applications such as ground target identification,surface classification, sea wind speed calculation, and soil moisture estimation. In this paper, we are suggesting new Kompsat-5 (K5) Radar Cross Section (RCS) and ${\sigma}^0$ equations reflecting the final SAR processor update and absolute radiometric calibration in order to increase the application of K5 SAR images. Firstly, we analyzed the accuracy of the K5 RCS equation by using trihedral corner reflectors installed in the Kompsat calibration site in Mongolia. The average difference between the calculated values using RCS equation and the measured values with K5 SAR processor was about $0.2dBm^2$ for Spotlight and Stripmap imaging modes. In addition, the verification of the K5 ${\sigma}^0$ equation was carried out using the TerraSAR-X (TSX) and Sentinel-1A (S-1A) SAR images over Amazon rainforest, where the backscattering characteristics are not significantly affected by the seasonal change. The calculated ${\sigma}^0$ difference between K5 and TSX/S-1A was less than 0.6 dB. Considering the K5 absolute radiometric accuracy requirement, which is 2.0 dB ($1{\sigma}$), the average difference of $0.2dBm^2$ for RCS equation and the maximum difference of 0.6 dB for ${\sigma}^0$ equation show that the accuracies of the suggested equations are relatively high. In the future, the validity of the suggested RCS and ${\sigma}^0$ equations is expected to be verified through the application such as sea wind speed calculation, where quantitative analysis is possible.

Positron Annihilation Spectroscopy of Active Galactic Nuclei

  • Doikov, Dmytry N.;Yushchenko, Alexander V.;Jeong, Yeuncheol
    • Journal of Astronomy and Space Sciences
    • /
    • v.36 no.1
    • /
    • pp.21-33
    • /
    • 2019
  • This paper focuses on the interpretation of radiation fluxes from active galactic nuclei. The advantage of positron annihilation spectroscopy over other methods of spectral diagnostics of active galactic nuclei (therefore AGN) is demonstrated. A relationship between regular and random components in both bolometric and spectral composition of fluxes of quanta and particles generated in AGN is found. We consider their diffuse component separately and also detect radiative feedback after the passage of high-velocity cosmic rays and hard quanta through gas-and-dust aggregates surrounding massive black holes in AGN. The motion of relativistic positrons and electrons in such complex systems produces secondary radiation throughout the whole investigated region of active galactic nuclei in form of cylinder with radius R= 400-1000 pc and height H=200-400 pc, thus causing their visible luminescence across all spectral bands. We obtain radiation and electron energy distribution functions depending on the spatial distribution of the investigated bulk of matter in AGN. Radiation luminescence of the non-central part of AGN is a response to the effects of particles and quanta falling from its center created by atoms, molecules and dust of its diffuse component. The cross-sections for the single-photon annihilation of positrons of different energies with atoms in these active galactic nuclei are determined. For the first time we use the data on the change in chemical composition due to spallation reactions induced by high-energy particles. We establish or define more accurately how the energies of the incident positron, emitted ${\gamma}-quantum$ and recoiling nucleus correlate with the atomic number and weight of the target nucleus. For light elements, we provide detailed tables of all indicated parameters. A new criterion is proposed, based on the use of the ratio of the fluxes of ${\gamma}-quanta$ formed in one- and two-photon annihilation of positrons in a diffuse medium. It is concluded that, as is the case in young supernova remnants, the two-photon annihilation tends to occur in solid-state grains as a result of active loss of kinetic energy of positrons due to ionisation down to thermal energy of free electrons. The single-photon annihilation of positrons manifests itself in the gas component of active galactic nuclei. Such annihilation occurs as interaction between positrons and K-shell electrons; hence, it is suitable for identification of the chemical state of substances comprising the gas component of the investigated media. Specific physical media producing high fluxes of positrons are discussed; it allowed a significant reduction in the number of reaction channels generating positrons. We estimate the brightness distribution in the ${\gamma}-ray$ spectra of the gas-and-dust media through which positron fluxes travel with the energy range similar to that recorded by the Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics (PAMELA) research module. Based on the results of our calculations, we analyse the reasons for such a high power of positrons to penetrate through gas-and-dust aggregates. The energy loss of positrons by ionisation is compared to the production of secondary positrons by high-energy cosmic rays in order to determine the depth of their penetration into gas-and-dust aggregations clustered in active galactic nuclei. The relationship between the energy of ${\gamma}-quanta$ emitted upon the single-photon annihilation and the energy of incident electrons is established. The obtained cross sections for positron interactions with bound electrons of the diffuse component of the non-central, peripheral AGN regions allowed us to obtain new spectroscopic characteristics of the atoms involved in single-photon annihilation.

Trends in QA/QC of Phytoplankton Data for Marine Ecosystem Monitoring (해양생태계 모니터링을 위한 식물플랑크톤 자료의 정도 관리 동향)

  • YIH, WONHO;PARK, JONG WOO;SEONG, KYEONG AH;PARK, JONG-GYU;YOO, YEONG DU;KIM, HYUNG SEOP
    • The Sea:JOURNAL OF THE KOREAN SOCIETY OF OCEANOGRAPHY
    • /
    • v.26 no.3
    • /
    • pp.220-237
    • /
    • 2021
  • Since the functional importance of marine phytoplankton was firstly advocated from early 1880s massive data on the species composition and abundance were produced by classical microscopic observation and the advanced auto-imaging technologies. Recently, pigment composition resulted from direct chemical analysis of phytoplankton samples or indirect remote sensing could be used for the group-specific quantification, which leads us to more diversified data production methods and for more improved spatiotemporal accessibilities to the target data-gathering points. In quite a few cases of many long-term marine ecosystem monitoring programs the phytoplankton species composition and abundance was included as a basic monitoring item. The phytoplankton data could be utilized as a crucial evidence for the long-term change in phytoplankton community structure and ecological functioning at the monitoring stations. Usability of the phytoplankton data sometimes is restricted by the differences in data producers throughout the whole monitoring period. Methods for sample treatments, analyses, and species identification of the phytoplankton species could be inconsistent among the different data producers and the monitoring years. In-depth study to determine the precise quantitative values of the phytoplankton species composition and abundance might be begun by Victor Hensen in late 1880s. International discussion on the quality assurance of the marine phytoplankton data began in 1969 by the SCOR Working Group 33 of ICSU. Final report of the Working group in 1974 (UNESCO Technical Papers in Marine Science 18) was later revised and published as the UNESCO Monographs on oceanographic methodology 6. The BEQUALM project, the former body of IPI (International Phytoplankton Intercomparison) for marine phytoplankton data QA/QC under ISO standard, was initiated in late 1990. The IPI is promoting international collaboration for all the participating countries to apply the QA/QC standard established from the 20 years long experience and practices. In Korea, however, such a QA/QC standard for marine phytoplankton species composition and abundance data is not well established by law, whereas that for marine chemical data from measurements and analysis has been already set up and managed. The first priority might be to establish a QA/QC standard system for species composition and abundance data of marine phytoplankton, then to be extended to other functional groups at the higher consumer level of marine food webs.

A Study on Intelligent Value Chain Network System based on Firms' Information (기업정보 기반 지능형 밸류체인 네트워크 시스템에 관한 연구)

  • Sung, Tae-Eung;Kim, Kang-Hoe;Moon, Young-Su;Lee, Ho-Shin
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.3
    • /
    • pp.67-88
    • /
    • 2018
  • Until recently, as we recognize the significance of sustainable growth and competitiveness of small-and-medium sized enterprises (SMEs), governmental support for tangible resources such as R&D, manpower, funds, etc. has been mainly provided. However, it is also true that the inefficiency of support systems such as underestimated or redundant support has been raised because there exist conflicting policies in terms of appropriateness, effectiveness and efficiency of business support. From the perspective of the government or a company, we believe that due to limited resources of SMEs technology development and capacity enhancement through collaboration with external sources is the basis for creating competitive advantage for companies, and also emphasize value creation activities for it. This is why value chain network analysis is necessary in order to analyze inter-company deal relationships from a series of value chains and visualize results through establishing knowledge ecosystems at the corporate level. There exist Technology Opportunity Discovery (TOD) system that provides information on relevant products or technology status of companies with patents through retrievals over patent, product, or company name, CRETOP and KISLINE which both allow to view company (financial) information and credit information, but there exists no online system that provides a list of similar (competitive) companies based on the analysis of value chain network or information on potential clients or demanders that can have business deals in future. Therefore, we focus on the "Value Chain Network System (VCNS)", a support partner for planning the corporate business strategy developed and managed by KISTI, and investigate the types of embedded network-based analysis modules, databases (D/Bs) to support them, and how to utilize the system efficiently. Further we explore the function of network visualization in intelligent value chain analysis system which becomes the core information to understand industrial structure ystem and to develop a company's new product development. In order for a company to have the competitive superiority over other companies, it is necessary to identify who are the competitors with patents or products currently being produced, and searching for similar companies or competitors by each type of industry is the key to securing competitiveness in the commercialization of the target company. In addition, transaction information, which becomes business activity between companies, plays an important role in providing information regarding potential customers when both parties enter similar fields together. Identifying a competitor at the enterprise or industry level by using a network map based on such inter-company sales information can be implemented as a core module of value chain analysis. The Value Chain Network System (VCNS) combines the concepts of value chain and industrial structure analysis with corporate information simply collected to date, so that it can grasp not only the market competition situation of individual companies but also the value chain relationship of a specific industry. Especially, it can be useful as an information analysis tool at the corporate level such as identification of industry structure, identification of competitor trends, analysis of competitors, locating suppliers (sellers) and demanders (buyers), industry trends by item, finding promising items, finding new entrants, finding core companies and items by value chain, and recognizing the patents with corresponding companies, etc. In addition, based on the objectivity and reliability of the analysis results from transaction deals information and financial data, it is expected that value chain network system will be utilized for various purposes such as information support for business evaluation, R&D decision support and mid-term or short-term demand forecasting, in particular to more than 15,000 member companies in Korea, employees in R&D service sectors government-funded research institutes and public organizations. In order to strengthen business competitiveness of companies, technology, patent and market information have been provided so far mainly by government agencies and private research-and-development service companies. This service has been presented in frames of patent analysis (mainly for rating, quantitative analysis) or market analysis (for market prediction and demand forecasting based on market reports). However, there was a limitation to solving the lack of information, which is one of the difficulties that firms in Korea often face in the stage of commercialization. In particular, it is much more difficult to obtain information about competitors and potential candidates. In this study, the real-time value chain analysis and visualization service module based on the proposed network map and the data in hands is compared with the expected market share, estimated sales volume, contact information (which implies potential suppliers for raw material / parts, and potential demanders for complete products / modules). In future research, we intend to carry out the in-depth research for further investigating the indices of competitive factors through participation of research subjects and newly developing competitive indices for competitors or substitute items, and to additively promoting with data mining techniques and algorithms for improving the performance of VCNS.

Electronic Word-of-Mouth in B2C Virtual Communities: An Empirical Study from CTrip.com (B2C허의사구중적전자구비(B2C虚拟社区中的电子口碑): 관우휴정려유망적실증연구(关于携程旅游网的实证研究))

  • Li, Guoxin;Elliot, Statia;Choi, Chris
    • Journal of Global Scholars of Marketing Science
    • /
    • v.20 no.3
    • /
    • pp.262-268
    • /
    • 2010
  • Virtual communities (VCs) have developed rapidly, with more and more people participating in them to exchange information and opinions. A virtual community is a group of people who may or may not meet one another face to face, and who exchange words and ideas through the mediation of computer bulletin boards and networks. A business-to-consumer virtual community (B2CVC) is a commercial group that creates a trustworthy environment intended to motivate consumers to be more willing to buy from an online store. B2CVCs create a social atmosphere through information contribution such as recommendations, reviews, and ratings of buyers and sellers. Although the importance of B2CVCs has been recognized, few studies have been conducted to examine members' word-of-mouth behavior within these communities. This study proposes a model of involvement, statistics, trust, "stickiness," and word-of-mouth in a B2CVC and explores the relationships among these elements based on empirical data. The objectives are threefold: (i) to empirically test a B2CVC model that integrates measures of beliefs, attitudes, and behaviors; (ii) to better understand the nature of these relationships, specifically through word-of-mouth as a measure of revenue generation; and (iii) to better understand the role of stickiness of B2CVC in CRM marketing. The model incorporates three key elements concerning community members: (i) their beliefs, measured in terms of their involvement assessment; (ii) their attitudes, measured in terms of their satisfaction and trust; and, (iii) their behavior, measured in terms of site stickiness and their word-of-mouth. Involvement is considered the motivation for consumers to participate in a virtual community. For B2CVC members, information searching and posting have been proposed as the main purpose for their involvement. Satisfaction has been reviewed as an important indicator of a member's overall community evaluation, and conceptualized by different levels of member interactions with their VC. The formation and expansion of a VC depends on the willingness of members to share information and services. Researchers have found that trust is a core component facilitating the anonymous interaction in VCs and e-commerce, and therefore trust-building in VCs has been a common research topic. It is clear that the success of a B2CVC depends on the stickiness of its members to enhance purchasing potential. Opinions communicated and information exchanged between members may represent a type of written word-of-mouth. Therefore, word-of-mouth is one of the primary factors driving the diffusion of B2CVCs across the Internet. Figure 1 presents the research model and hypotheses. The model was tested through the implementation of an online survey of CTrip Travel VC members. A total of 243 collected questionnaires was reduced to 204 usable questionnaires through an empirical process of data cleaning. The study's hypotheses examined the extent to which involvement, satisfaction, and trust influence B2CVC stickiness and members' word-of-mouth. Structural Equation Modeling tested the hypotheses in the analysis, and the structural model fit indices were within accepted thresholds: ${\chi}^2^$/df was 2.76, NFI was .904, IFI was .931, CFI was .930, and RMSEA was .017. Results indicated that involvement has a significant influence on satisfaction (p<0.001, ${\beta}$=0.809). The proportion of variance in satisfaction explained by members' involvement was over half (adjusted $R^2$=0.654), reflecting a strong association. The effect of involvement on trust was also statistically significant (p<0.001, ${\beta}$=0.751), with 57 percent of the variance in trust explained by involvement (adjusted $R^2$=0.563). When the construct "stickiness" was treated as a dependent variable, the proportion of variance explained by the variables of trust and satisfaction was relatively low (adjusted $R^2$=0.331). Satisfaction did have a significant influence on stickiness, with ${\beta}$=0.514. However, unexpectedly, the influence of trust was not even significant (p=0.231, t=1.197), rejecting that proposed hypothesis. The importance of stickiness in the model was more significant because of its effect on e-WOM with ${\beta}$=0.920 (p<0.001). Here, the measures of Stickiness explain over eighty of the variance in e-WOM (Adjusted $R^2$=0.846). Overall, the results of the study supported the hypothesized relationships between members' involvement in a B2CVC and their satisfaction with and trust of it. However, trust, as a traditional measure in behavioral models, has no significant influence on stickiness in the B2CVC environment. This study contributes to the growing body of literature on B2CVCs, specifically addressing gaps in the academic research by integrating measures of beliefs, attitudes, and behaviors in one model. The results provide additional insights to behavioral factors in a B2CVC environment, helping to sort out relationships between traditional measures and relatively new measures. For practitioners, the identification of factors, such as member involvement, that strongly influence B2CVC member satisfaction can help focus technological resources in key areas. Global e-marketers can develop marketing strategies directly targeting B2CVC members. In the global tourism business, they can target Chinese members of a B2CVC by providing special discounts for active community members or developing early adopter programs to encourage stickiness in the community. Future studies are called for, and more sophisticated modeling, to expand the measurement of B2CVC member behavior and to conduct experiments across industries, communities, and cultures.

An Analysis on the Conditions for Successful Economic Sanctions on North Korea : Focusing on the Maritime Aspects of Economic Sanctions (대북경제제재의 효과성과 미래 발전 방향에 대한 고찰: 해상대북제재를 중심으로)

  • Kim, Sang-Hoon
    • Strategy21
    • /
    • s.46
    • /
    • pp.239-276
    • /
    • 2020
  • The failure of early economic sanctions aimed at hurting the overall economies of targeted states called for a more sophisticated design of economic sanctions. This paved way for the advent of 'smart sanctions,' which target the supporters of the regime instead of the public mass. Despite controversies over the effectiveness of economic sanctions as a coercive tool to change the behavior of a targeted state, the transformation from 'comprehensive sanctions' to 'smart sanctions' is gaining the status of a legitimate method to impose punishment on states that do not conform to international norms, the nonproliferation of weapons of mass destruction in this particular context of the paper. The five permanent members of the United Nations Security Council proved that it can come to an accord on imposing economic sanctions over adopting resolutions on waging military war with targeted states. The North Korean nuclear issue has been the biggest security threat to countries in the region, even for China out of fear that further developments of nuclear weapons in North Korea might lead to a 'domino-effect,' leading to nuclear proliferation in the Northeast Asia region. Economic sanctions had been adopted by the UNSC as early as 2006 after the first North Korean nuclear test and has continually strengthened sanctions measures at each stage of North Korean weapons development. While dubious of the effectiveness of early sanctions on North Korea, recent sanctions that limit North Korea's exports of coal and imports of oil seem to have an impact on the regime, inducing Kim Jong-un to commit to peaceful talks since 2018. The purpose of this paper is to add a variable to the factors determining the success of economic sanctions on North Korea: preventing North Korea's evasion efforts by conducting illegal transshipments at sea. I first analyze the cause of recent success in the economic sanctions that led Kim Jong-un to engage in talks and add the maritime element to the argument. There are three conditions for the success of the sanctions regime, and they are: (1) smart sanctions, targeting commodities and support groups (elites) vital to regime survival., (2) China's faithful participation in the sanctions regime, and finally, (3) preventing North Korea's maritime evasion efforts.

DEVELOPMENT OF STATEWIDE TRUCK TRAFFIC FORECASTING METHOD BY USING LIMITED O-D SURVEY DATA (한정된 O-D조사자료를 이용한 주 전체의 트럭교통예측방법 개발)

  • 박만배
    • Proceedings of the KOR-KST Conference
    • /
    • 1995.02a
    • /
    • pp.101-113
    • /
    • 1995
  • The objective of this research is to test the feasibility of developing a statewide truck traffic forecasting methodology for Wisconsin by using Origin-Destination surveys, traffic counts, classification counts, and other data that are routinely collected by the Wisconsin Department of Transportation (WisDOT). Development of a feasible model will permit estimation of future truck traffic for every major link in the network. This will provide the basis for improved estimation of future pavement deterioration. Pavement damage rises exponentially as axle weight increases, and trucks are responsible for most of the traffic-induced damage to pavement. Consequently, forecasts of truck traffic are critical to pavement management systems. The pavement Management Decision Supporting System (PMDSS) prepared by WisDOT in May 1990 combines pavement inventory and performance data with a knowledge base consisting of rules for evaluation, problem identification and rehabilitation recommendation. Without a r.easonable truck traffic forecasting methodology, PMDSS is not able to project pavement performance trends in order to make assessment and recommendations in the future years. However, none of WisDOT's existing forecasting methodologies has been designed specifically for predicting truck movements on a statewide highway network. For this research, the Origin-Destination survey data avaiiable from WisDOT, including two stateline areas, one county, and five cities, are analyzed and the zone-to'||'&'||'not;zone truck trip tables are developed. The resulting Origin-Destination Trip Length Frequency (00 TLF) distributions by trip type are applied to the Gravity Model (GM) for comparison with comparable TLFs from the GM. The gravity model is calibrated to obtain friction factor curves for the three trip types, Internal-Internal (I-I), Internal-External (I-E), and External-External (E-E). ~oth "macro-scale" calibration and "micro-scale" calibration are performed. The comparison of the statewide GM TLF with the 00 TLF for the macro-scale calibration does not provide suitable results because the available 00 survey data do not represent an unbiased sample of statewide truck trips. For the "micro-scale" calibration, "partial" GM trip tables that correspond to the 00 survey trip tables are extracted from the full statewide GM trip table. These "partial" GM trip tables are then merged and a partial GM TLF is created. The GM friction factor curves are adjusted until the partial GM TLF matches the 00 TLF. Three friction factor curves, one for each trip type, resulting from the micro-scale calibration produce a reasonable GM truck trip model. A key methodological issue for GM. calibration involves the use of multiple friction factor curves versus a single friction factor curve for each trip type in order to estimate truck trips with reasonable accuracy. A single friction factor curve for each of the three trip types was found to reproduce the 00 TLFs from the calibration data base. Given the very limited trip generation data available for this research, additional refinement of the gravity model using multiple mction factor curves for each trip type was not warranted. In the traditional urban transportation planning studies, the zonal trip productions and attractions and region-wide OD TLFs are available. However, for this research, the information available for the development .of the GM model is limited to Ground Counts (GC) and a limited set ofOD TLFs. The GM is calibrated using the limited OD data, but the OD data are not adequate to obtain good estimates of truck trip productions and attractions .. Consequently, zonal productions and attractions are estimated using zonal population as a first approximation. Then, Selected Link based (SELINK) analyses are used to adjust the productions and attractions and possibly recalibrate the GM. The SELINK adjustment process involves identifying the origins and destinations of all truck trips that are assigned to a specified "selected link" as the result of a standard traffic assignment. A link adjustment factor is computed as the ratio of the actual volume for the link (ground count) to the total assigned volume. This link adjustment factor is then applied to all of the origin and destination zones of the trips using that "selected link". Selected link based analyses are conducted by using both 16 selected links and 32 selected links. The result of SELINK analysis by u~ing 32 selected links provides the least %RMSE in the screenline volume analysis. In addition, the stability of the GM truck estimating model is preserved by using 32 selected links with three SELINK adjustments, that is, the GM remains calibrated despite substantial changes in the input productions and attractions. The coverage of zones provided by 32 selected links is satisfactory. Increasing the number of repetitions beyond four is not reasonable because the stability of GM model in reproducing the OD TLF reaches its limits. The total volume of truck traffic captured by 32 selected links is 107% of total trip productions. But more importantly, ~ELINK adjustment factors for all of the zones can be computed. Evaluation of the travel demand model resulting from the SELINK adjustments is conducted by using screenline volume analysis, functional class and route specific volume analysis, area specific volume analysis, production and attraction analysis, and Vehicle Miles of Travel (VMT) analysis. Screenline volume analysis by using four screenlines with 28 check points are used for evaluation of the adequacy of the overall model. The total trucks crossing the screenlines are compared to the ground count totals. L V/GC ratios of 0.958 by using 32 selected links and 1.001 by using 16 selected links are obtained. The %RM:SE for the four screenlines is inversely proportional to the average ground count totals by screenline .. The magnitude of %RM:SE for the four screenlines resulting from the fourth and last GM run by using 32 and 16 selected links is 22% and 31 % respectively. These results are similar to the overall %RMSE achieved for the 32 and 16 selected links themselves of 19% and 33% respectively. This implies that the SELINICanalysis results are reasonable for all sections of the state.Functional class and route specific volume analysis is possible by using the available 154 classification count check points. The truck traffic crossing the Interstate highways (ISH) with 37 check points, the US highways (USH) with 50 check points, and the State highways (STH) with 67 check points is compared to the actual ground count totals. The magnitude of the overall link volume to ground count ratio by route does not provide any specific pattern of over or underestimate. However, the %R11SE for the ISH shows the least value while that for the STH shows the largest value. This pattern is consistent with the screenline analysis and the overall relationship between %RMSE and ground count volume groups. Area specific volume analysis provides another broad statewide measure of the performance of the overall model. The truck traffic in the North area with 26 check points, the West area with 36 check points, the East area with 29 check points, and the South area with 64 check points are compared to the actual ground count totals. The four areas show similar results. No specific patterns in the L V/GC ratio by area are found. In addition, the %RMSE is computed for each of the four areas. The %RMSEs for the North, West, East, and South areas are 92%, 49%, 27%, and 35% respectively, whereas, the average ground counts are 481, 1383, 1532, and 3154 respectively. As for the screenline and volume range analyses, the %RMSE is inversely related to average link volume. 'The SELINK adjustments of productions and attractions resulted in a very substantial reduction in the total in-state zonal productions and attractions. The initial in-state zonal trip generation model can now be revised with a new trip production's trip rate (total adjusted productions/total population) and a new trip attraction's trip rate. Revised zonal production and attraction adjustment factors can then be developed that only reflect the impact of the SELINK adjustments that cause mcreases or , decreases from the revised zonal estimate of productions and attractions. Analysis of the revised production adjustment factors is conducted by plotting the factors on the state map. The east area of the state including the counties of Brown, Outagamie, Shawano, Wmnebago, Fond du Lac, Marathon shows comparatively large values of the revised adjustment factors. Overall, both small and large values of the revised adjustment factors are scattered around Wisconsin. This suggests that more independent variables beyond just 226; population are needed for the development of the heavy truck trip generation model. More independent variables including zonal employment data (office employees and manufacturing employees) by industry type, zonal private trucks 226; owned and zonal income data which are not available currently should be considered. A plot of frequency distribution of the in-state zones as a function of the revised production and attraction adjustment factors shows the overall " adjustment resulting from the SELINK analysis process. Overall, the revised SELINK adjustments show that the productions for many zones are reduced by, a factor of 0.5 to 0.8 while the productions for ~ relatively few zones are increased by factors from 1.1 to 4 with most of the factors in the 3.0 range. No obvious explanation for the frequency distribution could be found. The revised SELINK adjustments overall appear to be reasonable. The heavy truck VMT analysis is conducted by comparing the 1990 heavy truck VMT that is forecasted by the GM truck forecasting model, 2.975 billions, with the WisDOT computed data. This gives an estimate that is 18.3% less than the WisDOT computation of 3.642 billions of VMT. The WisDOT estimates are based on the sampling the link volumes for USH, 8TH, and CTH. This implies potential error in sampling the average link volume. The WisDOT estimate of heavy truck VMT cannot be tabulated by the three trip types, I-I, I-E ('||'&'||'pound;-I), and E-E. In contrast, the GM forecasting model shows that the proportion ofE-E VMT out of total VMT is 21.24%. In addition, tabulation of heavy truck VMT by route functional class shows that the proportion of truck traffic traversing the freeways and expressways is 76.5%. Only 14.1% of total freeway truck traffic is I-I trips, while 80% of total collector truck traffic is I-I trips. This implies that freeways are traversed mainly by I-E and E-E truck traffic while collectors are used mainly by I-I truck traffic. Other tabulations such as average heavy truck speed by trip type, average travel distance by trip type and the VMT distribution by trip type, route functional class and travel speed are useful information for highway planners to understand the characteristics of statewide heavy truck trip patternS. Heavy truck volumes for the target year 2010 are forecasted by using the GM truck forecasting model. Four scenarios are used. Fo~ better forecasting, ground count- based segment adjustment factors are developed and applied. ISH 90 '||'&'||' 94 and USH 41 are used as example routes. The forecasting results by using the ground count-based segment adjustment factors are satisfactory for long range planning purposes, but additional ground counts would be useful for USH 41. Sensitivity analysis provides estimates of the impacts of the alternative growth rates including information about changes in the trip types using key routes. The network'||'&'||'not;based GMcan easily model scenarios with different rates of growth in rural versus . . urban areas, small versus large cities, and in-state zones versus external stations. cities, and in-state zones versus external stations.

  • PDF