• Title/Summary/Keyword: Distance measure

Search Result 1,789, Processing Time 0.031 seconds

Exploring the Factors Influencing on the Accuracy of Self-Reported Responses in Affective Assessment of Science (과학과 자기보고식 정의적 영역 평가의 정확성에 영향을 주는 요소 탐색)

  • Chung, Sue-Im;Shin, Donghee
    • Journal of The Korean Association For Science Education
    • /
    • v.39 no.3
    • /
    • pp.363-377
    • /
    • 2019
  • This study reveals the aspects of subjectivity in the test results in a science-specific aspect when assessing science-related affective characteristic through self-report items. The science-specific response was defined as the response that appear due to student's recognition of nature or characteristics of science when his or her concepts or perceptions about science were attempted to measure. We have searched for cases where science-specific responses especially interfere with the measurement objective or accurate self-reports. The results of the error due to the science-specific factors were derived from the quantitative data of 649 students in the 1st and 2nd grade of high school and the qualitative data of 44 students interviewed. The perspective of science and the characteristics of science that students internalize from everyday life and science learning experiences interact with the items that form the test tool. As a result, it was found that there were obstacles to accurate self-report in three aspects: characteristics of science, personal science experience, and science in tool. In terms of the characteristic of science in relation to the essential aspect of science, students respond to items regardless of the measuring constructs, because of their views and perceived characteristics of science based on subjective recognition. The personal science experience factor representing the learner side consists of student's science motivation, interaction with science experience, and perception of science and life. Finally, from the instrumental point of view, science in tool leads to terminological confusion due to the uncertainty of science concepts and results in a distance from accurate self-report eventually. Implications from the results of the study are as follows: review of inclusion of science-specific factors, precaution to clarify the concept of measurement, check of science specificity factors at the development stage, and efforts to cross the boundaries between everyday science and school science.

Evaluation of Park Service in Neighborhood Parks based on the Analysis of Walking Accessibility - Focused on Bundang-gu, Seongnam-si - (보행접근성 분석에 기반한 근린공원의 공원서비스 평가 - 성남시 분당구를 대상으로 -)

  • Hwang, Hae-Kwon;Son, Yong-Hoon
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.52 no.1
    • /
    • pp.59-70
    • /
    • 2024
  • As urbanization progresses, the demand for parks and green space is increasing. Park green spaces in the city are important spaces in the city because they are recognized as spaces where people can freely engage in outdoor activities. The park service area is a measure that shows the extent to which services are provided based on distance. In this process, the concept of accessibility plays an important role, and walking, in particular, as the most basic means of transportation for people and has a great influence on the use of parks. However, the current park service area analysis focuses on discovering underprivileged areas, so detailed evaluation of beneficiary areas is insufficient. This study seeks to evaluate park service areas based on the pedestrian accessibility and the pedestrian network. Park services are services that occur when users directly visit the park, and accessibility is expected to be reflected in terms of usability. To quantify the pedestrian network, this study used space syntax to analyze pedestrian accessibility based on integration values. The integration values are an indicators that quantify the level of accessibility of the pedestrian network, and in this study, the higher the integration value, the higher the possibility of park use. The results of the study are as follows. First, Bundang-gu's park service area accounts for 43%, and includes most sections with high pedestrian accessibility, but some sections with good pedestrian accessibility are excluded. This can be seen as a phenomenon that occurs when residential areas and commercial and business areas are given priority during the urban planning process, and then park and green areas are selected. Second, based on Bundang-gu, the park service area and pedestrian accessibility within the park service area were classified by neighborhood unit. Differences appear for each individual neighborhood unit, and it is expected that the availability of the park will vary accordingly. In addition, even in areas created during the same urban planning process, there were differences in the evaluation of park service areas according to pedestrian accessibility. Using this, it is possible to evaluate individual neighborhood units that can be reflected in living area plans, and it can be used as a useful indicator in park and green space policies that reflect this in the future.

An Expert System for the Estimation of the Growth Curve Parameters of New Markets (신규시장 성장모형의 모수 추정을 위한 전문가 시스템)

  • Lee, Dongwon;Jung, Yeojin;Jung, Jaekwon;Park, Dohyung
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.4
    • /
    • pp.17-35
    • /
    • 2015
  • Demand forecasting is the activity of estimating the quantity of a product or service that consumers will purchase for a certain period of time. Developing precise forecasting models are considered important since corporates can make strategic decisions on new markets based on future demand estimated by the models. Many studies have developed market growth curve models, such as Bass, Logistic, Gompertz models, which estimate future demand when a market is in its early stage. Among the models, Bass model, which explains the demand from two types of adopters, innovators and imitators, has been widely used in forecasting. Such models require sufficient demand observations to ensure qualified results. In the beginning of a new market, however, observations are not sufficient for the models to precisely estimate the market's future demand. For this reason, as an alternative, demands guessed from those of most adjacent markets are often used as references in such cases. Reference markets can be those whose products are developed with the same categorical technologies. A market's demand may be expected to have the similar pattern with that of a reference market in case the adoption pattern of a product in the market is determined mainly by the technology related to the product. However, such processes may not always ensure pleasing results because the similarity between markets depends on intuition and/or experience. There are two major drawbacks that human experts cannot effectively handle in this approach. One is the abundance of candidate reference markets to consider, and the other is the difficulty in calculating the similarity between markets. First, there can be too many markets to consider in selecting reference markets. Mostly, markets in the same category in an industrial hierarchy can be reference markets because they are usually based on the similar technologies. However, markets can be classified into different categories even if they are based on the same generic technologies. Therefore, markets in other categories also need to be considered as potential candidates. Next, even domain experts cannot consistently calculate the similarity between markets with their own qualitative standards. The inconsistency implies missing adjacent reference markets, which may lead to the imprecise estimation of future demand. Even though there are no missing reference markets, the new market's parameters can be hardly estimated from the reference markets without quantitative standards. For this reason, this study proposes a case-based expert system that helps experts overcome the drawbacks in discovering referential markets. First, this study proposes the use of Euclidean distance measure to calculate the similarity between markets. Based on their similarities, markets are grouped into clusters. Then, missing markets with the characteristics of the cluster are searched for. Potential candidate reference markets are extracted and recommended to users. After the iteration of these steps, definite reference markets are determined according to the user's selection among those candidates. Then, finally, the new market's parameters are estimated from the reference markets. For this procedure, two techniques are used in the model. One is clustering data mining technique, and the other content-based filtering of recommender systems. The proposed system implemented with those techniques can determine the most adjacent markets based on whether a user accepts candidate markets. Experiments were conducted to validate the usefulness of the system with five ICT experts involved. In the experiments, the experts were given the list of 16 ICT markets whose parameters to be estimated. For each of the markets, the experts estimated its parameters of growth curve models with intuition at first, and then with the system. The comparison of the experiments results show that the estimated parameters are closer when they use the system in comparison with the results when they guessed them without the system.

The Gradient Variation of Thermal Environments on the Park Woodland Edge in Summer - A Study of Hadongsongrim and Hamyangsangrim - (여름철 공원 수림지 가장자리의 온열환경 기울기 변화 - 하동송림과 함양상림을 대상으로 -)

  • Ryu, Nam-Hyong;Lee, Chun-Seok
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.43 no.6
    • /
    • pp.73-85
    • /
    • 2015
  • This study investigated the extent and magnitude of the woodland edge effects on users' thermal environments according to distance from woodland border. A series of experiments to measure air temperature, relative humidity, wind velocity, MRT and UTCI were conducted over six days between July 31 and August 5, 2015, which corresponded with extremely hot weather, at the south-facing edge of Hadongsongrim(pure Pinus densiflora stands, tree age: $100{\pm}33yr$, tree height: $12.8{\pm}2.7m$, canopy closure: 75%, N $35^{\circ}03^{\prime}34.7^{{\prime}{\prime}}$, E $127^{\circ}44^{\prime}43.3^{{\prime}{\prime}}$, elevation 7~10m) and east-facing edge of Hamyangsangrim (Quercus serrata-Carpinus tschonoskii community, tree age: 102~125yr/58~123yr, tree height: tree layer $18.6{\pm}2.3m/subtree$ layer $5.9{\pm}3.2m/shrub$ layer $0.5{\pm}0.5m$, herbaceous layer coverage ratio 60%, canopy closure: 96%, N $35^{\circ}31^{\prime}28.1^{{\prime}{\prime}}$, E $127^{\circ}43^{\prime}09.8^{{\prime}{\prime}}$, elevation 170~180m) in rural villages of Hadong and Hamyang, Korea. The minus result value of depth means woodland's outside. The depth of edge influence(DEI) on the maximum air temperature, minimum relative humidity and wind speed at maximum air temperature time during the daytime(10:00~17:00) were detected to be $12.7{\pm}4.9$, $15.8{\pm}9.8$ and $23.8{\pm}26.2m$, respectively, in the mature evergreen conifer woodland of Hadongsongrim. These were detected to be $3.7{\pm}2.2$, $4.9{\pm}4.4$ and $2.6{\pm}7.8m$, respectively, in the deciduous broadleaf woodland of Hamyansangrim. The DEI on the maximum 10 minutes average MRT, UTCI from the three-dimensional environment absorbed by the human-biometeorological reference person during the daytime(10:00~17:00) were detected to be $7.1{\pm}1.7$ and $4.3{\pm}4.6m$, respectively, in the relatively sparse woodland of Hadongsongrim. These were detected to be $5.8{\pm}4.9$ and $3.5{\pm}4.1m$, respectively, in the dense and closed woodland of Hadongsongrim. Edge effects on the thermal environments of air temperature, relative humidity, wind speed, MRT and UTCI in the sparse woodland of Hadongsongrim were less pronounced than those recorded in densed and closed woodland of Hamyansangrim. The gradient variation was less steep for maximum 10 minutes average UTCI with at least $4.3{\pm}4.6m$(Hadongsongrim) and $3.5{\pm}4.1m$(Hamyansangrim) being required to stabilize the UTCI at mature woodlands. Therefore it is suggested that the woodlands buffer widths based on the UTCI values should be 3.5~7.6 m(Hamyansangrim) and 4.3~8.9(Hadongsongrim) m on each side of mature woodlands for users' thermal comfort environments. The woodland edge structure should be multi-layered canopies and closed edge for the buffer effect of woodland edge on woodland users' thermal comfort.

The Effect of Common Features on Consumer Preference for a No-Choice Option: The Moderating Role of Regulatory Focus (재몰유선택적정황하공동특성대우고객희호적영향(在没有选择的情况下共同特性对于顾客喜好的影响): 조절초점적조절작용(调节焦点的调节作用))

  • Park, Jong-Chul;Kim, Kyung-Jin
    • Journal of Global Scholars of Marketing Science
    • /
    • v.20 no.1
    • /
    • pp.89-97
    • /
    • 2010
  • This study researches the effects of common features on a no-choice option with respect to regulatory focus theory. The primary interest is in three factors and their interrelationship: common features, no-choice option, and regulatory focus. Prior studies have compiled vast body of research in these areas. First, the "common features effect" has been observed bymany noted marketing researchers. Tversky (1972) proposed the seminal theory, the EBA model: elimination by aspect. According to this theory, consumers are prone to focus only on unique features during comparison processing, thereby dismissing any common features as redundant information. Recently, however, more provocative ideas have attacked the EBA model by asserting that common features really do affect consumer judgment. Chernev (1997) first reported that adding common features mitigates the choice gap because of the increasing perception of similarity among alternatives. Later, however, Chernev (2001) published a critically developed study against his prior perspective with the proposition that common features may be a cognitive load to consumers, and thus consumers are possible that they are prone to prefer the heuristic processing to the systematic processing. This tends to bring one question to the forefront: Do "common features" affect consumer choice? If so, what are the concrete effects? This study tries to answer the question with respect to the "no-choice" option and regulatory focus. Second, some researchers hold that the no-choice option is another best alternative of consumers, who are likely to avoid having to choose in the context of knotty trade-off settings or mental conflicts. Hope for the future also may increase the no-choice option in the context of optimism or the expectancy of a more satisfactory alternative appearing later. Other issues reported in this domain are time pressure, consumer confidence, and alternative numbers (Dhar and Nowlis 1999; Lin and Wu 2005; Zakay and Tsal 1993). This study casts the no-choice option in yet another perspective: the interactive effects between common features and regulatory focus. Third, "regulatory focus theory" is a very popular theme in recent marketing research. It suggests that consumers have two focal goals facing each other: promotion vs. prevention. A promotion focus deals with the concepts of hope, inspiration, achievement, or gain, whereas prevention focus involves duty, responsibility, safety, or loss-aversion. Thus, while consumers with a promotion focus tend to take risks for gain, the same does not hold true for a prevention focus. Regulatory focus theory predicts consumers' emotions, creativity, attitudes, memory, performance, and judgment, as documented in a vast field of marketing and psychology articles. The perspective of the current study in exploring consumer choice and common features is a somewhat creative viewpoint in the area of regulatory focus. These reviews inspire this study of the interaction possibility between regulatory focus and common features with a no-choice option. Specifically, adding common features rather than omitting them may increase the no-choice option ratio in the choice setting only to prevention-focused consumers, but vice versa to promotion-focused consumers. The reasoning is that when prevention-focused consumers come in contact with common features, they may perceive higher similarity among the alternatives. This conflict among similar options would increase the no-choice ratio. Promotion-focused consumers, however, are possible that they perceive common features as a cue of confirmation bias. And thus their confirmation processing would make their prior preference more robust, then the no-choice ratio may shrink. This logic is verified in two experiments. The first is a $2{\times}2$ between-subject design (whether common features or not X regulatory focus) using a digital cameras as the relevant stimulus-a product very familiar to young subjects. Specifically, the regulatory focus variable is median split through a measure of eleven items. Common features included zoom, weight, memory, and battery, whereas the other two attributes (pixel and price) were unique features. Results supported our hypothesis that adding common features enhanced the no-choice ratio only to prevention-focus consumers, not to those with a promotion focus. These results confirm our hypothesis - the interactive effects between a regulatory focus and the common features. Prior research had suggested that including common features had a effect on consumer choice, but this study shows that common features affect choice by consumer segmentation. The second experiment was used to replicate the results of the first experiment. This experimental study is equal to the prior except only two - priming manipulation and another stimulus. For the promotion focus condition, subjects had to write an essay using words such as profit, inspiration, pleasure, achievement, development, hedonic, change, pursuit, etc. For prevention, however, they had to use the words persistence, safety, protection, aversion, loss, responsibility, stability etc. The room for rent had common features (sunshine, facility, ventilation) and unique features (distance time and building state). These attributes implied various levels and valence for replication of the prior experiment. Our hypothesis was supported repeatedly in the results, and the interaction effects were significant between regulatory focus and common features. Thus, these studies showed the dual effects of common features on consumer choice for a no-choice option. Adding common features may enhance or mitigate no-choice, contradictory as it may sound. Under a prevention focus, adding common features is likely to enhance the no-choice ratio because of increasing mental conflict; under the promotion focus, it is prone to shrink the ratio perhaps because of a "confirmation bias." The research has practical and theoretical implications for marketers, who may need to consider common features carefully in a practical display context according to consumer segmentation (i.e., promotion vs. prevention focus.) Theoretically, the results suggest some meaningful moderator variable between common features and no-choice in that the effect on no-choice option is partly dependent on a regulatory focus. This variable corresponds not only to a chronic perspective but also a situational perspective in our hypothesis domain. Finally, in light of some shortcomings in the research, such as overlooked attribute importance, low ratio of no-choice, or the external validity issue, we hope it influences future studies to explore the little-known world of the "no-choice option."

A Consideration of Apron's Shielding in Nuclear Medicine Working Environment (PET검사 작업환경에 있어서 APRON의 방어에 대한 고찰)

  • Lee, Seong-wook;Kim, Seung-hyun;Ji, Bong-geun;Lee, Dong-wook;Kim, Jeong-soo;Kim, Gyeong-mok;Jang, Young-do;Bang, Chan-seok;Baek, Jong-hoon;Lee, In-soo
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.18 no.1
    • /
    • pp.110-114
    • /
    • 2014
  • Purpose: The advancement in PET/CT test devices has decreased the test time and popularized the test, and PET/CT tests have continuously increased. However, this increases the exposure dose of radiation workers, too. This study aims to measure the radiation shielding rate of $^{18}F-FDG$ with a strong energy and the shielding effect when worker wore an apron during the PET/CT test. Also, this study compared the shielding rate with $^{99m}TC$ to minimize the exposure dose of radiation workers. Materials and Methods: This study targeted 10 patients who visited in this hospital for the PET/CT test for 8 days from May 2nd to 10th 2013, and the $^{18}F-FDG$ distribution room, patient relaxing room (stand by room after $^{18}F-FDG$ injection) and PET/CT test room were chosen as measuring spots. Then, the changes in the dose rate were measured before and after the application of the APRON. For an accurate measurement, the distance from patients or sources was fixed at 1M. Also, the same method applied to $^{99m}TC's$ Source in order to compare the reduction in the dose by the Apron. Results: 1) When there was only L-block in the $^{18}F-FDG$ distribution room, the average dose rate was $0.32{\mu}Sv$, and in the case of L-blockK+ apron, it was $0.23{\mu}Sv$. The differences in the dose and dose rate between the two cases were respectively, $0.09{\mu}Sv$ and 26%. 2) When there was no apron in the relaxing room, the average dose rate was $33.1{\mu}Sv$, and when there was an apron, it was $22.3{\mu}Sv$. The differences in the dose and dose rate between them were respectively, $10.8{\mu}Sv$ and 33%. 3) When there was no APRON in the PET/CT room, the average dose rate was $6.9{\mu}Sv$, and there was an APRON, it was $5.5{\mu}Sv$. The differences in the dose and dose rate between them were respectively, $1.4{\mu}Sv$ and 25%. 4) When there was no apron, the average dose rate of $^{99m}TC$ was $23.7{\mu}Sv$, and when there was an apron, it was $5.5{\mu}Sv$. The differences in the dose and dose rate between them were respectively, $18.2{\mu}Sv$ and 77%. Conclusion: According to the result of the experiment, $^{99m}TC$ injected into patients showed an average shielding rate of 77%, and $^{18F}FDG$ showed a relatively low shielding rate of 27%. When comparing the sources only, $^{18F}FDG$ showed a shielding rate of 17%, and $^{99m}TC$'s was 77%. Though it had a lower shielding effect than $^{99m}TC$, $^{18}F-FDG$ also had a shielding effect on the apron. Therefore, it is considered that wearing an apron appropriate for high energy like $^{18}F-FDG$ would minimize the exposure dose of radiation workers.

  • PDF

Comparison of Activity Capacity Change and GFR Value Change According to Matrix Size during 99mTc-DTPA Renal Dynamic Scan (99mTc-DTPA 신장 동적 검사(Renal Dynamic Scan) 시 동위원소 용량 변화와 Matrix Size 변경에 따른 사구체 여과율(Glomerular Filtration Rate, GFR) 수치 변화 비교)

  • Kim, Hyeon;Do, Yong-Ho;Kim, Jae-Il;Choi, Hyeon-Jun;Woo, Jae-Ryong;Bak, Chan-Rok;Ha, Tae-Hwan
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.24 no.1
    • /
    • pp.27-32
    • /
    • 2020
  • Purpose Glomerular Filtration Rate(GFR) is an important indicator for evaluating renal function and monitoring the progress of renal disease. Currently, the method of measuring GFR in clinical trials by using serum creatinine value and 99mTc-DTPA(diethylenetriamine pentaacetic acid) renal dynamic scan is still useful. After the Gates method of formula was announced, when 99mTc-DTPA Renal dynamic scan is taken, it is applied the GFR is measured using a gamma camera. The purpose of this paper is to measure the GFR by applying the Gates method of formula. It is according to effect activity and matrix size that is related in the GFR. Materials and Methods Data from 5 adult patients (patient age = 62 ± 5, 3 males, 2 females) who had been examined 99mTc-DTPA Renal dynamic scan were analyzed. A dynamic image was obtained for 21 minutes after instantaneous injection of 99mTc-DTPA 15 mCi into the patient's vein. To evaluate the glomerular filtration rate according to changes in activity and matrix size, total counts were measured after setting regions of interest in both kidneys and tissues in 2-3 minutes. The distance from detector to the table was maintained at 30cm, and the capacity of the pre-syringe (PR) was set to 15, 20, 25, 30 mCi, and each the capacity of post-syringe (PO) was 1, 5, 10, 15 mCi is set to evaluate the activity change. And then, each matrix size was changed to 32 × 32, 64 × 64, 128 × 128, 256 × 256, 512 × 512, and 1024 × 1024 to compare and to evaluate the values. Results As the activity increased in matrix size, the difference in GFR gradually decreased from 52.95% at the maximum to 16.67% at the minimum. The GFR value according to the change of matrix size was similar to 2.4%, 0.2%, 0.2% of difference when changing from 128 to 256, 256 to 512, and 512 to 1024, but 54.3% of difference when changing from 32 to 64 and 39.43% of difference when changing from 64 to 128. Finally, based on the presently used protocol, 256 × 256, PR 15 mCi and PO 1 mCi, the GFR value was the largest difference with 82% in PR 15 mCi and PO 1 mCi. conditions, and at the least difference is 0.2% in the conditions of PR 30 mCi and PO 15 mCi. Conclusion Through this paper, it was confirmed that when measuring the GFR using the gate method in the 99mTc-DTPA renal dynamic scan. The GFR was affected by activity and matrix size changes. Therefore, it is considered that when taking the 99mTc-DTPA renal dynamic scan, is should be careful by applying appropriate parameters when calculating GFR in the every hospital.

Thinking in Terms of East-West Contacts through Spreading Process of Sarmathia-Pattened Scabbard on Tillya-Tepe Site in Afghanistan (아프가니스탄 틸랴 테페의 사르마티아(Sarmathia)식 검집 패용 방식의 전개 과정으로 본 동서교섭)

  • Lee, Song Ran
    • Korean Journal of Heritage: History & Science
    • /
    • v.45 no.4
    • /
    • pp.54-73
    • /
    • 2012
  • In this article, we examined the patterns of activities of the Sarmathians though in a humble measure, with a focus on the regions where the Sarmathian sheaths spreaded. One of the main weapons the mounted nomads like the Scythias, the Sarmathians, and the Alans used at war was a spear. Though complementary, a sword was the most convenient and appropriate weapon when fighting at a near distance, fallen from the horse to the ground. The Sarmathian swords continued the tradition of the Akinakes which the Scythias or the Persians used, but those of the Sarmathians showed some advances in terms of the easiness with which a sword was drawn out from a sheath, and the way the sheaths were worn to parts of a human body. It turns out that the Sarmathian sheaths, which were designed for the people to draw swords easily, having the sheaths attached to thighs through 4 bumps, spread extensively from Pazyryk, Altai, to South Siberia, Bactria, Parthia and Rome. The most noteworthy out of all the Sarmathian sheaths were the ones that were excavated from the 4th tomb in Tillatepe, Afghanistan which belonged to the region of Bactria. The owner of the fourth tomb of Tilla-tepe whose region was under the control of Kushan Dynasty at that time, was buried wearing Sarmathian swords, and regarded as a big shot in the region of Bactria which was also under the governance of Kushan Dynasty. The fact that the owner of the tomb wore two swords suggests that there had been active exchange between Bactria and Sarmathia. It seemed that the reason why the Sarmathians could play an important role in the exchange between the East and the West might have something to do with their role of supplying Chinese goods to Silk Road. That's why we are interested in how the copper mirrors of Han Dynasty, decoration beads like melon-type beads, crystal beads and goldring articulated beads, and the artifacts of South China which produced silks were excavated in the northern steppe route where the Sarmathians actively worked. Our study have established that the eye beads discovered in Sarmathian tomb estimated to have been built around the 1st century B.C. were reprocessed in China, and then imported to Sarmathia again. We should note the Huns as a medium between the Sarmathians and the South China which were far apart from each other. Thus gold-ring articulated beads which were spread out mainly across the South China has been discovered in the Huns' remains. On the other hand, between 2nd century B.C. and 2nd century A.D. which were main periods of the Sarmathians, it was considered that the traffic route connecting the steppe route and the South China might be West-South silk road which started from Yunnan, passed through Myanmar, Pakistan, and Afghanistan, and then went into the east of India. The West-south Silk road is presumed to have been used by nomadic tribes who wanted to get the goods from South China before the Oasis route was activated by the Han Dynasty's policy of managing the countries bordering on Western China.

DEVELOPMENT OF STATEWIDE TRUCK TRAFFIC FORECASTING METHOD BY USING LIMITED O-D SURVEY DATA (한정된 O-D조사자료를 이용한 주 전체의 트럭교통예측방법 개발)

  • 박만배
    • Proceedings of the KOR-KST Conference
    • /
    • 1995.02a
    • /
    • pp.101-113
    • /
    • 1995
  • The objective of this research is to test the feasibility of developing a statewide truck traffic forecasting methodology for Wisconsin by using Origin-Destination surveys, traffic counts, classification counts, and other data that are routinely collected by the Wisconsin Department of Transportation (WisDOT). Development of a feasible model will permit estimation of future truck traffic for every major link in the network. This will provide the basis for improved estimation of future pavement deterioration. Pavement damage rises exponentially as axle weight increases, and trucks are responsible for most of the traffic-induced damage to pavement. Consequently, forecasts of truck traffic are critical to pavement management systems. The pavement Management Decision Supporting System (PMDSS) prepared by WisDOT in May 1990 combines pavement inventory and performance data with a knowledge base consisting of rules for evaluation, problem identification and rehabilitation recommendation. Without a r.easonable truck traffic forecasting methodology, PMDSS is not able to project pavement performance trends in order to make assessment and recommendations in the future years. However, none of WisDOT's existing forecasting methodologies has been designed specifically for predicting truck movements on a statewide highway network. For this research, the Origin-Destination survey data avaiiable from WisDOT, including two stateline areas, one county, and five cities, are analyzed and the zone-to'||'&'||'not;zone truck trip tables are developed. The resulting Origin-Destination Trip Length Frequency (00 TLF) distributions by trip type are applied to the Gravity Model (GM) for comparison with comparable TLFs from the GM. The gravity model is calibrated to obtain friction factor curves for the three trip types, Internal-Internal (I-I), Internal-External (I-E), and External-External (E-E). ~oth "macro-scale" calibration and "micro-scale" calibration are performed. The comparison of the statewide GM TLF with the 00 TLF for the macro-scale calibration does not provide suitable results because the available 00 survey data do not represent an unbiased sample of statewide truck trips. For the "micro-scale" calibration, "partial" GM trip tables that correspond to the 00 survey trip tables are extracted from the full statewide GM trip table. These "partial" GM trip tables are then merged and a partial GM TLF is created. The GM friction factor curves are adjusted until the partial GM TLF matches the 00 TLF. Three friction factor curves, one for each trip type, resulting from the micro-scale calibration produce a reasonable GM truck trip model. A key methodological issue for GM. calibration involves the use of multiple friction factor curves versus a single friction factor curve for each trip type in order to estimate truck trips with reasonable accuracy. A single friction factor curve for each of the three trip types was found to reproduce the 00 TLFs from the calibration data base. Given the very limited trip generation data available for this research, additional refinement of the gravity model using multiple mction factor curves for each trip type was not warranted. In the traditional urban transportation planning studies, the zonal trip productions and attractions and region-wide OD TLFs are available. However, for this research, the information available for the development .of the GM model is limited to Ground Counts (GC) and a limited set ofOD TLFs. The GM is calibrated using the limited OD data, but the OD data are not adequate to obtain good estimates of truck trip productions and attractions .. Consequently, zonal productions and attractions are estimated using zonal population as a first approximation. Then, Selected Link based (SELINK) analyses are used to adjust the productions and attractions and possibly recalibrate the GM. The SELINK adjustment process involves identifying the origins and destinations of all truck trips that are assigned to a specified "selected link" as the result of a standard traffic assignment. A link adjustment factor is computed as the ratio of the actual volume for the link (ground count) to the total assigned volume. This link adjustment factor is then applied to all of the origin and destination zones of the trips using that "selected link". Selected link based analyses are conducted by using both 16 selected links and 32 selected links. The result of SELINK analysis by u~ing 32 selected links provides the least %RMSE in the screenline volume analysis. In addition, the stability of the GM truck estimating model is preserved by using 32 selected links with three SELINK adjustments, that is, the GM remains calibrated despite substantial changes in the input productions and attractions. The coverage of zones provided by 32 selected links is satisfactory. Increasing the number of repetitions beyond four is not reasonable because the stability of GM model in reproducing the OD TLF reaches its limits. The total volume of truck traffic captured by 32 selected links is 107% of total trip productions. But more importantly, ~ELINK adjustment factors for all of the zones can be computed. Evaluation of the travel demand model resulting from the SELINK adjustments is conducted by using screenline volume analysis, functional class and route specific volume analysis, area specific volume analysis, production and attraction analysis, and Vehicle Miles of Travel (VMT) analysis. Screenline volume analysis by using four screenlines with 28 check points are used for evaluation of the adequacy of the overall model. The total trucks crossing the screenlines are compared to the ground count totals. L V/GC ratios of 0.958 by using 32 selected links and 1.001 by using 16 selected links are obtained. The %RM:SE for the four screenlines is inversely proportional to the average ground count totals by screenline .. The magnitude of %RM:SE for the four screenlines resulting from the fourth and last GM run by using 32 and 16 selected links is 22% and 31 % respectively. These results are similar to the overall %RMSE achieved for the 32 and 16 selected links themselves of 19% and 33% respectively. This implies that the SELINICanalysis results are reasonable for all sections of the state.Functional class and route specific volume analysis is possible by using the available 154 classification count check points. The truck traffic crossing the Interstate highways (ISH) with 37 check points, the US highways (USH) with 50 check points, and the State highways (STH) with 67 check points is compared to the actual ground count totals. The magnitude of the overall link volume to ground count ratio by route does not provide any specific pattern of over or underestimate. However, the %R11SE for the ISH shows the least value while that for the STH shows the largest value. This pattern is consistent with the screenline analysis and the overall relationship between %RMSE and ground count volume groups. Area specific volume analysis provides another broad statewide measure of the performance of the overall model. The truck traffic in the North area with 26 check points, the West area with 36 check points, the East area with 29 check points, and the South area with 64 check points are compared to the actual ground count totals. The four areas show similar results. No specific patterns in the L V/GC ratio by area are found. In addition, the %RMSE is computed for each of the four areas. The %RMSEs for the North, West, East, and South areas are 92%, 49%, 27%, and 35% respectively, whereas, the average ground counts are 481, 1383, 1532, and 3154 respectively. As for the screenline and volume range analyses, the %RMSE is inversely related to average link volume. 'The SELINK adjustments of productions and attractions resulted in a very substantial reduction in the total in-state zonal productions and attractions. The initial in-state zonal trip generation model can now be revised with a new trip production's trip rate (total adjusted productions/total population) and a new trip attraction's trip rate. Revised zonal production and attraction adjustment factors can then be developed that only reflect the impact of the SELINK adjustments that cause mcreases or , decreases from the revised zonal estimate of productions and attractions. Analysis of the revised production adjustment factors is conducted by plotting the factors on the state map. The east area of the state including the counties of Brown, Outagamie, Shawano, Wmnebago, Fond du Lac, Marathon shows comparatively large values of the revised adjustment factors. Overall, both small and large values of the revised adjustment factors are scattered around Wisconsin. This suggests that more independent variables beyond just 226; population are needed for the development of the heavy truck trip generation model. More independent variables including zonal employment data (office employees and manufacturing employees) by industry type, zonal private trucks 226; owned and zonal income data which are not available currently should be considered. A plot of frequency distribution of the in-state zones as a function of the revised production and attraction adjustment factors shows the overall " adjustment resulting from the SELINK analysis process. Overall, the revised SELINK adjustments show that the productions for many zones are reduced by, a factor of 0.5 to 0.8 while the productions for ~ relatively few zones are increased by factors from 1.1 to 4 with most of the factors in the 3.0 range. No obvious explanation for the frequency distribution could be found. The revised SELINK adjustments overall appear to be reasonable. The heavy truck VMT analysis is conducted by comparing the 1990 heavy truck VMT that is forecasted by the GM truck forecasting model, 2.975 billions, with the WisDOT computed data. This gives an estimate that is 18.3% less than the WisDOT computation of 3.642 billions of VMT. The WisDOT estimates are based on the sampling the link volumes for USH, 8TH, and CTH. This implies potential error in sampling the average link volume. The WisDOT estimate of heavy truck VMT cannot be tabulated by the three trip types, I-I, I-E ('||'&'||'pound;-I), and E-E. In contrast, the GM forecasting model shows that the proportion ofE-E VMT out of total VMT is 21.24%. In addition, tabulation of heavy truck VMT by route functional class shows that the proportion of truck traffic traversing the freeways and expressways is 76.5%. Only 14.1% of total freeway truck traffic is I-I trips, while 80% of total collector truck traffic is I-I trips. This implies that freeways are traversed mainly by I-E and E-E truck traffic while collectors are used mainly by I-I truck traffic. Other tabulations such as average heavy truck speed by trip type, average travel distance by trip type and the VMT distribution by trip type, route functional class and travel speed are useful information for highway planners to understand the characteristics of statewide heavy truck trip patternS. Heavy truck volumes for the target year 2010 are forecasted by using the GM truck forecasting model. Four scenarios are used. Fo~ better forecasting, ground count- based segment adjustment factors are developed and applied. ISH 90 '||'&'||' 94 and USH 41 are used as example routes. The forecasting results by using the ground count-based segment adjustment factors are satisfactory for long range planning purposes, but additional ground counts would be useful for USH 41. Sensitivity analysis provides estimates of the impacts of the alternative growth rates including information about changes in the trip types using key routes. The network'||'&'||'not;based GMcan easily model scenarios with different rates of growth in rural versus . . urban areas, small versus large cities, and in-state zones versus external stations. cities, and in-state zones versus external stations.

  • PDF