• Title/Summary/Keyword: large scale model test

Search Result 417, Processing Time 0.037 seconds

Experimental Study on Generating mechanism of The Ground Subsidence of Due to Damaged Waters supply Pipe (상수관로 파손으로 인한 지반함몰 발생메카니즘에 관한 실험적 연구)

  • Kim, Youngho;Kim, Joo-Bong;Kim, Dowon;Han, Jung-Geun
    • Journal of the Korean Geosynthetics Society
    • /
    • v.16 no.2
    • /
    • pp.139-148
    • /
    • 2017
  • Ground subsidence caused by damaged water pipe and sewer is recently increasing due to the aging of city and pipeline in many city. Although many recent studies have verified characteristics of ground subsidence due to wastewater pipe breakdown, research about characteristics of ground subsidence due to water pipe is insignificant. subsidence due to water pipe is insignificant. This study aims to identify the ground failure mechanism caused by water and sewer pipe breakdown. Accordingly, we conducted an indoor model experiment to verify characteristics of ground subsidence considering characteristics of ground and ground failure. The water pipe pressure and velocity head was considered to find out ground subsidence mechanism. Also comparative analysis is conducted by analyzing relative density and fine-grain content considering embedded condition of water pipe. When the relative density and seepage pressure is low, small scale ground subsidence can occur, but when the conditions are opposite, ground subsidence occur in large scale and expands to ground level over time. Furthermore, it is acknowledgeable that ground cavity that is formed after soil run off due to seepage in deep earth, maintains steady strength and stays on the ground level for long period.

The way to make training data for deep learning model to recognize keywords in product catalog image at E-commerce (온라인 쇼핑몰에서 상품 설명 이미지 내의 키워드 인식을 위한 딥러닝 훈련 데이터 자동 생성 방안)

  • Kim, Kitae;Oh, Wonseok;Lim, Geunwon;Cha, Eunwoo;Shin, Minyoung;Kim, Jongwoo
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.1
    • /
    • pp.1-23
    • /
    • 2018
  • From the 21st century, various high-quality services have come up with the growth of the internet or 'Information and Communication Technologies'. Especially, the scale of E-commerce industry in which Amazon and E-bay are standing out is exploding in a large way. As E-commerce grows, Customers could get what they want to buy easily while comparing various products because more products have been registered at online shopping malls. However, a problem has arisen with the growth of E-commerce. As too many products have been registered, it has become difficult for customers to search what they really need in the flood of products. When customers search for desired products with a generalized keyword, too many products have come out as a result. On the contrary, few products have been searched if customers type in details of products because concrete product-attributes have been registered rarely. In this situation, recognizing texts in images automatically with a machine can be a solution. Because bulk of product details are written in catalogs as image format, most of product information are not searched with text inputs in the current text-based searching system. It means if information in images can be converted to text format, customers can search products with product-details, which make them shop more conveniently. There are various existing OCR(Optical Character Recognition) programs which can recognize texts in images. But existing OCR programs are hard to be applied to catalog because they have problems in recognizing texts in certain circumstances, like texts are not big enough or fonts are not consistent. Therefore, this research suggests the way to recognize keywords in catalog with the Deep Learning algorithm which is state of the art in image-recognition area from 2010s. Single Shot Multibox Detector(SSD), which is a credited model for object-detection performance, can be used with structures re-designed to take into account the difference of text from object. But there is an issue that SSD model needs a lot of labeled-train data to be trained, because of the characteristic of deep learning algorithms, that it should be trained by supervised-learning. To collect data, we can try labelling location and classification information to texts in catalog manually. But if data are collected manually, many problems would come up. Some keywords would be missed because human can make mistakes while labelling train data. And it becomes too time-consuming to collect train data considering the scale of data needed or costly if a lot of workers are hired to shorten the time. Furthermore, if some specific keywords are needed to be trained, searching images that have the words would be difficult, as well. To solve the data issue, this research developed a program which create train data automatically. This program can make images which have various keywords and pictures like catalog and save location-information of keywords at the same time. With this program, not only data can be collected efficiently, but also the performance of SSD model becomes better. The SSD model recorded 81.99% of recognition rate with 20,000 data created by the program. Moreover, this research had an efficiency test of SSD model according to data differences to analyze what feature of data exert influence upon the performance of recognizing texts in images. As a result, it is figured out that the number of labeled keywords, the addition of overlapped keyword label, the existence of keywords that is not labeled, the spaces among keywords and the differences of background images are related to the performance of SSD model. This test can lead performance improvement of SSD model or other text-recognizing machine based on deep learning algorithm with high-quality data. SSD model which is re-designed to recognize texts in images and the program developed for creating train data are expected to contribute to improvement of searching system in E-commerce. Suppliers can put less time to register keywords for products and customers can search products with product-details which is written on the catalog.

Preliminary Study on the Enhancement of Reconstruction Speed for Emission Computed Tomography Using Parallel Processing (병렬 연산을 이용한 방출 단층 영상의 재구성 속도향상 기초연구)

  • Park, Min-Jae;Lee, Jae-Sung;Kim, Soo-Mee;Kang, Ji-Yeon;Lee, Dong-Soo;Park, Kwang-Suk
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.43 no.5
    • /
    • pp.443-450
    • /
    • 2009
  • Purpose: Conventional image reconstruction uses simplified physical models of projection. However, real physics, for example 3D reconstruction, takes too long time to process all the data in clinic and is unable in a common reconstruction machine because of the large memory for complex physical models. We suggest the realistic distributed memory model of fast-reconstruction using parallel processing on personal computers to enable large-scale technologies. Materials and Methods: The preliminary tests for the possibility on virtual manchines and various performance test on commercial super computer, Tachyon were performed. Expectation maximization algorithm with common 2D projection and realistic 3D line of response were tested. Since the process time was getting slower (max 6 times) after a certain iteration, optimization for compiler was performed to maximize the efficiency of parallelization. Results: Parallel processing of a program on multiple computers was available on Linux with MPICH and NFS. We verified that differences between parallel processed image and single processed image at the same iterations were under the significant digits of floating point number, about 6 bit. Double processors showed good efficiency (1.96 times) of parallel computing. Delay phenomenon was solved by vectorization method using SSE. Conclusion: Through the study, realistic parallel computing system in clinic was established to be able to reconstruct by plenty of memory using the realistic physical models which was impossible to simplify.

Geotechnical Engineering Progress with the Incheon Bridge Project

  • Cho, Sung-Min
    • Proceedings of the Korean Geotechical Society Conference
    • /
    • 2009.09a
    • /
    • pp.133-144
    • /
    • 2009
  • Incheon Bridge, 18.4 km long sea-crossing bridge, will be opened to the traffic in October 2009 and this will be the new landmark of the gearing up north-east Asia as well as the largest & longest bridge of Korea. Incheon Bridge is the integrated set of several special featured bridges including a magnificent cable-stayed girder bridge which has a main span of 800 m width to cross the navigation channel in and out of the Port of Incheon. Incheon Bridge is making an epoch of long-span bridge designs thanks to the fully application of the AASHTO LRFD (load & resistance factor design) to both the superstructures and the substructures. A state-of-the-art of the geotechnologies which were applied to the Incheon Bridge construction project is introduced. The most Large-diameter drilled shafts were penetrated into the bedrock to support the colossal superstructures. The bearing capacity and deformational characteristics of the foundations were verified through the world's largest static pile load test. 8 full-scale pilot piles were tested in both offshore site and onshore area prior to the commencement of constructions. Compressible load beyond 30,000 tonf pressed a single 3 m diameter foundation pile by means of bi-directional loading method including the Osterberg cell techniques. Detailed site investigation to characterize the subsurface properties had been carried out. Geotextile tubes, tied sheet pile walls, and trestles were utilized to overcome the very large tidal difference between ebb and flow at the foreshore site. 44 circular-cell type dolphins surround the piers near the navigation channel to protect the bridge against the collision with aberrant vessels. Each dolphin structure consists of the flat sheet piled wall and infilled aggregates to absorb the collision impact. Geo-centrifugal tests were performed to evaluate the behavior of the dolphin in the seabed and to verify the numerical model for the design. Rip-rap embankments on the seabed are expected to prevent the scouring of the foundation. Prefabricated vertical drains, sand compaction piles, deep cement mixings, horizontal natural-fiber drains, and other subsidiary methods were used to improve the soft ground for the site of abutments, toll plazas, and access roads. Light-weight backfill using EPS blocks helps to reduce the earth pressure behind the abutment on the soft ground. Some kinds of reinforced earth like as MSE using geosynthetics were utilized for the ring wall of the abutment. Soil steel bridges made of corrugated steel plates and engineered backfills were constructed for the open-cut tunnel and the culvert. Diverse experiences of advanced designs and constructions from the Incheon Bridge project have been propagated by relevant engineers and it is strongly expected that significant achievements in geotechnical engineering through this project will contribute to the national development of the longspan bridge technologies remarkably.

  • PDF

DEVELOPMENT OF STATEWIDE TRUCK TRAFFIC FORECASTING METHOD BY USING LIMITED O-D SURVEY DATA (한정된 O-D조사자료를 이용한 주 전체의 트럭교통예측방법 개발)

  • 박만배
    • Proceedings of the KOR-KST Conference
    • /
    • 1995.02a
    • /
    • pp.101-113
    • /
    • 1995
  • The objective of this research is to test the feasibility of developing a statewide truck traffic forecasting methodology for Wisconsin by using Origin-Destination surveys, traffic counts, classification counts, and other data that are routinely collected by the Wisconsin Department of Transportation (WisDOT). Development of a feasible model will permit estimation of future truck traffic for every major link in the network. This will provide the basis for improved estimation of future pavement deterioration. Pavement damage rises exponentially as axle weight increases, and trucks are responsible for most of the traffic-induced damage to pavement. Consequently, forecasts of truck traffic are critical to pavement management systems. The pavement Management Decision Supporting System (PMDSS) prepared by WisDOT in May 1990 combines pavement inventory and performance data with a knowledge base consisting of rules for evaluation, problem identification and rehabilitation recommendation. Without a r.easonable truck traffic forecasting methodology, PMDSS is not able to project pavement performance trends in order to make assessment and recommendations in the future years. However, none of WisDOT's existing forecasting methodologies has been designed specifically for predicting truck movements on a statewide highway network. For this research, the Origin-Destination survey data avaiiable from WisDOT, including two stateline areas, one county, and five cities, are analyzed and the zone-to'||'&'||'not;zone truck trip tables are developed. The resulting Origin-Destination Trip Length Frequency (00 TLF) distributions by trip type are applied to the Gravity Model (GM) for comparison with comparable TLFs from the GM. The gravity model is calibrated to obtain friction factor curves for the three trip types, Internal-Internal (I-I), Internal-External (I-E), and External-External (E-E). ~oth "macro-scale" calibration and "micro-scale" calibration are performed. The comparison of the statewide GM TLF with the 00 TLF for the macro-scale calibration does not provide suitable results because the available 00 survey data do not represent an unbiased sample of statewide truck trips. For the "micro-scale" calibration, "partial" GM trip tables that correspond to the 00 survey trip tables are extracted from the full statewide GM trip table. These "partial" GM trip tables are then merged and a partial GM TLF is created. The GM friction factor curves are adjusted until the partial GM TLF matches the 00 TLF. Three friction factor curves, one for each trip type, resulting from the micro-scale calibration produce a reasonable GM truck trip model. A key methodological issue for GM. calibration involves the use of multiple friction factor curves versus a single friction factor curve for each trip type in order to estimate truck trips with reasonable accuracy. A single friction factor curve for each of the three trip types was found to reproduce the 00 TLFs from the calibration data base. Given the very limited trip generation data available for this research, additional refinement of the gravity model using multiple mction factor curves for each trip type was not warranted. In the traditional urban transportation planning studies, the zonal trip productions and attractions and region-wide OD TLFs are available. However, for this research, the information available for the development .of the GM model is limited to Ground Counts (GC) and a limited set ofOD TLFs. The GM is calibrated using the limited OD data, but the OD data are not adequate to obtain good estimates of truck trip productions and attractions .. Consequently, zonal productions and attractions are estimated using zonal population as a first approximation. Then, Selected Link based (SELINK) analyses are used to adjust the productions and attractions and possibly recalibrate the GM. The SELINK adjustment process involves identifying the origins and destinations of all truck trips that are assigned to a specified "selected link" as the result of a standard traffic assignment. A link adjustment factor is computed as the ratio of the actual volume for the link (ground count) to the total assigned volume. This link adjustment factor is then applied to all of the origin and destination zones of the trips using that "selected link". Selected link based analyses are conducted by using both 16 selected links and 32 selected links. The result of SELINK analysis by u~ing 32 selected links provides the least %RMSE in the screenline volume analysis. In addition, the stability of the GM truck estimating model is preserved by using 32 selected links with three SELINK adjustments, that is, the GM remains calibrated despite substantial changes in the input productions and attractions. The coverage of zones provided by 32 selected links is satisfactory. Increasing the number of repetitions beyond four is not reasonable because the stability of GM model in reproducing the OD TLF reaches its limits. The total volume of truck traffic captured by 32 selected links is 107% of total trip productions. But more importantly, ~ELINK adjustment factors for all of the zones can be computed. Evaluation of the travel demand model resulting from the SELINK adjustments is conducted by using screenline volume analysis, functional class and route specific volume analysis, area specific volume analysis, production and attraction analysis, and Vehicle Miles of Travel (VMT) analysis. Screenline volume analysis by using four screenlines with 28 check points are used for evaluation of the adequacy of the overall model. The total trucks crossing the screenlines are compared to the ground count totals. L V/GC ratios of 0.958 by using 32 selected links and 1.001 by using 16 selected links are obtained. The %RM:SE for the four screenlines is inversely proportional to the average ground count totals by screenline .. The magnitude of %RM:SE for the four screenlines resulting from the fourth and last GM run by using 32 and 16 selected links is 22% and 31 % respectively. These results are similar to the overall %RMSE achieved for the 32 and 16 selected links themselves of 19% and 33% respectively. This implies that the SELINICanalysis results are reasonable for all sections of the state.Functional class and route specific volume analysis is possible by using the available 154 classification count check points. The truck traffic crossing the Interstate highways (ISH) with 37 check points, the US highways (USH) with 50 check points, and the State highways (STH) with 67 check points is compared to the actual ground count totals. The magnitude of the overall link volume to ground count ratio by route does not provide any specific pattern of over or underestimate. However, the %R11SE for the ISH shows the least value while that for the STH shows the largest value. This pattern is consistent with the screenline analysis and the overall relationship between %RMSE and ground count volume groups. Area specific volume analysis provides another broad statewide measure of the performance of the overall model. The truck traffic in the North area with 26 check points, the West area with 36 check points, the East area with 29 check points, and the South area with 64 check points are compared to the actual ground count totals. The four areas show similar results. No specific patterns in the L V/GC ratio by area are found. In addition, the %RMSE is computed for each of the four areas. The %RMSEs for the North, West, East, and South areas are 92%, 49%, 27%, and 35% respectively, whereas, the average ground counts are 481, 1383, 1532, and 3154 respectively. As for the screenline and volume range analyses, the %RMSE is inversely related to average link volume. 'The SELINK adjustments of productions and attractions resulted in a very substantial reduction in the total in-state zonal productions and attractions. The initial in-state zonal trip generation model can now be revised with a new trip production's trip rate (total adjusted productions/total population) and a new trip attraction's trip rate. Revised zonal production and attraction adjustment factors can then be developed that only reflect the impact of the SELINK adjustments that cause mcreases or , decreases from the revised zonal estimate of productions and attractions. Analysis of the revised production adjustment factors is conducted by plotting the factors on the state map. The east area of the state including the counties of Brown, Outagamie, Shawano, Wmnebago, Fond du Lac, Marathon shows comparatively large values of the revised adjustment factors. Overall, both small and large values of the revised adjustment factors are scattered around Wisconsin. This suggests that more independent variables beyond just 226; population are needed for the development of the heavy truck trip generation model. More independent variables including zonal employment data (office employees and manufacturing employees) by industry type, zonal private trucks 226; owned and zonal income data which are not available currently should be considered. A plot of frequency distribution of the in-state zones as a function of the revised production and attraction adjustment factors shows the overall " adjustment resulting from the SELINK analysis process. Overall, the revised SELINK adjustments show that the productions for many zones are reduced by, a factor of 0.5 to 0.8 while the productions for ~ relatively few zones are increased by factors from 1.1 to 4 with most of the factors in the 3.0 range. No obvious explanation for the frequency distribution could be found. The revised SELINK adjustments overall appear to be reasonable. The heavy truck VMT analysis is conducted by comparing the 1990 heavy truck VMT that is forecasted by the GM truck forecasting model, 2.975 billions, with the WisDOT computed data. This gives an estimate that is 18.3% less than the WisDOT computation of 3.642 billions of VMT. The WisDOT estimates are based on the sampling the link volumes for USH, 8TH, and CTH. This implies potential error in sampling the average link volume. The WisDOT estimate of heavy truck VMT cannot be tabulated by the three trip types, I-I, I-E ('||'&'||'pound;-I), and E-E. In contrast, the GM forecasting model shows that the proportion ofE-E VMT out of total VMT is 21.24%. In addition, tabulation of heavy truck VMT by route functional class shows that the proportion of truck traffic traversing the freeways and expressways is 76.5%. Only 14.1% of total freeway truck traffic is I-I trips, while 80% of total collector truck traffic is I-I trips. This implies that freeways are traversed mainly by I-E and E-E truck traffic while collectors are used mainly by I-I truck traffic. Other tabulations such as average heavy truck speed by trip type, average travel distance by trip type and the VMT distribution by trip type, route functional class and travel speed are useful information for highway planners to understand the characteristics of statewide heavy truck trip patternS. Heavy truck volumes for the target year 2010 are forecasted by using the GM truck forecasting model. Four scenarios are used. Fo~ better forecasting, ground count- based segment adjustment factors are developed and applied. ISH 90 '||'&'||' 94 and USH 41 are used as example routes. The forecasting results by using the ground count-based segment adjustment factors are satisfactory for long range planning purposes, but additional ground counts would be useful for USH 41. Sensitivity analysis provides estimates of the impacts of the alternative growth rates including information about changes in the trip types using key routes. The network'||'&'||'not;based GMcan easily model scenarios with different rates of growth in rural versus . . urban areas, small versus large cities, and in-state zones versus external stations. cities, and in-state zones versus external stations.

  • PDF

Comparison of Forest Carbon Stocks Estimation Methods Using Forest Type Map and Landsat TM Satellite Imagery (임상도와 Landsat TM 위성영상을 이용한 산림탄소저장량 추정 방법 비교 연구)

  • Kim, Kyoung-Min;Lee, Jung-Bin;Jung, Jaehoon
    • Korean Journal of Remote Sensing
    • /
    • v.31 no.5
    • /
    • pp.449-459
    • /
    • 2015
  • The conventional National Forest Inventory(NFI)-based forest carbon stock estimation method is suitable for national-scale estimation, but is not for regional-scale estimation due to the lack of NFI plots. In this study, for the purpose of regional-scale carbon stock estimation, we created grid-based forest carbon stock maps using spatial ancillary data and two types of up-scaling methods. Chungnam province was chosen to represent the study area and for which the $5^{th}$ NFI (2006~2009) data was collected. The first method (method 1) selects forest type map as ancillary data and uses regression model for forest carbon stock estimation, whereas the second method (method 2) uses satellite imagery and k-Nearest Neighbor(k-NN) algorithm. Additionally, in order to consider uncertainty effects, the final AGB carbon stock maps were generated by performing 200 iterative processes with Monte Carlo simulation. As a result, compared to the NFI-based estimation(21,136,911 tonC), the total carbon stock was over-estimated by method 1(22,948,151 tonC), but was under-estimated by method 2(19,750,315 tonC). In the paired T-test with 186 independent data, the average carbon stock estimation by the NFI-based method was statistically different from method2(p<0.01), but was not different from method1(p>0.01). In particular, by means of Monte Carlo simulation, it was found that the smoothing effect of k-NN algorithm and mis-registration error between NFI plots and satellite image can lead to large uncertainty in carbon stock estimation. Although method 1 was found suitable for carbon stock estimation of forest stands that feature heterogeneous trees in Korea, satellite-based method is still in demand to provide periodic estimates of un-investigated, large forest area. In these respects, future work will focus on spatial and temporal extent of study area and robust carbon stock estimation with various satellite images and estimation methods.

Development of the Automatic Fishing System for the Anchovy Scoop nets (I) - The hydraulic winder device for the boom control - (멸치초망 어업의 조업자동화 시스템 개발 (I) -챗대 조작용 유압 권양기 개발-)

  • 박성욱;배봉성;서두옥
    • Journal of the Korean Society of Fisheries and Ocean Technology
    • /
    • v.36 no.3
    • /
    • pp.166-174
    • /
    • 2000
  • Anchovy, EngrauEis japonica scoop nets are used in the coastal of Southern and Cheju of Korea. Especially in the Cheju, the fishing gear of scoop nets consists of upper boom, lower boom, pressing stick and bag net. They are operated by fishing boats of 6 to 10 ton class and 8 persons on board. The booms are controlled by side drum, and the net and pressing stick are hauled by only human power in operating. Therefore this fishery needs to large labor and heavy human power and has much risk. Three kinds of hydraulic winding device which controls two booms was designed and manufactured to reduce heavy labor force of scoop nets, and trial in the sea was carried out to test their performances using the commercial fishing boats of 6 ton class. The proper capacity of hydraulic pump and motor were determined by model test of boom 1/5 scale. The results obtained are as follows, 1. Tension of boom which is being drawn was the strongest and 187.5kgf when the boom's end is in the depth of 4m under the water. 2. The hydraulic motor of the fittest kind of winder has the least leakage per time than the other kinds. 3. In the best type of several winder devices, when the pressure difference was fixed $130kg/^2$ for the safe fishery, the winding velocity of boom line was 2m/sec, is faster 0.48/sec than traditional fishing method and this winder can catch the anchovy of 1.6 tonnage. 4. As a result, the crew were decreased from 8 to 6 and the problem of heavy human power and risk on fishing operation were solved by using the this winder.

  • PDF

High-Temperature Structural-Analysis Model of Process Heat Exchanger for Helium Gas Loop (I) (헬륨가스루프 시험용 공정열교환기에 대한 고온구조해석 모델링 (I))

  • Song, Kee-Nam;Lee, Heong-Yeon;Kim, Yong-Wan;Hong, Seong-Duk;Park, Hong-Yoon
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.34 no.9
    • /
    • pp.1241-1248
    • /
    • 2010
  • In large-scale production of hydrogen, a PHE (Process Heat Exchanger) is a key component because the heat required to carry out the Sulfur-Iodine chemical reaction that yields hydrogen is transferred from a VHTR (Very High Temperature Reactor) by the PHE. Korea Atomic Energy Research Institute established a helium gas loop for conducting performance test of components that are used in the VHTR. In this study, as a part of high-temperature structural-integrity evaluation of a designed PHE prototype that is scheduled to be tested in the helium gas loop, we carried out high-temperature structural-analysis modeling, thermal analysis, and thermal-expansion analysis for the designed PHE prototype. An appropriate constraint condition is proposed at the end of the in-flow and out-flow pipelines of the primary and secondary coolants and the proposed constraint condition will be applied to the design of the performance-test loop setup for the designed PHE prototype.

Experimental study on the tunnel behavior induced by the excavation and the structure construction above existing tunnel (기존터널 상부지반 굴착 후 구조물 설치에 따른 터널거동에 관한 실험적 연구)

  • Cha, Seok-Kyu;Lee, Sangduk
    • Journal of Korean Tunnelling and Underground Space Association
    • /
    • v.20 no.3
    • /
    • pp.640-655
    • /
    • 2018
  • Recently, the construction of the urban area has been rapidly increasing, and the excavation work of the ground has been frequently performed at the upper part of the existing underground structures. Especially, when the structure is constructed after the excavation of the ground, the loading and unloading process in the ground under the excavation basement can affect the existing underground structures. Therefore, in order to maintain the stability of the existing underground structure due to the excavation of the ground, it is necessary to accurately grasp the influence of the excavation and the structure load in the adjoining part. In this study, the effect of the excavation of the ground and the new structure load on the existing tunnel was experimentally implemented and the influence of the adjacent construction on the existing tunnel was investigated. For this purpose a large testing model with 1/5 scale of the actual size was manufactured. The influence of ground excavation, width of the load due to new structure, and distance between centers of tunnel and of excavation on the existing tunnel was investigated. In this study, it was confirmed that the influence on the existing tunnel gets larger, as the excavation depth get deeper. At the same distance, it was confirmed that the tunnel displacement increased up to three times according to the increase of the building load width. That is, the load width influences the existing tunnel larger than the excavation depth. As the impact of the distance between centers of tunnel and of excavation, it was confirmed that tunnel crown displacement decreased by 48%. The result showed that a tunnel is located in the range of 1D (D: tunnel diameter) from the center of excavation, the effect of excavation is the largest.

Antecedents and Consequences of Cooperation in Retail Voluntary Chain (소매점 볼런터리 체인 활성화의 선행요인과 결과)

  • Yi, Ho-Taek
    • Journal of Distribution Science
    • /
    • v.14 no.6
    • /
    • pp.65-73
    • /
    • 2016
  • Purpose - Recently, the management conditions of small independent retailers are getting worse everyday as large-scale marts and franchised convenience stores are increasing. The objective of this research is to find out the antecedents and consequences of cooperation in voluntary chain in order to enhance small independent retailer's competitiveness. Voluntary chains, also called affiliation or symbol groups, or allied group represent a high market shared in some European countries like Italy, France, and Germany. Nevertheless, there are still limitations in this research from academic fields. Drawing from network theory, the author investigates the relationship between antecedent factors in voluntary chain cooperation, such as participation benefits, justice of compensation, and autonomy in voluntary chain, and relationship specific asset. The author also attempts to examine the relationship between the relationship specific asset and cooperation of voluntary chain member shop and cooperation and consequence factors of voluntary chain cooperation, such as efficiency, group cohesiveness, and long-term relationship. Research design, data, and methodology - The author presented conceptual framework integrating the major antecedents and consequences of voluntary chain cooperation. The data were collected from 174 independent small retailers who joined K-voluntary chain. K-voluntary chain consists of small independent retailers. In accordance with their status, each entrepreneur associated with the voluntary group can own one or more outlets and can be a part of the life and the decision-making process of the group. This participation is not based on company turnover or on the number of outlets, but based on a one member, one vote system. To verify the research model and test hypotheses, the author carefully investigated the reliability, content validity, convergent validity, and discriminant validity of the proposed model. The data were analyzed by using SPSS 18.0 and AMOS structural equation modeling program. Results - The results of this study are as follows. First, as antecedent variables, participation benefits and justice of compensation have positive effect on the relationship specific assets of voluntary chain members. Second, voluntary chain members' relationship specific asset also directly related to the level of its cooperation to chain headquarter. Third, cooperation of voluntary chain member shop facilitates efficiency, group cohesiveness, and long-term relationship. Unexpectedly, there are no effect autonomy in voluntary chain to relationship specific asset. Conclusions - This research shows several theoretical and practical implications to both marketing scholars and marketers. In terms of theoretical implications, this study applies to network theory and network theory variables to explain the antecedent and consequence factors of cooperation in voluntary chain. From the point of view from business management, most of all, this study shows the way how to reinforce competitiveness of voluntary chain. Specifically, it is necessary for voluntary chain headquarter to give higher level of participation benefit and justice of compensation to its members. Second, the results also indicate what the consequence factors of cooperation in voluntary chain. In other words, to increase the level of marketing efficiency, group cohesiveness, long-term orientation in retail voluntary chain, and chain headquarter need to facilitate participants' cooperation.