• Title/Summary/Keyword: 모델 접근 방법

Search Result 1,705, Processing Time 0.038 seconds

A Probabilistic Risk-based Cost Estimation Model for Initial-Stage Decision Making on Apartment Remolding Projects (공동주택 리모델링 초기 단계 의사결정을 위한 확률론적 리스크 기반 비용 예측 모델 개발)

  • Lee, Dong-gun;Cha, Heesung
    • Korean Journal of Construction Engineering and Management
    • /
    • v.17 no.2
    • /
    • pp.70-79
    • /
    • 2016
  • The current remodeling cost estimation process is not only dependent on the historical data of new building construction, but it also has a poor linkage with risk-based estimation approach. As such, there is a high risk of falling short of initial budget. To overcome this, a risk-based estimation approach is necessary by providing a probabilistic estimation in consideration of the potential risk factors in conducting the remodeling projects. In addition, the decision-making process should be linked with the risk-based estimation results in stead of intuitive and/or experience-based estimation. This study provides a probabilistic estimation process for residential remodeling projects by developing a detailed methodology in which a step-by-step approach can be achieved. The new proposed estimation approach can help in decision-making for remodeling projects in terms of whether to proceed or not, by effectively reflecting the potential risk factors in the early stage of the project. In addition, the study can enhance the reliability of the estimation results by developing a sustainable estimation process model where a risk-based evaluation can be accomplished by setting up the cost-risk relationship database structure.

User Innovation Empowerment in Open Market Systems: A Case Study on Participatory Game Communities (오픈마켓 시스템에서의 사용자 혁신 위임: 참여적 게임 커뮤니티에 대한 사례연구)

  • Kwon, Hee-Jung;Kim, Jin-Woo
    • Information Systems Review
    • /
    • v.12 no.3
    • /
    • pp.75-88
    • /
    • 2010
  • Business models in open market systems targeting smart phone users are determined by several important factors. First, by providing developers efficient technical platforms, it contains a setting for developers to learn, apply and improve the skills relating to the product category easily while they stay beyond a corporate boundary. Second, by the first condition, a huge population of talented developers becomes to join a specific open market where will invite more customers to use their applications. Hence it will attract more and more developer participants who will finally give a rise to a persistent market growth. Third, the evaluation system between platform providers and application producers, and one between application producers and application users may underlie the trust relationships between them. The research conducted a multiple embedded case study to test the success factors of open market based business models. It focused on smart phone game communities that have installed user evaluation, and feedback systems. The user innovation empowerment model within the social game networks has highlighted the theories on the roles and characteristics of lead users, and lead user network behaviors for future NPD participations.

Retrieval of Land Surface Temperature Using Landsat 8 Images with Deep Neural Networks (Landsat 8 영상을 이용한 심층신경망 기반의 지표면온도 산출)

  • Kim, Seoyeon;Lee, Soo-Jin;Lee, Yang-Won
    • Korean Journal of Remote Sensing
    • /
    • v.36 no.3
    • /
    • pp.487-501
    • /
    • 2020
  • As a viable option for retrieval of LST (Land Surface Temperature), this paper presents a DNN (Deep Neural Network) based approach using 148 Landsat 8 images for South Korea. Because the brightness temperature and emissivity for the band 10 (approx. 11-㎛ wavelength) of Landsat 8 are derived by combining physics-based equations and empirical coefficients, they include uncertainties according to regional conditions such as meteorology, climate, topography, and vegetation. To overcome this, we used several land surface variables such as NDVI (Normalized Difference Vegetation Index), land cover types, topographic factors (elevation, slope, aspect, and ruggedness) as well as the T0 calculated from the brightness temperature and emissivity. We optimized four seasonal DNN models using the input variables and in-situ observations from ASOS (Automated Synoptic Observing System) to retrieve the LST, which is an advanced approach when compared with the existing method of the bias correction using a linear equation. The validation statistics from the 1,728 matchups during 2013-2019 showed a good performance of the CC=0.910~0.917 and RMSE=3.245~3.365℃, especially for spring and fall. Also, our DNN models produced a stable LST for all types of land cover. A future work using big data from Landsat 5/7/8 with additional land surface variables will be necessary for a more reliable retrieval of LST for high-resolution satellite images.

The Study on Frameworks of Valuation Models for the Contents of Science and Technology Information (과학기술정보 콘텐츠의 가치평가모형 프레임워크 연구)

  • Sung, Tae-Eung;Jun, Seung-Pyo;Byun, Jeongeun;Park, Hyun-Woo
    • The Journal of the Korea Contents Association
    • /
    • v.16 no.11
    • /
    • pp.421-433
    • /
    • 2016
  • Recently, although the interest in transfer and transactions of intangible assets increases, there is no valuation model to objectively assess market value of knowledge and information contents such as electronic databases and the necessity of researches associated is brought up. Therefore, the present study proposes valuation models so as to utilize as objective reference information in the contents market of intangible assets, by assessing the market value of science and technology information contents including patents, academic papers and reports. First, we look into application methods of calculating cash flows by content types out of key variables which has been applied to the present technology valuation, and in case of patents we propose valuation methods based on concepts which are applied in the present technology valuation. Next, in case of both papers and reports, in order to reflect the characteristics of these contents we newly propose qualitative valuation methods which are adjustable based on both technology innovation and market demands indices while estimating the economic life cycle of the technology, and also present the input cost-based calculation method as the calculation method of cash flows. Throughout the study, we could establish frameworks by technology fields and business models applicable such as copyright licensing, transactions of individual science and technology information contents, and expect that more objective and reasonable assessment of content values is accessible.

Implementation of the Agent using Universal On-line Q-learning by Balancing Exploration and Exploitation in Reinforcement Learning (강화 학습에서의 탐색과 이용의 균형을 통한 범용적 온라인 Q-학습이 적용된 에이전트의 구현)

  • 박찬건;양성봉
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.7_8
    • /
    • pp.672-680
    • /
    • 2003
  • A shopbot is a software agent whose goal is to maximize buyer´s satisfaction through automatically gathering the price and quality information of goods as well as the services from on-line sellers. In the response to shopbots´ activities, sellers on the Internet need the agents called pricebots that can help them maximize their own profits. In this paper we adopts Q-learning, one of the model-free reinforcement learning methods as a price-setting algorithm of pricebots. A Q-learned agent increases profitability and eliminates the cyclic price wars when compared with the agents using the myoptimal (myopically optimal) pricing strategy Q-teaming needs to select a sequence of state-action fairs for the convergence of Q-teaming. When the uniform random method in selecting state-action pairs is used, the number of accesses to the Q-tables to obtain the optimal Q-values is quite large. Therefore, it is not appropriate for universal on-line learning in a real world environment. This phenomenon occurs because the uniform random selection reflects the uncertainty of exploitation for the optimal policy. In this paper, we propose a Mixed Nonstationary Policy (MNP), which consists of both the auxiliary Markov process and the original Markov process. MNP tries to keep balance of exploration and exploitation in reinforcement learning. Our experiment results show that the Q-learning agent using MNP converges to the optimal Q-values about 2.6 time faster than the uniform random selection on the average.

A Cost-Efficient Job Scheduling Algorithm in Cloud Resource Broker with Scalable VM Allocation Scheme (클라우드 자원 브로커에서 확장성 있는 가상 머신 할당 기법을 이용한 비용 적응형 작업 스케쥴링 알고리즘)

  • Ren, Ye;Kim, Seong-Hwan;Kang, Dong-Ki;Kim, Byung-Sang;Youn, Chan-Hyun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.1 no.3
    • /
    • pp.137-148
    • /
    • 2012
  • Cloud service users request dedicated virtual computing resource from the cloud service provider to process jobs in independent environment from other users. To optimize this process with automated method, in this paper we proposed a framework for workflow scheduling in the cloud environment, in which the core component is the middleware called broker mediating the interaction between users and cloud service providers. To process jobs in on-demand and virtualized resources from cloud service providers, many papers propose scheduling algorithms that allocate jobs to virtual machines which are dedicated to one machine one job. With this method, the isolation of being processed jobs is guaranteed, but we can't use each resource to its fullest computing capacity with high efficiency in resource utilization. This paper therefore proposed a cost-efficient job scheduling algorithm which maximizes the utilization of managed resources with increasing the degree of multiprogramming to reduce the number of needed virtual machines; consequently we can save the cost for processing requests. We also consider the performance degradation in proposed scheme with thrashing and context switching. By evaluating the experimental results, we have shown that the proposed scheme has better cost-performance feature compared to an existing scheme.

A Study of Visualizing Relational Information - In Mitologia Project - (관계형 정보의 시각화에 관한 연구 - 미톨로지아 프로젝트를 중심으로 -)

  • Jang, Seok-Hyun;Hwang, Hyo-Won;Lee, Kyung-Won
    • Journal of the HCI Society of Korea
    • /
    • v.1 no.1
    • /
    • pp.73-80
    • /
    • 2006
  • Mitologia is about visualizing relations of information in user-oriented method. Most information given in life has invisible relations with each other. By analyzing the common characters and relations of information, we can not only measure the importance of the information but also grasp the overall properties of the information. Especially human relations are the major concerns of social network having several visualization methodologies shown by analyzing relations of each individual in society. We applied social network theory to grasp relationships between characters in Greek mythology representing a limited society. But the current tools of social network analysis have limits that they show the information one-sided way because of the ignorance of user-oriented design. Mitologia attempts to suggest the visual structure model more effective and easy to understand in analyzing data. We extracted connections among myth characters by evaluating classes, frequencies of appearance and emotional links they have. And we raised the understanding of users with furnishing the proper interaction to the information. The initial interface offers 4 kinds of indexes helping to access character nodes easily, while zoom-in function can be used for the detailed relations. The Zoom-in is quite different from usual filtering methods. It makes the irrelative information invisible so that users can find out the characters' relation more easily and quickly. This project suggests the layout to show overall information relationships and the appropriate interactions to present detailed information at the same time.

  • PDF

Modeling of Gas Permeability Coefficient for Cementitious Materials with Relation to Water Permeability Coefficient (시멘트계 재료의 기체 투기계수 해석 및 투수계수와의 상관성 연구)

  • Yoon, In-Seok
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.36 no.2
    • /
    • pp.207-217
    • /
    • 2016
  • Permeability can not be expressed as a function of porosity alone, it depends on the porosity, pore size and distribution, and tortuosity of pore channels in concrete. There has been considerable interest in the relationship between microstructure and transport in cementitious materials, however, it is very rare to deal with the theoretical study on gas permeability coefficient in connection with carbonation of concrete and the effect of volumetric fraction of cement paste or aggregate on the permeability coefficient. The majority of these researches have not dealt with this issue combined with carbonation of concrete, although carbonation can significantly impact on the permeability coefficient of concrete. In this study, fundamental approach to compute gas permeability of (non)carbonated concrete is suggested. For several compositions of cement pastes, the gas permeability coefficient was calculated with the analytical formulation, followed by a microstructure-based model. For carbonated concrete, reduced porosity was calculated and this was used for calculating the gas permeability coefficeint. As the result of calculation of gas permeability for carbonated concrete, carbonation leaded to the significant reduction of gas permeability coefficient and this was obvious for concrete with high w/c ratio. Meanwhile, the relationship between gas permeability and water permeability has a linear function for cement paste based on Klinkenberg effect, however, which is not effective for concrete. For the evidence of the modeling, YOON's test was accomplished and these results were compared to each other.

The Design of Optimal Filters in Vector-Quantized Subband Codecs (벡터양자화된 부대역 코덱에서 최적필터의 구현)

  • 지인호
    • The Journal of the Acoustical Society of Korea
    • /
    • v.19 no.1
    • /
    • pp.97-102
    • /
    • 2000
  • Subband coding is to divide the signal frequency band into a set of uncorrelated frequency bands by filtering and then to encode each of these subbands using a bit allocation rationale matched to the signal energy in that subband. The actual coding of the subband signal can be done using waveform encoding techniques such as PCM, DPCM and vector quantizer(VQ) in order to obtain higher data compression. Most researchers have focused on the error in the quantizer, but not on the overall reconstruction error and its dependence on the filter bank. This paper provides a thorough analysis of subband codecs and further development of optimum filter bank design using vector quantizer. We compute the mean squared reconstruction error(MSE) which depends on N the number of entries in each code book, k the length of each code word, and on the filter bank coefficients. We form this MSE measure in terms of the equivalent quantization model and find the optimum FIR filter coefficients for each channel in the M-band structure for a given bit rate, given filter length, and given input signal correlation model. Specific design examples are worked out for 4-tap filter in 2-band paraunitary filter bank structure. These optimum paraunitary filter coefficients are obtained by using Monte Carlo simulation. We expect that the results of this work could be contributed to study on the optimum design of subband codecs using vector quantizer.

  • PDF

RSSI-based Location Determination via Segmentation-based Linear Spline Interpolation Method (분할기반의 선형 호 보간법에 의한 RSSI기반의 위치 인식)

  • Lau, Erin-Ee-Lin;Chung, Wan-Young
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2007.10a
    • /
    • pp.473-476
    • /
    • 2007
  • Location determination of mobile user via RSSI approach has received ample attention from researchers lately. However, it remains a challenging issue due to the complexities of RSSI signal propagation characteristics, which are easily exacerbated by the mobility of user. Hence, a segmentation-based linear spline interpolation method is proposed to cater for the dynamic fluctuation pattern of radio signal in complex environment. This optimization algorithm is proposed in addition to the current radiolocation's (CC2431, Chipcon, Norway) algorithm, which runs on IEEE802.15.4 standard. The enhancement algorithm involves four phases. First phase consists of calibration model in which RSSI values at different static locations are collected and processed to obtain the mean and standard deviation value for the predefined distance. RSSI smoothing algorithm is proposed to minimize the dynamic fluctuation of radio signal received from each reference node when the user is moving. Distances are computed using the segmentation formula obtain in the first phase. In situation where RSSI value falls in more than one segment, the ambiguity of distance is solved by probability approach. The distance probability distribution function(pdf) for each distances are computed and distance with the highest pdf at a particular RSSI is the estimated distance. Finally, with the distances obtained from each reference node, an iterative trilateration algorithm is used for position estimation. Experiment results obtained position the proposed algorithm as a viable alternative for location tracking.

  • PDF