• Title/Summary/Keyword: Process Cost

Search Result 7,741, Processing Time 0.037 seconds

A Survey on the Broadcasting Program Production by Video Journalists in Daejeon (비디오저널리스트(VJ)의 방송프로그램 제작 실태조사 - 대전지역을 중심으로 -)

  • Lee, Jong-Tak;Jeong, Jong-Geon
    • The Journal of the Korea Contents Association
    • /
    • v.9 no.9
    • /
    • pp.171-179
    • /
    • 2009
  • This paper explores the current situation of the broadcasting program production by video journalists (VJs), one of the recent trends of broadcasting production, centering around the programs made in Daejeon. The result of the survey shows that VJ programs are considered to be one of the generalized broadcasting program production systems. Thus, production costs should be γeadjusted to a realistic level in order to secure the good quality VJ programs in the current situation. Local broadcasting stations should depart from the idea that they make a VJ program subcontract to produce programs at low cost. It is time to support good quality VJ programs financially and systematically. The survey also reveals that most video journalists cannot participate in the decision-making process regarding program production. Local broadcastings should cooperate with video journalists, as co-producers, in producing the programs. Besides, VJ programs have some disadvantages such as limitation of high definition image production, poor image quality, instability of images taken, etc Hence, video journalists should also try to make better image quality of their broadcasting programs. Local subcontractors should make efforts to overcome their poor manpower and production environments as well. By economizing their scale of production and updating their production equipment, subcontractors need to constantly develop the contents related to local community which make them more competitive.

A Study on IPA-based Competitiveness Enhancement Measures for Regular Freight Service (IPA분석을 이용한 정기화물운송업의 경쟁력 강화방안에 관한 연구)

  • Lee, Young-Jae;Park, Soo-Hong;Sun, Il-Suck
    • Journal of Distribution Science
    • /
    • v.13 no.1
    • /
    • pp.83-91
    • /
    • 2015
  • Purpose - Despite the structural irrationality of multi-level transportation and the oil price rise, the domestic freight transportation market continues to grow, mirroring the rise in e-commerce and resultant increase in courier services and freight volumes. Several studies on courier services have been conducted. However, few studies or statistics have been published regarding regular freight services although they have played a role in the freight service market. The present study identifies the characteristics of regular freight service users to seek competitiveness enhancement measures specific to regular freight services. Research design, data, and methodology - IPA is a comparative analysis of the relative importance of and satisfaction with each attribute simultaneously. This study used IPA because it facilitates the process of analyzing importance and performance, deriving implications and a visual understanding of results. To enhance the competitiveness of regular freight services, this study surveyed its current users regarding the importance of the regular freight service factors. A total of 200 copies of a questionnaire were circulated and 190 copies were returned. In addition to demographics, respondents answered questions about the importance of and satisfaction with services on a 5-point Likert scale. Excluding 3 inappropriate copies, 187 out of 190 copies were analyzed. PASW Statistics 18 was used for statistical analysis. A total of 20 question items were selected for the service factors presented in the questionnaire based on the 1st pilot survey and previous studies. Results - According to the IPA performed to compare the importance of and satisfaction with service factors, both importance and satisfaction are high in the 1st quadrant, which involves the economic advantage of using regular freight services, quick arrival at destinations, weight freight handling, and less time constraints on freight receipt/dispatch. This area requires continuous management. Satisfaction is higher than importance in the 2nd quadrant, which involves the adequacy of freight, cost savings over ordinary courier services, notification on freight arrival, and freight tracking information. This area requires intensive investment and management. Satisfaction is lower than importance in the 3rd quadrant, involving the credit card payment system, courier delivery service, distance to freight handling sites, easy access to freight handling sites, and prompt problem solving. This area requires further intensive management. Both importance and satisfaction are low in the 4th quadrant, involving the availability of collection service, storage space at freight handling sites, kindness of collection/delivery staff, kindness of outlet staff, and easy delivery checks. This area is a set of variables should be excluded from priority control targets. Conclusions - Based on the IPA, service factors that need priority controls because of high importance and low satisfaction include the credit card payment system, delivery service, distance to freight handling sites, easy access to freight handling sites, and prompt problem solving. The findings need to be applied to future marketing strategies for regular freight services and for developing competitiveness enhancement programs.

Optimization Model for the Mixing Ratio of Coatings Based on the Design of Experiments Using Big Data Analysis (빅데이터 분석을 활용한 실험계획법 기반의 코팅제 배합비율 최적화 모형)

  • Noh, Seong Yeo;Kim, Young-Jin
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.3 no.10
    • /
    • pp.383-392
    • /
    • 2014
  • The research for coatings is one of the most popular and active research in the polymer industry. For the coatings, electronics industry, medical and optical fields are growing more important. In particular, the trend is the increasing of the technical requirements for the performance and accuracy of the coatings by the development of automotive and electronic parts. In addition, the industry has a need of more intelligent and automated system in the industry is increasing by introduction of the IoT and big data analysis based on the environmental information and the context information. In this paper, we propose an optimization model for the design of experiments based coating formulation data objects using the Internet technologies and big data analytics. In this paper, the coating formulation was calculated based on the best data analysis is based on the experimental design, modify the operator with respect to the error caused based on the coating formulation used in the actual production site data and the corrected result data. Further optimization model to correct the reference value by leveraging big data analysis and Internet of things technology only existing coating formulation is applied as the reference data using a manufacturing environment and context information retrieval in color and quality, the most important factor in maintaining and was derived. Based on data obtained from an experiment and analysis is improving the accuracy of the combination data and making it possible to give a LOT shorter working hours per data. Also the data shortens the production time due to the reduction in the delivery time per treatment and It can contribute to cost reduction or the like defect rate reduced. Further, it is possible to obtain a standard data in the manufacturing process for the various models.

XML Fragmentation for Resource-Efficient Query Processing over XML Fragment Stream (자원 효율적인 XML 조각 스트림 질의 처리를 위한 XML 분할)

  • Kim, Jin;Kang, Hyun-Chul
    • The KIPS Transactions:PartD
    • /
    • v.16D no.1
    • /
    • pp.27-42
    • /
    • 2009
  • In realizing ubiquitous computing, techniques of efficiently using the limited resource at client such as mobile devices are required. With a mobile device with limited amount of memory, the techniques of XML stream query processing should be employed to process queries over a large volume of XML data. Recently, several techniques were proposed which fragment XML documents into XML fragments and stream them for query processing at client. During query processing, there could be great difference in resource usage (query processing time and memory usage) depending on how the source XML documents are fragmented. As such, an efficient fragmentation technique is needed. In this paper, we propose an XML fragmentation technique whereby resource efficiency in query processing at client could be enhanced. For this, we first present a cost model of query processing over XML fragment stream. Then, we propose an algorithm for resource-efficient XML fragmentation. Through implementation and experiments, we showed that our fragmentation technique outperformed previous techniques both in processing time and memory usage. The contribution of this paper is to have made the techniques of query processing over XML fragment stream more feasible for practical use.

Investigating the Impact of Corporate Social Responsibility on Firm's Short- and Long-Term Performance with Online Text Analytics (온라인 텍스트 분석을 통해 추정한 기업의 사회적책임 성과가 기업의 단기적 장기적 성과에 미치는 영향 분석)

  • Lee, Heesung;Jin, Yunseon;Kwon, Ohbyung
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.2
    • /
    • pp.13-31
    • /
    • 2016
  • Despite expectations of short- or long-term positive effects of corporate social responsibility (CSR) on firm performance, the results of existing research into this relationship are inconsistent partly due to lack of clarity about subordinate CSR concepts. In this study, keywords related to CSR concepts are extracted from atypical sources, such as newspapers, using text mining techniques to examine the relationship between CSR and firm performance. The analysis is based on data from the New York Times, a major news publication, and Google Scholar. We used text analytics to process unstructured data collected from open online documents to explore the effects of CSR on short- and long-term firm performance. The results suggest that the CSR index computed using the proposed text - online media - analytics predicts long-term performance very well compared to short-term performance in the absence of any internal firm reports or CSR institute reports. Our study demonstrates the text analytics are useful for evaluating CSR performance with respect to convenience and cost effectiveness.

Design and Implementation of A Distributed Information Integration System based on Metadata Registry (메타데이터 레지스트리 기반의 분산 정보 통합 시스템 설계 및 구현)

  • Kim, Jong-Hwan;Park, Hea-Sook;Moon, Chang-Joo;Baik, Doo-Kwon
    • The KIPS Transactions:PartD
    • /
    • v.10D no.2
    • /
    • pp.233-246
    • /
    • 2003
  • The mediator-based system integrates heterogeneous information systems with the flexible manner. But it does not give much attention on the query optimization issues, especially for the query reusing. The other thing is that it does not use standardized metadata for schema matching. To improve this two issues, we propose mediator-based Distributed Information Integration System (DIIS) which uses query caching regarding performance and uses ISO/IEC 11179 metadata registry in terms of standardization. The DIIS is designed to provide decision-making support, which logically integrates the distributed heterogeneous business information systems based on the Web environment. We designed the system in the aspect of three-layer expression formula architecture using the layered pattern to improve the system reusability and to facilitate the system maintenance. The functionality and flow of core components of three-layer architecture are expressed in terms of process line diagrams and assembly line diagrams of Eriksson Penker Extension Model (EPEM), a methodology of an extension of UML. For the implementation, Supply Chain Management (SCM) domain is used. And we used the Web-based environment for user interface. The DIIS supports functions of query caching and query reusability through Query Function Manager (QFM) and Query Function Repository (QFR) such that it enhances the query processing speed and query reusability by caching the frequently used queries and optimizing the query cost. The DIIS solves the diverse heterogeneity problems by mapping MetaData Registry (MDR) based on ISO/IEC 11179 and Schema Repository (SCR).

An intercomparison study between optimization algorithms for parameter estimation of microphysics in Unified model : Micro-genetic algorithm and Harmony search algorithm (통합모델의 강수물리과정 모수 최적화를 위한 알고리즘 비교 연구 : 마이크로 유전알고리즘과 하모니 탐색 알고리즘)

  • Jang, Jiyeon;Lee, Yong Hee;Joo, Sangwon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.27 no.1
    • /
    • pp.79-87
    • /
    • 2017
  • The microphysical processes of the numerical weather prediction (NWP) model cover the following : fall speed, accretion, autoconversion, droplet size distribution, etc. However, the microphysical processes and parameters have a significant degree of uncertainty. Parameter estimation was generally used to reduce errors in NWP models associated with uncertainty. In this study, the micro- genetic algorithm and harmony search algorithm were used as an optimization algorithm for estimating parameters. And we estimate parameters of microphysics for the Unified model in the case of precipitation in Korea. The differences which occurred during the optimization process were due to different characteristics of the two algorithms. The micro-genetic algorithm converged to about 1.033 after 440 times. The harmony search algorithm converged to about 1.031 after 60 times. It shows that the harmony search algorithm estimated optimal parameters more quickly than the micro-genetic algorithm. Therefore, if you need to search for the optimal parameter within a faster time in the NWP model optimization problem with large calculation cost, the harmony search algorithm is more suitable.

Fingerprint Pore Extraction Method using 1D Gaussian Model (1차원 가우시안 모델을 이용한 지문 땀샘 추출 방법)

  • Cui, Junjian;Ra, Moonsoo;Kim, Whoi-Yul
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.52 no.4
    • /
    • pp.135-144
    • /
    • 2015
  • Fingerprint pores have proven to be useful features for fingerprint recognition and several pore-based fingerprint recognition systems have been reported recently. In order to recognize fingerprints using pore information, it is very important to extract pores reliably and accurately. Existing pore extraction methods utilize 2D model fitting to detect pore centers. This paper proposes a pore extraction method using 1D Gaussian model which is much simpler than 2D model. During model fitting process, 1D model requires less computational cost than 2D model. The proposed method first calculates local ridge orientation; then, ridge mask is generated. Since pore center is brighter than its neighboring pixels, pore candidates are extracted using a $3{\times}3$ filter and a $5{\times}5$ filter successively. Pore centers are extracted by fitting 1D Gaussian model on the pore candidates. Extensive experiments show that the proposed pore extraction method can extract pores more effectively and accurately than other existing methods, and pore matching results show the proposed pore extraction method could be used in fingerprint recognition.

A DC-DC Converter Design for OLED Display Module (OLED Display Module용 DC-DC 변환기 설계)

  • Lee, Tae-Yeong;Park, Jeong-Hun;Kim, Jeong-Hoon;Kim, Tae-Hoon;Vu, Cao Tuan;Kim, Jeong-Ho;Ban, Hyeong-Jin;Yang, Gweon;Kim, Hyoung-Gon;Ha, Pan-Bong;Kim, Young-Hee
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.12 no.3
    • /
    • pp.517-526
    • /
    • 2008
  • A one-chip DC-DC converter circuit for OLED(Organic Light-Emitting Diode) display module of automotive clusters is newly proposed. OLED panel driving voltage circuit, which is a charge-pump type, has improved characteristics in miniaturization, low cost and EMI(Electro-Magnetic Interference) compared with DC-DC converter of PWM(Pulse Width Modulator) type. By using bulk-potential biasing circuit, charge loss due to parasitic PNP BJT formed in charge pumping, is prevented. In addition, the current dissipation in start-up circuit of band-gap reference voltage generator is reduced by 42% and the layout area of ring oscillator is reduced by using a logic voltage VLP in ring oscillator circuit using VDD supply voltage. The driving current of VDD, OLED driving voltage, is over 40mA, which is required in OLED panels. The test chip is being manufactured using $0.25{\mu}m$ high-voltage process and the layout area is $477{\mu}m{\times}653{\mu}m$.

Efficient Methodology in Markov Random Field Modeling : Multiresolution Structure and Bayesian Approach in Parameter Estimation (피라미드 구조와 베이지안 접근법을 이용한 Markove Random Field의 효율적 모델링)

  • 정명희;홍의석
    • Korean Journal of Remote Sensing
    • /
    • v.15 no.2
    • /
    • pp.147-158
    • /
    • 1999
  • Remote sensing technique has offered better understanding of our environment for the decades by providing useful level of information on the landcover. In many applications using the remotely sensed data, digital image processing methodology has been usefully employed to characterize the features in the data and develop the models. Random field models, especially Markov Random Field (MRF) models exploiting spatial relationships, are successfully utilized in many problems such as texture modeling, region labeling and so on. Usually, remotely sensed imagery are very large in nature and the data increase greatly in the problem requiring temporal data over time period. The time required to process increasing larger images is not linear. In this study, the methodology to reduce the computational cost is investigated in the utilization of the Markov Random Field. For this, multiresolution framework is explored which provides convenient and efficient structures for the transition between the local and global features. The computational requirements for parameter estimation of the MRF model also become excessive as image size increases. A Bayesian approach is investigated as an alternative estimation method to reduce the computational burden in estimation of the parameters of large images.