• Title/Summary/Keyword: Electronics Industry

Search Result 1,307, Processing Time 0.036 seconds

The Effects of the Heavy and Chemical Industry Policy of the 1970s on the Capital Efficiency and Export Competitiveness of Korean Manufacturing Industries (1970년대(年代) 중화학공업정책(重化學工業政策)이 자본효율성(資本效率性)과 수출경쟁력(輸出競爭力)에 미친 영향(影響))

  • Yoo, Jung-ho
    • KDI Journal of Economic Policy
    • /
    • v.13 no.1
    • /
    • pp.65-113
    • /
    • 1991
  • Korea's rapid economic growth of the past thirty years was led by extremely fast export growth under extensive government intervention. Until very recently, the political regimes were authoritarian and oppressed human rights and labor movements. Because of these characteristics, many inside and outside Korea are under the impression that the rapid economic growth was made possible by the government's relentless push for export growth through industrial targetjng. Whether or not the government intervention was pivotal in Korean economic growth is an important issue because of its normative implications on the role of government and the degree of economic policy intervention in a market economy. A good example of industrial targeting policy in Korea is the "Heavy and Chemical Industry (HCI)" policy, which began in the early 1970s and lasted for one decade. Under the HCI policy the government intervened in resource allocation through preferential tax, trade, and credit and interest rate policies for "key industries" which included iron and steel, non-ferrous metals, shipbuilding, general machinery, chemicals, and electronics. This paper investigates the effects of. the HCI policy on the efficiency of capital and the export competitiveness of manufacturing industries. For individual three-digit KSIC (Korea Standard Industrial Classification) industries and for two industry groups, one favored by HCI Policy and the other not, this paper: (1) computes capital intensities and discusses the impact of the HCI policy on the changes in the intensities over time, (2) estimates the capital efficiencies and examines them on the basis of optimal condition of resource allocation, and (3) compares the Korean and Taiwanese shares of total imports by the OECD countries as a way of weighing the effects of the policy on the industries' export competitiveness. Taiwan is a good reference, as it did not adopt the kind of industrial targeting policy that Korea did, while the Taiwanese and Korean economies share similar characteristics. In the 1973-78 period, the capital intensity rose rapidly for the "HC Group" the group of industries favored by the policy, while it first declined and later showed an anemic rise for the "Light Group," the remaining manufacturing industries. Capital efficiency was much lower in the HC Group than in the Light Group, at least until the late 1970s. This paper acribes these results to excess investments in the favored industries and concludes that growth could have been faster in the absence of the HCI policy. The Korean Light Group's share in total imports by the OECD was larger than that of its Taiwanese counterpart but has become much smaller since 1978. For the HC Group Korea's market share was smaller than Taiwan's and has declined even more since the mid-1970s. This weakening in the export competitiveness of Korea's industries relative to Taiwan's lasted until the mid-1980s. This paper concludes that the HCI policy had either no positive effect on the competitiveness of the Korean manufacturing industries or negative effects.

  • PDF

A Study on Industries's Leading at the Stock Market in Korea - Gradual Diffusion of Information and Cross-Asset Return Predictability- (산업의 주식시장 선행성에 관한 실증분석 - 자산간 수익률 예측 가능성 -)

  • Kim Jong-Kwon
    • Proceedings of the Safety Management and Science Conference
    • /
    • 2004.11a
    • /
    • pp.355-380
    • /
    • 2004
  • I test the hypothesis that the gradual diffusion of information across asset markets leads to cross-asset return predictability in Korea. Using thirty-six industry portfolios and the broad market index as our test assets, I establish several key results. First, a number of industries such as semiconductor, electronics, metal, and petroleum lead the stock market by up to one month. In contrast, the market, which is widely followed, only leads a few industries. Importantly, an industry's ability to lead the market is correlated with its propensity to forecast various indicators of economic activity such as industrial production growth. Consistent with our hypothesis, these findings indicate that the market reacts with a delay to information in industry returns about its fundamentals because information diffuses only gradually across asset markets. Traditional theories of asset pricing assume that investors have unlimited information-processing capacity. However, this assumption does not hold for many traders, even the most sophisticated ones. Many economists recognize that investors are better characterized as being only boundedly rational(see Shiller(2000), Sims(2201)). Even from casual observation, few traders can pay attention to all sources of information much less understand their impact on the prices of assets that they trade. Indeed, a large literature in psychology documents the extent to which even attention is a precious cognitive resource(see, eg., Kahneman(1973), Nisbett and Ross(1980), Fiske and Taylor(1991)). A number of papers have explored the implications of limited information- processing capacity for asset prices. I will review this literature in Section II. For instance, Merton(1987) develops a static model of multiple stocks in which investors only have information about a limited number of stocks and only trade those that they have information about. Related models of limited market participation include brennan(1975) and Allen and Gale(1994). As a result, stocks that are less recognized by investors have a smaller investor base(neglected stocks) and trade at a greater discount because of limited risk sharing. More recently, Hong and Stein(1999) develop a dynamic model of a single asset in which information gradually diffuses across the investment public and investors are unable to perform the rational expectations trick of extracting information from prices. Hong and Stein(1999). My hypothesis is that the gradual diffusion of information across asset markets leads to cross-asset return predictability. This hypothesis relies on two key assumptions. The first is that valuable information that originates in one asset reaches investors in other markets only with a lag, i.e. news travels slowly across markets. The second assumption is that because of limited information-processing capacity, many (though not necessarily all) investors may not pay attention or be able to extract the information from the asset prices of markets that they do not participate in. These two assumptions taken together leads to cross-asset return predictability. My hypothesis would appear to be a very plausible one for a few reasons. To begin with, as pointed out by Merton(1987) and the subsequent literature on segmented markets and limited market participation, few investors trade all assets. Put another way, limited participation is a pervasive feature of financial markets. Indeed, even among equity money managers, there is specialization along industries such as sector or market timing funds. Some reasons for this limited market participation include tax, regulatory or liquidity constraints. More plausibly, investors have to specialize because they have their hands full trying to understand the markets that they do participate in

  • PDF

A Software Reliability Cost Model Based on the Shape Parameter of Lomax Distribution (Lomax 분포의 형상모수에 근거한 소프트웨어 신뢰성 비용모형에 관한 연구)

  • Yang, Tae-Jin
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.9 no.2
    • /
    • pp.171-177
    • /
    • 2016
  • Software reliability in the software development process is an important issue. Software process improvement helps in finishing with reliable software product. Infinite failure NHPP software reliability models presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing failure occurrence rates per fault. In this study, reliability software cost model considering shape parameter based on life distribution from the process of software product testing was studied. The cost comparison problem of the Lomax distribution reliability growth model that is widely used in the field of reliability presented. The software failure model was used the infinite failure non-homogeneous Poisson process model. The parameters estimation using maximum likelihood estimation was conducted. For analysis of software cost model considering shape parameter. In the process of change and large software fix this situation can scarcely avoid the occurrence of defects is reality. The conditions that meet the reliability requirements and to minimize the total cost of the optimal release time. Studies comparing emissions when analyzing the problem to help kurtosis So why Kappa efficient distribution, exponential distribution, etc. updated in terms of the case is considered as also worthwhile. In this research, software developers to identify software development cost some extent be able to help is considered.

Safety Evaluation on Real Time Operating Systems for Safety-Critical Systems (안전필수(Safety-Critical) 시스템의 실시간 운영체제에 대한 안전성 평가)

  • Kang, Young-Doo;Chong, Kil-To
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.11 no.10
    • /
    • pp.3885-3892
    • /
    • 2010
  • Safety-Critical systems, such as Plant Protection Systems in nuclear power plant, plays a key role that the facilities can be operated without undue risk to the health and safety of public and environment, and those systems shall be designed, fabricated, installed, and tested to quality standards commensurate with the importance of the functions to be performed. Computer-based Instrumentation and Control Systems to perform the safety-critical function have Real Time Operating Systems to control and monitoring the sub-system and executing the application software. The safety-critical Real Time Operating Systems shall be designed, analyzed, tested and evaluated to have capability to maintain a high integrity and quality. However, local nuclear power plants have applied the real time operating systems on safety critical systems through Commercial Grade Item Dedication method, and this is the reason of lack of detailed methodology on assessing the safety of real time operating systems, expecially to the new developed one. This paper presents the methodology and experiences of safety evaluation on safety-critical Real Time Operating Systems based upon design requirements. This paper may useful to develop and evaluate the safety-critical Real Time Operating Systems in other industry to ensure the safety of public and environment.

Manufacturing of a Korean Hand Phantom with Human Electrical Properties at 835 MHz and 1,800 MHz Bands (835 MHz 및 1,800 MHz 대역에서 인체의 전기적 특성을 가지는 한국인 손 모양의 팬텀 제작)

  • Choi, Donggeun;Gimm, Yoonmyoung;Choi, Jaehoon
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.24 no.5
    • /
    • pp.534-540
    • /
    • 2013
  • Interest of the hand effect on the electromagnetic wave are internationally increasing with the increase of the use of the mobile phone. IEC TC106(International Electrotechnical Commission, Technical Committee 106) promotes international research exchange program in order to reflect the effect of human hands in the standard assessment method of human exposure dosimetry by the electromagnetic wave of mobile phones. Since current commercialized hand phantom is manufactured by taking into account the average size of westerners and provides only one grip posture, it imposes many restrictions on the accurate SAR measurement. Therefore, the development of proper hand phantom accounting for domestic situation and various grip posture capability is essential in order to analyze the accurate effect of human hand on the exposure estimation. In this paper, a jelly hand phantom suitable for Korean was manufactured with various grip posture capability at 835 MHz and 1,800 MHz bands. Although the tolerances of permittivity and conductivity of the manufactured hand phantom are with ${\pm}10%$ each, it was much less than CTIA(Cellular Telecommunication Industry Association) tolerance of ${\pm}20%$ at both bands. Its 3D CAD(3 Dimensional Computer Aided Design) file which was developed can be utilized for the simulation of human hand effect on SAR measurement of mobile phones. The findings in this study can be utilized for the analysis of human hand effect on SAR measurement of a mobile phone.

A Study on the Procedure of Using Big Data to Solve Smart City Problems Based on Citizens' Needs and Participation (시민 니즈와 참여 기반의 스마트시티 문제해결을 위한 빅 데이터 활용 절차에 관한 연구)

  • Chang, Hye-Jung
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.13 no.2
    • /
    • pp.102-112
    • /
    • 2020
  • Smart City's goal is to solve urban problems through smart city's component technology, thereby developing eco-friendly and sustainable economies and improving citizens' quality of life. Until now, smart cities have evolved into component technologies, but it is time to focus attention on the needs and participation of citizens in smart cities. In this paper, we present a big data procedure for solving smart city problems based on citizens' needs and participation. To this end, we examine the smart city project market by region and major industry. We also examine the development stages of the smart city market area by sector. Additionally it understands the definition and necessity of each sector for citizen participation, and proposes a method to solve the problem through big data in the seven-step big data problem solving process. The seven-step big data process for solving problems is a method of deriving tasks after analyzing structured and unstructured data in each sector of smart cities and deriving policy programs accordingly. To attract citizen participation in these procedures, the empathy stage of the design thinking methodology is used in the unstructured data collection process. Also, as a method of identifying citizens' needs to solve urban problems in smart cities, the problem definition stage of the design sinking methodology was incorporated into the unstructured data analysis process.

A Study on Removal of Abietic Acid Using Plasma (플라스마를 이용한 Abietic Acid의 제거에 관한 연구)

  • Kim, Ga-Young;Kim, Da-Seul;Kim, Dong-Hyun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.21 no.11
    • /
    • pp.788-794
    • /
    • 2020
  • This study was measured and analyzed from January to November 2019 to confirm the effect that Abietic acid, an asthma-causing substance, which can be exposed to workers in the electronics industry, is removed by plasma treatment. The experiment was carried out using a solder wire and natural rosin. Air at temperatures of 250℃, 300℃, and 350℃ was collected with a glass fiber filter paper using an air sampler for 10 minutes at a flow rate of 2ℓ/min. An analysis of the collected samples was performed by pretreatment with methyl alcohol and quantitative analysis by high performance liquid chromatography (HPLC). This procedure confirmed that abietic acid was generated in both natural rosin and solder wires, and the quantum of abietic acid increased as the treatment temperature increased. The amount of abietic acid was higher in natural rosin than solder wire. As a result of plasma treatment, a removal efficiency of about 92% or more was confirmed in natural rosin. A peak of abietic acid was not detected in the solder wire. Therefore, a removal efficiency of 100% was confirmed. This study, confirmed that abietic acid, an asthma-trigger can be generated in solder wire and natural rosin, and can be removed by plasma treatment.

An Evaluation of Effectiveness for Providing Safety Navigation Supporting Service : Focused on Route Plan Sharing Service (안전 항해 지원 서비스 제공에 대한 유용성 평가(I) : 항로 계획 공유 서비스를 대상으로)

  • Hwang, Hun-Gyu;Kim, Bae-Sung;Shin, Il-Sik;Lee, Jang-Se;Yu, Yung-Ho
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.21 no.3
    • /
    • pp.620-628
    • /
    • 2017
  • In this paper, we suggest a route plan sharing service for the navigation assistance service which is the second item of 16 items in maritime service portfolios (MSPs) for safety navigation based on interview process. Also, we developed scenarios for effectiveness evaluation of the proposed service, and conducted simulations using full mission ship handling simulator (FMSS) for effectiveness evaluation of proposed services based on the developed scenarios. Through the simulations, we analyzed proximity measures, controllability statistics and subjective evaluations to assess the usefulness of suggested service. If accomplishing the test for new services to apply real ships and vessel traffic (VTS) center, there has possibilities to occur various risks in terms of time/cost problems. Therefore, there is needs for the preliminary effectiveness evaluation processes necessarily when adopts and implements new services. Because we expected the service that is helpful for safety navigation, but the test results are not when conducted a simulation.

A Study on the Development of the Single Station Fixed Temperature Detector of Low Power Consumption for Residential Fire Prevention (주택화재 예방을 위한 저소비 전력형 단독경보형 정온식감지기 개발에 관한 연구)

  • Park, Se-Hwa;Cho, Jae-Cheol
    • Fire Science and Engineering
    • /
    • v.24 no.6
    • /
    • pp.61-68
    • /
    • 2010
  • In this paper, a research and development result for the implementation of single station fixed temperature detector for residential fire prevention is described. The detector was developed for the certification in Japanese market because of very low domestic market situation. It is in the situation that there is no other regulation especially for residential detectors in Korea, Japanese case has been reviewed. Investigation of domestic legal circumstances and a comparative study for the test standard owned by KFI (Korea Institute of Fire Industry & Technology) and JFEII (Japan Fire Equipment Inspection Institute) respectively are also indicated. The detector alarms with a buzzer and an indicating LED. In the implementation ultra low power MCU(Micro Controller Unit) is applied to control the sleeping state and the monitoring state properly with low current consumption. To sense the temperature fast response thermistor is adopted in the design of fixed temperature residential detector. Automatic test function and alarm stop function are also considered in the design. The major factors which influence to current consumption are explained for the purpose of design reference. Main electronics circuit parts related to it's characteristics of the detector are described. It is explained that the measured current and experimental result of the battery discharge can be met over 10 years operation.

A Feature Point Extraction and Identification Technique for Immersive Contents Using Deep Learning (딥 러닝을 이용한 실감형 콘텐츠 특징점 추출 및 식별 방법)

  • Park, Byeongchan;Jang, Seyoung;Yoo, Injae;Lee, Jaechung;Kim, Seok-Yoon;Kim, Youngmo
    • Journal of IKEEE
    • /
    • v.24 no.2
    • /
    • pp.529-535
    • /
    • 2020
  • As the main technology of the 4th industrial revolution, immersive 360-degree video contents are drawing attention. The market size of immersive 360-degree video contents worldwide is projected to increase from $6.7 billion in 2018 to approximately $70 billion in 2020. However, most of the immersive 360-degree video contents are distributed through illegal distribution networks such as Webhard and Torrent, and the damage caused by illegal reproduction is increasing. Existing 2D video industry uses copyright filtering technology to prevent such illegal distribution. The technical difficulties dealing with immersive 360-degree videos arise in that they require ultra-high quality pictures and have the characteristics containing images captured by two or more cameras merged in one image, which results in the creation of distortion regions. There are also technical limitations such as an increase in the amount of feature point data due to the ultra-high definition and the processing speed requirement. These consideration makes it difficult to use the same 2D filtering technology for 360-degree videos. To solve this problem, this paper suggests a feature point extraction and identification technique that select object identification areas excluding regions with severe distortion, recognize objects using deep learning technology in the identification areas, extract feature points using the identified object information. Compared with the previously proposed method of extracting feature points using stitching area for immersive contents, the proposed technique shows excellent performance gain.