• Title/Summary/Keyword: 기술 프로세스

Search Result 2,707, Processing Time 0.03 seconds

The Impact of Virtual Influencer Formativeness on Advertising Attention and Attitude Toward Advertising: The Dual Parallel Mediating Effects of Attractiveness and Suitability (버츄얼 인플루언서의 조형성이 광고 주목도와 광고 태도에 미치는 영향: 매력성과 적합성의 병렬 이중 매개효과)

  • Eun Hee Kim;No-Mi Lee
    • Journal of Advanced Technology Convergence
    • /
    • v.3 no.1
    • /
    • pp.21-31
    • /
    • 2024
  • The study confirmed the relationship between attractiveness and suitability in the relationship between the formativeness of creating the artistic form of a virtual influencer who acts as an advertising model, advertising attention, and advertising attitude. To confirm this, the subjects of the study were the MZ generation and X generation, which have a high rate of SNS use. The analysis method used SPSS statistics 27.0 and SPSS process macro version. The research results are as follows. First, it was confirmed that attractiveness and suitability fully mediate the relationship between formativeness and advertising attention. In the path of formativeness and advertising attention, the total effect was found to be higher than the direct effect, and it was confirmed that there was a double parallel mediation effect through attractiveness and suitability in the relationship between the formativeness of virtual influencers and advertising attention. Second, it was confirmed that formativeness affects the mediating variable, attractiveness, but attractiveness does not affect attitude toward advertising. Since formativeness affects suitability and suitability in turn influences attitude toward advertising, it was confirmed that there is a full mediating effect between these variables. According to these results, the parallel mediating effect of attractiveness and formativeness was not confirmed in the relationship between formativeness and attitude towards advertising. The above study is significant in that it presents academic implications and practical implications by examining the dual and parallel mediating effects of attractiveness and suitability in the relationship between formativeness, advertising attention, and advertising attitude variables, which are considered in the production of virtual influencers.

Creating and Utilization of Virtual Human via Facial Capturing based on Photogrammetry (포토그래메트리 기반 페이셜 캡처를 통한 버추얼 휴먼 제작 및 활용)

  • Ji Yun;Haitao Jiang;Zhou Jiani;Sunghoon Cho;Tae Soo Yun
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.25 no.2
    • /
    • pp.113-118
    • /
    • 2024
  • Recently, advancements in artificial intelligence and computer graphics technology have led to the emergence of various virtual humans across multiple media such as movies, advertisements, broadcasts, games, and social networking services (SNS). In particular, in the advertising marketing sector centered around virtual influencers, virtual humans have already proven to be an important promotional tool for businesses in terms of time and cost efficiency. In Korea, the virtual influencer market is in its nascent stage, and both large corporations and startups are preparing to launch new services related to virtual influencers without clear boundaries. However, due to the lack of public disclosure of the development process, they face the situation of having to incur significant expenses. To address these requirements and challenges faced by businesses, this paper implements a photogrammetry-based facial capture system for creating realistic virtual humans and explores the use of these models and their application cases. The paper also examines an optimal workflow in terms of cost and quality through MetaHuman modeling based on Unreal Engine, which simplifies the complex CG work steps from facial capture to the actual animation process. Additionally, the paper introduces cases where virtual humans have been utilized in SNS marketing, such as on Instagram, and demonstrates the performance of the proposed workflow by comparing it with traditional CG work through an Unreal Engine-based workflow.

A Proposal of a Keyword Extraction System for Detecting Social Issues (사회문제 해결형 기술수요 발굴을 위한 키워드 추출 시스템 제안)

  • Jeong, Dami;Kim, Jaeseok;Kim, Gi-Nam;Heo, Jong-Uk;On, Byung-Won;Kang, Mijung
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.3
    • /
    • pp.1-23
    • /
    • 2013
  • To discover significant social issues such as unemployment, economy crisis, social welfare etc. that are urgent issues to be solved in a modern society, in the existing approach, researchers usually collect opinions from professional experts and scholars through either online or offline surveys. However, such a method does not seem to be effective from time to time. As usual, due to the problem of expense, a large number of survey replies are seldom gathered. In some cases, it is also hard to find out professional persons dealing with specific social issues. Thus, the sample set is often small and may have some bias. Furthermore, regarding a social issue, several experts may make totally different conclusions because each expert has his subjective point of view and different background. In this case, it is considerably hard to figure out what current social issues are and which social issues are really important. To surmount the shortcomings of the current approach, in this paper, we develop a prototype system that semi-automatically detects social issue keywords representing social issues and problems from about 1.3 million news articles issued by about 10 major domestic presses in Korea from June 2009 until July 2012. Our proposed system consists of (1) collecting and extracting texts from the collected news articles, (2) identifying only news articles related to social issues, (3) analyzing the lexical items of Korean sentences, (4) finding a set of topics regarding social keywords over time based on probabilistic topic modeling, (5) matching relevant paragraphs to a given topic, and (6) visualizing social keywords for easy understanding. In particular, we propose a novel matching algorithm relying on generative models. The goal of our proposed matching algorithm is to best match paragraphs to each topic. Technically, using a topic model such as Latent Dirichlet Allocation (LDA), we can obtain a set of topics, each of which has relevant terms and their probability values. In our problem, given a set of text documents (e.g., news articles), LDA shows a set of topic clusters, and then each topic cluster is labeled by human annotators, where each topic label stands for a social keyword. For example, suppose there is a topic (e.g., Topic1 = {(unemployment, 0.4), (layoff, 0.3), (business, 0.3)}) and then a human annotator labels "Unemployment Problem" on Topic1. In this example, it is non-trivial to understand what happened to the unemployment problem in our society. In other words, taking a look at only social keywords, we have no idea of the detailed events occurring in our society. To tackle this matter, we develop the matching algorithm that computes the probability value of a paragraph given a topic, relying on (i) topic terms and (ii) their probability values. For instance, given a set of text documents, we segment each text document to paragraphs. In the meantime, using LDA, we can extract a set of topics from the text documents. Based on our matching process, each paragraph is assigned to a topic, indicating that the paragraph best matches the topic. Finally, each topic has several best matched paragraphs. Furthermore, assuming there are a topic (e.g., Unemployment Problem) and the best matched paragraph (e.g., Up to 300 workers lost their jobs in XXX company at Seoul). In this case, we can grasp the detailed information of the social keyword such as "300 workers", "unemployment", "XXX company", and "Seoul". In addition, our system visualizes social keywords over time. Therefore, through our matching process and keyword visualization, most researchers will be able to detect social issues easily and quickly. Through this prototype system, we have detected various social issues appearing in our society and also showed effectiveness of our proposed methods according to our experimental results. Note that you can also use our proof-of-concept system in http://dslab.snu.ac.kr/demo.html.

A Legal Study on Safety Management System (항공안전관리에 관한 법적 고찰)

  • So, Jae-Seon;Lee, Chang-Kyu
    • The Korean Journal of Air & Space Law and Policy
    • /
    • v.29 no.1
    • /
    • pp.3-32
    • /
    • 2014
  • Safety Management System is the aviation industry policy for while operating the aircraft, to ensure the safety crew, aircraft and passengers. For operating a safe aircraft, in order to establish the international technical standards, the International Civil Aviation Organization has established the Annex 19 of the Convention on International Civil Aviation. As a result, member country was supposed to be in accordance with the policy of the International Civil Aviation Organization, to accept the international standard of domestic air law. The South Korean government announced that it would promote active safety management strategy in primary aviation policy master plan of 2012. And, by integrating and state safety programmes(ssp) and safety management system(sms) for the safe management of Annex 19 is to enforce the policy on aviation safety standards. State safety programmes(ssp) is a system of activities for the aim of strengthening the safety and integrated management of the activities of government. State safety programmes(ssp) is important on the basis of the data of the risk information. Collecting aviation hazard information is necessary for efficient operation of the state safety programmes(ssp) Korean government must implement the strategy required to comply with aviation methods and standards of the International Civil Aviation Organization. Airlines, must strive to safety features for safety culture construction and improvement of safety management is realized. It is necessary to make regulations on the basis of the aviation practice, for aviation safety regulatory requirements, aviation safety should reflect the opinion of the aviation industry.

A Study on the Establishment Case of Technical Standard for Electronic Record Information Package (전자문서 정보패키지 구축 사례 연구 - '공인전자문서보관소 전자문서 정보패키지 기술규격 개발 연구'를 중심으로-)

  • Kim, Sung-Kyum
    • The Korean Journal of Archival Studies
    • /
    • no.16
    • /
    • pp.97-146
    • /
    • 2007
  • Those days when people used paper to make up and manage all kinds of documents in the process of their jobs are gone now. Today electronic types of documents have replaced paper. Unlike paper documents, electronic ones contribute to the maximum job efficiency with their convenience in production and storage. But they too have some disadvantages; it's difficult to distinguish originals and copies like paper documents; it's not easy to examine if there is a change or damage to the documents; they are also prone to alteration and damage by the external influences in the electronic environment; and electronic documents require enormous amounts of workforce and costs for immediate measures to be taken according to the changes to the S/W and H/W environment. Despite all those weaknesses, however, electronic documents increasingly account for more percentage in the current job environment thanks to their job convenience and efficiency of production costs. Both the government and private sector have made efforts to come up with plans to maximize their advantages and minimize their risks at the same time. One of the methods is the Authorized Retention Center which is described in the study. There are a couple of prerequisites for its smooth operation; they should guarantee the legal validity of electronic documents in the administrative aspects and first secure the reliability and authenticity of electronic documents in the technological aspects. Responding to those needs, the Ministry of Commerce, Industry and Energy and the Korea Institute for Electronic Commerce, which were the two main bodies to drive the Authorized Retention Center project, revised the Electronic Commerce Act and supplemented the provisions to guarantee the legal validity of electronic documents in 2005 and conducted researches on the ways to preserve electronic documents for a long term and secure their reliability, which had been demanded by the users of the center, in 2006. In an attempt to fulfill those goals of the Authorized Retention Center, this study researched technical standard for electronic record information package of the center and applied the ISO 14721 information package model that's the standard for the long-term preservation of digital data. It also suggested a process to produce and manage information package so that there would be the SIP, AIP and DIP metadata features for the production, preservation, and utilization by users points of electronic documents and they could be implemented according to the center's policies. Based on the previous study, the study introduced the flow charts among the production and progress process, application methods and packages of technical standard for electronic record information package at the center and suggested some issues that should be consistently researched in the field of records management based on the results.

A Study on e-Healthcare Business Model: Focusing on Business Ecosystem Approach (e헬스케어 비즈니스모델에 관한 연구: 비즈니스생태계 접근 중심으로)

  • Kim, Youngsoo;Jung, Jai-Jin
    • Asia-Pacific Journal of Business Venturing and Entrepreneurship
    • /
    • v.14 no.1
    • /
    • pp.167-185
    • /
    • 2019
  • As most G-20 countries expect medical spending to grow rapidly over the next few decades, the burden of healthcare costs continues to grow globally due to an increase in the elderly population and chronic illnesses, and the ongoing quality improvement of health care services. However, under the rapidly changing technological environment of healthcare and IT convergence, the problem may become even bigger if not properly recognized and not properly prepared. In the context of the paradigm shift and the increasing problem of the medical field, complex responses in technical, institutional and business aspects are urgently needed. The key is to derive a business model that is appropriate for businesses that integrate IT in the medical field. With the arrival of the era of the 4th industrial revolution, new technologies such as Internet of Things have been applied to eHealthcare, and the need for new business models has emerged.In the e-healthcare of the Internet era, it became a traditional firm-based business model. However, due to the characteristics of dynamics and complexity of things Internet in the Internet of things, A business ecosystem-based approach is needed. In this paper, we present and analyze the major success factors of the ecosystem based on the 3 - layer structure of the e - healthcare business ecosystem as a result of research on e - healthcare business ecosystem based on emerging technology such as Internet of things. The three-layer business ecosystem was defined as (1) Infrastructure Layer, (2) Character Layer, and (3) Stakeholder Layer. As the key success factors for the eHealthCare business ecosystem, the following four factors are suggested: (1) introduction of the iHealthcare concept, (2) expansion of the business ecosystem, (3) business ecosystem change process innovation, and (4) business ecosystem leadership innovation.

Development of Coaching Model to Enhance Teaching Capability of Lifelong Educator (평생교육교수자의 교수역량 강화를 위한 코칭모델 개발)

  • Son, Sung Hwa;Kim, Jin Sook
    • The Journal of the Convergence on Culture Technology
    • /
    • v.7 no.4
    • /
    • pp.369-376
    • /
    • 2021
  • The purpose of this study is to develop a coaching model which can enhance teaching ability of lifelong educator. To achieve this purpose, this study verifies and analyzes several documentary records related with diverse teaching capabilities, operation reality and coaching method run by lifelong educator. Furthermore, an in-depth interview about teaching capability was undertaken for field experts who have worked at the institutions of lifelong education for more than 10 years. As a result, the study could develop a coaching model to identify teaching capability of lifelong educator by conducting matrix analysis. First, according to the documentary studies, the paradigm for lifelong education has been shifted to centralize learner's demand with the advent of 4th industrial revolution and it suggests coaching capability which could enhance educator's capability should come first. A lifelong educator should have capabilities including identification of vision and goal, creation of mission declaration, development of coaching skill and procedure, management of crisis and coaching capability as an expert in the lifelong education field. Second, a model which can centralize learners could be developed for lifelong teaching capability by adopting a teaching capability suggested by field experts, According to the experts, it is essential to develop a program model to acquire professional knowledge, communication capability, understanding of adult learner, personal relations capability. If there is a model which can develop such capabilities, it is able to strengthen lifelong teaching capability to focus on learner's demand, mainly adult learners, a major consumer of the field. Third, a coaching model to enhance teaching capability for an educator is to acquire and implement sufficient step-by-step teaching capability which has been suggested from a procedure comprised of entrance, progress, critique and return. This, present study suggests, after the critique, a lifelong educator oneself can newly develop and extend a teaching capability basis on pursuing teaching capability as a lifelong educator through the return process.

A Study on the Optimal Process Parameters for Recycling of Electric Arc Furnace Dust (EAFD) by Rotary Kiln (Rotary Kiln에 의한 전기로 제강분진(EAFD)의 재활용을 위한 최적의 공정변수에 관한 연구)

  • Jae-hong Yoon;Chi-hyun Yoon;Myoung-won Lee
    • Resources Recycling
    • /
    • v.33 no.4
    • /
    • pp.47-61
    • /
    • 2024
  • As a recycling technology for recovering zinc contained in large amounts in electric arc furnace dust (EAFD), the most commercialized technology in the world is the Wealz Kiln Process. The Wealz Kiln Process is a process in which components such as Zn and Pb in EAFD are reduced/volatile (endothermic reaction) in high-temperature Kiln and then re-oxidized (exothermic reaction) in the gas phase and recovered in the form of Crude zinc oxide (60wt%Zn) in the Bag Filter installed at the rear end of Kiln. In this study, an experimental Wealz kiln was produced to investigate the optimal process variable value for practical application to the recycling process of large-scale kiln on a commercial scale. Additionally, Pellets containing EAFD, reducing agents, and limestone were continuously loaded into Kiln, and the amount of input, heating temperature, and residence time were examined to obtain the optimal crude zinc oxide recovery rate. In addition, the optimal manufacturing conditions of Pellets (drum tilt angle, moisture addition, mixing time, etc.) were also investigated. In addition, referring to the SiO2-CaO-FeO ternary system diagram, the formation behavior of a low melting point compound, a reaction product inside Kiln according to the change in the basicity of Pellet, and the reactivity (adhesion) with the castable constructed on the inner wall of Kiln were investigated. In addition, in order to quantitatively investigate the possibility of using anthracite as a substitute for Coke, a reducing agent, changes in the temperature distribution inside Kiln, where oxidation/reduction reactions occur due to an increase in the amount of anthracite, the quality of Crude zinc oxide, and the behavior of tar in anthracite were also investigated.

Design of Client-Server Model For Effective Processing and Utilization of Bigdata (빅데이터의 효과적인 처리 및 활용을 위한 클라이언트-서버 모델 설계)

  • Park, Dae Seo;Kim, Hwa Jong
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.4
    • /
    • pp.109-122
    • /
    • 2016
  • Recently, big data analysis has developed into a field of interest to individuals and non-experts as well as companies and professionals. Accordingly, it is utilized for marketing and social problem solving by analyzing the data currently opened or collected directly. In Korea, various companies and individuals are challenging big data analysis, but it is difficult from the initial stage of analysis due to limitation of big data disclosure and collection difficulties. Nowadays, the system improvement for big data activation and big data disclosure services are variously carried out in Korea and abroad, and services for opening public data such as domestic government 3.0 (data.go.kr) are mainly implemented. In addition to the efforts made by the government, services that share data held by corporations or individuals are running, but it is difficult to find useful data because of the lack of shared data. In addition, big data traffic problems can occur because it is necessary to download and examine the entire data in order to grasp the attributes and simple information about the shared data. Therefore, We need for a new system for big data processing and utilization. First, big data pre-analysis technology is needed as a way to solve big data sharing problem. Pre-analysis is a concept proposed in this paper in order to solve the problem of sharing big data, and it means to provide users with the results generated by pre-analyzing the data in advance. Through preliminary analysis, it is possible to improve the usability of big data by providing information that can grasp the properties and characteristics of big data when the data user searches for big data. In addition, by sharing the summary data or sample data generated through the pre-analysis, it is possible to solve the security problem that may occur when the original data is disclosed, thereby enabling the big data sharing between the data provider and the data user. Second, it is necessary to quickly generate appropriate preprocessing results according to the level of disclosure or network status of raw data and to provide the results to users through big data distribution processing using spark. Third, in order to solve the problem of big traffic, the system monitors the traffic of the network in real time. When preprocessing the data requested by the user, preprocessing to a size available in the current network and transmitting it to the user is required so that no big traffic occurs. In this paper, we present various data sizes according to the level of disclosure through pre - analysis. This method is expected to show a low traffic volume when compared with the conventional method of sharing only raw data in a large number of systems. In this paper, we describe how to solve problems that occur when big data is released and used, and to help facilitate sharing and analysis. The client-server model uses SPARK for fast analysis and processing of user requests. Server Agent and a Client Agent, each of which is deployed on the Server and Client side. The Server Agent is a necessary agent for the data provider and performs preliminary analysis of big data to generate Data Descriptor with information of Sample Data, Summary Data, and Raw Data. In addition, it performs fast and efficient big data preprocessing through big data distribution processing and continuously monitors network traffic. The Client Agent is an agent placed on the data user side. It can search the big data through the Data Descriptor which is the result of the pre-analysis and can quickly search the data. The desired data can be requested from the server to download the big data according to the level of disclosure. It separates the Server Agent and the client agent when the data provider publishes the data for data to be used by the user. In particular, we focus on the Big Data Sharing, Distributed Big Data Processing, Big Traffic problem, and construct the detailed module of the client - server model and present the design method of each module. The system designed on the basis of the proposed model, the user who acquires the data analyzes the data in the desired direction or preprocesses the new data. By analyzing the newly processed data through the server agent, the data user changes its role as the data provider. The data provider can also obtain useful statistical information from the Data Descriptor of the data it discloses and become a data user to perform new analysis using the sample data. In this way, raw data is processed and processed big data is utilized by the user, thereby forming a natural shared environment. The role of data provider and data user is not distinguished, and provides an ideal shared service that enables everyone to be a provider and a user. The client-server model solves the problem of sharing big data and provides a free sharing environment to securely big data disclosure and provides an ideal shared service to easily find big data.

The Relationship Between DEA Model-based Eco-Efficiency and Economic Performance (DEA 모형 기반의 에코효율성과 경제적 성과의 연관성)

  • Kim, Myoung-Jong
    • Journal of Environmental Policy
    • /
    • v.13 no.4
    • /
    • pp.3-49
    • /
    • 2014
  • Growing interest of stakeholders on corporate responsibilities for environment and tightening environmental regulations are highlighting the importance of environmental management more than ever. However, companies' awareness of the importance of environment is still falling behind, and related academic works have not shown consistent conclusions on the relationship between environmental performance and economic performance. One of the reasons is different ways of measuring these two performances. The evaluation scope of economic performance is relatively narrow and the performance can be measured by a unified unit such as price, while the scope of environmental performance is diverse and a wide range of units are used for measuring environmental performances instead of using a single unified unit. Therefore, the results of works can be different depending on the performance indicators selected. In order to resolve this problem, generalized and standardized performance indicators should be developed. In particular, the performance indicators should be able to cover the concepts of both environmental and economic performances because the recent idea of environmental management has expanded to encompass the concept of sustainability. Another reason is that most of the current researches tend to focus on the motive of environmental investments and environmental performance, and do not offer a guideline for an effective implementation strategy for environmental management. For example, a process improvement strategy or a market discrimination strategy can be deployed through comparing the environment competitiveness among the companies in the same or similar industries, so that a virtuous cyclical relationship between environmental and economic performances can be secured. A novel method for measuring eco-efficiency by utilizing Data Envelopment Analysis (DEA), which is able to combine multiple environmental and economic performances, is proposed in this report. Based on the eco-efficiencies, the environmental competitiveness is analyzed and the optimal combination of inputs and outputs are recommended for improving the eco-efficiencies of inefficient firms. Furthermore, the panel analysis is applied to the causal relationship between eco-efficiency and economic performance, and the pooled regression model is used to investigate the relationship between eco-efficiency and economic performance. The four-year eco-efficiencies between 2010 and 2013 of 23 companies are obtained from the DEA analysis; a comparison of efficiencies among 23 companies is carried out in terms of technical efficiency(TE), pure technical efficiency(PTE) and scale efficiency(SE), and then a set of recommendations for optimal combination of inputs and outputs are suggested for the inefficient companies. Furthermore, the experimental results with the panel analysis have demonstrated the causality from eco-efficiency to economic performance. The results of the pooled regression have shown that eco-efficiency positively affect financial perform ances(ROA and ROS) of the companies, as well as firm values(Tobin Q, stock price, and stock returns). This report proposes a novel approach for generating standardized performance indicators obtained from multiple environmental and economic performances, so that it is able to enhance the generality of relevant researches and provide a deep insight into the sustainability of environmental management. Furthermore, using efficiency indicators obtained from the DEA model, the cause of change in eco-efficiency can be investigated and an effective strategy for environmental management can be suggested. Finally, this report can be a motive for environmental management by providing empirical evidence that environmental investments can improve economic performance.

  • PDF