• Title/Summary/Keyword: Task Technology Fit

Search Result 94, Processing Time 0.021 seconds

A Study on the Business Model of Fashion Mobile Commerce by Quality Evaluation (패션 모바일 커머스 품질 평가에 대한 비즈니스 모델 연구)

  • Na, Younkue
    • Journal of Fashion Business
    • /
    • v.18 no.1
    • /
    • pp.1-21
    • /
    • 2014
  • This study exceeds the view on the fragmentary fact-finding surveys related to the application of mobile commerce which further develops the evaluation model of fashion mobile commerce website and considers the validity of comprehensive fashion mobile commerce with quality evaluation factors according to the Task-Technology Fit. To fulfill the study objectives, a total of 433 questionnaires are being conducted to the customers with first-hand experience on fashion merchandises through mobile commerce. The judgement sampling method is employed according to the sample population ages from 20s to 30s during two months period. Based on the results of the above-mentioned path analysis, we have observed the following: First, the path relation analysis results show that the M-marketing (M-marketing) between perceived usabilities had effects to the perceived usability and the M-sales had effects to the perceived usability. Second, as seen from the fashion mobile shopping conformance (TTF), the perceived usability, customer satisfaction, and path pipe analysis result conformances between perceived values and immersions have effects of perceived usability, customer satisfaction and perceived value, and thus, indicate that the perceived usability had effects on the customer satisfaction and immersion. Third, the customer satisfaction, perceived value and immersion all have effects on the purchasing intention.

Application and Research of Monte Carlo Sampling Algorithm in Music Generation

  • MIN, Jun;WANG, Lei;PANG, Junwei;HAN, Huihui;Li, Dongyang;ZHANG, Maoqing;HUANG, Yantai
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.10
    • /
    • pp.3355-3372
    • /
    • 2022
  • Composing music is an inspired yet challenging task, in that the process involves many considerations such as assigning pitches, determining rhythm, and arranging accompaniment. Algorithmic composition aims to develop algorithms for music composition. Recently, algorithmic composition using artificial intelligence technologies received considerable attention. In particular, computational intelligence is widely used and achieves promising results in the creation of music. This paper attempts to provide a survey on the music generation based on the Monte Carlo (MC) algorithm. First, transform the MIDI music format files to digital data. Among these data, use the logistic fitting method to fit the time series, obtain the time distribution regular pattern. Except for time series, the converted data also includes duration, pitch, and velocity. Second, using MC simulation to deal with them summed up their distribution law respectively. The two main control parameters are the value of discrete sampling and standard deviation. Processing the above parameters and converting the data to MIDI file, then compared with the output generated by LSTM neural network, evaluate the music comprehensively.

Smart monitoring analysis system for tunnels in heterogeneous rock mass

  • Kim, Chang-Yong;Hong, Sung-Wan;Bae, Gyu-Jin;Kim, Kwang-Yeom;Schubert, Wulf
    • 한국지구물리탐사학회:학술대회논문집
    • /
    • 2003.11a
    • /
    • pp.255-261
    • /
    • 2003
  • Tunnelling in poor and heterogeneous ground is a difficult task. Even with a good geological investigation, uncertainties with respect to the local rock mass structure will remain. Especially for such conditions, a reliable short-term prediction of the conditions ahead and outside the tunnel profile are of paramount importance for the choice of appropriate excavation and support methods. The information contained in the absolute displacement monitoring data allows a comprehensive evaluation of the displacements and the determination of the behaviour and influence of an anisotropic rock mass. Case histories and with numerical simulations show, that changes in the displacement vector orientation can indicate changing rock mass conditions ahead of the tunnel face (Schubert & Budil 1995, Steindorfer & Schubert 1997). Further research has been conducted to quantify the influence of weak zones on stresses and displacements (Grossauer 2001). Sellner (2000) developed software, which allows predicting displacements (GeoFit$\circledR$). The function parameters describe the time and advance dependent deformation of a tunnel. Routinely applying this method at each measuring section allows determining trends of those parameters. It shows, that the trends of parameter sets indicate changes in the stiffness of the rock mass outside the tunnel in a similar way, as the displacement vector orientation does. Three-dimensional Finite Element simulations of different weakness zone properties, thicknesses, and orientations relative to the tunnel axis were carried out and the function parameters evaluated from the results. The results are compared to monitoring results from alpine tunnels in heterogeneous rock. The good qualitative correlation between trends observed on site and numerical results gives hope that by a routine determination of the function parameters during excavation the prediction of rock mass conditions ahead of the tunnel face can be improved. Implementing the rules developed from experience and simulations into the monitoring data evaluation program allows to automatically issuing information on the expected rock mass quality ahead of the tunnel.

  • PDF

A Study on the Intention to Use of the AI-related Educational Content Recommendation System in the University Library: Focusing on the Perceptions of University Students and Librarians (대학도서관 인공지능 관련 교육콘텐츠 추천 시스템 사용의도에 관한 연구 - 대학생과 사서의 인식을 중심으로 -)

  • Kim, Seonghun;Park, Sion;Parkk, Jiwon;Oh, Youjin
    • Journal of Korean Library and Information Science Society
    • /
    • v.53 no.1
    • /
    • pp.231-263
    • /
    • 2022
  • The understanding and capability to utilize artificial intelligence (AI) incorporated technology has become a required basic skillset for the people living in today's information age, and various members of the university have also increasingly become aware of the need for AI education. Amidst such shifting societal demands, both domestic and international university libraries have recognized the users' need for educational content centered on AI, but a user-centered service that aims to provide personalized recommendations of digital AI educational content is yet to become available. It is critical while the demand for AI education amongst university students is progressively growing that university libraries acquire a clear understanding of user intention towards an AI educational content recommender system and the potential factors contributing to its success. This study intended to ascertain the factors affecting acceptance of such system, using the Extended Technology Acceptance Model with added variables - innovativeness, self-efficacy, social influence, system quality and task-technology fit - in addition to perceived usefulness, perceived ease of use, and intention to use. Quantitative research was conducted via online research surveys for university students, and quantitative research was conducted through written interviews of university librarians. Results show that all groups, regardless of gender, year, or major, have the intention to use the AI-related Educational Content Recommendation System, with the task suitability factor being the most dominant variant to affect use intention. University librarians have also expressed agreement about the necessity of the recommendation system, and presented budget and content quality issues as realistic restrictions of the aforementioned system.

Defending the Indo-Pacific Liberal International Order: Lessons from France in Cold War Europe For Promoting Détente in Asia

  • Benedict E. DeDominicis
    • International Journal of Advanced Culture Technology
    • /
    • v.11 no.2
    • /
    • pp.82-108
    • /
    • 2023
  • As tension escalates between the US and China, scenarios for maintaining peace in Northeast Asia imply that secondary powers will perceive increasing incentives to reappraise their respective international roles. This analysis proposes that an analysis of France's Cold War role in Europe and the world under President Charles de Gaulle provides insights into conflict management in an increasingly multipolar international political environment. Their respective interests in preventing a so-called new Cold War emerging between the US and China include avoiding its excessive economic costs, if only because China is a massive trade partner. This study engages in theoretical framework-informed process tracing of de Gaulle's role. It explicates the assumptions that functionally underpinned de Gaulle's policy of soft balancing between the US and China. The analysis explores de Gaulle's contribution to the decay of the Cold War. It illuminates de Gaulle's contribution to a regional international environment that made West German Chancellor Willy Brandt's Ostpolitik strategy more feasible politically. This study applies these findings in the formulation of strategy recommendations focusing on Japan. Valid inferences regarding the predominant motivations driving American and Chinese international interaction are necessary for this task. To the extent to which the US and China have entered into a conflict spiral, Japan's hedging towards Washington is further incentivized. Tokyo would necessarily need to convince the Chinese that Japan is no longer Washington's unsinkable aircraft carrier off its coast. Tokyo, like de Gaulle's France, would maintain close relations with Washington, but it would need to project to its interlocutors its commitment to its own strategic autonomy. Tokyo's emphasis on closer relations with liberal democratic Indo-Pacific actors would potentially fit well with a commitment to strategic autonomy to defend the global liberal order.

The Effects of Information Systems Quality on the Performance of Emotional Labors : Focused on the Airline Call Centers (정보시스템 품질이 감정노동 성과에 미치는 영향: 항공사 콜센터를 중심으로)

  • Park, Wonhee;Kim, Shinkon;Kim, Changkyu
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.16 no.12
    • /
    • pp.8800-8811
    • /
    • 2015
  • When the crucial role of the agent in communicating with the customer is acknowledged well enough to relieve the agent's stress, it will lead to the decrease of the agent's emotional labor and the improvement of the business organization's performance simultaneously. However, the research on the relationship between information system and the emotional labor has been scarcely conducted even though the importance of the emotional labor is actively researched and discussed these days. Therefore, much effort has been put in this study to fine out how the quality of airline call center information system affects expectations-conformation and how expectations-conformation and self-efficacy affect performance of Emotional Labors. Analysis of the results to target a call center agent 436 people, When you provide them with quality information systems, it increased satisfaction and pride in their job. This mechanisms subsequently reduces the strength of the emotion labor, which ultimately improves the service performance. The implications of this study can be summarized as following: First, this research presented practical guidelines to the organization's decision-makers related to the airline call center operations in order to introduce and expand successful call center information system. Second, this research suggested the possible method to inspect and diagnose the system by way of applying the measurement model mentioned in this research into the airline information system and analyzing it. Third, the performance-measuring model developed in order to measure the performance of the airline call center information system can also be used when we carry out the performance-measuring task in the similar information system as the basis of diagnosing the situation and presenting the driving directions.

Mixed-Integer Programming based Techniques for Resource Allocation in Underlay Cognitive Radio Networks: A Survey

  • Alfa, Attahiru S.;Maharaj, B.T.;Lall, Shruti;Pal, Sougata
    • Journal of Communications and Networks
    • /
    • v.18 no.5
    • /
    • pp.744-761
    • /
    • 2016
  • For about the past decade and a half research efforts into cognitive radio networks (CRNs) have increased dramatically. This is because CRN is recognized as a technology that has the potential to squeeze the most out of the existing spectrum and hence virtually increase the effective capacity of a wireless communication system. The resulting increased capacity is still a limited resource and its optimal allocation is a critical requirement in order to realize its full benefits. Allocating these additional resources to the secondary users (SUs) in a CRN is an extremely challenging task and integer programming based optimization tools have to be employed to achieve the goals which include, among several aspects, increasing SUs throughput without interfering with the activities of primary users. The theory of the optimization tools that can be used for resource allocations (RA) in CRN have been well established in the literature; convex programming is one of them, in fact the major one. However when it comes to application and implementation, it is noticed that the practical problems do not fit exactly into the format of well established tools and researchers have to apply approximations of different forms to assist in the process. In this survey paper, the optimization tools that have been applied to RA in CRNs are reviewed. In some instances the limitations of techniques used are pointed out and creative tools developed by researchers to solve the problems are identified. Some ideas of tools to be considered by researchers are suggested, and direction for future research in this area in order to improve on the existing tools are presented.

A Study on the Design of CBRN Response Training Program in Korea Using Activity-Action Diagram Method (Activity-Action Diagram 기법을 활용한 한국형 화생방 교육훈련 프로그램 설계에 관한 연구)

  • Ham, Eun-Gu;Kim, Tae-Hwan
    • Journal of the Society of Disaster Information
    • /
    • v.10 no.1
    • /
    • pp.159-169
    • /
    • 2014
  • The development of science and technology to accompany the convenience of civilization but in addition to nuclear, gas, explosion, accident and spill all over the world with the possibility of a chemical or biological terrorism response efforts collectively as a response to the urgent task of a nation. In this study major economies such as the U.S. and Canada analyzed to investigate the CBRN training programs to fit the reality in Korea CBRN training programs were developed. also the development of training programs to CBRN Korean Activity-Action Diagram technique utilized by CBRN scenarios corresponding to each event needs to be taken when the Activity is defined by its detailed definition of corrective actions for the CBRN Activity to define context-sensitive actions in particular to enable the functionality of the structure in case of CBRN emergency initial response was to establish education and training programs.

Study on the wetsuit manufacturing status in Korea and future research task (국내 습식 잠수복 생산 업체의 생산실태 조사 및 향후 연구과제)

  • Shin, Hyun-Suk;Choi, Inyoung
    • Journal of the Korea Fashion and Costume Design Association
    • /
    • v.23 no.3
    • /
    • pp.99-108
    • /
    • 2021
  • The present study examines the overall manufacturing status of local wetsuit makers, problems in the manufacturing process, and future research tasks. The study revealed that most manufacturers use neoprene fabric of varying thickness, depending on the body part. Normally, 3 mm-thick fabric is utilized for high-activity body parts and 5 mm-thick fabric is used for high-activity areas requiring thermal insulation. In terms of the manufacturing method, the tools and manufacturing processes used by companies were found to be similar. However, because of the nature of wetsuits requiring a more complicated manufacturing method than that of general clothing, there were some differences in the manufacturing method processes from company to company, such as bonding and ease treatments. According to wetsuit manufacturers, they make incisions in consideration of the body's curvature and the overall shape and design of the wetsuit when developing patterns. For example, most answered that they preform the wrist and ankle parts, where the body's curvature is obvious. On the question regarding the "difficult manufacturing process", the most frequent response was the "bonding" process. Most manufacturers were found to focus on designs that can improve mobility and clothing fit, and commonly experienced low-order quantity as an operational difficulty. As for the question on the wetsuit-related technology needed in the future, the "development of various designs" was the most frequent answer, followed by the "development of lightweight and diverse materials".

Parallel Distributed Implementation of GHT on Ethernet Multicluster (이더넷 다중 클러스터에서 GHT의 병렬 분산 구현)

  • Kim, Yeong-Soo;Kim, Myung-Ho;Choi, Heung-Moon
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.46 no.3
    • /
    • pp.96-106
    • /
    • 2009
  • Extending the scale of the distributed processing in a single Ethernet cluster is physically restricted by maximum ports per switch. This paper presents an implementation of MPI-based multicluster consisting of multiple Ethernet switches for extending the scale of distributed processing, and a asymptotical analysis for communication overhead through execution-time analysis model. To determine an optimum task partitioning, we analyzed the processing time for various partitioning schemes, and AAP(accumulator array partitioning) scheme was finally chosen to minimize the overall communication overhead. The scope of data partitioned in AAP was modified to fit for incremented nodes, and suitable load balancing algorithm was implemented. We tried to alleviate the communication overhead through exploiting the pipelined broadcast and flat-tree based result gathering, and overlapping of the communication and the computation time. We used the linear pipeline broadcast to reduce the communication overhead in intercluster which is interconnected by a single link. Experimental results shows nearly linear speedup by the proposed parallel distributed GHT implemented on MPI-based Ethernet multicluster with four 100Mbps Ethernet switches and up to 128 nodes of Pentium PC.