• Title/Summary/Keyword: Internet service usage

Search Result 347, Processing Time 0.028 seconds

Drivers for Trust and Continuous Usage Intention on OTP: Perceived Security, Security Awareness, and User Experience (OTP에 대한 신뢰 및 재사용의도의 결정요인: 인지된 보안성, 보안의식 및 사용자경험을 중심으로)

  • Yun, Hae-Jung;Jang, Jae-Bin;Lee, Choong-C.
    • Journal of the Korea Society of Computer and Information
    • /
    • v.15 no.12
    • /
    • pp.163-173
    • /
    • 2010
  • PKI(Public Key Infrastructure)-based information certification technology has some limitations to be universally applied to mobile banking services, using smart phones, since PKI is dependent on the specific kind of web browser, Internet Explorer. OTP(One Time Password) is considered to be a substitute or complementary service of PKI, but it still shows low acceptance rate. Therefore, in this research, we analyze why OTP has not been very popular, and provide useful implications of making OTP more extensively and frequently used in the mobile environment. Perceived security of OTP was set as a higher-order construct of integrity, confidentiality, authentication, and non-repudiation. Research findings show that security awareness and perceived security of OTP is positively associated, and the relationship between perceived security and trust on OTP is statistically significant. Also, trust is positively related to intention to use OTP continuously.

How can we narrow the digital divide among SMEs in APEC member economies? (중소기업 정보화 수준 격차 해소방안에 관한 국가 간 비교연구)

  • Kwon, Sun-Dong;Yang, Hee-Dong;Sohn, Yong-Yeop;Lee, Seong-Bong;Sirh, Jin-Young;Cho, Taek-Hee
    • Journal of Information Technology Applications and Management
    • /
    • v.12 no.2
    • /
    • pp.79-106
    • /
    • 2005
  • This study, by adopting case study methodology, is focused on examining the present state and analyzing the cause of the digital divide, and suggesting policies for bridging the divide, specifically in view of SMEs. We have taken cases of manufacturing companies, visiting and interviewing 18 SMEs in 10 APEC member economies which show sharp difference in usage of ICT. In order to analyze the digital gap among SMEs, we used 5 variables that are composed of computer hardware, computer software, Internet, readiness of ICT, and performance of ICT adoption, while categorizing the cases into low and high tier based on the national ICT index. From a computer hardware perspective, the high tier (0.66) has almost double the number of PC’s per employee, compared with the low tiers (0.34). This gap can be explained by financial availability of low income and high tariff in the developing economies. In the computer software perspective, the SMEs in the low tier had some restrictive use of computer applications such as financial and accounting management and document management, while those in the high tier enjoyed more diversity in the use of applications such as inventory management, sales management, financial and accounting management, procurement management, CRM, and ERP. In view of the readiness of ICT, the difference in ICT infrastructure and financial status between the low and high tier was far wider than any other variables. As a result of ICT adoption, SMEs benefited in view of learning and growth, internal business processes, customer service, and financial affairs. To effectively bridge the digital divide between the low and high tier, actions such as setting up a secondary market of used computers among cooperating developed and developing countries, developing and diffusing good business applications, and building speedy, low-cost telecommunication infrastructures should be taken.

  • PDF

A Study on Wellbeing Support System for the Elderly using AI (고령자를 위한 AI 기반의 Wellbeing 지원 시스템의 연구)

  • Cho, Myeon-Gyun
    • Journal of Convergence for Information Technology
    • /
    • v.11 no.2
    • /
    • pp.16-24
    • /
    • 2021
  • This paper introduces a smart aging service that helps the elderly lead a happy old age by actively utilizing IoT and AI technologies for the elderly who are increasing rapidly as they enter the aging society. In particular, we propose a future-oriented, age-friendly well-being support system that breaks away from the existing welfare concept to solve the aging problem but leads to a paradigm shift toward building a vibrant aging society by protecting from emergency and satisfying emotions. By introducing IoT and AI, it judges the life situation and emotional state from the living information of the elderly can respond to emergencies and suggest meetings as a change of mood and give an emotional comfort. Since the proposed system uses artificial intelligence techniques to determine the degree of depression when inputting information such as pulse-rate, dangerous word usage, and external communication, I think it showed the feasibility of the new concept of wellbeing support system that is totally different from conventional wellbeing concept of health-care.

Analysis and Evaluation of Frequent Pattern Mining Technique based on Landmark Window (랜드마크 윈도우 기반의 빈발 패턴 마이닝 기법의 분석 및 성능평가)

  • Pyun, Gwangbum;Yun, Unil
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.101-107
    • /
    • 2014
  • With the development of online service, recent forms of databases have been changed from static database structures to dynamic stream database structures. Previous data mining techniques have been used as tools of decision making such as establishment of marketing strategies and DNA analyses. However, the capability to analyze real-time data more quickly is necessary in the recent interesting areas such as sensor network, robotics, and artificial intelligence. Landmark window-based frequent pattern mining, one of the stream mining approaches, performs mining operations with respect to parts of databases or each transaction of them, instead of all the data. In this paper, we analyze and evaluate the techniques of the well-known landmark window-based frequent pattern mining algorithms, called Lossy counting and hMiner. When Lossy counting mines frequent patterns from a set of new transactions, it performs union operations between the previous and current mining results. hMiner, which is a state-of-the-art algorithm based on the landmark window model, conducts mining operations whenever a new transaction occurs. Since hMiner extracts frequent patterns as soon as a new transaction is entered, we can obtain the latest mining results reflecting real-time information. For this reason, such algorithms are also called online mining approaches. We evaluate and compare the performance of the primitive algorithm, Lossy counting and the latest one, hMiner. As the criteria of our performance analysis, we first consider algorithms' total runtime and average processing time per transaction. In addition, to compare the efficiency of storage structures between them, their maximum memory usage is also evaluated. Lastly, we show how stably the two algorithms conduct their mining works with respect to the databases that feature gradually increasing items. With respect to the evaluation results of mining time and transaction processing, hMiner has higher speed than that of Lossy counting. Since hMiner stores candidate frequent patterns in a hash method, it can directly access candidate frequent patterns. Meanwhile, Lossy counting stores them in a lattice manner; thus, it has to search for multiple nodes in order to access the candidate frequent patterns. On the other hand, hMiner shows worse performance than that of Lossy counting in terms of maximum memory usage. hMiner should have all of the information for candidate frequent patterns to store them to hash's buckets, while Lossy counting stores them, reducing their information by using the lattice method. Since the storage of Lossy counting can share items concurrently included in multiple patterns, its memory usage is more efficient than that of hMiner. However, hMiner presents better efficiency than that of Lossy counting with respect to scalability evaluation due to the following reasons. If the number of items is increased, shared items are decreased in contrast; thereby, Lossy counting's memory efficiency is weakened. Furthermore, if the number of transactions becomes higher, its pruning effect becomes worse. From the experimental results, we can determine that the landmark window-based frequent pattern mining algorithms are suitable for real-time systems although they require a significant amount of memory. Hence, we need to improve their data structures more efficiently in order to utilize them additionally in resource-constrained environments such as WSN(Wireless sensor network).

A Study on the Usage of Investigation of Google Cloud Data (Smartphone user-oriented) (구글 클라우드 데이터의 수사활용 방안에 관한 연구 (스마트폰 사용자 중심))

  • Kim, Dongho;Lee, Sangjin
    • Journal of Digital Forensics
    • /
    • v.12 no.3
    • /
    • pp.109-120
    • /
    • 2018
  • The smartphone is the communication device that is the most personal to the user, and it keeps a lot of information related to the user and makes information communication with other devices. With these characteristics, forensics on smartphones are one of the most basic methods of investigation in criminal investigations, and have actually contributed to the settlement of the case by providing many clues. However, recently, it is designed to encrypt data stored as a social issue related to the protection of user's personal information, or to delete deleted data or to delete log data together. So, any solutions? In this paper, I try to find the answer from cloud data stored by smartphone user account. Cloud forensics should approach complementary relationships rather than smartphone forensics. There are a lot of data stored in the cloud that can be meaningfully used in the investigation. Online activity information of users, such as Internet usage, YouTube view, and contents purchase information, cloud service such as e-mail, cloud drive, and location information are the most representative data. These data can be unvaluable, but here are some important clues in various types of criminal investigations. In this paper, I propose a method to extract data from the google cloud so that the data can be used for investigation, and to utilize the extracted data for investigation. And it explains the role of the extracted artifacts in the actual investigation business through virtual cases and proves its value.

A Study of Digitalization Performance of Sinological Resource in Korea (고문헌의 디지털화 성과 연구)

  • Cho Hyung-Jin
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.40 no.3
    • /
    • pp.391-413
    • /
    • 2006
  • This study analyzed the procedures and contents of digitalization of sinological resources owned by major sinological resource institutes in Korea. It investigated the united organizations that use such sinological resources It also assessed governmental policies and future Plans for digitalization of sinological resources. Finally, it proposed steps and conditions necessary for successful digitalization of sinological resources. (1) The level of digitalization of library management, searching, and usage system of national library, university library, and research library that has been applied since 1980s has already been highly advanced. The amount of sinological resources collected is significant and its substance value is very high. The digitalized resources are already distributed on internet partially. However, the level of digitalization of sinological resources still lacks some aspects and requires further effort. (2) The data base for digitalized sinological resources already available can be grouped into bibliographic information, contents and annotation, and full text. and it includes both domestic and foreign resources. The quantities of resources are as described in the body (3) The types of digital sinological resources include antient books. archives, micro, and book blocks. (4) The encoding DB methods of digital sinological resources include text. image, PDF. and etc. (5) The united organizations of sinological resources enable us to avoid duplicated investigation and enhance service efficiency. Here are some factors to consider in order to accomplish ideal digitalization of sinological resources. (1) First of all, it is necessary to organize a control center for digitalization procedures of old materials, and allow it a certain degree of authority to develop and Proceed a comprehensive Plan. (2) Both short- and long-term plans need to be developed in order to analyze various aspects of digitalization process. and their steps need to be taken gradually (3) It is necessary to train experts for old materials and let them construct and manage DB.

Design and Implementation of the Spatio-Temporal DSMS for Moving Object Data Streams (이동체 데이타 스트림을 위한 시공간 DSMS의 설계 및 구현)

  • Lee, Ki-Young;Kim, Joung-Joon
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.8 no.5
    • /
    • pp.159-166
    • /
    • 2008
  • Recently, according to the rapid development of location positioning technology and wireless communications technology and increasement of usage of moving object data, many researches and developments on the real-time locating systems which provides real time service of moving object data stream are under proceeding. However, MO (Moving Object) DBMS used based system in the in these systems is the inefficient management of moving object data streams, and the existing DSMS (Data Stream Management System) has problems that spatio-temporal data are not handled efficiently. Therefore, in this thesis, we designed and implemented spatio-temporal DSMS for efficient real-time management of moving object data stream. This thesis implemented spatio-temporal DSMS based STREAM (STanford stREam dAta Manager) of Stanford University is supporting real-time management of moving object data stream and spatio-temproal query processing and filtering for reduce the input loading. Specifically, spatio-temporal operators of the spatio-temporal DSMS support standard interface of SQL form which extended "Simple Feature Specification for SQL" standard specifications presented by OGC for compatibility. Finally, implemented spatio-temporal DSMS in this thesis, proved the effectiveness of the system that as applied real-time monitoring areas that require real-time locating of object data stream DSMS.

  • PDF

The Role of Control Transparency and Outcome Feedback on Security Protection in Online Banking (계좌 이용 과정과 결과의 투명성이 온라인 뱅킹 이용자의 보안 인식에 미치는 영향)

  • Lee, Un-Kon;Choi, Ji Eun;Lee, Ho Geun
    • Information Systems Review
    • /
    • v.14 no.3
    • /
    • pp.75-97
    • /
    • 2012
  • Fostering trusting belief in financial transactions is a challenging task in Internet banking services. Authenticated Certificate had been regarded as an effective method to guarantee the trusting belief for online transactions. However, previous research claimed that this method has some loopholes for such abusers as hackers, who intend to attack the financial accounts of innocent transactors in Internet. Two types of methods have been suggested as alternatives for securing user identification and activity in online financial services. Control transparency uses information over the transaction process to verify and to control the transactions. Outcome feedback, which refers to the specific information about exchange outcomes, provides information over final transaction results. By using these two methods, financial service providers can send signals to involved parties about the robustness of their security mechanisms. These two methods-control transparency and outcome feedback-have been widely used in the IS field to enhance the quality of IS services. In this research, we intend to verify that these two methods can also be used to reduce risks and to increase the security protections in online banking services. The purpose of this paper is to empirically test the effects of the control transparency and the outcome feedback on the risk perceptions in Internet banking services. Our assumption is that these two methods-control transparency and outcome feedback-can reduce perceived risks involved with online financial transactions, while increasing perceived trust over financial service providers. These changes in user attitudes can increase the level of user satisfactions, which may lead to the increased user loyalty as well as users' willingness to pay for the financial transactions. Previous research in IS suggested that the increased level of transparency on the process and the result of transactions can enhance the information quality and decision quality of IS users. Transparency helps IS users to acquire the information needed to control the transaction counterpart and thus to complete transaction successfully. It is also argued that transparency can reduce the perceived transaction risks in IS usage. Many IS researchers also argued that the trust can be generated by the institutional mechanisms. Trusting belief refers to the truster's belief for the trustee to have attributes for being beneficial to the truster. Institution-based trust plays an important role to enhance the probability of achieving a successful outcome. When a transactor regards the conditions crucial for the transaction success, he or she considers the condition providers as trustful, and thus eventually trust the others involved with such condition providers. In this process, transparency helps the transactor complete the transaction successfully. Through the investigation of these studies, we expect that the control transparency and outcome feedback can reduce the risk perception on transaction and enhance the trust with the service provider. Based on a theoretical framework of transparency and institution-based trust, we propose and test a research model by evaluating research hypotheses. We have conducted a laboratory experiment in order to validate our research model. Since the transparency artifact(control transparency and outcome feedback) is not yet adopted in online banking services, the general survey method could not be employed to verify our research model. We collected data from 138 experiment subjects who had experiences with online banking services. PLS is used to analyze the experiment data. The measurement model confirms that our data set has appropriate convergent and discriminant validity. The results of testing the structural model indicate that control transparency significantly enhances the trust and significantly reduces the risk perception of online banking users. The result also suggested that the outcome feedback significantly enhances the trust of users. We have found that the reduced risk and the increased trust level significantly improve the level of service satisfaction. The increased satisfaction finally leads to the increased loyalty and willingness to pay for the financial services.

  • PDF

The Efficiency Analysis of CRM System in the Hotel Industry Using DEA (DEA를 이용한 호텔 관광 서비스 업계의 CRM 도입 효율성 분석)

  • Kim, Tai-Young;Seol, Kyung-Jin;Kwak, Young-Dai
    • Journal of Intelligence and Information Systems
    • /
    • v.17 no.1
    • /
    • pp.91-110
    • /
    • 2011
  • This paper analyzes the cases where the hotels have increased their services and enhanced their work process through IT solutions to cope with computerization globalization. Also the cases have been studies where national hotels use the CRM solution internally to respond effectively to customers requests, increase customer analysis, and build marketing strategies. In particular, this study discusses the introduction of the CRM solutions and CRM sales business and marketing services using a process for utilizing the presumed, CRM by introducing effective DEA(Data Envelopment Analysis). First, the comparison has done regarding the relative efficiency of L Company with the CCR model, then compared L Company's restaurants and facilities' effectiveness through BCC model. L Company reached a conclusion that it is important to precisely create and manage sales data which are the preliminary data for CRM, and for that reason it made it possible to save sales data generated by POS system on each sales performance database. In order to do that, it newly established Oracle POS system and LORIS POS system concerned with restaurants for food and beverage as well as rooms, and made it possible to stably generate and manage sales data and manage. Moreover, it set up a composite database to control comprehensively the results of work processes during a specific period by collecting customer registration information and made it possible to systematically control the information on sales performances. By establishing a system which unifies database and managing it comprehensively, impeccability of data has been greatly enhanced and a problem which generated asymmetric data could be thoroughly solved. Using data accumulated on the comprehensive database, sales data can be analyzed, categorized, classified through data mining engine imbedded in Polaris CRM and the results can be organized on data mart to provide them in the form of CRM application data. By transforming original sales data into forms which are easy to handle and saving them on data mart separately, it enabled acquiring well-organized data with ease when engaging in various marketing operations, holding a morning meeting and working on decision-making. By using summarized data at data mart, it was possible to process marketing operations such as telemarketing, direct mailing, internet marketing service and service product developments for perceived customers; moreover, information on customer perceptions which is one of CRM's end-products could feed back into the comprehensive database. This research was undertaken to find out how effectively CRM has been employed by comparing and analyzing the management performance of each enterprise site and store after introducing CRM to Hotel enterprises using DEA technique. According to the research results, efficiency evaluation for each site was calculated through input and output factors to find out comparative CRM system usage efficiency of L's Company four sites; moreover, with regard to stores, the sizes of workforce and budget application show a huge difference and so does the each store efficiency. Furthermore, by using the DEA technique, it could assess which sites have comparatively high efficiency and which don't by comparing and evaluating hotel enterprises IT project outcomes such as CRM introduction using the CCR model for each site of the related enterprises. By using the BCC model, it could comparatively evaluate the outcome of CRM usage at each store of A site, which is representative of L Company, and as a result, it could figure out which stores maintain high efficiency in using CRM and which don't. It analyzed the cases of CRM introduction at L Company, which is a hotel enterprise, and precisely evaluated them through DEA. L Company analyzed the customer analysis system by introducing CRM and achieved to provide customers identified through client analysis data with one to one tailored services. Moreover, it could come up with a plan to differentiate the service for customers who revisit by assessing customer discernment rate. As tasks to be solved in the future, it is required to do research on the process analysis which can lead to a specific outcome such as increased sales volumes by carrying on test marketing, target marketing using CRM. Furthermore, it is also necessary to do research on efficiency evaluation in accordance with linkages between other IT solutions such as ERP and CRM system.

A Scalable Dynamic QoS Support Protocol (확장성 있는 동적 QoS 지원 프로토콜)

  • 문새롬;이미정
    • Journal of KIISE:Information Networking
    • /
    • v.29 no.6
    • /
    • pp.722-737
    • /
    • 2002
  • As the number of multimedia applications increases, various protocols and architectures have been proposed to provide QoS(Quality of Service) guarantees in the Internet. Most of these techniques, though, bear inherent contradiction between the scalability and the capability of providing QoS guarantees. In this paper, we propose a protocol, named DQSP(Dynamic QoS Support Protocol), which provides the dynamic resource allocation and admission control for QoS guarantees in a scalable way. In DQSP, the core routers only maintain the per source-edge router resource allocation state information. Each of the source-edge routers maintains the usage information for the resources allocated to itself on each of the network links. Based on this information, the source edge routers perform admission control for the incoming flows. For the resource reservation and admission control, DQSP does not incur per flow signaling at the core network, and the amount of state information at the core routers depends on the scale of the topology instead of the number of user flows. Simulation results show that DQSP achieves efficient resource utilization without incurring the number of user flow related scalability problems as with IntServ, which is one of the representative architectures providing end-to-end QoS guarantees.