• Title/Summary/Keyword: Information value approach

Search Result 1,081, Processing Time 0.038 seconds

The Strategy Development of the Restaurant Industry through the AHP Analysis: Focusing on the Digital Conversion and Non-Contact Service after COVID-19 Pandemic (AHP 기법을 활용한 외식산업의 발전 전략에 관한 연구: COVID-19 팬더믹 이후의 디지털, 비대면 전환에 대한 인식)

  • Lee, Bong-Shik;Park, Min-Jae
    • Asia-Pacific Journal of Business
    • /
    • v.12 no.4
    • /
    • pp.271-288
    • /
    • 2021
  • Purpose - The purpose of this study is to identify the critical factors that restaurant management should consider strategically when making a decision after COVID-19 under the digital transformation and non-contact service expansions environment. Design/methodology/approach - The thirty six experts and managers who have 5 years or more experience in restaurant industry in Korea participate this study. The Analytic Hierarchy Process (AHP) and SER-M were used to analyze the experts' opinion Findings - As a result of the analysis, 'management environment' (0.313) showed the highest relative importance priority, followed by 'brand (0.263)', 'management characteristics (0.254)', and 'physical factors (0.17)'. And, as for the sub-factor value, 'lifestyle (0.087)', 'awareness (0.084)', 'consumer desire (0.075)', and 'loyalty (0.068)' were ranked highest among the 19 influencing critical factors. Research implications or Originality - In the case of large restaurant enterprises, it is judged that the subject(CEO, Management) is actively pursuing a strategy to acquire the necessary resources for the given environment of digital transformation and customer demand for non-contact services. On the other hand, in the case of middle and small restaurant enterprises, it seems that they are fully aware of the demand for expansion of non-contact services and the digital transformation required in the post COVID-19 era, but information technology utilization ability, usage experience, technology acceptance ability, and education and training support for this are only available to large enterprises.

Factors Influencing the Success of Mobile Payment in Developing Countries: A Comparative Analysis of Nigeria and Kenya Mobile Payment Users

  • Bitrus, Stephen-Aruwan;Lee, Chol-Ho;Rho, Jae-Jeung;Erdenebold, Tumennast
    • Asia-Pacific Journal of Business
    • /
    • v.12 no.3
    • /
    • pp.1-36
    • /
    • 2021
  • Purpose - This empirical study, aims to identify the determinants of adoption and acceptance of mobile payment as to understand why it is successful in some countries in Sub-Saharan Africa but failing in others. A comparative study of a successful mobile payment service and a purported failed one was done as to have some insights to the factors affecting acceptance of the technology. Design/methodology/approach - The strength of three notable theories: theory of diffusion of innovation (DOI), the extended unified theory of user acceptance of information technology (UTAUT2) and self-efficacy theory were use. The self-efficacy of government support inclusion as, a moderating variable in the form of infrastructure, securing transaction and price value revealed the relevance of government in the success of mobile payment service. By means of a field survey of 705 subjects in two separate regions of Africa (East and West), the data was collected and use to test the research model. Findings - The study result shows the importance of the moderating factor of government support to the success of mobile payment of any nation. The result also shows the importance of the perception of relative advantage, compatibility, complexity, social influence as already revealed by other studies. Research implications or Originality - Mobile payment success in some part of Sub-Saharan Africa is well known but also suggested to fail in some Sub-Saharan African countries. Buttressing the need for understanding of the factors affecting mobile payment acceptance. This article empirically examined the factors influencing the success of mobile payment, and we implicated that if the implementation of mobile payment is to be successful for mobile commerce in any nation, adoption, acceptance and use by its citizen is imperative.

Radioactivity of biological samples of patients treated with 90Y-DOTATOC

  • Marija Z. Jeremic;Milovan D. Matovic;Nenad R. Mijatovic;Suzana B. Pantovic;Dragana Z. Krstic;Tatjana B. Miladinovic;Dragoslav R. Nikezic
    • Nuclear Engineering and Technology
    • /
    • v.55 no.10
    • /
    • pp.3815-3821
    • /
    • 2023
  • Dosimetric studies in Nuclear Medicine are very important, especially with new therapeutic methods, the number of which has increased significantly with the Theranostic approach (determining diagnostic-therapeutic pairs where similar molecules are labelled with different isotopes in order to diagnose and treat malignant diseases). Peptide receptor radionuclide therapy (PRRT) has been used successfully for many years to treat neuroendocrine tumors (NET). 90Y-DOTATOC is one of the radiopharmaceuticals used frequently in this type of therapy. In this work, blood and urine samples from 13 patients treated with 90Y-DOTATOC were measured by a liquid scintillation beta counter (LSC). Calibration of the beta counter for this type of measurement was done and all results are presented in the paper. The presented paper also provides a methodology for determining the measurement uncertainty for this type of measurement. Immediately after the administration of radiopharmaceuticals, the activity in the blood was different from 6.31% to 88.9% of the applied radioactivity, while 3 h after the termination of the application, the average value of radiopharmaceuticals in the blood was only 3.84%. The activity in the excreted urine depended on the time when the patients urinated after the therapy. It was measured that as much as 58% of the applied radioactivity was excreted in the first urine after the therapy in a patient who urinated 4.5 h after the completed application of the therapy. In most patients, the highest urine activity was in the first 10 h after the application, while the activities after that time were negligibly low. The described methodology of measuring and evaluating activity in blood and excreted urine can be applied to other radiopharmaceuticals used in nuclear medicine. It could be useful for researchers for dosimetric assessments in clinical application of PRRT.

Prospects & Issues of NFT Art Contents in Blockchain Technology (블록체인 NFT 문화예술콘텐츠의 현황과 과제)

  • Jong-Guk Kim
    • Journal of Information Technology Applications and Management
    • /
    • v.30 no.1
    • /
    • pp.115-126
    • /
    • 2023
  • In various fields such as art, design, music, film, sports, games, and fashion, NFTs (Non-Fungible Tokens) are creating new economic value through trading platforms dedicated to NFT art and content. In this article, I analyze the current state of blockchain technology and NFT art content in the context of an expanding market for blockchain-based NFT art content in the metaverse. I also propose several tasks based on the economic and industrial logic of technological innovation. The first task proposed is to integrate cultural arts on blockchain, metaverse, and NFT platforms through digital innovation, instead of separating or distinguishing between creative production and consumption. Before the COVID-19 pandemic, there was a clear separation between creators and consumers. However, with the rise of Web 3.0 platforms, any user can now create and own their own content. Therefore, it is important to promote a collaborative and integrated approach to cultural arts production and consumption in the blockchain and metaverse ecosystem. The second task proposed is to align the legal framework with blockchain-based technological innovation. The enactment and revision of relevant laws should focus on promoting the development of the NFT trading platform ecosystem, rather than merely regulating it for user protection. As blockchain-based technology continues to evolve, it is important that legal systems adapt to support and promote innovation in the space. This shift in focus can help create a more conducive environment for the growth of blockchain-based NFT platforms. The third task proposed is to integrate education on digital arts, including metaverse and NFT art contents, into the current curriculum. This education should focus on convergence and consilience, rather than merely mixing together humanities, technology, and arts. By integrating digital arts education into the curriculum, students can gain a more comprehensive understanding of the potential of blockchain-based technologies and NFT art. This article examines the digital technological innovation such as blockchain, metaverse, and NFT from an economic and industrial point of view. As a limitation of this research, the critical mind such as philosophical thinking or social criticism on technological innovation is left as a future task.

CNN-ViT Hybrid Aesthetic Evaluation Model Based on Quantification of Cognitive Features in Images (이미지의 인지적 특징 정량화를 통한 CNN-ViT 하이브리드 미학 평가 모델)

  • Soo-Eun Kim;Joon-Shik Lim
    • Journal of IKEEE
    • /
    • v.28 no.3
    • /
    • pp.352-359
    • /
    • 2024
  • This paper proposes a CNN-ViT hybrid model that automatically evaluates the aesthetic quality of images by combining local and global features. In this approach, CNN is used to extract local features such as color and object placement, while ViT is employed to analyze the aesthetic value of the image by reflecting global features. Color composition is derived by extracting the primary colors from the input image, creating a color palette, and then passing it through the CNN. The Rule of Thirds is quantified by calculating how closely objects in the image are positioned near the thirds intersection points. These values provide the model with critical information about the color balance and spatial harmony of the image. The model then analyzes the relationship between these factors to predict scores that align closely with human judgment. Experimental results on the AADB image database show that the proposed model achieved a Spearman's Rank Correlation Coefficient (SRCC) of 0.716, indicating more consistent rank predictions, and a Pearson Correlation Coefficient (LCC) of 0.72, which is 2~4% higher than existing models.

A Folksonomy Ranking Framework: A Semantic Graph-based Approach (폭소노미 사이트를 위한 랭킹 프레임워크 설계: 시맨틱 그래프기반 접근)

  • Park, Hyun-Jung;Rho, Sang-Kyu
    • Asia pacific journal of information systems
    • /
    • v.21 no.2
    • /
    • pp.89-116
    • /
    • 2011
  • In collaborative tagging systems such as Delicious.com and Flickr.com, users assign keywords or tags to their uploaded resources, such as bookmarks and pictures, for their future use or sharing purposes. The collection of resources and tags generated by a user is called a personomy, and the collection of all personomies constitutes the folksonomy. The most significant need of the folksonomy users Is to efficiently find useful resources or experts on specific topics. An excellent ranking algorithm would assign higher ranking to more useful resources or experts. What resources are considered useful In a folksonomic system? Does a standard superior to frequency or freshness exist? The resource recommended by more users with mere expertise should be worthy of attention. This ranking paradigm can be implemented through a graph-based ranking algorithm. Two well-known representatives of such a paradigm are Page Rank by Google and HITS(Hypertext Induced Topic Selection) by Kleinberg. Both Page Rank and HITS assign a higher evaluation score to pages linked to more higher-scored pages. HITS differs from PageRank in that it utilizes two kinds of scores: authority and hub scores. The ranking objects of these pages are limited to Web pages, whereas the ranking objects of a folksonomic system are somewhat heterogeneous(i.e., users, resources, and tags). Therefore, uniform application of the voting notion of PageRank and HITS based on the links to a folksonomy would be unreasonable, In a folksonomic system, each link corresponding to a property can have an opposite direction, depending on whether the property is an active or a passive voice. The current research stems from the Idea that a graph-based ranking algorithm could be applied to the folksonomic system using the concept of mutual Interactions between entitles, rather than the voting notion of PageRank or HITS. The concept of mutual interactions, proposed for ranking the Semantic Web resources, enables the calculation of importance scores of various resources unaffected by link directions. The weights of a property representing the mutual interaction between classes are assigned depending on the relative significance of the property to the resource importance of each class. This class-oriented approach is based on the fact that, in the Semantic Web, there are many heterogeneous classes; thus, applying a different appraisal standard for each class is more reasonable. This is similar to the evaluation method of humans, where different items are assigned specific weights, which are then summed up to determine the weighted average. We can check for missing properties more easily with this approach than with other predicate-oriented approaches. A user of a tagging system usually assigns more than one tags to the same resource, and there can be more than one tags with the same subjectivity and objectivity. In the case that many users assign similar tags to the same resource, grading the users differently depending on the assignment order becomes necessary. This idea comes from the studies in psychology wherein expertise involves the ability to select the most relevant information for achieving a goal. An expert should be someone who not only has a large collection of documents annotated with a particular tag, but also tends to add documents of high quality to his/her collections. Such documents are identified by the number, as well as the expertise, of users who have the same documents in their collections. In other words, there is a relationship of mutual reinforcement between the expertise of a user and the quality of a document. In addition, there is a need to rank entities related more closely to a certain entity. Considering the property of social media that ensures the popularity of a topic is temporary, recent data should have more weight than old data. We propose a comprehensive folksonomy ranking framework in which all these considerations are dealt with and that can be easily customized to each folksonomy site for ranking purposes. To examine the validity of our ranking algorithm and show the mechanism of adjusting property, time, and expertise weights, we first use a dataset designed for analyzing the effect of each ranking factor independently. We then show the ranking results of a real folksonomy site, with the ranking factors combined. Because the ground truth of a given dataset is not known when it comes to ranking, we inject simulated data whose ranking results can be predicted into the real dataset and compare the ranking results of our algorithm with that of a previous HITS-based algorithm. Our semantic ranking algorithm based on the concept of mutual interaction seems to be preferable to the HITS-based algorithm as a flexible folksonomy ranking framework. Some concrete points of difference are as follows. First, with the time concept applied to the property weights, our algorithm shows superior performance in lowering the scores of older data and raising the scores of newer data. Second, applying the time concept to the expertise weights, as well as to the property weights, our algorithm controls the conflicting influence of expertise weights and enhances overall consistency of time-valued ranking. The expertise weights of the previous study can act as an obstacle to the time-valued ranking because the number of followers increases as time goes on. Third, many new properties and classes can be included in our framework. The previous HITS-based algorithm, based on the voting notion, loses ground in the situation where the domain consists of more than two classes, or where other important properties, such as "sent through twitter" or "registered as a friend," are added to the domain. Forth, there is a big difference in the calculation time and memory use between the two kinds of algorithms. While the matrix multiplication of two matrices, has to be executed twice for the previous HITS-based algorithm, this is unnecessary with our algorithm. In our ranking framework, various folksonomy ranking policies can be expressed with the ranking factors combined and our approach can work, even if the folksonomy site is not implemented with Semantic Web languages. Above all, the time weight proposed in this paper will be applicable to various domains, including social media, where time value is considered important.

Science & Technology Business: The Role of International Science Business Belt in Korea (과학기술 비즈니스(S&T Business): 과학벨트(ISBB)의 역할)

  • Lee, Won Cheul;Choi, Jong-In
    • Asia-Pacific Journal of Business Venturing and Entrepreneurship
    • /
    • v.11 no.4
    • /
    • pp.139-148
    • /
    • 2016
  • The importance of technology commercialization is being emphasized more and more, but high technology excellence does not always lead to a successful business. In addition, companies that invest heavily in basic research and technology development are also not always able to guarantee high profits. This is due to the technical problems when present in step research and development levels are difficult to commercialize. And the market will not work rationally is also the cause of the problem. Therefore, to effectively utilize the resources necessary to ensure such partners to build alliances and cooperation with external funding or manpower is crucial to commercialize the research results of basic science successfully. In previous studies, it has been made many studies in accordance with the approach to the technology (e.g. Research and Development, Management of Technology, Technology Innovation, etc.). But the study of technology commercialization point of view is not being done much. This study should explores the available business required, or realized in the process of researching the basic science and trying to understand the imperfections of the market through the property of technology (tacit knowledge, objectified value of technology and the information asymmetry between innovation subjects, etc.). In addition, this paper, we try to focus on a strategic approach to the role of International Science Business Belt (ISBB) with success in science and technology business as appropriate countermeasures about breaking the 'Valley of Death.

  • PDF

A Study on a Multiresolution Filtering Algorithm based on a Physical Model of SPECT Lesion Detectability (SPECT 이상조직 검출능 모델에 근거한 다해상도 필터링 기법 연구)

  • Kim, Jeong-Hui;Kim, Gwang-Ik
    • Journal of Biomedical Engineering Research
    • /
    • v.19 no.6
    • /
    • pp.551-562
    • /
    • 1998
  • Amultiresolution filtering algorithm based on the physical SPECT lesion detachability provides and optimal solution for SPECT reconstruction problem. Related to the previous study, we estimated the SPECT lesion detection capability by m minimum detectable lesion sizes (MDLSs), and generated m reconstruction filters which are designed to maximize the smoothing effect at a fixed MDLS-dependent resolution level $\frac{MDLS}{4\sqrt{2In2}}$. The proposed multiresolution filtering algorithm used a coarse-to-fine approach for the m-level resolution filter images obtained from these m filters for a given projection image. First, the local homogeneity is determined for every pixel of the filter images, by comparing the local variance value computed in a window centered at the pixel and the mode determined from the distribution of the local variances. Based on the local homogeneity, the pixels declared as homogeneous are chosen from the filter image of the lowest resolution, and for the other pixels the same process is repeated for the higher resolution filter images. For the non-homogeneous pixels after this pixels after this repetition process ends, the pixel values of the highest resolution filter image are substituted. From the results of the simulated experiments, the proposed multiresolution filtering algorithm showed a strong smoothing effect in the homogeneous regions and a significant resolution improvement near the edge regions of the projection images, and so produced good adaptability effects in the reconstructed images.

  • PDF

Regional Ecological Network Design for Wild Animals' Movement Using Landscape Permeability and Least-cost Path Methods in the Metropolitan Area of Korea (경관투과성 및 최소비용경로 분석을 통한 수도권 지역의 광역생태축 구축 연구)

  • Lee, Dong-Kun;Song, Won-Kyong;Jeon, Seong-Woo
    • Journal of the Korean Society of Environmental Restoration Technology
    • /
    • v.11 no.3
    • /
    • pp.94-106
    • /
    • 2008
  • As populations inhabiting in natural ecosystem are fragmented by artificial barriers and habitats are destructed by development, extinction possibility of species is getting higher. It is necessary to design and to manage conservation areas and corridors considering animals' movement and migration for sustainable species diversity in present circumstances. 'Least-cost modeling' is one commonly employed approach in which dispersal costs are assigned to distinct habitat types and the last-costly dispersal paths among habitat patches are calculated using a geographical information system (GIS). This study aims to design ecological corridor using least-cost path method and to apply it to a regional ecological network considering movability of medium-large size mammals. This study was carried out over the metropolitan area, which has been deforested by rapid urbanization. Nevertheless there is connected with Gangwon province, Baekdudaegan mountain range and DMZ, considered where many forest species can migrate to this region. This study employs such an approach to develop least-cost path models for medium-large size mammals, have inhabited for this entire region. Considering those species, two forest areas as a source of species supply and forest areas more than 1,000ha are selected as focal forest areas. Movement and migration paths from species supply sources to focal forest areas are calculated by applying landscape permeability theory using land cover map, road density map and land slope map. Results showed least-cost paths from species supply sources to focal forest areas on two species. Wildcat and roe deer are different in some least-cost paths caused by their landscape permeability but paths show generally same specifics. The result of considering regional distribution of expected movement and migration paths to regional ecological network, low altitude mountains of western metropolitan area are evaluated important area for species connectivity. In national or regional levels ecological connectivity is essential to promote species diversity and to preserve integrated ecosystem. This study concludes that developing least-cost models from similar empirical data could significantly improve the utility of these tools.

A Use-case based Component Mining Approach for the Modernization of Legacy Systems (레거시 시스템을 현대화하기 위한 유스케이스 기반의 컴포넌트 추출 방법)

  • Kim, Hyeon-Soo;Chae, Heung-Seok;Kim, Chul-Hong
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.7
    • /
    • pp.601-611
    • /
    • 2005
  • Due to not only proven stability and reliability but a significant investment and years of accumulated -experience and knowledge, legacy systems have supported the core business applications of a number of organizations over many years. While the emergence of Web-based e-business environments requires externalizing core business processes to the Web. This is a competitive advantage in the new economy. Consequently, organizations now need to mine the business value buried in the legacy systems for reuse in new e-business applications. In this paper we suggest a systematic approach to mining components that perform specific business services and that consist of the legacy system's assets to be leveraged on the modem platform. The proposed activities are divided into several tasks. First, use cases that realize the business processes are captured. Secondly, a design model is constructed for each identified use case in order to integrate the use cases with the similar functionalities. Thirdly, we identify component candidates from the design model and then adjust the component candidates by considering common elements among the candidate components. And also business components are divided into three more fine-grained components to deploy them onto J2EE/EJB environments. finally, we define the interfaces of components which provide functionalities of the components as operations.