• 제목/요약/키워드: Park Management

Search Result 21,563, Processing Time 0.052 seconds

Effects of firm strategies on customer acquisition of Software as a Service (SaaS) providers: A mediating and moderating role of SaaS technology maturity (SaaS 기업의 차별화 및 가격전략이 고객획득성과에 미치는 영향: SaaS 기술성숙도 수준의 매개효과 및 조절효과를 중심으로)

  • Chae, SeongWook;Park, Sungbum
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.3
    • /
    • pp.151-171
    • /
    • 2014
  • Firms today have sought management effectiveness and efficiency utilizing information technologies (IT). Numerous firms are outsourcing specific information systems functions to cope with their short of information resources or IT experts, or to reduce their capital cost. Recently, Software-as-a-Service (SaaS) as a new type of information system has become one of the powerful outsourcing alternatives. SaaS is software deployed as a hosted and accessed over the internet. It is regarded as the idea of on-demand, pay-per-use, and utility computing and is now being applied to support the core competencies of clients in areas ranging from the individual productivity area to the vertical industry and e-commerce area. In this study, therefore, we seek to quantify the value that SaaS has on business performance by examining the relationships among firm strategies, SaaS technology maturity, and business performance of SaaS providers. We begin by drawing from prior literature on SaaS, technology maturity and firm strategy. SaaS technology maturity is classified into three different phases such as application service providing (ASP), Web-native application, and Web-service application. Firm strategies are manipulated by the low-cost strategy and differentiation strategy. Finally, we considered customer acquisition as a business performance. In this sense, specific objectives of this study are as follows. First, we examine the relationships between customer acquisition performance and both low-cost strategy and differentiation strategy of SaaS providers. Secondly, we investigate the mediating and moderating effects of SaaS technology maturity on those relationships. For this purpose, study collects data from the SaaS providers, and their line of applications registered in the database in CNK (Commerce net Korea) in Korea using a questionnaire method by the professional research institution. The unit of analysis in this study is the SBUs (strategic business unit) in the software provider. A total of 199 SBUs is used for analyzing and testing our hypotheses. With regards to the measurement of firm strategy, we take three measurement items for differentiation strategy such as the application uniqueness (referring an application aims to differentiate within just one or a small number of target industry), supply channel diversification (regarding whether SaaS vendor had diversified supply chain) as well as the number of specialized expertise and take two items for low cost strategy like subscription fee and initial set-up fee. We employ a hierarchical regression analysis technique for testing moderation effects of SaaS technology maturity and follow the Baron and Kenny's procedure for determining if firm strategies affect customer acquisition through technology maturity. Empirical results revealed that, firstly, when differentiation strategy is applied to attain business performance like customer acquisition, the effects of the strategy is moderated by the technology maturity level of SaaS providers. In other words, securing higher level of SaaS technology maturity is essential for higher business performance. For instance, given that firms implement application uniqueness or a distribution channel diversification as a differentiation strategy, they can acquire more customers when their level of SaaS technology maturity is higher rather than lower. Secondly, results indicate that pursuing differentiation strategy or low cost strategy effectively works for SaaS providers' obtaining customer, which means that continuously differentiating their service from others or making their service fee (subscription fee or initial set-up fee) lower are helpful for their business success in terms of acquiring their customers. Lastly, results show that the level of SaaS technology maturity mediates the relationships between low cost strategy and customer acquisition. That is, based on our research design, customers usually perceive the real value of the low subscription fee or initial set-up fee only through the SaaS service provide by vender and, in turn, this will affect their decision making whether subscribe or not.

A Study on the Application of Outlier Analysis for Fraud Detection: Focused on Transactions of Auction Exception Agricultural Products (부정 탐지를 위한 이상치 분석 활용방안 연구 : 농수산 상장예외품목 거래를 대상으로)

  • Kim, Dongsung;Kim, Kitae;Kim, Jongwoo;Park, Steve
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.3
    • /
    • pp.93-108
    • /
    • 2014
  • To support business decision making, interests and efforts to analyze and use transaction data in different perspectives are increasing. Such efforts are not only limited to customer management or marketing, but also used for monitoring and detecting fraud transactions. Fraud transactions are evolving into various patterns by taking advantage of information technology. To reflect the evolution of fraud transactions, there are many efforts on fraud detection methods and advanced application systems in order to improve the accuracy and ease of fraud detection. As a case of fraud detection, this study aims to provide effective fraud detection methods for auction exception agricultural products in the largest Korean agricultural wholesale market. Auction exception products policy exists to complement auction-based trades in agricultural wholesale market. That is, most trades on agricultural products are performed by auction; however, specific products are assigned as auction exception products when total volumes of products are relatively small, the number of wholesalers is small, or there are difficulties for wholesalers to purchase the products. However, auction exception products policy makes several problems on fairness and transparency of transaction, which requires help of fraud detection. In this study, to generate fraud detection rules, real huge agricultural products trade transaction data from 2008 to 2010 in the market are analyzed, which increase more than 1 million transactions and 1 billion US dollar in transaction volume. Agricultural transaction data has unique characteristics such as frequent changes in supply volumes and turbulent time-dependent changes in price. Since this was the first trial to identify fraud transactions in this domain, there was no training data set for supervised learning. So, fraud detection rules are generated using outlier detection approach. We assume that outlier transactions have more possibility of fraud transactions than normal transactions. The outlier transactions are identified to compare daily average unit price, weekly average unit price, and quarterly average unit price of product items. Also quarterly averages unit price of product items of the specific wholesalers are used to identify outlier transactions. The reliability of generated fraud detection rules are confirmed by domain experts. To determine whether a transaction is fraudulent or not, normal distribution and normalized Z-value concept are applied. That is, a unit price of a transaction is transformed to Z-value to calculate the occurrence probability when we approximate the distribution of unit prices to normal distribution. The modified Z-value of the unit price in the transaction is used rather than using the original Z-value of it. The reason is that in the case of auction exception agricultural products, Z-values are influenced by outlier fraud transactions themselves because the number of wholesalers is small. The modified Z-values are called Self-Eliminated Z-scores because they are calculated excluding the unit price of the specific transaction which is subject to check whether it is fraud transaction or not. To show the usefulness of the proposed approach, a prototype of fraud transaction detection system is developed using Delphi. The system consists of five main menus and related submenus. First functionalities of the system is to import transaction databases. Next important functions are to set up fraud detection parameters. By changing fraud detection parameters, system users can control the number of potential fraud transactions. Execution functions provide fraud detection results which are found based on fraud detection parameters. The potential fraud transactions can be viewed on screen or exported as files. The study is an initial trial to identify fraud transactions in Auction Exception Agricultural Products. There are still many remained research topics of the issue. First, the scope of analysis data was limited due to the availability of data. It is necessary to include more data on transactions, wholesalers, and producers to detect fraud transactions more accurately. Next, we need to extend the scope of fraud transaction detection to fishery products. Also there are many possibilities to apply different data mining techniques for fraud detection. For example, time series approach is a potential technique to apply the problem. Even though outlier transactions are detected based on unit prices of transactions, however it is possible to derive fraud detection rules based on transaction volumes.

Pareto Ratio and Inequality Level of Knowledge Sharing in Virtual Knowledge Collaboration: Analysis of Behaviors on Wikipedia (지식 공유의 파레토 비율 및 불평등 정도와 가상 지식 협업: 위키피디아 행위 데이터 분석)

  • Park, Hyun-Jung;Shin, Kyung-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.3
    • /
    • pp.19-43
    • /
    • 2014
  • The Pareto principle, also known as the 80-20 rule, states that roughly 80% of the effects come from 20% of the causes for many events including natural phenomena. It has been recognized as a golden rule in business with a wide application of such discovery like 20 percent of customers resulting in 80 percent of total sales. On the other hand, the Long Tail theory, pointing out that "the trivial many" produces more value than "the vital few," has gained popularity in recent times with a tremendous reduction of distribution and inventory costs through the development of ICT(Information and Communication Technology). This study started with a view to illuminating how these two primary business paradigms-Pareto principle and Long Tail theory-relates to the success of virtual knowledge collaboration. The importance of virtual knowledge collaboration is soaring in this era of globalization and virtualization transcending geographical and temporal constraints. Many previous studies on knowledge sharing have focused on the factors to affect knowledge sharing, seeking to boost individual knowledge sharing and resolve the social dilemma caused from the fact that rational individuals are likely to rather consume than contribute knowledge. Knowledge collaboration can be defined as the creation of knowledge by not only sharing knowledge, but also by transforming and integrating such knowledge. In this perspective of knowledge collaboration, the relative distribution of knowledge sharing among participants can count as much as the absolute amounts of individual knowledge sharing. In particular, whether the more contribution of the upper 20 percent of participants in knowledge sharing will enhance the efficiency of overall knowledge collaboration is an issue of interest. This study deals with the effect of this sort of knowledge sharing distribution on the efficiency of knowledge collaboration and is extended to reflect the work characteristics. All analyses were conducted based on actual data instead of self-reported questionnaire surveys. More specifically, we analyzed the collaborative behaviors of editors of 2,978 English Wikipedia featured articles, which are the best quality grade of articles in English Wikipedia. We adopted Pareto ratio, the ratio of the number of knowledge contribution of the upper 20 percent of participants to the total number of knowledge contribution made by the total participants of an article group, to examine the effect of Pareto principle. In addition, Gini coefficient, which represents the inequality of income among a group of people, was applied to reveal the effect of inequality of knowledge contribution. Hypotheses were set up based on the assumption that the higher ratio of knowledge contribution by more highly motivated participants will lead to the higher collaboration efficiency, but if the ratio gets too high, the collaboration efficiency will be exacerbated because overall informational diversity is threatened and knowledge contribution of less motivated participants is intimidated. Cox regression models were formulated for each of the focal variables-Pareto ratio and Gini coefficient-with seven control variables such as the number of editors involved in an article, the average time length between successive edits of an article, the number of sections a featured article has, etc. The dependent variable of the Cox models is the time spent from article initiation to promotion to the featured article level, indicating the efficiency of knowledge collaboration. To examine whether the effects of the focal variables vary depending on the characteristics of a group task, we classified 2,978 featured articles into two categories: Academic and Non-academic. Academic articles refer to at least one paper published at an SCI, SSCI, A&HCI, or SCIE journal. We assumed that academic articles are more complex, entail more information processing and problem solving, and thus require more skill variety and expertise. The analysis results indicate the followings; First, Pareto ratio and inequality of knowledge sharing relates in a curvilinear fashion to the collaboration efficiency in an online community, promoting it to an optimal point and undermining it thereafter. Second, the curvilinear effect of Pareto ratio and inequality of knowledge sharing on the collaboration efficiency is more sensitive with a more academic task in an online community.

Intelligent Brand Positioning Visualization System Based on Web Search Traffic Information : Focusing on Tablet PC (웹검색 트래픽 정보를 활용한 지능형 브랜드 포지셔닝 시스템 : 태블릿 PC 사례를 중심으로)

  • Jun, Seung-Pyo;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.3
    • /
    • pp.93-111
    • /
    • 2013
  • As Internet and information technology (IT) continues to develop and evolve, the issue of big data has emerged at the foreground of scholarly and industrial attention. Big data is generally defined as data that exceed the range that can be collected, stored, managed and analyzed by existing conventional information systems and it also refers to the new technologies designed to effectively extract values from such data. With the widespread dissemination of IT systems, continual efforts have been made in various fields of industry such as R&D, manufacturing, and finance to collect and analyze immense quantities of data in order to extract meaningful information and to use this information to solve various problems. Since IT has converged with various industries in many aspects, digital data are now being generated at a remarkably accelerating rate while developments in state-of-the-art technology have led to continual enhancements in system performance. The types of big data that are currently receiving the most attention include information available within companies, such as information on consumer characteristics, information on purchase records, logistics information and log information indicating the usage of products and services by consumers, as well as information accumulated outside companies, such as information on the web search traffic of online users, social network information, and patent information. Among these various types of big data, web searches performed by online users constitute one of the most effective and important sources of information for marketing purposes because consumers search for information on the internet in order to make efficient and rational choices. Recently, Google has provided public access to its information on the web search traffic of online users through a service named Google Trends. Research that uses this web search traffic information to analyze the information search behavior of online users is now receiving much attention in academia and in fields of industry. Studies using web search traffic information can be broadly classified into two fields. The first field consists of empirical demonstrations that show how web search information can be used to forecast social phenomena, the purchasing power of consumers, the outcomes of political elections, etc. The other field focuses on using web search traffic information to observe consumer behavior, identifying the attributes of a product that consumers regard as important or tracking changes on consumers' expectations, for example, but relatively less research has been completed in this field. In particular, to the extent of our knowledge, hardly any studies related to brands have yet attempted to use web search traffic information to analyze the factors that influence consumers' purchasing activities. This study aims to demonstrate that consumers' web search traffic information can be used to derive the relations among brands and the relations between an individual brand and product attributes. When consumers input their search words on the web, they may use a single keyword for the search, but they also often input multiple keywords to seek related information (this is referred to as simultaneous searching). A consumer performs a simultaneous search either to simultaneously compare two product brands to obtain information on their similarities and differences, or to acquire more in-depth information about a specific attribute in a specific brand. Web search traffic information shows that the quantity of simultaneous searches using certain keywords increases when the relation is closer in the consumer's mind and it will be possible to derive the relations between each of the keywords by collecting this relational data and subjecting it to network analysis. Accordingly, this study proposes a method of analyzing how brands are positioned by consumers and what relationships exist between product attributes and an individual brand, using simultaneous search traffic information. It also presents case studies demonstrating the actual application of this method, with a focus on tablets, belonging to innovative product groups.

Sentiment Analysis of Movie Review Using Integrated CNN-LSTM Mode (CNN-LSTM 조합모델을 이용한 영화리뷰 감성분석)

  • Park, Ho-yeon;Kim, Kyoung-jae
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.141-154
    • /
    • 2019
  • Rapid growth of internet technology and social media is progressing. Data mining technology has evolved to enable unstructured document representations in a variety of applications. Sentiment analysis is an important technology that can distinguish poor or high-quality content through text data of products, and it has proliferated during text mining. Sentiment analysis mainly analyzes people's opinions in text data by assigning predefined data categories as positive and negative. This has been studied in various directions in terms of accuracy from simple rule-based to dictionary-based approaches using predefined labels. In fact, sentiment analysis is one of the most active researches in natural language processing and is widely studied in text mining. When real online reviews aren't available for others, it's not only easy to openly collect information, but it also affects your business. In marketing, real-world information from customers is gathered on websites, not surveys. Depending on whether the website's posts are positive or negative, the customer response is reflected in the sales and tries to identify the information. However, many reviews on a website are not always good, and difficult to identify. The earlier studies in this research area used the reviews data of the Amazon.com shopping mal, but the research data used in the recent studies uses the data for stock market trends, blogs, news articles, weather forecasts, IMDB, and facebook etc. However, the lack of accuracy is recognized because sentiment calculations are changed according to the subject, paragraph, sentiment lexicon direction, and sentence strength. This study aims to classify the polarity analysis of sentiment analysis into positive and negative categories and increase the prediction accuracy of the polarity analysis using the pretrained IMDB review data set. First, the text classification algorithm related to sentiment analysis adopts the popular machine learning algorithms such as NB (naive bayes), SVM (support vector machines), XGboost, RF (random forests), and Gradient Boost as comparative models. Second, deep learning has demonstrated discriminative features that can extract complex features of data. Representative algorithms are CNN (convolution neural networks), RNN (recurrent neural networks), LSTM (long-short term memory). CNN can be used similarly to BoW when processing a sentence in vector format, but does not consider sequential data attributes. RNN can handle well in order because it takes into account the time information of the data, but there is a long-term dependency on memory. To solve the problem of long-term dependence, LSTM is used. For the comparison, CNN and LSTM were chosen as simple deep learning models. In addition to classical machine learning algorithms, CNN, LSTM, and the integrated models were analyzed. Although there are many parameters for the algorithms, we examined the relationship between numerical value and precision to find the optimal combination. And, we tried to figure out how the models work well for sentiment analysis and how these models work. This study proposes integrated CNN and LSTM algorithms to extract the positive and negative features of text analysis. The reasons for mixing these two algorithms are as follows. CNN can extract features for the classification automatically by applying convolution layer and massively parallel processing. LSTM is not capable of highly parallel processing. Like faucets, the LSTM has input, output, and forget gates that can be moved and controlled at a desired time. These gates have the advantage of placing memory blocks on hidden nodes. The memory block of the LSTM may not store all the data, but it can solve the CNN's long-term dependency problem. Furthermore, when LSTM is used in CNN's pooling layer, it has an end-to-end structure, so that spatial and temporal features can be designed simultaneously. In combination with CNN-LSTM, 90.33% accuracy was measured. This is slower than CNN, but faster than LSTM. The presented model was more accurate than other models. In addition, each word embedding layer can be improved when training the kernel step by step. CNN-LSTM can improve the weakness of each model, and there is an advantage of improving the learning by layer using the end-to-end structure of LSTM. Based on these reasons, this study tries to enhance the classification accuracy of movie reviews using the integrated CNN-LSTM model.

A Study on the Yousang-Dae Goksuro(Curve-Waterway) in Gangneung, Yungok-Myun, Yoodung Ri (강릉 연곡면 유등리 '유상대(流觴臺)' 곡수로(曲水路)의 조명(照明))

  • Rho, Jae-Hyun;Shin, Sang-Sup;Lee, Jung-Han;Huh, Jun;Park, Joo-Sung
    • Journal of the Korean Institute of Traditional Landscape Architecture
    • /
    • v.30 no.1
    • /
    • pp.14-21
    • /
    • 2012
  • The object of the study, Yousang-Dae(流觴臺) and engraved Go broad text on the flat rock in Gangneung-si Yungok-myun Yoodung-ri Baemgol, reveals that the place was for appreciating arts like Yusang Goksu and Taoist hermit's games. three times of detail reconnaissance survey brought about the results as follows. There is a the text, Manwolsan(滿月山) Baegundongcheon(白雲洞天), engraved on the rock in Baegunsa(白雲寺) that had been built by Doun at the first year of King Hungang(in 875) of the United Shilla, became in ruins in the middle of Joseon, and then was rebuilt in 1954. The text is an invaluable evidence indicating that the tradition of Taoist hermit and Sunbee(classical scholars) culture has been generated in Baemgol Valley. According to the 2nd vol. of Donghoseungram(東湖勝覽), the chronicle of Gangneung published by Choi Baeksoon in 1934, there is a record saying that 'Baegunsa in Namjeonhyeon is the classroom where famous teachers like Yulgok Lee Yi or Seongje Choi Ok were teaching' that verifies the historic property of the place. In addition, the management of Nujeong(樓亭) and Dongcheon can be traced through Baegunjeong(白雲亭) constructed by Kim Yoonkyung(金潤卿) in Muo year, the 9th year of Cheoljong(1858) according to Donghoseungram and the completed version of Jeungboyimyoungji(增補臨瀛誌). Also, Baegundongdongcheon(白雲亭洞天), the text engraved on the standing stone across the stream from Yousang-Dae stone, was created 3 years after the Baegunjeong construction in the 12th year of Cheoljong(1861), which refers a symbolic sign closely related with Yousang-Dae. Based on this premise and circumstance, with careful studying the remains of 'Yusang-dae' Goksuro, we discovered that the Sebun-seok(細分石) controling the amount and the speed of moving water and the remains of furrows of Keumbae-soek(擒盃石) and Yubae-gong(留盃孔) containing water stream with cups through the mountain stream and rocks around Yusang-Dae. In addition, as 21 people's names engraved under the statement of 'Oh-Seong(午星)' were discovered on the bottom of the rock, this clearly confirms that the place was one of the main cultural footholds of tasting the arts which have characteristics of Yu-Sang-Gok-Su-Yeon(流觴曲水宴) until the middle of the 20th century. It implies that the arts tasting culture of Sunbees had been inherited centering on Yusang-dae in this particular place until the middle of the 20th century. It is necessary to be studied in depth because the place is a historic and unique cultural place where 'Confucianism, Buddhism, and Zen'were combined together. Based on the result of the study, the identification of 23 people as well as the writer of Yusang-Dae text should be carefully studied in depth in terms of the characteristics of the place through gathering data about appreciation of arts like Yusanggoksu. Likewise, we should make efforts to discover the chess board engraved on the rock described on the documents, thus we should consider to establish plans to recover the original shape of the place, for example, breaking the cement pavement of the road, additional excavation, changing the existing route, and so fourth.

Isotope Ratio of Mineral N in Pinus Densiflora Forest Soils in Rural and Industrial Areas: Potential Indicator of Atmospheric N Deposition and Soil N Loss (질소공급, 고추의 생육 및 수량에 대한 녹비작물 환원 효과)

  • Kwak, Jin-Hyeob;Lim, Sang-Sun;Park, Hyun-Jung;Lee, Sun-Il;Lee, Dong-Suk;Lee, Kye-Han;Han, Gwang-Hyun;Ro, Hee-Myong;Lee, Sang-Mo;Choi, Woo-Jung
    • Korean Journal of Soil Science and Fertilizer
    • /
    • v.42 no.1
    • /
    • pp.46-52
    • /
    • 2009
  • Deposition of atmospheric N that is depleted in $^{15}N$ has shown to decrease N isotope ratio ($^{15}N/^{14}N$,expressed as ${\delta}^{15}N$) of forest samples such as tree rings, foliage, and total soil-N. However, its effect on ${\delta}^{15}N$ of mineral soil-N which is biologically active N pool has never been tested. In this study, ${\delta}^{15}N$ of mineral N($NH{_4}^+$ and $NO_3{^-}$) in forest soils from organic and two depths of mineral soil layers (0 to 20 cm and 20 to 40cm depth) of Pinus densiflora stands located at two distinct areas (rural and industrial areas) in southern Korea was analyzed to investigate if there is any difference in ${\delta}^{15}N$ of mineral N between these areas. We also evaluated potential N loss of the study sites using ${\delta}^{15}N$ of mineral N. Across the soil layers, the ${\delta}^{15}N$ of $NH{_4}^+$ ranged from +8.9 to +24.8‰ in the rural area and from +4.4 to +13.8‰ in the industrial area. Soils from organic layer (+4.4‰) and mineral layer between 0 and 20 cm (+13.8‰) of industrial area showed significantly lower ${\delta}^{15}N$ of $NH{_4}^+$ than those of rural area (+8.9 and +24.3‰, respectively), probably indicating the greater contribution of $^{15}N$-depleted $NH{_4}^+$ from atmospheric deposition to forest in the industrial area than in the rural area. Meanwhile, ${\delta}^{15}N$ of $NO_3{^-}$ was not different between the rural and industrial areas, probably because ${\delta}^{15}N$ of $NO_3{^-}$ is more likely to be altered by the N loss that causes $^{15}N$ enrichment of the remaining soil N pool. Compared with the ${\delta}^{15}N$ of soil mineral N reported by other studies (from -10.9 to +15.6‰ for $NH{_4}^+$ and -14.8 to +5.6‰ for $NO_3{^-}$), the ${\delta}^{15}N$ observed in our study was substantially high, suggesting that the study sites are more subject to the N loss. It was concluded that $NH{_4}^+$ rather than $NO_3{^-}$ can conserve the ${\delta}^{15}N$ signature of atmospheric N deposition in forest ecosystems.

Analysis of the Eyeglasses Supply System for Ametropes in ROK Military (한국군 비정시자용 안경의 보급체계 분석)

  • Jin, Yong-Gab;Koo, Bon-Yeop;Lee, Woo-Chul;Yoon, Moon-Soo;Park, Jin-Tae;Lee, Hang-Seok;Lee, Kyo-Eun;Leem, Hyun-Sung;Jang, Jae-Young;Mah, Ki-Choong
    • The Korean Journal of Vision Science
    • /
    • v.20 no.4
    • /
    • pp.579-588
    • /
    • 2018
  • Purpose : To analyze the eyeglasses supply system for ametropic soldiers in ROK military. Methods : We investigated and analyzed the supply system of eyeglasses for the ametropic soldiers provided by the Korean military. The refractive powers and corrected visual acuity were measured for 37 ametropic soldiers who wear insert glasses for ballistic protective and gas-masks supplied by the military based on their habitual prescriptions. Full correction of refractive error was prescribed for subjects having less than 1.0 of distance visual acuity, and comparison was held for inspecting the changes in corrected visual acuity. Suggestions were provided for solving the issues regarding current supplying system, and this study investigated the applicabilities for utilizing professional optometric manpower. Results : The new glasses supplied by army for ametropic soldiers were duplicated from the glasses they worn when entering the army. The spherical equivalent refractive powers of the conventional, ballistic protective and gas-mask insert glasses supplied for 37 ametropic soldiers were $-3.47{\pm}1.69D$, $-3.52{\pm}1.66D$ and $-3.55{\pm}1.63D$, respectively, and the spherical equivalent refractive power of full corrected glasses was $-3.79{\pm}1.66D$, which showed a significant difference(p<0.05). The distant corrected visual acuity measured at high and low contrast(logMAR) of conventional, ballistic protective and gas-mask insert glasses were $0.06{\pm}0.80$, $0.21{\pm}0.82$, $0.15{\pm}0.74$, $0.34{\pm}0.89$, $0.10{\pm}0.70$ and $0.22{\pm}0.27$, respectively, while the corrected visual acuity by full corrected glasses were increased to $0.02{\pm}1.05$, $0.10{\pm}0.07$, $0.09{\pm}0.92$, $0.26{\pm}0.10$, $0.04{\pm}1.00$ and $0.19{\pm}1.00$, respectively. There was a significant difference(p<0.05) except for the case of the low contrast corrected visual acuity of the conventional and gas-mask insert glasses. The procedure for ordering, dispensing, and supplying military glasses consists of 5 steps, and it was found that approximately two weeks or more are required to supply from the initial examination. Conclusion : The procedure of supplying the military glasses showed three issues: 1) a lack of refraction for prescription system, 2) relatively long length of time required for supplying the glasses, 3) an inaccurate power of supplied glasses. In order to solve those issues, in the short term, education is necessarily required for soldiers on the measurement of the refractive powers, and in the near future, further standard procedures for prescription of glasses as well as the securement of optometric manpower are expected.

Individual Thinking Style leads its Emotional Perception: Development of Web-style Design Evaluation Model and Recommendation Algorithm Depending on Consumer Regulatory Focus (사고가 시각을 바꾼다: 조절 초점에 따른 소비자 감성 기반 웹 스타일 평가 모형 및 추천 알고리즘 개발)

  • Kim, Keon-Woo;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.4
    • /
    • pp.171-196
    • /
    • 2018
  • With the development of the web, two-way communication and evaluation became possible and marketing paradigms shifted. In order to meet the needs of consumers, web design trends are continuously responding to consumer feedback. As the web becomes more and more important, both academics and businesses are studying consumer emotions and satisfaction on the web. However, some consumer characteristics are not well considered. Demographic characteristics such as age and sex have been studied extensively, but few studies consider psychological characteristics such as regulatory focus (i.e., emotional regulation). In this study, we analyze the effect of web style on consumer emotion. Many studies analyze the relationship between the web and regulatory focus, but most concentrate on the purpose of web use, particularly motivation and information search, rather than on web style and design. The web communicates with users through visual elements. Because the human brain is influenced by all five senses, both design factors and emotional responses are important in the web environment. Therefore, in this study, we examine the relationship between consumer emotion and satisfaction and web style and design. Previous studies have considered the effects of web layout, structure, and color on emotions. In this study, however, we excluded these web components, in contrast to earlier studies, and analyzed the relationship between consumer satisfaction and emotional indexes of web-style only. To perform this analysis, we collected consumer surveys presenting 40 web style themes to 204 consumers. Each consumer evaluated four themes. The emotional adjectives evaluated by consumers were composed of 18 contrast pairs, and the upper emotional indexes were extracted through factor analysis. The emotional indexes were 'softness,' 'modernity,' 'clearness,' and 'jam.' Hypotheses were established based on the assumption that emotional indexes have different effects on consumer satisfaction. After the analysis, hypotheses 1, 2, and 3 were accepted and hypothesis 4 was rejected. While hypothesis 4 was rejected, its effect on consumer satisfaction was negative, not positive. This means that emotional indexes such as 'softness,' 'modernity,' and 'clearness' have a positive effect on consumer satisfaction. In other words, consumers prefer emotions that are soft, emotional, natural, rounded, dynamic, modern, elaborate, unique, bright, pure, and clear. 'Jam' has a negative effect on consumer satisfaction. It means, consumer prefer the emotion which is empty, plain, and simple. Regulatory focus shows differences in motivation and propensity in various domains. It is important to consider organizational behavior and decision making according to the regulatory focus tendency, and it affects not only political, cultural, ethical judgments and behavior but also broad psychological problems. Regulatory focus also differs from emotional response. Promotion focus responds more strongly to positive emotional responses. On the other hand, prevention focus has a strong response to negative emotions. Web style is a type of service, and consumer satisfaction is affected not only by cognitive evaluation but also by emotion. This emotional response depends on whether the consumer will benefit or harm himself. Therefore, it is necessary to confirm the difference of the consumer's emotional response according to the regulatory focus which is one of the characteristics and viewpoint of the consumers about the web style. After MMR analysis result, hypothesis 5.3 was accepted, and hypothesis 5.4 was rejected. But hypothesis 5.4 supported in the opposite direction to the hypothesis. After validation, we confirmed the mechanism of emotional response according to the tendency of regulatory focus. Using the results, we developed the structure of web-style recommendation system and recommend methods through regulatory focus. We classified the regulatory focus group in to three categories that promotion, grey, prevention. Then, we suggest web-style recommend method along the group. If we further develop this study, we expect that the existing regulatory focus theory can be extended not only to the motivational part but also to the emotional behavioral response according to the regulatory focus tendency. Moreover, we believe that it is possible to recommend web-style according to regulatory focus and emotional desire which consumers most prefer.

A Study on the Use of GIS-based Time Series Spatial Data for Streamflow Depletion Assessment (하천 건천화 평가를 위한 GIS 기반의 시계열 공간자료 활용에 관한 연구)

  • YOO, Jae-Hyun;KIM, Kye-Hyun;PARK, Yong-Gil;LEE, Gi-Hun;KIM, Seong-Joon;JUNG, Chung-Gil
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.21 no.4
    • /
    • pp.50-63
    • /
    • 2018
  • The rapid urbanization had led to a distortion of natural hydrological cycle system. The change in hydrological cycle structure is causing streamflow depletion, changing the existing use tendency of water resources. To manage such phenomena, a streamflow depletion impact assessment technology to forecast depletion is required. For performing such technology, it is indispensable to build GIS-based spatial data as fundamental data, but there is a shortage of related research. Therefore, this study was conducted to use the use of GIS-based time series spatial data for streamflow depletion assessment. For this study, GIS data over decades of changes on a national scale were constructed, targeting 6 streamflow depletion impact factors (weather, soil depth, forest density, road network, groundwater usage and landuse) and the data were used as the basic data for the operation of continuous hydrologic model. Focusing on these impact factors, the causes for streamflow depletion were analyzed depending on time series. Then, using distributed continuous hydrologic model based DrySAT, annual runoff of each streamflow depletion impact factor was measured and depletion assessment was conducted. As a result, the default value of annual runoff was measured at 977.9mm under the given weather condition without considering other factors. When considering the decrease in soil depth, the increase in forest density, road development, and groundwater usage, along with the change in land use and development, and annual runoff were measured at 1,003.5mm, 942.1mm, 961.9mm, 915.5mm, and 1003.7mm, respectively. The results showed that the major causes of the streaflow depletion were lowered soil depth to decrease the infiltration volume and surface runoff thereby decreasing streamflow; the increased forest density to decrease surface runoff; the increased road network to decrease the sub-surface flow; the increased groundwater use from undiscriminated development to decrease the baseflow; increased impervious areas to increase surface runoff. Also, each standard watershed depending on the grade of depletion was indicated, based on the definition of streamflow depletion and the range of grade. Considering the weather, the decrease in soil depth, the increase in forest density, road development, and groundwater usage, and the change in land use and development, the grade of depletion were 2.1, 2.2, 2.5, 2.3, 2.8, 2.2, respectively. Among the five streamflow depletion impact factors except rainfall condition, the change in groundwater usage showed the biggest influence on depletion, followed by the change in forest density, road construction, land use, and soil depth. In conclusion, it is anticipated that a national streamflow depletion assessment system to be develop in the future would provide customized depletion management and prevention plans based on the system assessment results regarding future data changes of the six streamflow depletion impact factors and the prospect of depletion progress.