• Title/Summary/Keyword: Information Processing Theory

Search Result 599, Processing Time 0.032 seconds

PVC Classification based on QRS Pattern using QS Interval and R Wave Amplitude (QRS 패턴에 의한 QS 간격과 R파의 진폭을 이용한 조기심실수축 분류)

  • Cho, Ik-Sung;Kwon, Hyeog-Soong
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.4
    • /
    • pp.825-832
    • /
    • 2014
  • Previous works for detecting arrhythmia have mostly used nonlinear method such as artificial neural network, fuzzy theory, support vector machine to increase classification accuracy. Most methods require accurate detection of P-QRS-T point, higher computational cost and larger processing time. Even if some methods have the advantage in low complexity, but they generally suffer form low sensitivity. Also, it is difficult to detect PVC accurately because of the various QRS pattern by person's individual difference. Therefore it is necessary to design an efficient algorithm that classifies PVC based on QRS pattern in realtime and decreases computational cost by extracting minimal feature. In this paper, we propose PVC classification based on QRS pattern using QS interval and R wave amplitude. For this purpose, we detected R wave, RR interval, QRS pattern from noise-free ECG signal through the preprocessing method. Also, we classified PVC in realtime through QS interval and R wave amplitude. The performance of R wave detection, PVC classification is evaluated by using 9 record of MIT-BIH arrhythmia database that included over 30 PVC. The achieved scores indicate the average of 99.02% in R wave detection and the rate of 93.72% in PVC classification.

Process Networks of Ecohydrological Systems in a Temperate Deciduous Forest: A Complex Systems Perspective (온대활엽수림 생태수문계의 과정망: 복잡계 관점)

  • Yun, Juyeol;Kim, Sehee;Kang, Minseok;Cho, Chun-Ho;Chun, Jung-Hwa;Kim, Joon
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.16 no.3
    • /
    • pp.157-168
    • /
    • 2014
  • From a complex systems perspective, ecohydrological systems in forests may be characterized with (1) large networks of components which give rise to complex collective behaviors, (2) sophisticated information processing, and (3) adaptation through self-organization and learning processes. In order to demonstrate such characteristics, we applied the recently proposed 'process networks' approach to a temperate deciduous forest in Gwangneung National Arboretum in Korea. The process network analysis clearly delineated the forest ecohydrological systems as the hierarchical networks of information flows and feedback loops with various time scales among different variables. Several subsystems were identified such as synoptic subsystem (SS), atmospheric boundary layer subsystem (ABLS), biophysical subsystem (BPS), and biophysicochemical subsystem (BPCS). These subsystems were assembled/disassembled through the couplings/decouplings of feedback loops to form/deform newly aggregated subsystems (e.g., regional subsystem) - an evidence for self-organizing processes of a complex system. Our results imply that, despite natural and human disturbances, ecosystems grow and develop through self-organization while maintaining dynamic equilibrium, thereby continuously adapting to environmental changes. Ecosystem integrity is preserved when the system's self-organizing processes are preserved, something that happens naturally if we maintain the context for self-organization. From this perspective, the process networks approach makes sense.

Pareto Ratio and Inequality Level of Knowledge Sharing in Virtual Knowledge Collaboration: Analysis of Behaviors on Wikipedia (지식 공유의 파레토 비율 및 불평등 정도와 가상 지식 협업: 위키피디아 행위 데이터 분석)

  • Park, Hyun-Jung;Shin, Kyung-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.3
    • /
    • pp.19-43
    • /
    • 2014
  • The Pareto principle, also known as the 80-20 rule, states that roughly 80% of the effects come from 20% of the causes for many events including natural phenomena. It has been recognized as a golden rule in business with a wide application of such discovery like 20 percent of customers resulting in 80 percent of total sales. On the other hand, the Long Tail theory, pointing out that "the trivial many" produces more value than "the vital few," has gained popularity in recent times with a tremendous reduction of distribution and inventory costs through the development of ICT(Information and Communication Technology). This study started with a view to illuminating how these two primary business paradigms-Pareto principle and Long Tail theory-relates to the success of virtual knowledge collaboration. The importance of virtual knowledge collaboration is soaring in this era of globalization and virtualization transcending geographical and temporal constraints. Many previous studies on knowledge sharing have focused on the factors to affect knowledge sharing, seeking to boost individual knowledge sharing and resolve the social dilemma caused from the fact that rational individuals are likely to rather consume than contribute knowledge. Knowledge collaboration can be defined as the creation of knowledge by not only sharing knowledge, but also by transforming and integrating such knowledge. In this perspective of knowledge collaboration, the relative distribution of knowledge sharing among participants can count as much as the absolute amounts of individual knowledge sharing. In particular, whether the more contribution of the upper 20 percent of participants in knowledge sharing will enhance the efficiency of overall knowledge collaboration is an issue of interest. This study deals with the effect of this sort of knowledge sharing distribution on the efficiency of knowledge collaboration and is extended to reflect the work characteristics. All analyses were conducted based on actual data instead of self-reported questionnaire surveys. More specifically, we analyzed the collaborative behaviors of editors of 2,978 English Wikipedia featured articles, which are the best quality grade of articles in English Wikipedia. We adopted Pareto ratio, the ratio of the number of knowledge contribution of the upper 20 percent of participants to the total number of knowledge contribution made by the total participants of an article group, to examine the effect of Pareto principle. In addition, Gini coefficient, which represents the inequality of income among a group of people, was applied to reveal the effect of inequality of knowledge contribution. Hypotheses were set up based on the assumption that the higher ratio of knowledge contribution by more highly motivated participants will lead to the higher collaboration efficiency, but if the ratio gets too high, the collaboration efficiency will be exacerbated because overall informational diversity is threatened and knowledge contribution of less motivated participants is intimidated. Cox regression models were formulated for each of the focal variables-Pareto ratio and Gini coefficient-with seven control variables such as the number of editors involved in an article, the average time length between successive edits of an article, the number of sections a featured article has, etc. The dependent variable of the Cox models is the time spent from article initiation to promotion to the featured article level, indicating the efficiency of knowledge collaboration. To examine whether the effects of the focal variables vary depending on the characteristics of a group task, we classified 2,978 featured articles into two categories: Academic and Non-academic. Academic articles refer to at least one paper published at an SCI, SSCI, A&HCI, or SCIE journal. We assumed that academic articles are more complex, entail more information processing and problem solving, and thus require more skill variety and expertise. The analysis results indicate the followings; First, Pareto ratio and inequality of knowledge sharing relates in a curvilinear fashion to the collaboration efficiency in an online community, promoting it to an optimal point and undermining it thereafter. Second, the curvilinear effect of Pareto ratio and inequality of knowledge sharing on the collaboration efficiency is more sensitive with a more academic task in an online community.

Study on the Applicability of High Frequency Seismic Reflection Method to the Inspection of Tunnel Lining Structures - Physical Modeling Approach - (터널 지보구조 진단을 위한 고주파수 탄성파 반사법의 응용성 연구 - 모형 실험을 중심으로 -)

  • Kim, Jung-Yul;Kim, Yoo-Sung;Shin, Yong-Suk;Hyun, Hye-Ja;Jung, Hyun-Key
    • Journal of Korean Tunnelling and Underground Space Association
    • /
    • v.2 no.3
    • /
    • pp.37-45
    • /
    • 2000
  • In recent years two reflection methods, i.e. GPR and seismic Impact-Echo, are usually performed to obtain the information about tunnel lining structures composed of concrete lining, shotcrete, water barrier, and voids at the back of lining. However, they do not lead to a desirable resolution sufficient for the inspection of tunnel safety, due to many problems of interest including primarily (1) inner thin layers of lining structure itself in comparison with the wavelength of source wavelets, (2) dominant unwanted surface wave arrivals, (3) inadequate measuring strategy. In this sense, seismic physical modeling is a useful tool, with the use of the full information about the known physical model, to handle such problems, especially to study problems of wave propagation in such fine structures that are not amenable to theory and field works as well. Thus, this paper deals with various results of seismic physical modeling to enable to show a possibility of detecting the inner layer boundaries of tunnel lining structures. To this end, a physical model analogous to a lining structure was built up, measured and processed in the same way as performed in regular reflection surveys. The evaluated seismic section gives a clear picture of the lining structure, that will open up more consistent direction of research into the development of an efficient measuring and processing technology.

  • PDF

An Algorithm for Spot Addressing in Microarray using Regular Grid Structure Searching (균일 격자 구조 탐색을 이용한 마이크로어레이 반점 주소 결정 알고리즘)

  • 진희정;조환규
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.31 no.9
    • /
    • pp.514-526
    • /
    • 2004
  • Microarray is a new technique for gene expression experiment, which has gained biologist's attention for recent years. This technology enables us to obtain hundreds and thousands of expression of gene or genotype at once using microarray Since it requires manual work to analyze patterns of gene expression, we want to develop an effective and automated tools to analyze microarray image. However it is difficult to analyze DNA chip images automatically due to several problems such as the variation of spot position, the irregularity of spot shape and size, and sample contamination. Especially, one of the most difficult problems in microarray analysis is the block and spot addressing, which is performed by manual or semi automated work in all the commercial tools. In this paper we propose a new algorithm to address the position of spot and block using a new concept of regular structure grid searching. In our algorithm, first we construct maximal I-regular sequences from the set of input points. Secondly we calculate the rotational angle and unit distance. Finally, we construct I-regularity graph by allowing pseudo points and then we compute the spot/block address using this graph. Experiment results showed that our algorithm is highly robust and reliable. Supplement information is available on http://jade.cs.pusan.ac.kr/~autogrid.

A Case Study on Big Data Analysis of Performing Arts Consumer for Audience Development (관객개발을 위한 공연예술 소비자 빅데이터 분석 사례 고찰)

  • Kim, Sun-Young;Yi, Eui-Shin
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.18 no.12
    • /
    • pp.286-299
    • /
    • 2017
  • The Korean performing arts has been facing stagnation due to oversupply, lack of effective distribution system, and insufficient business models. In order to overcome these difficulties, it is necessary to improve the efficiency and accuracy of marketing by using more objective market data, and to secure audience development and loyalty. This study considers the viewpoint that 'Big Data' could provide more general and accurate statistics and could ultimately promote tailoring services for performances. We examine the first case of Big Data analysis conducted by a credit card company as well as Big Data's characteristics, analytical techniques, and the theoretical background of performing arts consumer analysis. The purpose of this study is to identify the meaning and limitations of the analysis case on performing arts by Big Data and to overcome these limitations. As a result of the case study, incompleteness of credit card data for performance buyers, limits of verification of existing theory, low utilization, consumer propensity and limit of analysis of purchase driver were derived. In addition, as a solution to overcome these problems, it is possible to identify genre and performances, and to collect qualitative information, such as prospectors information, that can identify trends and purchase factors.combination with surveys, and purchase motives through mashups with social data. This research is ultimately the starting point of how the study of performing arts consumers should be done in the Big Data era and what changes should be sought. Based on our research results, we expect more concrete qualitative analysis cases for the development of audiences, and continue developing solutions for Big Data analysis and processing that accurately represent the performing arts market.

Intelligent I/O Subsystem for Future A/V Embedded Device (멀티미디어 기기를 위한 지능형 입출력 서브시스템)

  • Jang, Hyung-Kyu;Won, Yoo-Jip;Ryu, Jae-Min;Shim, Jun-Seok;Boldyrev, Serguei
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.33 no.1_2
    • /
    • pp.79-91
    • /
    • 2006
  • The intelligent disk can improve the overall performance of the I/O subsystem by processing the I/O operations in the disk side. At present time, however, realizing the intelligent disk seems to be impossible because of the limitation of the I/O subsystem and the lack of the backward compatibility with the traditional I/O interface scheme. In this paper, we proposed new model for the intelligent disk that dynamically optimizes the I/O subsystem using the information that is only related to the physical sector. In this way, the proposed model does not break the compatibility with the traditional I/O interface scheme. For these works, the boosting algorithm that upgrades a weak learner by repeating teaming is used. If the last learner classifies a recent I/O workload as the multimedia workload, the disk reads more sectors. Also, by embedding this functionality as a firmware or a embedded OS within the disk, the overall I/O subsystem can be operated more efficiently without the additional workload.

Generator of Dynamic User Profiles Based on Web Usage Mining (웹 사용 정보 마이닝 기반의 동적 사용자 프로파일 생성)

  • An, Kye-Sun;Go, Se-Jin;Jiong, Jun;Rhee, Phill-Kyu
    • The KIPS Transactions:PartB
    • /
    • v.9B no.4
    • /
    • pp.389-390
    • /
    • 2002
  • It is important that acquire information about if customer has some habit in electronic commerce application of internet base that led in recommendation service for customer in dynamic web contents supply. Collaborative filtering that has been used as a standard approach to Web personalization can not get rapidly user's preference change due to static user profiles and has shortcomings such as reliance on user ratings, lack of scalability, and poor performance in the high-dimensional data. In order to overcome this drawbacks, Web usage mining has been prevalent. Web usage mining is a technique that discovers patterns from We usage data logged to server. Specially. a technique that discovers Web usage patterns and clusters patterns is used. However, the discovery of patterns using Afriori algorithm creates many useless patterns. In this paper, the enhanced method for the construction of dynamic user profiles using validated Web usage patterns is proposed. First, to discover patterns Apriori is used and in order to create clusters for user profiles, ARHP algorithm is chosen. Before creating clusters using discovered patterns, validation that removes useless patterns by Dempster-Shafer theory is performed. And user profiles are created dynamically based on current user sessions for Web personalization.

Arrhythmia Classification based on Binary Coding using QRS Feature Variability (QRS 특징점 변화에 따른 바이너리 코딩 기반의 부정맥 분류)

  • Cho, Ik-Sung;Kwon, Hyeog-Soong
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.17 no.8
    • /
    • pp.1947-1954
    • /
    • 2013
  • Previous works for detecting arrhythmia have mostly used nonlinear method such as artificial neural network, fuzzy theory, support vector machine to increase classification accuracy. Most methods require accurate detection of P-QRS-T point, higher computational cost and larger processing time. But it is difficult to detect the P and T wave signal because of person's individual difference. Therefore it is necessary to design efficient algorithm that classifies different arrhythmia in realtime and decreases computational cost by extrating minimal feature. In this paper, we propose arrhythmia detection based on binary coding using QRS feature varibility. For this purpose, we detected R wave, RR interval, QRS width from noise-free ECG signal through the preprocessing method. Also, we classified arrhythmia in realtime by converting threshold variability of feature to binary code. PVC, PAC, Normal, BBB, Paced beat classification is evaluated by using 39 record of MIT-BIH arrhythmia database. The achieved scores indicate the average of 97.18%, 94.14%, 99.83%, 92.77%, 97.48% in PVC, PAC, Normal, BBB, Paced beat classification.

Effects of Foodservice Franchise's Online Advertising and E-WOM on Trust, Commitment and Loyalty

  • AHN, Sung-Man;YANG, Jae-Jang
    • The Korean Journal of Franchise Management
    • /
    • v.12 no.2
    • /
    • pp.7-21
    • /
    • 2021
  • Purpose: One of the characteristics of service companies such as foodservice franchise is that it is easy to imitate, so many brands can imitate the menu that is popular with consumers. Therefore, foodservice franchise company should develop a brand that customers can identify from other brands in order differentiate it from its competitors. In order make the foodservice franchise company identifiable from other brands, it is possible through communication with customers. Therefore, this study proposes a new research model to analyze customer loyalty through online advertising and online word of mouth trust and immersion. Online was provided to customers through a mixture of advertisements and word of mouth, but previous studies have only considered online advertisements or online word of mouth. In addition, we want to verify the difference according to gender, which is an important variable in researching the online information processing behavior of customers. Research design, data, and methodology: The questionnaire of this study was surveyed on 20 years of age or older who have visited the restaurant franchise store within the last 3 months among the foodservice franchise companies operating SNS. During the survey period, 400 surveys were surveyed for a total of 20 days from April 1 to April 20, 2020. Result: The research results are as follows. First, in this study, the effect of online advertisement and online word of mouth on trust and immersion was studied. Second, this study verified the social influence theory in online advertising and online word of mouth. Third, the effect of online advertising and online word of mouth on loyalty according to gender was verified. Fourth, compared to existing advertisements, online advertisements are suitable for marketing by foodservice franchise companies because they can interact with consumers, modify advertisements immediately, execute extensive advertisements at low cost, segment the market, and measure advertisement effectiveness. The recent online expansion has been expanded to mobile-based, allowing foodservice franchisees to provide new communication services such as SMS (Short Message Service), multimedia messaging services, and location-based services. Fifth, a foodservice franchise company can increase brand awareness through online marketing or induce the use of offline stores. Sixth, franchisor can grow into a sustainable company only when they use resources efficiently. Conclusions: Trust is important in foodservice franchise information. This trust has a significant impact on customer commitment and loyalty.