• Title/Summary/Keyword: Computing Contents

Search Result 801, Processing Time 0.03 seconds

Color Images Utilizing the Properties Emotional Quantification Algorithm (이미지 색채 속성을 활용한 감성 정량화 알고리즘)

  • Lee, Yean-Ran
    • The Journal of the Korea Contents Association
    • /
    • v.15 no.11
    • /
    • pp.1-9
    • /
    • 2015
  • Emotion recognition and regular controls are concentrated interest in computer studies to emotional changes. Thus, the quantified by objective assessment methods are essential for application of color sensibility computing situations. In this paper, it is applied to a digital color image emotion emotional computing calculations numbered recognized as one representation. Emotional computing research approach consists of a color attribute to the image recognition focused sensibility and emotional attributes of color is the color, brightness and saturation separated by. Computes the sensitivity weighted according to the score and the percentage increase or decrease in the sensitivity property tone applied to emotional expression. Sensitivity calculation is free-degree (X), and calculates the tension (Y-axis). And free-level (X-axis) coordinate of emotion, which is located the intersection of the tension (Y-axis) as a sensitivity point. The emotional effect of the Russell coordinates are utilizing the core (Core Affect). Tue numbers represent the size and sensitivity in the emotional relationship between emotional point location and quantified by computing the color sensibility.

Software Architecture of the Grid for implementing the Cloud Computing of the High Availability (고가용성 클라우드 컴퓨팅 구축을 위한 그리드 소프트웨어 아키텍처)

  • Lee, Byoung-Yup;Park, Jun-Ho;Yoo, Jae-Soo
    • The Journal of the Korea Contents Association
    • /
    • v.12 no.2
    • /
    • pp.19-29
    • /
    • 2012
  • Currently, cloud computing technology is being supplied in various service forms and it is becoming a ground breaking service which provides usage of storage service, data and software while user is not involved in technical background such as physical location of service or system environment. cloud computing technology has advantages that it can use easily as many IT resources as it wants freely regardless of hardware issues required by a variety of systems and service level required by infrastructure. Also, since it has a strength that it can choose usage of resource about business model due to various internet-based technologies, provisioning technology and virtualization technology are being paid attention as main technologies. These technologies are ones of important technology elements which help web-based users approach freely and install according to user environment. Therefore, this thesis introduces software-related technologies and architectures in an aspect of grid for building up high availability cloud computing environment by analysis about cloud computing technology trend.

Distributed File Systems Architectures of the Large Data for Cloud Data Services (클라우드 데이터 서비스를 위한 대용량 데이터 처리 분산 파일 아키텍처 설계)

  • Lee, Byoung-Yup;Park, Jun-Ho;Yoo, Jae-Soo
    • The Journal of the Korea Contents Association
    • /
    • v.12 no.2
    • /
    • pp.30-39
    • /
    • 2012
  • In these day, some of IT venders already were going to cloud computing market, as well they are going to expand their territory for the cloud computing market through that based on their hardware and software technology, making collaboration between hardware and software vender. Distributed file system is very mainly technology for the cloud computing that must be protect performance and safety for high levels service requests as well data store. This paper introduced distributed file system for cloud computing and how to use this theory such as memory database, Hadoop file system, high availability database system. now In the market, this paper define a very large distributed processing architect as a reference by kind of distributed file systems through using technology in cloud computing market.

A Method on the Realization of QoS Guarantee in the Grid Network (그리드 네트워크에서의 QoS 보장방법 구현)

  • Kim, Jung-Yun;Na, Won-Shin;Ryoo, In-Tae
    • Journal of Digital Contents Society
    • /
    • v.10 no.1
    • /
    • pp.169-175
    • /
    • 2009
  • Grid computing is an application to obtain the most efficient performance from computing resources in terms of cost and convenience. It is also considered as a good method to solve a problem that cannot be settled by conventional computing technologies such as clustering or is requiring supercomputing capability due to its complex and long-running task. In order to run grid computing effectively, it needs to connect high-performance computing resources in real-time which are distributed geographically. Answering to the needs of this grid application, researchers in several universities with Argonne National Laboratory in the USA (ANL) as the main axis have developed Globus. It is noticed, however, that the quality of service (QoS) is not guaranteed when certain jobs are exchanged through networks in the context of Globus. To tackle with this problem, the ANL has invented Globus Architecture for Reservation and Allocation (GARA). The researchers of this paper constructed a testbed for evaluating the ability to reserve resource in the GARA system and implemented the GARA code for it. We analyzed the applied results and discussed future research plans.

  • PDF

Design of Algorithm Thinking-Based Software Basic Education for Nonmajors (비전공자를 위한 알고리즘씽킹 기반 소프트웨어 기초교육 설계)

  • PARK, So-Hyun
    • The Journal of Industrial Distribution & Business
    • /
    • v.10 no.11
    • /
    • pp.71-80
    • /
    • 2019
  • Purpose: The purpose of this study is to design the curriculum of Basic College Software Programming to develop creative and logical-thinking. This course is guided by algorithmic thinking and logical thinking that can be solved by computing for problem-solving, and it helps to develop by software through basic programming education. Through the stage of problem analysis, abstraction, algorithm, data structure, and algorithm implementation, the curriculum is designed to help learners experience algorithm problem-solving in various areas to develop diffusion thinking. For Learners aim to achieve the balanced development of divergent and convergent-thinking needed in their creative problem-solving skills. Research design, data and methodology: This study is to design a basic software education for improving algorithm-thinking for non-major. The curriculum designed in this paper is necessary to non-majors students who have completed the 'Creative Thinking and Coding Course' Design Thinking based are targeted. For this, contents were extracted through advanced research analysis at home and abroad, and experts in computer education, computer engineering, SW education, and education were surveyed in the form of quasi-openness. Results: In this study, based on ADD Thinking's algorithm thinking, we divided the unit college majors into five groups so that students of each major could accomplish the goal of "the ability to internalize their own ideas into computing," and extracted and designed different content areas, content elements and sub-components from each group. Through three expert surveys, we established a strategy for characterization by demand analysis and major/textbook category and verified the appropriateness of the design direction to ensure that the subjects and contents of the curriculum are appropriate for each family in order to improve algorithm-thinking. Conclusions: This study helps develop software by enhancing the ability of students who practice various subjects and exercises to explore creative expressions in various areas, such as 'how to think like a computer' that can implement and execute their ideas in computing. And it helps increase the ability to think logical and algorithmic computing based on creative solutions, improving problem-solving ability based on computing thinking and fundamental understanding of computer coding and development of logical thinking ability through programming.

Implementation of big web logs analyzer in estimating preferences for web contents (웹 컨텐츠 선호도 측정을 위한 대용량 웹로그 분석기 구현)

  • Choi, Eun Jung;Kim, Myuhng Joo
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.8 no.4
    • /
    • pp.83-90
    • /
    • 2012
  • With the rapid growth of internet infrastructure, World Wide Web is evolving recently into various services such as cloud computing, social network services. It simply go beyond the sharing of information. It started to provide new services such as E-business, remote control or management, providing virtual services, and recently it is evolving into new services such as cloud computing and social network services. These kinds of communications through World Wide Web have been interested in and have developed user-centric customized services rather than providing provider-centric informations. In these environments, it is very important to check and analyze the user requests to a website. Especially, estimating user preferences is most important. For these reasons, analyzing web logs is being done, however, it has limitations that the most of data to analyze are based on page unit statistics. Therefore, it is not enough to evaluate user preferences only by statistics of specific page. Because recent main contents of web page design are being made of media files such as image files, and of dynamic pages utilizing the techniques of CSS, Div, iFrame etc. In this paper, large log analyzer was designed and executed to analyze web server log to estimate web contents preferences of users. With mapreduce which is based on Hadoop, large logs were analyzed and web contents preferences of media files such as image files, sounds and videos were estimated.

Signal integrity analysis of system interconnection module of high-density server supporting serial RapidIO

  • Kwon, Hyukje;Kwon, Wonok;Oh, Myeong-Hoon;Kim, Hagyoung
    • ETRI Journal
    • /
    • v.41 no.5
    • /
    • pp.670-683
    • /
    • 2019
  • In this paper, we analyzed the signal integrity of a system interconnection module for a proposed high-density server. The proposed server integrates several components into a chassis. Therefore, the proposed server can access multiple computing resources. To support the system interconnection, among the highly integrated computing resources, the interconnection module, which is based on Serial RapidIO, has been newly adopted and supports a bandwidth of 800 Gbps while routing 160 differential signal traces. The module was designed for two different stack-up types on a printed circuit board. Each module was designed into 12- (version 1) and 14-layer (version 2) versions with thicknesses of 1.5T and 1.8T, respectively. Version 1 has a structure with two consecutive high-speed signal-layers in the middle of two power planes, whereas Version 2 has a single high-speed signal placed only in the space between two power planes. To analyze the signal integrity of the module, we probed the S-parameters, eye-diagrams, and crosstalk voltages. The results show that the high-speed signal integrity of Version 2 has a better quality than Version 1, even if the signal trace length is increased.

Design of Cloud-based Context-aware System Based on Falling Type

  • Kwon, TaeWoo;Lee, Jong-Yong;Jung, Kye-Dong
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.9 no.4
    • /
    • pp.44-50
    • /
    • 2017
  • To understand whether Falling, which is one of the causes of injuries, occurs, various behavior recognition research is proceeding. However, in most research recognize only the fact that Falling has occurred and provide the service. As well as the occurrence of the Falling, the risk varies greatly based on the type of Falling and the situation before and after the Falling. Therefore, when Falling occurs, it is necessary to infer the user's current situation and provide appropriate services. In this paper, we propose to base on Fog Computing and Cloud Computing to design Context-aware System using analysis of behavior data and process sensor data in real-time. This system solved the problem of increase latency and server overload due to large capacity sensor data.

Software Architecture for Implementing the Grid Computing of the High Availability Solution through Load Balancing (고가용성 솔루션 구축을 위한 그리드 측면에서의 소프트웨어 아키텍처를 통한 로드밸랜싱 구현)

  • Lee, Byoung-Yup;Park, Jun-Ho;Yoo, Jae-Soo
    • The Journal of the Korea Contents Association
    • /
    • v.11 no.3
    • /
    • pp.26-35
    • /
    • 2011
  • In these days, internet environment are very quickly development as well on-line service have been using a online for the mission critical business around the world. As the amount of information to be processed by computers has recently been increased there has been cluster computing systems developed by connecting workstations server using high speed networks for high availability. but cluster computing technology are limited for a lot of IT resources. So, grid computing is an expanded technology of distributed computing technology to use low-cost and high-performance computing power in various fields. Although the purpose of Grid computing focuses on large-scale resource sharing, innovative applications, and in some case, high-performance orientation, it has been used as conventional distributed computing environment like clustered computer until now because grid middleware does not have common sharable information system. In order to use grid computing environment efficiently which consists of various grid middleware, it is necessary to have application-independent information system which can share information description and services, and expand them easily. This paper proposed new database architecture and load balancing for high availability through Grid technology.

A Comparative Analysis of Domestic and Foreign Docker Container-Based Research Trends (국내·외 도커 컨테이너 기반 연구 동향 비교 분석)

  • Bae, Sun-Young
    • The Journal of the Korea Contents Association
    • /
    • v.22 no.10
    • /
    • pp.742-753
    • /
    • 2022
  • Cloud computing, which is rapidly growing as one of the core technologies of the 4th industrial revolution, has become the center of global IT trend change, and Docker, a container-based open source platform, is the mainstream for virtualization technology for cloud computing. Therefore, in this paper, research trends based on Docker containers were compared and analyzed, focusing on studies published from March 2013 to July 2022. As a result of the study, first, the number of papers published by year, domestic and foreign research were steadily increasing. Second, the keywords of the study, in domestic research, Docker, Docker Containers, and Containers were in the order, and in foreign research, Cloud Computing, Containers, and Edge Computing were in the order. Third, in the frequency of publishing institutions to estimate research trends, the utilization was the highest in two papers of the Korean Next Generation Computer Society and the Korean Computer Accounting Society. In the overseas research, IEEE Communications Surveys & Tutorials, IEEE Access, and Computer were in the order. Fourth, in the research method, experiments 78(26.3%) and surveys 32(10.8%) were conducted in domestic research. In foreign research, experiments 128(43.1%) and surveys 59(19.9%) were conducted. In the experiment of implementation research, In domestic research, System 25(8.4%), Algorithm 24(8.1%), Performance Measurement and Improvement 16(5.4%) were in the order, In foreign research, Algorithm 37(12.5%), Performance Measurement and Improvement 17(9.1%), followed by Framework 26(8.8%). Through this, it is expected that it will be used as basic data that can lead the research direction of Docker container-based cloud computing such as research methods, research topics, research fields, and technology development.