• Title/Summary/Keyword: -storage data providing

Search Result 106, Processing Time 0.026 seconds

Comparative review of muscle fiber characteristics between porcine skeletal muscles

  • Junyoung Park;Sung Sil Moon;Sumin Song;Huilin Cheng;Choeun Im;Lixin Du;Gap-Don Kim
    • Journal of Animal Science and Technology
    • /
    • v.66 no.2
    • /
    • pp.251-265
    • /
    • 2024
  • Meat derived from skeletal muscles of animals is a highly nutritious type of food, and different meat types differ in nutritional, sensory, and quality properties. This study was conducted to compare the results of previous studies on the muscle fiber characteristics of major porcine skeletal muscles to the end of providing basic data for understanding differences in physicochemical and nutritional properties between different porcine muscle types (or meat cuts). Specifically, the muscle fiber characteristics between 19 major porcine skeletal muscles were compared. The muscle fibers that constitute porcine skeletal muscle can be classified into several types based on their contractile and metabolic characteristics. In addition, the muscle fiber characteristics, including size, composition, and density, of each muscle type were investigated and a technology based on these muscle fiber characteristics for improving meat quality or preventing quality deterioration was briefly discussed. This comparative review revealed that differences in muscle fiber characteristics are primarily responsible for the differences in quality between pork cuts (muscle types) and also suggested that data on muscle fiber characteristics can be used to develop optimal meat storage and packaging technologies for each meat cut (or muscle type).

Development of Information Technology Infrastructures through Construction of Big Data Platform for Road Driving Environment Analysis (도로 주행환경 분석을 위한 빅데이터 플랫폼 구축 정보기술 인프라 개발)

  • Jung, In-taek;Chong, Kyu-soo
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.19 no.3
    • /
    • pp.669-678
    • /
    • 2018
  • This study developed information technology infrastructures for building a driving environment analysis platform using various big data, such as vehicle sensing data, public data, etc. First, a small platform server with a parallel structure for big data distribution processing was developed with H/W technology. Next, programs for big data collection/storage, processing/analysis, and information visualization were developed with S/W technology. The collection S/W was developed as a collection interface using Kafka, Flume, and Sqoop. The storage S/W was developed to be divided into a Hadoop distributed file system and Cassandra DB according to the utilization of data. Processing S/W was developed for spatial unit matching and time interval interpolation/aggregation of the collected data by applying the grid index method. An analysis S/W was developed as an analytical tool based on the Zeppelin notebook for the application and evaluation of a development algorithm. Finally, Information Visualization S/W was developed as a Web GIS engine program for providing various driving environment information and visualization. As a result of the performance evaluation, the number of executors, the optimal memory capacity, and number of cores for the development server were derived, and the computation performance was superior to that of the other cloud computing.

A Property-Based Data Sealing using the Weakest Precondition Concept (최소 전제조건 개념을 이용한 성질 기반 데이터 실링)

  • Park, Tae-Jin;Park, Jun-Cheol
    • Journal of Internet Computing and Services
    • /
    • v.9 no.6
    • /
    • pp.1-13
    • /
    • 2008
  • Trusted Computing is a hardware-based technology that aims to guarantee security for machines beyond their users' control by providing security on computing hardware and software. TPM(Trusted Platform Module), the trusted platform specified by the Trusted Computing Group, acts as the roots for the trusted data storage and the trusted reporting of platform configuration. Data sealing encrypts secret data with a key and the platform's configuration at the time of encryption. In contrast to the traditional data sealing based on binary hash values of the platform configuration, a new approach called property-based data sealing was recently suggested. In this paper, we propose and analyze a new property-based data sealing protocol using the weakest precondition concept by Dijkstra. The proposed protocol resolves the problem of system updates by allowing sealed data to be unsealed at any configuration providing the required property. It assumes practically implementable trusted third parties only and protects platform's privacy when communicating. We demonstrate the proposed protocol's operability with any TPM chip by implementing and running the protocol on a software TPM emulator by Strasser. The proposed scheme can be deployed in PDAs and smart phones over wireless mobile networks as well as desktop PCs.

  • PDF

Utilization of Database in 3D Visualization of Remotely Sensed Data (원격탐사 영상의 3D 시각화와 데이터베이스의 활용)

  • Jung, Myung-Hee;Yun, Eui-Jung
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.45 no.3
    • /
    • pp.40-46
    • /
    • 2008
  • 3D visualization of geological environments using remotely sensed data and the various sources of data provides new methodology to interpret geological observation data and analyze geo-information in earth science applications. It enables to understand spatio-temporal relationships and causal processes in the three-dimension, which would be difficult to identify without 3D representation. To build more realistic geological environments, which are useful to recognize spatial characteristics and relationships of geological objects, 3D modeling, topological analysis, and database should be coupled and taken into consideration for an integrated configuration of the system. In this study, a method for 3D visualization, extraction of geological data, storage and data management using remotely sensed data is proposed with the goal of providing a methodology to utilize dynamic spatio-temporal modeling and simulation in the three-dimension for geoscience and earth science applications.

Structuring of unstructured big data and visual interpretation (부산지역 교통관련 기사를 이용한 비정형 빅데이터의 정형화와 시각적 해석)

  • Lee, Kyeongjun;Noh, Yunhwan;Yoon, Sanggyeong;Cho, Youngseuk
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.6
    • /
    • pp.1431-1438
    • /
    • 2014
  • We analyzed the articles from "Kukje Shinmun" and "Busan Ilbo", which are two local newpapers of Busan Metropolitan City. The articles cover from January 1, 2013 to December 31, 2013. Meaningful pattern inherent in 2889 articles of which the title includes "Busan" and "Traffic" and related data was analyzed. Textmining method, which is a part of datamining, was used for the social network analysis (SNA). HDFS and MapReduce (from Hadoop ecosystem), which is open-source framework based on JAVA, were used with Linux environment (Uubntu-12.04LTS) for the construction of unstructured data and the storage, process and the analysis of big data. We implemented new algorithm that shows better visualization compared with the default one from R package, by providing the color and thickness based on the weight from each node and line connecting the nodes.

Improving Efficiency of Encrypted Data Deduplication with SGX (SGX를 활용한 암호화된 데이터 중복제거의 효율성 개선)

  • Koo, Dongyoung
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.11 no.8
    • /
    • pp.259-268
    • /
    • 2022
  • With prosperous usage of cloud services to improve management efficiency due to the explosive increase in data volume, various cryptographic techniques are being applied in order to preserve data privacy. In spite of the vast computing resources of cloud systems, decrease in storage efficiency caused by redundancy of data outsourced from multiple users acts as a factor that significantly reduces service efficiency. Among several approaches on privacy-preserving data deduplication over encrypted data, in this paper, the research results for improving efficiency of encrypted data deduplication using trusted execution environment (TEE) published in the recent USENIX ATC are analysed in terms of security and efficiency of the participating entities. We present a way to improve the stability of a key-managing server by integrating it with individual clients, resulting in secure deduplication without independent key servers. The experimental results show that the communication efficiency of the proposed approach can be improved by about 30% with the effect of a distributed key server while providing robust security guarantees as the same level of the previous research.

Secure and Scalable Blockchain-Based Framework for IoT-Supply Chain Management Systems

  • Omimah, Alsaedi;Omar, Batarfi;Mohammed, Dahab
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.12
    • /
    • pp.37-50
    • /
    • 2022
  • Modern supply chains include multiple activities from collecting raw materials to transferring final products. These activities involve many parties who share a huge amount of valuable data, which makes managing supply chain systems a challenging task. Current supply chain management (SCM) systems adopt digital technologies such as the Internet of Things (IoT) and blockchain for optimization purposes. Although these technologies can significantly enhance SCM systems, they have their own limitations that directly affect SCM systems. Security, performance, and scalability are essential components of SCM systems. Yet, confidentiality and scalability are one of blockchain's main limitations. Moreover, IoT devices are lightweight and have limited power and storage. These limitations should be considered when developing blockchain-based IoT-SCM systems. In this paper, the requirements of efficient supply chain systems are analyzed and the role of both IoT and blockchain technologies in providing each requirement are discussed. The limitations of blockchain and the challenges of IoT integration are investigated. The limitations of current literature in the same field are identified, and a secure and scalable blockchain-based IoT-SCM system is proposed. The proposed solution employs a Hyperledger fabric blockchain platform and tackles confidentiality by implementing private data collection to achieve confidentiality without decreasing performance. Moreover, the proposed framework integrates IoT data to stream live data without consuming its limited resources and implements a dualstorge model to support supply chain scalability. The proposed framework is evaluated in terms of security, throughput, and latency. The results demonstrate that the proposed framework maintains confidentiality, integrity, and availability of on-chain and off-chain supply chain data. It achieved better performance through 31.2% and 18% increases in read operation throughput and write operation throughput, respectively. Furthermore, it decreased the write operation latency by 83.3%.

Improvement Strategies of Agro-Value Chain for Agricultural Development in Developing Countries: The Case of Cambodia (개도국 농업발전을 위한 농산물 가치사슬 개선 전략: 캄보디아 사례를 중심으로)

  • Kim, Dong-Hwan
    • Journal of Distribution Science
    • /
    • v.14 no.4
    • /
    • pp.127-134
    • /
    • 2016
  • Purpose - Value chain in agriculture refers to direct and indirect activities related to value-added process from raw materials to final products in agricultural industries. In recent years, value chain analysis has become more important in the area of agricultural development. This article reviews the concept and importance of value chain analysis in the context of agricultural development and attempts to suggest improvement strategies. Research design, data, methodology - A literature survey was conducted for value chain analysis for agricultural development. The case of agro-value chain in Cambodia was deeply analyzed based upon interviews with government officers and related experts. Results - It seems that agro-value chain in developing countries are not well developed and does not carry out appropriate functions, compared to developed countries. Because value adding facilities, such as storage, processing and packing plants, milling plants, and etc. are not sufficiently constructed, the quality of agricultural products is low. Especially developing countries may loose opportunities to increase value of their product by exporting their agricultural products as raw materials to neighboring countries. Value adding process is also mainly controlled by traders in local markets or wholesale markets in urban areas. Farmers therefore can get lower share of final value of agricultural products compared to the shares paid to traders. Lastly it is argued that governments of developing countries do not play an active role in developing value chains and do not carry out coordinating functions in an effective and efficient manner. Conclusions - The first step to improve agro-value chain in developing countries is to identify and analyze value chain structure of agricultural products and to make development strategies and implementation programs. For improving value chain of agricultural products in developing countries, it is required to provide not only plans for constructing hardwares, such as wholesale markets, storage facilities, processing and packing plants, and etc., but also plans for improving softwares, such as measures for improving product quality and safety, setting up grade and standard, providing market information, and nurturing producer cooperatives.

State of Information Technology and Its Application in Agricultural Meteorology (농업기상활용 정보기술 현황)

  • Byong-Lyol Lee;Dong-Il Lee
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.6 no.2
    • /
    • pp.118-126
    • /
    • 2004
  • Grid is a new Information Technology (IT) concept of "super Internet" for high-performance computing: worldwide collections of high-end resources such as supercomputers, storage, advanced instruments and immerse environments. The Grid is expected to bring together geographically and organizationally dispersed computational resources, such as CPUs, storage systems, communication systems, real-time data sources and instruments, and human collaborators. The term "the Grid" was coined in the mid1990s to denote a proposed distributed computing infrastructure for advanced science and engineering. The term computational Grids refers to infrastructures aimed at allowing users to access and/or aggregate potentially large numbers of powerful and sophisticated resources. More formally, Grids are defined as infrastructure allowing flexible, secure, and coordinated resource sharing among dynamic collections of individuals, institutions and resources referred to as virtual Organizations. GRID is an emerging IT as a kind of next generation Internet technology which will fit very well with agrometeorological services in the future. I believe that it would contribute to the resource sharing in agrometeorology by providing super computing power, virtual storage, and efficient data exchanges, especially for developing countries that are suffering from the lack of resources for their agmet services at national level. Thus, the establishment of CAgM-GRID based on existing RADMINSII is proposed as a part of FWIS of WMO.part of FWIS of WMO.

Non-Disruptive Server Management for Sustainable Resource Service Based on On-Premise (온-프레미스 기반 지속적인 자원 서비스를 위한 서버 무중단 기법)

  • Kim, Hyun-Woo
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.7 no.12
    • /
    • pp.295-300
    • /
    • 2018
  • The rapid development of IT, many conventional passive jobs have been automated. This automation increases the leisure time of many people and various services are being developed for them. In addition, with the advent of smart devices that are compact and portable, it is possible to use various internet services without any time and place discretion. Various studies based on virtualization are under way to efficiently store and process large data generated by many devices and services. Desktop Storage Virtualization (DSV), which integrates and provides users with on-premise-based distributed desktop resources during these studies, uses virtualization to consolidate unused resources within distributed, legacy desktops. This DSV is very important for providing high reliability to users. In addition, research on hierarchical structure and resource integration for efficient data distribution storage processing in a distributed desktop-based resource integration environment is underway. However, there is a lack of research on efficient operation in case of server failure in on-premise resource integration environment. In this paper, we propose Non-disruptive Server Management (NSM) which can actively cope with the failure of desktop server in distributed desktop storage environment based on on-premise. NSM is easy to add and remove desktops in a desktop-based integrated environment. In addition, an alternative server is actively performed in response to a failure occurrence.