• Title/Summary/Keyword: Store Size

Search Result 338, Processing Time 0.027 seconds

The Product Information in Online Jeans Shopping by Consumers' Evaluation Criteria

  • Choi, Eun-Ha;Chun, Jong-Suk
    • The International Journal of Costume Culture
    • /
    • v.13 no.1
    • /
    • pp.42-50
    • /
    • 2010
  • The purpose of this study was to find differences in evaluation criteria and product information based on jeans products consumers. The participants of this study were women age of 19 to 30 years. This study was implemented by descriptive survey method using questionnaires. A total 182 questionnaires were analyzed in this study. The subjects were grouped by the evaluation criteria of purchasing jeans through an online shopping. Finding of the study showed that, Group 1 was high involvement group. They conscious of both style feature and practicability of jeans when they bought jeans. Group 2 was low involvement group. They are not conscious of those features. The important factors were different by groups. The most important factors of purchasing jeans for Group 1 were fashion trend and practicality. Price was the most important factor for Group 2. They bought jeans at extremely low or high price. On the other hand, Group 1 bought jeans of diverse prices range. The popular shopping sites were different between two groups. The department store was the most important place purchasing jeans for both groups. The second important place was specialty stores for Group 1, and online shopping for Group 2. The usefulness of product informations were also examined when they evaluated the jeans at online shopping. The most useful product informations were leg cut style and rise length. Fit information was very important for Group 1. Group 1 considered that the function of zoomming the picture image was important. The material characteristic and name of brand were also useful than Group 2. But the size and care instruction were not highly useful.

  • PDF

A Transformation Technique of XML DTD to Relational Database Schema Based On Extracting Common Structure in XML Documents (공통 문서 구조 추출을 통한 XML DTD의 관계형 데이터 베이스 스키마 변환 기법)

  • Ahn, Sung-Eun;Choi, Hwang-Kyu
    • The KIPS Transactions:PartD
    • /
    • v.9D no.6
    • /
    • pp.999-1008
    • /
    • 2002
  • XML is emerging as a standard data format to exchange and to present data on the Web. There are increasing needs to efficiently store and to query XML data. In this paper. we propose a new schema transformation algorithm based on a common structure extracting technique from XML documents. The common structure is shared by all XML documents referenced by DTD and the uncommon structure is ununiformly appeared on all XML documents referenced by DTD. Based on the extracted common and uncommon structures, we transform XML DTD into relational database schema. We conduct a performance evaluation based on the number of the generated tables, the size of the record, query processing time and the number of joins on the query. The performance of our algorithm is compared with the existing algorithms, then in most cates, our algorithm is better than the existing ones with respect to the number of the generated tables and appearance of NULL values in the tables.

Memory Access Reduction Scheme for H.264/AVC Decoder Motion Compensation (H.264/AVC 디코더의 움직임 보상을 위한 메모리 접근 감소 기법)

  • Park, Kyoung-Oh;Hong, You-Pyo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.34 no.4C
    • /
    • pp.349-354
    • /
    • 2009
  • In this paper, a new motion compensation scheme to reduce external memory access frequency which is one of the major bottlenecks for real-time decoding is proposed. Most H.264/AVC decoders store reference pictures in external memories due to the large size and reference blocks are read into the decoder core as needed during decoding. If the reference data access is done for each reference block in decoding sequence, the memory bandwidth can be unacceptable for real-time decoding. This paper presents a memory access scheme for motion compensation to read as many reference data as possible with reduced memory access frequency by analyzing reference data access pattern for each macroblock. Experimental results show that the proposed motion compensation scheme leads to approximately 30% improvement in memory bandwidth requirement.

Compression of Elemental Images Using Block Division in 3D Integral Imaging (3D 집적 영상에서 영역 분할을 이용한 요소 영상의 압축 기법)

  • Kang, Ho-Hyun;Shin, Dong-Hak;Kim, Eun-Soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.34 no.3C
    • /
    • pp.297-303
    • /
    • 2009
  • Integral imaging is a well-known 3D image recording and display technique. The huge size of integral imaging data requires a compression scheme to store and transmit 3D scenes. In the conventional compression scheme, the data amount of elemental images depends on the various recording condition such as the positional location of a 3D object, the illumination and specification of the lenslet array even if an identical pickup system is used. In this paper, to reduce the dependence of the image characteristics of elemental images on the pickup conditions, a compression scheme using block division on the elemental image of integral imaging is proposed. The proposed scheme provides an improved compression ratio by considering the local similarity of elemental images picked up from three-dimensional objects according to a positional location. To test the proposed scheme, various elemental images are picked up and a compression process is then carried out u sing a standard MPEG-4. Based on compression ratio results, the proposed compression scheme is improved by approximately 9% compared with the conventional compression method.

SOAR : Storage Reliability Analyzer (SOAR : 저장장치를 기반으로 하는 시스템의 신뢰성 분석도구 개발)

  • Kim, Young-Jin;Won, You-Jip;Kim, Ra-Kie
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.35 no.6
    • /
    • pp.248-262
    • /
    • 2008
  • As the number of large size multimedia files increases and the importance of individual's digital data grows, storage devices have been advanced to store more data into smaller spaces. In such circumstances, a physical damage in a storage device can destroy large amount of important data. Therefore, it is needed to verify the robustness of various physical faults in storage device before certain systems are used. We developed SOAR(Storage Reliability Analyzer), Storage Reliability Analyzer, to detect physical faults in diverse kinds of HDD hardware components and to recover the systems from those faults. This is a useful tool to verify robustness and reliability of a disk. SOAR uses three unique methods of creating physical damages on a disk and two unique techniques to apply the same feature on file systems. In this paper, we have performed comprehensive tests to verify the robustness and reliability of storage device with SOAR, and from the verification result we could confirm SOAR is a very efficient tool.

Life Cycle Assessment of Activated Carbon Production System by Using Poplar (포플러를 이용한 활성탄 제조 시스템에 대한 전과정 평가)

  • Kim, Mihyung;Kim, Geonha
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.36 no.11
    • /
    • pp.725-732
    • /
    • 2014
  • Phytoremediation is a technology to mitigate the pollutant concentrations such as metals, pesticides, solvents, oils, or others in contaminated water and soils with plants. The plants absorb contaminants through the root and store them in the root, stems, or leaves. Rapid growth trees such as poplar are used to remove low concentrated contaminants eco-friendly and economically in a wide contaminated region. This study was practiced to evaluate an activated carbon production system by using poplar wood discarded after phytoremediation. Life cycle assessment methodology was used to analyze environmental impacts of the system, and the functional unit was one ton of harvested poplar. It was estimated that the small size rotary kiln for activated carbon production from poplar wood had an environmental benefit in optimized conditions to minimize energy consumptions. The results of an avoided environmental impact analysis show that the system contribute to reduce environmental impacts in comparison with activated carbon production from coconut shell.

A Study on the Developing of Big Data Services in Public Library (도서관 빅데이터 서비스 모형 개발에 관한 연구: 공공도서관을 중심으로)

  • Pyo, Soon Hee;Kim, Yun Hyung;Kim, Hye Sun;Kim, Wan Jong
    • Journal of the Korean Society for information Management
    • /
    • v.32 no.2
    • /
    • pp.63-86
    • /
    • 2015
  • Big data refers to dataset whose size is beyond the ability of typical database software tools to capture, store, manage, and analyze. And now it is considered to create the new opportunity in every industry. The purpose of this study is to develop of big data services in public library for improved library services. To this end, analysed the type of library big data and needs of stockholders through the various methods such as deep interview, focus group interview, questionnaire. At first step, we defined the 16 big data service models from interview with librarians, and LIS professions. Second step, it was considered necessity, timeliness, possibility of development. We developed the final two services called on 'Decision Support Services for Public Librarians' and 'Book Recommendation Services for Users.'

The Estimation of Temperature distribution around Gas Storage Cavern (저온가스 저장공동 주위암반의 온도분포 예측에 관한 연구)

  • Lee, Yang;Lee, Seung-Do;Moon, Hyun-Koo
    • Tunnel and Underground Space
    • /
    • v.14 no.1
    • /
    • pp.16-25
    • /
    • 2004
  • As underground caverns have many advantages such as safety and operation, they can also be used for gas storage purpose. When liquefied gas is stored underground, the cryogenic temperature of the gas affects the stability of the storage cavern. In order to store the liquefied gas successfully, it is essential to estimate the exact temperature distribution of the rock mass around the caverns. The main purpose of this study is the development of theoretical solution to be able to estimate the temperature distribution around storage caverns and the assessment of the solution. In this study, a theoretical solution and a conceptual model for estimating two and three dimensional temperature distribution around the storage caverns are suggested. Based on the multi-dimensional transient heat transfer theory, the theoretical solution is successfully derived by assuming the caverns shape as simplified geometry. In order to assess the theoretical solution, by performing numerical experiments with this multi-dimensional model, the temperature distribution of the theoretical solution is compared with that of numerical analysis. Furthermore, the effects of the caverns size are investigated.

High-speed W Address Lookup using Balanced Multi-way Trees (균형 다중 트리를 이용한 고속 IP 어드레스 검색 기법)

  • Kim, Won-Iung;Lee, Bo-Mi;Lim, Hye-Sook
    • Journal of KIISE:Information Networking
    • /
    • v.32 no.3
    • /
    • pp.427-432
    • /
    • 2005
  • Packet arrival rates in internet routers have been dramatically increased due to the advance of link technologies, and hence wire-speed packet processing in Internet routers becomes more challenging. As IP address lookup is one of the most essential functions for packet processing, algorithm and architectures for efficient IP address lookup have been widely studied. In this paper, we Propose an efficient I address lookup architecture which shows yeW good Performance in search speed while requires a single small-size memory The proposed architecture is based on multi-way tree structure which performs comparisons of multiple prefixes by one memory access. Performance evaluation results show that the proposed architecture requires a 280kByte SRAM to store about 40000 prefix samples and an address lookup is achieved by 5.9 memory accesses in average.

Service-centric Object Fragmentation Model for Efficient Retrieval and Management of Huge XML Documents (대용량 XML 문서의 효율적인 검색과 관리를 위한 SCOF 모델)

  • Jeong, Chang-Hoo;Choi, Yun-Soo;Jin, Du-Seok;Kim, Jin-Suk;Yoon, Hwa-Mook
    • Journal of Internet Computing and Services
    • /
    • v.9 no.1
    • /
    • pp.103-113
    • /
    • 2008
  • Vast amount of XML documents raise interests in how they will be used and how far their usage can be expanded, This paper has two central goals: 1) easy and fast retrieval of XML documents or relevant elements; and 2) efficient and stable management of large-size XML documents, The keys to develop such a practical system are how to segment a large XML document to smaller fragments and how to store them. In order to achieve these goals, we designed SCOF(Service-centric Object Fragmentation) model, which is a semi-decomposition method based on conversion rules provided by XML database managers. Keyword-based search using SCOF model then retrieves the specific elements or attributes of XML documents, just as typical XML query language does. Even though this approach needs the wisdom of managers in XML document collection, SCOF model makes it efficient both retrieval and management of massive XML documents.

  • PDF