• Title/Summary/Keyword: Science & Technology Integrated Information System

Search Result 474, Processing Time 0.032 seconds

Methods for Integration of Documents using Hierarchical Structure based on the Formal Concept Analysis (FCA 기반 계층적 구조를 이용한 문서 통합 기법)

  • Kim, Tae-Hwan;Jeon, Ho-Cheol;Choi, Joong-Min
    • Journal of Intelligence and Information Systems
    • /
    • v.17 no.3
    • /
    • pp.63-77
    • /
    • 2011
  • The World Wide Web is a very large distributed digital information space. From its origins in 1991, the web has grown to encompass diverse information resources as personal home pasges, online digital libraries and virtual museums. Some estimates suggest that the web currently includes over 500 billion pages in the deep web. The ability to search and retrieve information from the web efficiently and effectively is an enabling technology for realizing its full potential. With powerful workstations and parallel processing technology, efficiency is not a bottleneck. In fact, some existing search tools sift through gigabyte.syze precompiled web indexes in a fraction of a second. But retrieval effectiveness is a different matter. Current search tools retrieve too many documents, of which only a small fraction are relevant to the user query. Furthermore, the most relevant documents do not nessarily appear at the top of the query output order. Also, current search tools can not retrieve the documents related with retrieved document from gigantic amount of documents. The most important problem for lots of current searching systems is to increase the quality of search. It means to provide related documents or decrease the number of unrelated documents as low as possible in the results of search. For this problem, CiteSeer proposed the ACI (Autonomous Citation Indexing) of the articles on the World Wide Web. A "citation index" indexes the links between articles that researchers make when they cite other articles. Citation indexes are very useful for a number of purposes, including literature search and analysis of the academic literature. For details of this work, references contained in academic articles are used to give credit to previous work in the literature and provide a link between the "citing" and "cited" articles. A citation index indexes the citations that an article makes, linking the articleswith the cited works. Citation indexes were originally designed mainly for information retrieval. The citation links allow navigating the literature in unique ways. Papers can be located independent of language, and words in thetitle, keywords or document. A citation index allows navigation backward in time (the list of cited articles) and forwardin time (which subsequent articles cite the current article?) But CiteSeer can not indexes the links between articles that researchers doesn't make. Because it indexes the links between articles that only researchers make when they cite other articles. Also, CiteSeer is not easy to scalability. Because CiteSeer can not indexes the links between articles that researchers doesn't make. All these problems make us orient for designing more effective search system. This paper shows a method that extracts subject and predicate per each sentence in documents. A document will be changed into the tabular form that extracted predicate checked value of possible subject and object. We make a hierarchical graph of a document using the table and then integrate graphs of documents. The graph of entire documents calculates the area of document as compared with integrated documents. We mark relation among the documents as compared with the area of documents. Also it proposes a method for structural integration of documents that retrieves documents from the graph. It makes that the user can find information easier. We compared the performance of the proposed approaches with lucene search engine using the formulas for ranking. As a result, the F.measure is about 60% and it is better as about 15%.

Management of plant genetic resources at RDA in line with Nagoya Protocol

  • Yoon, Moon-Sup;Na, Young-Wang;Ko, Ho-Cheol;Lee, Sun-Young;Ma, Kyung-Ho;Baek, Hyung-Jin;Lee, Su-Kyeung;Lee, Sok-Young
    • Proceedings of the Korean Society of Crop Science Conference
    • /
    • 2017.06a
    • /
    • pp.51-52
    • /
    • 2017
  • "Plant genetic resources for food and agriculture" means any genetic material of plant origin of actual or potential value for food and agriculture. "Genetic material" means any material of plant origin, including reproductive and vegetative propagating material, containing functional units of heredity. (Internal Treaty on Plant Genetic Resources for Food and Agriculture, ITPGRFA). The "Nagoya Protocol on Access to Genetic Resources and the Fair and Equitable Sharing of Benefits Arising from their Utilization (ABS) to the Convention on Biological Diversity (shortly Nagoya Protocol)" is a supplementary agreement to the Convention on Biological Diversity. It provides a transparent legal framework for the effective implementation of one of the three objectives of the CBD: the fair and equitable sharing of benefits arising out of the utilization of genetic resources. The Nagoya Protocol on ABS was adopted on 29 October 2010 in Nagoya, Japan and entered into force on 12 October 2014, 90 days after the deposit of the fiftieth instrument of ratification. Its objective is the fair and equitable sharing of benefits arising from the utilization of genetic resources, thereby contributing to the conservation and sustainable use of biodiversity. The Nagoya Protocol will create greater legal certainty and transparency for both providers and users of genetic resources by; (a) Establishing more predictable conditions for access to genetic resources and (b) Helping to ensure benefit-sharing when genetic resources leave the country providing the genetic resources. By helping to ensure benefit-sharing, the Nagoya Protocol creates incentives to conserve and sustainably use genetic resources, and therefore enhances the contribution of biodiversity to development and human well-being. The Nagoya Protocol's success will require effective implementation at the domestic level. A range of tools and mechanisms provided by the Nagoya Protocol will assist contracting Parties including; (a) Establishing national focal points (NFPs) and competent national authorities (CNAs) to serve as contact points for information, grant access or cooperate on issues of compliance, (b) An Access and Benefit-sharing Clearing-House to share information, such as domestic regulatory ABS requirements or information on NFPs and CNAs, (c) Capacity-building to support key aspects of implementation. Based on a country's self-assessment of national needs and priorities, this can include capacity to develop domestic ABS legislation to implement the Nagoya Protocol, to negotiate MAT and to develop in-country research capability and institutions, (d) Awareness-raising, (e) Technology Transfer, (f) Targeted financial support for capacity-building and development initiatives through the Nagoya Protocol's financial mechanism, the Global Environment Facility (GEF) (Nagoya Protocol). The Rural Development Administration (RDA) leading to conduct management agricultural genetic resources following the 'ACT ON THE PRESERVATION, MANAGEMENT AND USE OF AGRO-FISHERY BIO-RESOURCES' established on 2007. According to $2^{nd}$ clause of Article 14 (Designation, Operation, etc. of Agencies Responsible for Agro-Fishery Bioresources) of the act, the duties endowed are, (a) Matters concerning securing, preservation, management, and use of agro-fishery bioresources; (b) Establishment of an integrated information system for agro-fishery bioresources; (c) Matters concerning medium and long-term preservation of, and research on, agro-fishery bioresources; (d) Matters concerning international cooperation for agro-fishery bioresources and other relevant matters. As the result the RDA manage about 246,000 accessions of plant genetic resources under the national management system at the end of 2016.

  • PDF

An Embedding /Extracting Method of Audio Watermark Information for High Quality Stereo Music (고품질 스테레오 음악을 위한 오디오 워터마크 정보 삽입/추출 기술)

  • Bae, Kyungyul
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.2
    • /
    • pp.21-35
    • /
    • 2018
  • Since the introduction of MP3 players, CD recordings have gradually been vanishing, and the music consuming environment of music users is shifting to mobile devices. The introduction of smart devices has increased the utilization of music through music playback, mass storage, and search functions that are integrated into smartphones and tablets. At the time of initial MP3 player supply, the bitrate of the compressed music contents generally was 128 Kbps. However, as increasing of the demand for high quality music, sound quality of 384 Kbps appeared. Recently, music content of FLAC (Free License Audio Codec) format using lossless compression method is becoming popular. The download service of many music sites in Korea has classified by unlimited download with technical protection and limited download without technical protection. Digital Rights Management (DRM) technology is used as a technical protection measure for unlimited download, but it can only be used with authenticated devices that have DRM installed. Even if music purchased by the user, it cannot be used by other devices. On the contrary, in the case of music that is limited in quantity but not technically protected, there is no way to enforce anyone who distributes it, and in the case of high quality music such as FLAC, the loss is greater. In this paper, the author proposes an audio watermarking technology for copyright protection of high quality stereo music. Two kinds of information, "Copyright" and "Copy_free", are generated by using the turbo code. The two watermarks are composed of 9 bytes (72 bits). If turbo code is applied for error correction, the amount of information to be inserted as 222 bits increases. The 222-bit watermark was expanded to 1024 bits to be robust against additional errors and finally used as a watermark to insert into stereo music. Turbo code is a way to recover raw data if the damaged amount is less than 15% even if part of the code is damaged due to attack of watermarked content. It can be extended to 1024 bits or it can find 222 bits from some damaged contents by increasing the probability, the watermark itself has made it more resistant to attack. The proposed algorithm uses quantization in DCT so that watermark can be detected efficiently and SNR can be improved when stereo music is converted into mono. As a result, on average SNR exceeded 40dB, resulting in sound quality improvements of over 10dB over traditional quantization methods. This is a very significant result because it means relatively 10 times improvement in sound quality. In addition, the sample length required for extracting the watermark can be extracted sufficiently if the length is shorter than 1 second, and the watermark can be completely extracted from music samples of less than one second in all of the MP3 compression having a bit rate of 128 Kbps. The conventional quantization method can extract the watermark with a length of only 1/10 compared to the case where the sampling of the 10-second length largely fails to extract the watermark. In this study, since the length of the watermark embedded into music is 72 bits, it provides sufficient capacity to embed necessary information for music. It is enough bits to identify the music distributed all over the world. 272 can identify $4*10^{21}$, so it can be used as an identifier and it can be used for copyright protection of high quality music service. The proposed algorithm can be used not only for high quality audio but also for development of watermarking algorithm in multimedia such as UHD (Ultra High Definition) TV and high-resolution image. In addition, with the development of digital devices, users are demanding high quality music in the music industry, and artificial intelligence assistant is coming along with high quality music and streaming service. The results of this study can be used to protect the rights of copyright holders in these industries.

A New Approach to Mobile Device Design - focused on the Communication Tool & it's GUI for Office Workers in the Near Future - (모바일 기기 디자인의 새로운 접근 - 근 미래 작업환경에서의 커뮤니케이션 도구 디자인과 GUI 연구를 중심으로 -)

  • Yang, Sung-Ho
    • Archives of design research
    • /
    • v.19 no.2 s.64
    • /
    • pp.31-42
    • /
    • 2006
  • This study originates from the following critical mind; what will the office of the future be like? and what technology will we rely upon most to communicate with colleagues or to access business information. In the office environment today, new technology has compelled new work paradigm and has greatly affected the capabilities of the individual to work in a more productive and efficient manner. However, even though new computer technology has changed the business world so rapidly, it is very difficult to see the changes that have been taken place. As an aim of the study, creating a mobile tool for office workers that successfully supports their work and communication was explored, and this study explored future work environment with a 5 years technological and social perspective. As a result of this study, the bON brings new visions to the mobile professionals via various interfaces. The bON, a mobile device, is both a system of work and of communication for office workers. The bON, as an integrated tool for working and communicating, forms the basis for a mobile information gateway that is equally capable of functioning as a mobile desk. The basic underlying idea is that all formal meeting places and hallways in the office are equipped with large wall-mounted screens. The bON collaborates with these media in various ways to enhance productivity and efficiency. The main challenge for the bON to enhance both mobility and quality of information is using new technology including bendable and flexible display and soft material display and sensors. To answer for the strong needs for mobility, the whole size of the device is fairly small while the screen is rolled inside the device. For Graphical User Interface, moreover, a new technique called Multi-layering Interface was adopted to stretch user's visual limits and suggests new direction in designing mobile device, equipped with small size display.

  • PDF

Utilization of Smart Farms in Open-field Agriculture Based on Digital Twin (디지털 트윈 기반 노지스마트팜 활용방안)

  • Kim, Sukgu
    • Proceedings of the Korean Society of Crop Science Conference
    • /
    • 2023.04a
    • /
    • pp.7-7
    • /
    • 2023
  • Currently, the main technologies of various fourth industries are big data, the Internet of Things, artificial intelligence, blockchain, mixed reality (MR), and drones. In particular, "digital twin," which has recently become a global technological trend, is a concept of a virtual model that is expressed equally in physical objects and computers. By creating and simulating a Digital twin of software-virtualized assets instead of real physical assets, accurate information about the characteristics of real farming (current state, agricultural productivity, agricultural work scenarios, etc.) can be obtained. This study aims to streamline agricultural work through automatic water management, remote growth forecasting, drone control, and pest forecasting through the operation of an integrated control system by constructing digital twin data on the main production area of the nojinot industry and designing and building a smart farm complex. In addition, it aims to distribute digital environmental control agriculture in Korea that can reduce labor and improve crop productivity by minimizing environmental load through the use of appropriate amounts of fertilizers and pesticides through big data analysis. These open-field agricultural technologies can reduce labor through digital farming and cultivation management, optimize water use and prevent soil pollution in preparation for climate change, and quantitative growth management of open-field crops by securing digital data for the national cultivation environment. It is also a way to directly implement carbon-neutral RED++ activities by improving agricultural productivity. The analysis and prediction of growth status through the acquisition of the acquired high-precision and high-definition image-based crop growth data are very effective in digital farming work management. The Southern Crop Department of the National Institute of Food Science conducted research and development on various types of open-field agricultural smart farms such as underground point and underground drainage. In particular, from this year, commercialization is underway in earnest through the establishment of smart farm facilities and technology distribution for agricultural technology complexes across the country. In this study, we would like to describe the case of establishing the agricultural field that combines digital twin technology and open-field agricultural smart farm technology and future utilization plans.

  • PDF

Review of Soil Vulnerability Assessment Tools in Korea and other developed countries (국내외 토양 취약성 평가 연구 동향)

  • Ki, Seo Jin;Kim, Kyoung-Ho;Lee, Hyeon Gyu;Shin, Kyung Hee
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.39 no.12
    • /
    • pp.741-749
    • /
    • 2017
  • This study aims to provide the technical considerations and implications for the development of soil vulnerability assesment tool based on the review of existing tools and case studies applied both domestically and internationally. For this study, we specifically investigated the basic theories and major features implemented in the screening models abroad. In contrast, one case study of prioritizing the vulnerable districts was presented to identify the research trends in Korea. Our literature review suggested that the characteristic of target areas and contaminants needed to be properly incorporated into soil vulnerability assessment because the current tools in Korea neglected these properties which prevented this tool from being used as a correct measure of soil management and prevention. We also reached the conclusion that in terms of technical aspect, the soil vulnerability assessment tool should be developed based on the physical theory and environmental data that were varied over space and time so that the end-users were able to readily and effectively screen soil vulnerability over large areas. In parallel with technical improvement, great effort needed to be devoted to develop an integrated environmental information system that increased the availability of data and shared various types of environmental data through enhanced multi-agency collaboration.

Evaluation of Crop Production Increase through Insect Pollination Service in Korean Agriculture (한국 농업에서 곤충 화분매개 서비스를 통한 식량 생산 증진 기능 평가)

  • Jung, Chuleui;Shin, Jong Hwa
    • Korean journal of applied entomology
    • /
    • v.61 no.1
    • /
    • pp.229-238
    • /
    • 2022
  • Animal pollination is an important ecosystem service provided mostly by diverse insect groups such as bees and hover flies. Maintaining agricultural productivity and securing the nutritional balance are closely tied to human wellbeing. This study aimed to estimate the pollination dependent food production in Korean agricultural system. Crop production data were obtained from Korean statistical information service (KOSIS) data of 2015. By implementing pollination dependency, crop production and market price, contribution of insect pollination to crop production increase were estimated from total 71 crops including 12 cereals, 19 fruits, 18 field vegetables, 13 greenhouse vegetables and 9 specialty crops. Mean pollination dependency of all crops were 29.2% and it was higher on fruits, specialty crops and greenhouse vegetables as well, but low (7.5%) in cereal crops. Pollination dependent (PD) production was estimated as 17.8% of total agricultural crop production with the economic value of 6,850 (6,508-7,193) billion won. Especially, PD production of greenhouse vegetables accounted 49.2% followed by fruits of 42.9%. Even specialty crop also showed higher PD production (35.9%). It was obvious that pollination is the vital service for agricultural production as well as nutritional security in Korea. Further protection and enhancing the pollination service were discussed with integrated pollinator-pest management (IPPM) strategies.

Recent Progress in Air-Conditioning and Refrigeration Research: A Review of Papers Published in the Korean Journal of Air-Conditioning and Refrigeration Engineering in 2014 (설비공학 분야의 최근 연구 동향: 2014년 학회지 논문에 대한 종합적 고찰)

  • Lee, Dae-Young;Kim, Sa Ryang;Kim, Hyun-Jung;Kim, Dong-Seon;Park, Jun-Seok;Ihm, Pyeong Chan
    • Korean Journal of Air-Conditioning and Refrigeration Engineering
    • /
    • v.27 no.7
    • /
    • pp.380-394
    • /
    • 2015
  • This article reviews the papers published in the Korean Journal of Air-Conditioning and Refrigeration Engineering during 2014. It is intended to understand the status of current research in the areas of heating, cooling, ventilation, sanitation, and indoor environments of buildings and plant facilities. Conclusions are as follows. (1) The research works on the thermal and fluid engineering have been reviewed as groups of heat and mass transfer, cooling and heating, and air-conditioning, the flow inside building rooms, and smoke control on fire. Research issues dealing with duct and pipe were reduced, but flows inside building rooms, and smoke controls were newly added in thermal and fluid engineering research area. (2) Research works on heat transfer area have been reviewed in the categories of heat transfer characteristics, pool boiling and condensing heat transfer and industrial heat exchangers. Researches on heat transfer characteristics included the results for thermal contact resistance measurement of metal interface, a fan coil with an oval-type heat exchanger, fouling characteristics of plate heat exchangers, effect of rib pitch in a two wall divergent channel, semi-empirical analysis in vertical mesoscale tubes, an integrated drying machine, microscale surface wrinkles, brazed plate heat exchangers, numerical analysis in printed circuit heat exchanger. In the area of pool boiling and condensing, non-uniform air flow, PCM applied thermal storage wall system, a new wavy cylindrical shape capsule, and HFC32/HFC152a mixtures on enhanced tubes, were actively studied. In the area of industrial heat exchangers, researches on solar water storage tank, effective design on the inserting part of refrigerator door gasket, impact of different boundary conditions in generating g-function, various construction of SCW type ground heat exchanger and a heat pump for closed cooling water heat recovery were performed. (3) In the field of refrigeration, various studies were carried out in the categories of refrigeration cycle, alternative refrigeration and modelling and controls including energy recoveries from industrial boilers and vehicles, improvement of dehumidification systems, novel defrost systems, fault diagnosis and optimum controls for heat pump systems. It is particularly notable that a substantial number of studies were dedicated for the development of air-conditioning and power recovery systems for electric vehicles in this year. (4) In building mechanical system research fields, seventeen studies were reported for achieving effective design of the mechanical systems, and also for maximizing the energy efficiency of buildings. The topics of the studies included energy performance, HVAC system, ventilation, and renewable energies, piping in the buildings. Proposed designs, performance performance tests using numerical methods and experiments provide useful information and key data which can improve the energy efficiency of the buildings. (5) The field of architectural environment was mostly focused on indoor environment and building energy. The main researches of indoor environment were related to the evaluation of work noise in tunnel construction and the simulation and development of a light-shelf system. The subjects of building energy were worked on the energy saving of office building applied with window blind and phase change material(PCM), a method of existing building energy simulation using energy audit data, the estimation of thermal consumption unit of apartment building and its case studies, dynamic window performance, a writing method of energy consumption report and energy estimation of apartment building using district heating system. The remained studies were related to the improvement of architectural engineering education system for plant engineering industry, estimating cooling and heating degree days for variable base temperature, a prediction method of underground temperature, the comfort control algorithm of car air conditioner, the smoke control performance evaluation of high-rise building, evaluation of thermal energy systems of bio safety laboratory and a development of measuring device of solar heat gain coefficient of fenestration system.

Hardware Approach to Fuzzy Inference―ASIC and RISC―

  • Watanabe, Hiroyuki
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 1993.06a
    • /
    • pp.975-976
    • /
    • 1993
  • This talk presents the overview of the author's research and development activities on fuzzy inference hardware. We involved it with two distinct approaches. The first approach is to use application specific integrated circuits (ASIC) technology. The fuzzy inference method is directly implemented in silicon. The second approach, which is in its preliminary stage, is to use more conventional microprocessor architecture. Here, we use a quantitative technique used by designer of reduced instruction set computer (RISC) to modify an architecture of a microprocessor. In the ASIC approach, we implemented the most widely used fuzzy inference mechanism directly on silicon. The mechanism is beaded on a max-min compositional rule of inference, and Mandami's method of fuzzy implication. The two VLSI fuzzy inference chips are designed, fabricated, and fully tested. Both used a full-custom CMOS technology. The second and more claborate chip was designed at the University of North Carolina(U C) in cooperation with MCNC. Both VLSI chips had muliple datapaths for rule digital fuzzy inference chips had multiple datapaths for rule evaluation, and they executed multiple fuzzy if-then rules in parallel. The AT & T chip is the first digital fuzzy inference chip in the world. It ran with a 20 MHz clock cycle and achieved an approximately 80.000 Fuzzy Logical inferences Per Second (FLIPS). It stored and executed 16 fuzzy if-then rules. Since it was designed as a proof of concept prototype chip, it had minimal amount of peripheral logic for system integration. UNC/MCNC chip consists of 688,131 transistors of which 476,160 are used for RAM memory. It ran with a 10 MHz clock cycle. The chip has a 3-staged pipeline and initiates a computation of new inference every 64 cycle. This chip achieved an approximately 160,000 FLIPS. The new architecture have the following important improvements from the AT & T chip: Programmable rule set memory (RAM). On-chip fuzzification operation by a table lookup method. On-chip defuzzification operation by a centroid method. Reconfigurable architecture for processing two rule formats. RAM/datapath redundancy for higher yield It can store and execute 51 if-then rule of the following format: IF A and B and C and D Then Do E, and Then Do F. With this format, the chip takes four inputs and produces two outputs. By software reconfiguration, it can store and execute 102 if-then rules of the following simpler format using the same datapath: IF A and B Then Do E. With this format the chip takes two inputs and produces one outputs. We have built two VME-bus board systems based on this chip for Oak Ridge National Laboratory (ORNL). The board is now installed in a robot at ORNL. Researchers uses this board for experiment in autonomous robot navigation. The Fuzzy Logic system board places the Fuzzy chip into a VMEbus environment. High level C language functions hide the operational details of the board from the applications programme . The programmer treats rule memories and fuzzification function memories as local structures passed as parameters to the C functions. ASIC fuzzy inference hardware is extremely fast, but they are limited in generality. Many aspects of the design are limited or fixed. We have proposed to designing a are limited or fixed. We have proposed to designing a fuzzy information processor as an application specific processor using a quantitative approach. The quantitative approach was developed by RISC designers. In effect, we are interested in evaluating the effectiveness of a specialized RISC processor for fuzzy information processing. As the first step, we measured the possible speed-up of a fuzzy inference program based on if-then rules by an introduction of specialized instructions, i.e., min and max instructions. The minimum and maximum operations are heavily used in fuzzy logic applications as fuzzy intersection and union. We performed measurements using a MIPS R3000 as a base micropro essor. The initial result is encouraging. We can achieve as high as a 2.5 increase in inference speed if the R3000 had min and max instructions. Also, they are useful for speeding up other fuzzy operations such as bounded product and bounded sum. The embedded processor's main task is to control some device or process. It usually runs a single or a embedded processer to create an embedded processor for fuzzy control is very effective. Table I shows the measured speed of the inference by a MIPS R3000 microprocessor, a fictitious MIPS R3000 microprocessor with min and max instructions, and a UNC/MCNC ASIC fuzzy inference chip. The software that used on microprocessors is a simulator of the ASIC chip. The first row is the computation time in seconds of 6000 inferences using 51 rules where each fuzzy set is represented by an array of 64 elements. The second row is the time required to perform a single inference. The last row is the fuzzy logical inferences per second (FLIPS) measured for ach device. There is a large gap in run time between the ASIC and software approaches even if we resort to a specialized fuzzy microprocessor. As for design time and cost, these two approaches represent two extremes. An ASIC approach is extremely expensive. It is, therefore, an important research topic to design a specialized computing architecture for fuzzy applications that falls between these two extremes both in run time and design time/cost. TABLEI INFERENCE TIME BY 51 RULES {{{{Time }}{{MIPS R3000 }}{{ASIC }}{{Regular }}{{With min/mix }}{{6000 inference 1 inference FLIPS }}{{125s 20.8ms 48 }}{{49s 8.2ms 122 }}{{0.0038s 6.4㎲ 156,250 }} }}

  • PDF

Stratigraphic response to tectonic evolution of sedimentary basins in the Yellow Sea and adjacent areas (황해 및 인접 지역 퇴적분지들의 구조적 진화에 따른 층서)

  • Ryo In Chang;Kim Boo Yang;Kwak won Jun;Kim Gi Hyoun;Park Se Jin
    • The Korean Journal of Petroleum Geology
    • /
    • v.8 no.1_2 s.9
    • /
    • pp.1-43
    • /
    • 2000
  • A comparison study for understanding a stratigraphic response to tectonic evolution of sedimentary basins in the Yellow Sea and adjacent areas was carried out by using an integrated stratigraphic technology. As an interim result, we propose a stratigraphic framework that allows temporal and spatial correlation of the sedimentary successions in the basins. This stratigraphic framework will use as a new stratigraphic paradigm for hydrocarbon exploration in the Yellow Sea and adjacent areas. Integrated stratigraphic analysis in conjunction with sequence-keyed biostratigraphy allows us to define nine stratigraphic units in the basins: Cambro-Ordovician, Carboniferous-Triassic, early to middle Jurassic, late Jurassic-early Cretaceous, late Cretaceous, Paleocene-Eocene, Oligocene, early Miocene, and middle Miocene-Pliocene. They are tectono-stratigraphic units that provide time-sliced information on basin-forming tectonics, sedimentation, and basin-modifying tectonics of sedimentary basins in the Yellow Sea and adjacent area. In the Paleozoic, the South Yellow Sea basin was initiated as a marginal sag basin in the northern margin of the South China Block. Siliciclastic and carbonate sediments were deposited in the basin, showing cyclic fashions due to relative sea-level fluctuations. During the Devonian, however, the basin was once uplifted and deformed due to the Caledonian Orogeny, which resulted in an unconformity between the Cambro-Ordovician and the Carboniferous-Triassic units. The second orogenic event, Indosinian Orogeny, occurred in the late Permian-late Triassic, when the North China block began to collide with the South China block. Collision of the North and South China blocks produced the Qinling-Dabie-Sulu-Imjin foldbelts and led to the uplift and deformation of the Paleozoic strata. Subsequent rapid subsidence of the foreland parallel to the foldbelts formed the Bohai and the West Korean Bay basins where infilled with the early to middle Jurassic molasse sediments. Also Piggyback basins locally developed along the thrust. The later intensive Yanshanian (first) Orogeny modified these foreland and Piggyback basins in the late Jurassic. The South Yellow Sea basin, however, was likely to be a continental interior sag basin during the early to middle Jurassic. The early to middle Jurassic unit in the South Yellow Sea basin is characterized by fluvial to lacustrine sandstone and shale with a thick basal quartz conglomerate that contains well-sorted and well-rounded gravels. Meanwhile, the Tan-Lu fault system underwent a sinistrai strike-slip wrench movement in the late Triassic and continued into the Jurassic and Cretaceous until the early Tertiary. In the late Jurassic, development of second- or third-order wrench faults along the Tan-Lu fault system probably initiated a series of small-scale strike-slip extensional basins. Continued sinistral movement of the Tan-Lu fault until the late Eocene caused a megashear in the South Yellow Sea basin, forming a large-scale pull-apart basin. However, the Bohai basin was uplifted and severely modified during this period. h pronounced Yanshanian Orogeny (second and third) was marked by the unconformity between the early Cretaceous and late Eocene in the Bohai basin. In the late Eocene, the Indian Plate began to collide with the Eurasian Plate, forming a megasuture zone. This orogenic event, namely the Himalayan Orogeny, was probably responsible for the change of motion of the Tan-Lu fault system from left-lateral to right-lateral. The right-lateral strike-slip movement of the Tan-Lu fault caused the tectonic inversion of the South Yellow Sea basin and the pull-apart opening of the Bohai basin. Thus, the Oligocene was the main period of sedimentation in the Bohai basin as well as severe tectonic modification of the South Yellow Sea basin. After the Oligocene, the Yellow Sea and Bohai basins have maintained thermal subsidence up to the present with short periods of marine transgressions extending into the land part of the present basins.

  • PDF