• Title/Summary/Keyword: Domain trace

Search Result 52, Processing Time 0.025 seconds

Cut out effect on nonlinear post-buckling behavior of FG-CNTRC micro plate subjected to magnetic field via FSDT

  • Jamali, M.;Shojaee, T.;Mohammadi, B.;Kolahchi, R.
    • Advances in nano research
    • /
    • v.7 no.6
    • /
    • pp.405-417
    • /
    • 2019
  • This research is devoted to study post-buckling analysis of functionally graded carbon nanotubes reinforced composite (FG-CNTRC) micro plate with cut out subjected to magnetic field and resting on elastic medium. The basic formulation of plate is based on first order shear deformation theory (FSDT) and the material properties of FG-CNTRCs are presumed to be changed through the thickness direction, and are assumed based on rule of mixture; moreover, nonlocal Eringen's theory is applied to consider the size-dependent effect. It is considered that the system is embedded in elastic medium and subjected to longitudinal magnetic field. Energy approach, domain decomposition and Rayleigh-Ritz methods in conjunction with Newton-Raphson iterative technique are employed to trace the post-buckling paths of FG-CNTRC micro cut out plate. The influence of some important parameters such as small scale effect, cut out dimension, different types of FG distributions of CNTs, volume fraction of CNTs, aspect ratio of plate, magnitude of magnetic field, elastic medium and biaxial load on the post-buckling behavior of system are calculated. With respect to results, it is concluded that the aspect ratio and length of square cut out have negative effect on post-buckling response of micro composite plate. Furthermore, existence of CNTs in system causes improvement in the post-buckling behavior of plate and different distributions of CNTs in plate have diverse response. Meanwhile, nonlocal parameter and biaxial compression load on the plate has negative effect on post-buckling response. In addition, imposing magnetic field increases the post-buckling load of the microstructure.

A Study about the Direction and Responsibility of the National Intelligence Agency to the Cyber Security Issues (사이버 안보에 대한 국가정보기구의 책무와 방향성에 대한 고찰)

  • Han, Hee-Won
    • Korean Security Journal
    • /
    • no.39
    • /
    • pp.319-353
    • /
    • 2014
  • Cyber-based technologies are now ubiquitous around the glob and are emerging as an "instrument of power" in societies, and are becoming more available to a country's opponents, who may use it to attack, degrade, and disrupt communications and the flow of information. The globe-spanning range of cyberspace and no national borders will challenge legal systems and complicate a nation's ability to deter threats and respond to contingencies. Through cyberspace, competitive powers will target industry, academia, government, as well as the military in the air, land, maritime, and space domains of our nations. Enemies in cyberspace will include both states and non-states and will range from the unsophisticated amateur to highly trained professional hackers. In much the same way that airpower transformed the battlefield of World War II, cyberspace has fractured the physical barriers that shield a nation from attacks on its commerce and communication. Cyberthreats to the infrastructure and other assets are a growing concern to policymakers. In 2013 Cyberwarfare was, for the first time, considered a larger threat than Al Qaeda or terrorism, by many U.S. intelligence officials. The new United States military strategy makes explicit that a cyberattack is casus belli just as a traditional act of war. The Economist describes cyberspace as "the fifth domain of warfare and writes that China, Russia, Israel and North Korea. Iran are boasting of having the world's second-largest cyber-army. Entities posing a significant threat to the cybersecurity of critical infrastructure assets include cyberterrorists, cyberspies, cyberthieves, cyberwarriors, and cyberhacktivists. These malefactors may access cyber-based technologies in order to deny service, steal or manipulate data, or use a device to launch an attack against itself or another piece of equipment. However because the Internet offers near-total anonymity, it is difficult to discern the identity, the motives, and the location of an intruder. The scope and enormity of the threats are not just focused to private industry but also to the country's heavily networked critical infrastructure. There are many ongoing efforts in government and industry that focus on making computers, the Internet, and related technologies more secure. As the national intelligence institution's effort, cyber counter-intelligence is measures to identify, penetrate, or neutralize foreign operations that use cyber means as the primary tradecraft methodology, as well as foreign intelligence service collection efforts that use traditional methods to gauge cyber capabilities and intentions. However one of the hardest issues in cyber counterintelligence is the problem of "Attribution". Unlike conventional warfare, figuring out who is behind an attack can be very difficult, even though the Defense Secretary Leon Panetta has claimed that the United States has the capability to trace attacks back to their sources and hold the attackers "accountable". Considering all these cyber security problems, this paper examines closely cyber security issues through the lessons from that of U.S experience. For that purpose I review the arising cyber security issues considering changing global security environments in the 21st century and their implications to the reshaping the government system. For that purpose this study mainly deals with and emphasis the cyber security issues as one of the growing national security threats. This article also reviews what our intelligence and security Agencies should do among the transforming cyber space. At any rate, despite of all hot debates about the various legality and human rights issues derived from the cyber space and intelligence service activity, the national security should be secured. Therefore, this paper suggests that one of the most important and immediate step is to understanding the legal ideology of national security and national intelligence.

  • PDF

An Iterative, Interactive and Unified Seismic Velocity Analysis (반복적 대화식 통합 탄성파 속도분석)

  • Suh Sayng-Yong;Chung Bu-Heung;Jang Seong-Hyung
    • Geophysics and Geophysical Exploration
    • /
    • v.2 no.1
    • /
    • pp.26-32
    • /
    • 1999
  • Among the various seismic data processing sequences, the velocity analysis is the most time consuming and man-hour intensive processing steps. For the production seismic data processing, a good velocity analysis tool as well as the high performance computer is required. The tool must give fast and accurate velocity analysis. There are two different approches in the velocity analysis, batch and interactive. In the batch processing, a velocity plot is made at every analysis point. Generally, the plot consisted of a semblance contour, super gather, and a stack pannel. The interpreter chooses the velocity function by analyzing the velocity plot. The technique is highly dependent on the interpreters skill and requires human efforts. As the high speed graphic workstations are becoming more popular, various interactive velocity analysis programs are developed. Although, the programs enabled faster picking of the velocity nodes using mouse, the main improvement of these programs is simply the replacement of the paper plot by the graphic screen. The velocity spectrum is highly sensitive to the presence of the noise, especially the coherent noise often found in the shallow region of the marine seismic data. For the accurate velocity analysis, these noise must be removed before the spectrum is computed. Also, the velocity analysis must be carried out by carefully choosing the location of the analysis point and accuarate computation of the spectrum. The analyzed velocity function must be verified by the mute and stack, and the sequence must be repeated most time. Therefore an iterative, interactive, and unified velocity analysis tool is highly required. An interactive velocity analysis program, xva(X-Window based Velocity Analysis) was invented. The program handles all processes required in the velocity analysis such as composing the super gather, computing the velocity spectrum, NMO correction, mute, and stack. Most of the parameter changes give the final stack via a few mouse clicks thereby enabling the iterative and interactive processing. A simple trace indexing scheme is introduced and a program to nike the index of the Geobit seismic disk file was invented. The index is used to reference the original input, i.e., CDP sort, directly A transformation techinique of the mute function between the T-X domain and NMOC domain is introduced and adopted to the program. The result of the transform is simliar to the remove-NMO technique in suppressing the shallow noise such as direct wave and refracted wave. However, it has two improvements, i.e., no interpolation error and very high speed computing time. By the introduction of the technique, the mute times can be easily designed from the NMOC domain and applied to the super gather in the T-X domain, thereby producing more accurate velocity spectrum interactively. The xva program consists of 28 files, 12,029 lines, 34,990 words and 304,073 characters. The program references Geobit utility libraries and can be installed under Geobit preinstalled environment. The program runs on X-Window/Motif environment. The program menu is designed according to the Motif style guide. A brief usage of the program has been discussed. The program allows fast and accurate seismic velocity analysis, which is necessary computing the AVO (Amplitude Versus Offset) based DHI (Direct Hydrocarn Indicator), and making the high quality seismic sections.

  • PDF

Tracking and Interpretation of Moving Object in MPEG-2 Compressed Domain (MPEG-2 압축 영역에서 움직이는 객체의 추적 및 해석)

  • Mun, Su-Jeong;Ryu, Woon-Young;Kim, Joon-Cheol;Lee, Joon-Hoan
    • The KIPS Transactions:PartB
    • /
    • v.11B no.1
    • /
    • pp.27-34
    • /
    • 2004
  • This paper proposes a method to trace and interpret a moving object based on the information which can be directly obtained from MPEG-2 compressed video stream without decoding process. In the proposed method, the motion flow is constructed from the motion vectors included in compressed video. We calculate the amount of pan, tilt, and zoom associated with camera operations using generalized Hough transform. The local object motion can be extracted from the motion flow after the compensation with the parameters related to the global camera motion. Initially, a moving object to be traced is designated by user via bounding box. After then automatic tracking Is performed based on the accumulated motion flows according to the area contributions. Also, in order to reduce the cumulative tracking error, the object area is reshaped in the first I-frame of a GOP by matching the DCT coefficients. The proposed method can improve the computation speed because the information can be directly obtained from the MPEG-2 compressed video, but the object boundary is limited by macro-blocks rather than pixels. Also, the proposed method is proper for approximate object tracking rather than accurate tracing of an object because of limited information available in the compressed video data.

Nonlinear Analysis of Nuclear Reinforced Concrete Containment Structures under Accidental Thermal Load and Pressure (온도 및 내압을 받는 원자로 철근콘크리트 격납구조물의 비선형해석)

  • Oh, Byung Hwan;Lee, Myung Gue
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.14 no.3
    • /
    • pp.403-414
    • /
    • 1994
  • Nonlinear analysis of RC containment structure under thermal load and pressure is presented to trace the behaviour after an assumed LOCA. The temperature distribution varying with time through the wall thickness is determined by transient finite element analysis with the two time level scheme in time domain. The layered shell finite elements are used to represent the containment structures in nuclear power plants. Both geometric and material nonlinearities are taken into account in the finite element formulation. The constitutive relation of concrete is modeled according to Drucker-Prager yield criteria in compression. Tension stiffening model is used to represent the tensile behaviour of concrete including bond effect. The reinforcing bars are modeled by smeared layer at the location of reinforcements accounting elasto-plastic axial behaviors. The steel liner model under Von Mises yield criteria is adopted to represent elastic-perfect plastic behaviour. Geometric nonlinearity is formulated to consider the large displacement effect. Thermal stress components are determined by the initial strain concept during each time step. The temperature differential between any two consecutive time steps is considered as a load incremental. The numerical results from this study reveal that nonlinear temperature gradient based on transient thermal analysis will produces excessive large displacement. Nonlinear behavior of containment structures up to ultimate stage can be traced reallistically. The present study allows more realistic analysis of concrete containment structures in nuclear power plants.

  • PDF

Evaluation of DNA Damage by Mercury Chloride (II) and Ionizing Radiation in HeLa Cells (이온화 방사선 및 염화수은(II)에 의한 자궁경부암 세포의 DNA 손상 평가)

  • Woo Hyun-Jung;Kim Ji-Hyang;Antonina Cebulska-Wasilewska;Kim Jin-Kyu
    • Korean Journal of Environmental Biology
    • /
    • v.24 no.1 s.61
    • /
    • pp.46-52
    • /
    • 2006
  • The mercury is among the most highly bioconcentrated toxic trace metals. Many national and international agencies and organisations have targeted mercury for the possible emission control. The mercury toxicity depends on its chemical form, among which alkylmercury compounds are the most toxic. A human cervix uterus cancer cell line HeLa cells was employed to investigate the effect of the toxic heavy metal mercury (Hg) and ionizing radiation. In the in vitro comet assays for the genotoxicity in the HeLa cells, the group of Hg treatment after irradiation showed higher DNA breakage than the other groups. The tail extent moment and olive tail moment of the control group were $4.88{\pm}1.00\;and\;3.50{\pm}0.52$ while the values of the only Hg treatment group were $26.90{\pm}2.67\;and\;13.16{\pm}1.82$, respectively. The tail extent moment and olive tail moment of the only 0.001, 0.005, 0.01 Hg group were $12.24{\pm}1.82,\;8.20{\pm}2.15,\;20.30{\pm}1.30,\;12.26{\pm}0.52,\;40.65{\pm}2.94\;and \;20.38{\pm}1.49$, respectively. In the case of Hg treatment after irradiation, the tail extent moment and olive tail moment of the 0.001, 0.005, 0.01 Hg group were $56.50{\pm}3.93,\;32.69{\pm}2.48,\;62.03{\pm}5.14,\;31.56{\pm}1.97,\;72.73{\pm}3.70\;and \;39.44{\pm}3.23$, respectively. The results showed that Hg induced DNA single-strand breaks or alkali labile sites as assessed by the Comet assay. It is in good agreement with the reported results. The mercury inhibits the repair of DNA. The bacterial formamidopyrimidine-DNA glycosylase (Epg protein) recognizes and removes some oxidative DNA base modifications. Enzyme inactivation by Hg (II) may therefore be due either to interactions with rysteine residues outside the metal binding domain or to very high-affinity binding of Hg (II) which readily removes Zn (II) from the zinc finger.

Reevaluation of the Songguk-ri site (송국리유적 재고)

  • Son, Jun-Ho
    • KOMUNHWA
    • /
    • no.70
    • /
    • pp.35-62
    • /
    • 2007
  • Songguk-ri site gained academic recognition by the excavation of stone coffin tomb with a bronze dagger in 1974. And it is confirmed that this site is epoch-making in Korean bronze age through the following excavation started in 1975. But the excavation reports published until now do not have even overall view of this site, thus it is difficult to get the whole picture of this site. Thus, in this paper the author reexamined all reports on this site and by making the map of overall view and distribution of archaeological features as a basic research. Moreover, I analyzed also artifacts from this site, and compared with the recent papers written by other researchers about the chronology and character. Songguk-ri site has livelihood domain which consists of dwelling pits, attached features, storage pits, pot-firing features, wood fence, abatises, buildings above ground, as well as cemetery which consists of stone-coffins, jar-coffins, pit tombs. Trace of making large terrace was also excavated. These features seem to belong to the same archaeological stage, dated about B.C.850-550, according to C14 dating. On the other hand, the intensification of wet-rice cultivation made this group more productive. Based on this financial strength, some influential group emerged they constructed defensive settlement to protect their products safely. Besides it seems that there were frequent occurrence of conflicts. However, we can know that they kept their stable life, through the expansion of living space. Consequently, Songguk-ri site played a role of the summit among some settlement in this area.

  • PDF

A Study of Documentary Archiving Focusing on the case of Archiving by Seoul Metropolitan Archives ('다큐멘터리 아카이빙' 연구 서울기록원의 수집 사례를 중심으로)

  • An, Duree;Song, Young Rang
    • The Korean Journal of Archival Studies
    • /
    • no.65
    • /
    • pp.227-251
    • /
    • 2020
  • The documentation of a city can never be complete with only the documentation of the administrative domain, and requires that of its citizens, who are living in the city in different ways. This study attempts to present the documentation of the memories of the citizens, which either have never been produced or have been damaged and thus are difficult to be collected. From the Archival Activist point of view, this study suggests documentary as its research method, in order to leave trace of various experiences of Seoul, which are not recorded in document but are rooted in its people's memories and their daily lives. Documentaries are characterized by their narrative. This can be somewhat arbitrary, but it is due to their narrative that this study suggests documentaries, rather than oral statements, as a new form of method. While, due to its self-historicality, oral records are subject to producing redundant or irrelevant memories, documentaries enable the documentation of data relevant to the topic of collection. First, the study presents the narrative-based archiving, which is the same method of collection suggested by Seoul Metropolitan Archives, and then explores the role and significance of documentary archiving. It further presents the conditions in which documentary archiving is required in the context of narrative-based collection. The study presents the planning and implementation of documentary archiving and introduces one of the three documentaries produced by 2019 Seoul Archiving Project.

Broadband Processing of Conventional Marine Seismic Data Through Source and Receiver Deghosting in Frequency-Ray Parameter Domain (주파수-파선변수 영역에서 음원 및 수신기 고스트 제거를 통한 전통적인 해양 탄성파 자료의 광대역 자료처리)

  • Kim, Su-min;Koo, Nam-Hyung;Lee, Ho-Young
    • Geophysics and Geophysical Exploration
    • /
    • v.19 no.4
    • /
    • pp.220-227
    • /
    • 2016
  • Marine seismic data have not only primary signals from subsurface but also ghost signals reflected from the sea surface. The ghost decreases temporal resolution of seismic data because it attenuates specific frequency components. For eliminating the ghost signals effectively, the exact ghost delaytimes and reflection coefficients are required. Because of undulation of the sea surface and vertical movements of airguns and streamers, the ghost delaytime varies spatially and randomly while acquiring seismic data. The reflection coefficient is a function of frequency, incidence angle of plane-wave and the sea state. In order to estimate the proper ghost delaytimes considering these characteristics, we compared the ghost delaytimes estimated with L-1 norm, L-2 norm and kurtosis of the deghosted trace and its autocorrelation on synthetic data. L-1 norm of autocorrelation showed a minimal error and the reflection coefficient was calculated using Kirchhoff approximation equation which can handle the effect of wave height. We applied the estimated ghost delaytimes and the calculated reflection coefficients to remove the source and receiver ghost effects. By removing ghost signals, we reconstructed the frequency components attenuated near the notch frequency and produced the migrated stack section with enhanced temporal resolution.

Analyzing the Main Paths and Intellectual Structure of the Data Literacy Research Domain (데이터 리터러시 연구 분야의 주경로와 지적구조 분석)

  • Jae Yun Lee
    • Journal of the Korean Society for information Management
    • /
    • v.40 no.4
    • /
    • pp.403-428
    • /
    • 2023
  • This study investigates the development path and intellectual structure of data literacy research, aiming to identify emerging topics in the field. A comprehensive search for data literacy-related articles on the Web of Science reveals that the field is primarily concentrated in Education & Educational Research and Information Science & Library Science, accounting for nearly 60% of the total. Citation network analysis, employing the PageRank algorithm, identifies key papers with high citation impact across various topics. To accurately trace the development path of data literacy research, an enhanced PageRank main path algorithm is developed, which overcomes the limitations of existing methods confined to the Education & Educational Research field. Keyword bibliographic coupling analysis is employed to unravel the intellectual structure of data literacy research. Utilizing the PNNC algorithm, the detailed structure and clusters of the derived keyword bibliographic coupling network are revealed, including two large clusters, one with two smaller clusters and the other with five smaller clusters. The growth index and mean publishing year of each keyword and cluster are measured to pinpoint emerging topics. The analysis highlights the emergence of critical data literacy for social justice in higher education amidst the ongoing pandemic and the rise of AI chatbots. The enhanced PageRank main path algorithm, developed in this study, demonstrates its effectiveness in identifying parallel research streams developing across different fields.