• Title/Summary/Keyword: information use

Search Result 31,508, Processing Time 0.064 seconds

T-Cache: a Fast Cache Manager for Pipeline Time-Series Data (T-Cache: 시계열 배관 데이타를 위한 고성능 캐시 관리자)

  • Shin, Je-Yong;Lee, Jin-Soo;Kim, Won-Sik;Kim, Seon-Hyo;Yoon, Min-A;Han, Wook-Shin;Jung, Soon-Ki;Park, Se-Young
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.13 no.5
    • /
    • pp.293-299
    • /
    • 2007
  • Intelligent pipeline inspection gauges (PIGs) are inspection vehicles that move along within a (gas or oil) pipeline and acquire signals (also called sensor data) from their surrounding rings of sensors. By analyzing the signals captured in intelligent PIGs, we can detect pipeline defects, such as holes and curvatures and other potential causes of gas explosions. There are two major data access patterns apparent when an analyzer accesses the pipeline signal data. The first is a sequential pattern where an analyst reads the sensor data one time only in a sequential fashion. The second is the repetitive pattern where an analyzer repeatedly reads the signal data within a fixed range; this is the dominant pattern in analyzing the signal data. The existing PIG software reads signal data directly from the server at every user#s request, requiring network transfer and disk access cost. It works well only for the sequential pattern, but not for the more dominant repetitive pattern. This problem becomes very serious in a client/server environment where several analysts analyze the signal data concurrently. To tackle this problem, we devise a fast in-memory cache manager, called T-Cache, by considering pipeline sensor data as multiple time-series data and by efficiently caching the time-series data at T-Cache. To the best of the authors# knowledge, this is the first research on caching pipeline signals on the client-side. We propose a new concept of the signal cache line as a caching unit, which is a set of time-series signal data for a fixed distance. We also provide the various data structures including smart cursors and algorithms used in T-Cache. Experimental results show that T-Cache performs much better for the repetitive pattern in terms of disk I/Os and the elapsed time. Even with the sequential pattern, T-Cache shows almost the same performance as a system that does not use any caching, indicating the caching overhead in T-Cache is negligible.

Beak Trimming Methods - Review -

  • Glatz, P.C.
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.13 no.11
    • /
    • pp.1619-1637
    • /
    • 2000
  • A review was undertaken to obtain information on the range of beak-trimming methods available or under development. Beak-trimming of commercial layer replacement pullets is a common yet critical management tool that can affect the performance for the life of the flock. The most obvious advantage of beak-trimming is a reduction in cannibalism although the extent of the reduction in cannibalism depends on the strain, season, and type of housing, flock health and other factors. Beak-trimming also improves feed conversion by reducing food wastage. A further advantage of beak-trimming is a reduction in the chronic stress associated with dominance interactions in the flock. Beak-trimming of birds at 7-10 days is favoured by Industry but research over last 10 years has shown that beak-trimming at day-old causes the least stress on birds and efforts are needed to encourage Industry to adopt the practice of beak-trimming birds at day-old. Proper beak-trimming can result in greatly improved layer performance but improper beak-trimming can ruin an other wise good flock of hens. Re-trimming is practiced in most flocks, although there are some flocks that only need one trimming. Given the continuing welfare scrutiny of using a hot blade to cut the beak, attempts have been made to develop more welfare friendly methods of beak-trimming. Despite the developments in design of hot blade beak-trimmers the process has remained largely unchanged. That is, a red-hot blade cuts and cauterises the beak. The variables in the process are blade temperature, cauterisation time, operator ability, severity of trimming, age of trimming, strain of bird and beak length. This method of beak-trimming is still overwhelmingly favoured in Industry and there appears to be no other alternative procedures that are more effective. Sharp secateurs have been used trim the upper beak of both layers and turkeys. Bleeding from the upper mandible ceases shortly after the operation, and despite the regrowth of the beak a reduction of cannibalism has been reported. Very few differences have been noted between behaviour and production of the hot blade and cold blade cut chickens. This method has not been used on a large scale in Industry. There are anecdotal reports of cannibalism outbreaks in birds with regrown beaks. A robotic beak-trimming machine was developed in France, which permitted simultaneous, automated beak-trimming and vaccination of day-old chicks of up to 4,500 chickens per hour. Use of the machine was not successful because if the chicks were not loaded correctly they could drop off the line, receive excessive beak-trimming or very light trimming. Robotic beak-trimming was not effective if there was a variation in the weight or size of chickens. Capsaicin can cause degeneration of sensory nerves in mammals and decreases the rate of beak regrowth by its action on the sensory nerves. Capsaicin is a cheap, non-toxic substance that can be readily applied at the time of less severe beak-trimming. It suffers the disadvantage of causing an extreme burning sensation in operators who come in contact with the substance during its application to the bird. Methods of applying the substance to minimise the risk to operators of coming in contact with capsaicin need to be explored. A method was reported which cuts the beaks with a laser beam in day-old chickens. No details were provided on the type of laser used, or the severity of beak-trimming, but by 16 weeks the beaks of laser trimmed birds resembled the untrimmed beaks, but without the bill tip. Feather pecking and cannibalism during the laying period were highest among the laser trimmed hens. Currently laser machines are available that are transportable and research to investigate the effectiveness of beak-trimming using ablasive and coagulative lasers used in human medicine should be explored. Liquid nitrogen was used to declaw emu toes but was not effective. There was regrowth of the claws and the time and cost involved in the procedure limit the potential of using this process to beak-trim birds.

ATHEROSCLEROSIS, CHOLESTEROL AND EGG - REVIEW -

  • Paik, I.K.;Blair, R.
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.9 no.1
    • /
    • pp.1-25
    • /
    • 1996
  • The pathogenesis of atherosclerosis can not be summarized as a single process. Lipid infiltration hypothesis and endothelial injury hypothesis have been proposed and investigated. Recent developments show that there are many points of potential interactions between them and that they can actually be regarded as two phases of a single, unifying hypothesis. Among the many risk factors of atherosclerosis, plasma homocysteine and lipoprotein(a) draw a considerable interest because they are independent indicators of atherogenicity. Triglyceride (TG)-rich lipoproteins (chylomicron and VLDL) are not considered to be atherogenic but they are related to the metabolism of HDL cholesterol and indirectly related to coronary heart disease (CHD). LDL can of itself be atherogenic but the oxidative products of this lipoprotein are more detrimental. HDL cholesterol has been considered to be a favorable cholesterol. The so-called 'causalist view' claims that HDL traps excess cholesterol from cellular membranes and transfers it to TG-rich lipoproteins that are subsequently removed by hepatic receptors. In the so-called 'noncausalist view', HDL does not interfere directly with cholesterol deposition in the arterial wall but instead reflects he metabolism of TG-rich lipoproteins and their conversion to atherogenic remnants. Approximately 70-80% of the human population shows an effective feedback control mechanism in cholesterol homeostasis. Type of dietary fat has a significant effect on the lipoprotein cholesterol metabolism and atherosclerosis. Generally, saturated fatty acids elevate and PUFA lower serum cholesterol, whereas MUFA have no specific effect. EPA and DHA inhibit the synthesis of TG, VLDL and LDL, and may have favourable effects on some of the risk factors. Phospholipids, particularly lecithin, have an antiatherosclerotic effect. Essential phospholipids (EPL) may enhance the formation of polyunsaturated cholesteryl ester (CE) which is less sclerotic and more easily dispersed via enhanced hydrolysis of CE in the arterial wall. Also, neutral fecal steroid elimination may be enhanced and cholesterol absorption reduced following EPL treatment. Antioxidants protect lipoproteins from oxidation, and cells from the injury of toxic, oxidized LDL. The rationale for lowering of serum cholesterol is the strong association between elevation of plasma or serum cholesterol and CHD. Cholesterol-lowing, especially LDL cholesterol, to the target level could be achieved using diet and combination of drug therapy. Information on the link between cholesterol and CHD has decreased egg consumption by 16-25%. Some clinical studies have indicated that dietary cholesterol and egg have a significant hypercholesterolemic effect, while others have indicated no effect. These studies differed in the use of purified cholesterol or cholesterol in eggs, in the range of baseline and challenge cholesterol levels, in the quality and quantity of concomitant dietary fat, in the study population demographics and initial serum cholesterol levels, and clinical settings. Cholesterol content of eggs varies to a certain extent depending on the age, breed and diet of hens. However, egg yolk cholesterol level is very resistant to change because of the particular mechanism involved in yolk formation. Egg yolk contains a factor of factors responsible for accelerated cholesterol metabolism and excretion compared with crystalline cholesterol. One of these factors could be egg lecithin. Egg lecithin may not be as effective as soybean lecithin in lowering serum cholesterol level due probably to the differences of fatty acid composition. However, egg lecithin may have positive effects in hypercholesterolemia by increasing serum HDL level and excretion of fecal cholesterol. The association of serum cholesterol with egg consumption has been widely studied. When the basal or control diet contained little or no cholesterol, consumption of 1 or 2 eggs daily increased the concentration of plasma cholesterol, whereas that of the normolipemic persons on a normal diet was not significantly influenced by consuming 2 to 3 eggs daily. At higher levels of egg consumption, the concentration of HDL tends to increase as well as LDL. There exist hyper-and hypo-responders to dietary (egg) cholesterol. Identifying individuals in both categories would be useful from the point of view of nutrition guidelines. Dietary modification of fatty acid composition has been pursued as a viable method of modifying fat composition of eggs and adding value to eggs. In many cases beneficial effects of PUFA enriched eggs have been demonstrated. Generally, consumption of n-3 fatty acids enriched eggs lowered the concentration of plasma TG and total cholesterol compared to the consumption of regular eggs. Due to the highly oxidative nature of PUFA, stability of this fat is essential. The implication of hepatic lipid accumulation which was observed in hens fed on fish oils should be explored. Nutritional manipulations, such as supplementation with iodine, inhibitors of cholesterol biosynthesis, garlic products, amino acids and high fibre ingredients, have met a limited success in lowering egg cholesterol.

On the field of domestic studies on Western Art History and Western Art Theory (국내 서양미술사, 서양미술이론 연구 장에 관한 연구)

  • Shim, Sang-Yong
    • The Journal of Art Theory & Practice
    • /
    • no.2
    • /
    • pp.75-120
    • /
    • 2004
  • Studies on western art in Korea has been caught in a dilemma that they could deal with only those things which had been arranged according to their 'historical generalization' in their contexts because of the bounds of time and space. It is not trivial that such conditions affect art studies in Korea. Access to the original texts and to their contexts of production is so restricted that the studies on them are prone to he superficial. And it is not independent on the politics of Korean art scene. Such factors are on the background of Korean art's excessive 'assimilation or accordance' with western art. The domestic studies on western art history and art theory have failed to notice the differences in context and Korean art has simply mediated or reproduced the restricted information by those studies. Also the studies on western art in Korea have been made use of as a justifying method of one's own academic domains. In such situations we should lead the studies on western art history and western art theory to a more reflective direction and confirm that the studies should not have any privileges of the realities. And we should try to reform a scholarship which participates in our life and existence. The field of domestic studies on western art history and western art theory should free itself from the invention of objectivity or the neutrality of mechanical reading and turn its eyes to the realities of life where events happens. Constantly suggesting which way Korean art and world art should go has to be the field's new coordinates.

  • PDF

The Reliability and Validity Studies of the Korean Version of the Perceived Stress Scale (한글판 스트레스 자각척도의 신뢰도와 타당도 연구)

  • Lee, Jongha;Shin, Cheolmin;Ko, Young-Hoon;Lim, JaeHyung;Joe, Sook-Haeng;Kim, SeungHyun;Jung, In-Kwa;Han, Changsu
    • Korean Journal of Psychosomatic Medicine
    • /
    • v.20 no.2
    • /
    • pp.127-134
    • /
    • 2012
  • Objectives : Perceived stress scale is a self-report inventory to estimate the degree of individual perceived stress in daily life. The aim of this study was to introduce this scale and test the reliability and validity of the Korean version of PSS. Methods : The total of 154 female hospital workers were included in this study. The survey questionnaires were conducted for demographic information. All participants were required to complete PSS, Hamilton Anxiety scale and Beck Depression Inventory. Reliability and validity studies were conducted and internal consistency was examined. Results : The mean score of the PSS reported in this sample was $20.69{\pm}4.56$. The overall Cronbach's alpha was 0.819, and the test-retest reliability coefficient was 0.66. PSS had a significant positive correlation with the HAM-A(r=0.49, p<0.01), and the BDI(r=0.55, p<0.01). Factor analysis yielded 2 factors with eigenvalues of 3.924 and 2.608, accounting for 65 percent of variance. Factor 1 represented "stress" and factor 2 represented "control of stress". Conclusions : This study indicates that the PSS is appropriate for estimating the perceived stress levels. These results support the use of PSS in large sections of the population in Korea.

  • PDF

Assessment of Environmental Impacts and $CO_2$ Emissions from Soil Remediation Technologies using Life Cycle Assessment - Case Studies on SVE and Biopile Systems - (전과정평가(LCA)에 의한 토양오염 정화공정의 환경영향분석 및 $CO_2$ 배출량 산정 - SVE 및 Biopile 시스템 중심으로 -)

  • Jeong, Seung-Woo;Suh, Sang-Won
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.33 no.4
    • /
    • pp.267-274
    • /
    • 2011
  • The environmental impacts of 95% remediation of a total petroleum hydrocarbon-contaminated soil were evaluated using life cycle assessment (LCA). LCA of two remediation systems, soil vapor extraction (SVE) and biopile, were conducted by using imput materials and energy listed in a remedial system standardization report. Life cycle impact assessment (LCIA) results showed that the environmental impacts of SVE were all higher than those of biopile. Prominent four environmental impacts, human toxicity via soil, aquatic ecotoxicity, human toxicity via surface water and human toxicity via air, were apparently found from the LCIA results of the both remedial systems. Human toxicity via soil was the prominent impact of SVE, while aquatic ecotoxicity was the prominent impact of biopile. This study also showed that the operation stage and the activated carbon replacement stage contributed 60% and 36% of the environmental impacts of SVE system, respectively. The major input affecting the environmental impact of SVE was electricity. The operation stage of biopile resulted in the highest contribution to the entire environmental impact. The key input affecting the environmental impact of biopile was also electricity. This study suggested that electricity reduction strategies would be tried in the contaminated-soil remediation sites for archieving less environmental impacts. Remediation of contaminated soil normally takes long time and thus requires a great deal of material and energy. More extensive life cycle researches on remedial systems are required to meet recent national challenges toward carbon dioxide reduction and green growth. Furthermore, systematic information on electricity use of remedial systems should be collected for the reliable assessment of environmental impacts and carbon dioxide emissions during soil remediation.

Separation of Nanomaterials Using Flow Field-Flow Fractionation (흐름 장-흐름 분획기를 이용한 나노물질의 분리)

  • Kim, Sung-Hee;Lee, Woo-Chun;Kim, Soon-Oh;Na, So-Young;Kim, Hyun-A;Lee, Byung-Tae;Lee, Byoung-Cheun;Eom, Ig-Chun
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.35 no.11
    • /
    • pp.835-860
    • /
    • 2013
  • Recently, the consumption of nanomaterials has been significantly increased in both industrial and commercial sectors, as a result of steady advancement in the nano-technologies. This ubiquitous use of nanomaterials has brought up the concern that their exposure to environments may cause detrimental effects on human health as well as natural ecosystems, and it is required to characterize their behavior in various environmental media and to evaluate their ecotoxicity. For the sake of accomplishing those assessments, the development of methods to effectively separate them from diverse media and to quantify their properties should be requisitely accompanied. Among a number of separation techniques developed so far, this study focuses on Field-Flow Fractionation (FFF) because of its strengths, such as relatively less disturbance of samples and simple pretreatment, and we review overseas and domestic literatures on the separation of nanomaterials using the FFF technique. In particular, researches with Flow Field-Flow Fractionation (FlFFF) are highlighted due to its most frequent application among FFF techniques. The basic principle of the FlFFF is briefly introduced and the studies conducted so far are classified and scrutinized based on the sort of target nanomaterials for the purpose of furnishing practical data and information for the researchers struggling in this field. The literature review suggests that the operational conditions, such as pretreatment, selection of membrane and carrier solution, and rate (velocity) of each flow, should be optimized in order to effectively separate them from various matrices using the FFF technique. Moreover, it seems to be a prerequisite to couple or hyphenate with several detectors and analyzers for quantification of their properties after their separation using the FFF. However, its application has been restricted regarding the types of target nanomaterials and environmental media. Furthermore, domestic literature data on both separation and characterization of nanomaterials are extremely limited. Taking into account the overwhelmingly increasing consumption of nanomaterials, the efforts for the area seem to be greatly urgent.

A study on a flow of the technological convergence in webtoon - Focused on the interactiontoon of webtoon (기술 융합형 웹툰의 몰입도 연구 -인터랙션 툰 <마주쳤다>를 중심으로)

  • Baek, Eun-Ji;Son, Ki-Hwan
    • Cartoon and Animation Studies
    • /
    • s.50
    • /
    • pp.101-130
    • /
    • 2018
  • Since the advent of the Smart Devices, the smartphone has become a popular tool to view Webtoon. This phenomenon has caused the convergence of cutting-edge technologies and Webtoons in diverse forms, creating unique versions of Webtoons including, but not limited to Smart-toon, Effect-toon, Cut-toon, Dubbing-toon, Moving-toon, AR-toon, VR-toon, and Interaction-toon. By comparison to these rich diversities of Webtoons in the online industry, there is a lack of academic research on this topic. There are some papers which talk about the different types of multimedia technology conversion and its case presentation or the effectiveness and problems of visual effect, but the effects of these convergence technologies on comic readers' concentration and reading effectiveness have never been investigated so far. Therefore, this paper will discuss the unique method of immersive storytelling that is often used in comics and analyze each aspects of immersive method in technology-converged Webtoons along with its problems. Furthermore, this paper will analyze different aspects of "immersion" and interaction elements that can be found in the popular Interaction-toon, (Encountered). Through this, this paper discusses the positive influence of the interaction elements on comic readers' immersion level and its limitation. To classify the technology-converged Webtoons in terms of the immersion level, the Effect-toon sometimes interferes viewer's flow by using excessive use of multimedia effect, creating information overload. The Smart-toon which applied motions to each frame under horizontal mode of smartphones was a good try, but it hindered the readers' activeness and made it hard for the readers to be fully absorbed in the story. The VR-toon, which utilizes virtual reality gadgets to allow viewers to explore the world of Webtoon was also a nice try to overcome the limitation of vertical screens. However, it often caused dispersion of user's attention and reduced the users' immersion level. The Moving-toon which only emphasized the reading convenience also invaded readers' activeness and disturb users' concentration. On the other hand, the cartoonist Il-Kwon Ha applied high technologies such as face recognition technology, augmented reality techniques, 360-degree panorama technology, and haptic technology to his cartoon (Encountered). This allowed the readers to form a sense of closeness towards the cartoon characters which let the readers to identify themselves with the characters and interact with them. By this way, the readers can be fully immersed in the story. However, technology abuse, impractical production and hackneyed storylining often showed later in the story remains as its limitations.

THE CURRENT STATUS OF BIOMEDICAL ENGINEERING IN THE USA

  • Webster, John G.
    • Proceedings of the KOSOMBE Conference
    • /
    • v.1992 no.05
    • /
    • pp.27-47
    • /
    • 1992
  • Engineers have developed new instruments that aid in diagnosis and therapy Ultrasonic imaging has provided a nondamaging method of imaging internal organs. A complex transducer emits ultrasonic waves at many angles and reconstructs a map of internal anatomy and also velocities of blood in vessels. Fast computed tomography permits reconstruction of the 3-dimensional anatomy and perfusion of the heart at 20-Hz rates. Positron emission tomography uses certain isotopes that produce positrons that react with electrons to simultaneously emit two gamma rays in opposite directions. It locates the region of origin by using a ring of discrete scintillation detectors, each in electronic coincidence with an opposing detector. In magnetic resonance imaging, the patient is placed in a very strong magnetic field. The precessing of the hydrogen atoms is perturbed by an interrogating field to yield two-dimensional images of soft tissue having exceptional clarity. As an alternative to radiology image processing, film archiving, and retrieval, picture archiving and communication systems (PACS) are being implemented. Images from computed radiography, magnetic resonance imaging (MRI), nuclear medicine, and ultrasound are digitized, transmitted, and stored in computers for retrieval at distributed work stations. In electrical impedance tomography, electrodes are placed around the thorax. 50-kHz current is injected between two electrodes and voltages are measured on all other electrodes. A computer processes the data to yield an image of the resistivity of a 2-dimensional slice of the thorax. During fetal monitoring, a corkscrew electrode is screwed into the fetal scalp to measure the fetal electrocardiogram. Correlations with uterine contractions yield information on the status of the fetus during delivery To measure cardiac output by thermodilution, cold saline is injected into the right atrium. A thermistor in the right pulmonary artery yields temperature measurements, from which we can calculate cardiac output. In impedance cardiography, we measure the changes in electrical impedance as the heart ejects blood into the arteries. Motion artifacts are large, so signal averaging is useful during monitoring. An intraarterial blood gas monitoring system permits monitoring in real time. Light is sent down optical fibers inserted into the radial artery, where it is absorbed by dyes, which reemit the light at a different wavelength. The emitted light travels up optical fibers where an external instrument determines O2, CO2, and pH. Therapeutic devices include the electrosurgical unit. A high-frequency electric arc is drawn between the knife and the tissue. The arc cuts and the heat coagulates, thus preventing blood loss. Hyperthermia has demonstrated antitumor effects in patients in whom all conventional modes of therapy have failed. Methods of raising tumor temperature include focused ultrasound, radio-frequency power through needles, or microwaves. When the heart stops pumping, we use the defibrillator to restore normal pumping. A brief, high-current pulse through the heart synchronizes all cardiac fibers to restore normal rhythm. When the cardiac rhythm is too slow, we implant the cardiac pacemaker. An electrode within the heart stimulates the cardiac muscle to contract at the normal rate. When the cardiac valves are narrowed or leak, we implant an artificial valve. Silicone rubber and Teflon are used for biocompatibility. Artificial hearts powered by pneumatic hoses have been implanted in humans. However, the quality of life gradually degrades, and death ensues. When kidney stones develop, lithotripsy is used. A spark creates a pressure wave, which is focused on the stone and fragments it. The pieces pass out normally. When kidneys fail, the blood is cleansed during hemodialysis. Urea passes through a porous membrane to a dialysate bath to lower its concentration in the blood. The blind are able to read by scanning the Optacon with their fingertips. A camera scans letters and converts them to an array of vibrating pins. The deaf are able to hear using a cochlear implant. A microphone detects sound and divides it into frequency bands. 22 electrodes within the cochlea stimulate the acoustic the acoustic nerve to provide sound patterns. For those who have lost muscle function in the limbs, researchers are implanting electrodes to stimulate the muscle. Sensors in the legs and arms feed back signals to a computer that coordinates the stimulators to provide limb motion. For those with high spinal cord injury, a puff and sip switch can control a computer and permit the disabled person operate the computer and communicate with the outside world.

  • PDF

The Efficient Merge Operation in Log Buffer-Based Flash Translation Layer for Enhanced Random Writing (임의쓰기 성능향상을 위한 로그블록 기반 FTL의 효율적인 합병연산)

  • Lee, Jun-Hyuk;Roh, Hong-Chan;Park, Sang-Hyun
    • The KIPS Transactions:PartD
    • /
    • v.19D no.2
    • /
    • pp.161-186
    • /
    • 2012
  • Recently, the flash memory consistently increases the storage capacity while the price of the memory is being cheap. This makes the mass storage SSD(Solid State Drive) popular. The flash memory, however, has a lot of defects. In order that these defects should be complimented, it is needed to use the FTL(Flash Translation Layer) as a special layer. To operate restrictions of the hardware efficiently, the FTL that is essential to work plays a role of transferring from the logical sector number of file systems to the physical sector number of the flash memory. Especially, the poor performance is attributed to Erase-Before-Write among the flash memory's restrictions, and even if there are lots of studies based on the log block, a few problems still exists in order for the mass storage flash memory to be operated. If the FAST based on Log Block-Based Flash often is generated in the wide locality causing the random writing, the merge operation will be occur as the sectors is not used in the data block. In other words, the block thrashing which is not effective occurs and then, the flash memory's performance get worse. If the log-block makes the overwriting caused, the log-block is executed like a cache and this technique contributes to developing the flash memory performance improvement. This study for the improvement of the random writing demonstrates that the log block is operated like not only the cache but also the entire flash memory so that the merge operation and the erase operation are diminished as there are a distinct mapping table called as the offset mapping table for the operation. The new FTL is to be defined as the XAST(extensively-Associative Sector Translation). The XAST manages the offset mapping table with efficiency based on the spatial locality and temporal locality.