• Title/Summary/Keyword: SPEED

Search Result 43,351, Processing Time 0.069 seconds

Effect of Plant Growth Regulator Treatments on the Growth and Lateral Root Formation in Soybean Sprouts - I. Effect of Plant Growth Regulator Treatments on the Growth in Soybean Sprouts (생장조절물질(生長調節物質) 처리(處理)가 콩나물의 생육(生育) 및 세근발생(細根發生)에 미치는 영향(影響) - I. 생장조절물질(生長調節物質)의 단용(單用) 및 혼용처리(混用處理)가 콩나물의 생육(生育)에 미치는 효과(效果))

  • Kang, C.K.;Lee, J.M.;Saka, H.
    • Korean Journal of Weed Science
    • /
    • v.9 no.1
    • /
    • pp.56-68
    • /
    • 1989
  • aA series of experiments were conducted to investigate the effect of plant growth regulator treatments on the growth and lateral root formation in soybean sprouts in order to establish the effective method of producing root-less or short-rooted soybean sprouts with larger diameter in the hypocotyl. Major results can be summarized as follows. 1. Soybean sprouts showed fairly uniform elongation rate from 3 to g days after imbibition with daily increase of 3.8cm. The speed of elongation of hypocotyl was reduced whereas that of root accelerated 7 days after imbibition. Lateral roots began to emerge fairly evenly from 5 to 9 days after imbibition with a daily increase of 4.4. 2. Auxins(IAA, IBA, NAA, 2,4-D) inhibited hypocotyl elongation and formation of lateral roots and increased hypocotyl diameter without influencing root length and hook diameter at higher concentrations. The dry weight of cotyledon was increased significantly as compared to that of hypocotyl and root. Among the tested auxins, 2, 4-D was the most effective. 3. BA and 4PU-30 significantly reduced elongation of hypocotyl and root and resulted in the biggest diameter of hypocotyl when treated at higher concentrations. The lowest effective concentration of BA to prevent the formation of larval gal roots was 12.5ppm. The formation of lateral roots could be completely prevented by BA and 4PU-30 treatment but kinetin, zeatin, zeatin riboside resulted in many lateral roots and increased thickness of soybean sprouts with little influence. Cotyledon deformation was found in soybean sprouts treated by 4PU-30. 4. 2, 4-D was the most effective for increasing the hypocotyl diameter while 4PU-30 was the most effective for reducing no. of lateral roots. 5. It can be concluded that among the plant growth regulators tested, BA was effective in reducing root length and increasing hypocotyl diameter. BA 12.5 ppm or 15 ppm may thus be the more practical for production of soybean sprouts. 6. ABA showed no significant effect of growth parameter, however ABA 25 ppm inhibited only no of lateral roots with little influence on the growth of seedling. 7. Ethephon inhibited the elongation of hypocotyl and root and increased hypocotyl diameter at higher concentrations. 8. The combined effect of cytokinins and ethephon was very similar to result of BA treatment alone. As the ethephon concentration increased, hypocotyl diameter and dry weight of cotyledon tended to increase.

  • PDF

The Role of the Soft Law for Space Debris Mitigation in International Law (국제법상 우주폐기물감축 연성법의 역할에 관한 연구)

  • Kim, Han-Taek
    • The Korean Journal of Air & Space Law and Policy
    • /
    • v.30 no.2
    • /
    • pp.469-497
    • /
    • 2015
  • In 2009 Iridium 33, a satellite owned by the American Iridium Communications Inc. and Kosmos-2251, a satellite owned by the Russian Space Forces, collided at a speed of 42,120 km/h and an altitude of 789 kilometers above the Taymyr Peninsula in Siberia. NASA estimated that the satellite collision had created approximately 1,000 pieces of debris larger than 10 centimeters, in addition to many smaller ones. By July 2011, the U.S. Space Surveillance Network(SSN) had catalogued over 2,000 large debris fragments. On January 11, 2007 China conducted a test on its anti-satellite missile. A Chinese weather satellite, the FY-1C polar orbit satellite, was destroyed by the missile that was launched using a multistage solid-fuel. The test was unprecedented for having created a record amount of debris. At least 2,317 pieces of trackable size (i.e. of golf ball size or larger) and an estimated 150,000 particles were generated as a result. As far as the Space Treaties such as 1967 Outer Space Treaty, 1968 Rescue Agreement, 1972 Liability Convention, 1975 Registration Convention and 1979 Moon Agreement are concerned, few provisions addressing the space environment and debris in space can be found. In the early years of space exploration dating back to the late 1950s, the focus of international law was on the establishment of a basic set of rules on the activities undertaken by various states in outer space.. Consequently environmental issues, including those of space debris, did not receive the priority they deserve when international space law was originally drafted. As shown in the case of the 1978 "Cosmos 954 Incident" between Canada and USSR, the two parties settled it by the memorandum between two nations not by the Space Treaties to which they are parties. In 1994 the 66th conference of International Law Association(ILA) adopted "International Instrument on the Protection of the Environment from Damage Caused by Space Debris". The Inter-Agency Space Debris Coordination Committee(IADC) issued some guidelines for the space debris which were the basis of "the UN Space Debris Mitigation Guidelines" which had been approved by the Committee on the Peaceful Uses of Outer Space(COPUOS) in its 527th meeting. On December 21 2007 this guideline was approved by UNGA Resolution 62/217. The EU has proposed an "International Code of Conduct for Outer Space Activities" as a transparency and confidence-building measure. It was only in 2010 that the Scientific and Technical Subcommittee began considering as an agenda item the long-term sustainability of outer space. A Working Group on the Long-term Sustainability of Outer Space Activities was established, the objectives of which include identifying areas of concern for the long-term sustainability of outer space activities, proposing measures that could enhance sustainability, and producing voluntary guidelines to reduce risks to long-term sustainability. By this effort "Guidelines on the Long-term Sustainability of Outer Space Activities" are being under consideration. In the case of "Declaration of Legal Principles Governing the Activities of States in the Exp1oration and Use of Outer Space" adopted by UNGA Resolution 1962(XVIII), December 13 1963, the 9 principles proclaimed in that Declaration, although all of them incorporated in the Space Treaties, could be regarded as customary international law binding all states considering the time and opinio juris by the responses of the world. Although the soft law such as resolutions, guidelines are not binding law, there are some provisions which have a fundamentally norm-creating character and customary international law. In November 12 1974 UN General Assembly recalled through a Resolution 3232(XXIX) "Review of the role of International Court of Justice" that the development of international law may be reflected, inter alia, by the declarations and resolutions of the General Assembly which may to that extend be taken into consideration by the judgements of the International Court of Justice. We are expecting COPUOS which gave birth 5 Space Treaties that it could give us binding space debris mitigation measures to be implemented based on space debris mitigation soft law in the near future.

THE EFFECTS OF THE PLATELET-DERIVED GROWTH FACTOR-BB ON THE PERIODONTAL TISSUE REGENERATION OF THE FURCATION INVOLVEMENT OF DOGS (혈소판유래성장인자-BB가 성견 치근이개부병변의 조직재생에 미치는 효과)

  • Cho, Moo-Hyun;Park, Kwang-Beom;Park, Joon-Bong
    • Journal of Periodontal and Implant Science
    • /
    • v.23 no.3
    • /
    • pp.535-563
    • /
    • 1993
  • New techniques for regenerating the destructed periodontal tissue have been studied for many years. Current acceptable methods of promoting periodontal regeneration alre basis of removal of diseased soft tissue, root treatment, guided tissue regeneration, graft materials, biological mediators. Platelet-derived growth factor (PDGF) is one of polypeptide growth factor. PDGF have been reported as a biological mediator which regulate activities of wound healing progress including cell proliferation, migration, and metabolism. The purposes of this study is to evaluate the possibility of using the PDGF as a regeneration promoting agent for furcation involvement defect. Eight adult mongrel dogs were used in this experiment. The dogs were anesthetized with Pentobarbital Sodium (25-30 mg/kg of body weight, Tokyo chemical Co., Japan) and conventional periodontal prophylaxis were performed with ultrasonic scaler. With intrasulcular and crestal incision, mucoperiosteal flap was elevated. Following decortication with 1/2 high speed round bur, degree III furcation defect was made on mandibular second(P2) and fourth(P4) premolar. For the basic treatment of root surface, fully saturated citric acid was applied on the exposed root surface for 3 minutes. On the right P4 20ug of human recombinant PDGF-BB dissolved in acetic acid was applied with polypropylene autopipette. On the left P2 and right P2 PDGF-BB was applied after insertion of ${\beta}-Tricalcium$ phosphate(TCP) and collagen (Collatape) respectively. Left mandibular P4 was used as control. Systemic antibiotics (Penicillin-G benzathine and penicillin-G procaine, 1 ml per 10-25 1bs body weight) were administrated intramuscular for 2 weeks after surgery. Irrigation with 0.1% Chlorhexidine Gluconate around operated sites was performed during the whole experimental period except one day immediate after surgery. Soft diets were fed through the whole experiment period. After 2, 4, 8, 12 weeks, the animals were sacrificed by perfusion technique. Tissue block was excised including the tooth and prepared for light microscope with H-E staining. At 2 weeks after surgery, therer were rapid osteogenesis phenomenon on the defected area of the PDGF only treated group and early trabeculation pattern was made with new osteoid tissue produced by activated osteoblast. Bone formation was almost completed to the fornix of furcation by 8 weeks after surgery. New cementum fromation was observed from 2 weeks after surgery, and the thickness was increased until 8 weeks with typical Sharpey’s fibers reembedded into new bone and cementum. In both PDGF-BB with TCP group and PDGF-BB with Collagen group, regeneration process including new bone and new cementum formation and the group especially in the early weeks. It might be thought that the migration of actively proliferating cells was prohibited by the graft materials. In conclusion, platelet-derived growth factor can promote rapid osteogenesis during early stage of periodontal tissue regeneration.

  • PDF

Evaluating Reverse Logistics Networks with Centralized Centers : Hybrid Genetic Algorithm Approach (집중형센터를 가진 역물류네트워크 평가 : 혼합형 유전알고리즘 접근법)

  • Yun, YoungSu
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.4
    • /
    • pp.55-79
    • /
    • 2013
  • In this paper, we propose a hybrid genetic algorithm (HGA) approach to effectively solve the reverse logistics network with centralized centers (RLNCC). For the proposed HGA approach, genetic algorithm (GA) is used as a main algorithm. For implementing GA, a new bit-string representation scheme using 0 and 1 values is suggested, which can easily make initial population of GA. As genetic operators, the elitist strategy in enlarged sampling space developed by Gen and Chang (1997), a new two-point crossover operator, and a new random mutation operator are used for selection, crossover and mutation, respectively. For hybrid concept of GA, an iterative hill climbing method (IHCM) developed by Michalewicz (1994) is inserted into HGA search loop. The IHCM is one of local search techniques and precisely explores the space converged by GA search. The RLNCC is composed of collection centers, remanufacturing centers, redistribution centers, and secondary markets in reverse logistics networks. Of the centers and secondary markets, only one collection center, remanufacturing center, redistribution center, and secondary market should be opened in reverse logistics networks. Some assumptions are considered for effectively implementing the RLNCC The RLNCC is represented by a mixed integer programming (MIP) model using indexes, parameters and decision variables. The objective function of the MIP model is to minimize the total cost which is consisted of transportation cost, fixed cost, and handling cost. The transportation cost is obtained by transporting the returned products between each centers and secondary markets. The fixed cost is calculated by opening or closing decision at each center and secondary markets. That is, if there are three collection centers (the opening costs of collection center 1 2, and 3 are 10.5, 12.1, 8.9, respectively), and the collection center 1 is opened and the remainders are all closed, then the fixed cost is 10.5. The handling cost means the cost of treating the products returned from customers at each center and secondary markets which are opened at each RLNCC stage. The RLNCC is solved by the proposed HGA approach. In numerical experiment, the proposed HGA and a conventional competing approach is compared with each other using various measures of performance. For the conventional competing approach, the GA approach by Yun (2013) is used. The GA approach has not any local search technique such as the IHCM proposed the HGA approach. As measures of performance, CPU time, optimal solution, and optimal setting are used. Two types of the RLNCC with different numbers of customers, collection centers, remanufacturing centers, redistribution centers and secondary markets are presented for comparing the performances of the HGA and GA approaches. The MIP models using the two types of the RLNCC are programmed by Visual Basic Version 6.0, and the computer implementing environment is the IBM compatible PC with 3.06Ghz CPU speed and 1GB RAM on Windows XP. The parameters used in the HGA and GA approaches are that the total number of generations is 10,000, population size 20, crossover rate 0.5, mutation rate 0.1, and the search range for the IHCM is 2.0. Total 20 iterations are made for eliminating the randomness of the searches of the HGA and GA approaches. With performance comparisons, network representations by opening/closing decision, and convergence processes using two types of the RLNCCs, the experimental result shows that the HGA has significantly better performance in terms of the optimal solution than the GA, though the GA is slightly quicker than the HGA in terms of the CPU time. Finally, it has been proved that the proposed HGA approach is more efficient than conventional GA approach in two types of the RLNCC since the former has a GA search process as well as a local search process for additional search scheme, while the latter has a GA search process alone. For a future study, much more large-sized RLNCCs will be tested for robustness of our approach.

A comparative study on the correlation between Korean foods and the fractures of PFG and all ceramic crowns for posterior applications (구치용 도재소부금관과 전부도재관에 파절을 일으키는 한국음식에 관한 연구)

  • Kim, Jeong-Ho;Lee, Jai-Bong
    • The Journal of Korean Academy of Prosthodontics
    • /
    • v.47 no.2
    • /
    • pp.156-163
    • /
    • 2009
  • Statement of problem: Recently, there have been increased esthetic needs for posterior dental restorations. The failure of posterior dental ceramic restoration are possible not only by the characters of the component materials but also by the type of food. Purpose: The research aim was to compare the in vitro fracture resistance of simulated first molar crowns fabricated using 4 dental ceramic systems, full-porcelain-occlusal-surfaced PFG, half-porcelain-occlusal-surfaced PFG, Empress 2, Ice Zirkon and selected Korean foods. Material and methods: Eighty axisymmetric crowns of each system were fabricated to fit a preparation with 1.5- to 2.0-mm occlusal reduction. The center of the occlusal surface on each of 15 specimens per ceramic system was axially loaded to fracture in a Instron 4465, and the maximum load(N) was recorded. Afterwards, selected Korean foods specimens(boiled crab, boiled chicken with bone, boiled beef rib, dried squid, dried anchovy, round candy, walnut shell) were prepared. 15 specimens per each food were placed under the Instron and the maximum fracture loads for them were recorded. The 95% confidence intervals of the characteristic failure load were compared between dental ceramic systems and Korean foods. Afterwards, on the basis of previous results, 14Hz cyclic load was applied on the 4 systems of dental ceramic restorations in MTS. The reults were analyzed by analysis of variance and Post Hoc tests. Results: 95% confidence intervals for mean of fracture load 1. full porcelain occlusal surfaced PFG Crown: 2599.3 to 2809.1 N 2. half porcelain occlusal surfaced PFG Crown: 3689.4 to 3819.8 N 3. Ice Zirkon Crown: 1501.2 to 1867.9 N 4. Empress 2 Crown: 803.2 to 1188.5 N 5. boiled crab: 294.1 to 367.9 N 6. boiled chicken with bone: 357.1 to 408.6 N 7. boiled beef rib: 4077.7 to 4356.0 N 8. dried squid: 147.5 to 190.5 N 9. dried anchovy: 35.6 to 46.5 N 10. round candy: 1900.5 to 2615.8 N 11. walnut shell: 85.7 to 373.1 N under cyclic load(14Hz) in MTS, fracture load and masticatory cycles are: 1. full porcelain occlusal surfaced PFG Crown fractured at 95% confidence intervals of 4796.8-9321.2 cycles under 2224.8 N(round candy)load, no fracture under smaller loads. 2. half porcelain occlusal surfaced PFG Crown fractured at 95% confidence intervals of 881705.1-1143565.7 cycles under 2224.8 N(round candy). no fracture under smaller loads. 3. Ice Zirkon Crown fractured at 95% confidence intervlas of 979993.0-1145773.4 cycles under 382.9 N(boiled chicken with bone). no fracture under smaller loads. 4. Empress 2 Crown fractured at 95% confidence intervals of 564.1-954.7 cycles under 382.9 N(boiled chicken with bone). no fracture under smaller loads. Conclusion: There was a significant difference in fracture resistance between experimental groups. Under single load, Korean foods than can cause fracture to the dental ceramic restorations are boiled beef rib and round candy. Even if there is no fracture under single load, cyclic dynamic load can fracture dental posterior ceramic crowns. Experimental data with 14 Hz dynamic cyclic load are obtained as follows. 1. PFG crown(full porcelain occlusion) was failed after mean 0.03 years under fracture load for round candy(2224.8 N). 2. PFG crown(half porcelain occlusion) was failed after mean 4.1 years under fracture load for round candy(2224.8 N). 3. Ice Zirkon crown was failed after mean 4.3 years under fracture load for boiled chicken with bone(382.9 N). 4. Empress 2 crown was failed after mean 0.003 years under fracture load for boiled chicken with bone(382.9 N).

DC Resistivity method to image the underground structure beneath river or lake bottom (하저 지반특성 규명을 위한 전기비저항 탐사)

  • Kim Jung-Ho;Yi Myeong-Jong;Song Yoonho;Cho Seong-Jun;Lee Seong-Kon;Son Jeongsul
    • 한국지구물리탐사학회:학술대회논문집
    • /
    • 2002.09a
    • /
    • pp.139-162
    • /
    • 2002
  • Since weak zones or geological lineaments are likely to be eroded, weak zones may develop beneath rivers, and a careful evaluation of ground condition is important to construct structures passing through a river. Dc resistivity surveys, however, have seldomly applied to the investigation of water-covered area, possibly because of difficulties in data aquisition and interpretation. The data aquisition having high quality may be the most important factor, and is more difficult than that in land survey, due to the water layer overlying the underground structure to be imaged. Through the numerical modeling and the analysis of case histories, we studied the method of resistivity survey at the water-covered area, starting from the characteristics of measured data, via data acquisition method, to the interpretation method. We unfolded our discussion according to the installed locations of electrodes, ie., floating them on the water surface, and installing at the water bottom, since the methods of data acquisition and interpretation vary depending on the electrode location. Through this study, we could confirm that the dc resistivity method can provide the fairly reasonable subsurface images. It was also shown that installing electrodes at the water bottom can give the subsurface image with much higher resolution than floating them on the water surface. Since the data acquired at the water-covered area have much lower sensitivity to the underground structure than those at the land, and can be contaminated by the higher noise, such as streaming potential, it would be very important to select the acquisition method and electrode array being able to provide the higher signal-to-noise ratio data as well as the high resolving power. The method installing electrodes at the water bottom is suitable to the detailed survey because of much higher resolving power, whereas the method floating them, especially streamer dc resistivity survey, is to the reconnaissance survey owing of very high speed of field work.

  • PDF

The Study on the investigation of oriental medical theraphy(oriental medical theraphy by symptoms and signs and Sasang constitutional medicine)and the each effect of oriental medicine, occidental medicine and both joint control (뇌졸중(腦卒中)에 대(大)한 한방치료법(韓方治療法) 연구(硏究)(증치의학(證治醫學)과 사상의학(四象醫學)) 및 한방(韓方), 양방(洋方), 양(洋)·한방(韓方) 협진치료(協診治療) 효과(效果)에 관(關)한 연구(硏究))

  • Kim, Jong-won;Kim, Young-kyun;Kim, Beob-young;Lee, In-seon;Lee, In-seon;Jang, Kyung-jeon;Gwon, Jeong-Nam;Lee, Won-oe;Song, Chang-won;Park, Dong-il
    • Journal of Sasang Constitutional Medicine
    • /
    • v.10 no.2
    • /
    • pp.351-429
    • /
    • 1998
  • The Purpose of Study 1. Inspection of clinical application on TCD to CVA 2. Objective Comparement and analysis about treatment effect of Western-Medicine, Korean Medicine, Cooperative consultation of Korean and Western medicice for CVA The Subject of Study We intended for the eighty six patient of CVA who had been treated in the Oriental Medical Hospital at Dong Eui Medical Center from 1997. 8. I to 1998. 7. 31 1. View of CT, MRI : the patient of Cb infarction 2. Attack Time : The patient who coming hospital falling ill within the early one week The method of study 1. Treat four group of Korean medicine, Constitution medicine, Western medicine, cooperative consultation of Korean medicine and Western medicine. 2. Application of TCD Check the result for three times, immediatly after the attack, two months later, four months later 3. Comparative analysis of each treatment effect by clinical symptoms and pathologic examination 4. The Judgement of the patient The Result From 8/1/1997 to 7/31/1998, We have the following result by clinical analysis intended for CVA 86 patients who had been treated in the Oriental Medical Hospital at Dong Eui Medical Center from 1997. 8. 1 to 1998. 7. 31 in 1. Analysis according to Age The first stage of thirties, forties, seventies is heavier than forties, fifties in improvement and Index of improvement of symptom 2. Analysis according to sex We have no special relation in an average of symptom and improvement, Index improvement 3. Analysis according to Family History We have the better result in first stage and improvement, index improvement when no family history. 4. Analysis according to Past History We have no special relation in past history like hypertension, DM, heart problem 5. Analysis devided two group, above group and under group on the basis of the average in first stage of all patient. We have the better result when the first stage is light, that the first score of barthel index and CNS is high. 6. Analysis of the effect of treatment about Korean medical treatment, Western medical treatment, cooperative treatment. In this study, the highest group of rate of treatment at four contrast groups (Korean medicine, Constitution medicine, Western medicine, cooperative treatment according to dyagnosis and range of treatment was the patient group of doing dyagnosis and method of treatment based on constitution medicine theory. This is that of doing demostation, A-Tx, po-herb-medication according to dyagnosis and treat method of constitution of Lee Je-ma In case of left, the case of dyagnosis any disease according to doctor view but, normal in TCDwas 22-beginning of attack, 20- two weeks later, 11 case-four weeks later in case of right, 15-beginning of attack, 12-two weeks later, 9 case four weeks later. So left vessel compares to right vessel is more interference, in fact more than a 1/2 of the patients of MCA disease can't do dyagnosis. In rate of imparement, the state of pacient improved but there isn't the improved case of result in TCD. 7. In TCD dyagnosis, between the case of inconsus the doctor view specially MCA in brain blood vessel is in large numbers and in total 86's patient, impossible case of dyagnosis according to interferiance of temporal is 21 case. 7. Result study about application of Kreaan medical treatment 1) The impossible patient of observation MCA blood vescular for interference temporal bone happened in large numbers. 2) There is the case having difference result to CT,MRI, MRA result. 3) Because individual difference is large, excluding to ananalogy of symptom. This is normal numerical value that has possibility of being checked as abnormal numerical value 4) there are a lot of cases that the speed of normal part is as similarly measured as that of abnormal part. It means that we cannot judge the disease by this measure 5) It is rare that this measure represent degree of improvement in patient's condition of disease. When we observe patient's condition become better, but we have no case that the result of TCD test better. 6) The result could be appear differently by the technique of the tester or by the experience of the tester 7) In the TCD test, abnormal symptoms is checked at 0 week, but at 2th week, normal symptoms is checked, again at 4th week abnormal is checked. According to the above result, CVA diagnosis is difficult only with TCD, as it appear in diagnosis error check which is suggested in the problem connected to project, for the aged persons who have the worst hardening of part of the cranium (1998. 5. 26 77 of 83 patients is 50s) there is a lot of cases that the measurement is impossible by TCD and the correction of measurement numerical value is decreased, as the age of cerebral infarction is high, TCD is inappropriate to diagnosis equipment through this study.

  • PDF

Memory Organization for a Fuzzy Controller.

  • Jee, K.D.S.;Poluzzi, R.;Russo, B.
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 1993.06a
    • /
    • pp.1041-1043
    • /
    • 1993
  • Fuzzy logic based Control Theory has gained much interest in the industrial world, thanks to its ability to formalize and solve in a very natural way many problems that are very difficult to quantify at an analytical level. This paper shows a solution for treating membership function inside hardware circuits. The proposed hardware structure optimizes the memoried size by using particular form of the vectorial representation. The process of memorizing fuzzy sets, i.e. their membership function, has always been one of the more problematic issues for the hardware implementation, due to the quite large memory space that is needed. To simplify such an implementation, it is commonly [1,2,8,9,10,11] used to limit the membership functions either to those having triangular or trapezoidal shape, or pre-definite shape. These kinds of functions are able to cover a large spectrum of applications with a limited usage of memory, since they can be memorized by specifying very few parameters ( ight, base, critical points, etc.). This however results in a loss of computational power due to computation on the medium points. A solution to this problem is obtained by discretizing the universe of discourse U, i.e. by fixing a finite number of points and memorizing the value of the membership functions on such points [3,10,14,15]. Such a solution provides a satisfying computational speed, a very high precision of definitions and gives the users the opportunity to choose membership functions of any shape. However, a significant memory waste can as well be registered. It is indeed possible that for each of the given fuzzy sets many elements of the universe of discourse have a membership value equal to zero. It has also been noticed that almost in all cases common points among fuzzy sets, i.e. points with non null membership values are very few. More specifically, in many applications, for each element u of U, there exists at most three fuzzy sets for which the membership value is ot null [3,5,6,7,12,13]. Our proposal is based on such hypotheses. Moreover, we use a technique that even though it does not restrict the shapes of membership functions, it reduces strongly the computational time for the membership values and optimizes the function memorization. In figure 1 it is represented a term set whose characteristics are common for fuzzy controllers and to which we will refer in the following. The above term set has a universe of discourse with 128 elements (so to have a good resolution), 8 fuzzy sets that describe the term set, 32 levels of discretization for the membership values. Clearly, the number of bits necessary for the given specifications are 5 for 32 truth levels, 3 for 8 membership functions and 7 for 128 levels of resolution. The memory depth is given by the dimension of the universe of the discourse (128 in our case) and it will be represented by the memory rows. The length of a world of memory is defined by: Length = nem (dm(m)+dm(fm) Where: fm is the maximum number of non null values in every element of the universe of the discourse, dm(m) is the dimension of the values of the membership function m, dm(fm) is the dimension of the word to represent the index of the highest membership function. In our case then Length=24. The memory dimension is therefore 128*24 bits. If we had chosen to memorize all values of the membership functions we would have needed to memorize on each memory row the membership value of each element. Fuzzy sets word dimension is 8*5 bits. Therefore, the dimension of the memory would have been 128*40 bits. Coherently with our hypothesis, in fig. 1 each element of universe of the discourse has a non null membership value on at most three fuzzy sets. Focusing on the elements 32,64,96 of the universe of discourse, they will be memorized as follows: The computation of the rule weights is done by comparing those bits that represent the index of the membership function, with the word of the program memor . The output bus of the Program Memory (μCOD), is given as input a comparator (Combinatory Net). If the index is equal to the bus value then one of the non null weight derives from the rule and it is produced as output, otherwise the output is zero (fig. 2). It is clear, that the memory dimension of the antecedent is in this way reduced since only non null values are memorized. Moreover, the time performance of the system is equivalent to the performance of a system using vectorial memorization of all weights. The dimensioning of the word is influenced by some parameters of the input variable. The most important parameter is the maximum number membership functions (nfm) having a non null value in each element of the universe of discourse. From our study in the field of fuzzy system, we see that typically nfm 3 and there are at most 16 membership function. At any rate, such a value can be increased up to the physical dimensional limit of the antecedent memory. A less important role n the optimization process of the word dimension is played by the number of membership functions defined for each linguistic term. The table below shows the request word dimension as a function of such parameters and compares our proposed method with the method of vectorial memorization[10]. Summing up, the characteristics of our method are: Users are not restricted to membership functions with specific shapes. The number of the fuzzy sets and the resolution of the vertical axis have a very small influence in increasing memory space. Weight computations are done by combinatorial network and therefore the time performance of the system is equivalent to the one of the vectorial method. The number of non null membership values on any element of the universe of discourse is limited. Such a constraint is usually non very restrictive since many controllers obtain a good precision with only three non null weights. The method here briefly described has been adopted by our group in the design of an optimized version of the coprocessor described in [10].

  • PDF

Efficient Topic Modeling by Mapping Global and Local Topics (전역 토픽의 지역 매핑을 통한 효율적 토픽 모델링 방안)

  • Choi, Hochang;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.3
    • /
    • pp.69-94
    • /
    • 2017
  • Recently, increase of demand for big data analysis has been driving the vigorous development of related technologies and tools. In addition, development of IT and increased penetration rate of smart devices are producing a large amount of data. According to this phenomenon, data analysis technology is rapidly becoming popular. Also, attempts to acquire insights through data analysis have been continuously increasing. It means that the big data analysis will be more important in various industries for the foreseeable future. Big data analysis is generally performed by a small number of experts and delivered to each demander of analysis. However, increase of interest about big data analysis arouses activation of computer programming education and development of many programs for data analysis. Accordingly, the entry barriers of big data analysis are gradually lowering and data analysis technology being spread out. As the result, big data analysis is expected to be performed by demanders of analysis themselves. Along with this, interest about various unstructured data is continually increasing. Especially, a lot of attention is focused on using text data. Emergence of new platforms and techniques using the web bring about mass production of text data and active attempt to analyze text data. Furthermore, result of text analysis has been utilized in various fields. Text mining is a concept that embraces various theories and techniques for text analysis. Many text mining techniques are utilized in this field for various research purposes, topic modeling is one of the most widely used and studied. Topic modeling is a technique that extracts the major issues from a lot of documents, identifies the documents that correspond to each issue and provides identified documents as a cluster. It is evaluated as a very useful technique in that reflect the semantic elements of the document. Traditional topic modeling is based on the distribution of key terms across the entire document. Thus, it is essential to analyze the entire document at once to identify topic of each document. This condition causes a long time in analysis process when topic modeling is applied to a lot of documents. In addition, it has a scalability problem that is an exponential increase in the processing time with the increase of analysis objects. This problem is particularly noticeable when the documents are distributed across multiple systems or regions. To overcome these problems, divide and conquer approach can be applied to topic modeling. It means dividing a large number of documents into sub-units and deriving topics through repetition of topic modeling to each unit. This method can be used for topic modeling on a large number of documents with limited system resources, and can improve processing speed of topic modeling. It also can significantly reduce analysis time and cost through ability to analyze documents in each location or place without combining analysis object documents. However, despite many advantages, this method has two major problems. First, the relationship between local topics derived from each unit and global topics derived from entire document is unclear. It means that in each document, local topics can be identified, but global topics cannot be identified. Second, a method for measuring the accuracy of the proposed methodology should be established. That is to say, assuming that global topic is ideal answer, the difference in a local topic on a global topic needs to be measured. By those difficulties, the study in this method is not performed sufficiently, compare with other studies dealing with topic modeling. In this paper, we propose a topic modeling approach to solve the above two problems. First of all, we divide the entire document cluster(Global set) into sub-clusters(Local set), and generate the reduced entire document cluster(RGS, Reduced global set) that consist of delegated documents extracted from each local set. We try to solve the first problem by mapping RGS topics and local topics. Along with this, we verify the accuracy of the proposed methodology by detecting documents, whether to be discerned as the same topic at result of global and local set. Using 24,000 news articles, we conduct experiments to evaluate practical applicability of the proposed methodology. In addition, through additional experiment, we confirmed that the proposed methodology can provide similar results to the entire topic modeling. We also proposed a reasonable method for comparing the result of both methods.

THE EFFECT OF INTERMITTENT COMPOSITE CURING ON MARGINAL ADAPTATION (복합레진의 간헐적 광중합 방법이 변연적합도에 미치는 영향)

  • Yun, Yong-Hwan;Park, Sung-Ho
    • Restorative Dentistry and Endodontics
    • /
    • v.32 no.3
    • /
    • pp.248-259
    • /
    • 2007
  • The aim of this research was to study the effect of intermittent polymerization on marginal adaptation by comparing the marginal adaptation of intermittently polymerized composite to that of continuously polymerized composite. The materials used for this study were Pyramid (Bisco Inc., Schaumburg, U.S.A.) and Heliomolar (Ivoclar Vivadent, Liechtenstein) . The experiment was carried out in class II MOD cavities prepared in 48 extracted human maxillary premolars. The samples were divided into 4 groups by light curing method: group 1- continuous curing (60s light on with no light off), group 2-intermittent curing (cycles of 3s with 2s light on & 1s light off for 90s); group 3- intermittent curing (cycles of 2s with 1s light on & 1s light off for 120s); group 4- intermittent curing (cycles of 3s with 1s light on & 2s light off for 180s). Consequently the total amount of light energy radiated was same in all the groups. Each specimen went through thermo-mechanical loading (TML) which consisted of mechanical loading (720,000 cycles, 5.0 kg) with a speed of 120 rpm for 100hours and thermocycling (6000 thermocycles of alternating water of $50^{\circ}C$ and $55^{\circ}C$). The continuous margin (CM) (%) of the total margin and regional margins, occlusal enamel (OE), vertical enamel (VE), and cervical enamel (CE) was measured before and after TML under a $\times200$ digital light microscope. Three-way ANOVA and Duncan's Multiple Range Test was performed at 95% level of confidence to test the effect of 3 variables on CM (%) of the total margin: light curing conditions, composite materials and effect of TML. In each group, One-way ANOVA and Duncan's Multiple Range Test was additionally performed to compare CM (%) of regions (OE, VE CE). The results indicated that all the three variables were statistically significant (p < 0.05). Before TML, in groups using Pyramid, groups 3 and 4 showed higher CM (%) than groups 1 and 2, and in groups using Heliomolar. groups 3 and 4 showed higher CM (%) than group 1 (p < 0.05). After TML, in both Pyramid and Heliomo)ar groups, group 3 showed higher CM (%) than group 1 (p < 0.05) CM (%) of the regions are significantly different in each group (p < 0.05). Before TML, no statistical difference was found between groups within the VE and CE region. In the OE region, group 4 of Pyramid showed higher CM (%) than group 2, and groups 2 and 4 of Heliomolar showed higher CM (%) than group 1 (p < 0.05). After TML, no statistical difference was found among groups within the VE and CE region. In the OE region, group 3 of Pyramid showed higher CM (%) than groups 1 and 2, and groups 2,3 and 4 of Heliomolar showed higher CM (%) than group 1 (p < 0.05). It was concluded that intermittent polymerization may be effective in reducing marginal gap formation.