Evaluation of Lateral Flow in Soft Ground under Embankment (성토하부 연약지반의 측방유동 평가)
-
- Journal of the Korean Geotechnical Society
- /
- v.22 no.10
- /
- pp.93-100
- /
- 2006
The lateral soil movement in soft grounds undergoing improvement with application of vertical drains is analyzed on the basis of monitoring data at three fields, in which fifty six monitoring sites are located. Based on the investigations, the criterions are suggested to predict the lateral soil movement. In order to predict the lateral soil movement in the improved soft grounds by using the dimensionless parameter R suggested by Marche & Chapuis (1974), it is desirable that the maximum lateral displacement in the soft ground below the toe of embankment should be applied to calculate R instead of the lateral displacement at the toe of embankment. The lateral soil movement may increase rapidly, if the safety factor of slope is less than 1.4 in case of high ratio of H/B (Thickness of soft ground/Embankment width) such as 1.15 or is less than 1.2 in case of low ratio of H/B such as 0.05. Also, the graph suggested by Tschebotarioff (1973), which illustrates the relationship between the maximum height of embankments and the undrained shear strength of soft grounds, can be applied to the evaluation for the possibility of the lateral soil movement due to embankments on soft grounds.
Purpose : Selective inhibition of multiple molecular targets may improve the antitumor activity of radiation. Two specific inhibitors of selective cyclooxygenase-2 (COX-2) and epidermal growth factor receptor (EGFR) were combined with radiation on the HeLa cell line. To investigate cooperative mechanism with selective COX-2 inhibitor and EGFR blocker, in vitro experiments were done. Materials and Methods : Antitumor effect was obtained by growth inhibition and apoptosis analysis by annexin V-Flous method. Radiation modulation effects were determined by the clonogenic cell survival assay. Surviving fractions at 2 Gy (
Operations research has developed rapidly since its origins in World War II. Practitioners of O. R. have contributed to almost every aspect of government and business. More recently, a number of operations researchers have turned their attention to library and information systems, and the author believes that significant research has resulted. It is the purpose of this essay to introduce the library audience to some of these accomplishments, to present some of the author's hypotheses on the subject of library management to which he belives O. R. has great potential, and to suggest some future research directions. Some problem areas in librianship where O. R. may play a part have been discussed and are summarized below. (1) Library location. It is usually necessary to make balance between accessibility and cost In location problems. Many mathematical methods are available for identifying the optimal locations once the balance between these two criteria has been decided. The major difficulties lie in relating cost to size and in taking future change into account when discriminating possible solutions. (2) Planning new facilities. Standard approaches to using mathematical models for simple investment decisions are well established. If the problem is one of choosing the most economical way of achieving a certain objective, one may compare th althenatives by using one of the discounted cash flow techniques. In other situations it may be necessary to use of cost-benefit approach. (3) Allocating library resources. In order to allocate the resources to best advantage the librarian needs to know how the effectiveness of the services he offers depends on the way he puts his resources. The O. R. approach to the problems is to construct a model representing effectiveness as a mathematical function of levels of different inputs(e.g., numbers of people in different jobs, acquisitions of different types, physical resources). (4) Long term planning. Resource allocation problems are generally concerned with up to one and a half years ahead. The longer term certainly offers both greater freedom of action and greater uncertainty. Thus it is difficult to generalize about long term planning problems. In other fields, however, O. R. has made a significant contribution to long range planning and it is likely to have one to make in librarianship as well. (5) Public relations. It is generally accepted that actual and potential users are too ignorant both of the range of library services provided and of how to make use of them. How should services be brought to the attention of potential users? The answer seems to lie in obtaining empirical evidence by controlled experiments in which a group of libraries participated. (6) Acquisition policy. In comparing alternative policies for acquisition of materials one needs to know the implications of each service which depends on the stock. Second is the relative importance to be ascribed to each service for each class of user. By reducing the level of the first, formal models will allow the librarian to concentrate his attention upon the value judgements which will be necessary for the second. (7) Loan policy. The approach to choosing between loan policies is much the same as the previous approach. (8) Manpower planning. For large library systems one should consider constructing models which will permit the skills necessary in the future with predictions of the skills that will be available, so as to allow informed decisions. (9) Management information system for libraries. A great deal of data can be available in libraries as a by-product of all recording activities. It is particularly tempting when procedures are computerized to make summary statistics available as a management information system. The values of information to particular decisions that may have to be taken future is best assessed in terms of a model of the relevant problem. (10) Management gaming. One of the most common uses of a management game is as a means of developing staff's to take decisions. The value of such exercises depends upon the validity of the computerized model. If the model were sufficiently simple to take the form of a mathematical equation, decision-makers would probably able to learn adequately from a graph. More complex situations require simulation models. (11) Diagnostics tools. Libraries are sufficiently complex systems that it would be useful to have available simple means of telling whether performance could be regarded as satisfactory which, if it could not, would also provide pointers to what was wrong. (12) Data banks. It would appear to be worth considering establishing a bank for certain types of data. It certain items on questionnaires were to take a standard form, a greater pool of data would de available for various analysis. (13) Effectiveness measures. The meaning of a library performance measure is not readily interpreted. Each measure must itself be assessed in relation to the corresponding measures for earlier periods of time and a standard measure that may be a corresponding measure in another library, the 'norm', the 'best practice', or user expectations.
To implement elliptic curve cryptosystem in GF(2
Selecting high-quality information that meets the interests and needs of users among the overflowing contents is becoming more important as the generation continues. In the flood of information, efforts to reflect the intention of the user in the search result better are being tried, rather than recognizing the information request as a simple string. Also, large IT companies such as Google and Microsoft focus on developing knowledge-based technologies including search engines which provide users with satisfaction and convenience. Especially, the finance is one of the fields expected to have the usefulness and potential of text data analysis because it's constantly generating new information, and the earlier the information is, the more valuable it is. Automatic knowledge extraction can be effective in areas where information flow is vast, such as financial sector, and new information continues to emerge. However, there are several practical difficulties faced by automatic knowledge extraction. First, there are difficulties in making corpus from different fields with same algorithm, and it is difficult to extract good quality triple. Second, it becomes more difficult to produce labeled text data by people if the extent and scope of knowledge increases and patterns are constantly updated. Third, performance evaluation is difficult due to the characteristics of unsupervised learning. Finally, problem definition for automatic knowledge extraction is not easy because of ambiguous conceptual characteristics of knowledge. So, in order to overcome limits described above and improve the semantic performance of stock-related information searching, this study attempts to extract the knowledge entity by using neural tensor network and evaluate the performance of them. Different from other references, the purpose of this study is to extract knowledge entity which is related to individual stock items. Various but relatively simple data processing methods are applied in the presented model to solve the problems of previous researches and to enhance the effectiveness of the model. From these processes, this study has the following three significances. First, A practical and simple automatic knowledge extraction method that can be applied. Second, the possibility of performance evaluation is presented through simple problem definition. Finally, the expressiveness of the knowledge increased by generating input data on a sentence basis without complex morphological analysis. The results of the empirical analysis and objective performance evaluation method are also presented. The empirical study to confirm the usefulness of the presented model, experts' reports about individual 30 stocks which are top 30 items based on frequency of publication from May 30, 2017 to May 21, 2018 are used. the total number of reports are 5,600, and 3,074 reports, which accounts about 55% of the total, is designated as a training set, and other 45% of reports are designated as a testing set. Before constructing the model, all reports of a training set are classified by stocks, and their entities are extracted using named entity recognition tool which is the KKMA. for each stocks, top 100 entities based on appearance frequency are selected, and become vectorized using one-hot encoding. After that, by using neural tensor network, the same number of score functions as stocks are trained. Thus, if a new entity from a testing set appears, we can try to calculate the score by putting it into every single score function, and the stock of the function with the highest score is predicted as the related item with the entity. To evaluate presented models, we confirm prediction power and determining whether the score functions are well constructed by calculating hit ratio for all reports of testing set. As a result of the empirical study, the presented model shows 69.3% hit accuracy for testing set which consists of 2,526 reports. this hit ratio is meaningfully high despite of some constraints for conducting research. Looking at the prediction performance of the model for each stocks, only 3 stocks, which are LG ELECTRONICS, KiaMtr, and Mando, show extremely low performance than average. this result maybe due to the interference effect with other similar items and generation of new knowledge. In this paper, we propose a methodology to find out key entities or their combinations which are necessary to search related information in accordance with the user's investment intention. Graph data is generated by using only the named entity recognition tool and applied to the neural tensor network without learning corpus or word vectors for the field. From the empirical test, we confirm the effectiveness of the presented model as described above. However, there also exist some limits and things to complement. Representatively, the phenomenon that the model performance is especially bad for only some stocks shows the need for further researches. Finally, through the empirical study, we confirmed that the learning method presented in this study can be used for the purpose of matching the new text information semantically with the related stocks.
The wall shear stress in the vicinity of end-to end anastomoses under steady flow conditions was measured using a flush-mounted hot-film anemometer(FMHFA) probe. The experimental measurements were in good agreement with numerical results except in flow with low Reynolds numbers. The wall shear stress increased proximal to the anastomosis in flow from the Penrose tubing (simulating an artery) to the PTFE: graft. In flow from the PTFE graft to the Penrose tubing, low wall shear stress was observed distal to the anastomosis. Abnormal distributions of wall shear stress in the vicinity of the anastomosis, resulting from the compliance mismatch between the graft and the host artery, might be an important factor of ANFH formation and the graft failure. The present study suggests a correlation between regions of the low wall shear stress and the development of anastomotic neointimal fibrous hyperplasia(ANPH) in end-to-end anastomoses. 30523 T00401030523 ^x Air pressure decay(APD) rate and ultrafiltration rate(UFR) tests were performed on new and saline rinsed dialyzers as well as those roused in patients several times. C-DAK 4000 (Cordis Dow) and CF IS-11 (Baxter Travenol) reused dialyzers obtained from the dialysis clinic were used in the present study. The new dialyzers exhibited a relatively flat APD, whereas saline rinsed and reused dialyzers showed considerable amount of decay. C-DAH dialyzers had a larger APD(11.70
The wall shear stress in the vicinity of end-to end anastomoses under steady flow conditions was measured using a flush-mounted hot-film anemometer(FMHFA) probe. The experimental measurements were in good agreement with numerical results except in flow with low Reynolds numbers. The wall shear stress increased proximal to the anastomosis in flow from the Penrose tubing (simulating an artery) to the PTFE: graft. In flow from the PTFE graft to the Penrose tubing, low wall shear stress was observed distal to the anastomosis. Abnormal distributions of wall shear stress in the vicinity of the anastomosis, resulting from the compliance mismatch between the graft and the host artery, might be an important factor of ANFH formation and the graft failure. The present study suggests a correlation between regions of the low wall shear stress and the development of anastomotic neointimal fibrous hyperplasia(ANPH) in end-to-end anastomoses. 30523 T00401030523 ^x Air pressure decay(APD) rate and ultrafiltration rate(UFR) tests were performed on new and saline rinsed dialyzers as well as those roused in patients several times. C-DAK 4000 (Cordis Dow) and CF IS-11 (Baxter Travenol) reused dialyzers obtained from the dialysis clinic were used in the present study. The new dialyzers exhibited a relatively flat APD, whereas saline rinsed and reused dialyzers showed considerable amount of decay. C-DAH dialyzers had a larger APD(11.70