• Title/Summary/Keyword: solution accuracy

Search Result 1,824, Processing Time 0.026 seconds

Quality Assessment of Tropospheric Delay Estimated by Precise Point Positioning in the Korean Peninsula

  • Park, Han-Earl;Roh, Kyoung Min;Yoo, Sung-Moon;Choi, Byung-Kyu;Chung, Jong-Kyun;Cho, Jungho
    • Journal of Positioning, Navigation, and Timing
    • /
    • v.3 no.4
    • /
    • pp.131-141
    • /
    • 2014
  • Over the last decade, the Global Navigation Satellite System (GNSS) has been increasingly utilized as a meteorological research tool. The Korea Astronomy and Space Science Institute (KASI) has also been developing a near real-time GNSS precipitable water vapor (PWV) information management system that can produce a precise PWV for the Korean Peninsula region using GNSS data processing and meteorological measurements. The goal of this paper is to evaluate whether the precise point positioning (PPP) strategy will be used as the new data processing strategy of the GNSS-PWV information management system. For this purpose, quality assessment has been performed by means of a comparative analysis of the troposphere zenith total delay (ZTD) estimates from KASI PPP solutions (KPS), KASI network solutions (KNS), and International GNSS Service (IGS) final troposphere products (IFTP) for ten permanent GNSS stations in the Korean Peninsula. The assessment consists largely of two steps: First, the troposphere ZTD of the KNS are compared to those of the IFTP for only DAEJ and SUWN, in which the IFTP are used as the reference. Second, the KPS are compared to the KNS for all ten GNSS stations. In this step, the KNS are used as a new reference rather than the IFTP, because it was proved in the previous step that the KNS can be a suitable reference. As a result, it was found that the ZTD values from both the KPS and the KNS followed the same overall pattern, with an RMS of 5.36 mm. When the average RMS was converted into an error of GNSS-PWV by considering the typical ratio of zenith wet delay and PWV, the GNSS-PWV error met the requirement for PWV accuracy in this application. Therefore, the PPP strategy can be used as a new data processing strategy in the near real-time GNSS-PWV information management system.

A Scheme of Distributed Network Security Management against DDoS Attacks (DDoS 공격에 대응하는 분산 네트워크 보안관리 기법)

  • Kim Sung-Ki;Yoo Seung-Hwan;Kim Moon-Chan;Min Byoung-Joon
    • Journal of the Institute of Electronics Engineers of Korea TC
    • /
    • v.43 no.7 s.349
    • /
    • pp.72-83
    • /
    • 2006
  • It is not a practical solution that the DDoS attacks or worm propagations are protected and responded within a domain itself because it clogs access of legitimate users to share communication lines beyond the boundary a domain. Especially, the DDoS attacks with spoofed source address or with bogus packets that the destination addresses are changed randomly but has the valid source address does not allow us to identify access of legitimate users. We propose a scheme of distributed network security management to protect access of legitimate users from the DDoS attacks exploiting randomly spoofed source IP addresses and sending the bogus packets. We assume that Internet is divided into multiple domains and there exists one or more domain security manager in each domain, which is responsible for identifying hosts within the domain. The domain security manager forwards information regarding identified suspicious attack flows to neighboring managers and then verifies the attack upon receiving return messages from the neighboring managers. Through the experiment on a test-bed, the proposed scheme was verified to be able to maintain high detection accuracy and to enhance the. normal packet survival rate.

The Possibility of Environmental Paraquat Exposure (파라콰트의 환경성 노출 가능성)

  • Oh, Se-Hyun;Choi, Hong-Soon;You, Ho-Young;Park, Jun-Ho;Song, Jae-Seok
    • Journal of agricultural medicine and community health
    • /
    • v.36 no.4
    • /
    • pp.218-226
    • /
    • 2011
  • Objectives: Paraquat (PQ) is a widely used ionic pesticide that is fatal when ingested accidentally or for suicidal purposes. It is thought that chronic exposure of PQ is related with the development of Parkinson's disease, but epidemiological studies have not yet confirmed that theory. This study attempted to estimate the possibility of environmental PQ exposure through soil and water. Materials and Methods: We analyzed the amount of decomposed PQ solution in wet soil after exposure to ultraviolet light. An artificial rainfall condition was simulated over soil sprayed with PQ to measure the amount of eluted PQ. In addition, PQ was diluted in water from three differently rated rivers and the changes in PQ concentration were measured after ultraviolet exposure over one month. High performance liquid chromatography/ultra violet detection was used to analyze the concentrations of PQ. Results: In the method we used, the recovery rate of PQ showed a precision rate less than 5%, an accuracy greater than 88%, and the calibration equation was y=5538.8x-440.01($R^2$=0.9985). There were no significant differences in the concentrations of PQ obtained from the three specimens over a 1-week period. From the PQ-sprayed soil, the artificial rainfall conditions showed no PQ elution over a 1-month period, and there was no significant differences in PQ concentrations according to ultraviolet exposure among the three samples. Conclusions: PQ remains well adsorbed naturally in soil. However, it may still exist in an integrated state for a long time in the hydrosphere, so the possibility of PQ exposure through drinking water cannot be disqualified.

Simultaneous Determination of Heavy Metals in Cosmetic Products by Ion Chromatography (이온 크로마토그래피를 이용한 화장품 중 중금속 동시분석)

  • Lee, So-Mi;Jeong, Hye-Jin;Kim, Han-Kon
    • Journal of the Society of Cosmetic Scientists of Korea
    • /
    • v.34 no.1
    • /
    • pp.57-62
    • /
    • 2008
  • No matter how small amount of heavy metals it may be cause skin allergies through percutaneous adsorption when existing in cosmetic products as impurities. In order to develop a highly sensitive method for simultaneous determination of $Pb^{2+},\;Fe^{2+},\;Cu^{2+},\;Ni^{2+},\;Zn^{2+},\;Co^{2+},\;Cd^{2+},\;and\;Mn^{2+}$ in coloring agents and cosmetic products with rapidity and accuracy, we carried out the determination on ion chromatography. All of these metals are well separated through a bifunctional ion-exchange column(IonPac CS5A) and detected by post-column reaction and spectrophotometric detection. The calibration graphs are linear($r^2>0.999$) in the range $0.1{\sim}1000{\mu}g/mL$. Detection limits for 200 ${\mu}L$ of sample solution are at the level of ${\mu}g/L$, which is sufficient for judging whether the product is safe or not. The relative standard deviations(RSDs) of the retention time and the peak area are less than 0.21 and 1.24%, respectively. The recovery rates are $97{\sim}104%$. The new method was applied to analyze the amount of heavy metals which were contained in 22 cosmetic products and 11 coloring agents.

Analysis of the Accuracy of Quaternion-Based Spatial Resection Based on the Layout of Control Points (기준점 배치에 따른 쿼터니언기반 공간후방교회법의 정확도 분석)

  • Kim, Eui Myoung;Choi, Han Seung
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.36 no.4
    • /
    • pp.255-262
    • /
    • 2018
  • In order to determine the three-dimensional position in photogrammetry, a spatial resection is a pre-requisite step to determine exterior orientation parameters. The existing spatial resection method is a non-linear equation that requires initial values of exterior orientation parameters and has a problem that a gimbal lock phenomenon may occur. On the other hand, the spatial resection using quaternion is a closed form solution that does not require initial values of EOP (Exterior Orientation Parameters) and is a method that can eliminate the problem of gimbal lock. In this study, to analyze the stability of the quaternion-based spatial resection, the exterior orientation parameters were determined according to the different layout of control points and were compared with the determined values using existing non-linear equation. As a result, it can be seen that the quaternionbased spatial resection is affected by the layout of the control points. Therefore, if the initial value of exterior orientation parameters could not be obtained, it would be more effective to estimate the initial exterior orientation values using the quaternion-based spatial resection and apply it to the collinearity equation-based spatial resection method.

A Study of Theoretical Methods for Estimating Void Ratio Based on the Elastic Wave Velocities (탄성파 속도를 이용한 간극비 산출 식의 고찰)

  • Lee, Jong-Sub;Park, Chung-Hwa;Yoon, Sung-Min;Yoon, Hyung-Koo
    • Journal of the Korean Geotechnical Society
    • /
    • v.29 no.2
    • /
    • pp.35-45
    • /
    • 2013
  • The void ratio is an important parameter for reflecting the soil behavior including physical property, compressibility, and relative density. The void ratio can be obtained by laboratory test with extracted soil samples. However, the specimen has a possibility to be easily disturbed due to the stress relief when extracting, vibration during transportation, and error in experimental process. Thus, the theoretical equations have been suggested for obtaing the void ratio based on the elastic wave velocities. The objective of this paper is to verify the accuracy of the proposed analytical solution through the error norm. The paper covers the theoretical methods of Wood, Gassmann and Foti. The elastic wave velocity is determined by the Field Velocity Probe in the southern part of Korean Peninsular. And the rest parameters are assumed based on the reference values. The Gassmann method shows the high reliability on determining the void ratio. The error norm is also analyzed as substitution of every parameter. The results show every equation has various characteristics. Thus, this paper may be widely applied for obtaining the void ratio according to the field condition.

An Evaluation for Effectiveness of Information Services by Reference Librarians at College and University Libraries in Korea (대학도서관 정보사서의 정보서비스 효율성 평가)

  • Han Sang Wan
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.13
    • /
    • pp.95-119
    • /
    • 1986
  • The objective of this study is to search for a theoretical and practical solution to the question of what is the most effective and qualitative method of information service for the college and university libraries in Korea. Assuming the maximum service or total service theory in information services, therefore, it appears natural that the subject specialist who is highly knowledgeable in his subject is indispensable in raising the quality of information librarians. The procedure of this research was as follows: There was no college and university library employing any full-time subject spceialist in Korea. This research, however, was proceeded on the assumption that subject specialists are already employed in all of the college and university libraries after the subject specialist system is established. The least qualification of subject specialist is limited, based on the criteria given by the foreign literature, to those who have master's degree in Library Science and bachelor's degree in any other subject area, those who have bacholor's degree in Library Science and master's degree in any other subject area, or those who have both bacholor's and master's degrees in Library Science with minor in any subject field . To prove the research hypothesis that the subject specialist will perform his role more efficiently than the generalist in effectively providing information service based on both accuracy and speed, this research as an obtrusive testing method analyzed the effectiveness by presenting information questions to the generalists and subject specialists who are information librarians in college and university libraries. For this study 20 librarians working at 12 university libraries were tested for performance levels of information services. The result showed $59.75\%$ an absolute performance rate and $75.20\%$ an adjust performance rate. Compared to Thomas Childer's 1970 study in which he used the unobtrusive testing method, these results were $5\%$ higher in the absolute performance rate and $11.36\%$ higher in the adjust performance rate. In comparing the generalist with the subject specialist in efficiency of information service, while the absolute performance rate was $57.08\%$ and the adjust performance rate was $73.08\%$ in the case of the generalist, the absolute rate was $63.75\%$ and the adjust rate was $78.38\%$ in the case of specialist, therefore, the efficiency of the subject specialist was $6.67\%$ higher in the absolute performance rate and $5.30\%$ higher in the adjust performance rate than that of generalist. But the factor of speediness was excluded from the analysis because of the difference between the time the interviewers recorded and the time the interviewee recorded. On the basis of the result of this research, it should be desirable to educate subject specialists and employ them as information librarians and for them to function as efficient subject specialists in order to improve the effectiveness of information services, the nucleus of the raison d'etre of college and university libraries.

  • PDF

A-team Based Approach for Reactive Power/Voltage Control Considering Steady State Security Assessment (정태 안전성 평가를 고려한 무효전력 전압제어를 위한 A-team기반 접근법)

  • Kim, Doo-Hyun
    • Journal of the Korean Society of Safety
    • /
    • v.11 no.2
    • /
    • pp.150-159
    • /
    • 1996
  • In this paper, an A-team(Asynchronous Team ) based approach for Reactive power and volage control considering static security assessment in a power system with infrastructural deficiencies is proposed. Reactive power and voltage control problem is the one of optimally establishing voltage level given several constraints such as reactive generation, voltage magnitude, line flow, and other switchable reactive power sources. It can be formulated as a mixed-integer linear programming(MILP) problem without deteriorating of solution accuracy to a certain extent. The security assessment is to estimate the relative robustness of the system in Its present state through the evaluation of data provided by security monitoring. Deterministic approach based on AC load flow calculations is adopted to assess the system security, especially voltage security. A security metric, as a standard of measurement for power system security, producting a set of discrete values rather than binary values, is employed. In order to analyze the above two problems, reactive power/voltage control problem and static security assessment problem, in an integrated fashion for real-time operations, a new organizational structure, called an A-team, is adopted. An A-team is an organization for agents which ale all autonomeus, work in parallel and communicate asynchronously, which is well-suited to the development of computer-based, multi-agent systems for operations. This A-team based approach, although it is still in the beginning stage, also has potential for handling other difficult power system problems.

  • PDF

A System of Audio Data Analysis and Masking Personal Information Using Audio Partitioning and Artificial Intelligence API (오디오 데이터 내 개인 신상 정보 검출과 마스킹을 위한 인공지능 API의 활용 및 음성 분할 방법의 연구)

  • Kim, TaeYoung;Hong, Ji Won;Kim, Do Hee;Kim, Hyung-Jong
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.30 no.5
    • /
    • pp.895-907
    • /
    • 2020
  • With the recent increasing influence of multimedia content other than the text-based content, services that help to process information in content brings us great convenience. These services' representative features are searching and masking the sensitive data. It is not difficult to find the solutions that provide searching and masking function for text information and image. However, even though we recognize the necessity of the technology for searching and masking a part of the audio data, it is not easy to find the solution because of the difficulty of the technology. In this study, we propose web application that provides searching and masking functions for audio data using audio partitioning method. While we are achieving the research goal, we evaluated several speech to text conversion APIs to choose a proper API for our purpose and developed regular expressions for searching sensitive information. Lastly we evaluated the accuracy of the developed searching and masking feature. The contribution of this work is in design and implementation of searching and masking a sensitive information from the audio data by the various functionality proving experiments.

Ontology-based Automated Metadata Generation Considering Semantic Ambiguity (의미 중의성을 고려한 온톨로지 기반 메타데이타의 자동 생성)

  • Choi, Jung-Hwa;Park, Young-Tack
    • Journal of KIISE:Software and Applications
    • /
    • v.33 no.11
    • /
    • pp.986-998
    • /
    • 2006
  • There has been an increasing necessity of Semantic Web-based metadata that helps computers efficiently understand and manage an information increased with the growth of Internet. However, it seems inevitable to face some semantically ambiguous information when metadata is generated. Therefore, we need a solution to this problem. This paper proposes a new method for automated metadata generation with the help of a concept of class, in which some ambiguous words imbedded in information such as documents are semantically more related to others, by using probability model of consequent words. We considers ambiguities among defined concepts in ontology and uses the Hidden Markov Model to be aware of part of a named entity. First of all, we constrict a Markov Models a better understanding of the named entity of each class defined in ontology. Next, we generate the appropriate context from a text to understand the meaning of a semantically ambiguous word and solve the problem of ambiguities during generating metadata by searching the optimized the Markov Model corresponding to the sequence of words included in the context. We experiment with seven semantically ambiguous words that are extracted from computer science thesis. The experimental result demonstrates successful performance, the accuracy improved by about 18%, compared with SemTag, which has been known as an effective application for assigning a specific meaning to an ambiguous word based on its context.