• 제목/요약/키워드: Deleted Data

검색결과 219건 처리시간 0.026초

대칭확률변수(對稱確率變數)의 대수(對數)의 법칙(法則)에 대하여 (On the Weak Law of Large Numbers for the Sums of Sign-Invariant Random Variables)

  • 홍덕헌
    • Journal of the Korean Data and Information Science Society
    • /
    • 제4권
    • /
    • pp.53-63
    • /
    • 1993
  • We consider various types of weak convergence for sums of sign-invariant random variables. Some results show a similarity between independence and sign-invariance. As a special case, we obtain a result which strengthens a weak law proved by Rosalsky and Teicher [6] in that some assumptions are deleted.

  • PDF

NAND 플래시 파일 시스템을 위한 안전 삭제 기법 (A Secure Deletion Method for NAND Flash File System)

  • 이재흥;오진하;김석현;이상호;허준영;조유근;홍지만
    • 한국정보과학회논문지:컴퓨팅의 실제 및 레터
    • /
    • 제14권3호
    • /
    • pp.251-255
    • /
    • 2008
  • 대부분의 파일 시스템에서는 파일이 삭제되더라도 메타데이타만 삭제되거나 변경될 뿐, 파일의 데이타는 물리 매체에 계속 저장된다. 그렇기 때문에 지워진 파일을 복구하는 것도 가능한데 이러한 복구를 불가능하도록 하는 것을 원하는 사용자들도 있다. 특히 이러한 요구는 플래시 메모리를 저장 장치로 사용하는 임베디드 시스템에서 더욱 더 많아지고 있다. 이에 본 논문에서는 NAND 플래시 파일 시스템을 위한 안전 삭제 기법을 제안하고 이를 가장 대표적인 NAND 플래시 파일 시스템인 YAFFS에 적용하였다. 제안 기법은 암호화를 기반으로 하고 있으며, 특정 파일을 암호화하는데 사용된 모든 키들이 같은 블록에 저장되도록 하여, 단 한번의 블록 삭제 연산으로 파일 복구가 불가능하도록 하였다. 시뮬레이션 결과는 제안 기법으로 인해 파일의 생성 및 변경 시 발생하는 블록 삭제를 감안하더라도 파일 삭제 시 발생하는 블록 삭제 횟수가 단순 암호화 기법의 경우보다 더 작다는 것을 보여준다. 본 논문에서는 제안 기법을 YAFFS에만 적용하였지만, 다른 NAND 플래시 전용 파일 시스템에도 쉽게 적용 가능하다.

시설 노인의 건강보존에 관한 도구 개발 (Scale Development on Health Conservation of the Institutionalized Elderly)

  • 성기월
    • 대한간호학회지
    • /
    • 제35권1호
    • /
    • pp.113-124
    • /
    • 2005
  • Purpose: The purpose of this study was to develop a health conservation scale with high validity and reliability for institutionalized elderly. Method: The process of development of this scale was as follows. A conceptual framework composed of 4 phases of health conservation of institutionalized elderly was identified based on the literature review with elderlies and discussions with experts in health conservation. A total of 75 items, on a 4-point scale were developed. Through reliability testing and factor analysis, 57 preliminary items were selected. By means of internal consistency of the 57 items, 18 items whose inner-items correlation coefficient was below .40 were deleted. Through factor analysis, 2 items whose factor loading was below .40 were deleted. Finally 37 items remained. To verity the 37 items, factor analysis, reliability testing, and correlation was done. Data were collected from 207 institutionalized elderly subjects in Daegu, Kyungpook, Busan, and KyungNam Province from August. 2003 to February. 2004. Result: In the result of factor analysis of the 37 items, 4 factors were extracted. These factors were labeled as ‘personal integrity’, ‘conservation of energy', ‘structural integrity’, and ‘social integrity'. These factors included 4 phases of health conservation. Cronbach's Alpha of 37 the items was .9424 and the correlation coefficient of HPLP was .723. Conclusion: The researchers recommend the following: An explorative study on the variables related to health conservation is needed for criterion validity of this scale. Studies on health conservation of different age groups, and subjects are needed for verification.

노인의 생의 의미 측정 도구 개발 (Development of Elderly Meaning in Life (EMIL) Scale)

  • 최순옥;김숙남;신경일;이정지;정유진
    • 대한간호학회지
    • /
    • 제33권3호
    • /
    • pp.414-424
    • /
    • 2003
  • Purpose: The purpose of this study was to develop elderly meaning in life scale with high validity and reliability. Method: The process of development of this scale were as follows. A conceptual framework composed of 4 phases of meaning in life of elderly was identified based on the literature review and interviews with elderlies and discussion with experts in meaning in life. Total 62 items, 4-points scale were developed. Through reliability testing, factor analysis, 40 preliminary items were selected. By means of internal consistency of 40 items, 2 items whose inner-items correlation coefficient was below .30 were deleted. Through factor analysis 1 item whose factor loading was below .30 was deleted. Finally 37 items were remained. To verify 37 items, factor analysis, reliability testing, LISEREL were done. Data were collected from 320 elderly subjects in Busan-KyungNam and Jeonla Province from May to June in 2002. SPSS WIN. 10.0 Program was used. Result: The result of factor analysis of 37 items, 8 factors were extracted. These factors were labeled as ‘self- awareness and self-acceptance’, ‘contentedness with life’, ‘purpose in life’, ‘love in family’, ‘role awareness’, ‘futuristic aspiration’, ‘commitment’, and ‘experience of love’. These factors included 4 phases of the meaning in life. Cronbach's Alpha of 37 items was .908 and correlation coefficient of PIL was .75. Conclusion: The researchers recommend the follows: The explorative study on the variables related to meaning in life are needed for criterion validity of this scale. The studies on meaning in life of different age groups, subjects are needed for reverification.

인공신경망 이론을 이용한 위성영상의 카테고리분류 (Multi-temporal Remote-Sensing Imag e ClassificationUsing Artificial Neural Networks)

  • 강문성;박승우;임재천
    • 한국농공학회:학술대회논문집
    • /
    • 한국농공학회 2001년도 학술발표회 발표논문집
    • /
    • pp.59-64
    • /
    • 2001
  • The objectives of the thesis are to propose a pattern classification method for remote sensing data using artificial neural network. First, we apply the error back propagation algorithm to classify the remote sensing data. In this case, the classification performance depends on a training data set. Using the training data set and the error back propagation algorithm, a layered neural network is trained such that the training pattern are classified with a specified accuracy. After training the neural network, some pixels are deleted from the original training data set if they are incorrectly classified and a new training data set is built up. Once training is complete, a testing data set is classified by using the trained neural network. The classification results of Landsat TM data show that this approach produces excellent results which are more realistic and noiseless compared with a conventional Bayesian method.

  • PDF

사료의 P, Ca, Zn, Mg, Fe, K, Mn과 Se이 조피볼락의 성장 및 체성분에 미치는 영향 (Influence of P, Ca, Zn, Mg, Fe, K, Mn, or Se in the Dietary Mineral Premix on Growth and Body Composition of Korean Rockfish (Sebastes schlegeli))

  • 이상민;박승렬
    • 한국수산과학회지
    • /
    • 제31권2호
    • /
    • pp.245-251
    • /
    • 1998
  • 조피볼락 사료의 각종 미네랄 필수성을 조사하기 위해 주요 mineral 함량이 낮은 조피볼락 근육과 casein을 사료 단백질원으로 선정하여 영양연구용으로 자체 설계된 reference premix중에 P, Ca, Zn, Mg, Fe, K, Mn 및 Se을 각각 첨가하지 않은 premix와 mineral 전부를 첨가하지 않은 사료를 설정하여 모두 10종 실험사료를 제조하였다. 평균체중 4.2g의 조피볼락 치어를 각 수조에 25마리씩 실험사료마다 3반복으로 수용하여 10주간 사육실험하였다. 성장, 사료효율, 영양소이용률, 전어체의 지질 함량에서 mineral premix를 첨가하지 않은 사료가 가장 낮은 것 (P<0.01)으로 나타났다. Reference mineral premix에 P, Ca, Zn, Mg, Fe, K, Mn 및 Se을 각각 첨가하지 않은 실험 사료의 증체율은 모두 reference premix를 첨가한 대조구보다 유의하게 낮았다 (P<0.01). 사료효율은 Mg, Fe, K 및 Se을 첨가하지 않은 실험구가 대조구와 통계적으로 차이가 없었으며, 그외 실험구는 대조구보다 낮은 값을 보였다 (P<0.01). 일일 사료섭취율은 실험구간에 유의차가 없었지만 (P>0.01), P, Ca 및 Zn 무첨가구들의 단백질축적률은 대조구보다 저조한 성적을 보였으며, 지질축적률은 Zn 무첨가구만 대조구와 차이를 보였다 (P<0.01). 각 미네랄을 첨가하지 않은 실험구들의 어체성분은 대조구와 통계적인 차이가 없었다.

  • PDF

배열기반 데이터 구조를 이용한 간략한 divide-and-conquer 삼각화 알고리즘 (A Compact Divide-and-conquer Algorithm for Delaunay Triangulation with an Array-based Data Structure)

  • 양상욱;최영
    • 한국CDE학회논문집
    • /
    • 제14권4호
    • /
    • pp.217-224
    • /
    • 2009
  • Most divide-and-conquer implementations for Delaunay triangulation utilize quad-edge or winged-edge data structure since triangles are frequently deleted and created during the merge process. How-ever, the proposed divide-and-conquer algorithm utilizes the array based data structure that is much simpler than the quad-edge data structure and requires less memory allocation. The proposed algorithm has two important features. Firstly, the information of space partitioning is represented as a permutation vector sequence in a vertices array, thus no additional data is required for the space partitioning. The permutation vector represents adaptively divided regions in two dimensions. The two-dimensional partitioning of the space is more efficient than one-dimensional partitioning in the merge process. Secondly, there is no deletion of edge in merge process and thus no bookkeeping of complex intermediate state for topology change is necessary. The algorithm is described in a compact manner with the proposed data structures and operators so that it can be easily implemented with computational efficiency.

Optical Flow Estimation of a Fluid Based on a Physical Model

  • Kim, Jin-Woo
    • Journal of information and communication convergence engineering
    • /
    • 제7권4호
    • /
    • pp.539-544
    • /
    • 2009
  • An estimation of 3D velocity field including occluded parts without maxing tracer to the fluid had not only never been proposed but also impossible by the conventional computer vision algorithm. In this paper, we propose a new method of three dimensional optical flow of the fluid based on physical model, where some boundary conditions are given from a priori knowledge of the flow configuration. Optical flow is obtained by minimizing the mean square errors of a basic constraint and the matching error terms with visual data using Euler equations. Here, Navier-Stokes motion equations and the differences between occluded data and observable data are employed as the basic constrains. we verify the effectiveness of our proposed method by applying our algorithm to simulated data with partly artificially deleted and recovering the lacking data. Next, applying our method to the fluid of observable surface data and the knowledge of boundary conditions, we demonstrate that 3D optical flow are obtained by proposed algorithm.

카오스 이론의 Lyapunov 지수를 응용한 안정상태 시뮬레이션의 출력분석 (Output Analysis for Steady-State Simulation Using Lyapunov Exponent in Chaos Theory)

  • 이영해;오형술
    • 대한산업공학회지
    • /
    • 제22권1호
    • /
    • pp.65-82
    • /
    • 1996
  • This paper proposes a sequential procedure which can be used to determine a truncation point and run length to reduce or remove bias owing to artificial startup conditions in simulations aimed at estimating steady-state behavior. It is based on the idea of Lyapunov exponent in chaos theory. The performance measures considered are relative bias, coverage, estimated relative half-width of the confidence interval, and mean amount of deleted data.

  • PDF

Application of NORM to the Multiple Imputation for Multivariate Missing Data

  • 김현정;문승호;신재경
    • Journal of the Korean Data and Information Science Society
    • /
    • 제13권2호
    • /
    • pp.105-113
    • /
    • 2002
  • The statistical analysis of incomplete data sometimes requires handling of incomplete observations. Towards this end, each case with some missing values generally should be deleted, namely, resulting in only use of non-missing cases. EM algorithm(Dempster et al., 1977) which involves prediction and estimation steps is a general method among others. In this article, we use the free software NORM developed for multiple imputation, which uses DA(Data Augmentation) algorithm in its imputation, and evaluate its efficiency through a numerical example.

  • PDF