Soil fertility of alpine soils in Gangwon-Do has been deteriorating because of heavy input of chemical fertilizers for intensive crop production. To reduce application of chemical fertilizers, use of livestock manure compost in alpine soils increases consistently. Soil loss and runoff due to heavy rainfall in alpine area cause nutrient loss from soil, and subsequently pollute stream water. Therefore, the objective of this study was to assess nutrient efficiency and loss in Chinese cabbage cultivated soil with different livestock manure composts in several slopes. As control, chemical fertilizer was applied at the rate of $250-78-168kg\;ha^{-1}$ for $N-P_2O_5-K_2O$. Each pig-and chicken manure compost was applied at the rate of $10MT\;ha^{-1}$. Chemical fertilizer + chicken manure compost was applied as same rate. Four treatments was practiced in 5, 20, and 35% filed slopes, respectively. We monitored the amounts of soil loss and runoff water after rainfalls, and we also analyzed the contents of nutrients in soil and runoff water through lysimeter installed in alpine agricultural institute in Gangwon-Do. T-N loss due to soil loss was much greater with increasing filed slops rather than different fertilizer treatments. T-N loss has positive relationship with field slopes, which showing soil loss (MT/ha) = 1.66 slopes (%) - 3.5 ($r^2$ = 0.99). Available phosphate and exchangeable cations showed similar tendency with increasing slopes. T-N and T-P losses caused by runoff water were highest in chemical fertilizer (NPK) + chicken manure compost treated plot, while lowest in chemical fertilizer treatment. T-N contents (2.13, 1.95%) in chinese cabbage treated either pig and chicken manure composts compared to that (2.65%) of chemical fertilizer were significantly less. This could be resulted from much greater T-N loss in soil treated with pig and chicken manure composts.
Those days when people used paper to make up and manage all kinds of documents in the process of their jobs are gone now. Today electronic types of documents have replaced paper. Unlike paper documents, electronic ones contribute to the maximum job efficiency with their convenience in production and storage. But they too have some disadvantages; it's difficult to distinguish originals and copies like paper documents; it's not easy to examine if there is a change or damage to the documents; they are also prone to alteration and damage by the external influences in the electronic environment; and electronic documents require enormous amounts of workforce and costs for immediate measures to be taken according to the changes to the S/W and H/W environment. Despite all those weaknesses, however, electronic documents increasingly account for more percentage in the current job environment thanks to their job convenience and efficiency of production costs. Both the government and private sector have made efforts to come up with plans to maximize their advantages and minimize their risks at the same time. One of the methods is the Authorized Retention Center which is described in the study. There are a couple of prerequisites for its smooth operation; they should guarantee the legal validity of electronic documents in the administrative aspects and first secure the reliability and authenticity of electronic documents in the technological aspects. Responding to those needs, the Ministry of Commerce, Industry and Energy and the Korea Institute for Electronic Commerce, which were the two main bodies to drive the Authorized Retention Center project, revised the Electronic Commerce Act and supplemented the provisions to guarantee the legal validity of electronic documents in 2005 and conducted researches on the ways to preserve electronic documents for a long term and secure their reliability, which had been demanded by the users of the center, in 2006. In an attempt to fulfill those goals of the Authorized Retention Center, this study researched technical standard for electronic record information package of the center and applied the ISO 14721 information package model that's the standard for the long-term preservation of digital data. It also suggested a process to produce and manage information package so that there would be the SIP, AIP and DIP metadata features for the production, preservation, and utilization by users points of electronic documents and they could be implemented according to the center's policies. Based on the previous study, the study introduced the flow charts among the production and progress process, application methods and packages of technical standard for electronic record information package at the center and suggested some issues that should be consistently researched in the field of records management based on the results.
The quality of school education is a key element for national education development. An important factor that determines the quality of school education is qualities of teachers who are in responsible for school education in the field. Therefore, it is necessary to hire competent teachers in the teacher appointment exam for the secondary school. This necessity is evident especially for vocational high schools and Meister high schools with the introduction of 2015-revised curriculum based on NCS that separates each three subjects, "Electrical, Electronics Communication" resulting in the change of question mechanism, which requires new designing of assessment and content area. So, this study analyzes curriculum in college of education for "Electrical", "Electronics", "Communication", 2015-revised curriculum based on NCS and the development of standards for teacher qualifications and assessment area and evaluation of teaching ability in the subjects of the teacher appointment exam, "Electrical, Electronics Communication" Engineering" in 2009. The assessment area and content elements of "Electrical", "Electronics", "Communication are extracted from the analyzed results and they are verified by experts' consultation and presented as follows; First, the assessment area and content elements of the "Electrical" subject were designed to evaluate the NCS - based 2015 revised curriculum by presenting the NCS learning module to the evaluation area and content element in the basic subject "Electrical and Electronics Practice". Second, the section of "Electronics" presented the assessment area and content elements applying the Electronic Circuit, basic subject of the NCS and it also added "Electromagnetics", which is the basic part of Electronics in the Application of Electromagnetic waves that could be applied to the assessment. Third, the assessment area and content elements of "Communication" consist of the communication-related practice that is based on "Electrical" and "Electronic", considering the characteristics of "Communication Engineering". In particular, "Electrical and Electronics practice" which adds network construction practice and communication-related practice makes it to be able to evaluate the communication-related practical education.
Lee, Minji;Shin, Juyong;Kim, Jin Ho;Lim, Young Kyun;Cho, Hoon;Baek, Seung Ho
Korean Journal of Environmental Biology
/
v.36
no.3
/
pp.359-369
/
2018
Harmful algal blooms (HABs) are a serious problem for public health and fisheries industries, thus there exists a need to investigate the possible ways for effective control of HABs. In the present study, we investigated the algicidal effects of a newly developed GreenTD against the HABs (Chattonella marina, Heterosigma akashiwo, Cochlodinium polykriokides, and Heterocapsa circularisquama) and non-HABs (Chaetoceros simplex, Skeletonema sp. and Tetraselmis sp.), which is focused on the different population density and concentration gradients of algicidal substances. The time series viability of target alga was assessed based on the activity of Chl. a photosynthetic efficiency in terms of $F_v/F_m$, and in vivo fluorescence (FSU). Effective control of Raphidophyta, C. marina and H. akashiwo was achieved at a GreenTD concentration of $0.5{\mu}gL^{-1}$ and $0.2{\mu}gL^{-1}$, respectively, and regrowth of both the species was not observed even after 14 days. The inhibitory ratio of the dinoflagellate, C. polykriokides was more than 80% at $0.2{\mu}gL^{-1}$ of GreenTD. H. circularisquama was constantly affected in the presence of $0.2{\mu}gL^{-1}$ of GreenTD in the high- and low-population density experimental groups. On the other hand, diatoms, C. simplex, and Skeletonema sp. were not significantly affected even in the presence of $0.2{\mu}gL^{-1}$ of GreenTD and exhibited re-growth activity with the passage of incubation time. In particular, green alga Tetraselmis sp. remained unaffected even in the presence of the highest concentration of GreenTD ($1.0{\mu}gL^{-1}$), implying that non-HABs were not greatly influenced by the algicidal substances. As a result, the algicidal activity of GreenTD on the harmful and nonharmful algae was as follows: raphidophyte>dinoflagellates>diatoms>green alga. Consequently, our results indicate that inoculation of GreenTD substances into natural blooms at a threshold concentration ($0.2{\mu}gL^{-1}$) can maximize the algicidal activity against HABs species. If we consider the dilution and diffusion rate in the field application, it is hypothesized that GreenTD will demonstrate economic efficiency, thus leading to effective control against the target HABs in the closed bay.
Korean Journal of Agricultural and Forest Meteorology
/
v.21
no.3
/
pp.135-145
/
2019
Analysis of a long cycle or a trend of time series data based on a long-term observation would require comparability between data observed in the past and the present. In the present study, we proposed an approach to ensure the compatibility among the instruments used for the long-term observation, which would allow to secure continuity of the data. An open-path gas analyzer (Model LI-7500, LI-COR, Inc., USA) has been used for eddy covariance flux measurement in the Gwangneung deciduous forest for more than 10 years. The open-path gas analyzer was replaced by an enclosed-path gas analyzer (Model EC155, Campbell Scientific, Inc., USA) in July 2015. Before completely replacing the gas analyzer, the carbon dioxide ($CO_2$) and latent heat fluxes were collected using both gas analyzers simultaneously during a five-month period from August to December in 2015. It was found that the $CO_2$ fluxes were not significantly different between the gas analyzers under the condition that the daily mean temperature was higher than $0^{\circ}C$. However, the $CO_2$ flux measured by the open-path gas analyzer was negatively biased (from positive sign, i.e., carbon source, to 0 or negative sign, i.e., carbon neutral or sink) due to the instrument surface heating under the condition that the daily mean temperature was lower than $0^{\circ}C$. Despite applying the frequency response correction associated with tube attenuation of water vapor, the latent heat flux measured by the enclosed-path gas analyzer was on average 9% smaller than that measured by the open-path gas analyzer, which resulted in >20% difference of the sums over the study period. These results indicated that application of the additional air density correction would be needed due to the instrument heat and analysis of the long-term observational flux data would be facilitated by understanding the underestimation tendency of latent heat flux measurements by an enclosed-path gas analyzer.
Fake news has emerged as a significant issue over the last few years, igniting discussions and research on how to solve this problem. In particular, studies on automated fact-checking and fake news detection using artificial intelligence and text analysis techniques have drawn attention. Fake news detection research entails a form of document classification; thus, document classification techniques have been widely used in this type of research. However, document summarization techniques have been inconspicuous in this field. At the same time, automatic news summarization services have become popular, and a recent study found that the use of news summarized through abstractive summarization has strengthened the predictive performance of fake news detection models. Therefore, the need to study the integration of document summarization technology in the domestic news data environment has become evident. In order to examine the effect of extractive summarization on the fake news detection model, we first summarized news articles through extractive summarization. Second, we created a summarized news-based detection model. Finally, we compared our model with the full-text-based detection model. The study found that BPN(Back Propagation Neural Network) and SVM(Support Vector Machine) did not exhibit a large difference in performance; however, for DT(Decision Tree), the full-text-based model demonstrated a somewhat better performance. In the case of LR(Logistic Regression), our model exhibited the superior performance. Nonetheless, the results did not show a statistically significant difference between our model and the full-text-based model. Therefore, when the summary is applied, at least the core information of the fake news is preserved, and the LR-based model can confirm the possibility of performance improvement. This study features an experimental application of extractive summarization in fake news detection research by employing various machine-learning algorithms. The study's limitations are, essentially, the relatively small amount of data and the lack of comparison between various summarization technologies. Therefore, an in-depth analysis that applies various analytical techniques to a larger data volume would be helpful in the future.
In recent years, as well as management of public records, interest in the private archive of large and small is growing. Dedicated archive has various types. In addition, lack of personnel and budget, personnel records management professional because the absence, that help you maintain these records in a systematic manner is not easy. Request to the system have continued to rise, but the budget and professionals in order to solve this problem are missing. As breakthrough of the burden to the system with archive dedicated, it introduces the trends and meaning of public recording system, and was examined in detail AtoM function. AtoM is public land can be made by a method that requires a Web service, the database server. Without restrictions, including the advantage of being available free of charge, by the application or operating system specific, installation and operation is convenient. In addition, compatibility, and is highly scalable, AtoM use and convenient archive of private experiencing a shortage of personnel and budget. Because in terms of data management, and excellent interoperability and search share, and use, it is possible in the future, it favors also documentary use through a network of inter-agency archives and private. In addition, Enhancements exhibition services through cooperation with Omeka, long-term storage through Archivematica, many discussion is needed. Public centered around the private area of the recording management spilling expanded, open-source software allows to balance the recording system will be able to play an important role. In addition, the efforts of academia and in the field, close collaboration between the open source recording system through a user study should be continued. Furthermore, co-operation and sharing of private archives expect come true.
For disaster management and mitigation of earthquakes in Korea Peninsula, active fault investigation has been conducted for the past 5 years. In particular, investigation of sediment-covered active faults integrates geomorphological analysis on airborne LiDAR data, surface geological survey, and geophysical exploration, and unearths subsurface active faults by trench survey. However, the fault traces revealed by trench surveys are only available for investigation during a limited time and restored to the previous condition. Thus, the geological data describing the fault trench sites remain as the qualitative data in terms of research articles and reports. To extend the limitations due to temporal nature of geological studies, we utilized a terrestrial LiDAR to produce 3D point clouds for the fault trench sites and restored them in a digital space. The terrestrial LiDAR scanning was conducted at two trench sites located near the Yangsan Fault and acquired amplitude and reflectance from the surveyed area as well as color information by combining photogrammetry with the LiDAR system. The scanned data were merged to form the 3D point clouds having the average geometric error of 0.003 m, which exhibited the sufficient accuracy to restore the details of the surveyed trench sites. However, we found more post-processing on the scanned data would be necessary because the amplitudes and reflectances of the point clouds varied depending on the scan positions and the colors of the trench surfaces were captured differently depending on the light exposures available at the time. Such point clouds are pretty large in size and visualized through a limited set of softwares, which limits data sharing among researchers. As an alternative, we suggested Potree, an open-source web-based platform, to visualize the point clouds of the trench sites. In this study, as a result, we identified that terrestrial LiDAR data can be practical to increase reproducibility of geological field studies and easily accessible by researchers and students in Earth Sciences.
The purpose of this study is to determine the order of priority for the use of amendments, matching the optimal amendment to the specific site in Korea. This decision-making process must prioritize the stabilization and economic efficiency of amendment for heavy metals and metalloid based on domestic site contamination scenarios. For this study, total 5 domestic heavy metal contaminated sites were selected based on different pollution scenarios and 13 amendments, which were previously studied as the soil stabilizer. Batch extraction experiments were performed to quantify the stabilization efficiency for 8 heavy metals (including As and Hg) for 5 soil samples, representing 5 different pollution scenarios. For each amendment, the analyses using XRD and XRF to identify their properties, the toxicity characteristics leaching procedure (TCLP) test, and the synthetic precipitation leaching procedure (SPLP) test were also conducted to evaluate the leaching safety in applied site. From results of batch experiments, the amendments showing > 20% extraction lowering efficiency for each heavy metal (metalloid) was selected and the top 5 ranked amendments were determined at different amount of amendment and on different extraction time conditions. For each amendment, the total number of times ranked in the top 5 was counted, prioritizing the feasible amendment for specific domestic contaminated sites in Korea. Mine drainage treatment sludge, iron oxide, calcium oxide, calcium hydroxide, calcite, iron sulfide, biochar showed high extraction decreasing efficiency for heavy metals in descending order. When the economic efficiency for these amendments was analyzed, mine drainage treatment sludge, limestone, steel making slag, calcium oxide, calcium hydroxide were determined as the priority amendment for the Korean field application in descending order.
Lim, Hye Jin;Jeong, Da Woon;Yoo, Seong Joon;Gu, Yeong Hyeon;Park, Jong Han
The Journal of Korean Institute of Next Generation Computing
/
v.14
no.6
/
pp.30-43
/
2018
Many studies have been carried out to retrieve images using colors, shapes, and textures which are characteristic of images. In addition, there is also progress in research related to the disease images of the crop. In this paper, to be a help to identify the disease occurred in crops grown in the agricultural field, we propose a similarity-based crop disease search system using the diseases image of horticulture crops. The proposed system improves the similarity retrieval performance compared to existing ones through the combination descriptor without using a single descriptor and applied the weight based calculation method to provide users with highly readable similarity search results. In this paper, a total of 13 Descriptors were used in combination. We used to retrieval of disease of six crops using a combination Descriptor, and a combination Descriptor with the highest average accuracy for each crop was selected as a combination Descriptor for the crop. The retrieved result were expressed as a percentage using the calculation method based on the ratio of disease names, and calculation method based on the weight. The calculation method based on the ratio of disease name has a problem in that number of images used in the query image and similarity search was output in a first order. To solve this problem, we used a calculation method based on weight. We applied the test image of each disease name to each of the two calculation methods to measure the classification performance of the retrieval results. We compared averages of retrieval performance for two calculation method for each crop. In cases of red pepper and apple, the performance of the calculation method based on the ratio of disease names was about 11.89% on average higher than that of the calculation method based on weight, respectively. In cases of chrysanthemum, strawberry, pear, and grape, the performance of the calculation method based on the weight was about 20.34% on average higher than that of the calculation method based on the ratio of disease names, respectively. In addition, the system proposed in this paper, UI/UX was configured conveniently via the feedback of actual users. Each system screen has a title and a description of the screen at the top, and was configured to display a user to conveniently view the information on the disease. The information of the disease searched based on the calculation method proposed above displays images and disease names of similar diseases. The system's environment is implemented for use with a web browser based on a pc environment and a web browser based on a mobile device environment.
이메일무단수집거부
본 웹사이트에 게시된 이메일 주소가 전자우편 수집 프로그램이나
그 밖의 기술적 장치를 이용하여 무단으로 수집되는 것을 거부하며,
이를 위반시 정보통신망법에 의해 형사 처벌됨을 유념하시기 바랍니다.
[게시일 2004년 10월 1일]
이용약관
제 1 장 총칙
제 1 조 (목적)
이 이용약관은 KoreaScience 홈페이지(이하 “당 사이트”)에서 제공하는 인터넷 서비스(이하 '서비스')의 가입조건 및 이용에 관한 제반 사항과 기타 필요한 사항을 구체적으로 규정함을 목적으로 합니다.
제 2 조 (용어의 정의)
① "이용자"라 함은 당 사이트에 접속하여 이 약관에 따라 당 사이트가 제공하는 서비스를 받는 회원 및 비회원을
말합니다.
② "회원"이라 함은 서비스를 이용하기 위하여 당 사이트에 개인정보를 제공하여 아이디(ID)와 비밀번호를 부여
받은 자를 말합니다.
③ "회원 아이디(ID)"라 함은 회원의 식별 및 서비스 이용을 위하여 자신이 선정한 문자 및 숫자의 조합을
말합니다.
④ "비밀번호(패스워드)"라 함은 회원이 자신의 비밀보호를 위하여 선정한 문자 및 숫자의 조합을 말합니다.
제 3 조 (이용약관의 효력 및 변경)
① 이 약관은 당 사이트에 게시하거나 기타의 방법으로 회원에게 공지함으로써 효력이 발생합니다.
② 당 사이트는 이 약관을 개정할 경우에 적용일자 및 개정사유를 명시하여 현행 약관과 함께 당 사이트의
초기화면에 그 적용일자 7일 이전부터 적용일자 전일까지 공지합니다. 다만, 회원에게 불리하게 약관내용을
변경하는 경우에는 최소한 30일 이상의 사전 유예기간을 두고 공지합니다. 이 경우 당 사이트는 개정 전
내용과 개정 후 내용을 명확하게 비교하여 이용자가 알기 쉽도록 표시합니다.
제 4 조(약관 외 준칙)
① 이 약관은 당 사이트가 제공하는 서비스에 관한 이용안내와 함께 적용됩니다.
② 이 약관에 명시되지 아니한 사항은 관계법령의 규정이 적용됩니다.
제 2 장 이용계약의 체결
제 5 조 (이용계약의 성립 등)
① 이용계약은 이용고객이 당 사이트가 정한 약관에 「동의합니다」를 선택하고, 당 사이트가 정한
온라인신청양식을 작성하여 서비스 이용을 신청한 후, 당 사이트가 이를 승낙함으로써 성립합니다.
② 제1항의 승낙은 당 사이트가 제공하는 과학기술정보검색, 맞춤정보, 서지정보 등 다른 서비스의 이용승낙을
포함합니다.
제 6 조 (회원가입)
서비스를 이용하고자 하는 고객은 당 사이트에서 정한 회원가입양식에 개인정보를 기재하여 가입을 하여야 합니다.
제 7 조 (개인정보의 보호 및 사용)
당 사이트는 관계법령이 정하는 바에 따라 회원 등록정보를 포함한 회원의 개인정보를 보호하기 위해 노력합니다. 회원 개인정보의 보호 및 사용에 대해서는 관련법령 및 당 사이트의 개인정보 보호정책이 적용됩니다.
제 8 조 (이용 신청의 승낙과 제한)
① 당 사이트는 제6조의 규정에 의한 이용신청고객에 대하여 서비스 이용을 승낙합니다.
② 당 사이트는 아래사항에 해당하는 경우에 대해서 승낙하지 아니 합니다.
- 이용계약 신청서의 내용을 허위로 기재한 경우
- 기타 규정한 제반사항을 위반하며 신청하는 경우
제 9 조 (회원 ID 부여 및 변경 등)
① 당 사이트는 이용고객에 대하여 약관에 정하는 바에 따라 자신이 선정한 회원 ID를 부여합니다.
② 회원 ID는 원칙적으로 변경이 불가하며 부득이한 사유로 인하여 변경 하고자 하는 경우에는 해당 ID를
해지하고 재가입해야 합니다.
③ 기타 회원 개인정보 관리 및 변경 등에 관한 사항은 서비스별 안내에 정하는 바에 의합니다.
제 3 장 계약 당사자의 의무
제 10 조 (KISTI의 의무)
① 당 사이트는 이용고객이 희망한 서비스 제공 개시일에 특별한 사정이 없는 한 서비스를 이용할 수 있도록
하여야 합니다.
② 당 사이트는 개인정보 보호를 위해 보안시스템을 구축하며 개인정보 보호정책을 공시하고 준수합니다.
③ 당 사이트는 회원으로부터 제기되는 의견이나 불만이 정당하다고 객관적으로 인정될 경우에는 적절한 절차를
거쳐 즉시 처리하여야 합니다. 다만, 즉시 처리가 곤란한 경우는 회원에게 그 사유와 처리일정을 통보하여야
합니다.
제 11 조 (회원의 의무)
① 이용자는 회원가입 신청 또는 회원정보 변경 시 실명으로 모든 사항을 사실에 근거하여 작성하여야 하며,
허위 또는 타인의 정보를 등록할 경우 일체의 권리를 주장할 수 없습니다.
② 당 사이트가 관계법령 및 개인정보 보호정책에 의거하여 그 책임을 지는 경우를 제외하고 회원에게 부여된
ID의 비밀번호 관리소홀, 부정사용에 의하여 발생하는 모든 결과에 대한 책임은 회원에게 있습니다.
③ 회원은 당 사이트 및 제 3자의 지적 재산권을 침해해서는 안 됩니다.
제 4 장 서비스의 이용
제 12 조 (서비스 이용 시간)
① 서비스 이용은 당 사이트의 업무상 또는 기술상 특별한 지장이 없는 한 연중무휴, 1일 24시간 운영을
원칙으로 합니다. 단, 당 사이트는 시스템 정기점검, 증설 및 교체를 위해 당 사이트가 정한 날이나 시간에
서비스를 일시 중단할 수 있으며, 예정되어 있는 작업으로 인한 서비스 일시중단은 당 사이트 홈페이지를
통해 사전에 공지합니다.
② 당 사이트는 서비스를 특정범위로 분할하여 각 범위별로 이용가능시간을 별도로 지정할 수 있습니다. 다만
이 경우 그 내용을 공지합니다.
제 13 조 (홈페이지 저작권)
① NDSL에서 제공하는 모든 저작물의 저작권은 원저작자에게 있으며, KISTI는 복제/배포/전송권을 확보하고
있습니다.
② NDSL에서 제공하는 콘텐츠를 상업적 및 기타 영리목적으로 복제/배포/전송할 경우 사전에 KISTI의 허락을
받아야 합니다.
③ NDSL에서 제공하는 콘텐츠를 보도, 비평, 교육, 연구 등을 위하여 정당한 범위 안에서 공정한 관행에
합치되게 인용할 수 있습니다.
④ NDSL에서 제공하는 콘텐츠를 무단 복제, 전송, 배포 기타 저작권법에 위반되는 방법으로 이용할 경우
저작권법 제136조에 따라 5년 이하의 징역 또는 5천만 원 이하의 벌금에 처해질 수 있습니다.
제 14 조 (유료서비스)
① 당 사이트 및 협력기관이 정한 유료서비스(원문복사 등)는 별도로 정해진 바에 따르며, 변경사항은 시행 전에
당 사이트 홈페이지를 통하여 회원에게 공지합니다.
② 유료서비스를 이용하려는 회원은 정해진 요금체계에 따라 요금을 납부해야 합니다.
제 5 장 계약 해지 및 이용 제한
제 15 조 (계약 해지)
회원이 이용계약을 해지하고자 하는 때에는 [가입해지] 메뉴를 이용해 직접 해지해야 합니다.
제 16 조 (서비스 이용제한)
① 당 사이트는 회원이 서비스 이용내용에 있어서 본 약관 제 11조 내용을 위반하거나, 다음 각 호에 해당하는
경우 서비스 이용을 제한할 수 있습니다.
- 2년 이상 서비스를 이용한 적이 없는 경우
- 기타 정상적인 서비스 운영에 방해가 될 경우
② 상기 이용제한 규정에 따라 서비스를 이용하는 회원에게 서비스 이용에 대하여 별도 공지 없이 서비스 이용의
일시정지, 이용계약 해지 할 수 있습니다.
제 17 조 (전자우편주소 수집 금지)
회원은 전자우편주소 추출기 등을 이용하여 전자우편주소를 수집 또는 제3자에게 제공할 수 없습니다.
제 6 장 손해배상 및 기타사항
제 18 조 (손해배상)
당 사이트는 무료로 제공되는 서비스와 관련하여 회원에게 어떠한 손해가 발생하더라도 당 사이트가 고의 또는 과실로 인한 손해발생을 제외하고는 이에 대하여 책임을 부담하지 아니합니다.
제 19 조 (관할 법원)
서비스 이용으로 발생한 분쟁에 대해 소송이 제기되는 경우 민사 소송법상의 관할 법원에 제기합니다.
[부 칙]
1. (시행일) 이 약관은 2016년 9월 5일부터 적용되며, 종전 약관은 본 약관으로 대체되며, 개정된 약관의 적용일 이전 가입자도 개정된 약관의 적용을 받습니다.