• Title/Summary/Keyword: 시간정보추출

Search Result 2,170, Processing Time 0.04 seconds

Elucidation of Dishes High in N-Nitrosamines Using Total Diet Study Data (총식이조사 자료를 이용한 음식별 니트로사민 함량 분포 규명)

  • Choi, Seul Ki;Lee, Youngwon;Seo, Jung-eun;Park, Jong-eun;Lee, Jee-yeon;Kwon, Hoonjeong
    • Journal of Food Hygiene and Safety
    • /
    • v.33 no.5
    • /
    • pp.361-368
    • /
    • 2018
  • N-nitrosamines are probable or possible human carcinogens, which are produced by the reaction between secondary amines and nitrogen oxide in the acidic environment or by heating. Common risk assessment procedure involves the comparison between exposures expressed in the unit, mg/kg body weight/day and the Health-Based Reference dose expressed in the same unit. This procedure is suitable for the policy decision-making and is considered as inappropriate for the consumers to get information about their dietary decision-making. Therefore, the distributions of NDMA (N-nitrosodimethylamine), NDBA (N-nitrosodibutylamine), the six N-nitrosamines (NDMA, NDBA, NDEA (N-nitrosodiethylamine), NPYR (N-nitrosopyrrolidine), NPIP (N-nitrosopiperidine), and NMOR (N-nitrosomorpholine) in the menus grouped based on the presence of main ingredients and cooking methods were analyzed to generate consumer-friendly information regarding food contaminants. Recipes and intakes were taken from 2014 to 2016 KNHANES (The Korean National Health and Nutrition Examination Survey) and only the data from ages of 7 years or older were used. The contamination data were collected from the 2014~2016 Total Diet Study and all the analysis were performed using R software. Rockfish, eel, anchovy broth and pollock were mainly exposed to N-nitrosamines. In terms of cooking methods, soups and stews appeared to contain the highest amount of N-nitrosamines. Cereals, fruits, and dairy products in the ingredient categories, and rice dishes and rice combined with others in recipe categories had the lowest level exposure to N-nitrosamines. In case of N-nitrosamines, unlike other cooking related food contaminants, boiled dishes such as soups and stews and dishes mainly consisting of fishes and shellfishes had highest level of exposure, showing a large discrepancy with the previous thought of processed meat is the main source of N-nitrosamines.

A study on application of fractal structure on graphic design (그래픽 디자인에 있어서 프랙탈 구조의 활용 가능성 연구)

  • Moon, Chul
    • Archives of design research
    • /
    • v.17 no.1
    • /
    • pp.211-220
    • /
    • 2004
  • The Chaos theory of complexity and Fractal theory which became a prominent figure as a new paradigm of natural science should be understood not as whole, and not into separate elements of nature. Fractal Dimensions are used to measure the complexity of objects. We now have ways of measuring things that were traditionally meaningless or impossible to measure. They are capable of describing many irregularly shaped objects including man and nature. It is compatible method of application to express complexity of nature in the dimension of non-fixed number by placing our point of view to lean toward non-linear, diverse, endless time, and complexity when we look at our world. Fractal Dimension allows us to measure the complexity of an object. Having a wide application of fractal geometry and Chaos theory to the art field is the territory of imagination where art and science encounter each other and yet there has not been much research in this area. The formative word has been extracted in this study by analyzing objective data to grasp formative principle and geometric characteristic of (this)distinct figures of Fractals. With this form of research, it is not so much about fractal in mathematics, but the concept of self-similarity and recursiveness, randomness, devices expressed from unspeakable space, and the formative similarity to graphic design are focused in this study. The fractal figures have characteristics in which the structure doesn't change the nature of things of the figure even in the process if repeated infinitely many times, the limit of the process produces is fractal. Almost all fractals are at least partially self-similar. This means that a part of the fractal is identical to the entire fractal itself even if there is an enlargement to infinitesimal. This means any part has all the information to recompose as whole. Based on this scene, the research is intended to examine possibility of analysis of fractals in geometric characteristics in plasticity toward forms in graphic design. As a result, a beautiful proportion appears in graphic design with calculation of mathematic. It should be an appropriate equation to express nature since the fractal dimension allows us to measure the complexity of an object and the Fractla geometry should pick out high addition in value of peculiarity and characteristics in the complex of art and science. At the stage where the necessity of accepting this demand and adapting ourselves to the change is gathering strength is very significant in this research.

  • PDF

Creation of Actual CCTV Surveillance Map Using Point Cloud Acquired by Mobile Mapping System (MMS 점군 데이터를 이용한 CCTV의 실질적 감시영역 추출)

  • Choi, Wonjun;Park, Soyeon;Choi, Yoonjo;Hong, Seunghwan;Kim, Namhoon;Sohn, Hong-Gyoo
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.5_3
    • /
    • pp.1361-1371
    • /
    • 2021
  • Among smart city services, the crime and disaster prevention sector accounted for the highest 24% in 2018. The most important platform for providing real-time situation information is CCTV (Closed-Circuit Television). Therefore, it is essential to create the actual CCTV surveillance coverage to maximize the usability of CCTV. However, the amount of CCTV installed in Korea exceeds one million units, including those operated by the local government, and manual identification of CCTV coverage is a time-consuming and inefficient process. This study proposed a method to efficiently construct CCTV's actual surveillance coverage and reduce the time required for the decision-maker to manage the situation. For this purpose, first, the exterior orientation parameters and focal lengths of the pre-installed CCTV cameras, which are difficult to access, were calculated using the point cloud data of the MMS (Mobile Mapping System), and the FOV (Field of View) was calculated accordingly. Second, using the FOV result calculated in the first step, CCTV's actual surveillance coverage area was constructed with 1 m, 2 m, 3 m, 5 m, and 10 m grid interval considering the occluded regions caused by the buildings. As a result of applying our approach to 5 CCTV images located in Uljin-gun, Gyeongsnagbuk-do the average re-projection error was about 9.31 pixels. The coordinate difference between calculated CCTV and location obtained from MMS was about 1.688 m on average. When the grid length was 3 m, the surveillance coverage calculated through our research matched the actual surveillance obtained from visual inspection with a minimum of 70.21% to a maximum of 93.82%.

A Study on the Possibility of Short-term Monitoring of Coastal Topography Changes Using GOCI-II (GOCI-II를 활용한 단기 연안지형변화 모니터링 가능성 평가 연구)

  • Lee, Jingyo;Kim, Keunyong;Ryu, Joo-Hyung
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.5_2
    • /
    • pp.1329-1340
    • /
    • 2021
  • The intertidal zone, which is a transitional zone between the ocean and the land, requires continuous monitoring as various changes occur rapidly due to artificial activity and natural disturbance. Monitoring of coastal topography changes using remote sensing method is evaluated to be effective in overcoming the limitations of intertidal zone accessibility and observing long-term topographic changes in intertidal zone. Most of the existing coastal topographic monitoring studies using remote sensing were conducted through high spatial resolution images such as Landsat and Sentinel. This study extracted the waterline using the NDWI from the GOCI-II (Geostationary Ocean Color Satellite-II) data, identified the changes in the intertidal area in Gyeonggi Bay according to various tidal heights, and examined the utility of DEM generation and topography altitude change observation over a short period of time. GOCI-II (249 scenes), Sentinel-2A/B (39 scenes), Landsat 8 OLI (7 scenes) images were obtained around Gyeonggi Bay from October 8, 2020 to August 16, 2021. If generating intertidal area DEM, Sentinel and Landsat images required at least 3 months to 1 year of data collection, but the GOCI-II satellite was able to generate intertidal area DEM in Gyeonggi Bay using only one day of data according to tidal heights, and the topography altitude was also observed through exposure frequency. When observing coastal topography changes using the GOCI-II satellite, it would be a good idea to detect topography changes early through a short cycle and to accurately interpolate and utilize insufficient spatial resolutions using multi-remote sensing data of high resolution. Based on the above results, it is expected that it will be possible to quickly provide information necessary for the latest topographic map and coastal management of the Korean Peninsula by expanding the research area and developing technologies that can be automatically analyzed and detected.

Analyzing Self-Introduction Letter of Freshmen at Korea National College of Agricultural and Fisheries by Using Semantic Network Analysis : Based on TF-IDF Analysis (언어네트워크분석을 활용한 한국농수산대학 신입생 자기소개서 분석 - TF-IDF 분석을 기초로 -)

  • Joo, J.S.;Lee, S.Y.;Kim, J.S.;Kim, S.H.;Park, N.B.
    • Journal of Practical Agriculture & Fisheries Research
    • /
    • v.23 no.1
    • /
    • pp.89-104
    • /
    • 2021
  • Based on the TF-IDF weighted value that evaluates the importance of words that play a key role, the semantic network analysis(SNA) was conducted on the self-introduction letter of freshman at Korea National College of Agriculture and Fisheries(KNCAF) in 2020. The top three words calculated by TF-IDF weights were agriculture, mathematics, study (Q. 1), clubs, plants, friends (Q. 2), friends, clubs, opinions, (Q. 3), mushrooms, insects, and fathers (Q. 4). In the relationship between words, the words with high betweenness centrality are reason, high school, attending (Q. 1), garbage, high school, school (Q. 2), importance, misunderstanding, completion (Q.3), processing, feed, and farmhouse (Q. 4). The words with high degree centrality are high school, inquiry, grades (Q. 1), garbage, cleanup, class time (Q. 2), opinion, meetings, volunteer activities (Q.3), processing, space, and practice (Q. 4). The combination of words with high frequency of simultaneous appearances, that is, high correlation, appeared as 'certification - acquisition', 'problem - solution', 'science - life', and 'misunderstanding - concession'. In cluster analysis, the number of clusters obtained by the height of cluster dendrogram was 2(Q.1), 4(Q.2, 4) and 5(Q. 3). At this time, the cohesion in Cluster was high and the heterogeneity between Clusters was clearly shown.

Factors Influencing Satisfaction on Home Visiting Health Care Service of the Elderly based on the degree of chronic diseases (만성질환 유병상태에 따른 노인 방문건강관리 서비스 만족도 영향요인 연구)

  • Seo, Daram;Shon, Changwoo
    • 한국노년학
    • /
    • v.41 no.2
    • /
    • pp.271-284
    • /
    • 2021
  • This study was conducted to derive factors that affect the satisfaction of home visiting health care services and to develop effective community care models by using the results of Seoul's outreach service which is the basis for Korean community care. The population of the study was the elderly aged 65 and 70 who participated in the Seoul's outreach community services 3rd stage (July 2017 - June 2018) and 4th stage (July 2018 to June 2019). 2,200 people were extracted by the proportional allocation method and home visit interviews were conducted on them. Subjects were divided into sub-groups based on chronic disease prevalence, and logistic regression was conducted to derive factors that affect the satisfaction of home visiting health care services. The results demonstrated that the elderly without chronic diseases were more satisfied when they received health education and counseling services, the elderly with one chronic disease were more satisfied when they received Community resource-linked services. In the case of elderly people with two or more chronic diseases, the service satisfaction level is increased when health condition assessment and Community resource-linked services are provided. Regardless of whether or not they have chronic diseases, service delivery time was a factor that increased satisfaction in home visiting health care. And the degree of explanation understanding was a factor that increased satisfaction for both single and complex chronic patients. Home Visiting health care services based on the community is a key component of the ongoing community care. In order to increase the sustainability and effectiveness of community care in the future, Community-oriented health care services based on the degree of chronic diseases of the elderly should be provided. In order to provide more effective services, however, it is necessary (1) to establish a linkage system to share health information of the subject held by the National Health Insurance Service to local governments and (2) to provide capacity-building education for visiting nurses to improve the quality of home visiting health care services. It is hoped that this study will be us ed as bas ic data for the successful settlement of community care.

Extension Method of Association Rules Using Social Network Analysis (사회연결망 분석을 활용한 연관규칙 확장기법)

  • Lee, Dongwon
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.4
    • /
    • pp.111-126
    • /
    • 2017
  • Recommender systems based on association rule mining significantly contribute to seller's sales by reducing consumers' time to search for products that they want. Recommendations based on the frequency of transactions such as orders can effectively screen out the products that are statistically marketable among multiple products. A product with a high possibility of sales, however, can be omitted from the recommendation if it records insufficient number of transactions at the beginning of the sale. Products missing from the associated recommendations may lose the chance of exposure to consumers, which leads to a decline in the number of transactions. In turn, diminished transactions may create a vicious circle of lost opportunity to be recommended. Thus, initial sales are likely to remain stagnant for a certain period of time. Products that are susceptible to fashion or seasonality, such as clothing, may be greatly affected. This study was aimed at expanding association rules to include into the list of recommendations those products whose initial trading frequency of transactions is low despite the possibility of high sales. The particular purpose is to predict the strength of the direct connection of two unconnected items through the properties of the paths located between them. An association between two items revealed in transactions can be interpreted as the interaction between them, which can be expressed as a link in a social network whose nodes are items. The first step calculates the centralities of the nodes in the middle of the paths that indirectly connect the two nodes without direct connection. The next step identifies the number of the paths and the shortest among them. These extracts are used as independent variables in the regression analysis to predict future connection strength between the nodes. The strength of the connection between the two nodes of the model, which is defined by the number of nodes between the two nodes, is measured after a certain period of time. The regression analysis results confirm that the number of paths between the two products, the distance of the shortest path, and the number of neighboring items connected to the products are significantly related to their potential strength. This study used actual order transaction data collected for three months from February to April in 2016 from an online commerce company. To reduce the complexity of analytics as the scale of the network grows, the analysis was performed only on miscellaneous goods. Two consecutively purchased items were chosen from each customer's transactions to obtain a pair of antecedent and consequent, which secures a link needed for constituting a social network. The direction of the link was determined in the order in which the goods were purchased. Except for the last ten days of the data collection period, the social network of associated items was built for the extraction of independent variables. The model predicts the number of links to be connected in the next ten days from the explanatory variables. Of the 5,711 previously unconnected links, 611 were newly connected for the last ten days. Through experiments, the proposed model demonstrated excellent predictions. Of the 571 links that the proposed model predicts, 269 were confirmed to have been connected. This is 4.4 times more than the average of 61, which can be found without any prediction model. This study is expected to be useful regarding industries whose new products launch quickly with short life cycles, since their exposure time is critical. Also, it can be used to detect diseases that are rarely found in the early stages of medical treatment because of the low incidence of outbreaks. Since the complexity of the social networking analysis is sensitive to the number of nodes and links that make up the network, this study was conducted in a particular category of miscellaneous goods. Future research should consider that this condition may limit the opportunity to detect unexpected associations between products belonging to different categories of classification.

Efficient Topic Modeling by Mapping Global and Local Topics (전역 토픽의 지역 매핑을 통한 효율적 토픽 모델링 방안)

  • Choi, Hochang;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.3
    • /
    • pp.69-94
    • /
    • 2017
  • Recently, increase of demand for big data analysis has been driving the vigorous development of related technologies and tools. In addition, development of IT and increased penetration rate of smart devices are producing a large amount of data. According to this phenomenon, data analysis technology is rapidly becoming popular. Also, attempts to acquire insights through data analysis have been continuously increasing. It means that the big data analysis will be more important in various industries for the foreseeable future. Big data analysis is generally performed by a small number of experts and delivered to each demander of analysis. However, increase of interest about big data analysis arouses activation of computer programming education and development of many programs for data analysis. Accordingly, the entry barriers of big data analysis are gradually lowering and data analysis technology being spread out. As the result, big data analysis is expected to be performed by demanders of analysis themselves. Along with this, interest about various unstructured data is continually increasing. Especially, a lot of attention is focused on using text data. Emergence of new platforms and techniques using the web bring about mass production of text data and active attempt to analyze text data. Furthermore, result of text analysis has been utilized in various fields. Text mining is a concept that embraces various theories and techniques for text analysis. Many text mining techniques are utilized in this field for various research purposes, topic modeling is one of the most widely used and studied. Topic modeling is a technique that extracts the major issues from a lot of documents, identifies the documents that correspond to each issue and provides identified documents as a cluster. It is evaluated as a very useful technique in that reflect the semantic elements of the document. Traditional topic modeling is based on the distribution of key terms across the entire document. Thus, it is essential to analyze the entire document at once to identify topic of each document. This condition causes a long time in analysis process when topic modeling is applied to a lot of documents. In addition, it has a scalability problem that is an exponential increase in the processing time with the increase of analysis objects. This problem is particularly noticeable when the documents are distributed across multiple systems or regions. To overcome these problems, divide and conquer approach can be applied to topic modeling. It means dividing a large number of documents into sub-units and deriving topics through repetition of topic modeling to each unit. This method can be used for topic modeling on a large number of documents with limited system resources, and can improve processing speed of topic modeling. It also can significantly reduce analysis time and cost through ability to analyze documents in each location or place without combining analysis object documents. However, despite many advantages, this method has two major problems. First, the relationship between local topics derived from each unit and global topics derived from entire document is unclear. It means that in each document, local topics can be identified, but global topics cannot be identified. Second, a method for measuring the accuracy of the proposed methodology should be established. That is to say, assuming that global topic is ideal answer, the difference in a local topic on a global topic needs to be measured. By those difficulties, the study in this method is not performed sufficiently, compare with other studies dealing with topic modeling. In this paper, we propose a topic modeling approach to solve the above two problems. First of all, we divide the entire document cluster(Global set) into sub-clusters(Local set), and generate the reduced entire document cluster(RGS, Reduced global set) that consist of delegated documents extracted from each local set. We try to solve the first problem by mapping RGS topics and local topics. Along with this, we verify the accuracy of the proposed methodology by detecting documents, whether to be discerned as the same topic at result of global and local set. Using 24,000 news articles, we conduct experiments to evaluate practical applicability of the proposed methodology. In addition, through additional experiment, we confirmed that the proposed methodology can provide similar results to the entire topic modeling. We also proposed a reasonable method for comparing the result of both methods.

A Study of the Reactive Movement Synchronization for Analysis of Group Flow (그룹 몰입도 판단을 위한 움직임 동기화 연구)

  • Ryu, Joon Mo;Park, Seung-Bo;Kim, Jae Kyeong
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.1
    • /
    • pp.79-94
    • /
    • 2013
  • Recently, the high value added business is steadily growing in the culture and art area. To generated high value from a performance, the satisfaction of audience is necessary. The flow in a critical factor for satisfaction, and it should be induced from audience and measures. To evaluate interest and emotion of audience on contents, producers or investors need a kind of index for the measurement of the flow. But it is neither easy to define the flow quantitatively, nor to collect audience's reaction immediately. The previous studies of the group flow were evaluated by the sum of the average value of each person's reaction. The flow or "good feeling" from each audience was extracted from his face, especially, the change of his (or her) expression and body movement. But it was not easy to handle the large amount of real-time data from each sensor signals. And also it was difficult to set experimental devices, in terms of economic and environmental problems. Because, all participants should have their own personal sensor to check their physical signal. Also each camera should be located in front of their head to catch their looks. Therefore we need more simple system to analyze group flow. This study provides the method for measurement of audiences flow with group synchronization at same time and place. To measure the synchronization, we made real-time processing system using the Differential Image and Group Emotion Analysis (GEA) system. Differential Image was obtained from camera and by the previous frame was subtracted from present frame. So the movement variation on audience's reaction was obtained. And then we developed a program, GEX(Group Emotion Analysis), for flow judgment model. After the measurement of the audience's reaction, the synchronization is divided as Dynamic State Synchronization and Static State Synchronization. The Dynamic State Synchronization accompanies audience's active reaction, while the Static State Synchronization means to movement of audience. The Dynamic State Synchronization can be caused by the audience's surprise action such as scary, creepy or reversal scene. And the Static State Synchronization was triggered by impressed or sad scene. Therefore we showed them several short movies containing various scenes mentioned previously. And these kind of scenes made them sad, clap, and creepy, etc. To check the movement of audience, we defined the critical point, ${\alpha}$and ${\beta}$. Dynamic State Synchronization was meaningful when the movement value was over critical point ${\beta}$, while Static State Synchronization was effective under critical point ${\alpha}$. ${\beta}$ is made by audience' clapping movement of 10 teams in stead of using average number of movement. After checking the reactive movement of audience, the percentage(%) ratio was calculated from the division of "people having reaction" by "total people". Total 37 teams were made in "2012 Seoul DMC Culture Open" and they involved the experiments. First, they followed induction to clap by staff. Second, basic scene for neutralize emotion of audience. Third, flow scene was displayed to audience. Forth, the reversal scene was introduced. And then 24 teams of them were provided with amuse and creepy scenes. And the other 10 teams were exposed with the sad scene. There were clapping and laughing action of audience on the amuse scene with shaking their head or hid with closing eyes. And also the sad or touching scene made them silent. If the results were over about 80%, the group could be judged as the synchronization and the flow were achieved. As a result, the audience showed similar reactions about similar stimulation at same time and place. Once we get an additional normalization and experiment, we can obtain find the flow factor through the synchronization on a much bigger group and this should be useful for planning contents.

Documentation of Intangible Cultural Heritage Using Motion Capture Technology Focusing on the documentation of Seungmu, Salpuri and Taepyeongmu (부록 3. 모션캡쳐를 이용한 무형문화재의 기록작성 - 국가지정 중요무형문화재 승무·살풀이·태평무를 중심으로 -)

  • Park, Weonmo;Go, Jungil;Kim, Yongsuk
    • Korean Journal of Heritage: History & Science
    • /
    • v.39
    • /
    • pp.351-378
    • /
    • 2006
  • With the development of media, the methods for the documentation of intangible cultural heritage have been also developed and diversified. As well as the previous analogue ways of documentation, the have been recently applying new multi-media technologies focusing on digital pictures, sound sources, movies, etc. Among the new technologies, the documentation of intangible cultural heritage using the method of 'Motion Capture' has proved itself prominent especially in the fields that require three-dimensional documentation such as dances and performances. Motion Capture refers to the documentation technology which records the signals of the time varing positions derived from the sensors equipped on the surface of an object. It converts the signals from the sensors into digital data which can be plotted as points on the virtual coordinates of the computer and records the movement of the points during a certain period of time, as the object moves. It produces scientific data for the preservation of intangible cultural heritage, by displaying digital data which represents the virtual motion of a holder of an intangible cultural heritage. National Research Institute of Cultural Properties (NRICP) has been working on for the development of new documentation method for the Important Intangible Cultural Heritage designated by Korean government. This is to be done using 'motion capture' equipments which are also widely used for the computer graphics in movie or game industries. This project is designed to apply the motion capture technology for 3 years- from 2005 to 2007 - for 11 performances from 7 traditional dances of which body gestures have considerable values among the Important Intangible Cultural Heritage performances. This is to be supported by lottery funds. In 2005, the first year of the project, accumulated were data of single dances, such as Seungmu (monk's dance), Salpuri(a solo dance for spiritual cleansing dance), Taepyeongmu (dance of peace), which are relatively easy in terms of performing skills. In 2006, group dances, such as Jinju Geommu (Jinju sword dance), Seungjeonmu (dance for victory), Cheoyongmu (dance of Lord Cheoyong), etc., will be documented. In the last year of the project, 2007, education programme for comparative studies, analysis and transmission of intangible cultural heritage and three-dimensional contents for public service will be devised, based on the accumulated data, as well as the documentation of Hakyeonhwadae Habseolmu (crane dance combined with the lotus blossom dance). By describing the processes and results of motion capture documentation of Salpuri dance (Lee Mae-bang), Taepyeongmu (Kang seon-young) and Seungmu (Lee Mae-bang, Lee Ae-ju and Jung Jae-man) conducted in 2005, this report introduces a new approach for the documentation of intangible cultural heritage. During the first year of the project, two questions have been raised. First, how can we capture motions of a holder (dancer) without cutoffs during quite a long performance? After many times of tests, the motion capture system proved itself stable with continuous results. Second, how can we reproduce the accurate motion without the re-targeting process? The project re-created the most accurate motion of the dancer's gestures, applying the new technology to drew out the shape of the dancers's body digital data before the motion capture process for the first time in Korea. The accurate three-dimensional body models for four holders obtained by the body scanning enhanced the accuracy of the motion capture of the dance.