• Title/Summary/Keyword: 시각화실험

Search Result 581, Processing Time 0.033 seconds

Analysis of Enhancement Effect and Attachment Ability of Beneficial Intestinal Microflora in Puffed Grain Foods Using Confocal Laser Scanning Microscopy (곡물 소재 팽화식품에서 장내 유익균의 증진 효과 분석 및 공초점 현미경을 이용한 부착능 평가)

  • Jeong, Myeong-Kyo;Oh, Do-Geon;Kwon, Oh-Sung;Jeong, Jun-Young;Lee, Ym-Shik;Kim, Kwang-Yup
    • Journal of the Korean Society of Food Science and Nutrition
    • /
    • v.46 no.9
    • /
    • pp.1071-1080
    • /
    • 2017
  • This study examined the adhesiveness of beneficial intestinal bacteria to whole-grains using confocal scanning laser microscopy (CLSM), to demonstrate the prebiotic effects of whole-grains, and to develop prebiotic puffed snacks with these whole-grains. CLSM has been used to observe the adhesiveness of Lactobacillus acidophilus, which belongs to beneficial intestinal bacteria, to whole-grain powders using optical sectioning techniques. The enhanced effects on the growth of beneficial intestinal bacteria with the hot water grain extract were verified using an indirect count method. Finally, a puffed snack was produced with the prebiotic effect and the quality was evaluated by checking the chromaticity and degree of hardness. As a result, L. acidophilus exhibited adhesive ability to whole-grain powders and growth of selected beneficial intestinal bacteria were improved significantly. The Hunter L value of the developed puffed snack increased when seasoning was added. The hardness of the puffed snack with seasoning was higher than that of the control. The results of a sensory evaluation showed that the puffed snack with seasoning was highly rated in the overall preference compared to the control.

Dual Codec Based Joint Bit Rate Control Scheme for Terrestrial Stereoscopic 3DTV Broadcast (지상파 스테레오스코픽 3DTV 방송을 위한 이종 부호화기 기반 합동 비트율 제어 연구)

  • Chang, Yong-Jun;Kim, Mun-Churl
    • Journal of Broadcast Engineering
    • /
    • v.16 no.2
    • /
    • pp.216-225
    • /
    • 2011
  • Following the proliferation of three-dimensional video contents and displays, many terrestrial broadcasting companies have been preparing for stereoscopic 3DTV service. In terrestrial stereoscopic broadcast, it is a difficult task to code and transmit two video sequences while sustaining as high quality as 2DTV broadcast due to the limited bandwidth defined by the existing digital TV standards such as ATSC. Thus, a terrestrial 3DTV broadcasting with a heterogeneous video codec system, where the left image and right images are based on MPEG-2 and H.264/AVC, respectively, is considered in order to achieve both high quality broadcasting service and compatibility for the existing 2DTV viewers. Without significant change in the current terrestrial broadcasting systems, we propose a joint rate control scheme for stereoscopic 3DTV service based on the heterogeneous dual codec systems. The proposed joint rate control scheme applies to the MPEG-2 encoder a quadratic rate-quantization model which is adopted in the H.264/AVC. Then the controller is designed for the sum of the left and right bitstreams to meet the bandwidth requirement of broadcasting standards while the sum of image distortions is minimized by adjusting quantization parameter obtained from the proposed optimization scheme. Besides, we consider a condition on maintaining quality difference between the left and right images around a desired level in the optimization in order to mitigate negative effects on human visual system. Experimental results demonstrate that the proposed bit rate control scheme outperforms the rate control method where each video coding standard uses its own bit rate control algorithm independently in terms of the increase in PSNR by 2.02%, the decrease in the average absolute quality difference by 77.6% and the reduction in the variance of the quality difference by 74.38%.

Assessment of the Synovial Inflammation in Rheumatoid Arthritis with $^{99m}Tc$-labelled Polyclonal Human IgG(HIG): Prospective Comparison with Gadolinium Enhanced MRI ($^{99m}Tc$-labelled HIG 스캔을 이용한 류마티스 관절염 환자에서 활막염증의 평가 : 조영증강 자기공명영상과의 전향적인 비교)

  • Ryu, Young-Hoon;Lee, Jong-Doo;Suh, Jin-Suck;Park, Chang-Yun;Jeon, Pyoung;Na, Jae-Beom;Lee, Soo-Kon
    • The Korean Journal of Nuclear Medicine
    • /
    • v.29 no.1
    • /
    • pp.84-91
    • /
    • 1995
  • Many clinical and laboratory tests have been employed to evaluate disease activity in rheumatioid arthritis. $^{99m}Tc$-labelled polyclonal IgG(HIG) has been demonstrated to accumulate in focal sites of infection or inflammation in both animals and human subjects. The purpose of this study was to distinguish arthritis with active inflammation from those without active inflammation and to correlate relative intensities of $^{99m}Tc$-labelled HIG uptake of the rheumatoid arthritis with clinical and MR indices of the joint inflammation. This study included twelve patients with active rheumatoid arthritis, two with ankylosing spondylitis and one with degenerative osteoarthritis without active inflammation. A Whole-body and spot images were obtained 4 hours after intravenous injection of 20mCi of $^{99m}Tc$-labelled HIG. Scintigrams were assessed visually by 3 experienced radiologists, and graded as normal or mildly and markedly increased uptake within the joints, and the degree of uptake was compared with clinical and radiologic severity of synovial inflammation. MRI studies were done on the involved joints consisted of wrist(n = 11), knee(n = 2) and hip joint(n= 2). Active synovitis was defined when marked elevation of ESR and gadolinium enhancement of synovium on MRI were demonstrated. Markedly increased radiotracer uptake was seen in 10 of 11 rheumatoid arthritic patients with active synovitis whereas normal or mildly increased uptakes were noted in others, including rheumatoid arthritic patient(n=1) and non-rheumatoid patients(n = 3) without active synovitis. This study showed that the localization of involved joints in rheumatoid arthritis could be detected with $^{99m}Tc$-labelled HIG and that the degree of uptake correlated well with the degree and activity of inflammation. In conclusion, $^{99m}Tc$-labelled HIG scan is a useful method in the evaluation of active inflammation in rheumatoid arthritis.

  • PDF

Design Strategies for Regionality in Contemporary Landscape Architecture (현대 조경 설계에서 지역성 구현 전략)

  • Choi, Jung-Mean
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.44 no.6
    • /
    • pp.98-106
    • /
    • 2016
  • This paper has attempted to reexamine current international circumstance and the meaning of regionality and discover the practical design strategy in the process of observing the trend of contemporary landscape architecture from the perspective of regionality. Contemporary landscape architecture has started to discover possibility in the local value and create identity. This tendency can be classified as follows: First, regionality is re-examined as a medium which can integrate nature, culture and city. As a concept which contains time and spatial continuity, landscape is a matter of the identity of land and area. Second, regionality has been reinterpreted and recreated by designers. Landscape designers attempt to restore the past memories and traces instead of adding a new concept after erasing previous physical features. This design attitude has spatialized time continuity. Third, site is seen as a palimpsest, not tabula rasa in contemporary landscape architecture. It has been attempted to visually materialize the natural and ecological processes and spatial features. Fourth, site is approached in a tectonic approach instead of analytical approach. It is attempted to organize and restore the geological and archeological memories and ecological processes. Differentiation has emerged as a critical design strategy in contemporary landscape architecture. However, regionality is also formed through an interaction with continuity as well as through differentiation. In this sense, the following possibilities can be reviewed as practical design strategies to realize regionality: First, a terra-tectonic approach discovers and selects possibility in the site and expresses the site, creating practical possibility which strengthens regionality. If the memory and conditions of the site are different, the identity would different as well. Second, continuity of region itself is a gene pool with comparative advantage. As a rough sketch of design, it acts as a loose conformity on designers' experience and practice. Of course, this approach is not absolute with some limitations. It is necessary to explore practical strategies.

A Method for Evaluating News Value based on Supply and Demand of Information Using Text Analysis (텍스트 분석을 활용한 정보의 수요 공급 기반 뉴스 가치 평가 방안)

  • Lee, Donghoon;Choi, Hochang;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.4
    • /
    • pp.45-67
    • /
    • 2016
  • Given the recent development of smart devices, users are producing, sharing, and acquiring a variety of information via the Internet and social network services (SNSs). Because users tend to use multiple media simultaneously according to their goals and preferences, domestic SNS users use around 2.09 media concurrently on average. Since the information provided by such media is usually textually represented, recent studies have been actively conducting textual analysis in order to understand users more deeply. Earlier studies using textual analysis focused on analyzing a document's contents without substantive consideration of the diverse characteristics of the source medium. However, current studies argue that analytical and interpretive approaches should be applied differently according to the characteristics of a document's source. Documents can be classified into the following types: informative documents for delivering information, expressive documents for expressing emotions and aesthetics, operational documents for inducing the recipient's behavior, and audiovisual media documents for supplementing the above three functions through images and music. Further, documents can be classified according to their contents, which comprise facts, concepts, procedures, principles, rules, stories, opinions, and descriptions. Documents have unique characteristics according to the source media by which they are distributed. In terms of newspapers, only highly trained people tend to write articles for public dissemination. In contrast, with SNSs, various types of users can freely write any message and such messages are distributed in an unpredictable way. Again, in the case of newspapers, each article exists independently and does not tend to have any relation to other articles. However, messages (original tweets) on Twitter, for example, are highly organized and regularly duplicated and repeated through replies and retweets. There have been many studies focusing on the different characteristics between newspapers and SNSs. However, it is difficult to find a study that focuses on the difference between the two media from the perspective of supply and demand. We can regard the articles of newspapers as a kind of information supply, whereas messages on various SNSs represent a demand for information. By investigating traditional newspapers and SNSs from the perspective of supply and demand of information, we can explore and explain the information dilemma more clearly. For example, there may be superfluous issues that are heavily reported in newspaper articles despite the fact that users seldom have much interest in these issues. Such overproduced information is not only a waste of media resources but also makes it difficult to find valuable, in-demand information. Further, some issues that are covered by only a few newspapers may be of high interest to SNS users. To alleviate the deleterious effects of information asymmetries, it is necessary to analyze the supply and demand of each information source and, accordingly, provide information flexibly. Such an approach would allow the value of information to be explored and approximated on the basis of the supply-demand balance. Conceptually, this is very similar to the price of goods or services being determined by the supply-demand relationship. Adopting this concept, media companies could focus on the production of highly in-demand issues that are in short supply. In this study, we selected Internet news sites and Twitter as representative media for investigating information supply and demand, respectively. We present the notion of News Value Index (NVI), which evaluates the value of news information in terms of the magnitude of Twitter messages associated with it. In addition, we visualize the change of information value over time using the NVI. We conducted an analysis using 387,014 news articles and 31,674,795 Twitter messages. The analysis results revealed interesting patterns: most issues show lower NVI than average of the whole issue, whereas a few issues show steadily higher NVI than the average.

Could a Product with Diverged Reviews Ratings Be Better?: The Change of Consumer Attitude Depending on the Converged vs. Diverged Review Ratings and Consumer's Regulatory Focus (평점이 수렴되지 않는 리뷰의 제품들이 더 좋을 수도 있을까?: 제품 리뷰평점의 분산과 소비자의 조절초점 성향에 따른 소비자 태도 변화)

  • Yi, Eunju;Park, Do-Hyung
    • Knowledge Management Research
    • /
    • v.22 no.3
    • /
    • pp.273-293
    • /
    • 2021
  • Due to the COVID-19 pandemic, the size of the e-commerce has been increased rapidly. This pandemic, which made contact-less communication culture in everyday life made the e-commerce market to be opened even to the consumers who would hesitate to purchase and pay by electronic device without any personal contacts and seeing or touching the real products. Consumers who have experienced the easy access and convenience of the online purchase would continue to take those advantages even after the pandemic. During this time of transformation, however, the size of information source for the consumers has become even shrunk into a flat screen and limited to visual only. To provide differentiated and competitive information on products, companies are adopting AR/VR and steaming technologies but the reviews from the honest users need to be recognized as important in that it is regarded as strong as the well refined product information provided by marketing professionals of the company and companies may obtain useful insight for product development, marketing and sales strategies. Then from the consumer's point of view, if the ratings of reviews are widely diverged how consumers would process the review information before purchase? Are non-converged ratings always unreliable and worthless? In this study, we analyzed how consumer's regulatory focus moderate the attitude to process the diverged information. This experiment was designed as a 2x2 factorial study to see how the variance of product review ratings (high vs. low) for cosmetics affects product attitudes by the consumers' regulatory focus (prevention focus vs. improvement focus). As a result of the study, it was found that prevention-focused consumers showed high product attitude when the review variance was low, whereas promotion-focused consumers showed high product attitude when the review variance was high. With such a study, this thesis can explain that even if a product with exactly the same average rating, the converged or diverged review can be interpreted differently by customer's regulatory focus. This paper has a theoretical contribution to elucidate the mechanism of consumer's information process when the information is not converged. In practice, as reviews and sales records of each product are accumulated, as an one of applied knowledge management types with big data, companies may develop and provide even reinforced customer experience by providing personalized and optimized products and review information.

Simulation and Post-representation: a study of Algorithmic Art (시뮬라시옹과 포스트-재현 - 알고리즘 아트를 중심으로)

  • Lee, Soojin
    • 기호학연구
    • /
    • no.56
    • /
    • pp.45-70
    • /
    • 2018
  • Criticism of the postmodern philosophy of the system of representation, which has continued since the Renaissance, is based on a critique of the dichotomy that separates the subjects and objects and the environment from the human being. Interactivity, highlighted in a series of works emerging as postmodern trends in the 1960s, was transmitted to an interactive aspect of digital art in the late 1990s. The key feature of digital art is the possibility of infinite variations reflecting unpredictable changes based on public participation on the spot. In this process, the importance of computer programs is highlighted. Instead of using the existing program as it is, more and more artists are creating and programming their own algorithms or creating unique algorithms through collaborations with programmers. We live in an era of paradigm shift in which programming itself must be considered as a creative act. Simulation technology and VR technology draw attention as a technique to represent the meaning of reality. Simulation technology helps artists create experimental works. In fact, Baudrillard's concept of Simulation defines the other reality that has nothing to do with our reality, rather than a reality that is extremely representative of our reality. His book Simulacra and Simulation refers to the existence of a reality entirely different from the traditional concept of reality. His argument does not concern the problems of right and wrong. There is no metaphysical meaning. Applying the concept of simulation to algorithmic art, the artist models the complex attributes of reality in the digital system. And it aims to build and integrate internal laws that structure and activate the world (specific or individual), that is to say, simulate the world. If the images of the traditional order correspond to the reproduction of the real world, the synthesized images of algorithmic art and simulated space-time are the forms of art that facilitate the experience. The moment of seeing and listening to the work of Ian Cheng presented in this article is a moment of personal experience and the perception is made at that time. It is not a complete and closed process, but a continuous and changing process. It is this active and situational awareness that is required to the audience for the comprehension of post-representation's forms.

Effects of Vibration Stimulation Therapy on Neglect of Stroke Patients (진동감각 자극치료가 뇌졸중 환자의 편측무시에 미치는 영향)

  • Jeon, So-Hyun
    • Journal of Society of Occupational Therapy for the Aged and Dementia
    • /
    • v.12 no.2
    • /
    • pp.87-95
    • /
    • 2018
  • Objective : Vibration stimulators are easier to obtain in clinical settings than other treatment tools, and it is advantageous that the arm activation training can be performed passively. Despite the following advantages, recent studies on vibration sense have not been activated yet. The purpose of this study was to investigate the effect of vibration sensation on the hands of the affected upper limb on unilateral reduction of stroke patients. Method : Patients with unilateral neglect due to stroke were enrolled in this study for about 3 weeks from October 19, 2018 to November 7, The research design used ABA design among the single-subject experimental research design, and a total of 18 circuits (4 baseline, 6 intervention, 3 baseline regression) were performed once a day on weekdays Respectively. MMES-K was used to select the subjects. Line bisection test (LBT), Albert's test and Star Cancellation Test (SCT) were used as unilateral neglect test. For the analysis, the baseline and intervention period measurements were visually analyzed using graphs and mean values were used. Result : All three evaluations showed that the number of errors missed during the training period was lower than the baseline period, and this decrease remained after training. The error was reduced by an average of $2{\pm}1.2$ omissions and an average omissions of $0.6{\pm}0.5$ omitting an average of $4.5{\pm}1$ omissions in the line break test. As a result of the Albert test, the average error decreased by $22.5{\pm}1.9$ omissions and $8{\pm}7.3$ omissions and $0.3{\pm}0.5$ omissions, respectively. In the star clearance test, the average error decreased from $26{\pm}4.6$ to $21.8{\pm}1.7$ and $20{\pm}0$, respectively. Conclusion : In this study, vibrotactile stimulation therapy showed a continuous effect on improving unilateral neglect. Based on these findings, further research should be conducted in order to improve objectivity in future studies. Further research on various arbitration methods that maximize the effect of intervention will be needed.

The Narrative Structure of Terayama Shūji's Sekkyōbushi Misemono Opera Shintokumaru (데라야마 슈지(寺山修司)의 '셋교부시(說敎節)에 의한 미세모노(見せ物)오페라' <신토쿠마루(身毒丸)>의 서사 구조)

  • Kang, Choon-ae
    • (The) Research of the performance art and culture
    • /
    • no.32
    • /
    • pp.489-524
    • /
    • 2016
  • This study examines the birth of a genre, the $Sekky{\bar{o}}bushi$ Misemono Opera, focusing on how it accepted and modernized Katarimono $Sekky{\bar{o}}bushi$. Unlike earlier studies, it argues that Terayama was clearly different from other first-generation Angura artists, in that he rebirthed the medieval story $Sekky{\bar{o}}bushi$ as a modern Misemono Opera. Shintokumaru (1978) was directed by Terayama $Sh{\bar{u}}ji$, a member of the first generation of Japan's 1960s Angura Theatre Movement. It takes as its subject the Katarimono $Sekky{\bar{o}}bushi$ Shintokumaru, a story set to music that can be considered an example of the modern heritage of East Asian storytelling. $Sekky{\bar{o}}$ Shintokumaru is set in Tennoji, Japan. The title character Shintoku develops leprosy as a result of his stepmother's curse and is saved through his fiancee Otohime's devoted love and the spiritual power of the Bodhisattva Avalokitesvara. In this work, Terayama combined the narrative style of $Sekky{\bar{o}}bushi$ with J.A. Caesar's shamanistic rock music and gave it the subtitle 'Misemono Opera by $Sekky{\bar{o}}bushi$'. He transforms its underlying theme, the principle of goddesses and their offspring in a medieval religious world and the modori (return) instinct, into a world of mother-son-incest. Also, the pedestrian revenge scene from $Sekky{\bar{o}}bushi$ is altered to represent Shintokumaru as a drag queen, wearing his stepmother's clothes and mask, and he unites sexually with Sensaku, his stepbrother, and ends up killing him. The play follows the cause and effect structure of $Sekky{\bar{o}}bushi$. The appearance of katarite, a storyteller, propelling the narrative throughout and Dr. Yanagida Kunio is significant as an example of the modern use of self-introduction as a narrative device and chorus. Terayama $Sh{\bar{u}}ji^{\prime}s$ memories of desperate childhood, especially the absence of his father and the Aomori air raids, are depicted and deepened in structure. However, seventeen years after Terayama's death, the version of the play directed by Ninagawa Yukio-based on a revised edition by Kishida Rio, who had been Terayama's writing partner since the play's premier-is the today the better-known version. All the theatrical elements implied by Terayama's subtitle were removed, and as a result, the Rio production misses the essence of the diverse experimental theatre of Terayama's theatre company, $Tenj{\bar{o}}$ Sajiki. Shintokumaru has the narrative structure characteristic of aphorism. That is, each part of the story can stand alone, but it is possible to combine all the parts organically.

Development of a complex failure prediction system using Hierarchical Attention Network (Hierarchical Attention Network를 이용한 복합 장애 발생 예측 시스템 개발)

  • Park, Youngchan;An, Sangjun;Kim, Mintae;Kim, Wooju
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.127-148
    • /
    • 2020
  • The data center is a physical environment facility for accommodating computer systems and related components, and is an essential foundation technology for next-generation core industries such as big data, smart factories, wearables, and smart homes. In particular, with the growth of cloud computing, the proportional expansion of the data center infrastructure is inevitable. Monitoring the health of these data center facilities is a way to maintain and manage the system and prevent failure. If a failure occurs in some elements of the facility, it may affect not only the relevant equipment but also other connected equipment, and may cause enormous damage. In particular, IT facilities are irregular due to interdependence and it is difficult to know the cause. In the previous study predicting failure in data center, failure was predicted by looking at a single server as a single state without assuming that the devices were mixed. Therefore, in this study, data center failures were classified into failures occurring inside the server (Outage A) and failures occurring outside the server (Outage B), and focused on analyzing complex failures occurring within the server. Server external failures include power, cooling, user errors, etc. Since such failures can be prevented in the early stages of data center facility construction, various solutions are being developed. On the other hand, the cause of the failure occurring in the server is difficult to determine, and adequate prevention has not yet been achieved. In particular, this is the reason why server failures do not occur singularly, cause other server failures, or receive something that causes failures from other servers. In other words, while the existing studies assumed that it was a single server that did not affect the servers and analyzed the failure, in this study, the failure occurred on the assumption that it had an effect between servers. In order to define the complex failure situation in the data center, failure history data for each equipment existing in the data center was used. There are four major failures considered in this study: Network Node Down, Server Down, Windows Activation Services Down, and Database Management System Service Down. The failures that occur for each device are sorted in chronological order, and when a failure occurs in a specific equipment, if a failure occurs in a specific equipment within 5 minutes from the time of occurrence, it is defined that the failure occurs simultaneously. After configuring the sequence for the devices that have failed at the same time, 5 devices that frequently occur simultaneously within the configured sequence were selected, and the case where the selected devices failed at the same time was confirmed through visualization. Since the server resource information collected for failure analysis is in units of time series and has flow, we used Long Short-term Memory (LSTM), a deep learning algorithm that can predict the next state through the previous state. In addition, unlike a single server, the Hierarchical Attention Network deep learning model structure was used in consideration of the fact that the level of multiple failures for each server is different. This algorithm is a method of increasing the prediction accuracy by giving weight to the server as the impact on the failure increases. The study began with defining the type of failure and selecting the analysis target. In the first experiment, the same collected data was assumed as a single server state and a multiple server state, and compared and analyzed. The second experiment improved the prediction accuracy in the case of a complex server by optimizing each server threshold. In the first experiment, which assumed each of a single server and multiple servers, in the case of a single server, it was predicted that three of the five servers did not have a failure even though the actual failure occurred. However, assuming multiple servers, all five servers were predicted to have failed. As a result of the experiment, the hypothesis that there is an effect between servers is proven. As a result of this study, it was confirmed that the prediction performance was superior when the multiple servers were assumed than when the single server was assumed. In particular, applying the Hierarchical Attention Network algorithm, assuming that the effects of each server will be different, played a role in improving the analysis effect. In addition, by applying a different threshold for each server, the prediction accuracy could be improved. This study showed that failures that are difficult to determine the cause can be predicted through historical data, and a model that can predict failures occurring in servers in data centers is presented. It is expected that the occurrence of disability can be prevented in advance using the results of this study.