• Title/Summary/Keyword: 데이터과학과

Search Result 5,279, Processing Time 0.036 seconds

Hybrid Scheme of Data Cache Design for Reducing Energy Consumption in High Performance Embedded Processor (고성능 내장형 프로세서의 에너지 소비 감소를 위한 데이타 캐쉬 통합 설계 방법)

  • Shim, Sung-Hoon;Kim, Cheol-Hong;Jhang, Seong-Tae;Jhon, Chu-Shik
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.33 no.3
    • /
    • pp.166-177
    • /
    • 2006
  • The cache size tends to grow in the embedded processor as technology scales to smaller transistors and lower supply voltages. However, larger cache size demands more energy. Accordingly, the ratio of the cache energy consumption to the total processor energy is growing. Many cache energy schemes have been proposed for reducing the cache energy consumption. However, these previous schemes are concerned with one side for reducing the cache energy consumption, dynamic cache energy only, or static cache energy only. In this paper, we propose a hybrid scheme for reducing dynamic and static cache energy, simultaneously. for this hybrid scheme, we adopt two existing techniques to reduce static cache energy consumption, drowsy cache technique, and to reduce dynamic cache energy consumption, way-prediction technique. Additionally, we propose a early wake-up technique based on program counter to reduce penalty caused by applying drowsy cache technique. We focus on level 1 data cache. The hybrid scheme can reduce static and dynamic cache energy consumption simultaneously, furthermore our early wake-up scheme can reduce extra program execution cycles caused by applying the hybrid scheme.

Direct Reconstruction of Displaced Subdivision Mesh from Unorganized 3D Points (연결정보가 없는 3차원 점으로부터 차이분할메쉬 직접 복원)

  • Jung, Won-Ki;Kim, Chang-Heon
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.29 no.6
    • /
    • pp.307-317
    • /
    • 2002
  • In this paper we propose a new mesh reconstruction scheme that produces a displaced subdivision surface directly from unorganized points. The displaced subdivision surface is a new mesh representation that defines a detailed mesh with a displacement map over a smooth domain surface, but original displaced subdivision surface algorithm needs an explicit polygonal mesh since it is not a mesh reconstruction algorithm but a mesh conversion (remeshing) algorithm. The main idea of our approach is that we sample surface detail from unorganized points without any topological information. For this, we predict a virtual triangular face from unorganized points for each sampling ray from a parameteric domain surface. Direct displaced subdivision surface reconstruction from unorganized points has much importance since the output of this algorithm has several important properties: It has compact mesh representation since most vertices can be represented by only a scalar value. Underlying structure of it is piecewise regular so it ran be easily transformed into a multiresolution mesh. Smoothness after mesh deformation is automatically preserved. We avoid time-consuming global energy optimization by employing the input data dependant mesh smoothing, so we can get a good quality displaced subdivision surface quickly.

Feasibility of Two Dimensional Ion Chamber Array for a Linac Periodic Quality Assurance (선형가속기의 품질관리를 위한 2차원이온전리함배열의 유용성)

  • Lee, Jeong-Woo;Hong, Se-Mie;Park, Byung-Moon;Kang, Min-Young;Kim, You-Hyun;Suh, Tae-Suk
    • Journal of radiological science and technology
    • /
    • v.31 no.2
    • /
    • pp.183-188
    • /
    • 2008
  • Aim of this study is to investigate the feasibility of 2D ion chamber array as a substitute of the water phantom system in a periodic Linac QA. For the feasibility study, a commercial ion chamber matrix was used as a substitute of the water phantom in the measurement for a routine QA beam properties. The device used in this study was the I'm RT MatriXX (Wellhofer Dosimetrie, Germany). The MatriXX consists of a 1,020 vented ion chamber array, arranged in $24{\times}24\;cm^2$ matrix. Each ion chamber has a volume of $0.08\;cm^3$, spacing of 0.762 cm. We investigated dosimetric parameters such as dose symmetry, energy ($TPR_{20,10}$), and absolute dose for comparing with the water phantom data with a Farmer-type ionization chamber (FC65G, Wellhofer Dosimetrie, Germany). For the MatriXX measurements, we used the white polystyrene phantom (${\rho}:\;1.18\;g/cm^3$) and also considered the intrinsic layer (${\rho}:\;1.06\;g/cm^3$, t: 0.36 cm) of MatriXX to be equivalent to water depth. In the preliminary study of geometrical QA using MatriXX, the rotation axis of collimator and half beam junction test were included and compared with film measurements. Regarding the dosimetrical QA, the MatriXX has shown good agreements within ${\pm}1%$ compared to the water phantom measurements. In the geometrical test, the data from MatriXX were comparable with those from the films. In conclusion, the MatriXX is a good substitute for water phantom system and film measurements. In addition, the results indicate that the MatriXX as a cost-effective novel QA tool to reduce time and personnel power.

  • PDF

The effects of driving performance during driving with sending text message and searching navigation : a study among 50s taxi drivers (운전 중 문자 메시지 전송과 네비게이션 검색이 운전 수행 능력에 미치는 영향 : 50대 택시 운전자를 대상으로)

  • Kim, Han-Soo;Choi, Jin-Seung;Kang, Dong-Won;Oh, Ho-Sang;Seo, Jung-Woo;Yeon, Hong-Won;Choi, Mi-Hyun;Min, Byung-Chan;Chung, Soon-Cheol;Tack, Gye-Rae
    • Science of Emotion and Sensibility
    • /
    • v.14 no.4
    • /
    • pp.571-580
    • /
    • 2011
  • The purpose of this study was to evaluate the effects of secondary task such as sending text message (STM) and searching navigation (SN) using the variable indicating control of vehicle ((Medial-Lateral Coefficient of Variation, MLCV), (Anterior-Posterior Coefficient of Variation, APCV)) and motion signal (Jerk-Cost function, JC). Participants included 50s taxi drivers; 14 males and 14 females. Participants were instructed to keep a certain distance (30m) from the car ahead with constant speed (80km/hr or 100km/hr). Experiement consisted of driving alone for 1minute and driving with secondary task for 1minute. Both MLCV and APCV were significantly increased during Driving + Sending Text Message(STM) and Driving + Searching Navigation(SN) than Driving only. Also, JC was increased during Driving + STM and Driving + SN than Driving only. In this study, we found that even in the experts group who are taxi driver and have 25 years driving experience, the smoothness of motion is decreased and the control of vehicle is disturbed when they were performing secondary tasks like sending text message or searching navigation.

  • PDF

Risk Ranking Determination of Combination of Foodborne Pathogens and Livestock or Livestock Products (식중독 세균과 주요 축산식품 및 가공품 조합에 대한 위해순위 결정)

  • Hong, Soo-Hyeon;Park, Na-Yoon;Jo, Hye-Jin;Ro, Eun-Young;Ko, Young-Mi;Na, Yu-Jin;Park, Keun-Cheol;Choi, Bum-Geun;Min, Kyung-Jin;Lee, Jong-Kyung;Moon, Jin-San;Yoon, Ki-Sun
    • Journal of Food Hygiene and Safety
    • /
    • v.30 no.1
    • /
    • pp.1-12
    • /
    • 2015
  • This study was performed to determine risk ranking of the combination of pathogen-livestock or livestock products to identify the most significant public health risks and to prioritize risk management strategies. First, we reviewed foodborne outbreak data related to livestock products and determined main vehicles and pathogens according to the number of outbreak and case. Second, expert's opinion about management priority of pathogen-livestock product pairing was surveyed with 19 livestock experts in the university, research center, and government agency. Lastly, we used the outcome of Risk Ranger (semi-quantitative risk ranking tool) of 14 combinations of pathogen and livestock or livestock products. We have classified the combination of pathogen-livestock products into group I (high risk), II (medium risk), and III (low risk) according to their risk levels and management priority. Group I, which is the highest risk for foodborne outbreak, includes Salmonella spp./egg and egg products, Campylobacter spp./poultry, pathogenic E. coli/meat and processed ground meat. In conclusion, the results of this study will provide the specific guideline of mid- and long-term planning for risk assessment and risk management prioritization of the combination of pathogen and livestock, or livestock product.

The analysis of ethylene glycol and metabolites in biological specimens (생체시료에서 에틸렌 글리콜과 그 대사체 분석에 관한 연구)

  • Park, Seh-Youn;Kim, Yu-Na;Kim, Nam-Yee
    • Analytical Science and Technology
    • /
    • v.24 no.2
    • /
    • pp.69-77
    • /
    • 2011
  • Ethylene glycol (EG) is produced commercially in large amounts and is widely used as antifreeze or deicing solution for cars, boats, and aircraft. EG poisoning occurs in suicide attempts and infrequently, either intentionally through misuse or accidental as EG has a sweet taste. EG has in itself a low toxicity, but is in vivo broken down to higher toxic organic acids which are responsible for extensive cellular damage in various tissues caused principally by the metabolites glycolic acid and oxalic acid. The most conclusive analytical method of diagnosing EG poisoning is determination of EG concentration. However, victims are sometimes admitted at a late stage to hospitals or died during emergency treatment like a gastric lavage or found rotten dead, when blood EG concentrations are low or not detected. Therefore, in this study, the identification of EG was not only performed by gas chromatograpyc-mass spectrometry (GC-MS) following derivatization but also further toxicological analyses of metabolites, glycolic acid (GA) and oxalic acid (OA), were performed by ion chromatography in various biological specimens. A ranges of blood concentrations (3 cases) was $10\sim2,400\;{\mu}g/mL$ for EG, $224\sim1,164\;{\mu}g/mL$ for GA and ND $\sim40\;{\mu}g/mL$ for OA, respectively, In other biological specimens (liver, kidney, bile and pleural fluid), a range of concentrations (3 cases) was ND $\sim55,000\;{\mu}g/mL$ for EG, ND $\sim1,124\;{\mu}g/mL$ for GA and ND $\sim60\;{\mu}g/mL$ for OA, respectively. Liver and kidney tissues were recommended specimens including blood because OA, a final metabolite of EG, was identified large amounts in these despite no detectable EG caused by some therapy.

Meta-Analytic Approach to the Effects of Food Processing Treatment on Pesticide Residues in Agricultural Products (식품가공처리가 농산물 잔류농약에 미치는 영향에 대한 메타분석)

  • Kim, Nam Hoon;Park, Kyung Ai;Jung, So Young;Jo, Sung Ae;Kim, Yun Hee;Park, Hae Won;Lee, Jeong Mi;Lee, Sang Mi;Yu, In Sil;Jung, Kweon
    • The Korean Journal of Pesticide Science
    • /
    • v.20 no.1
    • /
    • pp.14-22
    • /
    • 2016
  • A trial of combining and quantifying the effects of food processing on various pesticides was carried out using a meta-analysis. In this study, weighted mean response ratios and confidence intervals about the reduction of pesticide residue levels in fruits and vegetables treated with various food processing techniques were calculated using a statistical tool of meta-analysis. The weighted mean response ratios for tap water washing, peeling, blanching (boiling) and oven drying were 0.52, 0.14, 0.34 and 0.46, respectively. Among the food processing methods, peeling showed the greatest effect on the reduction of pesticide residues. Pearsons's correlation coefficient (r=0.624) between weighted mean response ratios and octanolwater partition coefficients ($logP_{ow}$) for twelve pesticides processed with tap water washing was confirmed as having a positive correlation in the range of significance level of 0.05 (p=0.03). This means that a pesticide having the higher value of $logP_{ow}$ was observed as showing a higher weighted mean response ratio. These results could be used effectively as a reference data for processing factor in risk assessment and as an information for consumers on how to reduce pesticide residues in agricultural products.

Design and Implementation of the SSL Component based on CBD (CBD에 기반한 SSL 컴포넌트의 설계 및 구현)

  • Cho Eun-Ae;Moon Chang-Joo;Baik Doo-Kwon
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.12 no.3
    • /
    • pp.192-207
    • /
    • 2006
  • Today, the SSL protocol has been used as core part in various computing environments or security systems. But, the SSL protocol has several problems, because of the rigidity on operating. First, SSL protocol brings considerable burden to the CPU utilization so that performance of the security service in encryption transaction is lowered because it encrypts all data which is transferred between a server and a client. Second, SSL protocol can be vulnerable for cryptanalysis due to the key in fixed algorithm being used. Third, it is difficult to add and use another new cryptography algorithms. Finally. it is difficult for developers to learn use cryptography API(Application Program Interface) for the SSL protocol. Hence, we need to cover these problems, and, at the same time, we need the secure and comfortable method to operate the SSL protocol and to handle the efficient data. In this paper, we propose the SSL component which is designed and implemented using CBD(Component Based Development) concept to satisfy these requirements. The SSL component provides not only data encryption services like the SSL protocol but also convenient APIs for the developer unfamiliar with security. Further, the SSL component can improve the productivity and give reduce development cost. Because the SSL component can be reused. Also, in case of that new algorithms are added or algorithms are changed, it Is compatible and easy to interlock. SSL Component works the SSL protocol service in application layer. First of all, we take out the requirements, and then, we design and implement the SSL Component, confidentiality and integrity component, which support the SSL component, dependently. These all mentioned components are implemented by EJB, it can provide the efficient data handling when data is encrypted/decrypted by choosing the data. Also, it improves the usability by choosing data and mechanism as user intend. In conclusion, as we test and evaluate these component, SSL component is more usable and efficient than existing SSL protocol, because the increase rate of processing time for SSL component is lower that SSL protocol's.

A Control Method for designing Object Interactions in 3D Game (3차원 게임에서 객체들의 상호 작용을 디자인하기 위한 제어 기법)

  • 김기현;김상욱
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.9 no.3
    • /
    • pp.322-331
    • /
    • 2003
  • As the complexity of a 3D game is increased by various factors of the game scenario, it has a problem for controlling the interrelation of the game objects. Therefore, a game system has a necessity of the coordination of the responses of the game objects. Also, it is necessary to control the behaviors of animations of the game objects in terms of the game scenario. To produce realistic game simulations, a system has to include a structure for designing the interactions among the game objects. This paper presents a method that designs the dynamic control mechanism for the interaction of the game objects in the game scenario. For the method, we suggest a game agent system as a framework that is based on intelligent agents who can make decisions using specific rules. Game agent systems are used in order to manage environment data, to simulate the game objects, to control interactions among game objects, and to support visual authoring interface that ran define a various interrelations of the game objects. These techniques can process the autonomy level of the game objects and the associated collision avoidance method, etc. Also, it is possible to make the coherent decision-making ability of the game objects about a change of the scene. In this paper, the rule-based behavior control was designed to guide the simulation of the game objects. The rules are pre-defined by the user using visual interface for designing their interaction. The Agent State Decision Network, which is composed of the visual elements, is able to pass the information and infers the current state of the game objects. All of such methods can monitor and check a variation of motion state between game objects in real time. Finally, we present a validation of the control method together with a simple case-study example. In this paper, we design and implement the supervised classification systems for high resolution satellite images. The systems support various interfaces and statistical data of training samples so that we can select the most effective training data. In addition, the efficient extension of new classification algorithms and satellite image formats are applied easily through the modularized systems. The classifiers are considered the characteristics of spectral bands from the selected training data. They provide various supervised classification algorithms which include Parallelepiped, Minimum distance, Mahalanobis distance, Maximum likelihood and Fuzzy theory. We used IKONOS images for the input and verified the systems for the classification of high resolution satellite images.

Implementation of Reporting Tool Supporting OLAP and Data Mining Analysis Using XMLA (XMLA를 사용한 OLAP과 데이타 마이닝 분석이 가능한 리포팅 툴의 구현)

  • Choe, Jee-Woong;Kim, Myung-Ho
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.15 no.3
    • /
    • pp.154-166
    • /
    • 2009
  • Database query and reporting tools, OLAP tools and data mining tools are typical front-end tools in Business Intelligence environment which is able to support gathering, consolidating and analyzing data produced from business operation activities and provide access to the result to enterprise's users. Traditional reporting tools have an advantage of creating sophisticated dynamic reports including SQL query result sets, which look like documents produced by word processors, and publishing the reports to the Web environment, but data source for the tools is limited to RDBMS. On the other hand, OLAP tools and data mining tools have an advantage of providing powerful information analysis functions on each own way, but built-in visualization components for analysis results are limited to tables or some charts. Thus, this paper presents a system that integrates three typical front-end tools to complement one another for BI environment. Traditional reporting tools only have a query editor for generating SQL statements to bring data from RDBMS. However, the reporting tool presented by this paper can extract data also from OLAP and data mining servers, because editors for OLAP and data mining query requests are added into this tool. Traditional systems produce all documents in the server side. This structure enables reporting tools to avoid repetitive process to generate documents, when many clients intend to access the same dynamic document. But, because this system targets that a few users generate documents for data analysis, this tool generates documents at the client side. Therefore, the tool has a processing mechanism to deal with a number of data despite the limited memory capacity of the report viewer in the client side. Also, this reporting tool has data structure for integrating data from three kinds of data sources into one document. Finally, most of traditional front-end tools for BI are dependent on data source architecture from specific vendor. To overcome the problem, this system uses XMLA that is a protocol based on web service to access to data sources for OLAP and data mining services from various vendors.