• Title/Summary/Keyword: Software Types

Search Result 1,382, Processing Time 0.037 seconds

The Comparative Kinematic Analysis of a Volleyball Spike Serve (배구 스파이크 서브 동작의 운동학적 비교 분석)

  • Park, Jong-Chul;Back, Jin-Ho;Lee, Jin-Taek
    • Korean Journal of Applied Biomechanics
    • /
    • v.19 no.4
    • /
    • pp.671-680
    • /
    • 2009
  • We performed a study to obtain kinematic data on the characteristics of spike serving techniques used by volleyball players, including other basic data that will be useful for in-field applications. We used three-dimensional videography to compare good tough serves and serve errors. The subjects were 3 left attackers whose spike serves were videographed (60 fileds/s). The three-dimensional coordinates were calculated using the direct linear transformation method and then analyzed using the Kwon 3D software program version 3.1. There was no difference in time elapsed. However, the vertical displacement of the center of body mass(CM) differed between the 2 types of serves: in successful serves, the CM tended to be lower, as did the maximum ball height at the time of hitting. Further, the higher the level of the hitting hand was at the moment of impact, the higher was the likelihood of scoring points. In good serves, the players tended to accelerate their CM movement just before jumping to hit the ball and descend rapidly at the moment of hitting. The hand speed along with ball velocity during the impact was proven to be higher in successful serves. Moreover, in successful serves, the shoulder angles increased to a greater extent while the elbow angles were maintained constant. This possibly resulted in faster and more precise serves. An important observation was that the angle of trunk inclination during the jump did not increase with the swing of the shoulders, muscle tendon complex.

Life Cycle Assessment (LCA) for Calculation of the Carbon Emission Amount of Organic Farming Material -With Emphasis on Hardwood Charcoal, Grass Liquid and Microbial Agents- (유기농자재의 탄소배출량 산정을 위한 전과정평가(LCA) -참숯, 목초액, 미생물제재를 중심으로-)

  • Yoon, Sung-Yee;Son, Bo-Hong
    • Korean Journal of Organic Agriculture
    • /
    • v.20 no.3
    • /
    • pp.297-311
    • /
    • 2012
  • Since 1997, Korean Ministry of Knowledge Economy and Ministry of Environment have established data on some 400 basic raw and subsidiary materials and process like energy, petro-chemical, steel, cement, glass, paper, construction materials, transportation, recycling and disposal etc by initiating establishment of LCI database. Regarding agriculture, Rural Development Administration has conducted establishment of LCI database for major farm products like rice, barley, beans, cabbage and radish etc from 2009, and released that they would establish LCI database for 50 items until 2020 later on. The domestic LCI database for seeds, seedling, agrochemical, inorganic fertilizer and organic fertilizer etc is only at initial stage of establishment, so overseas LCI databases are brought and being used. However, since the domestic and overseas natural environments differ, they fall behind in reliability. Therefore, this study has the purpose to select organic farming materials, survey the production process for various types of organic farming materials and establish LCI database for the effects of greenhouse gas emitted during the process in order to select carbon basic units for agricultural production system compliant in domestic situation instead of relying on overseas data and apply life cycle assessment of greenhouse gas emitted by each crop during the process. As for selecting methods, in this study organic farming materials were selected in the method of direct observation of material and bottom-up method a survey method with focus on the organic farming materials admitted into rice production. For the basic unit of carbon emission amount by the production of 1kg of organic farming material, the software PASS 4.1.1 developed by Korea Accreditation Board under Ministry of Knowledge Economy was used. The study had the goal to ultimately provide basic unit to calculate carbon emission amount in executing many institutions like goal management system and carbon performance display system etc in agricultural sector to be conducted later on. As a result, emission basic units per 1kg of production were calculated to be 0.0088kg-$CO_2$ for charcoal, 0.1319kg-$CO_2$ for grass liquid, and 0.2804kg-$CO_2$ for microbial agent.

Flying Cake: An Augmented Game on Mobile Device (Flying Cake: 모바일 단말기를 이용한 실감형 게임)

  • Park, An-Jin;Jung, Kee-Chul
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.1
    • /
    • pp.79-94
    • /
    • 2007
  • In the ubiquitous computing age which uses a high quantity network, mobile devices such as wearable and hand-held ones with a small tamers and a wireless communication module will be widely used in near future. Thus, a lot of researches about an augmented game on mobile devices have been attempted recently. The existing augmented games used a traditional 'backpack' system and a pattern marker. The 'backpack' system is expensive, cumbersome and inconvenient to use, and because of the pattern marker, it is only possible to play the game in the previously installed palace. In this paper, we propose an augmented game called Flying Cake using a face region to create the virtual object(character) without the pattern marker, which manually indicates an overlapped location of the virtual object in the real world, on a small and mobile PDA instead of the cumbersome hardware. Flying Cake is an augmented shooting game. This game supplies us with two types: 1) a single player which attacks a virtual character on images captured by a camera in an outdoor physical area, 2) dual players which attack the virtual character on images which we received through a wireless LAN. We overlap the virtual character on the face region using a face detection technique, and users play Flying Cake though attacking the virtual character. Flying Cake supplies new pleasure to flayers with a new game paradigm through an interaction between the user in the physical world captured by the PDA camera and the virtual character in a virtual world using the face detection.

A STUDY ON CLASS II COMPOSITE RESIN CAVITY USING FINITE ELEMENT STRESS ANALYSIS (유한요소법을 이용한 2급 복합레진 와동의 비교 연구)

  • Rim, Young-Il;Yo, In-Ho;Um, Chung-Moon
    • Restorative Dentistry and Endodontics
    • /
    • v.22 no.1
    • /
    • pp.428-446
    • /
    • 1997
  • Restorative procedures can lead to weakening tooth due to reduction and alteration of tooth structure. It is essential to prevent fractures to conserve tooth. The resistance to fracture of the restored tooth may be influenced by many factors, among these are the cavity dimension and the physical properties of the restorative material. The placement of direct composite resin restorations has generally been found to have a strengthening effect on the prepared teeth. It is the purpose of this investigation to study the relationship between the cavity isthmus and the fracture resistance of a tooth in composite resin restorations. In this study, MO cavity was prepared on maxillary first premolar. Three dimensional finite element models were made by serial photographic method and isthmus(1/4, 1/3, 1/2 of intercuspal distance) were varied. Two types of model(B and R model) were developed. B model was assumed perfect bonding between the restoration and cavity wall and R model was left unfilled. A load of 500N was applied vertically at the first node from the lingual slope of the buccal cusp tip. This study analysed the displacement, 1 and 2 direction normal stress and strain with FEM software ABAQUS Version 5.2 and hardware IRIS 4D/310 VGX Work-station. The results were as follows : 1. Displacement of buccal cusp in R model occurred and increased as widening of the cavity, and displacement in B model was little and not influenced by cavity width. 2. There was a significant decrease of stress resulting in increase of fracture resistance in B model when compared with R model. 3. With the increase of the isthmus width, B model showed no change in the stress and strain. In R model, the stress and strain increased both in the area of buccal-pulpal line angle and the buccal side of marginal ridge, therefore the possibility of crack increased. 4. The stress and strain were distributed evenly on the tooth in B model, but in R model, were concentrated on the buccal side of the distal marginal ridge and buccal-pulpal line angle, therefore the possibility of fracture increased.

  • PDF

Analysis of Disaster Safety Situation Classification Algorithm Based on Natural Language Processing Using 119 Calls Data (119 신고 데이터를 이용한 자연어처리 기반 재난안전 상황 분류 알고리즘 분석)

  • Kwon, Su-Jeong;Kang, Yun-Hee;Lee, Yong-Hak;Lee, Min-Ho;Park, Seung-Ho;Kang, Myung-Ju
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.9 no.10
    • /
    • pp.317-322
    • /
    • 2020
  • Due to the development of artificial intelligence, it is used as a disaster response support system in the field of disaster. Disasters can occur anywhere, anytime. In the event of a disaster, there are four types of reports: fire, rescue, emergency, and other call. Disaster response according to the 119 call also responds differently depending on the type and situation. In this paper, 1280 data set of 119 calls were tested with 3 classes of SVM, NB, k-NN, DT, SGD, and RF situation classification algorithms using a training data set. Classification performance showed the highest performance of 92% and minimum of 77%. In the future, it is necessary to secure an effective data set by disaster in various fields to study disaster response.

A Systematic Approach Of Construction Management Based On Last Planner System And Its Implementation In The Construction Industry

  • Hussain, SM Abdul Mannan;Sekhar, Dr.T.Seshadri;Fatima, Asra
    • Journal of Construction Engineering and Project Management
    • /
    • v.5 no.2
    • /
    • pp.11-15
    • /
    • 2015
  • The Last PlannerSystem (LPS) has been implemented on construction projects to increase work flow reliability, a precondition for project performance againstproductivity and progress targets. The LPS encompasses four tiers of planning processes:master scheduling, phase scheduling, lookahead planning, and commitment / weeklywork planning. This research highlights deficiencies in the current implementation of LPS including poor lookahead planning which results in poor linkage between weeklywork plans and the master schedule. This poor linkage undetermines the ability of theweekly work planning process to select for execution tasks that are critical to projectsuccess. As a result, percent plan complete (PPC) becomes a weak indicator of project progress. The purpose of this research is to improve lookahead planning (the bridgebetween weekly work planning and master scheduling), improve PPC, and improve theselection of tasks that are critical to project success by increasing the link betweenShould, Can, Will, and Did (components of the LPS), thereby rendering PPC a betterindicator of project progress. The research employs the case study research method to describe deficiencies inthe current implementation of the LPS and suggest guidelines for a better application ofLPS in general and lookahead planning in particular. It then introduces an analyticalsimulation model to analyze the lookahead planning process. This is done by examining the impact on PPC of increasing two lookahead planning performance metrics: tasksanticipated (TA) and tasks made ready (TMR). Finally, the research investigates theimportance of the lookahead planning functions: identification and removal ofconstraints, task breakdown, and operations design.The research findings confirm the positive impact of improving lookaheadplanning (i.e., TA and TMR) on PPC. It also recognizes the need to perform lookaheadplanning differently for three types of work involving different levels of uncertainty:stable work, medium uncertainty work, and highly emergent work.The research confirms the LPS rules for practice and specifically the need to planin greater detail as time gets closer to performing the work. It highlights the role of LPSas a production system that incorporates deliberate planning (predetermined andoptimized) and situated planning (flexible and adaptive). Finally, the research presents recommendations for production planningimprovements in three areas: process related, (suggesting guidelines for practice),technical, (highlighting issues with current software programs and advocating theinclusion of collaborative planning capability), and organizational improvements(suggesting transitional steps when applying the LPS).

Design and frnplernentation of a Query Processing Algorithm for Dtstributed Semistructlred Documents Retrieval with Metadata hterface (메타데이타 인터페이스를 이용한 분산된 반구조적 문서 검색을 위한 질의처리 알고리즘 설계 및 구현)

  • Choe Cuija;Nam Young-Kwang
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.6
    • /
    • pp.554-569
    • /
    • 2005
  • In the semistructured distributed documents, it is very difficult to formalize and implement the query processing system due to the lack of structure and rule of the data. In order to precisely retrieve and process the heterogeneous semistructured documents, it is required to handle multiple mappings such as 1:1, 1:W and W:1 on an element simultaneously and to generate the schema from the distributed documents. In this paper, we have proposed an query processing algorithm for querying and answering on the heterogeneous semistructured data or documents over distributed systems and implemented with a metadata interface. The algorithm for generating local queries from the global query consists of mapping between g1oba1 and local nodes, data transformation according to the mapping types, path substitution, and resolving the heterogeneity among nodes on a global input query with metadata information. The mapping, transformation, and path substitution algorithms between the global schema and the local schemas have been implemented the metadata interface called DBXMI (for Distributed Documents XML Metadata Interface). The nodes with the same node name and different mapping or meanings is resolved by automatically extracting node identification information from the local schema automatically. The system uses Quilt as its XML query language. An experiment testing is reported over 3 different OEM model semistructured restaurant documents. The prototype system is developed under Windows system with Java and JavaCC compiler.

A Homonym Disambiguation System based on Semantic Information Extracted from Dictionary Definitions (사전의 뜻풀이말에서 추출한 의미정보에 기반한 동형이의어 중의성 해결 시스템)

  • Hur, Jeong;Ock, Cheol-Young
    • Journal of KIISE:Software and Applications
    • /
    • v.28 no.9
    • /
    • pp.688-698
    • /
    • 2001
  • A homonym could be disambiguated by anther words in the context such as nouns, predicates used with the homonym. This paper proposes a homonym disambiguation system based on statistical semantic information which is extracted from definitions in dictionary. The semantic information consists of nouns and predicates that are used with the homonym in definitions. In order to extract accurate semantic information, definitions are used with the homonym in definitions. In order to extract accurate semantic information, definitions are classified into two types. One has hyponym-hypernym relation between title word and head word (homonym) in definition. The hyponym-hypernym relation is one level semantic hierarchy and can be extended to deeper levels in order to overcome the problem of data sparseness. The other is the case that the homonym is used in the middle of definition. The system considers nouns and predicates simultaneously to disambiguate the homonym. Nine homonyms are examined in order to determine the weight of nouns and predicates which affect accrutacy of homonym disambiguation. From experiments using training corpus(definitions in dictionary), the average accruracy of homonym disamguation is 96.11% when the weight is 0.9 and 0.1 for noun and verb respectively. And another experiment to meaure the generality of the homonym disambiguation system results in the 80.73% average accuracy to 1,796 untraining sentences from Korean Information Base I and ETRI corpus.

  • PDF

Characterization of Rabbit Retinal Ganglion Cells with Multichannel Recording (다채널기록법을 이용한 토끼 망막 신경절세포의 특성 분석)

  • Cho Hyun Sook;Jin Gye-Hwan;Goo Yong Sook
    • Progress in Medical Physics
    • /
    • v.15 no.4
    • /
    • pp.228-236
    • /
    • 2004
  • Retinal ganglion cells transmit visual scene as an action potential to visual cortex through optic nerve. Conventional recording method using single intra- or extra-cellular electrode enables us to understand the response of specific neuron on specific time. Therefore, it is not possible to determine how the nerve impulses in the population of retinal ganglion cells collectively encode the visual stimulus with conventional recording. This requires recording the simultaneous electrical signals of many neurons. Recent advances in multi-electrode recording have brought us closer to understanding how visual information is encoded by population of retinal ganglion cells. We examined how ganglion cells act together to encode a visual scene with multi-electrode array (MEA). With light stimulation (on duration: 2 sec, off duration: 5 sec) generated on a color monitor driven by custom-made software, we isolated three functional types of ganglion cell activities; ON (35.0$\pm$4.4%), OFF (31.4$\pm$1.9%), and ON/OFF cells (34.6$\pm$5.3%) (Total number of retinal pieces = 8). We observed that nearby neurons often fire action potential near synchrony (< 1 ms). And this narrow correlation is seen among cells within a cluster which is made of 6~8 cells. As there are many more synchronized firing patterns than ganglion cells, such a distributed code might allow the retina to compress a large number of distinct visual messages into a small number of ganglion cells.

  • PDF

Development of Sensor Network Simulator for Estimating Power Consumption and Execution Time (전력소모량 및 실행시간 추정이 가능한 센서 네트워크 시뮬레이터의 개발)

  • Kim, Bang-Hyun;Kim, Tae-Kyu;Jung, Yong-Doc;Kim, Jong-Hyun
    • Journal of the Korea Society for Simulation
    • /
    • v.15 no.1
    • /
    • pp.35-42
    • /
    • 2006
  • Sensor network, that is an infrastructure of ubiquitous computing, consists of a number of sensor nodes of which hardware is very small. The network topology and routing scheme of the network should be determined according to its purpose, and its hardware and software may have to be changed as needed from time to time. Thus, the sensor network simulator being capable of verifying its behavior and estimating performance is required for better design. Sensor network simulators currently existing have been developed for specific hardwares or operating systems, so that they can only be used for such systems and do not provide any means to estimate the amount of power consumption and program execution time which are major issues for system design. In this study, we develop the sensor network simulator that can be used to design and verify various sensor networks without regarding to types of applications or operating systems, and also has the capability of predicting the amount of power consumption and program execution time. For this purpose, the simulator is developed by using machine instruction-level discrete-event simulation scheme. As a result, the simulator can be used to analyze program execution timings and related system behaviors in the actual sensor nodes in detail. Instruction traces used as workload for simulations are executable images produced by the cross-compiler for ATmega128L microcontroller.

  • PDF