• Title/Summary/Keyword: Compilation

Search Result 393, Processing Time 0.023 seconds

A Study of the Books Printed with a Newly Found Font, Tentatively Named "Muin-ja" (세조조(世祖朝) 신주(新鑄)의 '무인자(戊寅字)'와 그 간본(刊本) -주(主)로 그 주자(鑄字)의 고증(考證)을 중심(中心)으로-)

  • Chon, Hye-Bong
    • Journal of the Korean BIBLIA Society for library and Information Science
    • /
    • v.2 no.1
    • /
    • pp.102-131
    • /
    • 1974
  • The author's thesis is that the types used for the large-sized characters seen in the two metal type-printed books "Kyosik chubopop karyong"(交食推步法假令) and "Yok-hak kemong yohae"(易學啓蒙要解) both printed in 1458 belong to a new metal font hitherto unnamed. The former book was compiled by Yi Sun-ji(李純之) and Kim Sok-je(金石梯) in January of 1458 in accordance to King Sejo's order. A new font was created to be used for the large-sized characters of the book. Several. months after completion of the compilation, the book was printed with mixed use of the new font and the Kabin-ja(甲寅字) for medium- and small-sized characters. The latter book had been written by King Sejo before his accession to the throne. Ascending the throne the king had his scholar-subjects examine the writing to correct it where necessary. The examination was completed in July of 1458 and printing was immediately done with the two fonts the above-mentioned, new font for the large-sized letters and the Kabin-ja for the medium- and small-sized ones. The books were granted to the scholar-subjects and the students of the Sung Kyun Kwan Academy as a royal gift. The matrix seems to have been modeled after the calligraphy of King Sejo. Because the new font was created to print the large-sized letters of the two books in 1458, it may be proper to name it "Muin-ja" using the "kanji"(干支) of the year. The author is happy to identify and include another font in the list of Korean movable types as a result of the present study.

  • PDF

A Study on algorithm of SCAMIN attribute for ENCs (전자해도 SCAMIN 속성 적용 알고리즘 개발 연구)

  • Oh, Se-Woong;Park, Jong-Min;Lee, Moon-Jin
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.14 no.11
    • /
    • pp.2403-2412
    • /
    • 2010
  • ENC(Electronic Navigational Chart) is an electronic chart including all kinds of information of nautical chart. Mariners have difficulty reading information because of clutter problem if display scale of ECDIS is lower than compilation scale of ENCs. IRO made SCAMIN improvement model included in S-65 Standard to settle this ENC clutter problem. This SCAMIN model includes both simple restriction that should satisfy some requirement and share geographic information, and hard restriction that should overlap with area object such as depth area, land area, dredge area. It also apply SCAMIN step of sounding value according to importance of safety navigation. In this paper, we analyzed the restrictions for SCAMIN model of S-65 Standard, developed algorithm and application to apply theses restrictions reasonably and mechanically. Also we applied this algorithm and application to Korea ENCs and evaluated the result of performance.

Students' Perception of Smart Learning in Distance Higher Education (스마트러닝에 대한 원격대학 학습자의 인식)

  • Choi, Hyoseon;Woo, Younghee;Jung, Hyojung
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.10
    • /
    • pp.584-593
    • /
    • 2013
  • The purpose of this research is to analyze students' perception of smart learning focusing on its definitions, roles and values in distance higher education. In the online survey, 1,950 students of 'A' open university were participated. The results show that the students viewed the smart learning to be more 'absorbing', 'interactive' and 'collaborative' than the existing e-learning, as it compiles their experiences into learning. However, the respondents' perceptions of smart learning varied among different age groups: more students in their 40s and 50s responded that smart learning was 'customized', 'humanlike', 'interactive', 'comfortable', 'stable', 'familiar', 'unstressful', and 'practical' than students in their 20s and 30s, and they tend to view the main feature of smart learning to be the compilation of learner experiences.

Application of Analytical Hierarchy Process in Analyzing the Priorities of Strategy for Improving the Army Military Foodservice (계층분석과정(AHP)을 이용한 육군 군대급식 개선과제의 실행 우선순위 분석)

  • Baek, Seung-Hee
    • Korean Journal of Community Nutrition
    • /
    • v.19 no.1
    • /
    • pp.51-59
    • /
    • 2014
  • The current exploratory study presents the Analytical Hierarchy Process (AHP) as a potential decision-making method to obtain the relative weights of alternatives through pairwise comparison in the context of hierarchical structure. The aim of this study was to elicit prior strategy to improve army military foodservice. Content analysis and seven times of in-depth interview from 13 officers of the Ministry of National Defense were conducted to develop the hierarchical structure for AHP analysis. Questionnaires were distributed to 61 foodservice managers and 39 dietitian and military foodservice officers. The highest-ranked strategy for improving military foodservice was the 'renewal of the kitchen facilities' (0.2578), followed by 'enlargement of foodservice operating staffs' (0.2345), 'specialization of sanitation & foodservice management' (0.2222), 'Practical foodservice budget control' (0.1394), and 'menu variety & standardized recipe' (0.1281). 'Enlargement of foodservice facilities' (0.3995), 'increase the no. of kitchen police' (0.3463), 'sanitary & cooking training reinforcement of kitchen police' (0.4445), 'management of foodservice budget by total amount' (0.5043), and 'standardization of mass cooking' (0.3571) were the highest overweight item in each strategy. The study also compared the relative weights of alternatives of foodservice managers with that of dietitians and military foodservice officers. Those two groups revealed some difference in their priority of important strategy regarding army military foodservice. The results of this study would provide the data for making a policy or compilation of the budget regarding army military foodservice.

Teachers' Opinions on Differences of Detail Learning Content According to High School Textbooks - Focused on Utilizing of Definite Integral - (고등학교 교과서 내용 영역별 세부 학습내용 차이에 대한 교사 의견 조사: 정적분의 활용을 중심으로)

  • Yang, Seong Hyun
    • School Mathematics
    • /
    • v.17 no.4
    • /
    • pp.555-570
    • /
    • 2015
  • General and subject guideline for new curriculum are confirmed and plan to announced on September 2015. The development of instructional materials according to them will proceed. Because textbooks have the role and function to determine and control the direction and scope of teaching and learning, Textbook plays a very important role in the situation where teachers and students meet. Thus we get a significant effect depending on the structure of textbook. In this study, we analyzed differences of detail learning content according to high school textbooks focused on utilizing of definite integral. After creating the questionnaire based on it, The survey was conducted targeting for 369 high school mathematics teachers belong to 14 education offices of cities and provinces are polled out by accidental sampling method. Analyzing the results of the survey, We searched various teaching and learning method that arise due to differences of detail learning content according to high school textbooks and thereby it explored the impact on students' mathematics learning. Through this, Our intention is to offer implication about the structure of textbook detail learning contents and to derive the improvements about textbook compilation system.

Design of Metadata and Development of System for Managing Connection Information of Digital Contents (디지털 콘텐츠 연관 정보 관리를 위한 메타데이터 설계 및 시스템 개발)

  • Kim, Jae-In;Kim, Dae-In;Song, Myung-Jin;Han, Dae-Young;Hwang, Bu-Hyun
    • The Journal of the Korea Contents Association
    • /
    • v.9 no.7
    • /
    • pp.27-36
    • /
    • 2009
  • The advances in communication technology and the popularity of network have rendered the increment of demand and desire to digital contents, and a number of creative digital contents produced. According to the time of compilation and a production of raw data, and manners, the format of digital contents may be variable and there can be existed in many connection information about digital contents. Although there are many metadata for standardizing the digital contents, they do not consider for expressing the connection relation about digital contents. In this paper, we proposed the new metadata for expressing the connection information about digital contents. This metadata is compatible the metadata of dublin core out of international standard of digital contents. Our metadata can express the variable direct or indirect connection relation about digital contents by expanding the relation element of dublin core. Also, our system can provide more useful information since we develop a system for managing connection information of digital contents based on our metadata.

A Study on Loose Laboratory Reports in A Hospital (일개(K) 병원의 누락 조직검사결과지에 관한 조사연구)

  • Yoo, Yeon-Soon;Ha, Eun-Hee
    • Quality Improvement in Health Care
    • /
    • v.2 no.2
    • /
    • pp.46-54
    • /
    • 1996
  • Background : The medical record is a compilation of pertinent facts of a patient's life and health history, including past and present illness and treatment. It is written by the health professionals contributing to that patient's care. And the medical record is the permanent, legal document which must contain sufficient information to identify the patient, justify the diagnosis and treatment, and record the results. As such, it must be accurate and complete. So we try to analyze the medical record especially a kind of incomplete record, loose laboratory reports. Methods: During the one-year period(from January to December 1988), a medical record practitioner examine and analyze the record of laboratory reports at K Hospital in Seoul. A total of 320 loose laboratory reports for 3,818 admitted laboratory reports. And a medical record practitioner and a physician review and analyze the influencing factors for the various reasons of clinical and laboratory aspects. Result: The loose percentage by department is the highest in obstetrics(40.4%) but the highest loose rate is in pediatrics(25.0%). The most of omission is occurred in operation room(80.3%) than OPD(19.7%). The change of diagnosis is according to duration of laboratory and more changable in cancer patient. Conclusion : Regular analysis of the documentation in the medical record so it fulfills its purposes of communicating patient care information. So it serves as evidence of the patient's course of illness and treatment for various legal, reimbursement, and peer evaluation review. And it is very important aspect of quality assurance in medical activities.

  • PDF

Improvement of Iterative Algorithm for Live Variable Analysis based on Computation Reordering (사용할 변수의 예측에 사용되는 반복적 알고리즘의 계산순서 재정렬을 통한 수행 속도 개선)

  • Yun Jeong-Han;Han Taisook
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.8
    • /
    • pp.795-807
    • /
    • 2005
  • The classical approaches for computing Live Variable Analysis(LVA) use iterative algorithms across the entire programs based on the Data Flow Analysis framework. In case of Zephyr compiler, average execution time of LVA takes $7\%$ of the compilation time for the benchmark programs. The classical LVA algorithm has many aspects for improvement. The iterative algorithm for LVA scans useless basic blocks and calculates large sets of variables repeatedly. We propose the improvement of Iterative algorithm for LVA based on used variables' upward movement. Our algorithm produces the same result as the previous iterative algorithm. It is based on use-def chain. Reordering of applying the flow equation in DFA reduces the number of visiting basic blocks and redundant flow equation executions, which improves overall processing time. Experimental results say that our algorithm ran reduce $36.4\%\;of\;LVA\;execution\;time\;and\;2.6\%$ of overall computation time in Zephyr compiler with benchmark programs.

Code Optimization Using Pattern Table (패턴 테이블을 이용한 코드 최적화)

  • Yun Sung-Lim;Oh Se-Man
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.11
    • /
    • pp.1556-1564
    • /
    • 2005
  • Various optimization techniques are deployed in the compilation process of a source program for improving the program's execution speed and reducing the size of the source code. Of the optimization pattern matching techniques, the string pattern matching technique involves finding an optimal pattern that corresponds to the intermediate code. However, it is deemed inefficient due to excessive time required for optimized pattern search. The tree matching pattern technique can result in many redundant comparisons for pattern determination, and there is also the disadvantage of high cost involved in constructing a code tree. The objective of this paper is to propose a table-driven code optimizer using the DFA(Deterministic Finite Automata) optimization table to overcome the shortcomings of existing optimization techniques. Unlike other techniques, this is an efficient method of implementing an optimizer that is constructed with the deterministic automata, which determines the final pattern, refuting the pattern selection cost and expediting the pattern search process.

  • PDF

Design and Implementation of Web Compiler for Learning of Artificial Intelligence (인공지능 학습을 위한 웹 컴파일러 설계 및 구현)

  • Park, Jin-tae;Kim, Hyun-gook;Moon, Il-young
    • Journal of Advanced Navigation Technology
    • /
    • v.21 no.6
    • /
    • pp.674-679
    • /
    • 2017
  • As the importance of the 4th industrial revolution and ICT technology increased, it became a software centered society. Existing software training was limited to the composition of the learning environment, and a lot of costs were incurred early. In order to solve these problems, a learning method using a web compiler was developed. The web compiler supports various software languages and shows compilation results to the user via the web. However, Web compilers that support artificial intelligence technology are missing. In this paper, we designed and implemented a tensor flow based web compiler, Google's artificial intelligence library. We implemented a system for learning artificial intelligence by building a meteorJS based web server, implementing tensor flow and tensor flow serving, Python Jupyter on a nodeJS based server. It is expected that it can be utilized as a tool for learning artificial intelligence in software centered society.