• Title/Summary/Keyword: Composite number

Search Result 1,282, Processing Time 0.025 seconds

3-Level Response Surface Design by Using Expanded Spherical Experimental Region (확장된 구형설계를 이용한 반응표면설계)

  • Kim, Ha-Yan;Lee, Woo-Sun
    • The Korean Journal of Applied Statistics
    • /
    • v.25 no.1
    • /
    • pp.215-223
    • /
    • 2012
  • Response surface methodology(RSM) is a very useful statistical techniques for improving and optimizing the product process. By this reason, RSM has been utilized extensively in the industrial world, particularly in the circumstances where several product variables potentially influence some quality characteristic of the product. In order to estimate the optimal condition of product variables, an experiment is being conducted defining appropriate experimental region. However, this experimental region can vary with the experimental circumstances and choice of a researcher. Response surface designs can be classified, according to the shape of the experimental region, into spherical and cuboidal. In the spherical case, the design is either rotatable or very near-rotatable. The central composite design(CCD)s widely used in RSM is an example of 5-level and spherical design. The cuboidal CCDs(CCDs with ${\alpha}=1$) is appropriate when an experimental region is cuboidal but this design dose not satisfy the rotatability as it is not spherical. Practically, a 3-level spherical design is often required in the industrial world where various level of experiments are not available. Box-Behnken design(BBD)s are a most popular 3-level spherical designs for fitting second-order response surfaces and satisfy the rotatability but the experimental region does not vary with the number of variables. The new experimental design with expanded experimental region can be considered if the predicting response at the extremes are interested. This paper proposes a new 3-level spherical RSM which are constructed to expand the experimental region together with number of product variables.

The κ-Fermat's Integer Factorization Algorithm (κ-페르마 소인수분해 알고리즘)

  • Choi, Myeong-Bok;Lee, Sang-Un
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.11 no.4
    • /
    • pp.157-164
    • /
    • 2011
  • It is very difficult problem to factorize composite number. Integer factorization algorithms, for the most part, find ($a,b$) that is congruence of squares ($a^2{\equiv}b^2$(mode $n$)) with using factoring(factor base, B) and get the result, $p=GCD(a-b,n)$, $q=GCD(a+b,n)$ with taking the greatest common divisor of Euclid based on the formula $a^2-b^2=(a-b)(a+b)$. The efficiency of these algorithms hangs on finding ($a,b$). Fermat's algorithm that is base of congruence of squares finds $a^2-b^2=n$. This paper proposes the method to find $a^2-b^2=kn$, ($k=1,2,{\cdots}$). It is supposed $b_1$=0 or 5 to be surely, and b is a double number. First, the proposed method decides $k$ by getting kn that satisfies $b_1=0$ and $b_1=5$ about $n_2n_1$. Second, it decides $a_2a_1$ that satisfies $a^2-b^2=kn$. Third, it figures out ($a,b$) from $a^2-b^2=kn$ about $a_2a_1$ as deciding $\sqrt{kn}$ < $a$ < $\sqrt{(k+1)n}$ that is in $kn$ < $a^2$ < $(k+1)n$. The proposed algorithm is much more effective in comparison with the conventional Fermat algorithm.

Monitoring on Alcohol Fermentation Properties of Red Ginseng Extracts. (홍삼액의 알코올 발효 특성 모니터링)

  • Kim, Seong-Ho;Kang, Bok-Hee;Noh, Sang-Gyun;Kim, Jong-Guk;Lee, Sang-Han;Lee, Jin-Man
    • Journal of Life Science
    • /
    • v.18 no.4
    • /
    • pp.550-555
    • /
    • 2008
  • This study focused on alcohol fermentation properties of red ginseng extracts using Saccharomyces cerevisiae JF-Y3. Central composite design was employed to investigate the influence of red ginseng extract content ($10{\sim}50%$, v/v) and yeast extract ($0.5{\sim}2.5%$, w/v) on the properties of alcohol fermentation added with red ginseng extracts. Yeast cell growth was affected both by red ginseng extract content and yeast extract content, and red ginseng extract content had a greater effect on yeast cell number than yeast extract content. Yeast cell number increased along with decrease of the red ginseng extract content and with increase of yeast extract content. Alcohol content was maximal at 30% red ginseng extracts and 0.50% yeast extract and the predicted maximum value of alcohol content was 12.45%. Brix degree and total sugar content were significant within 1% level (p<0.01), and brix degree was affected both by red ginseng extract and yeast extract content. Total sugar content was significantly affected by red ginseng content.

Hangul Bitmap Data Compression Embedded in TrueType Font (트루타입 폰트에 내장된 한글 비트맵 데이타의 압축)

  • Han Joo-Hyun;Jeong Geun-Ho;Choi Jae-Young
    • Journal of KIISE:Software and Applications
    • /
    • v.33 no.6
    • /
    • pp.580-587
    • /
    • 2006
  • As PDA, IMT-2000, and e-Book are developed and popular in these days, the number of users who use these products has been increasing. However, available memory size of these machines is still smaller than that of desktop PCs. In these products, TrueType fonts have been increased in demand because the number of users who want to use good quality fonts has increased, and TrueType fonts are of great use in Windows CE products. However, TrueType fonts take a large portion of available device memory, considering the small memory sizes of mobile devices. Therefore, it is required to reduce the size of TrueType fonts. In this paper, two-phase compression techniques are presented for the purpose of reducing the sire of hangul bitmap data embedded in TrueType fonts. In the first step, each character in bitmap is divided into initial consonant, medial vowel, and final consonant, respectively, then the character is recomposed into the composite bitmap. In the second phase, if any two consonants or vowels are determined to be the same, one of them is removed. The TrueType embedded bitmaps in Hangeul Wanseong (pre-composed) and Hangul Johab (pre-combined) are used in compression. By using our compression techniques, the compression rates of embedded bitmap data for TrueType fonts can be reduced around 35% in Wanseong font, and 7% in Johab font. Consequently, the compression rate of total TrueType Wanseong font is about 9.26%.

A Composite Study on the Writing Characteristics of Korean Learners - Focused on Syntax Production, Syntax Complexity and Syntax Errors (한국어 학습자의 쓰기 특성에 관한 융복합적 연구 - 구문산출성, 구문복잡성 및 구문오류를 중심으로)

  • Lee, MI Kyung;Noh, Byungho
    • Journal of the Korea Convergence Society
    • /
    • v.9 no.11
    • /
    • pp.315-324
    • /
    • 2018
  • For Korean learners, writing is a harder part than any other areas in Korean languages. But in the future, the ability to organize and write systematically is essential for future koran languages learners to take classes, do assignments and presentations at school, and then adapt to job situations. Therefore, there is a need to devise a direction for this. In general, writing characteristics are viewed in many ways, including writing productivity, writing complexity, and writing errors. Accordingly, the study provided drawings and A4 paper for Vietnamese Korean learners, Chinese Korean learners, and Korean university students, before writing freely. Based on the their writing results, we looked at syntax factors (total C-units, total number of words), syntax complexity (number of words per C-unit and clause density), and writing errors (postposition, spell errors, and connective suffix, space errors) According to the study, Vietnamese and Chinese Korean language learners showed significantly lower syntax productivity and complexity than Korean university students, and showed more writing errors than Korean students in postposition and clause density. Based on the results of the study, we discussed writing guidelines for Korean languages learners. However, this study did not validate the differences in writing characteristics according to the Korean language level and length of residences for the study subjects. Therefore, it is necessary to consider this in future research.

Application of Laser Surface Treatment Technique for Adhesive Bonding of Carbon Fiber Reinforced Composites (탄소복합재 접착공정을 위한 CFRP의 레이저 표면처리 기법의 적용)

  • Hwang, Mun-Young;Kang, Lae-Hyong;Huh, Mongyoung
    • Composites Research
    • /
    • v.33 no.6
    • /
    • pp.371-376
    • /
    • 2020
  • The adhesive strength can be improved through surface treatment. The most common method is to improve physical bonding by varying the surface conditions. This study presents the effect of laser surface treatment on the adhesive strength of CFRP. The surface roughness was patterned using a 1064 nm laser. The effects of the number of laser shots and the direction and length of the pattern on the adhesion of the CFRP/CFRP single joint were investigated through tensile tests. Tests according to ASTM D5868 were performed, and the bonding mechanism was determined by analyzing the damaged surface after a fracture. The optimized number of the laser shots and the optimized depth of the roughness should be required to increase the bonding strength on the CFRP surface. When considering the shear stress in the tensile direction, the roughness pattern in the direction of 45° that increases the length of the fracture path in the adhesive layer resulted in an increase of the adhesive strength. The surface treatment of the bonding surface using a laser is a suitable method to acquire a mechanical bonding mechanism and improve the bonding strength of the CFRP bonding joint. The study on the optimized laser process parameters is required for utilizing the benefits of laser surface processing.

Multivessel Coronary Revascularization with Composite LITA-RA Y Graft (좌내흉동맥-요골동맥 복합이식편을 이용한 다중혈관 관상동맥우회술)

  • Lee Sub;Ko Mgo-Sung;Park Ki-Sung;Ryu Jae-Kean;Jang Jae-Suk;Kwon Oh-Choon
    • Journal of Chest Surgery
    • /
    • v.39 no.5 s.262
    • /
    • pp.359-365
    • /
    • 2006
  • Background: Arterial grafts have been used to achieve better long-term results for coronary revascularization. Bilateral internal thoracic artery (ITA) grafts have a better results, but it may be not used in some situations such as diabetes and chronic obstructive pulmonary disease (COPD). We evaluated the clinical and angiographic results of composite left internal thoracic artery-radial artery (LITA-RA) Y graft. Material and Method: Between April 2002 and September 2004, 119 patients were enrolled in composite Y graft for coronary bypass surgery. The mean age was $62.6{\pm}8.8$ years old and female was 34.5%. Preoperative cardiac risk factors were as follows: hypertension 43.7%, diabetes 33.6%, smoker 41.2%, and hyperlipidemia 22.7%, There were emergency operation (14), cardiogenic shock (6), left ventricle ejection fraction (LVEF) less than 40% (17), and 17 cases of left main disease. Coronary angiography was done in 35 patients before the hospital discharge. Result: The number of distal anastomoses was $3.1{\pm}0.91$ and three patients (2.52%) died during hospital stay. The off-pump coronary artery bypass (OPCAB) was applied to 79 patients (66.4%). The LITA was anastomosed to left anterior descending system except three cases which was to lateral wall. The radial Y grafts were anastomosed to diagonal branches (4), ramus intermedius (21), obtuse marginal branches (109), posterolateral branches (12), and posterior descending coronary artery (8). Postoperative coronary angiography in 35 patients showed excellent patency rates (LITA 100%, and RA 88.5%; 3 RA grafts which anastomosed to coronary arteries <70% stenosed showed string sign with competitive flow). Conclusion: The LITA-RA Y composite graft provided good early clinical and angiographic results in multivessel coronary revascularization. But it should be cautiously used in selected patients.

A Study on the Development Trend of Artificial Intelligence Using Text Mining Technique: Focused on Open Source Software Projects on Github (텍스트 마이닝 기법을 활용한 인공지능 기술개발 동향 분석 연구: 깃허브 상의 오픈 소스 소프트웨어 프로젝트를 대상으로)

  • Chong, JiSeon;Kim, Dongsung;Lee, Hong Joo;Kim, Jong Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.1-19
    • /
    • 2019
  • Artificial intelligence (AI) is one of the main driving forces leading the Fourth Industrial Revolution. The technologies associated with AI have already shown superior abilities that are equal to or better than people in many fields including image and speech recognition. Particularly, many efforts have been actively given to identify the current technology trends and analyze development directions of it, because AI technologies can be utilized in a wide range of fields including medical, financial, manufacturing, service, and education fields. Major platforms that can develop complex AI algorithms for learning, reasoning, and recognition have been open to the public as open source projects. As a result, technologies and services that utilize them have increased rapidly. It has been confirmed as one of the major reasons for the fast development of AI technologies. Additionally, the spread of the technology is greatly in debt to open source software, developed by major global companies, supporting natural language recognition, speech recognition, and image recognition. Therefore, this study aimed to identify the practical trend of AI technology development by analyzing OSS projects associated with AI, which have been developed by the online collaboration of many parties. This study searched and collected a list of major projects related to AI, which were generated from 2000 to July 2018 on Github. This study confirmed the development trends of major technologies in detail by applying text mining technique targeting topic information, which indicates the characteristics of the collected projects and technical fields. The results of the analysis showed that the number of software development projects by year was less than 100 projects per year until 2013. However, it increased to 229 projects in 2014 and 597 projects in 2015. Particularly, the number of open source projects related to AI increased rapidly in 2016 (2,559 OSS projects). It was confirmed that the number of projects initiated in 2017 was 14,213, which is almost four-folds of the number of total projects generated from 2009 to 2016 (3,555 projects). The number of projects initiated from Jan to Jul 2018 was 8,737. The development trend of AI-related technologies was evaluated by dividing the study period into three phases. The appearance frequency of topics indicate the technology trends of AI-related OSS projects. The results showed that the natural language processing technology has continued to be at the top in all years. It implied that OSS had been developed continuously. Until 2015, Python, C ++, and Java, programming languages, were listed as the top ten frequently appeared topics. However, after 2016, programming languages other than Python disappeared from the top ten topics. Instead of them, platforms supporting the development of AI algorithms, such as TensorFlow and Keras, are showing high appearance frequency. Additionally, reinforcement learning algorithms and convolutional neural networks, which have been used in various fields, were frequently appeared topics. The results of topic network analysis showed that the most important topics of degree centrality were similar to those of appearance frequency. The main difference was that visualization and medical imaging topics were found at the top of the list, although they were not in the top of the list from 2009 to 2012. The results indicated that OSS was developed in the medical field in order to utilize the AI technology. Moreover, although the computer vision was in the top 10 of the appearance frequency list from 2013 to 2015, they were not in the top 10 of the degree centrality. The topics at the top of the degree centrality list were similar to those at the top of the appearance frequency list. It was found that the ranks of the composite neural network and reinforcement learning were changed slightly. The trend of technology development was examined using the appearance frequency of topics and degree centrality. The results showed that machine learning revealed the highest frequency and the highest degree centrality in all years. Moreover, it is noteworthy that, although the deep learning topic showed a low frequency and a low degree centrality between 2009 and 2012, their ranks abruptly increased between 2013 and 2015. It was confirmed that in recent years both technologies had high appearance frequency and degree centrality. TensorFlow first appeared during the phase of 2013-2015, and the appearance frequency and degree centrality of it soared between 2016 and 2018 to be at the top of the lists after deep learning, python. Computer vision and reinforcement learning did not show an abrupt increase or decrease, and they had relatively low appearance frequency and degree centrality compared with the above-mentioned topics. Based on these analysis results, it is possible to identify the fields in which AI technologies are actively developed. The results of this study can be used as a baseline dataset for more empirical analysis on future technology trends that can be converged.

Availability of the Skeletonized Gastroepiploic Artery as a Free Graft for Coronary Artery Bypass Grafting (관상동맥 우회로 조성술에 있어 유리 이식편으로 사용된 골격화 우위대망 동맥의 효용성)

  • Ryu Sang-Wan;Ahn Byong-Hee;Hong Seong-Beom;Song Sang-Yun;Jung In-Suk;Beom Min-Sun;Park Jung-Min;Lee Kyo-Sun;Ryu Sang-Woo;Yoon Ju-Sik;Kim Sang-Hyung
    • Journal of Chest Surgery
    • /
    • v.38 no.9 s.254
    • /
    • pp.601-608
    • /
    • 2005
  • Background: To maximize the histological advantage and minimize the physiological disadvantage, we have been using the skeletonized gastroepiploic artey (GEA) as a free graft for total arterial revascularization. The aims of the current study was to assess the efficacy of the skeletonized GEA as a composite or extended graft for total arterial revascularization. Material and Method: Between January 2000 and Feburary 2005, 133 patients (43 female, mean age=61.8 yrs) undergoing coronary artery bypass grafting (CABG) with a skeletonized GEA as free graft (22 extended, 107 composite and 4 others) were enrolled in this study. Coronary angiograms were performed in the immediate (median 44 days, n=86), early (median 366 days, n=56) and midterm (median 984 days, n=29) postoperative periods. Result: There were 3 ($2.2\%$) early and 4 ($3.3\%$) late cardiac-related deaths. The mean number of distal anastomoses per patient was 3.34 for total graft and 1.92 for GEA graft. The immediate, early, and midterm GEA patency were 157/159 ($98.7\%$), 106/142 ($94.6\%$), and 53/56 ($94.6\%$), respectively. During follow-up, four patients required percutaneous intracoronary intervention because of GEA and target coronary artery stenosis or competitive flow. Conclusion: These data demonstrate satisfactory clinical and angiographic results in the skeletonized GEA as free graft for total arterial revascularizatioh. Although we need a careful longer follow-up, the skeletonized GEA as a free graft will be a valuable option 'to be' for CABG.

Evaluation of the Measurement Uncertainty from the Standard Operating Procedures(SOP) of the National Environmental Specimen Bank (국가환경시료은행 생태계 대표시료의 채취 및 분석 표준운영절차에 대한 단계별 측정불확도 평가 연구)

  • Lee, Jongchun;Lee, Jangho;Park, Jong-Hyouk;Lee, Eugene;Shim, Kyuyoung;Kim, Taekyu;Han, Areum;Kim, Myungjin
    • Journal of Environmental Impact Assessment
    • /
    • v.24 no.6
    • /
    • pp.607-618
    • /
    • 2015
  • Five years have passed since the first set of environmental samples was taken in 2011 to represent various ecosystems which would help future generations lead back to the past environment. Those samples have been preserved cryogenically in the National Environmental Specimen Bank(NESB) at the National Institute of Environmental Research. Even though there is a strict regulation (SOP, standard operating procedure) that rules over the whole sampling procedure to ensure each sample to represent the sampling area, it has not been put to the test for the validation. The question needs to be answered to clear any doubts on the representativeness and the quality of the samples. In order to address the question and ensure the sampling practice set in the SOP, many steps to the measurement of the sample, that is, from sampling in the field and the chemical analysis in the lab are broken down to evaluate the uncertainty at each level. Of the 8 species currently taken for the cryogenic preservation in the NESB, pine tree samples from two different sites were selected for this study. Duplicate samples were taken from each site according to the sampling protocol followed by the duplicate analyses which were carried out for each discrete sample. The uncertainties were evaluated by Robust ANOVA; two levels of uncertainty, one is the uncertainty from the sampling practice, and the other from the analytical process, were then compiled to give the measurement uncertainty on a measured concentration of the measurand. As a result, it was confirmed that it is the sampling practice not the analytical process that accounts for the most of the measurement uncertainty. Based on the top-down approach for the measurement uncertainty, the efficient way to ensure the representativeness of the sample was to increase the quantity of each discrete sample for the making of a composite sample, than to increase the number of the discrete samples across the site. Furthermore, the cost-effective approach to enhance the confidence level on the measurement can be expected from the efforts to lower the sampling uncertainty, not the analytical uncertainty. To test the representativeness of a composite sample of a sampling area, the variance within the site should be less than the difference from duplicate sampling. For that, a criterion, ${i.e.s^2}_{geochem}$(across the site variance) <${s^2}_{samp}$(variance at the sampling location) was proposed. In light of the criterion, the two representative samples for the two study areas passed the requirement. In contrast, whenever the variance of among the sampling locations (i.e. across the site) is larger than the sampling variance, more sampling increments need to be added within the sampling area until the requirement for the representativeness is achieved.