• Title/Summary/Keyword: technical standards

Search Result 923, Processing Time 0.028 seconds

A Study of Six Sigma and Total Error Allowable in Chematology Laboratory (6 시그마와 총 오차 허용범위의 개발에 대한 연구)

  • Chang, Sang-Wu;Kim, Nam-Yong;Choi, Ho-Sung;Kim, Yong-Whan;Chu, Kyung-Bok;Jung, Hae-Jin;Park, Byong-Ok
    • Korean Journal of Clinical Laboratory Science
    • /
    • v.37 no.2
    • /
    • pp.65-70
    • /
    • 2005
  • Those specifications of the CLIA analytical tolerance limits are consistent with the performance goals in Six Sigma Quality Management. Six sigma analysis determines performance quality from bias and precision statistics. It also shows if the method meets the criteria for the six sigma performance. Performance standards calculates allowable total error from several different criteria. Six sigma means six standard deviations from the target value or mean value and about 3.4 failures per million opportunities for failure. Sigma Quality Level is an indicator of process centering and process variation total error allowable. Tolerance specification is replaced by a Total Error specification, which is a common form of a quality specification for a laboratory test. The CLIA criteria for acceptable performance in proficiency testing events are given in the form of an allowable total error, TEa. Thus there is a published list of TEa specifications for regulated analytes. In terms of TEa, Six Sigma Quality Management sets a precision goal of TEa/6 and an accuracy goal of 1.5 (TEa/6). This concept is based on the proficiency testing specification of target value +/-3s, TEa from reference intervals, biological variation, and peer group median mean surveys. We have found rules to calculate as a fraction of a reference interval and peer group median mean surveys. We studied to develop total error allowable from peer group survey results and CLIA 88 rules in US on 19 items TP, ALB, T.B, ALP, AST, ALT, CL, LD, K, Na, CRE, BUN, T.C, GLU, GGT, CA, phosphorus, UA, TG tests in chematology were follows. Sigma level versus TEa from peer group median mean CV of each item by group mean were assessed by process performance, fitting within six sigma tolerance limits were TP ($6.1{\delta}$/9.3%), ALB ($6.9{\delta}$/11.3%), T.B ($3.4{\delta}$/25.6%), ALP ($6.8{\delta}$/31.5%), AST ($4.5{\delta}$/16.8%), ALT ($1.6{\delta}$/19.3%), CL ($4.6{\delta}$/8.4%), LD ($11.5{\delta}$/20.07%), K ($2.5{\delta}$/0.39mmol/L), Na ($3.6{\delta}$/6.87mmol/L), CRE ($9.9{\delta}$/21.8%), BUN ($4.3{\delta}$/13.3%), UA ($5.9{\delta}$/11.5%), T.C ($2.2{\delta}$/10.7%), GLU ($4.8{\delta}$/10.2%), GGT ($7.5{\delta}$/27.3%), CA ($5.5{\delta}$/0.87mmol/L), IP ($8.5{\delta}$/13.17%), TG ($9.6{\delta}$/17.7%). Peer group survey median CV in Korean External Assessment greater than CLIA criteria were CL (8.45%/5%), BUN (13.3%/9%), CRE (21.8%/15%), T.B (25.6%/20%), and Na (6.87mmol/L/4mmol/L). Peer group survey median CV less than it were as TP (9.3%/10%), AST (16.8%/20%), ALT (19.3%/20%), K (0.39mmol/L/0.5mmol/L), UA (11.5%/17%), Ca (0.87mg/dL1mg/L), TG (17.7%/25%). TEa in 17 items were same one in 14 items with 82.35%. We found out the truth on increasing sigma level due to increased total error allowable, and were sure that the goal of setting total error allowable would affect the evaluation of sigma metrics in the process, if sustaining the same process.

  • PDF

A Study on Radiation Safety Evaluation for Spent Fuel Transportation Cask (사용후핵연료 운반용기 방사선적 안전성평가에 관한 연구)

  • Choi, Young-Hwan;Ko, Jae-Hun;Lee, Dong-Gyu;Jung, In-Su
    • Journal of Nuclear Fuel Cycle and Waste Technology(JNFCWT)
    • /
    • v.17 no.4
    • /
    • pp.375-387
    • /
    • 2019
  • In this study, the radiation dose rates for the design basis fuel of 360 assemblies CANDU spent nuclear fuel transportation cask were evaluated, by measuring radiation source terms for the design basis fuel of a pressurized heavy water reactor. Additionally, radiological safety evaluation was carried out and the validity of the results was determined by radiological technical standards. To select the design basis fuel, which was the radiation source term for the spent fuel transportation cask, the design basis fuels from two spent fuel storage facilities were stored in a spent fuel transportation cask operating in Wolsung NPP. The design basis fuel for each transportation and storage system was based on the burnup of spent fuel, minimum cooling period, and time of transportation to the intermediate storage facility. A burnup of 7,800 MWD/MTU and a minimum cooling period of 6 years were set as the design basis fuel. The radiation source terms of the design basis fuel were evaluated using the ORIGEN-ARP computer module of SCALE computer code. The radiation shielding of the cask was evaluated using the MCNP6 computer code. In addition, the evaluation of the radiation dose rate outside the transport cask required by the technical standard was classified into normal and accident conditions. Thus, the maximum radiation dose rates calculated at the surface of the cask and at a point 2 m from the surface of the cask under normal transportation conditions were respectively 0.330 mSv·h-1 and 0.065 mSv·h-1. The maximum radiation dose rate 1 m from the surface of the cask under accident conditions was calculated as 0.321 mSv·h-1. Thus, it was confirmed that the spent fuel cask of the large capacity heavy water reactor had secured the radiation safety.

Modern Paper Quality Control

  • Olavi Komppa
    • Proceedings of the Korea Technical Association of the Pulp and Paper Industry Conference
    • /
    • 2000.06a
    • /
    • pp.16-23
    • /
    • 2000
  • The increasing functional needs of top-quality printing papers and packaging paperboards, and especially the rapid developments in electronic printing processes and various computer printers during past few years, set new targets and requirements for modern paper quality. Most of these paper grades of today have relatively high filler content, are moderately or heavily calendered , and have many coating layers for the best appearance and performance. In practice, this means that many of the traditional quality assurance methods, mostly designed to measure papers made of pure. native pulp only, can not reliably (or at all) be used to analyze or rank the quality of modern papers. Hence, introduction of new measurement techniques is necessary to assure and further develop the paper quality today and in the future. Paper formation , i.e. small scale (millimeter scale) variation of basis weight, is the most important quality parameter of paper-making due to its influence on practically all the other quality properties of paper. The ideal paper would be completely uniform so that the basis weight of each small point (area) measured would be the same. In practice, of course, this is not possible because there always exists relatively large local variations in paper. However, these small scale basis weight variations are the major reason for many other quality problems, including calender blacking uneven coating result, uneven printing result, etc. The traditionally used visual inspection or optical measurement of the paper does not give us a reliable understanding of the material variations in the paper because in modern paper making process the optical behavior of paper is strongly affected by using e.g. fillers, dye or coating colors. Futhermore, the opacity (optical density) of the paper is changed at different process stages like wet pressing and calendering. The greatest advantage of using beta transmission method to measure paper formation is that it can be very reliably calibrated to measure true basis weight variation of all kinds of paper and board, independently on sample basis weight or paper grade. This gives us the possibility to measure, compare and judge papers made of different raw materials, different color, or even to measure heavily calendered, coated or printed papers. Scientific research of paper physics has shown that the orientation of the top layer (paper surface) fibers of the sheet paly the key role in paper curling and cockling , causing the typical practical problems (paper jam) with modern fax and copy machines, electronic printing , etc. On the other hand, the fiber orientation at the surface and middle layer of the sheet controls the bending stiffness of paperboard . Therefore, a reliable measurement of paper surface fiber orientation gives us a magnificent tool to investigate and predict paper curling and coclking tendency, and provides the necessary information to finetune, the manufacturing process for optimum quality. many papers, especially heavily calendered and coated grades, do resist liquid and gas penetration very much, bing beyond the measurement range of the traditional instruments or resulting invonveniently long measuring time per sample . The increased surface hardness and use of filler minerals and mechanical pulp make a reliable, nonleaking sample contact to the measurement head a challenge of its own. Paper surface coating causes, as expected, a layer which has completely different permeability characteristics compared to the other layer of the sheet. The latest developments in sensor technologies have made it possible to reliably measure gas flow in well controlled conditions, allowing us to investigate the gas penetration of open structures, such as cigarette paper, tissue or sack paper, and in the low permeability range analyze even fully greaseproof papers, silicon papers, heavily coated papers and boards or even detect defects in barrier coatings ! Even nitrogen or helium may be used as the gas, giving us completely new possibilities to rank the products or to find correlation to critical process or converting parameters. All the modern paper machines include many on-line measuring instruments which are used to give the necessary information for automatic process control systems. hence, the reliability of this information obtained from different sensors is vital for good optimizing and process stability. If any of these on-line sensors do not operate perfectly ass planned (having even small measurement error or malfunction ), the process control will set the machine to operate away from the optimum , resulting loss of profit or eventual problems in quality or runnability. To assure optimum operation of the paper machines, a novel quality assurance policy for the on-line measurements has been developed, including control procedures utilizing traceable, accredited standards for the best reliability and performance.

Development and Complementation of Evaluation Area and Content Elements in Electrical, Electronics and Communications Subject (중등교사 임용후보자선정경쟁시험 표시과목인 전기·전자·통신의 평가영역 및 내용요소 개발·보완 연구)

  • Song, Youngjik;Kang, Yoonkook;Cho, Hanwook;Gim, Seongdeuk;Lim, Seunggak;Lee, Hyuksoo
    • 대한공업교육학회지
    • /
    • v.44 no.1
    • /
    • pp.52-71
    • /
    • 2019
  • The quality of school education is a key element for national education development. An important factor that determines the quality of school education is qualities of teachers who are in responsible for school education in the field. Therefore, it is necessary to hire competent teachers in the teacher appointment exam for the secondary school. This necessity is evident especially for vocational high schools and Meister high schools with the introduction of 2015-revised curriculum based on NCS that separates each three subjects, "Electrical, Electronics Communication" resulting in the change of question mechanism, which requires new designing of assessment and content area. So, this study analyzes curriculum in college of education for "Electrical", "Electronics", "Communication", 2015-revised curriculum based on NCS and the development of standards for teacher qualifications and assessment area and evaluation of teaching ability in the subjects of the teacher appointment exam, "Electrical, Electronics Communication" Engineering" in 2009. The assessment area and content elements of "Electrical", "Electronics", "Communication are extracted from the analyzed results and they are verified by experts' consultation and presented as follows; First, the assessment area and content elements of the "Electrical" subject were designed to evaluate the NCS - based 2015 revised curriculum by presenting the NCS learning module to the evaluation area and content element in the basic subject "Electrical and Electronics Practice". Second, the section of "Electronics" presented the assessment area and content elements applying the Electronic Circuit, basic subject of the NCS and it also added "Electromagnetics", which is the basic part of Electronics in the Application of Electromagnetic waves that could be applied to the assessment. Third, the assessment area and content elements of "Communication" consist of the communication-related practice that is based on "Electrical" and "Electronic", considering the characteristics of "Communication Engineering". In particular, "Electrical and Electronics practice" which adds network construction practice and communication-related practice makes it to be able to evaluate the communication-related practical education.

A Study of Establishing the Plan of Lodging for the Workers of Gaesung Industrial Complex (개성공단 근로자 기숙사 건립 계획 연구)

  • Choi, Sang-Hee;Kim, Doo-Hwan;Kim, Sang-Yeon;Choi, Eun-Hee
    • Land and Housing Review
    • /
    • v.6 no.2
    • /
    • pp.67-77
    • /
    • 2015
  • Now that it is the current situation that the smooth supply and demand are necessary for 2nd phase of beginning construction and stable development of Gaesung Industrial Complex, this study was willing to offer the planning criteria and model to establish the lodging for the workers in Gaesung Industrial Complex based on the agreement that both South and North Korea agreed in 2007. Regarding the plan, its standard and the alternative were reviewed considering welfare of workers, economic efficiency, technical validity, possibility of agreement and long-term development. The exclusive area per capita was calculated through Labor Standards Act of Korea and status survey of lodging for the workers provided to border line area between China and North Korea and the economic alternative based on one room for 6 persons with the public restroom was compared with that of development type based on one room for 4 persons with indoor restroom. Especially regarding the proposed site, the area with the optimized position was set by considering gradient, accessibility and convenience of development out of the area of Dongchang-ri where was agreed already and the priority of the proposed site that can keep the existing building site and provide was offered. The necessary period for whole construction was set as approximately 36 months. Regarding construction method, RC Rahmen method was selected as the optimized alternative considering the workmanship of manpower of North Korea and conditions of supply and demand of materials and cluster-type vehicle allocation plan based on 4~6 units considering the efficiency of supplying service facilities and convenient facilities along the simultaneous accommodation of 15,000 people was offered. It was analyzed that total business expenses of approximately 80~100 billion Korean Won would required though there were the difference for each alternative in the charged rental way that the development business owner develops by lending the inter-Korea Cooperation Fund and withdraws the rent by the benefit principle. The possibility of withdrawing the rent was analyzed assuming that the period of withdrawing the investment is 30 years. Especially for the operation management after moving, the establishment of the committee of operating the lodging for the workers of Gaesung Industrial Complex (tentative name) was offered with the dualized governance that the constructor takes charge of operational management, collecting fees and management of infrastructure and human resource management is delegated to North Korea.

토양 및 지하수 Investigation 과 Remediation에 대한 현장적용

  • Wallner, Heinz
    • Proceedings of the Korean Society of Soil and Groundwater Environment Conference
    • /
    • 2000.11a
    • /
    • pp.44-63
    • /
    • 2000
  • Situated close to Heathrow Airport, and adjacent to the M4 and M25 Motorways, the site at Axis Park is considered a prime location for business in the UK. In consequnce two of the UK's major property development companies, MEPC and Redrew Homes sought the expertise of Intergeo to remediate the contaminated former industrial site prior to its development. Industrial use of the twenty-six hectare site, started in 1936, when Hawker Aircraft commence aircraft manufacture. In 1963 the Firestone Tyre and Rubber Company purchased part of the site. Ford commenced vehicle production at the site in the mid-1970's and production was continued by Iveco Ford from 1986 to the plant's decommissioning in 1997. Geologically the site is underlain by sand and gravel, deposited in prehistory by the River Thames, with London Clay at around 6m depth. The level of groundwater fluctuates seasonally at around 2.5m depth, moving slowly southwest towards local streams and watercourses. A phased investigation of the site was undertaken, which culminated in the extensive site investigation undertaken by Intergeo in 1998. In total 50 boreholes, 90 probeholes and 60 trial pits were used to investigate the site and around 4000 solid and 1300 liquid samples were tested in the laboratory for chemical substances. The investigations identified total petroleum hydrocarbons in the soil up to 25, 000mg/kg. Diesel oil, with some lubricating oil were the main components. Volatile organic compounds were identified in the groundwater in excess of 10mg/l. Specific substances included trichloromethane, trichloromethane and tetrachloroethene. Both the oil and volatile compounds were widely spread across the site, The specific substances identified could be traced back to industrial processes used at one or other dates in the sites history Slightly elevated levels of toxic metals and polycyclic aromatic hydrocarbons were also identified locally. Prior to remediation of the site and throughout its progress, extensive liaison with the regulatory authorities and the client's professional representatives was required. In addition to meetings, numerous technical documents detailing methods and health and safety issues were required in order to comply with UK environmental and safety legislation. After initially considering a range of options to undertake remediation, the following three main techniques were selected: ex-situ bioremediation of hydrocarbon contaminated soils, skimming of free floating hydrocarbon product from the water surface at wells and excavations and air stripping of volatile organic compounds from groundwater recovered from wells. The achievements were as follows: 1) 350, 000m3 of soil was excavated and 112, 000m3 of sand and gravel was processed to remove gravel and cobble sized particles; 2) 53, 000m3 of hydrocarbon contaminated soil was bioremediated in windrows ; 3) 7000m3 of groundwater was processed by skimming to remove free floating Product; 4) 196, 000m3 of groundwater was Processed by air stripping to remove volatile organic compounds. Only 1000m3 of soil left the site for disposal in licensed waste facilities Given the costs of disposal in the UK, the selected methods represented a considerable cost saving to the Clients. All other soil was engineered back into the ground to a precise geotechnical specification. The following objective levels were achieved across the site 1) By a Risk Based Corrective Action (RBCA) methodology it was demonstrated that soil with less that 1000mg/kg total petroleum hydrocarbons did not pose a hazard to health or water resources and therefore, could remain insitu; 2) Soils destined for the residential areas of the site were remediated to 250mg/kg total petroleum hydrocarbons; in the industrial areas 500mg/kg was proven acceptable. 3) Hydrocarbons in groundwater were remediated to below the Dutch Intervegtion Level of 0.6mg/1; 4) Volatile organic compounds/BTEX group substances were reduced to below the Dutch Intervention Levels; 5) Polycyclic aromatic hydrocarbons and metals were below Inter-departmental Committee for the Redevelopment of Contaminated Land guideline levels for intended enduse. In order to verify the qualify of the work 1500 chemical test results were submitted for the purpose of validation. Quality assurance checks were undertaken by independent consultants and at an independent laboratory selected by Intergeo. Long term monitoring of water quality was undertaken for a period of one year after remediation work had been completed. Both the regulatory authorities and Clients representatives endorsed the quality of remediation now completed at the site. Subsequent to completion of the remediation work Redrew Homes constructed a prestige housing development. The properties at "Belvedere Place" retailed at premium prices. On the MEPC site the Post Office, amongst others, has located a major sorting office for the London area. Exceptionally high standards of remediation, control and documentation were a requirement for the work undertaken here.aken here.

  • PDF

Impact of impulsiveness on mobile banking usage: Moderating effect of credit card use and mediating effect of SNS addiction (충동성이 모바일뱅킹 사용률에 미치는 영향: 신용카드 사용 여부의 조절효과와 SNS 중독의 매개효과)

  • Lee, Youmi;Nam, Kihwan
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.3
    • /
    • pp.113-137
    • /
    • 2021
  • According to the clear potential of mobile banking growth, many studies related to this are being conducted, but in Korea, it is concentrated on the analysis of technical factors or consumers' intentions, behaviors, and satisfaction. In addition, even though it has a strong customer base of 20s, there are few studies that have been conducted specifically for this customer group. In order for mobile banking to take a leap forward, a strategy to secure various perspectives is needed not only through research on itself but also through research on external factors affecting mobile banking. Therefore, this study analyzes impulsiveness, credit card use, and SNS addiction among various external factors that can significantly affect mobile banking in their 20s. This study examines whether the relationship between impulsiveness and mobile banking usage depends on whether or not a credit card is used, and checks whether a customer's impulsiveness is possible by examining whether a credit card is used. Based on this, it is possible to establish new standards for classification of marketing target groups of mobile banking. After finding out the static or unsuitable relationship between whether to use a credit card and impulsiveness, we want to indirectly predict the customer's impulsiveness through whether to use a credit card or not to use a credit card. It also verifies the mediating effect of SNS addiction in the relationship between impulsiveness and mobile banking usage. For this analysis, the collected data were conducted according to research problems using the SPSS Statistics 25 program. The findings are as follows. First, positive urgency has been shown to have a significant static effect on mobile banking usage. Second, whether to use credit cards has shown moderating effects in the relationship between fraudulent urgency and mobile banking usage. Third, it has been shown that all subfactors of impulsiveness have significant static relationships with subfactors of SNS addiction. Fourth, it has been confirmed that the relationship between positive urgency, SNS addiction, and mobile banking usage has total effect and direct effect. The first result means that mobile banking usage may be high if positive urgency is measured relatively high, even if the multi-dimensional impulsiveness scale is low. The second result indicates that mobile banking usage rates were not affected by the independent variable, negative urgency, but were found to have a significant static relationship with negative urgency when using credit cards. The third result means that SNS is likely to become addictive if lack of premeditation or lack of perseverance is high because it provides instant enjoyment and satisfaction as a mobile-based service. This also means that SNS can be used as an avoidance space for those with negative urgency, and as an emotional expression space for those with high positive urgency.

Development Process and Methods of Audit and Certification Toolkit for Trustworthy Digital Records Management Agency (신뢰성 있는 전자기록관리기관 감사인증도구 개발에 관한 연구)

  • Rieh, Hae-young;Kim, Ik-han;Yim, Jin-Hee;Shim, Sungbo;Jo, YoonSun;Kim, Hyojin;Woo, Hyunmin
    • The Korean Journal of Archival Studies
    • /
    • no.25
    • /
    • pp.3-46
    • /
    • 2010
  • Digital records management is one whole system in which many social and technical elements are interacting. To maintain the trustworthiness, the repository needs periodical audit and certification. Thus, individual electronic records management agency needs toolkit that can be used to self-evaluate their trustworthiness continuously, and self-assess their atmosphere and system to recognize deficiencies. The purpose of this study is development of self-certification toolkit for repositories, which synthesized and analysed such four international standard and best practices as OAIS Reference Model(ISO 14721), TRAC, DRAMBORA, and the assessment report conducted and published by TNA/UKDA, as well as MoRe2 and current national laws and standards. As this paper describes and demonstrate the development process and the framework of this self-certification toolkit, other electronic records management agencies could follow the process and develop their own toolkit reflecting their situation, and utilize the self-assessment results in-house. As a result of this research, 12 areas for assessment were set, which include (organizational) operation management, classification system and master data management, acquisition, registration and description, storage and preservation, disposal, services, providing finding aids, system management, access control and security, monitoring/audit trail/statistics, and risk management. In each 12 area, the process map or functional charts were drawn and business functions were analyzed, and 54 'evaluation criteria', consisted of main business functional unit in each area were drawn. Under each 'evaluation criteria', 208 'specific evaluation criteria', which supposed to be implementable, measurable, and provable for self-evaluation in each area, were drawn. The audit and certification toolkit developed by this research could be used by digital repositories to conduct periodical self-assessment of the organization, which would be used to supplement any found deficiencies and be used to reflect the organizational development strategy.

Current status and prospects of approval of the new technology-based food additives (신기술이용 식품첨가물 국내·외 심사 현황 및 전망)

  • Rhee, Jin-Kyu
    • Food Science and Industry
    • /
    • v.52 no.2
    • /
    • pp.188-201
    • /
    • 2019
  • In the past, food additives were classified and managed as chemical synthetic and natural additives according to the manufacturing process, but it was difficult to confirm the purpose or function of food additives.CODEX, an internationalstandard, classifies food additives according to their practical use, based on scientific evidence on the technical effects of food additives, instead of classifying them as synthetic or natural. Therefore, very recently, the food additive standards in Korea have been completely revised in accordance with these global trends. Currently, the classification system of food additives is divided into 31 uses to specify their functions and purposes instead of manufacturing methods. Newer revision of the legislative framework for defining and expanding the scope of the Act as an enlarged area is required. Competition for preempting new food products based on bio-based technology is very fierce in order to enhance the safety of domestic people and maximize the economic profit of their own countries. In this age of infinite competition, it is very urgent to revise or supplement the current regulations in order to revitalize the domestic food industry and enhance national competitiveness through the development of food additives using new biotechnology. In this report, current laws on domestic food ingredients, food additives and manufacturing methods, and a comparison of domestic and foreign advanced countries' regulations and countermeasures strategies were reviewed to improve national competitiveness of domestic advanced biotechnology-based food additives industry.

Predicting stock movements based on financial news with systematic group identification (시스템적인 군집 확인과 뉴스를 이용한 주가 예측)

  • Seong, NohYoon;Nam, Kihwan
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.1-17
    • /
    • 2019
  • Because stock price forecasting is an important issue both academically and practically, research in stock price prediction has been actively conducted. The stock price forecasting research is classified into using structured data and using unstructured data. With structured data such as historical stock price and financial statements, past studies usually used technical analysis approach and fundamental analysis. In the big data era, the amount of information has rapidly increased, and the artificial intelligence methodology that can find meaning by quantifying string information, which is an unstructured data that takes up a large amount of information, has developed rapidly. With these developments, many attempts with unstructured data are being made to predict stock prices through online news by applying text mining to stock price forecasts. The stock price prediction methodology adopted in many papers is to forecast stock prices with the news of the target companies to be forecasted. However, according to previous research, not only news of a target company affects its stock price, but news of companies that are related to the company can also affect the stock price. However, finding a highly relevant company is not easy because of the market-wide impact and random signs. Thus, existing studies have found highly relevant companies based primarily on pre-determined international industry classification standards. However, according to recent research, global industry classification standard has different homogeneity within the sectors, and it leads to a limitation that forecasting stock prices by taking them all together without considering only relevant companies can adversely affect predictive performance. To overcome the limitation, we first used random matrix theory with text mining for stock prediction. Wherever the dimension of data is large, the classical limit theorems are no longer suitable, because the statistical efficiency will be reduced. Therefore, a simple correlation analysis in the financial market does not mean the true correlation. To solve the issue, we adopt random matrix theory, which is mainly used in econophysics, to remove market-wide effects and random signals and find a true correlation between companies. With the true correlation, we perform cluster analysis to find relevant companies. Also, based on the clustering analysis, we used multiple kernel learning algorithm, which is an ensemble of support vector machine to incorporate the effects of the target firm and its relevant firms simultaneously. Each kernel was assigned to predict stock prices with features of financial news of the target firm and its relevant firms. The results of this study are as follows. The results of this paper are as follows. (1) Following the existing research flow, we confirmed that it is an effective way to forecast stock prices using news from relevant companies. (2) When looking for a relevant company, looking for it in the wrong way can lower AI prediction performance. (3) The proposed approach with random matrix theory shows better performance than previous studies if cluster analysis is performed based on the true correlation by removing market-wide effects and random signals. The contribution of this study is as follows. First, this study shows that random matrix theory, which is used mainly in economic physics, can be combined with artificial intelligence to produce good methodologies. This suggests that it is important not only to develop AI algorithms but also to adopt physics theory. This extends the existing research that presented the methodology by integrating artificial intelligence with complex system theory through transfer entropy. Second, this study stressed that finding the right companies in the stock market is an important issue. This suggests that it is not only important to study artificial intelligence algorithms, but how to theoretically adjust the input values. Third, we confirmed that firms classified as Global Industrial Classification Standard (GICS) might have low relevance and suggested it is necessary to theoretically define the relevance rather than simply finding it in the GICS.