• Title/Summary/Keyword: 최적효율

Search Result 6,018, Processing Time 0.032 seconds

Study on the Mechanical Stability of Red Mud Catalysts for HFC-134a Hydrolysis Reaction (HFC-134a 가수분해를 위한 Red mud 촉매 기계적 안정성 향상에 관한 연구)

  • In-Heon Kwak;Eun-Han Lee;Sung-Chan Nam;Jung-Bae Kim;Shin-Kun Ryi
    • Clean Technology
    • /
    • v.30 no.2
    • /
    • pp.134-144
    • /
    • 2024
  • In this study, the mechanical stability of red mud was improved for its commercial use as a catalyst to effectively decompose HFC-134a, one of the seven major greenhouse gases. Red mud is an industrial waste discharged from aluminum production, but it can be used for the decomposition of HFC-134a. Red mud can be manufactured into a catalyst via the crushing-preparative-compression molding-firing process, and it is possible to improve the catalyst performance and secure mechanical stability through calcination. In order to determine the optimal heat treatment conditions, pellet-shaped compressed red mud samples were calcined at 300, 600, 800 ℃ using a muffle furnace for 5 hours. The mechanical stability was confirmed by the weight loss rate before and after ultra-sonication after the catalyst was immersed in distilled water. The catalyst calcined at 800 ℃ (RM 800) was found to have the best mechanical stability as well as the most catalytic activity. The catalyst performance and durability tests that were performed for 100 hours using the RM 800 catalyst showed thatmore than 99% of 1 mol% HFC-134a was degraded at 650 ℃, and no degradation in catalytic activity was observed. XRD analysis showed tri-calcium aluminate and gehlenite crystalline phases, which enhance mechanical strength and catalytic activity due to the interaction of Ca, Si, and Al after heat treatment at 800 ℃. SEM/EDS analysis of the durability tested catalysts showed no losses in active substances or shape changes due to HFC-134a abasement. Through this research, it is expected that red mud can be commercialized as a catalyst for waste refrigerant treatment due to its high economic feasibility, high decomposition efficiency and mechanical stability.

Preparation of Pure CO2 Standard Gas from Calcium Carbonate for Stable Isotope Analysis (탄산칼슘을 이용한 이산화탄소 안정동위원소 표준시료 제작에 대한 연구)

  • Park, Mi-Kyung;Park, Sunyoung;Kang, Dong-Jin;Li, Shanlan;Kim, Jae-Yeon;Jo, Chun Ok;Kim, Jooil;Kim, Kyung-Ryul
    • The Sea:JOURNAL OF THE KOREAN SOCIETY OF OCEANOGRAPHY
    • /
    • v.18 no.1
    • /
    • pp.40-46
    • /
    • 2013
  • The isotope ratios of $^{13}C/^{12}C$ and $^{18}O/^{16}O$ for a sample in a mass spectrometer are measured relative to those of a pure $CO_2$ reference gas (i.e., laboratory working standard). Thus, the calibration of a laboratory working standard gas to the international isotope scales (Pee Dee Belemnite (PDB) for ${\delta}^{13}C$ and Vienna Standard Mean Ocean Water (V-SMOW) for ${\delta}^{18}O$) is essential for comparisons between data sets obtained by other groups on other mass spectrometers. However, one often finds difficulties in getting well-calibrated standard gases, because of their production time and high price. Additional difficulty is that fractionation processes can occur inside the gas cylinder most likely due to pressure drop in long-term use. Therefore, studies on laboratory production of pure $CO_2$ isotope standard gas from stable solid calcium carbonate standard materials, have been performed. For this study, we propose a method to extract pure $CO_2$ gas without isotope fractionation from a solid calcium carbonate material. The method is similar to that suggested by Coplen et al., (1983), but is better optimized particularly to make a large amount of pure $CO_2$ gas from calcium carbonate material. The $CaCO_3$ releases $CO_2$ in reaction with 100% pure phosphoric acid at $25^{\circ}C$ in a custom designed, evacuated reaction vessel. Here we introduce optimal procedure, reaction conditions, and samples/reactants size for calcium carbonate-phosphoric acid reaction and also provide the details for extracting, purifying and collecting $CO_2$ gas out of the reaction vessel. The measurements for ${\delta}^{18}O$ and ${\delta}^{13}C$ of $CO_2$ were performed at Seoul National University using a stable isotope ratio mass spectrometer (VG Isotech, SIRA Series II) operated in dual-inlet mode. The entire analysis precisions for ${\delta}^{18}O$ and ${\delta}^{13}C$ were evaluated based on the standard deviations of multiple measurements on 15 separate samples of purified $CO_2$. The pure $CO_2$ samples were taken from 100-mg aliquots of a solid calcium carbonate (Solenhofen-ori $CaCO_3$) during 8-day experimental period. The multiple measurements yielded the $1{\sigma}$ precisions of ${\pm}0.01$‰ for ${\delta}^{13}C$ and ${\pm}0.05$‰ for ${\delta}^{18}O$, comparable to the internal instrumental precisions of SIRA. Therefore, we conclude the method proposed in this study can serve as a way to produce an accurate secondary and/or laboratory $CO_2$ standard gas. We hope this study helps resolve difficulties in placing a laboratory working standard onto the international isotope scales and does make accurate comparisons with other data sets from other groups.

Decomposition Characteristics of Fungicides(Benomyl) using a Design of Experiment(DOE) in an E-beam Process and Acute Toxicity Assessment (전자빔 공정에서 실험계획법을 이용한 살균제 Benomyl의 제거특성 및 독성평가)

  • Yu, Seung-Ho;Cho, Il-Hyoung;Chang, Soon-Woong;Lee, Si-Jin;Chun, Suk-Young;Kim, Han-Lae
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.30 no.9
    • /
    • pp.955-960
    • /
    • 2008
  • We investigated and estimated at the characteristics of decomposition and mineralization of benomyl using a design of experiment(DOE) based on the general factorial design in an E-beam process, and also the main factors(variables) with benomyl concentration(X$_1$) and E-beam irradiation(X$_2$) which consisted of 5 levels in each factor was set up to estimate the prediction model and the optimization conditions. At frist, the benomyl in all treatment combinations except 17 and 18 trials was almost degraded and the difference in the decomposition of benomyl in the 3 blocks was not significant(p > 0.05, one-way ANOVA). However, the % of benomyl mineralization was 46%(block 1), 36.7%(block 2) and 22%(block 3) and showed the significant difference of the % that between each block(p < 0.05). The linear regression equations of benomyl mineralization in each block were also estimated as followed; block 1(Y$_1$ = 0.024X$_1$ + 34.1(R$^2$ = 0.929)), block 2(Y$_2$ = 0.026X$_2$ + 23.1(R$^2$ = 0.976)) and block 3(Y$_3$ = 0.034X$_3$ + 6.2(R$^2$ = 0.98)). The normality of benomyl mineralization obtained from Anderson-Darling test in all treatment conditions was satisfied(p > 0.05). The results of prediction model and optimization point using the canonical analysis in order to obtain the optimal operation conditions were Y = 39.96 - 9.36X$_1$ + 0.03X$_2$ - 10.67X$_1{^2}$ - 0.001X$_2{^2}$ + 0.011X$_1$X$_2$(R$^2$ = 96.3%, Adjusted R$^2$ = 94.8%) and 57.3% at 0.55 mg/L and 950 Gy, respectively. A Microtox test using V. fischeri showed that the toxicity, expressed as the inhibition(%), was reduced almost completely after an E-beam irradiation, whereas the inhibition(%) for 0.5 mg/L, 1 mg/L and 1.5 mg/L was 10.25%, 20.14% and 26.2% in the initial reactions in the absence of an E-beam illumination.

An Exploratory study on the demand for training programs to improve Real Estate Agents job performance -Focused on Cheonan, Chungnam- (부동산중개인의 직무능력 향상을 위한 교육프로그램 욕구에 관한 탐색적 연구 -충청남도 천안지역을 중심으로-)

  • Lee, Jae-Beom
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.12 no.9
    • /
    • pp.3856-3868
    • /
    • 2011
  • Until recently, research trend in real estate has been focused on real estate market and the market analysis. But the studies on real estate training program development for real estate agents to improve their job performance are relatively short in numbers. Thus, this study shows empirical analysis of the needs for the training programs for real estate agents in Cheonan to improve their job performance. The results are as follows. First, in the survey of asking what educational contents they need in order to improve real estate agents' job performance, most of the respondents show their needs for the analysis of house's value, legal knowledge, real estate management, accounting, real estate marketing, and understanding of the real estate policy. This is because they are well aware that the best way of responding to the changing clients' needs comes from training programs. Secondly, asked about real estate marketing strategies, most of respondents showed their awareness of new strategies to meet the needs of clients. This is because new forms of marketing strategies including internet ads are needed in the field as the paradigm including Information Technology changes. Thirdly, asked about the need for real estate-related training programs, 92% of the respondents answered they need real estate education programs run by the continuing education centers of the universities. In addition, the survey showed their needs for retraining programs that utilize the resources in the local universities. Other than this, to have effective and efficient training programs, they demanded running a training system by utilizing the human resources of the universities under the name of the department of 'Real Estate Contract' for real estate agents' job performance. Fourthly, the survey revealed real estate management(44.2%) and real estate marketing(42.3%) is the most chosen contents they want to take in the regular course for improving real estate agents' job performance. This shows their will to understand clients' needs through the mind of real estate management and real estate marketing. The survey showed they prefer the training programs as an irregular course to those in the regular one. Despite the above results, this study chose subjects only in Cheanan and thus it needs to research more diverse areas. The needs of programs to improve real estate agents job performance should be analyzed empirically targeting the real estate agents not just in Cheonan but also cities like Pyeongchon, Ilsan and Bundang in which real estate business is booming, as well as undergraduate and graduate students whose major is real estate studies. These studies will be able to provide information to help develop the customized training programs by evaluating elements that real estate agents need in order to meet clients satisfaction and improve their job performance. Many variables of the program development learned through these studies can be incorporated in the curriculum of the real estate studies and used very practically as information for the development of the real estate studies in this fast changing era.

An Empirical Study on Statistical Optimization Model for the Portfolio Construction of Sponsored Search Advertising(SSA) (키워드검색광고 포트폴리오 구성을 위한 통계적 최적화 모델에 대한 실증분석)

  • Yang, Hognkyu;Hong, Juneseok;Kim, Wooju
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.2
    • /
    • pp.167-194
    • /
    • 2019
  • This research starts from the four basic concepts of incentive incompatibility, limited information, myopia and decision variable which are confronted when making decisions in keyword bidding. In order to make these concept concrete, four framework approaches are designed as follows; Strategic approach for the incentive incompatibility, Statistical approach for the limited information, Alternative optimization for myopia, and New model approach for decision variable. The purpose of this research is to propose the statistical optimization model in constructing the portfolio of Sponsored Search Advertising (SSA) in the Sponsor's perspective through empirical tests which can be used in portfolio decision making. Previous research up to date formulates the CTR estimation model using CPC, Rank, Impression, CVR, etc., individually or collectively as the independent variables. However, many of the variables are not controllable in keyword bidding. Only CPC and Rank can be used as decision variables in the bidding system. Classical SSA model is designed on the basic assumption that the CPC is the decision variable and CTR is the response variable. However, this classical model has so many huddles in the estimation of CTR. The main problem is the uncertainty between CPC and Rank. In keyword bid, CPC is continuously fluctuating even at the same Rank. This uncertainty usually raises questions about the credibility of CTR, along with the practical management problems. Sponsors make decisions in keyword bids under the limited information, and the strategic portfolio approach based on statistical models is necessary. In order to solve the problem in Classical SSA model, the New SSA model frame is designed on the basic assumption that Rank is the decision variable. Rank is proposed as the best decision variable in predicting the CTR in many papers. Further, most of the search engine platforms provide the options and algorithms to make it possible to bid with Rank. Sponsors can participate in the keyword bidding with Rank. Therefore, this paper tries to test the validity of this new SSA model and the applicability to construct the optimal portfolio in keyword bidding. Research process is as follows; In order to perform the optimization analysis in constructing the keyword portfolio under the New SSA model, this study proposes the criteria for categorizing the keywords, selects the representing keywords for each category, shows the non-linearity relationship, screens the scenarios for CTR and CPC estimation, selects the best fit model through Goodness-of-Fit (GOF) test, formulates the optimization models, confirms the Spillover effects, and suggests the modified optimization model reflecting Spillover and some strategic recommendations. Tests of Optimization models using these CTR/CPC estimation models are empirically performed with the objective functions of (1) maximizing CTR (CTR optimization model) and of (2) maximizing expected profit reflecting CVR (namely, CVR optimization model). Both of the CTR and CVR optimization test result show that the suggested SSA model confirms the significant improvements and this model is valid in constructing the keyword portfolio using the CTR/CPC estimation models suggested in this study. However, one critical problem is found in the CVR optimization model. Important keywords are excluded from the keyword portfolio due to the myopia of the immediate low profit at present. In order to solve this problem, Markov Chain analysis is carried out and the concept of Core Transit Keyword (CTK) and Expected Opportunity Profit (EOP) are introduced. The Revised CVR Optimization model is proposed and is tested and shows validity in constructing the portfolio. Strategic guidelines and insights are as follows; Brand keywords are usually dominant in almost every aspects of CTR, CVR, the expected profit, etc. Now, it is found that the Generic keywords are the CTK and have the spillover potentials which might increase consumers awareness and lead them to Brand keyword. That's why the Generic keyword should be focused in the keyword bidding. The contribution of the thesis is to propose the novel SSA model based on Rank as decision variable, to propose to manage the keyword portfolio by categories according to the characteristics of keywords, to propose the statistical modelling and managing based on the Rank in constructing the keyword portfolio, and to perform empirical tests and propose a new strategic guidelines to focus on the CTK and to propose the modified CVR optimization objective function reflecting the spillover effect in stead of the previous expected profit models.

Self-optimizing feature selection algorithm for enhancing campaign effectiveness (캠페인 효과 제고를 위한 자기 최적화 변수 선택 알고리즘)

  • Seo, Jeoung-soo;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.173-198
    • /
    • 2020
  • For a long time, many studies have been conducted on predicting the success of campaigns for customers in academia, and prediction models applying various techniques are still being studied. Recently, as campaign channels have been expanded in various ways due to the rapid revitalization of online, various types of campaigns are being carried out by companies at a level that cannot be compared to the past. However, customers tend to perceive it as spam as the fatigue of campaigns due to duplicate exposure increases. Also, from a corporate standpoint, there is a problem that the effectiveness of the campaign itself is decreasing, such as increasing the cost of investing in the campaign, which leads to the low actual campaign success rate. Accordingly, various studies are ongoing to improve the effectiveness of the campaign in practice. This campaign system has the ultimate purpose to increase the success rate of various campaigns by collecting and analyzing various data related to customers and using them for campaigns. In particular, recent attempts to make various predictions related to the response of campaigns using machine learning have been made. It is very important to select appropriate features due to the various features of campaign data. If all of the input data are used in the process of classifying a large amount of data, it takes a lot of learning time as the classification class expands, so the minimum input data set must be extracted and used from the entire data. In addition, when a trained model is generated by using too many features, prediction accuracy may be degraded due to overfitting or correlation between features. Therefore, in order to improve accuracy, a feature selection technique that removes features close to noise should be applied, and feature selection is a necessary process in order to analyze a high-dimensional data set. Among the greedy algorithms, SFS (Sequential Forward Selection), SBS (Sequential Backward Selection), SFFS (Sequential Floating Forward Selection), etc. are widely used as traditional feature selection techniques. It is also true that if there are many risks and many features, there is a limitation in that the performance for classification prediction is poor and it takes a lot of learning time. Therefore, in this study, we propose an improved feature selection algorithm to enhance the effectiveness of the existing campaign. The purpose of this study is to improve the existing SFFS sequential method in the process of searching for feature subsets that are the basis for improving machine learning model performance using statistical characteristics of the data to be processed in the campaign system. Through this, features that have a lot of influence on performance are first derived, features that have a negative effect are removed, and then the sequential method is applied to increase the efficiency for search performance and to apply an improved algorithm to enable generalized prediction. Through this, it was confirmed that the proposed model showed better search and prediction performance than the traditional greed algorithm. Compared with the original data set, greed algorithm, genetic algorithm (GA), and recursive feature elimination (RFE), the campaign success prediction was higher. In addition, when performing campaign success prediction, the improved feature selection algorithm was found to be helpful in analyzing and interpreting the prediction results by providing the importance of the derived features. This is important features such as age, customer rating, and sales, which were previously known statistically. Unlike the previous campaign planners, features such as the combined product name, average 3-month data consumption rate, and the last 3-month wireless data usage were unexpectedly selected as important features for the campaign response, which they rarely used to select campaign targets. It was confirmed that base attributes can also be very important features depending on the type of campaign. Through this, it is possible to analyze and understand the important characteristics of each campaign type.

Edge to Edge Model and Delay Performance Evaluation for Autonomous Driving (자율 주행을 위한 Edge to Edge 모델 및 지연 성능 평가)

  • Cho, Moon Ki;Bae, Kyoung Yul
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.1
    • /
    • pp.191-207
    • /
    • 2021
  • Up to this day, mobile communications have evolved rapidly over the decades, mainly focusing on speed-up to meet the growing data demands of 2G to 5G. And with the start of the 5G era, efforts are being made to provide such various services to customers, as IoT, V2X, robots, artificial intelligence, augmented virtual reality, and smart cities, which are expected to change the environment of our lives and industries as a whole. In a bid to provide those services, on top of high speed data, reduced latency and reliability are critical for real-time services. Thus, 5G has paved the way for service delivery through maximum speed of 20Gbps, a delay of 1ms, and a connecting device of 106/㎢ In particular, in intelligent traffic control systems and services using various vehicle-based Vehicle to X (V2X), such as traffic control, in addition to high-speed data speed, reduction of delay and reliability for real-time services are very important. 5G communication uses high frequencies of 3.5Ghz and 28Ghz. These high-frequency waves can go with high-speed thanks to their straightness while their short wavelength and small diffraction angle limit their reach to distance and prevent them from penetrating walls, causing restrictions on their use indoors. Therefore, under existing networks it's difficult to overcome these constraints. The underlying centralized SDN also has a limited capability in offering delay-sensitive services because communication with many nodes creates overload in its processing. Basically, SDN, which means a structure that separates signals from the control plane from packets in the data plane, requires control of the delay-related tree structure available in the event of an emergency during autonomous driving. In these scenarios, the network architecture that handles in-vehicle information is a major variable of delay. Since SDNs in general centralized structures are difficult to meet the desired delay level, studies on the optimal size of SDNs for information processing should be conducted. Thus, SDNs need to be separated on a certain scale and construct a new type of network, which can efficiently respond to dynamically changing traffic and provide high-quality, flexible services. Moreover, the structure of these networks is closely related to ultra-low latency, high confidence, and hyper-connectivity and should be based on a new form of split SDN rather than an existing centralized SDN structure, even in the case of the worst condition. And in these SDN structural networks, where automobiles pass through small 5G cells very quickly, the information change cycle, round trip delay (RTD), and the data processing time of SDN are highly correlated with the delay. Of these, RDT is not a significant factor because it has sufficient speed and less than 1 ms of delay, but the information change cycle and data processing time of SDN are factors that greatly affect the delay. Especially, in an emergency of self-driving environment linked to an ITS(Intelligent Traffic System) that requires low latency and high reliability, information should be transmitted and processed very quickly. That is a case in point where delay plays a very sensitive role. In this paper, we study the SDN architecture in emergencies during autonomous driving and conduct analysis through simulation of the correlation with the cell layer in which the vehicle should request relevant information according to the information flow. For simulation: As the Data Rate of 5G is high enough, we can assume the information for neighbor vehicle support to the car without errors. Furthermore, we assumed 5G small cells within 50 ~ 250 m in cell radius, and the maximum speed of the vehicle was considered as a 30km ~ 200 km/hour in order to examine the network architecture to minimize the delay.

Development of Yóukè Mining System with Yóukè's Travel Demand and Insight Based on Web Search Traffic Information (웹검색 트래픽 정보를 활용한 유커 인바운드 여행 수요 예측 모형 및 유커마이닝 시스템 개발)

  • Choi, Youji;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.3
    • /
    • pp.155-175
    • /
    • 2017
  • As social data become into the spotlight, mainstream web search engines provide data indicate how many people searched specific keyword: Web Search Traffic data. Web search traffic information is collection of each crowd that search for specific keyword. In a various area, web search traffic can be used as one of useful variables that represent the attention of common users on specific interests. A lot of studies uses web search traffic data to nowcast or forecast social phenomenon such as epidemic prediction, consumer pattern analysis, product life cycle, financial invest modeling and so on. Also web search traffic data have begun to be applied to predict tourist inbound. Proper demand prediction is needed because tourism is high value-added industry as increasing employment and foreign exchange. Among those tourists, especially Chinese tourists: Youke is continuously growing nowadays, Youke has been largest tourist inbound of Korea tourism for many years and tourism profits per one Youke as well. It is important that research into proper demand prediction approaches of Youke in both public and private sector. Accurate tourism demands prediction is important to efficient decision making in a limited resource. This study suggests improved model that reflects latest issue of society by presented the attention from group of individual. Trip abroad is generally high-involvement activity so that potential tourists likely deep into searching for information about their own trip. Web search traffic data presents tourists' attention in the process of preparation their journey instantaneous and dynamic way. So that this study attempted select key words that potential Chinese tourists likely searched out internet. Baidu-Chinese biggest web search engine that share over 80%- provides users with accessing to web search traffic data. Qualitative interview with potential tourists helps us to understand the information search behavior before a trip and identify the keywords for this study. Selected key words of web search traffic are categorized by how much directly related to "Korean Tourism" in a three levels. Classifying categories helps to find out which keyword can explain Youke inbound demands from close one to far one as distance of category. Web search traffic data of each key words gathered by web crawler developed to crawling web search data onto Baidu Index. Using automatically gathered variable data, linear model is designed by multiple regression analysis for suitable for operational application of decision and policy making because of easiness to explanation about variables' effective relationship. After regression linear models have composed, comparing with model composed traditional variables and model additional input web search traffic data variables to traditional model has conducted by significance and R squared. after comparing performance of models, final model is composed. Final regression model has improved explanation and advantage of real-time immediacy and convenience than traditional model. Furthermore, this study demonstrates system intuitively visualized to general use -Youke Mining solution has several functions of tourist decision making including embed final regression model. Youke Mining solution has algorithm based on data science and well-designed simple interface. In the end this research suggests three significant meanings on theoretical, practical and political aspects. Theoretically, Youke Mining system and the model in this research are the first step on the Youke inbound prediction using interactive and instant variable: web search traffic information represents tourists' attention while prepare their trip. Baidu web search traffic data has more than 80% of web search engine market. Practically, Baidu data could represent attention of the potential tourists who prepare their own tour as real-time. Finally, in political way, designed Chinese tourist demands prediction model based on web search traffic can be used to tourism decision making for efficient managing of resource and optimizing opportunity for successful policy.