• Title/Summary/Keyword: Make up

Search Result 5,278, Processing Time 0.036 seconds

The Characteristics of Bronchioloalveolar Carcinoma Presenting with Solitary Pulmonary Nodule (고립성 폐결절로 나타난 기관지폐포암의 임상적 고찰)

  • Kim, Ho-Cheol;Cheon, Eun-Mee;Suh, Gee-Young;Chung, Man-Pyo;Kim, Ho-Joong;Kwon, O-Jung;Rhee, Chong-H.;Han, Yong-Chol;Lee, Kyoung-Soo;Han, Jung-Ho
    • Tuberculosis and Respiratory Diseases
    • /
    • v.44 no.2
    • /
    • pp.280-289
    • /
    • 1997
  • Background : Bronchioloalveolar carcinoma (BAC) has been reported to diveres spectrum of chinical presentations and radiologic patterns. The three representative radiologic patterns are followings ; 1) a solitary nodule or mass, 2) a localized consolidation, and 3) multicentric or diffuse disease. While, the localized consolidation and solitary nodular patterns has favorable prognosis, the multicentric of diffuse pattern has worse prognosis regardless of treatment. BAC presenting as a solitary pulmonary nodule is often misdiagnosed as other benign disease such as tuberculoma. Therefore it is very important to make proper diagnosis of BAC with solitary nodular pattern, since this pattern of BAC is usually curable with a surgical resection. Methods : We reviewed the clinical and radiologic features of patients with pathologically-proven BAC with solitary nodular pattern from January 1995 to September 1996 at Samsung Medical Center. Results : Total 11 patients were identified. 6 were men and 5 were women. Age ranged from 37 to 69. Median age was 60. Most patients with BAC with solitary nodular pattern were asymptomatic and were detected by incidental radiologic abnormality. The chest radiograph showed poorly defined opacity or nodule and computed tomography showed consolidation, ground glass appearance, internal bubble-like lucencies, air bronchogram, open bronchus sign, spiculated margin or pleural tag in most patients. The initial diagnosis on chest X-ray were pulmonary tuberculosis in 4 patients, benign nodule in 2 patients and malignant nodule in 5 patients. The FDG-positron emission tomogram was performed in eight patients. The FDG-PET revealed suggestive findings of malignancy in only 3 patients. The pathologic diagnosis was obtained by transbronchial lung biopsy in 1 patient, by CT guided percutaneous needle aspiration in 2 patients, and by lung biopsy via video-assited thoracocopy in 8 patients. Lobectomy was performed in all patients and postoperative pathologic staging were $T_1N_0N_0$ in 8 patients and $T_2N_0M_0$ in 3 patients. Conclusion : Patients of BAC presenting with solitary nodular pattern were most often asymptomatic and incidentally detected by radiologic abnormality. The chest X-ray showed poorly defined nodule or opacity and these findings were often regarded as benign lesion. If poorly nodule or opacity does not disappear on follow up chest X-ray, computed tomography should be performed. If consolidation, ground glass appearance, open bronchus sign, air bronchogram, internal bubble like lucency, pleural tag or spiculated margin are found on computed tomography, further diagnostic procedures, including open thoracotomy, should be performed to exclude the possiblity of BAC with solitary nodular pattern.

  • PDF

Service Quality, Customer Satisfaction and Customer Loyalty of Mobile Communication Industry in China (중국이동통신산업중적복무질량(中国移动通信产业中的服务质量), 고객만의도화고객충성도(顾客满意度和顾客忠诚度))

  • Zhang, Ruijin;Li, Xiangyang;Zhang, Yunchang
    • Journal of Global Scholars of Marketing Science
    • /
    • v.20 no.3
    • /
    • pp.269-277
    • /
    • 2010
  • Previous studies have shown that the most important factor affecting customer loyalty in the service industry is service quality. However, on the subject of whether service quality has a direct or indirect effect on customer loyalty, scholars' views apparently vary. Some studies suggest that service quality has a direct and fundamental influence on customer loyalty (Bai and Liu, 2002). However, others have shown that service quality not only directly affects customer loyalty, it also has an indirect impact on customer loyalty by influencing customer satisfaction and perceived value (Cronin, Brady, and Hult, 2000). Currently, there are few domestic articles that specifically address the relationship between service quality and customer loyalty in the mobile communication industry. Moreover, research has studied customer loyalty as a whole variable, rather than breaking it down further into multiple dimensions. Based on this analysis, this paper summarizes previous study results, establishes an effect mechanism model among service quality, customer satisfaction, and customer loyalty in the mobile communication industry, and presents a statistical test on model assumptions by using customer investigation data from Heilongjiang Mobile Company. It provides theoretical guidance for mobile service management based on the discussion of the hypothesis test results. For data collection, the sample comprised mobile users in Harbin city, and the survey was taken by random sampling. Out of a total of 300 questionnaires, 276 (92.9%) were recovered. After excluding invalid questionnaires, 249 remained, for an effective rate of 82.6 percent for the study. Cronbach's ${\alpha}$ coefficient was adapted to assess the scale reliability, and validity testing was conducted on the questionnaire from three aspects: content validity, construct validity. and convergent validity. The study tested for goodness of fit mainly from the absolute and relative fit indexes. From the hypothesis testing results, overall, four assumptions have not been supported. The ultimate affective relationship of service quality, customer satisfaction, and customer loyalty is demonstrated in Figure 2. On the whole, the service quality of the communication industry not only has a direct positive significant effect on customer loyalty, it also has an indirect positive significant effect on customer loyalty through service quality; the affective mechanism and extent of customer loyalty are different, and are influenced by each dimension of service quality. This study used the questionnaires of existing literature from home and abroad and tested them in empirical research, with all questions adapted to seven-point Likert scales. With the SERVQUAL scale of Parasuraman, Zeithaml, and Berry (1988), or PZB, as a reference point, service quality was divided into five dimensions-tangibility, reliability, responsiveness, assurance, and empathy-and the questions were simplified down to nineteen. The measurement of customer satisfaction was based mainly on Fornell (1992) and Wang and Han (2003), ending up with four questions. Based on the study’s three indicators of price tolerance, first choice, and complaint reaction were used to measure attitudinal loyalty, while repurchase intention, recommendation, and reputation measured behavioral loyalty. The collection and collation of literature data produced a model of the relationship among service quality, customer satisfaction, and customer loyalty in mobile communications, and China Mobile in the city of Harbin in Heilongjiang province was used for conducting an empirical test of the model and obtaining some useful conclusions. First, service quality in mobile communication is formed by the five factors mentioned earlier: tangibility, reliability, responsiveness, assurance, and empathy. On the basis of PZB SERVQUAL, the study designed a measurement scale of service quality for the mobile communications industry, and obtained these five factors through exploratory factor analysis. The factors fit basically with the five elements, indicating the concept of five elements of service quality for the mobile communications industry. Second, service quality in mobile communications has both direct and indirect positive effects on attitudinal loyalty, with the indirect effect being produced through the intermediary variable, customer satisfaction. There are also both direct and indirect positive effects on behavioral loyalty, with the indirect effect produced through two intermediary variables: customer satisfaction and attitudinal loyalty. This shows that better service quality and higher customer satisfaction will activate the attitudinal to service providers more active and show loyalty to service providers much easier. In addition, the effect mechanism of all dimensions of service quality on all dimensions of customer loyalty is different. Third, customer satisfaction plays a significant intermediary role among service quality and attitudinal and behavioral loyalty, indicating that improving service quality can boost customer satisfaction and make it easier for satisfied customers to become loyal customers. Moreover, attitudinal loyalty plays a significant intermediary role between service quality and behavioral loyalty, indicating that only attitudinally and behaviorally loyal customers are truly loyal customers. The research conclusions have some indications for Chinese telecom operators and others to upgrade their service quality. Two limitations to the study are also mentioned. First, all data were collected in the Heilongjiang area, so there might be a common method bias that skews the results. Second, the discussion addresses the relationship between service quality and customer loyalty, setting customer satisfaction as mediator, but does not consider other factors, like customer value and consumer features, This research will be continued in the future.

Twitter Issue Tracking System by Topic Modeling Techniques (토픽 모델링을 이용한 트위터 이슈 트래킹 시스템)

  • Bae, Jung-Hwan;Han, Nam-Gi;Song, Min
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.2
    • /
    • pp.109-122
    • /
    • 2014
  • People are nowadays creating a tremendous amount of data on Social Network Service (SNS). In particular, the incorporation of SNS into mobile devices has resulted in massive amounts of data generation, thereby greatly influencing society. This is an unmatched phenomenon in history, and now we live in the Age of Big Data. SNS Data is defined as a condition of Big Data where the amount of data (volume), data input and output speeds (velocity), and the variety of data types (variety) are satisfied. If someone intends to discover the trend of an issue in SNS Big Data, this information can be used as a new important source for the creation of new values because this information covers the whole of society. In this study, a Twitter Issue Tracking System (TITS) is designed and established to meet the needs of analyzing SNS Big Data. TITS extracts issues from Twitter texts and visualizes them on the web. The proposed system provides the following four functions: (1) Provide the topic keyword set that corresponds to daily ranking; (2) Visualize the daily time series graph of a topic for the duration of a month; (3) Provide the importance of a topic through a treemap based on the score system and frequency; (4) Visualize the daily time-series graph of keywords by searching the keyword; The present study analyzes the Big Data generated by SNS in real time. SNS Big Data analysis requires various natural language processing techniques, including the removal of stop words, and noun extraction for processing various unrefined forms of unstructured data. In addition, such analysis requires the latest big data technology to process rapidly a large amount of real-time data, such as the Hadoop distributed system or NoSQL, which is an alternative to relational database. We built TITS based on Hadoop to optimize the processing of big data because Hadoop is designed to scale up from single node computing to thousands of machines. Furthermore, we use MongoDB, which is classified as a NoSQL database. In addition, MongoDB is an open source platform, document-oriented database that provides high performance, high availability, and automatic scaling. Unlike existing relational database, there are no schema or tables with MongoDB, and its most important goal is that of data accessibility and data processing performance. In the Age of Big Data, the visualization of Big Data is more attractive to the Big Data community because it helps analysts to examine such data easily and clearly. Therefore, TITS uses the d3.js library as a visualization tool. This library is designed for the purpose of creating Data Driven Documents that bind document object model (DOM) and any data; the interaction between data is easy and useful for managing real-time data stream with smooth animation. In addition, TITS uses a bootstrap made of pre-configured plug-in style sheets and JavaScript libraries to build a web system. The TITS Graphical User Interface (GUI) is designed using these libraries, and it is capable of detecting issues on Twitter in an easy and intuitive manner. The proposed work demonstrates the superiority of our issue detection techniques by matching detected issues with corresponding online news articles. The contributions of the present study are threefold. First, we suggest an alternative approach to real-time big data analysis, which has become an extremely important issue. Second, we apply a topic modeling technique that is used in various research areas, including Library and Information Science (LIS). Based on this, we can confirm the utility of storytelling and time series analysis. Third, we develop a web-based system, and make the system available for the real-time discovery of topics. The present study conducted experiments with nearly 150 million tweets in Korea during March 2013.

A Study on the Liturgical Vestments of Catholic-With reference to the Liturgical Vestments Firm of Paderborn and kevelaer in Germany (카톨릭교 전례복에 관한 연구-독일 Paderborn 과 kevelaer의 전례복 회사를 중심으로)

  • Yang, Ri-Na
    • The Journal of Natural Sciences
    • /
    • v.7
    • /
    • pp.133-162
    • /
    • 1995
  • Paderborn's companies, Wameling and Cassau, produce the liturgical vestments, which have much traditional artistic merit. And Kevelaerer Fahnen + Paramenten GmbH, located in Kevelater which is a place of pilgrimage of the Virgin Mary, was known to Europe, Africa, America and the Scandinavia Peninsula as the "Hidden Company" of liturgical vesments maker up to now. Paderborn and Kevelaer were the place of the center of the religious world and the Catholic ceremony during a good few centries. The Catholic liturgical vestiments of these 3 companies use versatile design, color, shape and techniques. These have not only the symbolism of religion, but also can meet our's expectations of utilization of modern textile art, art clothing and wide-all division of design. These give the understanding of symbolic meanings and harmony according to liturgical vestments to the believers. And these have an influence on mental thinking and induction of religious belief to the non-believers as the recognition and concerns about the religious art. The liturgical vestments are clothes which churchmen put on at the all ceremonial function of a mass, a sacrament, performance and a parade according to rules of church. These show the represen-tation of "Holy God" in silence and distinguish between common people and churchmen. And these represent a status and dignity of churchmen and induce majesty and respect to churchmen. Common clothes of the beginning of the Greece and Rome was developed to Christian clothes with the tendency of religion. There were no special uniforms distinguished from commen people until the Christianity was recognized officially by the Roman Emperor Constantinus at A.D.313. The color of liturgical vestments was originally white and changed to special colors according to liturgical day and each time by the Pope Innocentius at 12th century. The color and symbolic meaning of the liturgical vestments of present day was originated by the Pope St. Pius(1566-1572). Wool and Linen was used as decorations and materials in the beginnings and the special materials like silk was used after 4th century and beautiful materials made of gold thread was used at 12th century. It is expected that there is no critical changes to the liturgical vestments of future. But the development of liturgical vestments will continues slowly by the command of conservative church and will change to simple and convenient formes according to the culture, the trend of the times and the fashion of clothes. The companies of liturgical vestments develop versatile design, embroidery technique and realization of creative design for distinction of the liturgical vestments of each company and artistic progress. The cooperation of companies, artists and church will make the bright future of these 3 companies. We expect that our country will be a famous producing center of the liturgical vestments through the research and development of companies, participation of artists in religeous arts and concerts of church.

  • PDF

Measuring Consumer-Brand Relationship Quality (소비자-브랜드 관계 품질 측정에 관한 연구)

  • Kang, Myung-Soo;Kim, Byoung-Jai;Shin, Jong-Chil
    • Journal of Global Scholars of Marketing Science
    • /
    • v.17 no.2
    • /
    • pp.111-131
    • /
    • 2007
  • As a brand becomes a core asset in creating a corporation's value, brand marketing has become one of core strategies that corporations pursue. Recently, for customer relationship management, possession and consumption of goods were centered on brand for the management. Thus, management related to this matter was developed. The main reason of the increased interest on the relationship between the brand and the consumer is due to acquisition of individual consumers and development of relationship with those consumers. Along with the development of relationship, a corporation is able to establish long-term relationships. This has become a competitive advantage for the corporation. All of these processes became the strategic assets of corporations. The importance and the increase of interest of a brand have also become a big issue academically. Brand equity, brand extension, brand identity, brand relationship, and brand community are the results derived from the interest of a brand. More specifically, in marketing, the study of brands has been led to the study of factors related to building of powerful brands and the process of building the brand. Recently, studies concentrated primarily on the consumer-brand relationship. The reason is that brand loyalty can not explain the dynamic quality aspects of loyalty, the consumer-brand relationship building process, and especially interactions between the brands and the consumers. In the studies of consumer-brand relationship, a brand is not just limited to possession or consumption objectives, but rather conceptualized as partners. Most of the studies from the past concentrated on the results of qualitative analysis of consumer-brand relationship to show the depth and width of the performance of consumer-brand relationship. Studies in Korea have been the same. Recently, studies of consumer-brand relationship started to concentrate on quantitative analysis rather than qualitative analysis or even go further with quantitative analysis to show effecting factors of consumer-brand relationship. Studies of new quantitative approaches show the possibilities of using the results as a new concept of viewing consumer-brand relationship and possibilities of applying these new concepts on marketing. Studies of consumer-brand relationship with quantitative approach already exist, but none of them include sub-dimensions of consumer-brand relationship, which presents theoretical proofs for measurement. In other words, most studies add up or average out the sub-dimensions of consumer-brand relationship. However, to do these kind of studies, precondition of sub-dimensions being in identical constructs is necessary. Therefore, most of the studies from the past do not meet conditions of sub-dimensions being as one dimension construct. From this, we question the validity of past studies and their limits. The main purpose of this paper is to overcome the limits shown from the past studies by practical use of previous studies on sub-dimensions in a one-dimensional construct (Naver & Slater, 1990; Cronin & Taylor, 1992; Chang & Chen, 1998). In this study, two arbitrary groups were classified to evaluate reliability of the measurements and reliability analyses were pursued on each group. For convergent validity, correlations, Cronbach's, one-factor solution exploratory analysis were used. For discriminant validity correlation of consumer-brand relationship was compared with that of an involvement, which is a similar concept with consumer-based relationship. It also indicated dependent correlations by Cohen and Cohen (1975, p.35) and results showed that it was different constructs from 6 sub-dimensions of consumer-brand relationship. Through the results of studies mentioned above, we were able to finalize that sub-dimensions of consumer-brand relationship can viewed from one-dimensional constructs. This means that the one-dimensional construct of consumer-brand relationship can be viewed with reliability and validity. The result of this research is theoretically meaningful in that it assumes consumer-brand relationship in a one-dimensional construct and provides the basis of methodologies which are previously preformed. It is thought that this research also provides the possibility of new research on consumer-brand relationship in that it gives root to the fact that it is possible to manipulate one-dimensional constructs consisting of consumer-brand relationship. In the case of previous research on consumer-brand relationship, consumer-brand relationship is classified into several types on the basis of components consisting of consumer-brand relationship and a number of studies have been performed with priority given to the types. However, as we can possibly manipulate a one-dimensional construct through this research, it is expected that various studies which make the level or strength of consumer-brand relationship practical application of construct will be performed, and not research focused on separate types of consumer-brand relationship. Additionally, we have the theoretical basis of probability in which to manipulate the consumer-brand relationship with one-dimensional constructs. It is anticipated that studies using this construct, which is consumer-brand relationship, practical use of dependent variables, parameters, mediators, and so on, will be performed.

  • PDF

The Study on the Debris Slope Landform in the Southern Taebaek Mountains (태백산맥 남부산지의 암설사면지형)

  • Jeon, Young-Gweon
    • Journal of the Korean Geographical Society
    • /
    • v.28 no.2
    • /
    • pp.77-98
    • /
    • 1993
  • The intent of this study is to analyze the characteristics of distribution, patter, and deposits of the exposed debris slope landform by aerial photography interpretation, measure-ment on the topographical maps and field surveys in the southern part Taebaek mountains. It also aims to research the arrangement types of mountain slope and the landform development of debris slopes in this area. In conclusion, main observations can be summed up as follows. 1. The distribution characteristics 1)From the viewpoint of bedrocks, the distribution density of talus is high in case of the bedrock with high density of joints, sheeting structures and hard rocks, but that of the block stream is high in case of intrusive rocks with the talus line. 2)From the viewpoint of bedrocks, the distribution density of talus is high in case of the bedrock with high density of joints, sheeting structures and hard rocks, but that of the block stream is high in case of inrtusive rocks with the talus line. 2) From the viewpoint of distribution altitude, talus is mainly distributed in the 301~500 meters part above the sea level, while the block stream is distributed in the 101~300 meters part. 3) From the viewpoint of slope oriention, the distribution density of talus on the slope facing the south(S, SE, SW) is a little higher than that of talus on the slope facing the north(N, NE, NW). 2. The Pattern Characteristics 1) The tongue-shaped type among the four types is the most in number. 2) The average length of talus slope is 99 meters, especially that of talus composed of hornfels or granodiorite is longer. Foth the former is easy to make free face; the latter is easdy to produce round stones. The average length of block stream slope is 145 meters, the longest of all is one km(granodiorite). 3) The gradient of talus slope is 20~45${^\circ}$, most of them 26-30${^\croc}$; but talus composed of intrusive rocks is gentle. 4) The slope pattern of talus shows concave slope, which means readjustment of constituent debris. Some of the block stream slope patterns show concave slope at the upper slope and the lower slope, but convex slope at the middle slope; others have uneven slope. 3. The deposit characteristics 1) The average length of constituent debris is 48~172 centimeters in diameter, the sorting of debris is not bad without matrix. That of block stream is longer than that of talus; this difference of debris average diameter is funda-mentally caused by joint space of bedrocks. 2) The shape of constituent debris in talus is mainly angular, but that of the debris composed of intrusive rocks is sub-angular. The shape of constituent debris in block stream is mainly sub-roundl. 3) IN case dof talus, debris diameter is generally increasing with downward slope, but some of them are disordered and the debris diameter of the sides are larger than that of the middle part on a landform surface. In block stream, debris diameter variation is perpendicularly disordered, and the debris diameter of the middle part is generally larger than that of the sides on a landform surface. 4)The long axis orientation of debris is a not bad at the lower part of the slope in talus (only 2 of 6 talus). In block stream(2 of 3), one is good in sorting; another is not bad. The researcher thinks that the latter was caused by the collapse of constituent debris. 5) Most debris were weathered and some are secondly weathered in situ, but talus composed of fresh debris is developing. 4. The landform development of debris slopes and the arrangement types of the mountain slope 1) The formation and development period of talus is divided into two periods. The first period is formation period of talus9the last glacial period), the second period is adjustment period(postglacial age). And that of block stream is divided into three periods: the first period is production period of blocks(tertiary, interglacial period), the second formation period of block stream(the last glacial period), and the third adjustment period of block stream(postglacialage). 2) The arrangement types of mountain slope are divided into six types in this research area, which are as follows. Type I; high level convex slope-free face-talus-block stream-alluvial surface Type II: high level convex slope-free face-talus-alluvial surface Type III: free face-talus-block stream-all-uvial surface Type IV: free face-talus-alluval surface Type V: talus-alluval surface Type VI: block stream-alluvial surface Particularly, type IV id\s basic type of all; others are modified ones.

  • PDF

Machine learning-based corporate default risk prediction model verification and policy recommendation: Focusing on improvement through stacking ensemble model (머신러닝 기반 기업부도위험 예측모델 검증 및 정책적 제언: 스태킹 앙상블 모델을 통한 개선을 중심으로)

  • Eom, Haneul;Kim, Jaeseong;Choi, Sangok
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.2
    • /
    • pp.105-129
    • /
    • 2020
  • This study uses corporate data from 2012 to 2018 when K-IFRS was applied in earnest to predict default risks. The data used in the analysis totaled 10,545 rows, consisting of 160 columns including 38 in the statement of financial position, 26 in the statement of comprehensive income, 11 in the statement of cash flows, and 76 in the index of financial ratios. Unlike most previous prior studies used the default event as the basis for learning about default risk, this study calculated default risk using the market capitalization and stock price volatility of each company based on the Merton model. Through this, it was able to solve the problem of data imbalance due to the scarcity of default events, which had been pointed out as the limitation of the existing methodology, and the problem of reflecting the difference in default risk that exists within ordinary companies. Because learning was conducted only by using corporate information available to unlisted companies, default risks of unlisted companies without stock price information can be appropriately derived. Through this, it can provide stable default risk assessment services to unlisted companies that are difficult to determine proper default risk with traditional credit rating models such as small and medium-sized companies and startups. Although there has been an active study of predicting corporate default risks using machine learning recently, model bias issues exist because most studies are making predictions based on a single model. Stable and reliable valuation methodology is required for the calculation of default risk, given that the entity's default risk information is very widely utilized in the market and the sensitivity to the difference in default risk is high. Also, Strict standards are also required for methods of calculation. The credit rating method stipulated by the Financial Services Commission in the Financial Investment Regulations calls for the preparation of evaluation methods, including verification of the adequacy of evaluation methods, in consideration of past statistical data and experiences on credit ratings and changes in future market conditions. This study allowed the reduction of individual models' bias by utilizing stacking ensemble techniques that synthesize various machine learning models. This allows us to capture complex nonlinear relationships between default risk and various corporate information and maximize the advantages of machine learning-based default risk prediction models that take less time to calculate. To calculate forecasts by sub model to be used as input data for the Stacking Ensemble model, training data were divided into seven pieces, and sub-models were trained in a divided set to produce forecasts. To compare the predictive power of the Stacking Ensemble model, Random Forest, MLP, and CNN models were trained with full training data, then the predictive power of each model was verified on the test set. The analysis showed that the Stacking Ensemble model exceeded the predictive power of the Random Forest model, which had the best performance on a single model. Next, to check for statistically significant differences between the Stacking Ensemble model and the forecasts for each individual model, the Pair between the Stacking Ensemble model and each individual model was constructed. Because the results of the Shapiro-wilk normality test also showed that all Pair did not follow normality, Using the nonparametric method wilcoxon rank sum test, we checked whether the two model forecasts that make up the Pair showed statistically significant differences. The analysis showed that the forecasts of the Staging Ensemble model showed statistically significant differences from those of the MLP model and CNN model. In addition, this study can provide a methodology that allows existing credit rating agencies to apply machine learning-based bankruptcy risk prediction methodologies, given that traditional credit rating models can also be reflected as sub-models to calculate the final default probability. Also, the Stacking Ensemble techniques proposed in this study can help design to meet the requirements of the Financial Investment Business Regulations through the combination of various sub-models. We hope that this research will be used as a resource to increase practical use by overcoming and improving the limitations of existing machine learning-based models.

Research on Perfusion CT in Rabbit Brain Tumor Model (토끼 뇌종양 모델에서의 관류 CT 영상에 관한 연구)

  • Ha, Bon-Chul;Kwak, Byung-Kook;Jung, Ji-Sung;Lim, Cheong-Hwan;Jung, Hong-Ryang
    • Journal of radiological science and technology
    • /
    • v.35 no.2
    • /
    • pp.165-172
    • /
    • 2012
  • We investigated the vascular characteristics of tumors and normal tissue using perfusion CT in the rabbit brain tumor model. The VX2 carcinoma concentration of $1{\times}10^7$ cells/ml(0.1ml) was implanted in the brain of nine New Zealand white rabbits (weight: 2.4kg-3.0kg, mean: 2.6kg). The perfusion CT was scanned when the tumors were grown up to 5mm. The tumor volume and perfusion value were quantitatively analyzed by using commercial workstation (advantage windows workstation, AW, version 4.2, GE, USA). The mean volume of implanted tumors was $316{\pm}181mm^3$, and the biggest and smallest volumes of tumor were 497 $mm^3$ and 195 $mm^3$, respectively. All the implanted tumors in rabbits are single-nodular tumors, and intracranial metastasis was not observed. In the perfusion CT, cerebral blood volume (CBV) were $74.40{\pm}9.63$, $16.08{\pm}0.64$, $15.24{\pm}3.23$ ml/100g in the tumor core, ipsilateral normal brain, and contralateral normal brain, respectively ($p{\leqq}0.05$). In the cerebral blood flow (CBF), there were significant differences between the tumor core and both normal brains ($p{\leqq}0.05$), but no significant differences between ipsilateral and contralateral normal brains ($962.91{\pm}75.96$ vs. $357.82{\pm}12.82$ vs. $323.19{\pm}83.24$ ml/100g/min). In the mean transit time (MTT), there were significant differences between the tumor core and both normal brains ($p{\leqq}0.05$), but no significant differences between ipsilateral and contralateral normal brains ($4.37{\pm}0.19$ vs. $3.02{\pm}0.41$ vs. $2.86{\pm}0.22$ sec). In the permeability surface (PS), there were significant differences among the tumor core, ipsilateral and contralateral normal brains ($47.23{\pm}25.45$ vs. $14.54{\pm}1.60$ vs. $6.81{\pm}4.20$ ml/100g/min)($p{\leqq}0.05$). In the time to peak (TTP) were no significant differences among the tumor core, ipsilateral and contralateral normal brains. In the positive enhancement integral (PEI), there were significant differences among the tumor core, ipsilateral and contralateral brains ($61.56{\pm}16.07$ vs. $12.58{\pm}2.61$ vs. $8.26{\pm}5.55$ ml/100g). ($p{\leqq}0.05$). In the maximum slope of increase (MSI), there were significant differences between the tumor core and both normal brain($p{\leqq}0.05$), but no significant differences between ipsilateral and contralateral normal brains ($13.18{\pm}2.81$ vs. $6.99{\pm}1.73$ vs. $6.41{\pm}1.39$ HU/sec). Additionally, in the maximum slope of decrease (MSD), there were significant differences between the tumor core and contralateral normal brain($p{\leqq}0.05$), but no significant differences between the tumor core and ipsilateral normal brain($4.02{\pm}1.37$ vs. $4.66{\pm}0.83$ vs. $6.47{\pm}1.53$ HU/sec). In conclusion, the VX2 tumors were implanted in the rabbit brain successfully, and stereotactic inoculation method make single-nodular type of tumor that was no metastasis in intracranial, suitable for comparative study between tumors and normal tissues. Therefore, perfusion CT would be a useful diagnostic tool capable of reflecting the vascularity of the tumors.

A Study on the 'Zhe Zhong Pai'(折衷派) of the Traditional Medicine of Japan (일본(日本) 의학醫學의 '절충파(折衷派)'에 관(關)한 연구(硏究))

  • Park, Hyun-Kuk;Kim, Ki-Wook
    • Journal of Korean Medical classics
    • /
    • v.20 no.3
    • /
    • pp.121-141
    • /
    • 2007
  • The outline and characteristics of the important doctors of the 'Zhe Zhong Pai'(折衷派) are as follows. Part 1. In the late Edo(江戶) period The 'Zhe Zhong Pai', which tried to take the theory and clinical treatment of the 'Hou Shi Pai (後世派)' and the 'Gu Fang Pai (古方派)' and get their strong points to make treatments perfect, appeared. Their point was 'The main part is the art of the ancients, The latter prescriptions are to be used'(以古法爲主, 後世方爲用) and the "Shang Han Lun(傷寒論)" was revered for its treatments but in actual use it was not kept at that. As mentioned above The 'Zhe Zhong Pai ' viewed treatments as the base, which was the view of most doctors in the Edo period, However, the reason the 'Zhe Zhong Pai' is not valued as much as the 'Gu Fang Pai' by medical history books in Japan is because the 'Zhe Zhong Pai' does not have the substantiation or uniqueness of the 'Gu Fang Pai', and also because the view of 'gather as well as store up' was the same as the 'Kao Zheng Pai', Moreover, the 'compromise'(折衷) point of view was from taking in both Chinese and western medical knowledge systems(漢蘭折衷), Generally the pioneer of the 'Zhe Zhong Pai' is seen as Mochizuki Rokumon(望月鹿門) and after that was Fukui Futei(福井楓亭), Wadato Kaku(和田東郭), Yamada Seichin(山田正珍) and Taki Motohiro(多紀元簡), Part 2. The lives of Wada Tokaku(和田東郭), Nakagame Kinkei(中神琴溪), Nei Teng Xi Zhe(內藤希哲), the important doctors of the 'Zhe Zhong Pai', are as follows First. Wada Tokaku(和田東郭, 1743-1803) was born when the 'Hou Shi Pai' was already declining and the 'Gu Fang Pai' was flourishing and learned medicine from a 'Hou Shi Pai' doctor, Hu Tian Xu Shan(戶田旭山) and a 'Gu Fang Pai' doctor, Yoshimasu Todo(吉益東洞). He was not hindered by 'the old ways(古方), and did not lean towards 'the new ways(後世方)' and formed a way of compromise that 'looked at hardness and softness as the same'(剛柔相摩) by setting 'the cure of the disease' as the base, and said that to cure diseases 'the old way' must be used, but 'the new way' was necessary to supplement its shortcomings. His works include "Dao Shui Suo Yan", "Jiao Chiang Fang Yi Je" and "Yi Xue Sho(醫學說)" Second. Nakagame Kinkei(中神琴溪, 1744-1833) was famous for leaving Yoshirnasu Todo(吉益東洞) and changing to the 'Zhe Zhong Pai', and in his early years used qing fen(輕粉) to cure geisha(妓女) of syphilis. His argument was "the "Shang Han Lun" must be revered but needs to be adapted", "Zhong jing can be made into a follower but I cannot become his follower", "the later medical texts such as "Ru Men Shi Qin(儒門事親)" should only be used for its prescriptions and not its theories". His works include "Shang Han Lun Yue Yan(傷寒論約言) Third. Nei Teng Xi Zhe(內藤希哲, 1701-1735) learned medicine from Qing Shui Xian Sheng(淸水先生) and went out to Edo. In his book "Yi Jing Jie Huo Lun(醫經解惑論)" he tells of how he went from 'learning'(學) to 'skepticism'(惑) and how skepticism made him learn in 'the six skepticisms'(六惑). In the latter years Xi Zhe(希哲) combines the "Shen Nong Ben Cao jing(神農本草經)", the main text for herbal medicine, "Ming Tang jing(明堂經)" of accupuncture, basic theory texts "Huang Dui Nei jing(黃帝內徑)" and "Nan jing(難經)" with the "Shang Han Za Bing Lun", a book that the 'Gu Fang Pai' saw as opposing to the rest, and became 'an expert of five scriptures'(五經一貫). Part 3. Asada Showhaku(淺田宗伯, 1815-1894) started medicine at Zhong Cun Zhong(中村中倧) and learned 'the old way'(古方) from Yoshirnasu Todo and got experience through Chuan Yue(川越) and Fu jing(福井) and received teachings in texts, history and Wang Yangmin's principles(陽明學) from famous teachers. Showhaku(宗伯) meets a medical official of the makufu(幕府), Ben Kang Zong Yuan(本康宗圓), and recieves help from the 3 great doctors of the Edo period, Taki Motokato(多紀元堅), Xiao Dao Xue GU(小島學古) and Xi Duo Cun Kao Chuang and further develops his arts. At 47 he diagnoses the general Jia Mao(家茂) with 'heart failure from beriberi'(脚氣衝心) and becomes a Zheng Shi(徵I), at 51 he cures a minister from France and received a present from Napoleon, at 65 he becomes the court physician and saves Ming Gong(明宮) jia Ren Qn Wang(嘉仁親王, later the 大正犬皇) from bodily convulsions and becomes 'the vassal of merit who saved the national polity(國體)' At the 7th year of the Meiji(明治) he becomes the 2nd owner of Wen Zhi She(溫知社) and takes part in the 'kampo continuation movement'. In his latter years he saw 14000 patients a year, so we can estimate the quality and quantity of his clinical skills Showhaku(宗伯) wrote over 80 books including the "Ju Chuang Shu Ying(橘窓書影)", "WU Wu Yao Shi Fang Han(勿誤藥室方函)", "Shang Han Biang Shu(傷寒辨術)", "jing Qi Shen Lun(精氣神論)", "Hunag Guo Ming Yi Chuan(皇國名醫傳)" and the "Xian Jhe Yi Hua(先哲醫話)". Especially in the "Ju Chuang Shu Ying(橘窓書影)" he says "the old theories are the main, and the new prescriptions are to be used"(以古法爲主, 後世方爲用), stating the 'Zhe Zhong Pai' way of thinking. In the first volume of "Shung Han Biang Shu(傷寒辨術) and "Za Bing Lun Shi(雜病論識)", 'Zong Ping'(總評), He discerns the parts that are not Zhang Zhong Jing's writings and emphasizes his theories and practical uses.

  • PDF

Robo-Advisor Algorithm with Intelligent View Model (지능형 전망모형을 결합한 로보어드바이저 알고리즘)

  • Kim, Sunwoong
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.2
    • /
    • pp.39-55
    • /
    • 2019
  • Recently banks and large financial institutions have introduced lots of Robo-Advisor products. Robo-Advisor is a Robot to produce the optimal asset allocation portfolio for investors by using the financial engineering algorithms without any human intervention. Since the first introduction in Wall Street in 2008, the market size has grown to 60 billion dollars and is expected to expand to 2,000 billion dollars by 2020. Since Robo-Advisor algorithms suggest asset allocation output to investors, mathematical or statistical asset allocation strategies are applied. Mean variance optimization model developed by Markowitz is the typical asset allocation model. The model is a simple but quite intuitive portfolio strategy. For example, assets are allocated in order to minimize the risk on the portfolio while maximizing the expected return on the portfolio using optimization techniques. Despite its theoretical background, both academics and practitioners find that the standard mean variance optimization portfolio is very sensitive to the expected returns calculated by past price data. Corner solutions are often found to be allocated only to a few assets. The Black-Litterman Optimization model overcomes these problems by choosing a neutral Capital Asset Pricing Model equilibrium point. Implied equilibrium returns of each asset are derived from equilibrium market portfolio through reverse optimization. The Black-Litterman model uses a Bayesian approach to combine the subjective views on the price forecast of one or more assets with implied equilibrium returns, resulting a new estimates of risk and expected returns. These new estimates can produce optimal portfolio by the well-known Markowitz mean-variance optimization algorithm. If the investor does not have any views on his asset classes, the Black-Litterman optimization model produce the same portfolio as the market portfolio. What if the subjective views are incorrect? A survey on reports of stocks performance recommended by securities analysts show very poor results. Therefore the incorrect views combined with implied equilibrium returns may produce very poor portfolio output to the Black-Litterman model users. This paper suggests an objective investor views model based on Support Vector Machines(SVM), which have showed good performance results in stock price forecasting. SVM is a discriminative classifier defined by a separating hyper plane. The linear, radial basis and polynomial kernel functions are used to learn the hyper planes. Input variables for the SVM are returns, standard deviations, Stochastics %K and price parity degree for each asset class. SVM output returns expected stock price movements and their probabilities, which are used as input variables in the intelligent views model. The stock price movements are categorized by three phases; down, neutral and up. The expected stock returns make P matrix and their probability results are used in Q matrix. Implied equilibrium returns vector is combined with the intelligent views matrix, resulting the Black-Litterman optimal portfolio. For comparisons, Markowitz mean-variance optimization model and risk parity model are used. The value weighted market portfolio and equal weighted market portfolio are used as benchmark indexes. We collect the 8 KOSPI 200 sector indexes from January 2008 to December 2018 including 132 monthly index values. Training period is from 2008 to 2015 and testing period is from 2016 to 2018. Our suggested intelligent view model combined with implied equilibrium returns produced the optimal Black-Litterman portfolio. The out of sample period portfolio showed better performance compared with the well-known Markowitz mean-variance optimization portfolio, risk parity portfolio and market portfolio. The total return from 3 year-period Black-Litterman portfolio records 6.4%, which is the highest value. The maximum draw down is -20.8%, which is also the lowest value. Sharpe Ratio shows the highest value, 0.17. It measures the return to risk ratio. Overall, our suggested view model shows the possibility of replacing subjective analysts's views with objective view model for practitioners to apply the Robo-Advisor asset allocation algorithms in the real trading fields.