• Title/Summary/Keyword: Procedure Management

Search Result 3,031, Processing Time 0.035 seconds

Establishment of A WebGIS-based Information System for Continuous Observation during Ocean Research Vessel Operation (WebGIS 기반 해양 연구선 상시관측 정보 체계 구축)

  • HAN, Hyeon-Gyeong;LEE, Cholyoung;KIM, Tae-Hoon;HAN, Jae-Rim;CHOI, Hyun-Woo
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.24 no.1
    • /
    • pp.40-53
    • /
    • 2021
  • Research vessels(R/Vs) used for ocean research move to the planned research area and perform ocean observations suitable for the research purpose. The five research vessels of the Korea Institute of Ocean Science & Technology(KIOST) are equipped with global positioning system(GPS), water depth, weather, sea surface layer temperature and salinity measurement equipment that can be observed at all times during cruise. An information platform is required to systematically manage and utilize the data produced through such continuous observation equipment. Therefore, the data flow was defined through a series of business analysis ranging from the research vessel operation plan to observation during the operation of the research vessel, data collection, data processing, data storage, display and service. After creating a functional design for each stage of the business process, KIOST Underway Meteorological & Oceanographic Information System(KUMOS), a Web-Geographic information system (Web-GIS) based information platform, was built. Since the data produced during the cruise of the R/Vs have characteristics of temporal and spatial variability, a quality management system was developed that considered these variabilities. For the systematic management and service of data, the KUMOS integrated Database(DB) was established, and functions such as R/V tracking, data display, search and provision were implemented. The dataset provided by KUMOS consists of cruise report, raw data, Quality Control(QC) flagged data, filtered data, cruise track line data, and data report for each cruise of the R/V. The business processing procedure and system of KUMOS for each function developed through this study are expected to serve as a benchmark for domestic ocean-related institutions and universities that have research vessels capable of continuous observations during cruise.

The analysis for attributes of OUV of the capital of Shilla Kingdom (세계유산 신라왕경의 탁월한 보편적 가치 속성 분석)

  • KIM, Euiyeon
    • Korean Journal of Heritage: History & Science
    • /
    • v.55 no.1
    • /
    • pp.151-174
    • /
    • 2022
  • According to the "Special Act on the Restoration and Maintenance of the Core Relics of the Shilla Kingdom" enacted in 2019, the Shilla Kingdom refers to the capital of Shilla and Unified Shilla period, and refers to Gyeongju, where the king lived, and the nearby area. Shilla Wanggyeong is a heritage registered on the UNESCO World Heritage List in 2000 under the name of Gyeongju Historic Site and belongs to Wolseong District, Hwangnyongsa District, and Daeneungwon District among the five districts registered as Gyeongju Historic Site. Unlike the Namsan and Sanseong districts, the Shilla Kingdom is a heritage consisting mostly of archaeological sites without physical substance. Gyeongju City sought to promote local tourism while providing more direct experiences to visitors by restoring the heritage that constitutes the Shilla Kingdom. Starting with the restoration of Woljeonggyo Bridge in 2005, the Shilla Wanggyeong restoration project began in earnest. Gyeongju City tried to restore the building site on the west side of Donggung Palace and Wolji after Woljeonggyo Bridge, but it was canceled due to opposition from the UNESCO World Heritage Committee. The World Heritage Committee opposed the restoration and recommended a heritage impact assessment for similar projects in the future. During the miscarriage impact assessment procedure, there is an OUV attribute analysis process of the heritage to be evaluated. This study intends to preemptively derive OUV attributes for the Silla Kingdom through literature and overseas case analysis. In the case of literature research, domestic and foreign research data related to the UNESCO World Heritage Convention and World Heritage Management were examined, and in overseas cases, the architectural works of Krakow Historical District, Stonehenge and Abbury Geoseok Ruins in England, and Le Corbusier were analyzed. Through this, the outstanding universal value attributes of the Silla Kingdom were derived. This study is expected to be used as a reference in the process of restoration projects of other heritage constituting the Shilla Kingdom or construction plans in nearby areas in the future and serve as an indicator to improve the management system of the Shilla Kingdom more efficiently from the perspective of world heritage.

Early Result of Surgical Management of the Anomalous Origin of the Left Coronary Artery from the Pulmonary Artery (관상동맥-폐동맥 이상 기시증에 대한 수술의 조기 결과)

  • Yoon Yoo Sang;Park Jeong Jun;Yun Tae Jin;Kim Young Hwue;Ko Jae Kon;Park In Sook;Seo Dong Man
    • Journal of Chest Surgery
    • /
    • v.39 no.1 s.258
    • /
    • pp.18-27
    • /
    • 2006
  • Background: Anomalous origin of the left coronary artery from the pulmonary artery (ALCAPA) is a rare congenital anomaly, but is one of the most common causes of myocardial ischemia which would result in high mortality within the first year of life. This is our early result of the surgical management for these patients. Material and Method: From June 1989 to July 2003, 6 patients with ALCAPA and one patient with ARCAPA (Anomalous origin of the Right coronary artery from the pulmonary artery) underwent surgical repair. We have reviewed the all medical records, electrocardiogram, chest X-ray and echocardiography retrospectively. Result: Three of the patients were boys and four were girls. The median age at the operation was 5.4 months (Range: 3$\∼$33 months). The average body weight of at the operation was 6.7 kg (Range: 3.7$\∼$11.3 kg). A mean follow up period was 18 months. Only 3 patients were initially diagnosed as ALCAPA. And 3 patients had moderate mitral regurgitation. Immediate coronary artery reimplantation on diagnosis with the aim of restoring a two-coronary system circulation was done. The average bypass time was 114$\pm$37 minutes, and the average aortic cross clamping time was 55$\pm$22 minutes. The average stay of intensive care unit was 5$\pm$3 days, the mean mechanical ventilator time was 38$\pm$45 hours and the hospital stay after operation was 12$\pm$5 days. There were significant improvements in electrocardiogram and chest X-ray of the all patients except one late death patient. The ventricular function showed almost normal recovery after operation; the EF (Ejection Fraction) increased from 41.2$\pm$ 10.3$\%$ to 60.5$\pm$ 15.8$\%$ within 1 month and to 59.8$\pm$13.9$\%$ within 1 year after operation, the SF (Shortening Fraction) increased from 23.6$\pm$4.7$\%$ to 38.6$\pm$8.4$\%$ within 1 month and to 37.4$\pm$7.9$\%$ within 1 year after operation, LVEDDI (Left Ventricular End-diastolic Dimension Index) decreased from 100.8$\pm$25.6 mm/$m^{2}$ to 90.3$\pm$ 19.2 mm/$m^{2}$ within f month and to 79.3$\pm$ 15.8 mm/$m^{2}$ within 1 year after operation. Concomitant mitral repair was done in two patients with anterior mitral leaflet prolapse. In every patient, mitral valve showed less than mild regurgitation during follow up. One late death occurred in which patient Dor procedure was applied 10 months after initial operation due to the dilated cardiomyopathy Conclusion: In the management of this rare and could be fatal Anomalous origin of the left coronary artery from the pulmonary artery (ALCAPA), early suspicion and correct diagnosis is of most important. But, after diagnosis, immediate restoration of 2 coronary systems could result in good outcome.

A Store Recommendation Procedure in Ubiquitous Market for User Privacy (U-마켓에서의 사용자 정보보호를 위한 매장 추천방법)

  • Kim, Jae-Kyeong;Chae, Kyung-Hee;Gu, Ja-Chul
    • Asia pacific journal of information systems
    • /
    • v.18 no.3
    • /
    • pp.123-145
    • /
    • 2008
  • Recently, as the information communication technology develops, the discussion regarding the ubiquitous environment is occurring in diverse perspectives. Ubiquitous environment is an environment that could transfer data through networks regardless of the physical space, virtual space, time or location. In order to realize the ubiquitous environment, the Pervasive Sensing technology that enables the recognition of users' data without the border between physical and virtual space is required. In addition, the latest and diversified technologies such as Context-Awareness technology are necessary to construct the context around the user by sharing the data accessed through the Pervasive Sensing technology and linkage technology that is to prevent information loss through the wired, wireless networking and database. Especially, Pervasive Sensing technology is taken as an essential technology that enables user oriented services by recognizing the needs of the users even before the users inquire. There are lots of characteristics of ubiquitous environment through the technologies mentioned above such as ubiquity, abundance of data, mutuality, high information density, individualization and customization. Among them, information density directs the accessible amount and quality of the information and it is stored in bulk with ensured quality through Pervasive Sensing technology. Using this, in the companies, the personalized contents(or information) providing became possible for a target customer. Most of all, there are an increasing number of researches with respect to recommender systems that provide what customers need even when the customers do not explicitly ask something for their needs. Recommender systems are well renowned for its affirmative effect that enlarges the selling opportunities and reduces the searching cost of customers since it finds and provides information according to the customers' traits and preference in advance, in a commerce environment. Recommender systems have proved its usability through several methodologies and experiments conducted upon many different fields from the mid-1990s. Most of the researches related with the recommender systems until now take the products or information of internet or mobile context as its object, but there is not enough research concerned with recommending adequate store to customers in a ubiquitous environment. It is possible to track customers' behaviors in a ubiquitous environment, the same way it is implemented in an online market space even when customers are purchasing in an offline marketplace. Unlike existing internet space, in ubiquitous environment, the interest toward the stores is increasing that provides information according to the traffic line of the customers. In other words, the same product can be purchased in several different stores and the preferred store can be different from the customers by personal preference such as traffic line between stores, location, atmosphere, quality, and price. Krulwich(1997) has developed Lifestyle Finder which recommends a product and a store by using the demographical information and purchasing information generated in the internet commerce. Also, Fano(1998) has created a Shopper's Eye which is an information proving system. The information regarding the closest store from the customers' present location is shown when the customer has sent a to-buy list, Sadeh(2003) developed MyCampus that recommends appropriate information and a store in accordance with the schedule saved in a customers' mobile. Moreover, Keegan and O'Hare(2004) came up with EasiShop that provides the suitable tore information including price, after service, and accessibility after analyzing the to-buy list and the current location of customers. However, Krulwich(1997) does not indicate the characteristics of physical space based on the online commerce context and Keegan and O'Hare(2004) only provides information about store related to a product, while Fano(1998) does not fully consider the relationship between the preference toward the stores and the store itself. The most recent research by Sedah(2003), experimented on campus by suggesting recommender systems that reflect situation and preference information besides the characteristics of the physical space. Yet, there is a potential problem since the researches are based on location and preference information of customers which is connected to the invasion of privacy. The primary beginning point of controversy is an invasion of privacy and individual information in a ubiquitous environment according to researches conducted by Al-Muhtadi(2002), Beresford and Stajano(2003), and Ren(2006). Additionally, individuals want to be left anonymous to protect their own personal information, mentioned in Srivastava(2000). Therefore, in this paper, we suggest a methodology to recommend stores in U-market on the basis of ubiquitous environment not using personal information in order to protect individual information and privacy. The main idea behind our suggested methodology is based on Feature Matrices model (FM model, Shahabi and Banaei-Kashani, 2003) that uses clusters of customers' similar transaction data, which is similar to the Collaborative Filtering. However unlike Collaborative Filtering, this methodology overcomes the problems of personal information and privacy since it is not aware of the customer, exactly who they are, The methodology is compared with single trait model(vector model) such as visitor logs, while looking at the actual improvements of the recommendation when the context information is used. It is not easy to find real U-market data, so we experimented with factual data from a real department store with context information. The recommendation procedure of U-market proposed in this paper is divided into four major phases. First phase is collecting and preprocessing data for analysis of shopping patterns of customers. The traits of shopping patterns are expressed as feature matrices of N dimension. On second phase, the similar shopping patterns are grouped into clusters and the representative pattern of each cluster is derived. The distance between shopping patterns is calculated by Projected Pure Euclidean Distance (Shahabi and Banaei-Kashani, 2003). Third phase finds a representative pattern that is similar to a target customer, and at the same time, the shopping information of the customer is traced and saved dynamically. Fourth, the next store is recommended based on the physical distance between stores of representative patterns and the present location of target customer. In this research, we have evaluated the accuracy of recommendation method based on a factual data derived from a department store. There are technological difficulties of tracking on a real-time basis so we extracted purchasing related information and we added on context information on each transaction. As a result, recommendation based on FM model that applies purchasing and context information is more stable and accurate compared to that of vector model. Additionally, we could find more precise recommendation result as more shopping information is accumulated. Realistically, because of the limitation of ubiquitous environment realization, we were not able to reflect on all different kinds of context but more explicit analysis is expected to be attainable in the future after practical system is embodied.

The Early Experience with a Laparoscopy-assisted Pylorus-preserving Gastrectomy: A Comparison with a Laparoscopy-assisted Distal Gastrectomy with Billroth-I Reconstruction (복강경 보조 유문부보존 위절제술의 초기 경험: 복강경 보조 원위부 위절제술 후 Billroth-I 재건술과의 비교)

  • Park, Jong-Ik;Jin, Sung-Ho;Bang, Ho-Yoon;Chae, Gi-Bong;Paik, Nam-Sun;Moon, Nan-Mo;Lee, Jong-Inn
    • Journal of Gastric Cancer
    • /
    • v.8 no.1
    • /
    • pp.20-26
    • /
    • 2008
  • Purpose: Pylorus-preserving gastrectomy (PPG), which retains pyloric ring and gastric function, has been accepted as a function-preserving procedure for early gastric cancer for the prevention of postgastrectomy syndrome. This study was compared laparoscopy-assisted pylorus-preerving gastrectomy (LAPPG) with laparoscopy-assisted distal gastrectomy with Billroth-I reconstruction (LADGB I). Materials and Methods: Between November 2006 and September 2007, 39 patients with early gastric cancer underwent laparoscopy-assisted gastrectomy in the Department of Surgery at Korea Cancer Center Hospital. 9 of these patients underwent LAPPG and 18 underwent LADGBI. When LAPPG was underwent, we preserved the pyloric branch, hepatic branch, and celiac branch of the vagus nerve, the infrapyloric artery, and the right gastric artery and performed D1+$\beta$ lymphadenectomy to the exclusion of suprapyloric lymph node dissection. The distal stomach was resected while retaining a $2.5{\sim}3.0\;cm$ pyloric cuff and maintaining a $3.0{\sim}4.0\;cm$ distal margin for the resection. Results: The mean age for patients who underwent LAPPG and LADGBI were $59.9{\pm}9.4$ year-old and $64.1{\pm}10.0$ year-old, respectively. The sex ratio was 1.3 : 1.0 (male 5, female 4) in the LAPPG group and 2.6 : 1.0 (male 13, female 5) in the LADGBI group. Mean total number of dissected lymph nodes ($28.3{\pm}11.9$ versus $28.1{\pm}8.9$), operation time ($269.0{\pm}34.4$ versus $236.3{\pm}39.6$ minutes), estimated blood loss ($191.1{\pm}85.7$ versus $218.3{\pm}150.6\;ml$), time to first flatus ($3.6{\pm}0.9$ versus $3.5{\pm}0.8$ days), time to start of diet ($5.1{\pm}0.9$ versus $5.1{\pm}1.7$ days), and postoperative hospital stay ($10.1{\pm}4.0$ versus $9.2{\pm}3.0$ days) were not found significant differences (P>0.05). The postoperative complications were 1 patient with gastric stasis and 1 patient with wound seroma in LAPPG group and 1 patient with left lateral segment infarct of liver in the LADGB I group. Conclusion: Patients treated by LAPPG showed a comparable quality of surgical operation compared with those treated by LADGBI. LAPPG has an important role in the surgical management of early gastric cancer in terms of quality of postoperative life. Randomized controlled studies should be undertaken to analyze the optimal survival and long-term outcomes of this operative procedure.

  • PDF

Development of Standard Process for Private Information Protection of Medical Imaging Issuance (개인정보 보호를 위한 의료영상 발급 표준 업무절차 개발연구)

  • Park, Bum-Jin;Yoo, Beong-Gyu;Lee, Jong-Seok;Jeong, Jae-Ho;Son, Gi-Gyeong;Kang, Hee-Doo
    • Journal of radiological science and technology
    • /
    • v.32 no.3
    • /
    • pp.335-341
    • /
    • 2009
  • Purpose : The medical imaging issuance is changed from conventional film method to Digital Compact Disk solution because of development on IT technology. However other medical record department's are undergoing identification check through and through whereas medical imaging department cannot afford to do that. So, we examine present applicant's recognition of private intelligence safeguard, and medical imaging issuance condition by CD & DVD medium toward various medical facility and then perform comparative analysis associated with domestic and foreign law & recommendation, lastly suggest standard for medical imaging issuance and process relate with internal environment. Materials and methods : First, we surveyed issuance process & required documents when situation of medical image issuance in the metropolitan medical facility by wire telephone between 2008.6.1$\sim$2008.7.1. in accordance with the medical law Article 21$\sim$clause 2, suggested standard through applicant's required documents occasionally - (1) in the event of oneself $\rightarrow$ verifying identification, (2) in the event of family $\rightarrow$ verifying applicant identification & family relations document (health insurance card, attested copy, and so on), (3) third person or representative $\rightarrow$ verifying applicant identification & letter of attorney & certificate of one's seal impression. Second, also checked required documents of applicant in accordance with upper standard when situation of medical image issuance in Kyung-hee university medical center during 3 month 2008.5.1$\sim$2008.7.31. Third, developed a work process by triangular position of issuance procedure for situation when verifying required documents & management of unpreparedness. Result : Look all over the our manufactured output in the hospital - satisfy the all conditions $\rightarrow$ 4 place(12%), possibly request everyone $\rightarrow$ 4 place(12%), and apply in the clinic section $\rightarrow$ 9 place(27%) that does not medical imaging issuance office, so we don't know about required documents condition. and look into whether meet or not the applicant's required documents on upper 3month survey - satisfy the all conditions $\rightarrow$ 629 case(49%), prepare a one part $\rightarrow$ 416 case(33%), insufficiency of all document $\rightarrow$ 226case(18%). On the authority of upper research result, we are establishing the service model mapping for objective reception when image export situation through triangular position of issuance procedure and reduce of friction with patient and promote the patient convenience. Conclusion : The PACS is classified under medical machinery that mean indicates about higher importance of medical information therefore medical information administrator's who already received professional education & mind, are performer about issuance process only and also have to provide under ID checking process exhaustively.

  • PDF

Study for Diagnostic Efficacy of Minibronchoalveolar Lavage in the Detection of Etiologic Agents of Ventilator-associated Pneumonia in Patients Receiving Antibiotics (항생제를 사용하고 있었던 인공호흡기 연관 폐렴환자에서의 원인균 발견을 위한 소량 기관지폐포세척술의 진단적 효용성에 관한 연구)

  • Moon, Doo-Seop;Lim, Chae-Man;Pai, Chik-Hyun;Kim, Mi-Na;Chin, Jae-Yong;Shim, Tae-Sun;Lee, Sang-Do;Kim, Woo-Sung;Kim, Dong-Soon;Kim, Won-Dong;Koh, Youn-Suck
    • Tuberculosis and Respiratory Diseases
    • /
    • v.47 no.3
    • /
    • pp.321-330
    • /
    • 1999
  • Background : Early diagnosis and proper antibiotic treatment are very important in the management of ventilator-associated pneumonia (VAP) because of its high mortality. Bronchoscopy with a protected specimen brush (PSB) has been considered the standard method to isolate the causative organisms of VAP. However, this method burdens consumer economically to purchase a PSB. Another useful method for the diagnosis of VAP is quantitative cultures of aspirated specimens through bronchoscopic bronchoalveolar lavage (BAL), for which the infusion of more than 120 m1 of saline has been recommended for adequate sampling of a pulmonary segment. However, occasionally it leads to deterioration of the patient's condition. We studied the diagnostic efficacy of minibronchoalveolar lavage (miniBAL), which retrieves only 25 ml of BAL fluid, in the isolation of causative organisms of VAP. Methods: We included 38 consecutive patients (41 cases) suspected of having VAP on the basis of clinical evidence, who had received antibiotics before the bronchoscopy. The two diagnostic techniques of PSB and miniBAL, which were performed one after another at the same pulmonary segment, 'were compared prospectively. The cut-off values for quantitative cultures to define causative bacteria of VAP were more than $10^3$ colony-forming units (cfu)/ml for PSB and more than $10^4$ cfu/ml for BAL. Results: The amount of instilled normal saline required to retrieve 25 ml of BAL fluid was $93{\pm}32 ml$ (mean${\pm}$SD). The detection rate of causative agents was 46.3% (19/41) with PSB and 43.9% (18/41) with miniBAL. The concordance rate of PSB and miniBAL in the bacterial culture was 85.4% (35/41). Although arterial blood oxygen saturation dropped significantly (p<0.05) during ($92{\pm}10%$) and 10 min after ($95{\pm}3%$) miniBAL compared with the baseline ($97{\pm}3%$), all except 3 cases were within normal ranges. The significantly elevated heart rate during ($l25{\pm}24$/min, p<0.05) miniBAL compared with the baseline ($1l1{\pm}22$/min) recovered again in 10 min after ($111{\pm}26$/min) miniBAL. Transient hypotension was developed during the procedure in two cases. The procedure was stopped in one case due to atrial flutter. Conclusion: MiniBAL is a safe and effective technique to detect the causative organisms of VAP.

  • PDF

Analysis of the Eyeglasses Supply System for Ametropes in ROK Military (한국군 비정시자용 안경의 보급체계 분석)

  • Jin, Yong-Gab;Koo, Bon-Yeop;Lee, Woo-Chul;Yoon, Moon-Soo;Park, Jin-Tae;Lee, Hang-Seok;Lee, Kyo-Eun;Leem, Hyun-Sung;Jang, Jae-Young;Mah, Ki-Choong
    • The Korean Journal of Vision Science
    • /
    • v.20 no.4
    • /
    • pp.579-588
    • /
    • 2018
  • Purpose : To analyze the eyeglasses supply system for ametropic soldiers in ROK military. Methods : We investigated and analyzed the supply system of eyeglasses for the ametropic soldiers provided by the Korean military. The refractive powers and corrected visual acuity were measured for 37 ametropic soldiers who wear insert glasses for ballistic protective and gas-masks supplied by the military based on their habitual prescriptions. Full correction of refractive error was prescribed for subjects having less than 1.0 of distance visual acuity, and comparison was held for inspecting the changes in corrected visual acuity. Suggestions were provided for solving the issues regarding current supplying system, and this study investigated the applicabilities for utilizing professional optometric manpower. Results : The new glasses supplied by army for ametropic soldiers were duplicated from the glasses they worn when entering the army. The spherical equivalent refractive powers of the conventional, ballistic protective and gas-mask insert glasses supplied for 37 ametropic soldiers were $-3.47{\pm}1.69D$, $-3.52{\pm}1.66D$ and $-3.55{\pm}1.63D$, respectively, and the spherical equivalent refractive power of full corrected glasses was $-3.79{\pm}1.66D$, which showed a significant difference(p<0.05). The distant corrected visual acuity measured at high and low contrast(logMAR) of conventional, ballistic protective and gas-mask insert glasses were $0.06{\pm}0.80$, $0.21{\pm}0.82$, $0.15{\pm}0.74$, $0.34{\pm}0.89$, $0.10{\pm}0.70$ and $0.22{\pm}0.27$, respectively, while the corrected visual acuity by full corrected glasses were increased to $0.02{\pm}1.05$, $0.10{\pm}0.07$, $0.09{\pm}0.92$, $0.26{\pm}0.10$, $0.04{\pm}1.00$ and $0.19{\pm}1.00$, respectively. There was a significant difference(p<0.05) except for the case of the low contrast corrected visual acuity of the conventional and gas-mask insert glasses. The procedure for ordering, dispensing, and supplying military glasses consists of 5 steps, and it was found that approximately two weeks or more are required to supply from the initial examination. Conclusion : The procedure of supplying the military glasses showed three issues: 1) a lack of refraction for prescription system, 2) relatively long length of time required for supplying the glasses, 3) an inaccurate power of supplied glasses. In order to solve those issues, in the short term, education is necessarily required for soldiers on the measurement of the refractive powers, and in the near future, further standard procedures for prescription of glasses as well as the securement of optometric manpower are expected.

Deriving adoption strategies of deep learning open source framework through case studies (딥러닝 오픈소스 프레임워크의 사례연구를 통한 도입 전략 도출)

  • Choi, Eunjoo;Lee, Junyeong;Han, Ingoo
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.27-65
    • /
    • 2020
  • Many companies on information and communication technology make public their own developed AI technology, for example, Google's TensorFlow, Facebook's PyTorch, Microsoft's CNTK. By releasing deep learning open source software to the public, the relationship with the developer community and the artificial intelligence (AI) ecosystem can be strengthened, and users can perform experiment, implementation and improvement of it. Accordingly, the field of machine learning is growing rapidly, and developers are using and reproducing various learning algorithms in each field. Although various analysis of open source software has been made, there is a lack of studies to help develop or use deep learning open source software in the industry. This study thus attempts to derive a strategy for adopting the framework through case studies of a deep learning open source framework. Based on the technology-organization-environment (TOE) framework and literature review related to the adoption of open source software, we employed the case study framework that includes technological factors as perceived relative advantage, perceived compatibility, perceived complexity, and perceived trialability, organizational factors as management support and knowledge & expertise, and environmental factors as availability of technology skills and services, and platform long term viability. We conducted a case study analysis of three companies' adoption cases (two cases of success and one case of failure) and revealed that seven out of eight TOE factors and several factors regarding company, team and resource are significant for the adoption of deep learning open source framework. By organizing the case study analysis results, we provided five important success factors for adopting deep learning framework: the knowledge and expertise of developers in the team, hardware (GPU) environment, data enterprise cooperation system, deep learning framework platform, deep learning framework work tool service. In order for an organization to successfully adopt a deep learning open source framework, at the stage of using the framework, first, the hardware (GPU) environment for AI R&D group must support the knowledge and expertise of the developers in the team. Second, it is necessary to support the use of deep learning frameworks by research developers through collecting and managing data inside and outside the company with a data enterprise cooperation system. Third, deep learning research expertise must be supplemented through cooperation with researchers from academic institutions such as universities and research institutes. Satisfying three procedures in the stage of using the deep learning framework, companies will increase the number of deep learning research developers, the ability to use the deep learning framework, and the support of GPU resource. In the proliferation stage of the deep learning framework, fourth, a company makes the deep learning framework platform that improves the research efficiency and effectiveness of the developers, for example, the optimization of the hardware (GPU) environment automatically. Fifth, the deep learning framework tool service team complements the developers' expertise through sharing the information of the external deep learning open source framework community to the in-house community and activating developer retraining and seminars. To implement the identified five success factors, a step-by-step enterprise procedure for adoption of the deep learning framework was proposed: defining the project problem, confirming whether the deep learning methodology is the right method, confirming whether the deep learning framework is the right tool, using the deep learning framework by the enterprise, spreading the framework of the enterprise. The first three steps (i.e. defining the project problem, confirming whether the deep learning methodology is the right method, and confirming whether the deep learning framework is the right tool) are pre-considerations to adopt a deep learning open source framework. After the three pre-considerations steps are clear, next two steps (i.e. using the deep learning framework by the enterprise and spreading the framework of the enterprise) can be processed. In the fourth step, the knowledge and expertise of developers in the team are important in addition to hardware (GPU) environment and data enterprise cooperation system. In final step, five important factors are realized for a successful adoption of the deep learning open source framework. This study provides strategic implications for companies adopting or using deep learning framework according to the needs of each industry and business.

Behavioural Analysis of Password Authentication and Countermeasure to Phishing Attacks - from User Experience and HCI Perspectives (사용자의 패스워드 인증 행위 분석 및 피싱 공격시 대응방안 - 사용자 경험 및 HCI의 관점에서)

  • Ryu, Hong Ryeol;Hong, Moses;Kwon, Taekyoung
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.79-90
    • /
    • 2014
  • User authentication based on ID and PW has been widely used. As the Internet has become a growing part of people' lives, input times of ID/PW have been increased for a variety of services. People have already learned enough to perform the authentication procedure and have entered ID/PW while ones are unconscious. This is referred to as the adaptive unconscious, a set of mental processes incoming information and producing judgements and behaviors without our conscious awareness and within a second. Most people have joined up for various websites with a small number of IDs/PWs, because they relied on their memory for managing IDs/PWs. Human memory decays with the passing of time and knowledges in human memory tend to interfere with each other. For that reason, there is the potential for people to enter an invalid ID/PW. Therefore, these characteristics above mentioned regarding of user authentication with ID/PW can lead to human vulnerabilities: people use a few PWs for various websites, manage IDs/PWs depending on their memory, and enter ID/PW unconsciously. Based on the vulnerability of human factors, a variety of information leakage attacks such as phishing and pharming attacks have been increasing exponentially. In the past, information leakage attacks exploited vulnerabilities of hardware, operating system, software and so on. However, most of current attacks tend to exploit the vulnerabilities of the human factors. These attacks based on the vulnerability of the human factor are called social-engineering attacks. Recently, malicious social-engineering technique such as phishing and pharming attacks is one of the biggest security problems. Phishing is an attack of attempting to obtain valuable information such as ID/PW and pharming is an attack intended to steal personal data by redirecting a website's traffic to a fraudulent copy of a legitimate website. Screens of fraudulent copies used for both phishing and pharming attacks are almost identical to those of legitimate websites, and even the pharming can include the deceptive URL address. Therefore, without the supports of prevention and detection techniques such as vaccines and reputation system, it is difficult for users to determine intuitively whether the site is the phishing and pharming sites or legitimate site. The previous researches in terms of phishing and pharming attacks have mainly studied on technical solutions. In this paper, we focus on human behaviour when users are confronted by phishing and pharming attacks without knowing them. We conducted an attack experiment in order to find out how many IDs/PWs are leaked from pharming and phishing attack. We firstly configured the experimental settings in the same condition of phishing and pharming attacks and build a phishing site for the experiment. We then recruited 64 voluntary participants and asked them to log in our experimental site. For each participant, we conducted a questionnaire survey with regard to the experiment. Through the attack experiment and survey, we observed whether their password are leaked out when logging in the experimental phishing site, and how many different passwords are leaked among the total number of passwords of each participant. Consequently, we found out that most participants unconsciously logged in the site and the ID/PW management dependent on human memory caused the leakage of multiple passwords. The user should actively utilize repudiation systems and the service provider with online site should support prevention techniques that the user can intuitively determined whether the site is phishing.