• Title/Summary/Keyword: X network

Search Result 1,002, Processing Time 0.033 seconds

Storm sewer network simplification technique for improving efficiency of urban flood forecasting (도시침수예측 효율 향상을 위한 관망간소화 기법 제시)

  • Sang Bo Sim;Hyung-Jun Kim
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2023.05a
    • /
    • pp.269-269
    • /
    • 2023
  • 기후 변화로 인한 강우 패턴의 변화는 도심지 방재성능 목표를 상회하는 홍수로 이어져 침수피해를 가중시키고 있다. 이로 인한 도시침수 피해를 저감하기 위하여 도시침수 예측모형 개발이 활발히 이루어지고 있으나, 대규모 관망으로 이루어진 복잡한 도심지 우수관망을 모의하기 때문에 분석속도가 느려 실시간 예측 적용에 한계점이 있다. 도시침수 분석에 가장 많이 활용되는 대표적인 모형인 SWMM(Storm Water Management Model)은 복잡한 관망을 비교적 빠르고 정확히 해석할 수 있어 유용하지만, 이 또한 대도심의 우수관망 모의 시 많은 시간이 소요되며, 관망 정밀도 기준이 정의되어 있지 않아 분석에 어려움이 있다. 이러한 문제점을 해결하기 위하여 본 연구에서는 관망 간소화 기법(유역면적의 밀도, 관거 직경, 관로의 길이 등)을 적용하고, 이에 따른 주요 지선과 간선의 수위 변화와 침수흔적도를 비교하여 분석결과의 정확성을 담보하는 관망 간소화 수준을 파악하고 도시침수 분석 시 적정 간소화 기준과 자동 간소화 방안을 제시하고자 한다. 도시침수 분석 시 우수관망 자동 간소화를 위하여 Python을 활용한 코드를 작성하였으며, SWMM의 .inp 파일을 읽어들여 Dataframe형태로 저장한 후 분석을 위한 데이터 가공, 간소화 기준에 따른 분류, 간소화 대상 수리·수문인자 연산, 인접 간선에 연결, 간소화된 .inp파일 저장의 총 6단계로 구성하였다. 연구 대상지역은 도림천 유역으로 설정하였으며, 초기자료는 맨홀 30,469, 관거 32,443, 소유역 30,586개로 이루어져 있으며, 모의 시간은 약 2시간 30분이 소요되었다. 유역면적 100x100 미만을 대상으로 수행 시 맨홀 9,965, 관거 10,464, 소유역 9,240개로 관거의 복잡도가 약 1/3 감소하였으며, 모의 시간은 약 43분으로 기존대비 약 72% 단축되는 것으로 나타났다. 실제 침수가 발생한 주요지점들을 비교한 결과 R2 0.85 ~ 0.92로 예측모형의 정확도에 큰 영향을 끼치지 않는 것으로 나타났다. 도시침수모형 최적 간소화를 통해 모형의 복잡성을 줄이고, 계산량을 줄여 모형의 수행시간을 단축시킬 수 있으며, 불필요한 우수관망을 제거하거나 병합함으로써, 모형의 예측력 향상과 분석과 해석에 효율적으로 사용될 수 있을 것으로 기대한다.

  • PDF

Radiation parameterizations and optical characterizations for glass shielding composed of SLS waste glass and lead-free materials

  • Thair Hussein Khazaalah;Iskandar Shahrim Mustafa ;M.I. Sayyed
    • Nuclear Engineering and Technology
    • /
    • v.54 no.12
    • /
    • pp.4708-4714
    • /
    • 2022
  • The novelty in the present search, the Soda-Lime-Silica (SLS) glass waste to prepare free lead glass shielding was used in order to limit the accumulation of glass waste, which requires extensive time to decompose. This also saves on the consumption of pure SiO2, which is a finite resource. Furthermore, the combining of BaO with Bi2O3 into a glass network leads to increased optical properties and improved attenuation. The UV-Visible Spectrophotometer was used to investigate the optical properties and the radiation shielding properties were reported for current glass samples utilizing the PhysX/PDS online software. The optical property results indicate that when BaO content increases in glass structure, the Urbach energy ΔE, and refractive index n increases while the energy optical band gap Eopt decreases. The result of the metallisation criteria (M) revealed that the present glass samples are nonmetallic (insulators). Furthermore, the radiation shielding parameter findings suggest that when BaO was increased in the glass structure, the linear attenuation coefficient and effective atomic number (Zeff) rose. But the half-value layer HVL declined as the BaO concentration grew. According to the research, the glass samples are non-toxic, transparent to visible light, and efficient radiation shielding materials. The Ba5 sample is considered the best among all the samples due to its higher attenuation value and lower HVL and MFP values, which make it a suitable candidate as transparent glass shield shielding.

"Where can I buy this?" - Fashion Item Searcher using Instance Segmentation with Mask R-CNN ("이거 어디서 사?" - Mask R-CNN 기반 객체 분할을 활용한 패션 아이템 검색 시스템)

  • Jung, Kyunghee;Choi, Ha nl;Sammy, Y.X.B.;Kim, Hyunsung;Toan, N.D.;Choo, Hyunseung
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2022.11a
    • /
    • pp.465-467
    • /
    • 2022
  • Mobile phones have become an essential item nowadays since it provides access to online platform and service fast and easy. Coming to these platforms such as Social Network Service (SNS) for shopping have been a go-to option for many people. However, searching for a specific fashion item in the picture is challenging, where users need to try multiple searches by combining appropriate search keywords. To tackle this problem, we propose a system that could provide immediate access to websites related to fashion items. In the framework, we also propose a deep learning model for an automatic analysis of image contexts using instance segmentation. We use transfer learning by utilizing Deep fashion 2 to maximize our model accuracy. After segmenting all the fashion item objects in the image, the related search information is retrieved when the object is clicked. Furthermore, we successfully deploy our system so that it could be assessable using any web browser. We prove that deep learning could be a promising tool not only for scientific purpose but also applicable to commercial shopping.

Development of Deep Learning-based Automatic Classification of Architectural Objects in Point Clouds for BIM Application in Renovating Aging Buildings (딥러닝 기반 노후 건축물 리모델링 시 BIM 적용을 위한 포인트 클라우드의 건축 객체 자동 분류 기술 개발)

  • Kim, Tae-Hoon;Gu, Hyeong-Mo;Hong, Soon-Min;Choo, Seoung-Yeon
    • Journal of KIBIM
    • /
    • v.13 no.4
    • /
    • pp.96-105
    • /
    • 2023
  • This study focuses on developing a building object recognition technology for efficient use in the remodeling of buildings constructed without drawings. In the era of the 4th industrial revolution, smart technologies are being developed. This research contributes to the architectural field by introducing a deep learning-based method for automatic object classification and recognition, utilizing point cloud data. We use a TD3D network with voxels, optimizing its performance through adjustments in voxel size and number of blocks. This technology enables the classification of building objects such as walls, floors, and roofs from 3D scanning data, labeling them in polygonal forms to minimize boundary ambiguities. However, challenges in object boundary classifications were observed. The model facilitates the automatic classification of non-building objects, thereby reducing manual effort in data matching processes. It also distinguishes between elements to be demolished or retained during remodeling. The study minimized data set loss space by labeling using the extremities of the x, y, and z coordinates. The research aims to enhance the efficiency of building object classification and improve the quality of architectural plans by reducing manpower and time during remodeling. The study aligns with its goal of developing an efficient classification technology. Future work can extend to creating classified objects using parametric tools with polygon-labeled datasets, offering meaningful numerical analysis for remodeling processes. Continued research in this direction is anticipated to significantly advance the efficiency of building remodeling techniques.

Study on the Improvement of Lung CT Image Quality using 2D Deep Learning Network according to Various Noise Types (폐 CT 영상에서 다양한 노이즈 타입에 따른 딥러닝 네트워크를 이용한 영상의 질 향상에 관한 연구)

  • Min-Gwan Lee;Chanrok Park
    • Journal of the Korean Society of Radiology
    • /
    • v.18 no.2
    • /
    • pp.93-99
    • /
    • 2024
  • The digital medical imaging, especially, computed tomography (CT), should necessarily be considered in terms of noise distribution caused by converting to X-ray photon to digital imaging signal. Recently, the denoising technique based on deep learning architecture is increasingly used in the medical imaging field. Here, we evaluated noise reduction effect according to various noise types based on the U-net deep learning model in the lung CT images. The input data for deep learning was generated by applying Gaussian noise, Poisson noise, salt and pepper noise and speckle noise from the ground truth (GT) image. In particular, two types of Gaussian noise input data were applied with standard deviation values of 30 and 50. There are applied hyper-parameters, which were Adam as optimizer function, 100 as epochs, and 0.0001 as learning rate, respectively. To analyze the quantitative values, the mean square error (MSE), the peak signal to noise ratio (PSNR) and coefficient of variation (COV) were calculated. According to the results, it was confirmed that the U-net model was effective for noise reduction all of the set conditions in this study. Especially, it showed the best performance in Gaussian noise.

Calculation of Local Coordinate of Common Points for Coordinate Transformation by Trilateral Adjustment (좌표변환 공통점의 지역측지계 조정좌표 산출 - 삼변망조정계산의 활용 -)

  • Yang, Chul Soo;Kang, Sang-gu;Song, Wonho;Lee, Won Hui
    • Journal of Cadastre & Land InformatiX
    • /
    • v.54 no.1
    • /
    • pp.103-115
    • /
    • 2024
  • Trilateral adjustment can complement the problem of transforming cadastral maps into World Geodetic Coordinate system. First, it is possible to determine adjusted coordinate of common points that match each other over a wide area. Second, calculations that focus on specific points can be performed. Third, a solution that maintains the shape of the regional network can be obtained through constraints. Thus, the point coordinates can be determined appropriately for the survey system. In addition, heterogeneous survey results that span regions with different coordinate origins can be calculated on a single origin coordinate. This improves the efficiency of the workflow in tranforming cadastral maps into World Geodetic Coordinate System.

GPR Development for Landmine Detection (지뢰탐지를 위한 GPR 시스템의 개발)

  • Sato, Motoyuki;Fujiwara, Jun;Feng, Xuan;Zhou, Zheng-Shu;Kobayashi, Takao
    • Geophysics and Geophysical Exploration
    • /
    • v.8 no.4
    • /
    • pp.270-279
    • /
    • 2005
  • Under the research project supported by Japanese Ministry of Education, Culture, Sports, Science and Technology (MEXT), we have conducted the development of GPR systems for landmine detection. Until 2005, we have finished development of two prototype GPR systems, namely ALIS (Advanced Landmine Imaging System) and SAR-GPR (Synthetic Aperture Radar-Ground Penetrating Radar). ALIS is a novel landmine detection sensor system combined with a metal detector and GPR. This is a hand-held equipment, which has a sensor position tracking system, and can visualize the sensor output in real time. In order to achieve the sensor tracking system, ALIS needs only one CCD camera attached on the sensor handle. The CCD image is superimposed with the GPR and metal detector signal, and the detection and identification of buried targets is quite easy and reliable. Field evaluation test of ALIS was conducted in December 2004 in Afghanistan, and we demonstrated that it can detect buried antipersonnel landmines, and can also discriminate metal fragments from landmines. SAR-GPR (Synthetic Aperture Radar-Ground Penetrating Radar) is a machine mounted sensor system composed of B GPR and a metal detector. The GPR employs an array antenna for advanced signal processing for better subsurface imaging. SAR-GPR combined with synthetic aperture radar algorithm, can suppress clutter and can image buried objects in strongly inhomogeneous material. SAR-GPR is a stepped frequency radar system, whose RF component is a newly developed compact vector network analyzers. The size of the system is 30cm x 30cm x 30 cm, composed from six Vivaldi antennas and three vector network analyzers. The weight of the system is 17 kg, and it can be mounted on a robotic arm on a small unmanned vehicle. The field test of this system was carried out in March 2005 in Japan.

Transfer and Validation of NIRS Calibration Models for Evaluating Forage Quality in Italian Ryegrass Silages (이탈리안 라이그라스 사일리지의 품질평가를 위한 근적외선분광 (NIRS) 검량식의 이설 및 검증)

  • Cho, Kyu Chae;Park, Hyung Soo;Lee, Sang Hoon;Choi, Jin Hyeok;Seo, Sung;Choi, Gi Jun
    • Journal of Animal Environmental Science
    • /
    • v.18 no.sup
    • /
    • pp.81-90
    • /
    • 2012
  • This study was evaluated high end research grade Near infrared spectrophotometer (NIRS) to low end popular field grade multiple Near infrared spectrophotometer (NIRS) for rapid analysis at forage quality at sight with 241 samples of Italian ryegrass silage during 3 years collected whole country for evaluate accuracy and precision between instruments. Firstly collected and build database high end research grade NIRS using with Unity Scientific Model 2500X (650 nm~2,500 nm) then trim and fit to low end popular field grade NIRS with Unity Scientific Model 1400 (1,400 nm~2,400 nm) then build and create calibration, transfer calibration with special transfer algorithm. The result between instruments was 0.000%~0.343% differences, rapidly analysis for chemical constituents, NDF, ADF, and crude protein, crude ash and fermentation parameter such as moisture, pH and lactic acid, finally forage quality parameter, TDN, DMI, RFV within 5 minutes at sight and the result equivalent with laboratory data. Nevertheless during 3 years collected samples for build calibration was organic samples that make differentiate by local or yearly bases etc. This strongly suggest population evaluation technique needed and constantly update calibration and maintenance calibration to proper handling database accumulation and spread out by knowledgable control laboratory analysis and reflect calibration update such as powerful control center needed for long lasting usage of forage analysis with NIRS at sight. Especially the agriculture products such as forage will continuously changes that made easily find out the changes and update routinely, if not near future NIRS was worthless due to those changes. Many research related NIRS was shortly study not long term study that made not well using NIRS, so the system needed check simple and instantly using with local language supported signal methods Global Distance (GD) and Neighbour Distance (ND) algorithm. Finally the multiple popular field grades instruments should be the same results not only between research grade instruments but also between multiple popular field grade instruments that needed easily transfer calibration and maintenance between instruments via internet networking techniques.

Optimization of Multiclass Support Vector Machine using Genetic Algorithm: Application to the Prediction of Corporate Credit Rating (유전자 알고리즘을 이용한 다분류 SVM의 최적화: 기업신용등급 예측에의 응용)

  • Ahn, Hyunchul
    • Information Systems Review
    • /
    • v.16 no.3
    • /
    • pp.161-177
    • /
    • 2014
  • Corporate credit rating assessment consists of complicated processes in which various factors describing a company are taken into consideration. Such assessment is known to be very expensive since domain experts should be employed to assess the ratings. As a result, the data-driven corporate credit rating prediction using statistical and artificial intelligence (AI) techniques has received considerable attention from researchers and practitioners. In particular, statistical methods such as multiple discriminant analysis (MDA) and multinomial logistic regression analysis (MLOGIT), and AI methods including case-based reasoning (CBR), artificial neural network (ANN), and multiclass support vector machine (MSVM) have been applied to corporate credit rating.2) Among them, MSVM has recently become popular because of its robustness and high prediction accuracy. In this study, we propose a novel optimized MSVM model, and appy it to corporate credit rating prediction in order to enhance the accuracy. Our model, named 'GAMSVM (Genetic Algorithm-optimized Multiclass Support Vector Machine),' is designed to simultaneously optimize the kernel parameters and the feature subset selection. Prior studies like Lorena and de Carvalho (2008), and Chatterjee (2013) show that proper kernel parameters may improve the performance of MSVMs. Also, the results from the studies such as Shieh and Yang (2008) and Chatterjee (2013) imply that appropriate feature selection may lead to higher prediction accuracy. Based on these prior studies, we propose to apply GAMSVM to corporate credit rating prediction. As a tool for optimizing the kernel parameters and the feature subset selection, we suggest genetic algorithm (GA). GA is known as an efficient and effective search method that attempts to simulate the biological evolution phenomenon. By applying genetic operations such as selection, crossover, and mutation, it is designed to gradually improve the search results. Especially, mutation operator prevents GA from falling into the local optima, thus we can find the globally optimal or near-optimal solution using it. GA has popularly been applied to search optimal parameters or feature subset selections of AI techniques including MSVM. With these reasons, we also adopt GA as an optimization tool. To empirically validate the usefulness of GAMSVM, we applied it to a real-world case of credit rating in Korea. Our application is in bond rating, which is the most frequently studied area of credit rating for specific debt issues or other financial obligations. The experimental dataset was collected from a large credit rating company in South Korea. It contained 39 financial ratios of 1,295 companies in the manufacturing industry, and their credit ratings. Using various statistical methods including the one-way ANOVA and the stepwise MDA, we selected 14 financial ratios as the candidate independent variables. The dependent variable, i.e. credit rating, was labeled as four classes: 1(A1); 2(A2); 3(A3); 4(B and C). 80 percent of total data for each class was used for training, and remaining 20 percent was used for validation. And, to overcome small sample size, we applied five-fold cross validation to our dataset. In order to examine the competitiveness of the proposed model, we also experimented several comparative models including MDA, MLOGIT, CBR, ANN and MSVM. In case of MSVM, we adopted One-Against-One (OAO) and DAGSVM (Directed Acyclic Graph SVM) approaches because they are known to be the most accurate approaches among various MSVM approaches. GAMSVM was implemented using LIBSVM-an open-source software, and Evolver 5.5-a commercial software enables GA. Other comparative models were experimented using various statistical and AI packages such as SPSS for Windows, Neuroshell, and Microsoft Excel VBA (Visual Basic for Applications). Experimental results showed that the proposed model-GAMSVM-outperformed all the competitive models. In addition, the model was found to use less independent variables, but to show higher accuracy. In our experiments, five variables such as X7 (total debt), X9 (sales per employee), X13 (years after founded), X15 (accumulated earning to total asset), and X39 (the index related to the cash flows from operating activity) were found to be the most important factors in predicting the corporate credit ratings. However, the values of the finally selected kernel parameters were found to be almost same among the data subsets. To examine whether the predictive performance of GAMSVM was significantly greater than those of other models, we used the McNemar test. As a result, we found that GAMSVM was better than MDA, MLOGIT, CBR, and ANN at the 1% significance level, and better than OAO and DAGSVM at the 5% significance level.

Research Trend Analysis Using Bibliographic Information and Citations of Cloud Computing Articles: Application of Social Network Analysis (클라우드 컴퓨팅 관련 논문의 서지정보 및 인용정보를 활용한 연구 동향 분석: 사회 네트워크 분석의 활용)

  • Kim, Dongsung;Kim, Jongwoo
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.1
    • /
    • pp.195-211
    • /
    • 2014
  • Cloud computing services provide IT resources as services on demand. This is considered a key concept, which will lead a shift from an ownership-based paradigm to a new pay-for-use paradigm, which can reduce the fixed cost for IT resources, and improve flexibility and scalability. As IT services, cloud services have evolved from early similar computing concepts such as network computing, utility computing, server-based computing, and grid computing. So research into cloud computing is highly related to and combined with various relevant computing research areas. To seek promising research issues and topics in cloud computing, it is necessary to understand the research trends in cloud computing more comprehensively. In this study, we collect bibliographic information and citation information for cloud computing related research papers published in major international journals from 1994 to 2012, and analyzes macroscopic trends and network changes to citation relationships among papers and the co-occurrence relationships of key words by utilizing social network analysis measures. Through the analysis, we can identify the relationships and connections among research topics in cloud computing related areas, and highlight new potential research topics. In addition, we visualize dynamic changes of research topics relating to cloud computing using a proposed cloud computing "research trend map." A research trend map visualizes positions of research topics in two-dimensional space. Frequencies of key words (X-axis) and the rates of increase in the degree centrality of key words (Y-axis) are used as the two dimensions of the research trend map. Based on the values of the two dimensions, the two dimensional space of a research map is divided into four areas: maturation, growth, promising, and decline. An area with high keyword frequency, but low rates of increase of degree centrality is defined as a mature technology area; the area where both keyword frequency and the increase rate of degree centrality are high is defined as a growth technology area; the area where the keyword frequency is low, but the rate of increase in the degree centrality is high is defined as a promising technology area; and the area where both keyword frequency and the rate of degree centrality are low is defined as a declining technology area. Based on this method, cloud computing research trend maps make it possible to easily grasp the main research trends in cloud computing, and to explain the evolution of research topics. According to the results of an analysis of citation relationships, research papers on security, distributed processing, and optical networking for cloud computing are on the top based on the page-rank measure. From the analysis of key words in research papers, cloud computing and grid computing showed high centrality in 2009, and key words dealing with main elemental technologies such as data outsourcing, error detection methods, and infrastructure construction showed high centrality in 2010~2011. In 2012, security, virtualization, and resource management showed high centrality. Moreover, it was found that the interest in the technical issues of cloud computing increases gradually. From annual cloud computing research trend maps, it was verified that security is located in the promising area, virtualization has moved from the promising area to the growth area, and grid computing and distributed system has moved to the declining area. The study results indicate that distributed systems and grid computing received a lot of attention as similar computing paradigms in the early stage of cloud computing research. The early stage of cloud computing was a period focused on understanding and investigating cloud computing as an emergent technology, linking to relevant established computing concepts. After the early stage, security and virtualization technologies became main issues in cloud computing, which is reflected in the movement of security and virtualization technologies from the promising area to the growth area in the cloud computing research trend maps. Moreover, this study revealed that current research in cloud computing has rapidly transferred from a focus on technical issues to for a focus on application issues, such as SLAs (Service Level Agreements).