• 제목/요약/키워드: Engineering Technology

Search Result 93,629, Processing Time 0.114 seconds

Conjunction Assessments of the Satellites Transported by KSLV-II and Preparation of the Countermeasure for Possible Events in Timeline (누리호 탑재 위성들의 충돌위험의 예측 및 향후 상황의 대응을 위한 분석)

  • Shawn Seunghwan Choi;Peter Joonghyung Ryu;John Kim;Lowell Kim;Chris Sheen;Yongil Kim;Jaejin Lee;Sunghwan Choi;Jae Wook Song;Hae-Dong Kim;Misoon Mah;Douglas Deok-Soo Kim
    • Journal of Space Technology and Applications
    • /
    • v.3 no.2
    • /
    • pp.118-143
    • /
    • 2023
  • Space is becoming more commercialized. Despite of its delayed start-up, space activities in Korea are attracting more nation-wide supports from both investors and government. May 25, 2023, KSLV II, also called Nuri, successfully transported, and inserted seven satellites to a sun-synchronous orbit of 550 km altitude. However, Starlink has over 4,000 satellites around this altitude for its commercial activities. Hence, it is necessary for us to constantly monitor the collision risks of these satellites against resident space objects including Starlink. Here we report a quantitative research output regarding the conjunctions, particularly between the Nuri satellites and Starlink. Our calculation shows that, on average, three times everyday, the Nuri satellites encounter Starlink within 1 km distance with the probability of collision higher than 1.0E-5. A comparative study with KOMPSAT-5, also called Arirang-5, shows that its distance of closest approach distribution significantly differs from those of Nuri satellites. We also report a quantitative analysis of collision-avoiding maneuver cost of Starlink satellites and a strategy for Korea, being a delayed starter, to speed up to position itself in the space leading countries. We used the AstroOne program for analyses and compared its output with that of Socrates Plus of Celestrak. The two line element data was used for computation.

Verification of Multi-point Displacement Response Measurement Algorithm Using Image Processing Technique (영상처리기법을 이용한 다중 변위응답 측정 알고리즘의 검증)

  • Kim, Sung-Wan;Kim, Nam-Sik
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.30 no.3A
    • /
    • pp.297-307
    • /
    • 2010
  • Recently, maintenance engineering and technology for civil and building structures have begun to draw big attention and actually the number of structures that need to be evaluate on structural safety due to deterioration and performance degradation of structures are rapidly increasing. When stiffness is decreased because of deterioration of structures and member cracks, dynamic characteristics of structures would be changed. And it is important that the damaged areas and extent of the damage are correctly evaluated by analyzing dynamic characteristics from the actual behavior of a structure. In general, typical measurement instruments used for structure monitoring are dynamic instruments. Existing dynamic instruments are not easy to obtain reliable data when the cable connecting measurement sensors and device is long, and have uneconomical for 1 to 1 connection process between each sensor and instrument. Therefore, a method without attaching sensors to measure vibration at a long range is required. The representative applicable non-contact methods to measure the vibration of structures are laser doppler effect, a method using GPS, and image processing technique. The method using laser doppler effect shows relatively high accuracy but uneconomical while the method using GPS requires expensive equipment, and has its signal's own error and limited speed of sampling rate. But the method using image signal is simple and economical, and is proper to get vibration of inaccessible structures and dynamic characteristics. Image signals of camera instead of sensors had been recently used by many researchers. But the existing method, which records a point of a target attached on a structure and then measures vibration using image processing technique, could have relatively the limited objects of measurement. Therefore, this study conducted shaking table test and field load test to verify the validity of the method that can measure multi-point displacement responses of structures using image processing technique.

A Study on the Development of Ultra-precision Small Angle Spindle for Curved Processing of Special Shape Pocket in the Fourth Industrial Revolution of Machine Tools (공작기계의 4차 산업혁명에서 특수한 형상 포켓 곡면가공을 위한 초정밀 소형 앵글 스핀들 개발에 관한 연구)

  • Lee Ji Woong
    • Journal of Practical Engineering Education
    • /
    • v.15 no.1
    • /
    • pp.119-126
    • /
    • 2023
  • Today, in order to improve fuel efficiency and dynamic behavior of automobiles, an era of light weight and simplification of automobile parts is being formed. In order to simplify and design and manufacture the shape of the product, various components are integrated. For example, in order to commercialize three products into one product, product processing is occurring to a very narrow area. In the case of existing parts, precision die casting or casting production is used for processing convenience, and the multi-piece method requires a lot of processes and reduces the precision and strength of the parts. It is very advantageous to manufacture integrally to simplify the processing air and secure the strength of the parts, but if a deep and narrow pocket part needs to be processed, it cannot be processed with the equipment's own spindle. To solve a problem, research on cutting processing is being actively conducted, and multi-axis composite processing technology not only solves this problem. It has many advantages, such as being able to cut into composite shapes that have been difficult to flexibly cut through various processes with one machine tool so far. However, the reality is that expensive equipment increases manufacturing costs and lacks engineers who can operate the machine. In the five-axis cutting processing machine, when producing products with deep and narrow sections, the cycle time increases in product production due to the indirectness of tools, and many problems occur in processing. Therefore, dedicated machine tools and multi-axis composite machines should be used. Alternatively, an angle spindle may be used as a special tool capable of multi-axis composite machining of five or more axes in a three-axis machining center. Various and continuous studies are needed in areas such as processing vibration absorption, low heat generation and operational stability, excellent dimensional stability, and strength securing by using the angle spindle.

A Study on the Performance Verification Method of Small-Sized LTE-Maritime Transceiver (소형 초고속해상무선통신망 송수신기 성능 검증 방안에 관한 연구)

  • Seok Woo;Bu-young Kim;Woo-Seong Shim
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.29 no.7
    • /
    • pp.902-909
    • /
    • 2023
  • This study evaluated the performance test of a small-sized LTE-Maritime(LTE-M) transceiver that was developed and promoted to expand the use of intelligent maritime traf ic information services led by the Ministry of Oceans and Fisheries with the aim of supporting the prevention of maritime accidents. Accoriding to statistics, approximately 30% of all marine accidents in Korean water occur with ships weighing less than 3 tons. Therefore, the blind spots of maritime safety must be supplemented through the development of small-sized transceivers. The small transceiver may be used in fishing boats that are active near coastal waters and in water leisure equipment near the coastline. Therefore, verifying whether sufficient performance and stable communication quality are provided is necessary, considering the environment of their real usage. In this study, we reviewed the communication quality goals of the LTE-M network and the performance requirements of small-sized transceivers suggested by the Ministry of Oceans and Fisheries, and proposed a test plan to appropriately evaluate the performance of small-sized transceivers. The validity of the proposed test method was verified for six real-sea areas with a high frequency of marine accidents. Consequently, the downlink and uplink transmission speeds of the small-sized LTE-M transceiver showed performances of 9 Mbps or more and 3 Mbps or more, respectively. In addition, using the coverage analysis system, coverage of more than 95% and 100% were confirmed in the intensive management zone (0-30 km) and interesting zone (30-50 km), respectively. The performance evaluation method and test results proposed in this paper are expected to be used as reference materials for verifying the performance of transceivers, contributing to the spread of government-promoted e-navigation services and small-sized transceivers.

Building Change Detection Methodology in Urban Area from Single Satellite Image (단일위성영상 기반 도심지 건물변화탐지 방안)

  • Seunghee Kim;Taejung Kim
    • Korean Journal of Remote Sensing
    • /
    • v.39 no.5_4
    • /
    • pp.1097-1109
    • /
    • 2023
  • Urban is an area where small-scale changes to individual buildings occur frequently. An existing urban building database requires periodic updating to increase its usability. However, there are limitations in data collection for building changes over a wide urban. In this study, we check the possibility of detecting building changes and updating a building database by using satellite images that can capture a wide urban region by a single image. For this purpose, building areas in a satellite image are first extracted by projecting 3D coordinates of building corners available in a building database onto the image. Building areas are then divided into roof and facade areas. By comparing textures of the roof areas projected, building changes such as height change or building removal can be detected. New height values are estimated by adjusting building heights until projected roofs align to actual roofs observed in the image. If the projected image appeared in the image while no building is observed, it corresponds to a demolished building. By checking buildings in the original image whose roofs and facades areas are not projected, new buildings are identified. Based on these results, the building database is updated by the three categories of height update, building deletion, or new building creation. This method was tested with a KOMPSAT-3A image over Incheon Metropolitan City and Incheon building database available in public. Building change detection and building database update was carried out. Updated building corners were then projected to another KOMPSAT-3 image. It was confirmed that building areas projected by updated building information agreed with actual buildings in the image very well. Through this study, the possibility of semi-automatic building change detection and building database update based on single satellite image was confirmed. In the future, follow-up research is needed on technology to enhance computational automation of the proposed method.

Analysis of sustainability changes in the Korean rice cropping system using an emergy approach (에머지 접근법을 이용한 국내 벼농사 시스템의 지속가능성 변화 분석)

  • Yongeun Kim;Minyoung Lee;Jinsol Hong;Yun-Sik Lee;June Wee;Jaejun Song;Kijong Cho
    • Korean Journal of Environmental Biology
    • /
    • v.41 no.4
    • /
    • pp.482-496
    • /
    • 2023
  • Many changes in the scale and structure of the Korean rice cropping system have been made over the past few decades. Still, insufficient research has been conducted on the sustainability of this system. This study analyzed changes in the Korean rice cropping system's sustainability from a system ecology perspective using an emergy approach. For this purpose, an emergy table was created for the Korean rice cropping system in 2011, 2016, and 202, and an emergy-based indicator analysis was performed. The emergy analysis showed that the total emergy input to the rice cropping system decreased from 10,744E+18 sej year-1 to 8,342E+18 sej year-1 due to decreases in paddy field areas from 2011 to 2021, and the proportion of renewable resources decreased by 1.4%. The emergy input per area (ha) was found to have decreased from 13.13E+15 sej ha-1 year-1 in 2011 to 11.89E+15 sej ha-1 year-1 in 2021, and the leading cause was a decrease in nitrogen fertilizer usage and working hours. The amount of emergy used to grow 1 g of rice stayed the same between 2016 and 2021 (specific emergy: 13.3E+09 sej g-1), but the sustainability of the rice cropping system (emergy sustainability index, ESI) continued to decrease (2011: 0.107, 2016: 0.088, and 2021: 0.086). This study provides quantitative information on the emergy input structure and characteristics of Korean rice cropping systems. The results of this study can be used as a valuable reference in establishing measures to improve the ecological sustainability of the Korean rice cropping system.

Characteristics That Affect Japanese Consumer Preferences for Chrysanthemum (국화 수출 확대를 위한 일본 소비자의 상품 선호도 분석)

  • Lim, Jin Hee;Seo, Ji Yeon;Shim, Myung Syun
    • Horticultural Science & Technology
    • /
    • v.31 no.5
    • /
    • pp.640-647
    • /
    • 2013
  • This study was conducted to provide exportation strategy by surveying on preference of Japanese consumers on cut chrysanthemum exported. The survey was conducted two times by a local survey company in Japan, and the surveys were conducted largely on chrysanthemums for casual flowers and the altar. After departmentalizing Japanese consumers per groups the result were analyzed through conjoint and cluster methods, flower colors and shape were used relatively higher rate for selection criteria of flowers in every group in the case of casual flowers. Group 1 comprised of 60 year-old housewives who reside in a small city with high school diploma and annual income less than 300 million yen, and group 2 of 40 year-old housewives who are small city residents with high school diplomas and annual income of 300 million yen show higher rate of use in flower shape than colors. Another group 3 whose members are 50 year-old housewives, small city residents with high school diplomas and annual income of 600 million yen showed higher rate of use colors than the shape for selection criteria of flowers. The consumption characteristics according to the ages of the consumers showed a pronounced tendency. The 40-50 year-old housewives preferred single flowers packed with other flowers, and the 60 year-old housewives double flowers packed with only chrysanthemums. In flower color, the 50-60 year-old housewives preferred white and yellow flowers, and the 40 year-old housewives pink and yellow flowers. Therefore, there are needs for development strategy of new products considering the consumption characteristics of flower shape and color according to the ages of consumer. After analyzing the chrysanthemums for altar by departmentalization of Japanese consumers, every group showed relative higher rate of use for flower shape for selection criteria of flowers. According to the analysis on the consumption characteristics, group 1 which is comprised of 30-40 year-old housewives who reside in small city with high school diplomas and income less than 300 million yen, and the group 2 of 20 year-old housewives who reside in small city with college diplomas and annual income less than 300 million yen. They are very sensitive to the price of the products while the group 3 of 50 year-old housewives who reside in small city with high school diplomas and annual income less than 300 million yen are insensitive to the price. The 30-50 year-old housewives preferred white and pink flowers, and the 20 year-old housewives yellow and pink flowers. In flower shape, the 50 year-old housewives preferred anemone shape, the 30-40 year-old housewives double shape, and the 20 year-old housewives pompon shapes. Therefore, the white, double flowers for the 30-40 year-old housewives and the yellow, pompon flowers for the 20 year-old housewives are needed to be created at the lowest cost, while the white, anemone flowers are needed to created at higher cost with high quality. In light of these results, it is considered that we should understand the types of purchasing products through consumption characteristics of Japanese consumers. Also we should plan, create market-oriented and consumer-oriented products, and should export them in order to expand more exportation.

Design and Implementation of MongoDB-based Unstructured Log Processing System over Cloud Computing Environment (클라우드 환경에서 MongoDB 기반의 비정형 로그 처리 시스템 설계 및 구현)

  • Kim, Myoungjin;Han, Seungho;Cui, Yun;Lee, Hanku
    • Journal of Internet Computing and Services
    • /
    • v.14 no.6
    • /
    • pp.71-84
    • /
    • 2013
  • Log data, which record the multitude of information created when operating computer systems, are utilized in many processes, from carrying out computer system inspection and process optimization to providing customized user optimization. In this paper, we propose a MongoDB-based unstructured log processing system in a cloud environment for processing the massive amount of log data of banks. Most of the log data generated during banking operations come from handling a client's business. Therefore, in order to gather, store, categorize, and analyze the log data generated while processing the client's business, a separate log data processing system needs to be established. However, the realization of flexible storage expansion functions for processing a massive amount of unstructured log data and executing a considerable number of functions to categorize and analyze the stored unstructured log data is difficult in existing computer environments. Thus, in this study, we use cloud computing technology to realize a cloud-based log data processing system for processing unstructured log data that are difficult to process using the existing computing infrastructure's analysis tools and management system. The proposed system uses the IaaS (Infrastructure as a Service) cloud environment to provide a flexible expansion of computing resources and includes the ability to flexibly expand resources such as storage space and memory under conditions such as extended storage or rapid increase in log data. Moreover, to overcome the processing limits of the existing analysis tool when a real-time analysis of the aggregated unstructured log data is required, the proposed system includes a Hadoop-based analysis module for quick and reliable parallel-distributed processing of the massive amount of log data. Furthermore, because the HDFS (Hadoop Distributed File System) stores data by generating copies of the block units of the aggregated log data, the proposed system offers automatic restore functions for the system to continually operate after it recovers from a malfunction. Finally, by establishing a distributed database using the NoSQL-based Mongo DB, the proposed system provides methods of effectively processing unstructured log data. Relational databases such as the MySQL databases have complex schemas that are inappropriate for processing unstructured log data. Further, strict schemas like those of relational databases cannot expand nodes in the case wherein the stored data are distributed to various nodes when the amount of data rapidly increases. NoSQL does not provide the complex computations that relational databases may provide but can easily expand the database through node dispersion when the amount of data increases rapidly; it is a non-relational database with an appropriate structure for processing unstructured data. The data models of the NoSQL are usually classified as Key-Value, column-oriented, and document-oriented types. Of these, the representative document-oriented data model, MongoDB, which has a free schema structure, is used in the proposed system. MongoDB is introduced to the proposed system because it makes it easy to process unstructured log data through a flexible schema structure, facilitates flexible node expansion when the amount of data is rapidly increasing, and provides an Auto-Sharding function that automatically expands storage. The proposed system is composed of a log collector module, a log graph generator module, a MongoDB module, a Hadoop-based analysis module, and a MySQL module. When the log data generated over the entire client business process of each bank are sent to the cloud server, the log collector module collects and classifies data according to the type of log data and distributes it to the MongoDB module and the MySQL module. The log graph generator module generates the results of the log analysis of the MongoDB module, Hadoop-based analysis module, and the MySQL module per analysis time and type of the aggregated log data, and provides them to the user through a web interface. Log data that require a real-time log data analysis are stored in the MySQL module and provided real-time by the log graph generator module. The aggregated log data per unit time are stored in the MongoDB module and plotted in a graph according to the user's various analysis conditions. The aggregated log data in the MongoDB module are parallel-distributed and processed by the Hadoop-based analysis module. A comparative evaluation is carried out against a log data processing system that uses only MySQL for inserting log data and estimating query performance; this evaluation proves the proposed system's superiority. Moreover, an optimal chunk size is confirmed through the log data insert performance evaluation of MongoDB for various chunk sizes.

Development of a complex failure prediction system using Hierarchical Attention Network (Hierarchical Attention Network를 이용한 복합 장애 발생 예측 시스템 개발)

  • Park, Youngchan;An, Sangjun;Kim, Mintae;Kim, Wooju
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.127-148
    • /
    • 2020
  • The data center is a physical environment facility for accommodating computer systems and related components, and is an essential foundation technology for next-generation core industries such as big data, smart factories, wearables, and smart homes. In particular, with the growth of cloud computing, the proportional expansion of the data center infrastructure is inevitable. Monitoring the health of these data center facilities is a way to maintain and manage the system and prevent failure. If a failure occurs in some elements of the facility, it may affect not only the relevant equipment but also other connected equipment, and may cause enormous damage. In particular, IT facilities are irregular due to interdependence and it is difficult to know the cause. In the previous study predicting failure in data center, failure was predicted by looking at a single server as a single state without assuming that the devices were mixed. Therefore, in this study, data center failures were classified into failures occurring inside the server (Outage A) and failures occurring outside the server (Outage B), and focused on analyzing complex failures occurring within the server. Server external failures include power, cooling, user errors, etc. Since such failures can be prevented in the early stages of data center facility construction, various solutions are being developed. On the other hand, the cause of the failure occurring in the server is difficult to determine, and adequate prevention has not yet been achieved. In particular, this is the reason why server failures do not occur singularly, cause other server failures, or receive something that causes failures from other servers. In other words, while the existing studies assumed that it was a single server that did not affect the servers and analyzed the failure, in this study, the failure occurred on the assumption that it had an effect between servers. In order to define the complex failure situation in the data center, failure history data for each equipment existing in the data center was used. There are four major failures considered in this study: Network Node Down, Server Down, Windows Activation Services Down, and Database Management System Service Down. The failures that occur for each device are sorted in chronological order, and when a failure occurs in a specific equipment, if a failure occurs in a specific equipment within 5 minutes from the time of occurrence, it is defined that the failure occurs simultaneously. After configuring the sequence for the devices that have failed at the same time, 5 devices that frequently occur simultaneously within the configured sequence were selected, and the case where the selected devices failed at the same time was confirmed through visualization. Since the server resource information collected for failure analysis is in units of time series and has flow, we used Long Short-term Memory (LSTM), a deep learning algorithm that can predict the next state through the previous state. In addition, unlike a single server, the Hierarchical Attention Network deep learning model structure was used in consideration of the fact that the level of multiple failures for each server is different. This algorithm is a method of increasing the prediction accuracy by giving weight to the server as the impact on the failure increases. The study began with defining the type of failure and selecting the analysis target. In the first experiment, the same collected data was assumed as a single server state and a multiple server state, and compared and analyzed. The second experiment improved the prediction accuracy in the case of a complex server by optimizing each server threshold. In the first experiment, which assumed each of a single server and multiple servers, in the case of a single server, it was predicted that three of the five servers did not have a failure even though the actual failure occurred. However, assuming multiple servers, all five servers were predicted to have failed. As a result of the experiment, the hypothesis that there is an effect between servers is proven. As a result of this study, it was confirmed that the prediction performance was superior when the multiple servers were assumed than when the single server was assumed. In particular, applying the Hierarchical Attention Network algorithm, assuming that the effects of each server will be different, played a role in improving the analysis effect. In addition, by applying a different threshold for each server, the prediction accuracy could be improved. This study showed that failures that are difficult to determine the cause can be predicted through historical data, and a model that can predict failures occurring in servers in data centers is presented. It is expected that the occurrence of disability can be prevented in advance using the results of this study.

Health Assessment of the Nakdong River Basin Aquatic Ecosystems Utilizing GIS and Spatial Statistics (GIS 및 공간통계를 활용한 낙동강 유역 수생태계의 건강성 평가)

  • JO, Myung-Hee;SIM, Jun-Seok;LEE, Jae-An;JANG, Sung-Hyun
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.18 no.2
    • /
    • pp.174-189
    • /
    • 2015
  • The objective of this study was to reconstruct spatial information using the results of the investigation and evaluation of the health of the living organisms, habitat, and water quality at the investigation points for the aquatic ecosystem health of the Nakdong River basin, to support the rational decision making of the aquatic ecosystem preservation and restoration policies of the Nakdong River basin using spatial analysis techniques, and to present efficient management methods. To analyze the aquatic ecosystem health of the Nakdong River basin, punctiform data were constructed based on the position information of each point with the aquatic ecosystem health investigation and evaluation results of 250 investigation sections. To apply the spatial analysis technique, the data need to be reconstructed into areal data. For this purpose, spatial influence and trends were analyzed using the Kriging interpolation(ArcGIS 10.1, Geostatistical Analysis), and were reconstructed into areal data. To analyze the spatial distribution characteristics of the Nakdong River basin health based on these analytical results, hotspot(Getis-Ord Gi, $G^*_i$), LISA(Local Indicator of Spatial Association), and standard deviational ellipse analyses were used. The hotspot analysis results showed that the hotspot basins of the biotic indices(TDI, BMI, FAI) were the Andong Dam upstream, Wangpicheon, and the Imha Dam basin, and that the health grades of their biotic indices were good. The coldspot basins were Nakdong River Namhae, the Nakdong River mouth, and the Suyeong River basin. The LISA analysis results showed that the exceptional areas were Gahwacheon, the Hapcheon Dam, and the Yeong River upstream basin. These areas had high bio-health indices, but their surrounding basins were low and required management for aquatic ecosystem health. The hotspot basins of the physicochemical factor(BOD) were the Nakdong River downstream basin, Suyeong River, Hoeya River, and the Nakdong River Namhae basin, whereas the coldspot basins were the upstream basins of the Nakdong River tributaries, including Andong Dam, Imha Dam, and Yeong River. The hotspots of the habitat and riverside environment factor(HRI) were different from the hotspots and coldspots of each factor in the LISA analysis results. In general, the habitat and riverside environment of the Nakdong River mainstream and tributaries, including the Nakdong river upstream, Andong Dam, Imha Dam, and the Hapcheon Dam basin, had good health. The coldspot basins of the habitat and riverside environment also showed low health indices of the biotic indices and physicochemical factors, thus requiring management of the habitat and riverside environment. As a result of the time-series analysis with a standard deviation ellipsoid, the areas with good aquatic ecosystem health of the organisms, habitat, and riverside environment showed a tendency to move northward, and the BOD results showed different directions and concentrations by the year of investigation. These aquatic ecosystem health analysis results can provide not only the health management information for each investigation spot but also information for managing the aquatic ecosystem in the catchment unit for the working research staff as well as for the water environment researchers in the future, based on spatial information.