• Title/Summary/Keyword: Processing Map

Search Result 1,460, Processing Time 0.032 seconds

A study on the distribution of latent fingerprints on paper knife sheaths (간이 칼집에서의 잠재지문 분포에 관한 연구)

  • Kim, Hyo-Mi;Park, Gi-Hyun;Lee, Su-Bhin;Yu, Je-Seol
    • Analytical Science and Technology
    • /
    • v.34 no.6
    • /
    • pp.251-258
    • /
    • 2021
  • Knives are most frequently used as weapons in violent crimes. Criminals leave behind knife sheaths made of paper and tape at crime scenes. It is difficult to develop fingerprints using tape attached to a porous surface, resulting in the need to explore effective techniques for identifying fingerprints as well as the distribution of fingerprints on each surface, when evidence such as paper knife sheaths are found. In this study, 50 knife sheaths were prepared. The cyanoacrylate fuming (CA fuming) method was applied to develop fingerprints on the non-adhesive side of the tape, and a dual-purpose 1,2-indanedione/Zn (1,2-IND/Zn) reagent was used to separate tape from paper while simultaneously developing fingerprints on the paper. The fingerprints on the adhesive side of the tape were developed using Wet Powder Black®. Using the R statistical analysis program (The R Foundation for Statistical Computing), we used a heat map to indicate the location of fingerprints developed from each surface. More fingerprints were detected at the ends than in the center of the adhesive side of the tape, and although the non-adhesive sides of tape and paper did not present clear distribution patterns, many fingerprints were developed that had sufficient clarity for personal identification. The results of this study may be applicable for processing evidence when paper sheaths are found at crime scenes.

Apartment Price Prediction Using Deep Learning and Machine Learning (딥러닝과 머신러닝을 이용한 아파트 실거래가 예측)

  • Hakhyun Kim;Hwankyu Yoo;Hayoung Oh
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.12 no.2
    • /
    • pp.59-76
    • /
    • 2023
  • Since the COVID-19 era, the rise in apartment prices has been unconventional. In this uncertain real estate market, price prediction research is very important. In this paper, a model is created to predict the actual transaction price of future apartments after building a vast data set of 870,000 from 2015 to 2020 through data collection and crawling on various real estate sites and collecting as many variables as possible. This study first solved the multicollinearity problem by removing and combining variables. After that, a total of five variable selection algorithms were used to extract meaningful independent variables, such as Forward Selection, Backward Elimination, Stepwise Selection, L1 Regulation, and Principal Component Analysis(PCA). In addition, a total of four machine learning and deep learning algorithms were used for deep neural network(DNN), XGBoost, CatBoost, and Linear Regression to learn the model after hyperparameter optimization and compare predictive power between models. In the additional experiment, the experiment was conducted while changing the number of nodes and layers of the DNN to find the most appropriate number of nodes and layers. In conclusion, as a model with the best performance, the actual transaction price of apartments in 2021 was predicted and compared with the actual data in 2021. Through this, I am confident that machine learning and deep learning will help investors make the right decisions when purchasing homes in various economic situations.

Estimation of Illuminant Chromaticity by Equivalent Distance Reference Illumination Map and Color Correlation (균등거리 기준 조명 맵과 색 상관성을 이용한 조명 색도 추정)

  • Kim Jeong Yeop
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.12 no.6
    • /
    • pp.267-274
    • /
    • 2023
  • In this paper, a method for estimating the illuminant chromaticity of a scene for an input image is proposed. The illuminant chromaticity is estimated using the illuminant reference region. The conventional method uses a certain number of reference lighting information. By comparing the chromaticity distribution of pixels from the input image with the chromaticity set prepared in advance for the reference illuminant, the reference illuminant with the largest overlapping area is regarded as the scene illuminant for the corresponding input image. In the process of calculating the overlapping area, the weights for each reference light were applied in the form of a Gaussian distribution, but a clear standard for the variance value could not be presented. The proposed method extracts an independent reference chromaticity region from a given reference illuminant, calculates the characteristic values in the r-g chromaticity plane of the RGB color coordinate system for all pixels of the input image, and then calculates the independent chromaticity region and features from the input image. The similarity is evaluated and the illuminant with the highest similarity was estimated as the illuminant chromaticity component of the image. The performance of the proposed method was evaluated using the database image and showed an average of about 60% improvement compared to the conventional basic method and showed an improvement performance of around 53% compared to the conventional Gaussian weight of 0.1.

An Approach Using LSTM Model to Forecasting Customer Congestion Based on Indoor Human Tracking (실내 사람 위치 추적 기반 LSTM 모델을 이용한 고객 혼잡 예측 연구)

  • Hee-ju Chae;Kyeong-heon Kwak;Da-yeon Lee;Eunkyung Kim
    • Journal of the Korea Society for Simulation
    • /
    • v.32 no.3
    • /
    • pp.43-53
    • /
    • 2023
  • In this detailed and comprehensive study, our primary focus has been placed on accurately gauging the number of visitors and their real-time locations in commercial spaces. Particularly, in a real cafe, using security cameras, we have developed a system that can offer live updates on available seating and predict future congestion levels. By employing YOLO, a real-time object detection and tracking algorithm, the number of visitors and their respective locations in real-time are also monitored. This information is then used to update a cafe's indoor map, thereby enabling users to easily identify available seating. Moreover, we developed a model that predicts the congestion of a cafe in real time. The sophisticated model, designed to learn visitor count and movement patterns over diverse time intervals, is based on Long Short Term Memory (LSTM) to address the vanishing gradient problem and Sequence-to-Sequence (Seq2Seq) for processing data with temporal relationships. This innovative system has the potential to significantly improve cafe management efficiency and customer satisfaction by delivering reliable predictions of cafe congestion to all users. Our groundbreaking research not only demonstrates the effectiveness and utility of indoor location tracking technology implemented through security cameras but also proposes potential applications in other commercial spaces.

Measuring the Public Service Quality Using Process Mining: Focusing on N City's Building Licensing Complaint Service (프로세스 마이닝을 이용한 공공서비스의 품질 측정: N시의 건축 인허가 민원 서비스를 중심으로)

  • Lee, Jung Seung
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.35-52
    • /
    • 2019
  • As public services are provided in various forms, including e-government, the level of public demand for public service quality is increasing. Although continuous measurement and improvement of the quality of public services is needed to improve the quality of public services, traditional surveys are costly and time-consuming and have limitations. Therefore, there is a need for an analytical technique that can measure the quality of public services quickly and accurately at any time based on the data generated from public services. In this study, we analyzed the quality of public services based on data using process mining techniques for civil licensing services in N city. It is because the N city's building license complaint service can secure data necessary for analysis and can be spread to other institutions through public service quality management. This study conducted process mining on a total of 3678 building license complaint services in N city for two years from January 2014, and identified process maps and departments with high frequency and long processing time. According to the analysis results, there was a case where a department was crowded or relatively few at a certain point in time. In addition, there was a reasonable doubt that the increase in the number of complaints would increase the time required to complete the complaints. According to the analysis results, the time required to complete the complaint was varied from the same day to a year and 146 days. The cumulative frequency of the top four departments of the Sewage Treatment Division, the Waterworks Division, the Urban Design Division, and the Green Growth Division exceeded 50% and the cumulative frequency of the top nine departments exceeded 70%. Higher departments were limited and there was a great deal of unbalanced load among departments. Most complaint services have a variety of different patterns of processes. Research shows that the number of 'complementary' decisions has the greatest impact on the length of a complaint. This is interpreted as a lengthy period until the completion of the entire complaint is required because the 'complement' decision requires a physical period in which the complainant supplements and submits the documents again. In order to solve these problems, it is possible to drastically reduce the overall processing time of the complaints by preparing thoroughly before the filing of the complaints or in the preparation of the complaints, or the 'complementary' decision of other complaints. By clarifying and disclosing the cause and solution of one of the important data in the system, it helps the complainant to prepare in advance and convinces that the documents prepared by the public information will be passed. The transparency of complaints can be sufficiently predictable. Documents prepared by pre-disclosed information are likely to be processed without problems, which not only shortens the processing period but also improves work efficiency by eliminating the need for renegotiation or multiple tasks from the point of view of the processor. The results of this study can be used to find departments with high burdens of civil complaints at certain points of time and to flexibly manage the workforce allocation between departments. In addition, as a result of analyzing the pattern of the departments participating in the consultation by the characteristics of the complaints, it is possible to use it for automation or recommendation when requesting the consultation department. In addition, by using various data generated during the complaint process and using machine learning techniques, the pattern of the complaint process can be found. It can be used for automation / intelligence of civil complaint processing by making this algorithm and applying it to the system. This study is expected to be used to suggest future public service quality improvement through process mining analysis on civil service.

GIS based Development of Module and Algorithm for Automatic Catchment Delineation Using Korean Reach File (GIS 기반의 하천망분석도 집수구역 자동 분할을 위한 알고리듬 및 모듈 개발)

  • PARK, Yong-Gil;KIM, Kye-Hyun;YOO, Jae-Hyun
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.20 no.4
    • /
    • pp.126-138
    • /
    • 2017
  • Recently, the national interest in environment is increasing and for dealing with water environment-related issues swiftly and accurately, the demand to facilitate the analysis of water environment data using a GIS is growing. To meet such growing demands, a spatial network data-based stream network analysis map(Korean Reach File; KRF) supporting spatial analysis of water environment data was developed and is being provided. However, there is a difficulty in delineating catchment areas, which are the basis of supplying spatial data including relevant information frequently required by the users such as establishing remediation measures against water pollution accidents. Therefore, in this study, the development of a computer program was made. The development process included steps such as designing a delineation method, and developing an algorithm and modules. DEM(Digital Elevation Model) and FDR(Flow Direction) were used as the major data to automatically delineate catchment areas. The algorithm for the delineation of catchment areas was developed through three stages; catchment area grid extraction, boundary point extraction, and boundary line division. Also, an add-in catchment area delineation module, based on ArcGIS from ESRI, was developed in the consideration of productivity and utility of the program. Using the developed program, the catchment areas were delineated and they were compared to the catchment areas currently used by the government. The results showed that the catchment areas were delineated efficiently using the digital elevation data. Especially, in the regions with clear topographical slopes, they were delineated accurately and swiftly. Although in some regions with flat fields of paddles and downtowns or well-organized drainage facilities, the catchment areas were not segmented accurately, the program definitely reduce the processing time to delineate existing catchment areas. In the future, more efforts should be made to enhance current algorithm to facilitate the use of the higher precision of digital elevation data, and furthermore reducing the calculation time for processing large data volume.

Control Policy for the Land Remote Sensing Industry (미국(美國)의 지상원격탐사(地上遠隔探査) 통제제탁(統制制度))

  • Suh, Young-Duk
    • The Korean Journal of Air & Space Law and Policy
    • /
    • v.20 no.1
    • /
    • pp.87-107
    • /
    • 2005
  • Land Remote Sensing' is defined as the science (and to some extent, art) of acquiring information about the Earth's surface without actually being in contact with it. Narrowly speaking, this is done by sensing and recording reflected or emitted energy and processing, analyzing, and applying that information. Remote sensing technology was initially developed with certain purposes in mind ie. military and environmental observation. However, after 1970s, as these high-technologies were taught to private industries, remote sensing began to be more commercialized. Recently, we are witnessing a 0.61-meter high-resolution satellite image on a free market. While privatization of land remote sensing has enabled one to use this information for disaster prevention, map creation, resource exploration and more, it can also create serious threat to a sensed nation's national security, if such high resolution images fall into a hostile group ie. terrorists. The United States, a leading nation for land remote sensing technology, has been preparing and developing legislative control measures against the remote sensing industry, and has successfully created various policies to do so. Through the National Oceanic and Atmospheric Administration's authority under the Land Remote Sensing Policy Act, the US can restrict sensing and recording of resolution of 0.5 meter or better, and prohibit distributing/circulating any images for the first 24 hours. In 1994, Presidential Decision Directive 23 ordered a 'Shutter Control' policy that details heightened level of restriction from sensing to commercializing such sensitive data. The Directive 23 was even more strengthened in 2003 when the Congress passed US Commercial Remote Sensing Policy. These policies allow Secretary of Defense and Secretary of State to set up guidelines in authorizing land remote sensing, and to limit sensing and distributing satellite images in the name of the national security - US government can use the civilian remote sensing systems when needed for the national security purpose. The fact that the world's leading aerospace technology country acknowledged the magnitude of land remote sensing in the context of national security, and it has made and is making much effort to create necessary legislative measures to control the powerful technology gives much suggestions to our divided Korean peninsula. We, too, must continue working on the Korea National Space Development Act and laws to develop the necessary policies to ensure not only the development of space industry, but also to ensure the national security.

  • PDF

THE CURRENT STATUS OF BIOMEDICAL ENGINEERING IN THE USA

  • Webster, John G.
    • Proceedings of the KOSOMBE Conference
    • /
    • v.1992 no.05
    • /
    • pp.27-47
    • /
    • 1992
  • Engineers have developed new instruments that aid in diagnosis and therapy Ultrasonic imaging has provided a nondamaging method of imaging internal organs. A complex transducer emits ultrasonic waves at many angles and reconstructs a map of internal anatomy and also velocities of blood in vessels. Fast computed tomography permits reconstruction of the 3-dimensional anatomy and perfusion of the heart at 20-Hz rates. Positron emission tomography uses certain isotopes that produce positrons that react with electrons to simultaneously emit two gamma rays in opposite directions. It locates the region of origin by using a ring of discrete scintillation detectors, each in electronic coincidence with an opposing detector. In magnetic resonance imaging, the patient is placed in a very strong magnetic field. The precessing of the hydrogen atoms is perturbed by an interrogating field to yield two-dimensional images of soft tissue having exceptional clarity. As an alternative to radiology image processing, film archiving, and retrieval, picture archiving and communication systems (PACS) are being implemented. Images from computed radiography, magnetic resonance imaging (MRI), nuclear medicine, and ultrasound are digitized, transmitted, and stored in computers for retrieval at distributed work stations. In electrical impedance tomography, electrodes are placed around the thorax. 50-kHz current is injected between two electrodes and voltages are measured on all other electrodes. A computer processes the data to yield an image of the resistivity of a 2-dimensional slice of the thorax. During fetal monitoring, a corkscrew electrode is screwed into the fetal scalp to measure the fetal electrocardiogram. Correlations with uterine contractions yield information on the status of the fetus during delivery To measure cardiac output by thermodilution, cold saline is injected into the right atrium. A thermistor in the right pulmonary artery yields temperature measurements, from which we can calculate cardiac output. In impedance cardiography, we measure the changes in electrical impedance as the heart ejects blood into the arteries. Motion artifacts are large, so signal averaging is useful during monitoring. An intraarterial blood gas monitoring system permits monitoring in real time. Light is sent down optical fibers inserted into the radial artery, where it is absorbed by dyes, which reemit the light at a different wavelength. The emitted light travels up optical fibers where an external instrument determines O2, CO2, and pH. Therapeutic devices include the electrosurgical unit. A high-frequency electric arc is drawn between the knife and the tissue. The arc cuts and the heat coagulates, thus preventing blood loss. Hyperthermia has demonstrated antitumor effects in patients in whom all conventional modes of therapy have failed. Methods of raising tumor temperature include focused ultrasound, radio-frequency power through needles, or microwaves. When the heart stops pumping, we use the defibrillator to restore normal pumping. A brief, high-current pulse through the heart synchronizes all cardiac fibers to restore normal rhythm. When the cardiac rhythm is too slow, we implant the cardiac pacemaker. An electrode within the heart stimulates the cardiac muscle to contract at the normal rate. When the cardiac valves are narrowed or leak, we implant an artificial valve. Silicone rubber and Teflon are used for biocompatibility. Artificial hearts powered by pneumatic hoses have been implanted in humans. However, the quality of life gradually degrades, and death ensues. When kidney stones develop, lithotripsy is used. A spark creates a pressure wave, which is focused on the stone and fragments it. The pieces pass out normally. When kidneys fail, the blood is cleansed during hemodialysis. Urea passes through a porous membrane to a dialysate bath to lower its concentration in the blood. The blind are able to read by scanning the Optacon with their fingertips. A camera scans letters and converts them to an array of vibrating pins. The deaf are able to hear using a cochlear implant. A microphone detects sound and divides it into frequency bands. 22 electrodes within the cochlea stimulate the acoustic the acoustic nerve to provide sound patterns. For those who have lost muscle function in the limbs, researchers are implanting electrodes to stimulate the muscle. Sensors in the legs and arms feed back signals to a computer that coordinates the stimulators to provide limb motion. For those with high spinal cord injury, a puff and sip switch can control a computer and permit the disabled person operate the computer and communicate with the outside world.

  • PDF

Development of Minimal Processing Technology for Korean Fruit and Vegetables (과실 및 채소의 신선편의 식품화 개발기술에 관한 연구)

  • 김건희
    • Korean journal of food and cookery science
    • /
    • v.16 no.6
    • /
    • pp.577-583
    • /
    • 2000
  • The purpose of this study was to investigate the effectiveness of various quality preservative treatments for extending shelf life and maintaining good quality of minimally processed fruit and vegetables produced in Korea. To determine the suitable treatments for delaying quality deterioration, fresh Asian pears and Chinese cabbages were sliced and treated with various quality preservatives (1% CaCl$_2$, 1% NaCl, 3% sucrose, 1% Ca-lactate, 1% vitamin C, 0.05% chitosan +1% vitamin C, 0.1% Sporix+1% vitamin C, hot water (60$\^{C}$), 0.2% L-cysteine), packed with polyethylene film (60㎛-thick), and stored at 4$\^{C}$/0$\^{C}$ or 20$\^{C}$. Various biological and sensory tests were performed to evaluate the quality changes in minimally processed products. Results indicated that Chinese cabbages treated with 1% CaCl$_2$ at 4, and 1% CaCl$_2$ and 1% NaCl at 20$\^{C}$ were most effective in maintaining the quality and minimizing the biochemical changes during storage. For sliced Asian pears, 0.2% L-cysteine and 1% NaCl treatments were effective to reduce browning, and 1% CaCl$_2$ treatment was the most effective to prevent softening during storage at 20$\^{C}$ and 0$\^{C}$. Modified atmosphere packaging of Pleurotus ostreatus and Lentinus edodes had a significantly different shelf life depending on packaging material, packaging thickness and storage temperature. Sealed packaging with polyethylene film (60㎛-thick) for two kinds of mushrooms maintained a good quality with an extended shelf life by 30-50% at 20$\^{C}$ and by 30-130% at 0$\^{C}$. To minimize the quality deterioration which appeared in the condition of polyethylene film packaging, quality preservatives such as KMnO$_4$ and KHSO$_2$+K$_2$S$_2$O$\_$5/ for SO$_2$ generation were added inside of mushroom packaging. The best condition for maintaining good quality longer was packaging with polyethylene film+SO$_2$ which showed 5080% extended shelf life for both Pleurotus ostreatus and Lentinus edodes at 20$\^{C}$ and 0$\^{C}$.

  • PDF

Development of Quality Assurance Program for the On-board Imager Isocenter Accuracy with Gantry Rotation (갠트리 회전에 의한 온-보드 영상장치 회전중심점의 정도관리 프로그램 개발)

  • Cheong, Kwang-Ho;Cho, Byung-Chul;Kang, Sei-Kwon;Kim, Kyoung-Joo;Bae, Hoon-Sik;Suh, Tae-Suk
    • Progress in Medical Physics
    • /
    • v.17 no.4
    • /
    • pp.212-223
    • /
    • 2006
  • Positional accuracy of the on-board imager (OBI) isocenter with gantry rotation was presented in this paper. Three different type of automatic evaluation methods of discrepancies between therapeutic and OBI isocenter using digital image processing techniques as well as a procedure stated in the customer acceptance procedure (CAP) were applied to check OBI isocenter migration trends. Two kinds of kV x-ray image set obtained at OBI source angle of $0^{\circ},\;90^{\circ},\;180^{\circ},\;270^{\circ}$ and every $10^{\circ}$ and raw projection data for cone-beam CT reconstruction were used for each evaluation method. Efficiencies of the methods were also estimated. If a user needs to obtain an isocenter variation map with full gantry rotation, a method taking OBI image for every $10^{\circ}$ and fitting with 5th order polynomial was appropriate. However for a mere quality assurance (QA) purpose of OBI isocenter accuracy, it was adequate to use only four OBI Images taken at the OBI source angle of $0^{\circ},\;90^{\circ},\;180^{\circ}\;and\;270^{\circ}$. Maximal discrepancy was 0.44 mm which was observed between the OBI source angle of $90^{\circ}\;and\;180^{\circ}$ OBI isocenter accuracy was maintained below 0.5 mm for a year. Proposed QA program may be helpful to Implement a reasonable routine QA of the OBI isocenter accuracy without great efforts.

  • PDF