• Title/Summary/Keyword: K-file

Search Result 1,990, Processing Time 0.03 seconds

Design and Implementation of MongoDB-based Unstructured Log Processing System over Cloud Computing Environment (클라우드 환경에서 MongoDB 기반의 비정형 로그 처리 시스템 설계 및 구현)

  • Kim, Myoungjin;Han, Seungho;Cui, Yun;Lee, Hanku
    • Journal of Internet Computing and Services
    • /
    • v.14 no.6
    • /
    • pp.71-84
    • /
    • 2013
  • Log data, which record the multitude of information created when operating computer systems, are utilized in many processes, from carrying out computer system inspection and process optimization to providing customized user optimization. In this paper, we propose a MongoDB-based unstructured log processing system in a cloud environment for processing the massive amount of log data of banks. Most of the log data generated during banking operations come from handling a client's business. Therefore, in order to gather, store, categorize, and analyze the log data generated while processing the client's business, a separate log data processing system needs to be established. However, the realization of flexible storage expansion functions for processing a massive amount of unstructured log data and executing a considerable number of functions to categorize and analyze the stored unstructured log data is difficult in existing computer environments. Thus, in this study, we use cloud computing technology to realize a cloud-based log data processing system for processing unstructured log data that are difficult to process using the existing computing infrastructure's analysis tools and management system. The proposed system uses the IaaS (Infrastructure as a Service) cloud environment to provide a flexible expansion of computing resources and includes the ability to flexibly expand resources such as storage space and memory under conditions such as extended storage or rapid increase in log data. Moreover, to overcome the processing limits of the existing analysis tool when a real-time analysis of the aggregated unstructured log data is required, the proposed system includes a Hadoop-based analysis module for quick and reliable parallel-distributed processing of the massive amount of log data. Furthermore, because the HDFS (Hadoop Distributed File System) stores data by generating copies of the block units of the aggregated log data, the proposed system offers automatic restore functions for the system to continually operate after it recovers from a malfunction. Finally, by establishing a distributed database using the NoSQL-based Mongo DB, the proposed system provides methods of effectively processing unstructured log data. Relational databases such as the MySQL databases have complex schemas that are inappropriate for processing unstructured log data. Further, strict schemas like those of relational databases cannot expand nodes in the case wherein the stored data are distributed to various nodes when the amount of data rapidly increases. NoSQL does not provide the complex computations that relational databases may provide but can easily expand the database through node dispersion when the amount of data increases rapidly; it is a non-relational database with an appropriate structure for processing unstructured data. The data models of the NoSQL are usually classified as Key-Value, column-oriented, and document-oriented types. Of these, the representative document-oriented data model, MongoDB, which has a free schema structure, is used in the proposed system. MongoDB is introduced to the proposed system because it makes it easy to process unstructured log data through a flexible schema structure, facilitates flexible node expansion when the amount of data is rapidly increasing, and provides an Auto-Sharding function that automatically expands storage. The proposed system is composed of a log collector module, a log graph generator module, a MongoDB module, a Hadoop-based analysis module, and a MySQL module. When the log data generated over the entire client business process of each bank are sent to the cloud server, the log collector module collects and classifies data according to the type of log data and distributes it to the MongoDB module and the MySQL module. The log graph generator module generates the results of the log analysis of the MongoDB module, Hadoop-based analysis module, and the MySQL module per analysis time and type of the aggregated log data, and provides them to the user through a web interface. Log data that require a real-time log data analysis are stored in the MySQL module and provided real-time by the log graph generator module. The aggregated log data per unit time are stored in the MongoDB module and plotted in a graph according to the user's various analysis conditions. The aggregated log data in the MongoDB module are parallel-distributed and processed by the Hadoop-based analysis module. A comparative evaluation is carried out against a log data processing system that uses only MySQL for inserting log data and estimating query performance; this evaluation proves the proposed system's superiority. Moreover, an optimal chunk size is confirmed through the log data insert performance evaluation of MongoDB for various chunk sizes.

Multi-day Trip Planning System with Collaborative Recommendation (협업적 추천 기반의 여행 계획 시스템)

  • Aprilia, Priska;Oh, Kyeong-Jin;Hong, Myung-Duk;Ga, Myeong-Hyeon;Jo, Geun-Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.1
    • /
    • pp.159-185
    • /
    • 2016
  • Planning a multi-day trip is a complex, yet time-consuming task. It usually starts with selecting a list of points of interest (POIs) worth visiting and then arranging them into an itinerary, taking into consideration various constraints and preferences. When choosing POIs to visit, one might ask friends to suggest them, search for information on the Web, or seek advice from travel agents; however, those options have their limitations. First, the knowledge of friends is limited to the places they have visited. Second, the tourism information on the internet may be vast, but at the same time, might cause one to invest a lot of time reading and filtering the information. Lastly, travel agents might be biased towards providers of certain travel products when suggesting itineraries. In recent years, many researchers have tried to deal with the huge amount of tourism information available on the internet. They explored the wisdom of the crowd through overwhelming images shared by people on social media sites. Furthermore, trip planning problems are usually formulated as 'Tourist Trip Design Problems', and are solved using various search algorithms with heuristics. Various recommendation systems with various techniques have been set up to cope with the overwhelming tourism information available on the internet. Prediction models of recommendation systems are typically built using a large dataset. However, sometimes such a dataset is not always available. For other models, especially those that require input from people, human computation has emerged as a powerful and inexpensive approach. This study proposes CYTRIP (Crowdsource Your TRIP), a multi-day trip itinerary planning system that draws on the collective intelligence of contributors in recommending POIs. In order to enable the crowd to collaboratively recommend POIs to users, CYTRIP provides a shared workspace. In the shared workspace, the crowd can recommend as many POIs to as many requesters as they can, and they can also vote on the POIs recommended by other people when they find them interesting. In CYTRIP, anyone can make a contribution by recommending POIs to requesters based on requesters' specified preferences. CYTRIP takes input on the recommended POIs to build a multi-day trip itinerary taking into account the user's preferences, the various time constraints, and the locations. The input then becomes a multi-day trip planning problem that is formulated in Planning Domain Definition Language 3 (PDDL3). A sequence of actions formulated in a domain file is used to achieve the goals in the planning problem, which are the recommended POIs to be visited. The multi-day trip planning problem is a highly constrained problem. Sometimes, it is not feasible to visit all the recommended POIs with the limited resources available, such as the time the user can spend. In order to cope with an unachievable goal that can result in no solution for the other goals, CYTRIP selects a set of feasible POIs prior to the planning process. The planning problem is created for the selected POIs and fed into the planner. The solution returned by the planner is then parsed into a multi-day trip itinerary and displayed to the user on a map. The proposed system is implemented as a web-based application built using PHP on a CodeIgniter Web Framework. In order to evaluate the proposed system, an online experiment was conducted. From the online experiment, results show that with the help of the contributors, CYTRIP can plan and generate a multi-day trip itinerary that is tailored to the users' preferences and bound by their constraints, such as location or time constraints. The contributors also find that CYTRIP is a useful tool for collecting POIs from the crowd and planning a multi-day trip.

MICROLEAKAGE OF RESILON: EFFECTS OF SEVERAL SELF-ETCHING PRIMER (Resilon을 이용한 근관충전 시 수종의 치면처리제에 따른 미세누출 평가)

  • O, Jong-Hyeon;Park, Se-Hee;Shin, Hye-Jin;Cho, Kyung-Mo;Kim, Jin-Woo
    • Restorative Dentistry and Endodontics
    • /
    • v.33 no.2
    • /
    • pp.133-140
    • /
    • 2008
  • The purpose of this study was to compare the apical micro leakage in root canal filled with Resilon by several self-etching primers and methacrylate-based root canal sealer. Seventy single-rooted human teeth were used in this study. The canals were instrumented by a crown-down manner with Gate-Glidden drills and .04 Taper Profile to ISO #40. The teeth were randomly divided into four experimental groups of 15 teeth each according to root canal filling material and self-etching primers and two control groups (positive and negative) of 5 teeth each as follows: group 1 - gutta percha and $AH26^{(R)}$ sealer: group 2 - Resilon, $RealSeal^{TM}$ primer and $RealSeal^{TM}$ sealer: group 3-Resilon, Clearfil SE $Bond^{(R)}$ primer and $RealSeal^{TM}$ sealer group 4-Resilon, $AdheSe^{(R)}$ primer and $RealSeal^{TM}$ sealer. Apical leakage was measured by a maximum length of linear dye penetration of roots sectioned longitudinally by diamond disk. Statistical analysis was performed using the One-way ANOVA followed by Scheffe's test. There were no statistical differences in the mean apical dye penetration among the groups 2, 3 and 4 of self-etching primers. And group 1, 2 and 3 had also no statistical difference in apical dye penetration. But, there was statistical difference between group 1 and 4 (p < 0.05). The group 1 showed the least dye penetration. According to the results of this study, Resilon with self-etching primer was not sealed root canal better than gutta precha with $AH26^{(R)}$ at sealing root canals. And there was no significant difference in apical leakage among the three self-etching primers.

Evaluation of Real-time Measurement Liver Tumor's Movement and $Synchrony^{TM}$ System's Accuracy of Radiosurgery using a Robot CyberKnife (로봇사이버나이프를 이용한 간 종양의 실시간 움직임 측정과 방사선수술 시 호흡추적장치의 정확성 평가)

  • Kim, Gha-Jung;Shim, Su-Jung;Kim, Jeong-Ho;Min, Chul-Kee;Chung, Weon-Kuu
    • Radiation Oncology Journal
    • /
    • v.26 no.4
    • /
    • pp.263-270
    • /
    • 2008
  • Purpose: This study aimed to quantitatively measure the movement of tumors in real-time and evaluate the treatment accuracy, during the treatment of a liver tumor patient, who underwent radiosurgery with a Synchrony Respiratory motion tracking system of a robot CyberKnife. Materials and Methods: The study subjects included 24 liver tumor patients who underwent CyberKnife treatment, which included 64 times of treatment with the Synchrony Respiratory motion tracking system ($Synchrony^{TM}$). The treatment involved inserting 4 to 6 acupuncture needles into the vicinity of the liver tumor in all the patients using ultrasonography as a guide. A treatment plan was set up using the CT images for treatment planning uses. The position of the acupuncture needle was identified for every treatment time by Digitally Reconstructed Radiography (DRR) prepared at the time of treatment planning and X-ray images photographed in real-time. Subsequent results were stored through a Motion Tracking System (MTS) using the Mtsmain.log treatment file. In this way, movement of the tumor was measured. Besides, the accuracy of radiosurgery using CyberKnife was evaluated by the correlation errors between the real-time positions of the acupuncture needles and the predicted coordinates. Results: The maximum and the average translational movement of the liver tumor were measured 23.5 mm and $13.9{\pm}5.5\;mm$, respectively from the superior to the inferior direction, 3.9 mm and $1.9{\pm}0.9mm$, respectively from left to right, and 8.3 mm and $4.9{\pm}1.9\;mm$, respectively from the anterior to the posterior direction. The maximum and the average rotational movement of the liver tumor were measured to be $3.3^{\circ}$ and $2.6{\pm}1.3^{\circ}$, respectively for X (Left-Right) axis rotation, $4.8^{\circ}$ and $2.3{\pm}1.0^{\circ}$, respectively for Y (Crania-Caudal) axis rotation, $3.9^{\circ}$ and $2.8{\pm}1.1^{\circ}$, respectively for Z (Anterior-Posterior) axis rotation. In addition, the average correlation error, which represents the treatment's accuracy was $1.1{\pm}0.7\;mm$. Conclusion: In this study real-time movement of a liver tumor during the radiosurgery could be verified quantitatively and the accuracy of the radiosurgery with the Synchrony Respiratory motion tracking system of robot could be evaluated. On this basis, the decision of treatment volume in radiosurgery or conventional radiotherapy and useful information on the movement of liver tumor are supposed to be provided.

An Analysis of Big Video Data with Cloud Computing in Ubiquitous City (클라우드 컴퓨팅을 이용한 유시티 비디오 빅데이터 분석)

  • Lee, Hak Geon;Yun, Chang Ho;Park, Jong Won;Lee, Yong Woo
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.45-52
    • /
    • 2014
  • The Ubiquitous-City (U-City) is a smart or intelligent city to satisfy human beings' desire to enjoy IT services with any device, anytime, anywhere. It is a future city model based on Internet of everything or things (IoE or IoT). It includes a lot of video cameras which are networked together. The networked video cameras support a lot of U-City services as one of the main input data together with sensors. They generate huge amount of video information, real big data for the U-City all the time. It is usually required that the U-City manipulates the big data in real-time. And it is not easy at all. Also, many times, it is required that the accumulated video data are analyzed to detect an event or find a figure among them. It requires a lot of computational power and usually takes a lot of time. Currently we can find researches which try to reduce the processing time of the big video data. Cloud computing can be a good solution to address this matter. There are many cloud computing methodologies which can be used to address the matter. MapReduce is an interesting and attractive methodology for it. It has many advantages and is getting popularity in many areas. Video cameras evolve day by day so that the resolution improves sharply. It leads to the exponential growth of the produced data by the networked video cameras. We are coping with real big data when we have to deal with video image data which are produced by the good quality video cameras. A video surveillance system was not useful until we find the cloud computing. But it is now being widely spread in U-Cities since we find some useful methodologies. Video data are unstructured data thus it is not easy to find a good research result of analyzing the data with MapReduce. This paper presents an analyzing system for the video surveillance system, which is a cloud-computing based video data management system. It is easy to deploy, flexible and reliable. It consists of the video manager, the video monitors, the storage for the video images, the storage client and streaming IN component. The "video monitor" for the video images consists of "video translater" and "protocol manager". The "storage" contains MapReduce analyzer. All components were designed according to the functional requirement of video surveillance system. The "streaming IN" component receives the video data from the networked video cameras and delivers them to the "storage client". It also manages the bottleneck of the network to smooth the data stream. The "storage client" receives the video data from the "streaming IN" component and stores them to the storage. It also helps other components to access the storage. The "video monitor" component transfers the video data by smoothly streaming and manages the protocol. The "video translator" sub-component enables users to manage the resolution, the codec and the frame rate of the video image. The "protocol" sub-component manages the Real Time Streaming Protocol (RTSP) and Real Time Messaging Protocol (RTMP). We use Hadoop Distributed File System(HDFS) for the storage of cloud computing. Hadoop stores the data in HDFS and provides the platform that can process data with simple MapReduce programming model. We suggest our own methodology to analyze the video images using MapReduce in this paper. That is, the workflow of video analysis is presented and detailed explanation is given in this paper. The performance evaluation was experiment and we found that our proposed system worked well. The performance evaluation results are presented in this paper with analysis. With our cluster system, we used compressed $1920{\times}1080(FHD)$ resolution video data, H.264 codec and HDFS as video storage. We measured the processing time according to the number of frame per mapper. Tracing the optimal splitting size of input data and the processing time according to the number of node, we found the linearity of the system performance.

Finite Element Method Modeling for Individual Malocclusions: Development and Application of the Basic Algorithm (유한요소법을 이용한 환자별 교정시스템 구축의 기초 알고리즘 개발과 적용)

  • Shin, Jung-Woog;Nahm, Dong-Seok;Kim, Tae-Woo;Lee, Sung Jae
    • The korean journal of orthodontics
    • /
    • v.27 no.5 s.64
    • /
    • pp.815-824
    • /
    • 1997
  • The purpose of this study is to develop the basic algorithm for the finite element method modeling of individual malocclusions. Usually, a great deal of time is spent in preprocessing. To reduce the time required, we developed a standardized procedure for measuring the position of each tooth and a program to automatically preprocess. The following procedures were carried to complete this study. 1. Twenty-eight teeth morphologies were constructed three-dimensionally for the finite element analysis and saved as separate files. 2. Standard brackets were attached so that the FA points coincide with the center of the brackets. 3. The study model of a patient was made. 4. Using the study model, the crown inclination, angulation, and the vertical distance from the tip of a tooth was measured by using specially designed tools. 5. The arch form was determined from a picture of the model with an image processing technique. 6. The measured data were input as a rotational matrix. 7. The program provides an output file containing the necessary information about the three-dimensional position of teeth, which is applicable to several finite element programs commonly used. The program for a basic algorithm was made with Turbo-C and the subsequent outfile was applied to ANSYS. This standardized model measuring procedure and the program reduce the time required, especially for preprocessing and can be applied to other malocclusions easily.

  • PDF

Comparative Studies on Absorbed Dose by Geant4-based Simulation Using DICOM File and Gafchromic EBT2 Film (DICOM 파일을 사용한 Geant4 시뮬레이션과 Gafchromic EBT2 필름에 의한 인체 내 흡수선량 비교 연구)

  • Mo, Eun-Hui;Lee, Sang-Ho;Ahn, Sung-Hwan;Kim, Chong-Yeal
    • Progress in Medical Physics
    • /
    • v.24 no.1
    • /
    • pp.48-53
    • /
    • 2013
  • Monte Carlo method has been known as the most accurate method for calculating absorbed dose in the human body, and an anthropomorphic phantom has been mainly used as a method of simulating internal organs for using such a calculation method. However, various efforts are made to extract data on several internal organs in the human body directly from CT DICOM files in recent Monte Carlo calculation using Geant4 code and to use by converting them into the geometry necessary for simulation. Such a function makes it possible to calculate the internal absorbed dose accurately while duplicating the actual human anatomical structure. Thus, this study calculated the absorbed dose in the human body by using Geant4 associating with DICOM files, and aimed to confirm the usefulness by compare the result with the measured dose using a Gafchromic EBT2 film. This study compared the dose calculated using simulation and the measured dose in beam central axis using the EBT2 film. The results showed that the range of difference was an average of 3.75% except for a build-up region, in which the dose rapidly changed from skin surface to the depth of maximum dose. In addition, this study made it easy to confirm the target absorbed dose by internal organ and organ through the output of the calculated value of dose by CT slice and the dose value of each voxel in each slice. Thus, the method that outputs dose value by slice and voxel through the use of CT DICOM, which is actual image data of human body, instead of the anthropomorphic phantom enables accurate dose calculations of various regions. Therefore, it is considered that it will be useful for dose calculation of radiotherapy planning system in the future. Moreover, it is applicable for currently-used several energy ranges in current use, so it is considered that it will be effectively used in order to check the radiation absorbed dose in the human body.

MICROTENSILE BONDING OF RESIN FIBER REINFORCED POST TO RADICULAR DENTIN USING RESIN CEMENT (레진 시멘트를 이용한 레진 파이버 강화 레진포스트의 치근 상아질에 대한 미세인장결합강도)

  • Kim, Jin-Woo;Yu, Mi-Kyung;Lee, Se-Joon;Lee, Kwang-Won
    • Restorative Dentistry and Endodontics
    • /
    • v.28 no.1
    • /
    • pp.80-88
    • /
    • 2003
  • Object The purpose of this study were to evaluate the microtensile bond strength of resin fiber reinforced post to radicular dentin using resin cement according to various dentin surface treatment and to observe the inter face between post and root dentin under SEM Material and Method A total 16 extracted human single rooted teeth were used. A lingual access was made using a #245 carbide bur in a high-speed handpiece with copious air water spray. The post space was mechanically enlarged using H-file(up to #60) and Gates Glidden bures(#3). This was followed by refining of the canal space using the calbrating drill set provided in ER Dentinpost(GEBR, BRASSELER GmbH&Co. KG). The 16 teeth were randomly distributed into 4 group of 4 teeth. Group 1 teeth had their post space prepared using 10% phosphoric acid as root canal surface treatment agent during 20s. The canal was then rinsed with saline and dried with paper point. Group 2 teeth had their post space prepared using 3% NaOCl as root canal surface treatment agent during 30min. The canal was then rinsed with saline and dried with paper point. Group 3 teeth had their post space prepared using 17% EDTA as root canal surface treatment agent during 1min. The canal was then rinsed with saline and dried with paper point. Group 4 teeth had their post space prepared using 17% EDTA as root canal surface treatment agent during 1min. After rinsing with saline, the canal was rinced 10m1 of 3% NaOCl for 30min. After drying with paper point, the post(ER Dentinpost, GEBR, BRASSELER GmbH&Co. KG) was placed in the treated canals using resin cement. Once the canal was filled with resin cement(Super bond C&B sunmedical co. Ltd.), a lentulo was inserted to the depth of the canal to ensure proper coating of the root canal wall. After 24 hours, acrylic resin blocks($10{\cdot}10{\cdot}50mm$) were made. The resin block was serially sectioned vertically into stick of $1{\cdot}1mm$. Twenty sticks were prepared from each group. After that, tensile bond strengths for each stick was measured with Microtensile Tester. Failure pattern of the specimen at the interface between post and dentin were observed under SEM. Results 1. Tensile bond strengths(meen{\pm}SD$) ) were expressed with ascending order as follows group 4, $12.52{\pm}6.60$ ; group 1, $7.63{\pm}5.83$ ; group 2, $4.13{\pm}2.31$ ; group 3, $3.31{\pm}1.44$. 2. Tensile bond strengths of Group 4 treated with 17% EDTA +3%NaOCl were significant higher than those of group 1, 2 and 3 (p<0.05). 3. Tensile bond strengths of Group 1 treated with 10% phosphoric acid were significant higher than those of group 2 (p<0.05). Tensile bond strengths of Group 4 treated with 17% EDTA +3% NaOCl was significant higher than those of other groups.

Relationship between Low Back Pain and Lumbar Paraspinal Muscles Fat Change in MRI (편측 요통을 호소하는 환자에 있어서 척추 주위 근육의 지방량과 통증과의 관계)

  • Kim, Ha-Neul;Kim, Kyoung-Hun;Kim, Joo-Won;Jin, Eun-Seok;Ha, In-Hyuk;Koh, Dong-Hyun;Hong, Soon-Sung;Kwon, Hyeok-Joon
    • Journal of Korean Medicine Rehabilitation
    • /
    • v.19 no.1
    • /
    • pp.135-143
    • /
    • 2009
  • Objectives : Low back pain(LBP) is a common disabling disease in clinical practice and loss of working hours due to this condition is huge. The aim of this study was to determine if there was an association between fat deposit of paraspinal muscles as observed on MRI scans in patients presenting with unilateral LBP. Methods : 24 patients who visiting our hospital with a clinical presentation of unilateral LBP were recruited to the study. Patients were between 20 and 30 years and had a history of unilateral LBP within 12 months. After MRI scaning, the images were saved in DICOM file format for Picture Archiving and Communication System(PACS). The percentage of fat infiltrated area was measured using a pseudocoloring technique. Data were analyzed comparing the fat deposits of the muscles on the symptomatic and asymptomatic sides. Paired t-test was used to find the difference between the measurements of fat tissue in individual patients. Results : The amount of fat in the symptomatic side was $7.6{\pm}4.51%$, asymptomatic side was $6.7{\pm}4.29%$. There were increases, statistically significant, in the fat changes of the paraspinal muscles at the L4-5 disc level(P <0.05). Also, men were likely than women to have more fat deposit in symptomatic side(men $8.5{\pm}5.1%$, women $6.5{\pm}3.6%$). Conclusions : The amount of fat in the symptomatic side shows significantly increased than asymptomatic side in the paraspinal muscles at the L4-5 disc level. It suggested that fat infiltration in the muscles associated with LBP. Further studies will be needed to confirm the relationship between the muscle fatty changes and LBP in the large sample size. In addition, the correlation of pain severity with fat infiltration needs to be addressed.

A Complexity Reduction Method of MPEG-4 Audio Lossless Coding Encoder by Using the Joint Coding Based on Cross Correlation of Residual (여기신호의 상관관계 기반 joint coding을 이용한 MPEG-4 audio lossless coding 인코더 복잡도 감소 방법)

  • Cho, Choong-Sang;Kim, Je-Woo;Choi, Byeong-Ho
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.47 no.3
    • /
    • pp.87-95
    • /
    • 2010
  • Portable multi-media products which can service the highest audio-quality by using lossless audio codec has been released and the international lossless codecs, MPEG-4 audio lossless coding(ALS) and MPEG-4 scalable lossless coding(SLS), were standardized by MPEG in 2006. The simple profile of MPEG-4 ALS, it supports up to stereo, was defined by MPEG in 2009. The lossless audio codec should have low-complexity in stereo to be widely used in portable multi-media products. But the previous researches of MPEG-4 ALS have focused on an improvement of compression ratio, a complexity reduction in multi-channels coding, and a selection of linear prediction coefficients(LPCs) order. In this paper, the complexity and compression ratio of MPEG-4 ALS encoder is analyzed in simple profile of MPEG-4 ALS, the method to reduce a complexity of MPEG-4 ALS encoder is proposed. Based on an analysis of complexity of MPEG-4 ALS encoder, the complexity of short-term prediction filter of MPEG-4 ALS encoder is reduced by using the low-complexity filter that is proposed in previous research to reduce the complexity of MPEG-4 ALS decoder. Also, we propose a joint coding decision method, it reduces the complexity and keeps the compression ratio of MPEG-4 ALS encoder. In proposed method, the operation of joint coding is decided based on the relation between cross-correlation of residual and compression ratio of joint coding. The performance of MPEG-4 ALS encoder that has the method and low-complexity filter is evaluated by using the MPEG-4 ALS conformance test file and normal music files. The complexity of MPEG-4 ALS encoder is reduced by about 24% by comparing with MPEG-4 ALS reference encoder, while the compression ratio by the proposed method is comparable to MPEG-4 ALS reference encoder.