• Title/Summary/Keyword: Computer Algorithms

Search Result 3,795, Processing Time 0.028 seconds

Developing JSequitur to Study the Hierarchical Structure of Biological Sequences in a Grammatical Inference Framework of String Compression Algorithms

  • Galbadrakh, Bulgan;Lee, Kyung-Eun;Park, Hyun-Seok
    • Genomics & Informatics
    • /
    • v.10 no.4
    • /
    • pp.266-270
    • /
    • 2012
  • Grammatical inference methods are expected to find grammatical structures hidden in biological sequences. One hopes that studies of grammar serve as an appropriate tool for theory formation. Thus, we have developed JSequitur for automatically generating the grammatical structure of biological sequences in an inference framework of string compression algorithms. Our original motivation was to find any grammatical traits of several cancer genes that can be detected by string compression algorithms. Through this research, we could not find any meaningful unique traits of the cancer genes yet, but we could observe some interesting traits in regards to the relationship among gene length, similarity of sequences, the patterns of the generated grammar, and compression rate.

Super-resolution image enhancement by Papoulis-Gerchbergmethod improvement (Papoulis-Gerchberg 방법의 개선에 의한 초해상도 영상 화질 향상)

  • Jang, Hyo-Sik;Kim, Duk-Gyoo;Jung, Yoon-Soo;Lee, Tae-Gyoun;Won, Chul-Ho
    • Journal of Sensor Science and Technology
    • /
    • v.19 no.2
    • /
    • pp.118-123
    • /
    • 2010
  • This paper proposes super-resolution reconstruction algorithm for image enhancement. Super-resolution reconstruction algorithms reconstruct a high-resolution image from multi-frame low-resolution images of a scene. Conventional super- resolution reconstruction algorithms are iterative back-projection(IBP), robust super-resolution(RS)method and standard Papoulis-Gerchberg(PG)method. However, traditional methods have some problems such as rotation and ringing. So, this paper proposes modified algorithm to improve the problem. Experimental results show that this proposed algorithm solve the problem. As a result, the proposed method showed an increase in the PSNR for traditional super-resolution reconstruction algorithms.

PROJECTION ALGORITHMS WITH CORRECTION

  • Nicola, Aurelian;Popa, Constantin;Rude, Ulrich
    • Journal of applied mathematics & informatics
    • /
    • v.29 no.3_4
    • /
    • pp.697-712
    • /
    • 2011
  • We present in this paper two versions of a general correction procedure applied to a classical linear iterative method. This gives us the possibility, under certain assumptions, to obtain an extension of it to inconsistent linear least-squares problems. We prove that some well known extended projection type algorithms from image reconstruction in computerized tomography fit into one or the other of these general versions and are derived as particular cases of them. We also present some numerical experiments on two phantoms widely used in image reconstruction literature. The experiments show the importance of these extension procedures, reflected in the quality of reconstructed images.

Combination of Evolution Algorithms and Fuzzy Controller for Nonlinear Control System (비선형 제어 시스템을 위한 진화 알고리즘과 퍼지 제어기와의 결합)

  • 이말례;장재열
    • Journal of the Korea Society of Computer and Information
    • /
    • v.1 no.1
    • /
    • pp.159-170
    • /
    • 1996
  • In this paper, we propose a generating method for the optimal rules for the nonlinear control system using evolution algorithms and fuzzy controller. With the aid of evolution algorithms optimal rules of fuzzy logic system can be automatic designed without human expert's priori experience and. knowledge. and ran be intelligent control. The approachpresented here generating rules by self-tuning the parameters of membership functions and searchs the optimal control rules based on a fitness value which Is tile defined performance criterion. Computer simulations demonstrates the usefulness of the proposed method In non -linear systems.

  • PDF

URL Filtering by Using Machine Learning

  • Saqib, Malik Najmus
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.8
    • /
    • pp.275-279
    • /
    • 2022
  • The growth of technology nowadays has made many things easy for humans. These things are from everyday small task to more complex tasks. Such growth also comes with the illegal activities that are perform by using technology. These illegal activities can simple as displaying annoying message to big frauds. The easiest way for the attacker to perform such activities is to convenience user to click on the malicious link. It has been a great concern since a decay to classify URLs as malicious or benign. The blacklist has been used initially for that purpose and is it being used nowadays. It is efficient but has a drawback to update blacklist automatically. So, this method is replace by classification of URLs based on machine learning algorithms. In this paper we have use four machine learning classification algorithms to classify URLs as malicious or benign. These algorithms are support vector machine, random forest, n-nearest neighbor, and decision tree. The dataset that is used in this research has 36694 instances. A comparison of precision accuracy and recall values are shown for dataset with and without preprocessing.

Comparison of similarity measures and community detection algorithms using collaboration filtering (협업 필터링을 사용한 유사도 기법 및 커뮤니티 검출 알고리즘 비교)

  • Ugli, Sadriddinov Ilkhomjon Rovshan;Hong, Minpyo;Park, Doo-Soon
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2022.05a
    • /
    • pp.366-369
    • /
    • 2022
  • The glut of information aggravated the process of data analysis and other procedures including data mining. Many algorithms were devised in Big Data and Data Mining to solve such an intricate problem. In this paper, we conducted research about the comparison of several similarity measures and community detection algorithms in collaborative filtering for movie recommendation systems. Movielense data set was used to do an empirical experiment. We applied three different similarity measures: Cosine, Euclidean, and Pearson. Moreover, betweenness and eigenvector centrality were used to detect communities from the network. As a result, we elucidated which algorithm is more suitable than its counterpart in terms of recommendation accuracy.

Discrete Optimum Design of Space Truss Structures Using Genetic Algorithms

  • Park, Choon Wook;Kang, Moon Myung
    • Architectural research
    • /
    • v.4 no.1
    • /
    • pp.33-38
    • /
    • 2002
  • The objective of this study is the development of discrete optimum design algorithms which is based on the genetic algorithms. The developed algorithms was implemented in a computer program. For the optimum design, the objective function is the weight of space trusses structures and the constraints are stresses and displacements. This study solves the problem by introducing the genetic algorithms. The genetic algorithms consists of genetic process and evolutionary process. The genetic process selects the next design points based on the survivability of the current design points. The evolutionary process evaluates the survivability of the design points selected from the genetic process. The efficiency and validity of the developed discrete optimum design algorithms was verified by applying the algorithms to optimum design examples.

Performance Comparison of Logistic Regression Algorithms on RHadoop

  • Jung, Byung Ho;Lim, Dong Hoon
    • Journal of the Korea Society of Computer and Information
    • /
    • v.22 no.4
    • /
    • pp.9-16
    • /
    • 2017
  • Machine learning has found widespread implementations and applications in many different domains in our life. Logistic regression is a type of classification in machine leaning, and is used widely in many fields, including medicine, economics, marketing and social sciences. In this paper, we present the MapReduce implementation of three existing algorithms, this is, Gradient Descent algorithm, Cost Minimization algorithm and Newton-Raphson algorithm, for logistic regression on RHadoop that integrates R and Hadoop environment applicable to large scale data. We compare the performance of these algorithms for estimation of logistic regression coefficients with real and simulated data sets. We also compare the performance of our RHadoop and RHIPE platforms. The performance experiments showed that our Newton-Raphson algorithm when compared to Gradient Descent and Cost Minimization algorithms appeared to be better to all data tested, also showed that our RHadoop was better than RHIPE in real data, and was opposite in simulated data.

FUZZY RULE MODIFICATION BY GENETIC ALGORITHMS

  • Park, Seihwan;Lee, Hyung-Kwang
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 1998.06a
    • /
    • pp.646-651
    • /
    • 1998
  • Fuzzy control has been used successfully in many practical applications. In traditional methods, experience and control knowledge of human experts are needed to design fuzzy controllers. However, it takes much time and cost. In this paper, an automatic design method for fuzzy controllers using genetic algorithms is proposed. In the method, we proposed an effective encoding scheme and new genetic operators. The maximum number of linguistic terms is restricted to reduce the number of combinatorial fuzzy rules in the research space. The proposed genetic operators maintain the correspondency between membership functions and control rules. The proposed method is applied to a cart centering problem. The result of the experiment has been satisfactory compared with other design methods using genetic algorithms.

  • PDF

Analytical Algorithms for Ergonomic Seated Posture When Working with Notebook Computers

  • Jalil, Sakib;Nanthavanij, Suebsak
    • Industrial Engineering and Management Systems
    • /
    • v.6 no.2
    • /
    • pp.146-157
    • /
    • 2007
  • This paper discusses two algorithms for recommending notebook computer (NBC) and workstation adjustments so that the user can assume an ergonomic seated posture during NBC operation. Required input data are the user's anthropometric data and physical dimensions of the NBC and the workstation. The first algorithm is based on an assumption that there are no workstation constraints while the second algorithm considers the actual seat height and work surface height. The results from the algorithms include recommendations for adjusting the NBC (tilt angle of the NBC base unit, angle between the base and screen units, and base support height) and the workstation (heights of seat support and footrest, and distance between the body and the NBC).