WSEAS Transactions on Computers
Contents:
2016 | 2015 | 2014 | 2013 | 2012 | 2011 | 2010 | 2009 | 2008 | Pre-2008
Print ISSN: 1109-2750
E-ISSN: 2224-2872
Volume 12, 2013
Issue 1, Volume 12, January 2013
Title of the Paper: The Modified Mobius Function
Authors: Peter G. Brown
Keywords: Mobius function, arithmetic functions, Riemann-Zeta function
Title of the Paper: The Improved Hierarchical Clustering Algorithm by a P System with Active Membranes
Authors: Yuzhen Zhao, Xiyu Liu, Xiufeng Li
Abstract: In this paper an improved hierarchical clustering algorithm by a P system with active membranes is proposed which provides new ideas and methods for cluster analysis. The membrane system has great parallelism. It could reduce the computational time complexity and is suitable for the clustering problem. Firstly an improved hierarchical algorithm was presented which introduced the K-medoids algorithm. The distance of clusters is defined as the distance between the medoids of these clusters instead of the mean distance between them. Secondly a P system with all the rules to solve the above hierarchical algorithm was constructed. The specific P system is designed for the dissimilarity matrix associated with n objects. The computation of the system can obtain one possible classifications in a non-deterministic way. Through example test, the proposed algorithm is appropriate for cluster analysis. This is a new attempt in applications of membrane system.
Keywords: Clustering algorithm; the hierarchical clustering; K-medoids algorithm; Membrane computing; P System; Membrane system
Title of the Paper: Regularized Least Squares Piecewise Multi-classification Machine
Authors: Olutayo O. Oladunni
Abstract: This paper presents a Tikhonov regularization based piecewise classification model for multi-category discrimination of sets or objects. The proposed model includes a linear classification and nonlinear kernel classification model formulation. Advantages of the regularized multi-classification formulations include its ability to express a multi-class problem as a single and unconstrained optimization problem, its ability to derive explicit expressions for the classification weights of the classifiers as well as its computational tractability in providing the optimal classification weights for multi-categorical separation. Computational results are also provided to validate the functionality of the classification models using three data sets (GPA, IRIS, and WINE data).
Keywords: Piecewise, multi-class, multi-category discrimination, least squares, linear classifiers, nonlinear classifiers, linear system of equations
Issue 2, Volume 12, February 2013
Title of the Paper: Maximal Constraint Satisfaction Problems Solved by Continuous Hopfield Networks
Authors: Mohamed Ettaouil, Chakir Loqman, Khalid Haddouch, Youssef Hami
Abstract: In this paper, we propose a new approach to solve the maximal constraint satisfaction problems (Max-CSP) using the continuous Hopfield network. This approach is divided into two steps: the first step involves modeling the maximal constraint satisfaction problem as 0-1 quadratic programming subject to linear constraints (QP). The second step concerns applying the continuous Hopfield network (CHN) to solve the QP problem. Therefore, the generalized energy function associated with the CHN and an appropriate parameter-setting procedure about Max-CSP problems are given in detail. Finally, the proposed algorithm and some computational experiments solving the Max-CSP are shown.
Keywords: Maximal constraint satisfaction problems, quadratic 0-1 programming, continuous Hopfield network, energy function
Title of the Paper: Algorithmic and Logical Thinking Development: Base of Programming Skills
Authors: Eva Milková, Anna Hulková
Abstract: This paper is based on rich experience gained in the area of computer science education and it could serve as an inspirational material directed to all educators developing students’ algorithmic thinking and programming skills. The foundation a developer gains at the beginning of his/her career plays a crucial role. An essential part of studies at faculties preparing students in the area of computer science is the development of student’s ability to think algorithmically. Students must be able to create various algorithms solving given problems starting with easy ones and consecutively increase their algorithmic knowledge and shifts during studies till the level where they deeply understand much more complex algorithms. The aim of this paper is to introduce our approach that has proven to be successful in the optimization of teaching and learning a subject developing algorithmic thinking of beginners. This is followed by a discussion of the benefits of puzzles and logical games, solved within subjects, dealing with graph algorithms and enabling further development of students’ algorithmic thinking as well as logical thinking and imagination, i.e. skills needed for deeper understanding more complex algorithms.
Keywords: Computer science education, logical games, multimedia applications, puzzles
Title of the Paper: The Improved Hierarchical Clustering Algorithm by a P System with Active Membranes
Authors: Yuzhen Zhao, Xiyu Liu, Jianhua Qu
Abstract: In this paper an improved hierarchical clustering algorithm by a P system with active membranes is proposed which provides new ideas and methods for cluster analysis. The membrane system has great parallelism. It could reduce the computational time complexity and is suitable for the clustering problem. Firstly an improved hierarchical algorithm was presented which introduced the K-medoids algorithm. The distance of clusters is defined as the distance between the medoids of these clusters instead of the mean distance between them. Secondly a P system with all the rules to solve the above hierarchical algorithm was constructed. The specific P system is designed for the dissimilarity matrix associated with n objects. The computation of the system can obtain one possible classifications in a non-deterministic way. Through example test, the proposed algorithm is appropriate for cluster analysis. This is a new attempt in applications of membrane system.
Keywords: Clustering algorithm, the hierarchical clustering, K-medoids algorithm, Membrane computing, P System, Membrane system
Title of the Paper: Curve Representation for Outlines of Planar Images Using Multilevel Coordinate Search
Authors: Mhammad Sarfraz, Naelah Al-Dabbous
Abstract: This paper proposes an optimization technique for the outline capture of planar images. This is inspired by a global optimization algorithm based on multilevel coordinate search (MCS). By starting a search from certain good points (initially detected corner points), an improved convergence result is obtained. The overall technique has various phases including extracting outlines of images, detecting corner points from the detected outline, curve fitting, and addition of extra knot points if needed. The idea of multilevel coordinate search has been used to optimize the shape parameters in the description of the generalized cubic spline introduced. The spline method ultimately produces optimal results for the approximate vectorization of the digital contour obtained from the generic shapes. It provides an optimal fit as far as curve fitting is concerned. The proposed algorithm is fully automatic and requires no human intervention. Implementation details are sufficiently discussed. Some numerical and pictorial results are also demonstrated to support the proposed technique.
Keywords: Optimization, multilevel coordinate search, Generic shapes, curve fitting, cubic spline
Title of the Paper: Blocking and Non-Blocking Concurrent Hash Tables in Multi-Core Systems
Authors: Akos Dudas, Sandor Juhasz
Abstract: Widespread use of multi-core systems demand highly parallel applications and algorithms in everyday computing. Parallel data structures, which are basic building blocks of concurrent algorithms, are hard to design in a way that they remain both fast and simple. By using mutual exclusion they can be implemented with little effort, but blocking synchronization has many unfavorable properties, such as delays, performance bottleneck and being prone to programming errors. Non-blocking synchronization, on the other hand, promises solutions to the aforementioned drawbacks, but often requires complete redesign of the algorithm and the underlying data structure to accommodate the needs of atomic instructions. Implementations of parallel data structures satisfy different progress conditions: lock based algorithms can be deadlock-free or starvation free, while non-blocking solutions can be lock-free or wait-free. These properties guarantee different levels of progress, either system-wise or for all threads. We present several parallel hash table implementations, satisfying different types of progress conditions and having various levels of implementation complexity, and discuss their behavior. The performance of different blocking and non-blocking implementations will be evaluated on a multi-core machine.
Keywords: blocking synchronization, non-blocking synchronization, hash table, mutual exclusion, lock-free, wait-free
Issue 3, Volume 12, March 2013
Title of the Paper: Data Mining by Symbolic Fuzzy Classifiers and Genetic Programming–State of the Art and Prospective Approaches
Authors: Suhail Owais, Pavel Krömer, Jan Platoš, Václav Snášel, Ivan Zelinka
Abstract: There are various techniques for data mining and data analysis. Data mining is very important in the information retrieval areas especially when the data amounts are very large. Among them, hybrid approaches combining two or more algorithms gain importance as the complexity and dimension of real world data sets grows. In this paper, we present an application of evolutionary-fuzzy classification technique for data mining, outline state of the art of related methods and draw future directions of the research. In the presented application, genetic programming was deployed to evolve a fuzzy classifier and an example of real world application was presented.
Keywords: Data mining, fuzzy classifiers, genetic programming, application
Title of the Paper: Semantic Similarity Using First and Second Order Co-occurrence Matrices and Information Content Vectors
Authors: Ahmad Pesaranghader, Saravanan Muthaiyah
Abstract: Massiveness of data on the Web demands automated Knowledge Engineering techniques enabling machines to achieve integrated definition of all available data to make a unique understanding of all discrete data sources. This research deals with Measures of Semantic Similarity resolving foregoing issue. These measures are widely used in ontology alignment, information retrieval and natural language processing. The study also introduces new normalized functions based on first and second order context and information content vectors of concepts in a corpus. By applying these measures to Unified Medical Language System (UMLS) using WordNet as a general taxonomy and MEDLINE abstract as the corpus to extract information content and information content vectors, these functions get evaluated against a created test bed of 301 biomedical concept pairs scored by medical residents. The paper shows newly proposed Semantic Similarity Measures outperform previous functions.
Keywords: Semantic Similarity, Computational Linguistic, UMLS, WordNet
Title of the Paper: A Novel Watermarking of Images Based on Wavelet Based Contourlet Transform Energized by Biometrics
Authors: P. Tamije Selvy, V. Palanisamy, S. Elakkiya
Abstract: The progress and the substantial procreation of web technologies have created an environment in which some crucial issues for digital media have become very easy. The phenomenon has led to an increasing need for developing some standard solutions to prevent these issues. One of the technical solutions is to provide law enforcement and copyright protection for digital media which can be achieved practically by Digital Watermarking. The proposed method contains following phases (i) Pre-processing of biometric image, (ii) Obtaining keys from the biometrics of the owner/user and Speeded-Up Robust Features (SURF) is used as the scale- and rotation-invariant detector for biometric images, (iii) Wavelet-Based Contourlet Transform (WBCT) is applied on the host image, (iv) Singular Value Decomposition (SVD) is enforced over the watermark image, (v) Embedding of the host image with the Watermark Image and (vi) Watermark Extraction and Attack Analysis. The implemented proposed system SURF provides execution time of 98% over SIFT. The comparative analysis confirms the efficiency and robustness of the proposed system.
Keywords: Digital Watermarking, Singular Value Decomposition (SVD), Speeded-Up Robust Features (SURF), Pre-processing, robustness
Title of the Paper: Indoor Mobile Target Localization Based on Path-Planning and Prediction in Wireless Sensor Networks
Authors: Peng Gao, Wei-Ren Shi, Hong-Bing Li, Wei Zhou
Abstract: Node position information is one of the important issues in many wireless sensor networks usages. In this paper, based on path-planning and prediction, an indoor mobile target localization algorithm (PPIMT) is proposed. We first establish the path-planning model to constrain the movement trajectory of the mobile target in indoor environment according to indoor architectural pattern. Then, one certain localization result can be obtained using MLE algorithm. After that, based on the path-planning model and some previous localization results, the most likely position of the target in the next time interval can be predicted with the proposed predicting approach. Finally, the MLE result and prediction result are weighted to obtain the final position. The simulation results demonstrate the effectiveness of the proposed algorithm.
Keywords: Wireless sensor networks, Localization, Path-planning, Prediction
Title of the Paper: Hybridizing Genetic Algorithms and Particle Swarm Optimization Transplanted into a Hyper-Heuristic System for Solving University Course Timetabling Problem
Authors: Morteza Alinia Ahandani, Mohammad Taghi Vakil Baghmisheh
Abstract: In this paper, we use genetic algorithms (GAs), particle swarm optimization (PSO) and hybrid versions of them to solve university course timetabling problem (UCTP). A new crossover method called 2-staged n-point crossover by combining classic n-point crossover method and graph colouring heuristics is introduced which aims to generate free-conflict offspring. The hybrid algorithms are generated by adding a local search (LS), based on hill climbing (HC) method, on three global search algorithms i.e. the GA, the PSO and a combination of them called GAPSO. The proposed algorithms such as hyper-heuristic systems, manage a set of graph colouring heuristics as low-level heuristics in a hyper-heuristic strategy. The proposed algorithms are examined by 11 well-known benchmark problems. Experimental results demonstrate that the GA outperforms the PSO and the GAPSO algorithms, but the hybrid GAPSO algorithm has a better performance than the hybrid GA and hybrid PSO. Also all hybrid algorithms obtain a better performance than their non-hybrid competitors. However the GA has been widely applied to UCTP, to the best our knowledge the obtained results of GA in this paper are the first reported results on these databases which are competitive than results of other approaches. In a later part of the comparative experiments, a comparison of our proposed algorithms and 14 other approaches reported in the literature confirms that by considering the hybrid GAPSO as a hybrid hyper-heuristic, it is one of the best strategies for the hyper-heuristic systems on the UCTP proposed so far. Also results of the hybrid GAPSO in comparison of other hybrid algorithms proposed in the literature are completely comparable.
Keywords: Crossover, Genetic algorithm, Hybrid algorithm, Particle swarm optimization, University course timetabling, Hyper-Heuristic
Issue 4, Volume 12, April 2013
Title of the Paper: An Improved MRI Brain Image Segmentation to Detect Cerebrospinal Fluid Level Using Anisotropic Diffused Fuzzy C Means
Authors: P. Tamije Selvy, V. Palanisamy, M. Sri Radhai
Abstract: Cerebrospinal Fluid (CSF) is a clear colorless fluid produced in the brain. The changes in CSF protein levels form abnormal brain deposits strongly linked to variety of neurological diseases. Magnetic Resonance Imaging (MRI) of Brain is segmented using Fuzzy C Means (FCM) to detect the CSF level in brain. However, FCM is not suitable to segment the images with noise. This paper presents an algorithm known as Total Variation (TV) Regularization to solve the problems in FCM. Here TV is combined with FCM to eliminate noise but the method results in stair casing effect and takes longer reconstruction time. The proposed hybrid algorithm is the combination of Anisotropic Diffusion (AD) and TVFCM method, which overcomes the problems in traditional TVFCM. AD method first diffuses the image and then is convoluted using convolution filter and is then subjected to TVFCM segmentation. The performance of the proposed method finds the CSF level present in the MRI Brain images with 98% of accuracy, 92% of sensitivity and 97% of specificity. When compared to the traditional FCM and TVFCM, ADTVFCM yields better segmentation accuracy.
Keywords: Cerebrospinal Fluid, Segmentation, Magnetic Resonance Image, Fuzzy C Means, Total Variation Regularizer, Anisotropic Diffusion
Title of the Paper: A New Algorithm for Optimization of the Kohonen Network Architectures Using the Continuous Hopfield Networks
Authors: M. Ettaouil, M. Lazaar, K. Elmoutaouakil, K. Haddouch
Abstract: The choice of the Kohonen neural network architecture has a great impact on the convergence of trained learning methods. In this paper, we generalize the learning method of the Kohonen network. This method optimizes the Kohonen network architecture and conserves the neighborhood notion defined on the observation set. To this end, we model the problem of Kohonen network architecture optimization on the terms of a mix-integer non linear problem with quadratic constraints. In order to solve the proposed model, we use the nues dynamics method. In this context, the continuous Hopfield network is used in the assignment phase. To show the advantages of our method, some experiments results are introduced.
Keywords: Kohonen networks, Continuous Hopfield Networks, mix-integer non linear programming, Clustering
Title of the Paper: A Comparative Study of Crossover Operators for Genetic Algorithms to Solve the Job Shop Scheduling Problem
Authors: Jorge Magalhães-Mendes
Abstract: Genetic algorithms (GA) are wide class of global optimization methods. Many genetic algorithms have been applied to solve combinatorial optimization problems. One of the problems in using genetic algorithms is the choice of crossover operator. The aim of this paper is to show the influence of genetic crossover operators on the performance of a genetic algorithm. The GA is applied to the job shop scheduling problem (JSSP). To achieve this aim an experimental study of a set of crossover operators is presented. The experimental study is based on a decision support system (DSS). To compare the abilities of different crossover operators, the DSS was designed giving all the operators the same opportunities. The genetic crossover operators are tested on a set of standard instances taken from the literature. The makespan is the measure used to evaluate the genetic crossover operators. The main conclusion is that there is a crossover operator having the best average performance on a specific set of solved instances.
Keywords: Scheduling, Genetic Algorithms, Crossover Operators, Optimization, Operations Research, JSSP
Title of the Paper: Declarative Implementations of Search Strategies for Solving CSPs in Control Network Programming
Authors: Emilia Golemanova
Abstract: The paper describes one of the most researched techniques in solving Constraint Satisfaction Problems (CSPs) - searching which is well-suited for declarative (non-procedural) implementation in a new programming paradigm named Control Network Programming, and how this can be achieved using the tools for dynamic computation control. Some heuristics for variable and value ordering in backtracking algorithm, lookahead strategies, stochastic strategies and local search strategies are subjects of interest. The 8-queens problem is used to help in illustrating how these algorithms work, and how they can be implemented in Control Network Programming.
Keywords: Control Network Programming; graphical programming; declarative programming; Constraint Satisfaction Problem; MRV, degree, LCV and minimum-conflicts heuristics
Issue 5, Volume 12, May 2013
Title of the Paper: Quantification and Segmentation of Breast Cancer Diagnosis: Efficient Hardware Accelerator Approach
Authors: Khairulnizam Othman, Afandi Ahmad
Abstract: This research presents VLSI architecture for image segmentation. The architecture is based on the fuzzy c-means algorithm with spatial constraint for reducing the misclassification rate. In the architecture, the usual iterative operations for updating the homogeneus membership matrix and cluster centroid are merged into one single updating process to evade the large storage requirement. In addition, an efficient pipelined circuit is used for the updating process for accelerating the computational speed. Experimental results show that the proposed circuit is an effective alternative for real-time image segmentation with low area cost (time) and low misclassification rate.
Keywords: fuzzy c-means, image segmentation, clustering, FPGA, chip
Title of the Paper: Identification of Reliable Information for Classification Problems
Authors: Kuang Yu Huang, Hung-Yi Lin
Abstract: A novel information identification model is proposed to support accurate classification tasks with mixtures of categorical and real-valued attributes. This model combines the advantages of rough set theory and cluster validity method to promote the classification quality to the higher levels. Real-valued attribute values are pre-processed by fuzzy c-means clustering method and then analyzed by variable precision rough set theory. Our cluster validity index finalizes the information system with the feasible cluster number for each attribute. In the case that a considerable amount of ambiguous instances is included, the experimental results show that our model can explicitly improve traditional classifiers in the aspects of classification accuracy and discrimination power. This paper provides a better solution for the generation of reliable decision rules for classification problems with attribute mixtures.
Keywords: Reliable information, classification problems, fuzzy c-means, variable precision rough set, cluster validity index, discrimination power
Title of the Paper: Saliency Detection Based on Path Price and Fuzzy Reasoning
Authors: Sai Luo, Shuhan Chen
Abstract: Saliency detection is essential for many vision tasks and has become a very active topic in compute vision. Although various computational models have been developed especially these contrast based, there still exist some limitations such as: can’t uniformly highlight whole salient regions; usually falsely marking background as salient regions. Aim to solve these, a novel saliency detection method based on path price and fuzzy reasoning rule was proposed in this paper. In detail, we tackle the saliency detection from a different viewpoint: we measure four path prices instead of contrast. Finally, we introduce two fuzzy reasoning rules to capture the properties of these four path prices. Final saliency map is computed by averaging these two fuzzy truth values. Evaluation on two databases validates that the proposed method achieves superior results both on precision recall curve and visual quality.
Keywords: Saliency detection, Path price, Fuzzy reasoning, Saliency map, Fuzzy rule, Fuzzy truth value
Title of the Paper: The Solution Area and Fitness-Based Algorithms of the Content-Driven Template-Based Layout System
Authors: István Albert, Hassan Charaf, László Lengyel
Abstract: People use their mobile devices every day to access a wide variety digital content. The diversity of mobile platforms and that of mobile device capabilities requires providing automatic layout solutions for online content. For the purposes of this paper, our solutions focus on online magazines and newspapers. The Content-Driven Template-Based Layout System (CTLS) is a template-based online magazine layout approach. This approach facilitates in defining hierarchical layouts from basic layout elements and splitter components. The goal is to effectively calculate the resulting adaptation method for a particular layout element at any level of the layout hierarchy. This paper introduces both the solution area-based algorithm and the fitness-based algorithm of the CTLS approach. These algorithms apply inequalities rather than equations. Inequalities, represented by solution areas (polygons), provide the flexibility within the approach.
Keywords: Adaptive Layout, Content-Driven Layout, Template-Based Layout, Online Magazine Layout, Splitter Algorithm
Issue 6, Volume 12, June 2013
Title of the Paper: Research on Mining the Online Community: A Case of Open Source Software Community
Authors: Luo Yan
Abstract: The development of Open Source Software (OSS) projects is a process of collective innovation in the environment of online community. The paper addresses the challenge of efficiently mining data from OSS web repositories and building models to study OSS community features. Data collection for OSS community study is nontrivial since most OSS projects are developed by distributed developers using web tools. We design a mining process which combines web mining and database mining together to identify, extract, filter and analyze data. We address and analyze the difficulty of mining OSS community data. Our work provides a general solution for researchers to implement advanced techniques, such as web mining, data mining, statistics, and algorithms to collect and analyze online community data.
Keywords: Online community, Data mining, Open source software
Title of the Paper: A Novel Approach for the Prediction of Epilepsy from 2D Medical Images Using Case Based Reasoning Classification Model
Authors: P. Tamije Selvy, V. Palanisamy, S. Elakkiya
Abstract: Corpus Callosum is a highly visible structure in brain imaging whose function is to connect the left and right hemisphere of the brain. Epilepsy is the sudden alterations in behaviour or motion function caused by an electrical discharge from the brain. Such electrical activity that starts from one side of the brain spread to the other side through the Corpus Callosum. Epilepsy occurs in 2% of the general population and it is the oldest known brain disorder. Approaches for the classification of Corpus Callosum are described for the specific application of epilepsy detection. The proposed technique includes the improved classification for the diagnosis of epilepsy. The technique includes the following phases: (i) Pre-processing the 2D MR Brain Image using threshold interval method and Min-Max Normalization (ii) Segmentation of brain image using Multiscale segmentation method to obtain the segments of corpus callosum. Multiscale segmentation proves to be better in curvature segmentation with less execution time and 91% of accuracy based on entropy (iii) Shape features such as corpus callosum bending angle, Genu thickness and Intelligent Quotient (IQ) are extracted from the segmented corpus callosum (iv) Diagnosis of epilepsy using Case Based Reasoning (CBR). The performance of the proposed CBR classification reduces the false positive rate and results in 96.7% of prediction accuracy when compared to the conventional classification models.
Keywords: Corpus callosum, Epilepsy, Multiscale Segmentation, Pre-processing, shape features, Case Based Reasoning
Title of the Paper: Research on Security of Routing Protocols against Wormhole Attack in the Ad Hoc Networks
Authors: Huang Wen Hua
Abstract: Ad hoc networks are made of a group of portable terminals with wireless transmitter as multi-hops provisional autonomy system. Ad hoc networks are very easy to be attacked by various kinds of network attacks, as the limited resources, dynamic topology and open communication channel and so on. Wormhole attack is an internal attack means against routing protocols in the Ad hoc networks?The research on security of routing protocols against wormhole attack in the Ad hoc networks is performed. The security routing mechanism against wormhole attack in the Ad hoc networks is put forward, on the basis of deeply research on wormhole attack principles and models. The modeling and simulations are presented in detailed.
Keywords: Ad hoc networks, routing protocols, wormhole attack, security routing mechanism, modeling and simulation
Title of the Paper: Arithmetic Operation in Spiking Neural P System with Chain Structure
Authors: Jing Luan, Xiyu Liu
Abstract: Spiking neural P system with chain structure (SNPC, for short) is a new membrane system, which combines spiking neural P system (SNP, for short) with discrete Morse theory, that is to say, neural membrane cells in spiking neural P system are set on chain by discrete gradient vector path, building a SNP system with chain structure. Membrane computing is a computational model, which simulates the nature at the cellular level, and its maximum parallelism and well distributed manner showing strong computational completeness and efficiency. In this paper, the possibility of performing arithmetic operation with natural numbers in SNPC system has been proved, and the algorithms of the arithmetic operation in SNPC system are given from the aspects of generating datasets and generating string language. The efficiency and complexity of the algorithms are significantly improved, compared with usual computer architecture.
Keywords: Membrane Computing, Spiking Neural P System, Discrete Morse Theory, Arithmetic Operation
Issue 7, Volume 12, July 2013
Title of the Paper: A Novel Approach to Simulate the Interaction between Grass and Dynamic Objects
Authors: Hang Qiu, Lei-Ting Chen, Guo-Ping Qiu
Abstract: Grass animation plays an important role in various applications, such as virtual reality, computer games and special effects of movies. In recent years, varieties of approaches have been proposed to simulate the interaction between grass and the wind field. However, simulation of grass-object interaction is seldom involved due to the complexity of physical interaction and the high computational cost. In this paper, we present a novel method to animate realistic interaction between grass and dynamic objects. The representation of large-scale grassland relies on three different levels of detail that could reduce the rendering cost and still allow high-fidelity rendering of grasses close to the viewer. According to different levels of detail, two animation approaches are applied: physically- based method, which deals with the deformation of geometricalbased grass blades that close to the viewer; procedural method, which handles billboard-based grass. Besides, seamless dynamic transition between different levels is taken into account. Experiments demonstrate our method not only can realistically simulate the interaction between massive amounts of grass and the dynamic object, but also can overcome the deficiencies of existing approaches.
Keywords: Grass Animation; Levels of Detail (LOD); Billboard; Collision Detection; Dynamic Simulation
Title of the Paper: Line-Torus Intersection for Ray Tracing: Alternative Formulations
Authors: Vaclav Skala
Abstract: Intersection algorithms are very important in computation of geometrical problems. Algorithms for a line intersection with linear or quadratic surfaces are quite efficient. However, algorithms for a line intersection with other surfaces are more complex and time consuming. In this case the object is usually closed into a simple bounding volume to speed up the cases when the given line cannot intersect the given object. In this paper new formulations of the line-torus intersection problem are given and new specification of the bounding volume for a torus is given as well. The presented approach is based on an idea of a line intersection with an envelope of rotating sphere that forms a torus. Due to this approach new bounding volume can be formulated which is more effective as it enables to detect cases when the line passes the “hole” of a torus, too.
Keywords: Line clipping; torus line intersection, CAD systems
Title of the Paper: A Proficient Clustering Technique to Detect CSF Level in MRI Brain Images Using PSO Algorithm
Authors: P. Tamije Selvy, V.Palanisamy, M. Sri Radhai
Abstract: Image segmentation is an indispensible part of the visualization of human tissues during the analysis of Magnetic Resonance Imaging (MRI). MRI is an advanced medical imaging technique which provides rich information for detecting Cerebrospinal Fluid (CSF) level in brain images. The changes in the CSF protein level forms abnormal brain deposits strongly linked to variety of neurological diseases. The proposed system encompasses the following steps, Pre-Processing (Contrast Limited Adaptive Histogram Equalization), the enhanced image is then subjected to CSF extraction, Clustering methods (Fuzzy C Means, Total Variation FCM, and Anisotropic Diffused TVFCM), and Particle Swarm Optimization (PSO) with clustering techniques (FCM-PSO, TVFCM-PSO, and ADTVFCM-PSO). The clustering methods provide only local optimal solution. In order to achieve global optimal solution, the clustering methods are further optimized using PSO. The performance of the clustering with optimization method is analyzed using defined set of Simulated MS Lesion Brain database. The optimized clustering methods finds the level of CSF present in MRI brain images with 98% of Accuracy, 92% of Sensitivity and 97% of Specificity.
Keywords: Cerebrospinal Fluid, Segmentation, Magnetic Resonance Image, Fuzzy C Means, Total Variation Regularizer, Anisotropic Diffusion, Particle Swarm Optimization
Issue 8, Volume 12, August 2013
Title of the Paper: The DBSCAN Clustering Algorithm by a P System with Active Membranes
Authors: Jie Sun, Xiyu Liu
Abstract: The great characteristic of the P system with active membranes is that not only the objects evolve but also the membrane structure. Using the possibility to change membrane structure, it can be used in a parallel computation for solving clustering problems. In this paper a P system with active membranes for solving DBSCAN clustering problems is proposed. This new model of P system can reduce the time complexity of computing without increasing the complexity of the DBSCAN clustering algorithm. Firstly it specifies the procedure of the DBSCAN clustering algorithm. Then a P system with a sequence of new rules is designed to realize DBSCAN clustering algorithm. For a given dataset, it can be clustered in a non-deterministic way. Through example verification, this new model of P system is proved to be feasible and effective to solve DBSCAN clustering problems. This is a great improvement in applications of membrane computing.
Keywords: DBSCAN; Clustering Algorithm; Membrane Computing; P System; Active Membranes
Title of the Paper: 3D Surface Simplification Based on Extended Shape Operator
Authors: Juin-Ling Tseng, Yu-Hsuan Lin
Abstract: In order to present realistic scenes and models, complex triangle meshes comprising large numbers of triangles are often used to describe 3D models. However, with an increase in the number of triangles, storage and computation costs will raise. To preserve more geometric features of 3D models, Zhanhong and Shutian employed the curvature factor of collapsing edge, Gaussian Curvature, to improve the Quadric Error Metrics (QEM) simplification. Their method allows the QEM not only to measure distance error but also to reflect geometric variations of local surface. However, the method can only estimate the curvature factors of vertices in manifold surfaces due to Gaussian Curvature. To overcome this problem, we propose a new simplification method, called Extended Shape Operator. The Extended Shape Operator estimates the local surface variation using three-rings shape operator. The Extended Shape Operator can be applied in the simplification of manifold and non-manifold surfaces. In our experiment, we employed the error detection tool, Metro, to compare errors resulting from simplification. The results of the experiment demonstrate that when the model has been simplified, the proposed method is superior to the simplification method proposed by Zhanhong and Shutian.
Keywords: Surface simplification, extended shape operator, curvature factor, quadric error metric
Title of the Paper: Realistic Simulation of 3D Cloud
Authors: Hang Qiu, Lei-Ting Chen, Guo-Ping Qiu, Hao Yang
Abstract: The digital creation of cloud is important for many applications in computer graphics, including outdoor simulations and the digital rendering of atmospheric effects. Unfortunately, it is still difficult to simulate realistic cloud with interactive frame rates due to its peculiar microstructures and complex physical process of formation. Realistic simulation of cloud turns to be one of the most challenging topics in computer graphics. In this paper, we present a method for simulating 3D cloud. The Coupled Map Lattice (CML) is adopted for the modeling of cloud, and the simulation of light scattering in clouds is achieved by using a series of spherical harmonics and spherical harmonic coefficients that represent incident-light distribution. A frequency domain volume-rendering algorithm combined with spherical harmonics is applied to implement fast rendering of cloud scenes. Experiments demonstrate that our method facilitates computing efficiency, while yielding realistic visual quality.
Keywords: Natural Scene, Coupled Map Lattice (CML), Spherical Harmonics, Frequency Domain, Cloud Rendering, Volume Rendering
Issue 9, Volume 12, September 2013
Title of the Paper: Schema Matching Using Directed Graph Matching
Authors: K. Amshakala, R. Nedunchezhian
Abstract: Integration of data from multiple sources has gained importance as data and the data providers explode at a faster rate. Schema matching is considered an important step in integrating data from multiple sources. Most of the available techniques for automated schema matching require interpretation of attribute names and data values. Such techniques fail when the data sources have incomprehensible attribute names and data values. An alternative schema matching technique, which uses statistics from the schema instances and does not require value interpretations, is proposed in this paper. In this work, functional dependency (FD) relationships between attributes of two schemas are represented in the form of a directed dependency graph. A primitive directed graph matching algorithm is used to find the matching between the two dependency graphs and therefore to find the corresponding attributes of the two schemas. The experimental results show that the proposed approach increases the accuracy of matching as it uses fine grained functional dependency relationships between attributes to compare two schemas.
Keywords: Data Integration, Schema Matching, Information Theory, Graph matching, Functional Dependency
Title of the Paper: Solving SVM Model Selection Problem Using ACOR and IACOR
Authors: Hiba Basim Alwan, Ku Ruhana Ku-Mahamud
Abstract: Ant Colony Optimization (ACO) has been used to solve Support Vector Machine (SVM) model selection problem. ACO originally deals with discrete optimization problem. In applying ACO for optimizing SVM parameters which are continuous variables, there is a need to discretize the continuously value into discrete values. This discretize process would result in loss of some information and hence affect the classification accuracy. In order to enhance SVM performance and solving the discretization problem, this study proposes two algorithms to optimize SVM parameters using Continuous ACO (ACOR) and Incremental Continuous Ant Colony Optimization (IACOR) without the need to discretize continuous value for SVM parameters. Eight datasets from UCI were used to evaluate the credibility of the proposed integrated algorithm in terms of classification accuracy and size of features subset. Promising results were obtained when compared to grid search technique, GAwith feature chromosome-SVM, PSO-SVM, and GA-SVM. Results have also shown that IACOR-SVM is better than ACOR-SVM in terms of classification accuracy.
Keywords: Support Vector Machine, Continuous Ant Colony Optimization, Incremental Continuous Ant Colony Optimization, Model Selection
Title of the Paper: An Efficient Cluster-Based Reliable Power Aware Scheme (RPAS) for Network Longevity in WSN
Authors: G. Kannan, T. Sree Renga Raja
Abstract: This experimental work is mainly focused on designing a new wireless sensor networks (WSN) node called WSNMSP430 with increased life time. The developed node was based on the analysis of the various low power components available in the market and also an energy model was created for processor, transceiver & sensor for predicting the life time of the WSN node. Utilizing low power features supported by processor and transceiver a reliable power aware scheme was developed for the node to transmit and receive the packets. An appropriate experimental setup was employed for transmitting and receiving the packets to various WSNMSP430 nodes in different places. And also the same node was compared with cross bow motes like Telosb, Mica2 & Micaz for estimating the life time through various real time scenarios. From the experimental test results it revealed that newly developed node had good impact on the improved life time.
Keywords: Wireless Sensor Networks (WSN), Clusters, Energy model, life time estimation
Issue 10, Volume 12, October 2013
Title of the Paper: Fast Moving Object Tracking Algorithm Based on Hybrid Quantum PSO
Authors: Jinyin Chen, Yi Zhen, Dongyong Yang
Abstract: Standard particle swarm optimization(PSO) has capacity of local search exploitation and global search exploratio. The population diversity gets easily lost during the latter period of evolution, which means most particles are convergenced into near positions which is the local optimia. In this paper, a Euclid distance based hybird quantum particle swarm optimization (HQPSO) is brought up. Based on the calculation of population diversity, when the diversity is less than thereshold, population division is proposed for seperating population into two sub-populations based on Euclid distance. One sub-population near Euclid center is defined as will evolve according to traditional QPSO, while the other sub-population far away from center named will fly to boundery which is far away from center. In this way, population diversity would promined to get particles convergence into global optima. Benchmark functions are adopted to testify the efficiency of HQPSO. And based on HQPSO Mean shift algorithm is designed for fast moving object tracking to improve tracking efficiency and decrease detection time cost, which will overcome the “tracking lost” problem of Mean Shift algorithm.
Keywords: Quantum particle Swarm optimization, Euclid distance, Fast moving, Population diversity, Object tracking
Title of the Paper: Combined SIQT and SSF Matching Score for Feature Extraction Evaluation in Finger Knuckle Print Recognition
Authors: Kirthika Alagar, Arumugam Subbanna
Abstract: Feature selection has been a prevailing part of examine in biometrics, image reclamation, data mining, and text classification. The major design of feature selection is to choose a set of features, by removing the irrelevant as well as the surplus features that are robustly associated. Many classification techniques have been presented for feature extraction process but there has bee no effort for feature selection. The previous work used texture and color intensive biometric (TCIB) for multimodal security that achieves significant presentation even for the huge pose variations with various angles. The matching pattern is done using texture values but the outcome of the image is not as clear as much. To improve the knuckle finger print recognition system, in this paper, we present a narrative grouping of restricted information for a proficient finger-knuckle-print (FKP) based recognition system which is vigorous to extent and rotation. The non-uniform clarity of the FKP due to comparatively curvature surface is accurate and texture is improved. The features of the improved FKP are mined using the Scale Invariant Quality Transform (SIQT) and the Strong Speeded up features (SSF). Consequent features of the register and the query FKPs are coordinated using nearest-neighbor-ratio method and subsequently the consequent SIQT and SSF matching scores are combined using weighted sum rule. An experimental result are carried on the datasets with seven sample images in a substantial pose variations provides enhanced results compared to an existing Texture and Color intensive biometric multimodal security using hand geometry and palm print. The proposed system is evaluated with the set of images for both classification and authentication mode. It is practical that the system achieves with CRR of 100% and EER of 0:215%.
Keywords: Palm print, Hand Geometry, Biometrics, Feature selection, Knuckle finger feature selection, SIQT, SSF, TCIB
Title of the Paper: Automatic Detection and Classification of Brain Hemorrhages
Authors: Mahmoud Al-Ayyoub, Duaa Alawad, Khaldun Al-Darabsah, Inad Aljarrah
Abstract: Computer-aided diagnosis systems have been the focus of many research endeavors. They are based on the idea of processing and analyzing images of different parts of the human body for a quick and accurate diagnosis. In this paper, the aforementioned approach is followed to detect whether a brain hemorrhage exists or not in a Computed Topography (CT) scans of the brain. Moreover, the type of the hemorrhage is identified. The implemented system consists of several stages that include image preprocessing, image segmentation, feature extraction, and classification. The results of the conducted experiments are very promising. A recognition rate of 100% is attained for detecting whether a brain hemorrhage exists or not. For the hemorrhage type classification, more than 92% accuracy is achieved.
Keywords: brain hemorrhage, brain ct scans, machine learning, image processing, image segmentation
Issue 11, Volume 12, November 2013
Title of the Paper: A Novel Image Encryption Approach Using Matrix Reordering
Authors: T. Sivakumar, R. Venkatesan
Abstract: Transmission and storage of multimedia data like audio, video, and images over the Internet has increased in today’s digital communication. Among the different multimedia data, images are transmitted and used very often. It is essential to protect the multimedia data from unauthorized disclosure during transmit. A novel approach for encrypting digital images using Matrix Reordering (MR), a kind of scanning, and simple XOR operation is proposed in this paper. The MR is applied to permute the pixel positions and the XOR operation is done to diffuse the pixel values. The bitwise XOR operation is performed using pseudorandom numbers generated by the linear congruential method. The image encryption algorithm evaluation parameters such as histogram, correlation, cut test, dispersion test, visual testing, and speed test have been conducted using the suggested method, and the results are analyzed. The analysis shows that the proposed system is resistant to statistical and differential attacks, and could be used in real-time applications to provide confidentiality service for images with less computational overhead.
Keywords: Information Security, Image Encryption, Matrix Reordering, Scan Pattern, Image Histogram, Image Correlation, Differential Attack
Title of the Paper: Elliptic Curve Point Multiplication Algorithm Using Precomputation
Authors: Hani Mimi, Azman Samsudin, Shahram Jahani
Abstract: Window-based elliptic curve multiplication algorithms are more attractive than non-window techniques if precomputation is allowed. Reducing the complexity of elliptic curve point multiplication of the form , which is the dominant operation in elliptic curve cryptography schemes, will reduces the overall complexity of the cryptographic protocol. The wBD is a new window-based elliptic curve multiplication method. It is based on new recoding method called window big-digit (wBD). The wBD is a bidirectional method that can be calculated in both directions based on the amount of the available memory.The available memory is invested in an efficient way since wBD has little number of precomputed points compared to other window methods wich make it more suitable for limited storage devices.The BD recoding method requires only one pass to transform the exponent k from its binary representation to its wBD representation. Moreover, the wBDkeys has the lowest zero-run length among other window methods. Finally, the number of elliptic curve operations in addition to the execution timeof wBD method is measured. Consequently, the wBD is efficient as other window-based methods.
Keywords: Window methods, Single Scalar EC multiplication, Big-digit Recoding, Public key cryptography
Title of the Paper: Multi-Cluster Based Temporal Mobile Sequential Pattern Mining Using Heuristic Search
Authors: M. I. Thariq Hussan, B. Kalaavathi
Abstract: An enhanced mobile sequential pattern mining using heuristic search technique is explored to predict mobile user’s behavior effectively. By analyzing the movement of mobile users with respect to time, location and service request, one can contend that users in different user groups may have different mobile transaction behavior. Similar transaction behavior in a set is grouped by applying heuristic search technique. The heuristic search technique efficiently performs search on mobile transaction database to prune undesired transaction to form cluster and make them refined, meaningful, and relevant to the query. Research on multi-cluster based sequential pattern mining has been emerging in recent years due to a wide range of potential applications. One of the active topics is the facilitation of wireless and web technologies to the mobile user’s through the usage of mobile devices at anytime and anywhere. This approach has been evaluated with the transactional dataset and simulation is carried out with data obtained from the real world to generate the required network environment. Compared with Cluster-based Temporal Mobile Sequential Pattern (CTMSP), the evaluation results show that Multi-Cluster based Temporal Mobile Sequential Pattern using Heuristic Search (MCTMSPHS) achieves 30 to 40% more in accuracy, 40 to 50% less in energy usage and 20 to 25% less in execution time.
Keywords: Mobile Sequential Pattern Mining, Heuristic Search, Clustering, Mobile Environment
Issue 12, Volume 12, December 2013
Title of the Paper: A Clique-Based and Degree-Based Clustering Algorithm for Expressway Network Simplification Problem
Authors: Zipeng Zhang, Hongguo Wang
Abstract: In order to meet the complex topological structure and huge distance and site number information of the expressway network, this paper proposed a new degree-based maximal clique mining algorithm which based on the principle from top to down to reasonably simplify the expressway network information. At the same time a search tree model which containing a series of pruning strategies and dictionary ordering strategies is derived to describe the mining process directly and improve the efficiency of searching and clustering the expressway network. Finally in this paper, a simplified algorithm of the expressway network model is designed, and the new algorithm shows better results based on the example databases of expressway network in Shandong province.
Keywords: maximal clique mining, degree-based principle, distributed search tree model, expressway network simplification
Title of the Paper: An Authoring Tool for Designing Learning Scenarios Adapted to Teachers
Authors: Ryane Imane, Moncef Bentaleb, Mohammed Khalidi Idrissi, And Samir Bennani
Abstract: Learning scenario is an important product of the instructional engineering process. It enables to plan learning situations. However, instructional scripting is not invented by instructional engineers. It is a common practice for teachers, long time ago, but in a more intuitive and artisanal way than instructional engineers. Thus, several research projects have focused on this aspect, or how to equip teachers, so they can, like instructional engineers, produce learning scenarios adapted to the current learning problems: use of ICT in classroom, e-learning, etc. A number of tools, fruits of these works, have been created: Exploragraph, ScenEdit or Web Collage. They are called authoring systems. Our research project is a part of these works and aims to create an authoring system providing a support, for designing learning scenarios, adapted to teachers. In this paper, we present a synthesis of our work.
Keywords: e-learning, authoring system, learning scenario, educational modeling language, SOA, web service, instructional scripting, instructional engineering
Title of the Paper: Quality Aware Service Oriented Ontology Based Data Integration
Authors: Hema M. S., Chandramathi S.
Abstract: Integration of multiple, distributed, and heterogeneous sources are essential for scientific and commercial domains. Ensuring data quality in data integration is an issue or challenge because of their varying quality levels. The existing data integration methodologies do not assure quality of data as it is difficult to assess. The Meta data of data sources do not provide quality details and it is difficult to choose best query plan. It is also difficult to predict the resultant data quality before integration. To mitigate above issues, this paper proposes Service Oriented Data Integration with Quality of Service (SODI-QoS) architecture. The SODI-QoS architecture has wrapper mediator layer, which consists of semantic conflict resolution layer and Quality of Service (QoS) layer. Semantic conflict resolution layer uses ontology to create local and global schema to resolve semantic conflicts. The QoS layer detects and resolves the incompleteness and inaccuracy of resultant data of data sources. The proposed architecture provides high-quality data results to the end-user and notification about the incompleteness and inaccuracy of the source is communicated to respective data sources. The E-shopping application has been proposed to analyze the performance of the SODI-QoS architecture. Experimental results illustrates that the accuracy and precision of SODI-QoS architecture has been improved by 12% 14% respectively than the ontology based data integration.
Keywords: Data integration, Quality of service, Semantic conflict, data completeness, source completeness, accuracy