WSEAS JOURNALS

WSEAS TRANSACTIONS ON COMPUTERS


WSEAS Transactions on Computers


Contents:
2016 | 2015 | 2014 | 2013 | 2012 | 2011 | 2010 | 2009 | 2008 | Pre-2008

Print ISSN: 1109-2750
E-ISSN: 2224-2872

Volume 13, 2014

Notice: As of 2014 and for the forthcoming years, the publication frequency/periodicity of WSEAS Journals is adapted to the 'continuously updated' model. What this means is that instead of being separated into issues, new papers will be added on a continuous basis, allowing a more regular flow and shorter publication times. The papers will appear in reverse order, therefore the most recent one will be on top.


Volume 13, 2014


Title of the Paper: Recursive Estimation Algorithms in Matlab & Simulink Development Environment

Authors: Petr Navrátil, Ján Ivanka

Abstract: The article deals with recursive estimation algorithms realized in Matlab&Simulink development environment. These algorithms are realized as a blocks in simple SIMULINK library. Proposed library can be used for recursive parameter estimation of linear dynamic models ARX, ARMAX and OE. The library implements several recursive estimation methods: Least Squares Method, Recursive Leaky Incremental Estimation, Damped Least Squares, Adaptive Control with Selective Memory, Instrumental Variable Method, Extended Least Squares Method, Prediction Error Method and Extended Instrumental Variable Method. Several forgetting factor and modification of basic algorithm are taken into consideration in order to cope with tracking the time-variant parameters.

Keywords: Recursive estimation, ARX model, ARMAX model, OE model, forgetting factors, Matlab, Simulink

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #61, pp. 691-702


Title of the Paper: Reliability Improvement and Reliability Assessment for Distributed Hardware-Software Multi-Agent Systems

Authors: Alexei V. Igumnov, Sergey E. Saradgishvili

Abstract: The problem of reliability of new object of distributed hardware software (DHS) multi-agent system (MAS) is considered. DHS MAS is determined as a system that is based on agent technologies and consists of both agents and hardware components required for execution of agents and for interaction of agents with an environment. Reliability improvement, fault-recovery and several reliability assessment approaches for DHS MAS are presented. The reliability improvement methodology is built upon replication of unique functional components and redundancy of universal components. The fault-recovery methodology defines a set of fault-recovery procedures required for restoration of consistent system configuration after failures of its components. Methodology for operability function formation was developed to enable utilization of logical-and-probabilistic methods for reliability assessment. Another approach for reliability assessment is based on Markovian model and system state graph and was developed to overcome limitations of logical-and-probabilistic methods that are suitable only for systems with hot standby. The state graph based approach allows reliability assessment for DHS MAS with cold standby and different operating modes of system components. The state graph based approach is also applicable for a case when probability of failure of one of components depends on states of other components. New failure model for determination of failure rates of system components in accordance with system state is introduced. Computing experiments described in the article have validated developed methodologies.

Keywords: Multi-agent system, reliability, fault-recovery, operability function, redundancy, state graph, logical-and-probabilistic method

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #60, pp. 670-690


Title of the Paper: Developing Scalable Applications with Actors

Authors: Agostino Poggi

Abstract: This paper presents a software framework aimed at both simplifying the development of large and distributed complex systems and guarantying an efficient execution of applications. This software framework takes advantage of a concise actor model that makes easy the development of the actor code by delegating the management of events (i.e., the reception of messages) to the execution environment. Moreover, it allows the development of scalable and efficient applications through the possibility of using different implementations of the components that drive the execution of actors. In particular, the paper introduces the software framework and presents the first results of its experimentation.

Keywords: Actor model, software framework, concurrent systems, distributed systems, Java

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #59, pp. 660-669


Title of the Paper: Control Network Programming Development Environments

Authors: Kostadin Kratchanov, Tzanko Golemanov, Buket Yüksel, Emilia Golemanova

Abstract: In this paper we discuss the unusual distinctive features of Control Network Programming as a hybrid programming paradigm. We postulate the maxim “Primitives + Control Network = Control Network Program”, and use this observation in the design of programming environments for developing Control Network Programming projects. The various possible approaches to building such environments are the main focus of the paper, together with a relatively detailed presentation of the currently most powerful locally run SpiderCNP environment with graphical editing and tracing, as well as two light-weight and ready-to-use cloudbased environments. An extended survey of cloud compilers and IDEs is also included.

Keywords: Control Network Programming, CNP, programming environments, cloud IDEs, cloud compilers, online compilers, programming paradigms, learning systems

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #58, pp. 645-659


Title of the Paper: Multi-Level Image Annotation Using Bayes Classifier and Fuzzy Knowledge Representation Scheme

Authors: Marina Ivasic-Kos, Ivo Ipsic, Slobodan Ribaric

Abstract: Automatic image annotation (AIA) is the process by which metadata, in form of keywords or text descriptions are automatically assigned to an unlabeled image. Generally, two problems can be distinguished: the problem of semantic extraction, due to the gap between the image features and object labels, and the problem of semantic interpretation, due to the gap between the object labels and the human interpretation of images. In this paper, a model for multi-level image annotation that is performed in two phases is proposed. In the first phase, a Naïve Bayes classifier is used to classify low-level image features into elementary classes. In the second phase, a knowledge representation scheme based on Fuzzy Petri Net is used to expand the level of vocabulary and to include multi-level semantic concepts related to images into image annotations. In the paper, a knowledge representation scheme for outdoor image annotation is given. Procedures for determining concepts related to an image using fuzzy recognition and inheritance algorithms on a knowledge representation scheme are presented, as well as experimental results of image annotation.

Keywords: Multi-level Image Annotation, Fuzzy Petri Net, Knowledge Representation, Naïve Bayes

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #57, pp. 635-644


Title of the Paper: Towards Relating Physiological Signals to Usability Metrics: A Case Study with a Web Avatar

Authors: Pierfrancesco Foglia, Michele Zanda

Abstract: Inferring the user’s approval of a graphical interface with non-invasive devices can be effective in improving its design and in implementing adaptive pleasant interactions. This paper investigates how 3 common physiological signals, i.e. skin conductance, heart rate and respiration, can be exploited to infer users’ approval of an online avatar embedded in a health care Website. A between group experiment is performed with participants who have the avatar support and participants who do not. During the experiment, skin conductance, hearth rate and respiration were monitored, together with traditional usability metrics (visited pages, completion times, errors, etc). At the end of each experiment, a feedback questionnaire is proposed to infer information related to the user experience, ease of use and approval. Results indicate that the respiration overshoot rate is closely related to the users’ appreciation of the avatar based interaction. Further steps of our research will consider improvements in the results by investigating and exploiting mutual effects induced by the multiple collected signals.

Keywords: Smart Interfaces, Human-Computer Interaction, Usability Assessment, Physiology, GSR, HRV, respiration

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #56, pp. 624-634


Title of the Paper: Energy Efficient Task Allocation and Scheduling in Distributed Homogeneous Multiprocessor Systems

Authors: Anju S. Pillai, T. B. Isha

Abstract: With the advent of semi conductor technology, the development of more complex embedded real time applications is made possible today. This accelerates the development and support for multiprocessor based systems. The paper presents the development of “a power-aware real time embedded systemfor temperature monitoring and control in safety critical applications”.The main objective of the work is to perform a hardware implementation of task allocation for multiprocessor systems based on task dependencies and precedence relations and schedule the real time tasks. The proposed work also provides an energy-aware solution by integrating Dynamic Voltage Frequency Scaling (DVFS) technique and shutting off the unused peripherals of the processor. A laboratory model of multiprocessor based experimental setup was developed to validate the functioning of algorithms using ARM7 LPC2148 microcontrollers.

Keywords: Multi processor systems, Energy minimization, Embedded Systems, Microcontrollers, Scheduling, Task-processor allocation, Dynamic voltage frequency scaling

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #55, pp. 613-623


Title of the Paper: Optimization of the Dispatch Probability of Service Requests for Two Servers in Parallel Connection

Authors: Chung-Ping Chen, Ying-Wen Bai, Hsiang-Hsiu Peng

Abstract: This paper describes the procedure used to create a queueing model for servers which are currently in parallel connection with different dispatch probabilities. The system response time of parallel Web servers is improved by calculating dispatch probabilities. The system response time is calculated first. The equation for the lowest system response time is obtained by adjusting the equation to meet the dispatch probability requirements. Then the system response time of the servers is measured. Last, the errors in the system response time of the servers are compared by simulating, modeling and measurement.

Keywords: Service rate, Optimization of the Dispatch probability, Parallel connection, System response time, Queueing Model, Different Dispatch Probabilities, Equivalent model

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #54, pp. 600-612


Title of the Paper: Change-Point Detection in Multivariate Time-Series Data by Recurrence Plot

Authors: Min Hu, Shengchen Zhou, Jiajie Wei, Yuyan Deng, Wenting Qu

Abstract: Change-point detection in time-series is an important data mining task with applications to abnormity diagnosis, events monitoring, climate change analysis, and other domains. This paper presents a novel method based on recurrence plot for detecting multiple change-points in multivariate time series. Bhattacharyya distance function is applied to improve the recurrence plot generation so as to capture the dependency change among variables. A window-based detection algorithm is proposed to capture the change-points quickly and automatically. With experiments on artificial and real datasets, we show that the algorithm has made improvement to traditional recurrence plot, is able to handle noisy data with optimized parameter, and can cope with complex situation like human activity and micro-blog events monitoring.

Keywords: Change-point Detection, Multivariate Time Series, Recurrence Plot, Bhattacharyya Distance

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #53, pp. 592-599


Title of the Paper: Parameter Optimization in Decision Tree Learning by Using Simple Genetic Algorithms

Authors: Michel Camilleri, Filippo Neri

Abstract: The process of identifying the optimal parameters for an optimization algorithm or a machine learning one is a costly combinatorial problem because it involves the search of a large, possibly infinite, space of candidate parameter sets. Our work compares grid search with a simple genetic algorithm when used to find the optimal parameter setting for an ID3 like learner operating on given datasets.

Keywords: Machine Learning, Evolutionary Algorithms, Parameter Optimization

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #52, pp. 582-591


Title of the Paper: Privacy Preserving Association Rule Mining by Concept of Impact Factor Using Item Lattice

Authors: B. Janakiramaiah, A. RamaMohan Reddy

Abstract: Association Rules revealed by association rule mining may contain some sensitive rules, which may cause potential threats towards privacy and protection. Association rule hiding is a competent solution that helps enterprises keeps away from the hazards caused by sensitive knowledge leakage when sharing the data in their collaborations. This study shows how to protect actionable knowledge for strategic decisions, but at the same time not losing the great benefit of association rule mining. A new algorithm has been proposed to eradicate sensitive knowledge from the released database based on the intersection lattice and impact factor of items in sensitive association rules. The proposed algorithm specifies the victim item such that the alteration of this item causes the slightest impact on the non sensitive frequent association rules. Experimental results demonstrate that our proposed algorithm is appropriate in real context and can achieve significant improvement over other approaches present in the literature.

Keywords: Frequent itemset lattice, Sensitive itemset grouping, Privacy preserving, Hiding association rules

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #51, pp. 567-581


Title of the Paper: Evaluation of Clusters for Climate Data

Authors: Dzenana Donko, Nejra Hadzimejlic, Nijaz Hadzimejlic

Abstract: Climate data analysis is a progressive research area that focuses on analysis of change of climate conditions, investigation of climate phenomena and evaluation of interconnections of climate conditions. Data mining techniques introduce the effective and efficient way to analyze large amount of data in climatology. In this paper is presented the algorithm for climate data analysis using the clustering data mining techniques. The developed solution represents evaluation of climate data from the different points of view in order to provide a complete view of the data. Climate research experts can use these results to draw their own conclusions and perform detailed climate change analysis. Climate data is represented graphically as the map of measured climate parameters, the map of climate clusters identified in specified moment of time and the map of evolution steps identified between the consecutive time slices.

Keywords: Data mining, clustering, climate, hierarchical clustering, meteorology, evolving clusters

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #50, pp. 556-566


Title of the Paper: VBS2 Scenarios Development for PSI Purposes

Authors: Petr Svoboda, Jiri Sevcik

Abstract: This article presents a research focused on the Virtual Battlespace 2 scenarios development for private security industry (PSI) purposes. In the first part, the VBS2 scenario creation options are described. In the second part, a description of the scenario representing the most common workload of employees of PSI can be found. In the last part, the process of developing this scenario in 2D and 3D editor is described; especially adding entities which are essential for the simulation, and object preparation and modification.

Keywords: Burglary, Private security forces, Private security industry, Scenario development, Training simulation, Virtual Battlespace 2

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #49, pp. 549-555


Title of the Paper: Detecting DRDoS Attack by Log File Based IP Pairing Mechanism

Authors: P. Mohana Priya, V. Akilandeswari, S. Mercy Shalinie

Abstract: As the number of security threats and attacks increase the need for developing flexible and automated network security mechanism also increase. The main objective of this paper is to propose a Reflection Attack Log File (RALF) based IP pairing detection method to detect the TCP-SYN reflection attack. The proposed RALF based IP pairing detection method is best suitable for all the types of protocols such as TCP, UDP, ICMP packets and it belongs to the category of protocol independent detection method. The RALF based IP pairing detection method involves log files which comprises the details of source and destination addresses that are considered to be the comparative parameter for detecting the TCP-SYN reflection attack. In the experimental analysis, the performance of the proposed method is analyzed with Distributed Denial of Service (DDoS) and Distributed Reflection Denial of Service (DRDoS) attack traffic. This method achieves (99%) of True Positive Rates (TPR) and less (1%) of False Positive Rate (FPR) when compared to existing reflected attack detection method. The proposed RALF based IP pairing detection method effectively detects the TCP-SYN reflection attacks before the attack reaching the target server. The results show that the proposed RALF based IP pairing detection method detects the highest probability of attack traffic.

Keywords: DDoS attack, DRDoS attack, Reflection attack, TCP-SYN Reflection attack, High-rate flooding attacks, Log file

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #48, pp. 538-548


Title of the Paper: Features Analysis and Fuzzy-SVM Classification for Tracking Players in Water Polo

Authors: Vladimir Pleština, Vladan Papić

Abstract: This paper presents a novel approach for detection and tracking humans in water. Uniqueness of the tracked objects has been defined after analysis of standard color models. Based on the analysis results, YCbCr is proposed as the best color model for targeted application. Furthermore, relation of Cb and Cr components for different categories of targeted objects (object parts) were analyzed and used as features that can be used by classifier. Fuzzy-SVM classifier is proposed as the best solution for particular domain of problems. Unlike other Fuzzy-SVM methods, presented method is focused on fuzzy logic and applies binary SVM only in special situations when classification of input data is uncertain. In order to test and evaluate hypothesis, proposed method was compared to standard classification methods. Experimental results demonstrated validity and efficiency of the proposed approach.

Keywords: Fuzzy SVM, YCbCr, classification, water polo, tracking player, feature extraction

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #47, pp. 528-537


Title of the Paper: Video Saliency Detection by Spatio-Temporal Sampling and Sparse Matrix Decomposition

Authors: Yunfeng Pan, Qiuping Jiang, Zhutuan Li, Feng Shao

Abstract: In this paper, we present a video saliency detection method by spatio-temporal sampling and sparse matrix decomposition. In the method, we sample the input video sequence into three planes: X-T slice plane, Y-T slice plane, and X-Y slice plane. Then, motion saliency map is extracted from the X-T and Y-T slices, and static saliency map is extracted from the X-Y slices by low-rank matrix decomposition. Finally, these maps are retransformed into the X-Y image domain and combined with central bias prior to obtain the video saliency maps. Extensive results on ASCMN dataset demonstrate that the proposed video saliency model can achieve higher subjective and objective performances.

Keywords: saliency detection, spatio-temporal sampling, sparse matrix decomposition, motion saliency, static saliency

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #46, pp. 520-527


Title of the Paper: ANN Models Optimized Using Swarm Intelligence Algorithms

Authors: N. Kayarvizhy, S. Kanmani, R. V. Uthariaraj

Abstract: Artificial Neural Network (ANN) has found widespread application in the field of classification. Many domains have benefited with the use of ANN based models over traditional statistical models for their classification and prediction needs. Many techniques have been proposed to arrive at optimal values for parameters of the ANN model to improve its prediction accuracy. This paper compares the improvement in prediction accuracy of ANN when it is trained using warm intelligence algorithms. Swarm intelligence algorithms are inspired by the natural social behaviour of a group of biological organisms. Models have been formulated for evaluating the various ANN-Swarm Intelligence combinations. Fault prediction in Object oriented systems through the use of OO metrics has been considered as the objective function for the models. The swarm intelligence algorithms considered in this paper are Particle Swarm Optimization, Ant Colony Optimization, Artificial Bee Colony Optimization and Firefly. The object oriented metrics and fault information for the analysis have been taken from NASA public dataset. The models are compared for their convergence speed and improvement in prediction accuracy over traditional ANN models. The results indicate that Swarm Intelligence Algorithms bring improvement over ANN models trained with gradient descent.

Keywords: Artificial Neural Network, Swarm Intelligence, Particle Swarm Optimization, Ant Colony Optimization, Artificial Bee Colony Optimization, Firefly

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #45, pp. 501-519


Title of the Paper: The Indicator of Cognitive Diversity in Project-Based Learning

Authors: Yassine Benjelloun Touimi, Nourrdine El Faddouli, Samir Bennani, Mohammed Khalidi Idrissi

Abstract: The development of ICTs has emerged several electronic platforms, which contributed to a new mode of digital learning: E-learning. Moreover, new pedagogical approaches have been adopted, based on group learning, including project-based teaching. Project-based learning is an active learning method, based on group work to develop skills and build knowledge. However, the group of learners is faced many challenges during the project, such as decision-making. the decisions by group , generate convergences and diversitys between the members, due to cognitive conflict. Our approach in this paper is to treat the cognitive conflict in the group of students, by measuring the cognitive diversity indicator concerning the concepts of project, in decision making situations.

Keywords: Project-based learning, analytical hierarchy process, decision-making, cognitive map, Shannon entropy, cognitive diversity indicator

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #44, pp. 492-500


Title of the Paper: MQVC: Measuring Quranic Verses Similarity and Sura Classification Using N-gram

Authors: Mohammed Akour, Izzat Alsmadi, Iyad Alazzam

Abstract: Extensive research efforts in the area of Information Retrieval were concentrated on developing retrieval systems related to Arabic language for the different natural language and information retrieval methodologies. However, little effort was conducted in those areas for knowledge extraction from the Holly Muslim book, the Quran. In this paper, we present an approach (MQVC) for retrieving the most similar verses in comparison with a user input verse as a query. To demonstrate the accuracy of our approach, we performed a set of experiments and compared the results with an evaluation from a Quran Specialist who manually identified all relevant chapters and verses to the targeted verse in our study. The MQVC approach was applied to 70 out of 114 Quran chapters. We picked 40 verses randomly and calculated the precision to evaluate the accuracy of our approach. We utilized N-gram to extend the work by performing experiment with machine learning algorithm (LibSVM classifier in Weka), to classify Quran chapters based on the most common scholars classification: Makki and Madani chapters.

Keywords: Text Classification, Quranic Verses Similarity, Madani and Makki Chapters classification

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #43, pp. 485-491


Title of the Paper: Semantic Similarity Based Web Document Classification Using Artificial Bee Colony (ABC) Algorithm

Authors: C. Kavitha, G. Sudha Sadasivam, S. Kiruthika

Abstract: Due to the exponential growth of information on the Internet and the emergent need to organize them, automated categorization of documents into predefined labels has received an ever-increased attention in the recent years for efficient information retrieval. Relevancy of information retrieved can also be improved by considering semantic relatedness between words which is a basic research area in fields like natural language processing, intelligent retrieval, document clustering and classification and word sense disambiguation. The web search engine based semantic relationship from huge web corpus can improve classification of documents. This paper proposes an approach for web document classification that exploits information, including both page count and snippets and also proposes the use of Artificial Bee Colony (ABC) algorithm as a new tool in the classification task. To identify the semantic relations between the query words, a lexical pattern extraction algorithm is applied on snippets. A sequential pattern clustering algorithm is used to form clusters of different documents. The page count based measures are combined with the clustered documents to define the features extracted from the documents. These features are used to train the ABC algorithm, in order to classify the web documents.

Keywords: Artificial Bee Colony (ABC) algorithm, Document Classification, Term Document Frequency, Latent Semantic Indexing (LSI), Web Search Engine

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #42, pp. 476-484


Title of the Paper: Cognitive Radio Resource Management Using Multi-Agent Systems, Auctions and Game Theory

Authors: Asma Amraoui, Badr Benmammar, Francine Krief

Abstract: In the last few years, we have attested an impressive growth in wireless communication due to the popularity of smart phones and other mobile devices. Due to the emergence of application domains, such as sensor networks, smart grid control, medical wearable and embedded wireless devices, we are seeing increasing demand for unlicensed bandwidth. There has been also increasing interest from the wireless community in the use of game theory and multi-agent systems. Our aims in this article are to summarize the different uses of game theory in wireless networks and we focused on its use in cognitive radio networks, then to discuss how multi-agent systems can be applied to solve the problem of radio resource management, and finally give the results of our simulations when combining auctions theory with multi-agent systems for the negotiation between agents.

Keywords: Game theory, Wireless Networks, Cognitive radio network, Multi-agent systems, Auctions theory

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #41, pp. 463-475


Title of the Paper: Tightly Cooperative Caching Approach in Mobile Ad Hoc Network

Authors: Nwe Nwe Htay Win, Bao Jianmin, Cui Gang, Dalaijargal Purevsuren

Abstract: Mobile Ad hoc Network (MANETs) relies on cooperation of all participating nodes. All mobile nodes may easily vulnerable to selfish/malicious nodes which can lower down data communication and degrade the network performance due to refusal of relaying packet in order to save their own resources such as power, time, etc. These non-cooperative behaviors become challenging in cooperative caching system which mainly depends on cooperative nodes. To solve this problem, we propose tightly cooperative caching approach (TCCA) to encourage all nodes to aggressively cooperate by using enhanced credit-based distribution algorithm based on the resource status of the mobile nodes and the demand volume of the request to have load balancing among the acceptor nodes and requester nodes. According to the experimental results, our approach gets the superior results than other approaches under different parameter setting of nodes and network structure.

Keywords: mobile ad hoc network, cooperative caching, credit-based distribution, load balancing

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #40, pp. 452-462


Title of the Paper: Implementation of Efficient Bit Permutation Box for Embedded Security

Authors: Nishchal Raval, Gaurav Bansod, Narayan Pisharoty

Abstract: Security in every real time applications is of utmost importance. The secure architecture implemented in the automobiles such as EVITA (E-safety Vehicle Intrusion protected Application), SEVECOM (Secure Vehicle Communication) has rich cryptographic properties, but has more footprint area and high power consumption. This existing architecture uses standard engines like AES (Advanced encryption standard), Elliptical curves, Hash Engines which are heavy in memory requirement and consumes more power. So its reach is limited only to high end systems that consisting of large bit processors and coprocessors. Role of a bit permutation instruction in cryptographic environment is well proven. GRP (Group Operations) and OMFLIP (Omega-Flip) networks are bit permutation instructions and its implementation in hardware not only accelerates software cryptography but also results in less footprint area and low power consumption. This paper proposes a novel implementation and analysis of GRP and OMFLIP architecture for security in small scale embedded networks. In this paper a hybrid implementation is analysed and its results are compared with ‘P’ box of standard algorithms like AES and DES (Data Encryption Standard). This paper shows that GRP needs very less memory space as compared to other bit permutation instructions and will be useful to design lightweight ciphers.

Keywords: OMFLIP, GRP, Embedded Security, Automotive, Encryption, Bit permutation

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #39, pp. 442-451


Title of the Paper: Fuzzy Causal Inferences Based on Fuzzy Semantics of Fuzzy Concepts in Cognitive Computing

Authors: Yingxu Wang

Abstract: Fuzzy semantics comprehension and fuzzy inference are two of the central abilities of human brains that play a crucial role in thinking, perception, and problem solving. A formal methodology for rigorously describing and manipulating fuzzy semantics and fuzzy concepts is sought for bridging the gap between humans and cognitive fuzzy systems. A mathematical model of fuzzy concepts is created based on concept algebra as the basic unit of fuzzy semantics for denoting languages entities in semantic analyses. The semantic operations of fuzzy modifiers and qualifiers on fuzzy concepts are introduced to deal with complex fuzzy concepts. On the basis the fuzzy semantic models, fuzzy causations and fuzzy causal inferences are formally elaborated by algebraic operations. The denotational mathematical structure of fuzzy semantics and fuzzy inferences not only explains the fuzzy nature of linguistic semantics and its comprehension, but also enables cognitive machines and fuzzy systems to mimic the human fuzzy inference mechanisms in cognitive informatics, cognitive linguistics, cognitive computing, and computational intelligence.

Keywords: Fuzzy systems, fuzzy semantics, fuzzy concept, fuzzy semantics, fuzzy inference, cognitive linguistics, cognitive informatics, cognitive computing, soft computing, computational intelligence

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #38, pp. 430-441


Title of the Paper: Change Theory: Towards a Better Understanding of Software Maintenance

Authors: Amal Abdel-Raouf, Maeda Hanafi

Abstract: A successful Software maintenance process depends on three factors: the maintenance goals, the technical properties of the system and the people performing the software maintenance. Most of the current work to investigate software maintenance only considers the first two factors, ignoring the third factor, which limits the scope and accuracy of these approaches. In this paper, we use change theory to introduce a deeper understanding of the software maintenance process. We utilize three change theories: Lewin’s, Prochaska and DiClemente’s, and Lippit’s theories to introduce three different software maintenance models. These models consider the three success factors and incorporate contextual information to help maintainers better understand the software maintenance task to bring about an effective change.

Keywords: Software Engineering, Software Maintenance, Software Quality, Software Properties, Change Theory, Change Stages

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #37, pp. 421-429


Title of the Paper: A Flexible and Efficient System Development Approach for Enterprise System and Smartphone Application

Authors: Michiko Oba, Taku Yamaguchi

Abstract: The business environment has accelerated sharply year by year. However, the information system does not catch up the rapid change. A certain gap arises between the business strategy and IT. On the other hand, the use of smartphones in the enterprise system has been increasing. The possibility of the use of BYOD (Bring Your Own Device) is also increasing. However, smartphone is necessary to cope with different platforms such as iOS and Android. There is a problem in development effort is increasing. Approaching this problem, this paper aims to resolve a combination of two concepts of SOA and BPM,. In the proposal approach, we apply the agile development process to which the change in the requirement is brilliantly accepted. The grasped requirement is listed and the order of priority is applied at the time of beginning of development. Next, it develops from the function with high priority one by one. When the change and the addition of the requirement occur, priority is reviewed. As mentioned above, our approach accepts the change and the addition of the requirement, and develops them according to priority. As a result, system implementation to be worthy for the customer in the delivery date becomes possible. In this paper, we propose the design-approach that combines BPM, SOA, and agile development for the enterprise system and smartphone application development to which the specification is not fixed. Additionally, we prove the effectiveness of our proposal by the application experience.

Keywords: Agile Development Process, BPM, Component, Feature List, Information System Development, SOA, Enterprise system, Smartphone Application

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #36, pp. 414-420


Title of the Paper: ImpNet: Programming Software-Defied Networks Using Imperative Techniques

Authors: Mohamed A. El-Zawawy, Adel I. AlSalem

Abstract: Software and hardware components are basic parts of modern networks. However the software compo- nent is typical sealed and function-oriented. Therefore it is very difficult to modify these components. This badly affected networking innovations. Moreover, this resulted in network policies having complex interfaces that are not user-friendly and hence resulted in huge and complicated flow tables on physical switches of networks. This greatly degrades the network performance in many cases. Software-Defined Networks (SDNs) is a modern architecture of networks to overcome issues mentioned above. The idea of SDN is to add to the network a controller device that manages all the other devices on the network including physical switches of the network. One of the main tasks of the managing process is switch learning; achieved via programming physical switches of the network by adding or removing rules for packet-processing to/from switches, more specifically to/from their flow tables. A high-level imperative network programming language, called ImpNet, is presented in this paper. ImpNet enables writing efficient, yet simple, and powerful programs to run on the controller to control all other network devices including switches. ImpNet is compositional, simply-structured, expressive, and more importantly imperative. The syntax of ImpNet together two types of operational semantics to contracts of ImpNet are presented in the paper. The proposed semantics are of the static and dynamic types. Two modern application programmed using ImpNet are shown in the paper as well. The semantics of the applications are shown in the paper also.

Keywords: Network programming languages, Controller-switch architecture, Static operational semantics, Dynamic operational semantics, Syntax, ImpNet, Software-defined networks

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #35, pp. 402-413


Title of the Paper: 3D Shape Retrieval Using the Characteristics Level Cuts

Authors: Mustafa Elkhal, Abdelaghni Lakehal, Khalid Satori

Abstract: The technological development of 3D modeling leads to the proliferation of a large number of 3D objects databases, which requires a technical to indexing and retrieval these 3D models. In this paper, we propose a method based on binary images extracted from the 3D object called “level cut” LC. These level cut are obtained by the intersection of the 3D object with a set of equidistant parallel plan. We are based on the LC set to extract the vector descriptor using the Hu moment descriptor to index this set. To measure the similarity between the 3D objects we compute the Hausdorff distance between a vectors descriptor. The performance of this method is evaluated with two well known descriptors of 3D object, using the NTU database (national taiwan university).

Keywords: 3D Shape retrieval, X-means Algorithm, Characteristics Level Cut, Vector descriptor, similarity measuring

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #34, pp. 394-401


Title of the Paper: Design and Implement of a Web-Based E-Procurement System Platform for Shipping Line

Authors: Yeh-Cheng Chen, C. N. Chu, H. M. Sun, Ruey-Shun Chen, Y. H. Yang

Abstract: To establish an e-procurement system platform for the global shipping line can not only solve the problems of purchase and delivery difficulties, but also help the enterprises improve the efficiency of procurement. The goal of this research is to develop a web-based e-procurement platform, which can be actually initiated in the multinational shipping companies. The implementation of e-procurement system with W. Shipping Corporation about one year showed that the average procurement time decreased 80%, the number of full-time procurement personnel went down 50%, annual telephone and stationery expenditures were reduced 66%, the time for document processing of procurement personnel went down 89%, and value analysis time went up 25%. Moreover, it can solve the unnecessary contacts of negotiation dealings and documentation between purchasers and suppliers. For the required supplements of ships, suppliers can be informed in advance to deliver and supply the goods before the ships berth at any of the worldwide operation points or ports. This service design can improve the competition of enterprises.

Keywords: E-procurement platform, Enterprise Resource Planning (ERP), Shipping line

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #33, pp. 381-393


Title of the Paper: Frequent Segment Clustering of Test Cases for Test Suite Reduction

Authors: Narendra Kumar Rao B., A. Ramamohan Reddy

Abstract: Execution profiles are indicators for code coverage of program; this has been demonstrated by researchers on a large scale through their contributions on the same. Test Suite reduction is a feature which achieves code coverage with minimum number of test cases ensuring that all code items have been tested. It is a Non-deterministic Polynomial-time Complete (NP-Complete) problem. Few approaches like Greedy approach, Harrold,Soffa and Gupta(HGS) approach have been used in literature which are good approaches. Current work achieves similar milestones with reduced test cases as well. This paper presents Maximal frequency item set clustering and sequencing of similar test cases, residue code requirements based test case reduction and modification based test case selection. In the current work, few interesting results were found, where in similar program trace test cases were greatly reduced ensuring high code coverage percentage during testing. Fault detection can be tuned by selecting test cases from similar groups.

Keywords: Clustering, Test Suite Reduction, Test case coverage, Program Profiles, Test case Selection, Regression Testing

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #32, pp. 368-380


Title of the Paper: The Effects of Multimedia Elements on Learning Achievements in Digital Content

Authors: Kyung-Bo Noh, Ki-Sang Song, Sang Chun Nam, Se Young Park

Abstract: Because of e-learning environments characteristics of learners include isolation from the instructor or peer learners, therefore providing social cues or emotional feedback has been seriously considered by inserting pedagogical agents into digital contents. However, there exists controversies for introducing pedagogical agents into digital contents, pros are fostering learner motivation and learning outcomes, and cons include the potential to distract learners from the learning content. Measuring the differences of cognitive load needed for e-learning, we have applied learners' eye movement data taken from eye-tracker, and identify the impact of applying a pedagogical agent. The 45 high school students have been divided into three groups, each group focuses on one of the three types of e-learning contents; image and text based multimedia with a pedagogical agent (G1), multimedia as a figure and text with narration (G2), and multimedia only (G3). While learners use the contents, their eye-movement data was recorded and analyzed with EyeWorks software. Also, self-reported cognitive loads and learning achievements were used to analyze learners' performance. From these experiments, we found that the group of individuals using pedagogical agents and narration (G1 and G2), outperformed the multimedia only group (G3), these results show the positive impact of narration and pedagogical agents' in e-learning environments.

Keywords: Pedagogical agent, Narration, Multimedia learning, Contents design, Eye-tracking, Cognitive load

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #31, pp. 361-367


Title of the Paper: Feature Extraction Method for Epileptic Seizure Detection Based on Cluster Coefficient Distribution of Complex Network

Authors: Fenglin Wang, Qingfang Meng, Yuehui Chen, Yuzhen Zhao

Abstract: Automatic epileptic seizure detection has important research significance in clinical medicine. Feature extraction method for epileptic EEG occupies core position in detection algorithm, since it seriously affects the performance of algorithm. In this paper, we propose a novel epileptic EEG feature extraction method based on the statistical property of complex networks theory. EEG signal is first converted to complex network and cluster coefficients of every node in the network are computed. Through analysis of the cluster coefficient distribution, the partial sum of cluster coefficient distribution is extracted as the classification feature. A public epileptic EEG dataset was utilized for evaluating the classification performance of extracted feature. Experimental results show that the extracted feature achieves classification accuracy up to 94.50%, which indicates that it can clearly distinguish between the ictal EEG and interictal EEG. The higher classification accuracy demonstrates the extracted classification feature’s great potentiality of the real-time epileptic seizures detection.

Keywords: Epileptic Seizure Detection, Feature Extraction Method, Complex Network, Cluster Coefficient Distribution, Nonlinear Time Series Analysis, Electroencephalograph (EEG)

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #30, pp. 351-360


Title of the Paper: Improvising Web Search Using Concept Based Clustering

Authors: Indumathi D., Chitra A., Bineeshia J.

Abstract: The user profile is an elementary component of any application based on personalization. The existing strategies of user profile considers only objects which interests the user (positive preferences of the user), and not on objects which does not interest the user (negative preferences of the user). This paper focuses on personalization in search engine and the proposed approach consists of three steps. At first, an algorithm for concept extraction is employed in which concepts are extracted and the relations between these concepts are obtained from the web-snippets returned by the search engine. Second, a user profile strategy is employed to build a concept-based user profile which predicts the conceptual preferences of the user. Building user profile comprises of identifying the concept preference pair by Spy Naive Bayes Classifier (Spy NB-C) method and learning the users preferences represented by feature weights vectors by Ranking-Support Vector Machine(RSVM). Third, the concept relations together with the predicted conceptual preferences of the user, is given as input to personalized concept-based clustering algorithm to find the conceptually related queries. To cluster ambiguous queries into different clusters of queries a personalized clustered query-concept bi-partite graph is created by making use of the extracted concepts and click through data. This suggested personalized query recommendations to the individual users based on their interests. From the experimental results, it is observed that the user profile which captured both the preferences of the user increased the separation between dissimilar queries and similar queries. Improvements in F-measure and DCG score shows that the quality of query clusters resulted provided personalised results to the users.

Keywords: concept extraction, negative preferences, personalization, concept clusters, search engine, user profile

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #29, pp. 338-350


Title of the Paper: Software Design of System SMPSL

Authors: Radek Nemec, Stepan Hubalovsky

Abstract: One of the most important methods in current scientific, technological and educational research is process of modeling and simulation of real experiment as well as modeling and simulation of real experimentally measured data. Modeling and simulation are discipline with its own theory and research and educational methodology. The paper briefly focuses to the theory of the process of modeling and simulation as one of the important educational method. The process of modeling and simulation is step by step demonstrating by creation of user application of system SMPSL. The system SMPSL is a measurement system using computer in the school laboratory. This system was designed as a cheap and flexible interface for recording the values and represents that. The system must be controlled by appropriate software. Therefore research investigation has been conducted to find best user application. Based on the selection of the three options and the comments were selected and modified the most optimal solution.

Keywords: Software design, user interface, DAQ, eProDas, data acquisition, experiment, education

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #28, pp. 329-337


Title of the Paper: Routing Virtualization Intended for a Real Time Simulation over a Testbed designed for IPv4/IPv6 Transition Techniques

Authors: Sheryl Radley, Shalini Punithavathani D., Mari Kumar B.

Abstract: The swift growth of the Internet has led IPv6 to loom on the horizon. IPv4-IPv6 transition rolls out several challenges to the world of Internet as the Internet is migrating from IPv4 to IPv6. IETF proposes transition techniques which includes Dual stack, Translation and Tunneling. A transition allows IPv4/IPv6 coexistence and interoperability, in order to maintain end to end model that the Internet is built on. The three individual mechanisms do not provide a thorough solution. To address this need we have developed a Testbed using a Real Time Simulator Packet tracer 6.0.1 for Routing Virtualization (RV) using a single physical Router and have compared the different transition techniques proving high scalability and reachability. The throughput is witnessed in the test analysis. The different parameters are also compared and studied for different transition mechanism under access, distribution and core network.

Keywords: IPv4, IPv6, Dual stack, Tunneling and Translation, Routing Virtualization

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #27, pp. 320-328


Title of the Paper: Data-Driven Saliency Region Detection Based on Undirected Graph Ranking

Authors: Wenjie Zhang, Qingyu Xiong, Shunhan Chen

Abstract: Highlighting saliency region is still a challenging problem in computer vision. In this paper, we present a data-driven salient region detection method based on undirected graph ranking. It consists of two steps: we first compute priori saliency map on super-pixel image by combining region contrast and center prior information, and then extract saliency map by optimized ranking function based on a new affinity matrix. It is simple and efficient. Furthermore, salient objects can be successfully highlighted with precise details and high consistency. We evaluate the proposed method with three image datasets. The experimental results show that the proposed approach has a good performance in terms with the PR curve, the ROC curve.

Keywords: Image saliency, salient region, affinity matrix, undirected graph ranking

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #26, pp. 310-319


Title of the Paper: Levenberg-Marquardt Learning Neural Network For Part-of-Speech Tagging of Arabic Sentences

Authors: Hasan Muaidi

Abstract: Part-of-speech tagging is usually the first step in linguistic analysis. Also, it is a very important intermediate step to build many natural language processing applications. This paper examines the application of neural networks to the task of tagging Arabic sentences. The network is trained with the help of Levenberg-Marquardt learning algorithm. Corpora of 24,810 words are collected and manually tagged to train the neural networks and to test the performance of the developed POS-Tagger. The developed tagger achieved an accuracy of 98.83% when evaluated on the train set and 90.21% on the test set. The performance of the Levenberg-Marquardt learning algorithm was compared with the performance of the traditional Backpropagation learning algorithm. It was found that the Levenberg-Marquardt Learning neural network is an efficient approach and more effective than the traditional Backpropagation learning algorithm for tagging Arabic words.

Keywords: Part of Speech Tagging, Arabic Language, Neural Networks, Levenberg-Marquardt Learning Algorithm, Backpropagation Learning Algorithm

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #25, pp. 300-309


Title of the Paper: ROCK Clustering Algorithm Based on the P System with Active Membranes

Authors: Yuzhen Zhao, Xiyu Liu, Wenping Wang

Abstract: The ROCK algorithm plays an important role in data mining and data analysis, which can help people discover knowledge from large amounts of data. In this paper, an improved ROCK algorithm based on the P system with active membranes is constructed. Since the P system has great parallelism, it could reduce the computational time complexity and is suitable for the clustering problem. All the rules of the proposed algorithm are designed in this paper. Experimental results show that the proposed algorithm is appropriate for clustering large dataset. The proposed improved ROCK algorithm is a new attempt in applications of membrane system and it provides a novel perspective of cluster analysis.

Keywords: Clustering Algorithm, Hierarchical Clustering, ROCK Algorithm, Membrane Computing, P System, Membrane System

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #24, pp. 289-299


Title of the Paper: Decision Tree Based Learning Approach for Identification of Operating System Processes

Authors: Amit Kumar, Shishir Kumar

Abstract: In present scenario various tools like firewalls, anti-virus tool, network security tools, malware removal tools, monitoring tools etc, are being used for providing security to computer systems. Computer security tools available in present era need to be updated and monitored regularly. If any computer users do not regularly update the security tools, such systems will be vulnerable to virus and other attacks. Through this paper a learning system is being proposed to identify the operating system processes as Self and Non-Self, using the concepts of Decision Tree Learning. ID3 algorithm has been used to construct a Decision Tree after calculating the Entropy and Information Gain. Initially Decision Trees are generated using training examples and then these constructed Decision Trees are tested with test data. Further, it has been inferred through experimental results that the Decision Tree Learning approach will provide better security through effective identification of Self and Non-Self processes.

Keywords: Self and Non Self Process, Process-Parameters, Decision Tree, ID3 (Iterative Dichotomiser 3), Entropy, Information Gain

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #23, pp. 277-288


Title of the Paper: Feature Coding for Image Classification Based on Saliency Detection and Fuzzy Reasoning and Its Application in Elevator Videos

Authors: Xiao Lv, Dingdong Zou, Lei Zhang, Shangyuan Jia

Abstract: Feature coding is an fundamental step in bag-of-words based model for image classification and have drawn increasing attention in recent works. However, there still exits ambiguity problem, and it is also sensitiveness to unusual features. To improve the stability and robustness, we introduce saliency detection and fuzzy reasoning rules to propose an novel coding scheme. In detail, saliency maps generated by saliency detection are first used to divide each image into salient and non-salient region, then a structured dictionary is obtained by combing two separated codebooks in them. Secondly, fuzzy reasoning rules are introduced to choose the most salient and stable codewords to encode. Finally, saliency maps are incorporated into pooling operation named saliency based spatial pooling to introduce spatial information. Experiments on several datasets demonstrate our approach outperforms all other coding methods in image classification. Furthermore, we also apply it into elevator video event classification, which shows the potential application in intelligent elevator video surveillance, such as overload detection, violence detection, video summarization.

Keywords: Image classification, feature coding, saliency detection, fuzzy reasoning, elevator video event

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #22, pp. 266-276


Title of the Paper: Balanced Constraint Measure Algorithm to Preserve Privacy from Sequential Rule Discovery

Authors: S. S. Arumugam, V. Palanisamy

Abstract: Preservation of needed privacy from mining algorithms (data mining methods which extract information from the privacy diffusion of people and organizations) is an emerging research area. Researchers are creating procedures to maintain a proper balance between maintaining information privacy and knowledge discovery by using data mining. In this paper, we initially use the prefixspan algorithm to generate sequential patterns from the medical database, and these patterns are converted into sequential rules. We then apply our proposed algorithm to evaluate these generated sequential rules according to random values. Our proposed algorithm evaluates the processed rule in terms of knowledge discovery and information loss. If the evaluated result satisfies the user-defined thresholds, our proposed algorithm releases the modified sequential rules, else further iteration is carried out until the sequential rules satisfy the user-defined threshold value. Finally, an experiment is conducted to evaluate the proposed algorithm on the basis of knowledge discovery and information loss.

Keywords: Prefix span algorithm, Sequential rule, Significant disease, Balanced constraint, Knowledge discovery, Information loss

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #21, pp. 251-265


Title of the Paper: Objective Quality Assessment for Stereoscopic Images Based on Structure-Texture Decomposition

Authors: Kemeng Li, Shanshan Wang, Qiuping Jiang, Feng Shao

Abstract: We present a novel quality assessment index for stereoscopic images based on structure-texture decomposition. To be more specific, we decompose a stereoscopic image pair into its structure and texture components. Then, gradient magnitude similarity (GMS) and luminance-contrast similarity (LCS) indexes are used to measure the qualities of structure components and texture components, respectively. Finally, the quality score is obtained by combining the above quality scores in a non-fixed manner. Experimental results on two publicly available 3D image quality assessment databases demonstrate that, in comparison with the related existing methods, the proposed technique achieves high consistency alignment with subjective assessment.

Keywords: Quality assessment, structure-texture decomposition, gradient magnitude similarity, luminancecontrast similarity

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #20, pp. 242-250


Title of the Paper: TRPC: An Ensembled Online Prediction Mechanism for Trusted Recommender System

Authors: M. Parvathy, R. Ramya, K. Sundarakantham, S. Mercy Shalinie

Abstract: The most recent invasions in social networks make it inevitable to develop a network with high dependence and confidence to users. Even though recommender systems of today use advanced parallelism in web development, achieving trustworthiness in such a system has been a challenging task for several years. To overcome the existing sparsity, scalability and dynamism in new item/user issues, we propose a framework TRust Propagation and Clustering (TRPC), to build a trust network using the social distance between every pair of users and similarity measure of clustered users based on the users’ tastes and preference. Our proposed technique to predict the ratings of items by users involves three major steps which comprise both implicit and explicit social relationship and propagation mechanism. The second step involves clustering the trusted users and third step predicts the products/subjects between them based on the alike criteria. The proposed rating prediction promises a better eminent recommendation for all buyers and online users who gain access to the community. To validate the effectiveness of our work, we experimented with two real world datasets Epinions and Movie Lens.

Keywords: Trust propagation, Trusted path, Clustering, Social networks, Recommendation systems, Similarity metrics

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #19, pp. 231-241


Title of the Paper: Optimization of Tree Pipe Networks Layout and Size, Using Particle Swarm Optimization

Authors: Ibrahim Mtolera, Li Haibin, Liu Ye, Su Bao-Feng, Du Xue, Maxiao-Yi

Abstract: A commonly used design method for irrigation pipe network (IPN) layout and size often involves trial and error approach. This makes it difficult to minimize capital investment and energy cost. Objective of this study was to optimize simultaneously size and layout of the irrigation pipe networks using particle swarm optimization (PSO) technique. This technique was linked to the MATLAB software to reduce the pipeline investment cost in irrigation projects. The Pipe layout and size optimization model for a tree irrigation pipe network are therefore, presented in this paper. The performance of PSO technique was tested and results were compared with non-optimized (Step-by-step) and genetic algorithm optimization methods. The proposed PSO technique with an increase in the search space showed a quick response in the size of the swarm and the initial swarm compared to the non-optimized (Step-by-step) design method and genetic algorithm.

Keywords: Irrigation, Tree pipe network, Particle swarm optimization (PSO), Investment cost, non-optimized (Step-by-step)

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #18, pp. 219-230


Title of the Paper: QOS Evaluation of Stable Energy Efficient Node Disjoint Adhoc Routing Protocol

Authors: V. Seethalakshmi, G. Mohankumar

Abstract: A mobile adhoc network (MANET) is a self-configuring infra- structure less network of mobile devices connected by wireless. The emergence of real-time applications such as multimedia services, disaster recovery etc., and the widespread use of wireless and mobile devices has generated the need to provide quality-of-service (QoS) support in MANET. But QoS provisioning in MANETs is a very challenging problem when compared to wired IP networks. This is because of wireless multi-hop communication, limited battery power, each device in a MANET is free to move independently in any direction, and will therefore change its links to other devices frequently, and range of mobile devices as well as the absence of a central coordination authority. So, the design of an efficient and reliable routing scheme providing QoS support for such applications is a very important. Therefore, an effort has been done to create a new QOS based Stable Energy Aware Ad hoc Routing protocol (QSEAAR) by adding quality of service to the Stable energy aware adhoc routing (SEAAR) protocol. The simulation of proposed protocol is carried out using network simulator ns-2.35 under Linux platform. The protocol considers not only the QoS requirement, but also the cost optimality of the routing path to improve the overall network performance. The evaluation results show that the performance of QSEAAR is comparable and outperforms the existing AOMDV and SEAAR.

Keywords: AOMDV, RREQ, RREP, MANET, ADHOC, QSEAAR, SEAAR

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #17, pp. 207-218


Title of the Paper: Coherence in the CMP ERA: Lesson Learned in Designing a LLC Architecture

Authors: Sandro Bartolini, Pierfrancesco Foglia, Cosimo Antonio Prete, Marco Solinas

Abstract: Designing an efficient memory system is a big challenge for future multicore systems. In particular, multicore systems increase the number of requests towards the memory systems, so the design of efficient on-chip caches is crucial to achieve adequate level of performance. Solutions based on conventional, big sized cache may be improved due to wire delay effects, so NUCA and D-NUCA cache may represents an alternative solution, thanks to their ability to limit such effects. Another important design issue of such systems is related to coherence management: the theory of caches kept coherent via directory based coherence protocols was successful in designing high performance DSM machine, and now must consider the requirements of the new scenario: many cores on a chip, and NUCA organizations. In this paper, we face some of these aspects by presenting a NUCA based last-level cache (LLC) architecture. Such an architecture is based on a D-NUCA scheme, i.e. a banked LLC architecture with a migration mechanism to put frequently accessed data near to the requesting processor. To improve access time to shared copies limited by ping-pong effects, we adopted the copy replication, that allows the replication of shared copies that are requested by processors located on the opposite side of the cache. Finally, we have adapted a directory based, distributed, coherence protocol to a D-NUCA cache with migration and replication. Our resulting cache memory sub-system is more performing than a statically sub-banked LLC. The adoption of all such mechanisms forced us to deal with race conditions that may compromise data coherence inside the chip and the memory and, then, to modify the baseline coherence protocol. This experience demonstrated that, in the multicore era, coherence protocols still must be considered of the utmost importance by researchers and designers when facing the design of such systems.

Keywords: CMP systems, wire delay, NUCA, coherence, migration, replication

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #16, pp.195-206


Title of the Paper: The Design of Multidimensional Data Model Using Principles of the Anchor Data Modeling: An Assessment of Experimental Approach Based on Query Execution Performance

Authors: Radek Němec, František Zapletal

Abstract: The decision making processes need to reflect changes in the business world in a multidimensional way. This includes also similar way of viewing the data for carrying out key decisions that ensure competitiveness of the business. In this paper we focus on the Business Intelligence system as a main toolset that helps in carrying out complex decisions and which requires multidimensional view of data for this purpose. We propose a novel experimental approach to the design a multidimensional data model that uses principles of the anchor modeling technique. The proposed approach is expected to bring several benefits like better query execution performance, better support for temporal querying and several others. We provide assessment of this approach mainly from the query execution performance perspective in this paper. The emphasis is placed on the assessment of this technique as a potential innovative approach for the field of the data warehousing with some implicit principles that could make the process of the design, implementation and maintenance of the data warehouse more effective. The query performance testing was performed in the row-oriented database environment using a sample of 10 star queries executed in the environment of 10 sample multidimensional data models. The results show comparison of differences between results of query execution in the environment of the experimental “Anchor” schema and the traditional “Star” schema using statistical methods. The results show possible indications towards expected benefits of the proposed approach that embraces high level of normalization of the resulting database schema in contrast with the traditional approach that results mostly to the creation of a non-normalized database schema.

Keywords: Μultidimensional view of data, multidimensional data model, experimental approach, Anchor modeling, query execution performance

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #15, pp.177-194


Title of the Paper: E-Procurement Institutionalization in Construction Industry in Developing Countries: A Model and Instrument

Authors: Quang-Dung Tran, De-Chun Huang

Abstract: The adoption of e-commerce technologies is a progressively interactive multi-phase process. Existing literature on the adoption has not been paid appropriately attention on this nature. This study distinguishes the difference between initial adoption and institutionalization of e-procurement and provides a particular focus on the investigation of determinants of e-procurement adoption in construction sector in developing countries’ context. It uses formative measurement to model exogenous latent variables. As a result, a theoretical model and an instrument are constructed and empirically tested by using data collected in Hanoi. The sophistication of e-procurement is essential to gain full benefits from the technology; therefore, specific studies on institutionalization of e-procurement, for instance this present work is very necessary. Further research with larger sample need to be conducted to validate additionally the model and instrument proposed.

Keywords: Ε-procurement; initial adoption; institutionalization; construction industry; developing countries

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #14, pp.152-176


Title of the Paper: EAST: An Exponential Adaptive Skipping Training Algorithm for Multilayer Feedforward Neural Networks

Authors: R. Manjula Devi, S. Kuppuswami

Abstract: Multilayer Feedforward Neural Network (MFNN) has been administered widely for solving a wide range of supervised pattern recognition tasks. The major problem in the MFNN training phase is its long training time especially when it is trained on very huge training datasets. In this accordance, an enhanced training algorithm called Exponential Adaptive Skipping Training (EAST) Algorithm is proposed in this research paper which intensifies on reducing the training time of the MFNN through stochastic manifestation of training datasets. The stochastic manifestation is accomplished by partitioning the training dataset into two completely separate classes, classified and misclassified class, based on the comparison result of the calculated error measure with the threshold value. Only the input samples in the misclassified class are exhibited to the MFNN for training in the next epoch, whereas the correctly classified class is skipped exponentially which dynamically reducing the number of training input samples exhibited at every single epoch. Thus decreasing the size of the training dataset exponentially can reduce the total training time, thereby speeding up the training process. This EAST algorithm can be integrated with any supervised training algorithms and also it is very simple to implement. The evaluation of the proposed EAST algorithm is demonstrated effectively using the benchmark datasets - Iris, Waveform, Heart Disease and Breast Cancer for different learning rate. Simulation study proved that EAST training algorithm results in faster training than LAST and standard BPN algorithm.

Keywords: Adaptive Skipping, Neural Network, Training Algorithm, Training Speed, MFNN, Learning Rate

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #13, pp.138-151


Title of the Paper: Camera Calibration with Varying Parameters Based on Improved Genetic Algorithm

Authors: Mostafa Merras, Nabil El Akkad, Abderrahim Saaidi, Abderrazak Gadhi Nazih, Khalid Satori

Abstract: In this paper, we propose an approach based on improved genetic algorithms for the camera calibration having the varying parameters. The present method is based on the formulation of a nonlinear cost function from the determination of the relationship between points of the image planes and all parameters of the cameras. The minimization of this function by a genetic approach enables us to simultaneously estimate the intrinsic and extrinsic parameters of different cameras. Comparing to traditional optimization methods, the camera calibration based on improved genetic algorithms can avoid being trapped in a local minimum and does not need the initial value. The proposed technique to find the near optimal solution without the need for initial estimates of the cameras parameters. Tested on several cases, the proposed method proved to be an effective tool to determine the camera parameters necessary for various applications, such as the images rectification, the analytical photogrammetry and 3D reconstruction from 2D images.

Keywords: Camera calibration, computer vision, improved genetic algorithms, varying parameters, non-linear optimization

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #12, pp.129-137


Title of the Paper: Enabling Predictive Maintenance Strategy in Rail Sector: A Clustering Approach

Authors: John Victor Antony, G. M. Nasira

Abstract: One of the imperatives of predictive maintenance of assets is to analyze, understand and act upon the failure pattern hidden in the failure data which are represented, in general, by failure code, failure description and failure instance. A maintenance plan that will be in consonance with the failure trend inferred through data mining is bound to enhance the asset reliability. The paper shows how to integrate clustering approach into the realm of asset maintenance and particularly provides a road map to implement predictive maintenance strategy in rail sector. The proposed approach has been tested on actual failure data pertaining to passenger carrying vehicles of the trains. Finally, the performance of two fundamental approaches i.e. hard and soft clustering has been investigated on the data set and a recommendation made therein.

Keywords: Clustering, Data Analysis, Data Description, K Means algorithm, Railways, Pattern Analysis

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #11, pp.118-128


Title of the Paper: Achieving Load Balance in a Hierarchical Peer-to-Peer Architecture

Authors: S. C. Wang, Y. H. Su, S. S. Wang, K. Q. Yan, S. F. Chen

Abstract: With the fast development of networking, the demands made of computers are greater than ever before. Determining how to utilize the resources of networking to reach the objective of cooperative computing has remained an important topic in recent years. In addition, with the great advances in technology of the Internet, Peer-to-Peer (or P2P) computing has gradually become the mainstream of distributed applications; it not only provides enormous resources for complicated computing that a single computer cannot solve, but also integrates resources more effectively. A P2P architecture relies primarily on the computing power and bandwidth of the participants in the network rather than concentrating the work in a relatively limited number of servers. P2P architectures typically are used for connecting nodes via large-scale connections. The topology is useful for many purposes. Furthermore, every joint in a P2P computing system has its own resources. Determining how to take the different characteristics of every joint set into consideration for loading assignments is an important topic. However, in this study, a three-phase scheduling algorithm under P2P architecture is advanced. The proposed scheduling algorithm is composed of BTO (Best Task Order), TOLB (Threshold-based Opportunistic Load Balancing) and TLBMM (Threshold-based Load Balance Min-Min) scheduling algorithm that can better utilize executing efficiency and maintain the load balancing of system.

Keywords: Distribution system, Peer-to-Peer computing, Scheduling, Load balance

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #10, pp.107-117


Title of the Paper: F-PAC: A Novel Soft Index Based Cluster Head Validation & Gateway Election Mechanism for Ad Hoc Network

Authors: S. Thirmurugan, E. George Dharma Prakash Raj

Abstract: In this dynamic scenario the communication no longer happens in predetermined manner. The network as a platform for communication comes with high infrastructure may likely to waste the resources. Thus, the ad hoc scenario network came into existence. This network functionality has been enhanced through clustering mechanism. These clusters need to be perfect to sustain the efficient functionality of the network. Thus, this paper proposes F-PAC as a fuzzy logic based cluster validation technique to authenticate the cluster head identified by the existing cluster formation mechanisms. This procedure also helps to elect the gateway node for each cluster. This study has been shown using OMNET++ as simulator.

Keywords: F-PAC, W-PAC, Fuzzy

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #9, pp.99-106


Title of the Paper: Equity between Secondary Users in a Cognitive Radio Network

Authors: Abdellah Idrissi, Said Lakhal

Abstract: Ensuring fairness in the allocation of free channels to Secondary Users (SUs) in a Cognitive Radio Network (CRN) is a big problem for researchers. In this work, we develop a new algorithm, explaining the parameters of channel assignment, by identifying the SU, the number of packets to send and the slots and channels to use. To calculate the effectiveness of the algorithm, we use the Equity Index Jain (EIJ) as an indicator of performance. The results show values of EIJ always close to 1, which proves the effectiveness of the algorithm.

Keywords: Cognitive Radio Network, Scheduling, Equity, Equity Index Jain, Transfert rate, Cantor’s Bijection

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #8, pp.90-98


Title of the Paper: Hiding Sensitive Fuzzy Association Rules Using Weighted Item Grouping and Rank Based Correlated Rule Hiding Algorithm

Authors: K. Sathiyapriya, G. Sudhasadasivam, C. J. P. Suganya

Abstract: This paper aims at identifying the optimal Multicore processor configuration for cryptographic applications. The RSA encryption algorithm has been taken as a case study and a comprehensive design space exploration (DSE) has been performed to obtain the optimal processor configuration that can serve as either a standalone or a coprocessor for security applications. The DSE was based on four figures of merit that include: performance, power consumption, energy dissipation and lifetime reliability of the processor. A parallel version of the RSA algorithm has been implemented and used as an experimentation workload. Direct program execution and full-system simulation have been used to evaluate each candidate processor configuration based on the aforementioned figures of merit. Our analysis was based on commodity processors in order to come up with realistic optimal processor configuration in terms of its clock rate, number of cores, number of hardware threads, process technology and cache hierarchy. Our results indicate that the optimal Multicore processor for parallel cryptographic algorithms must have a large number of cores, a large number of hardware threads, small feature size and should support dynamic frequency scaling. The execution of our parallel RSA algorithm on the identified optimal configuration has revealed a set of observations. First, the parallel algorithm has achieved a 79% performance improvement as compared to the serial implementation of the same algorithm. Second, running the optimal configuration at the highest possible clock rate has achieved 40.13% energy saving as compared to the same configuration with the lowest clock rate. Third, running the optimal configuration at the lowest clock rate has achieved a 19.7 % power saving as compared to the same configuration with the highest clock rate. Fourth, the optimal configuration with low clock rate has achieved 109.85 % higher mean time to failure (MTTF), on average, as compared to the high-frequency configuration. Consequently, the optimal configuration has always the same number of cores, hardware threads, and process technology but the clock rate should be adjusted appropriately based on the design constraints and the system requirements.

Keywords: Data perturbation, Fuzzy, Correlation analysis, Sensitive Association rules, Item Grouping, Rule Hiding, Quantitative data, weighted, privacy preservation, data security

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #7, pp.78-89


Title of the Paper: Towards an Optimal Multicore Processor Design for Cryptographic Algorithms – A Case Study on RSA

Authors: Mutaz Al-Tarawneh, Ashraf Alkhresheh

Abstract: This paper aims at identifying the optimal Multicore processor configuration for cryptographic applications. The RSA encryption algorithm has been taken as a case study and a comprehensive design space exploration (DSE) has been performed to obtain the optimal processor configuration that can serve as either a standalone or a coprocessor for security applications. The DSE was based on four figures of merit that include: performance, power consumption, energy dissipation and lifetime reliability of the processor. A parallel version of the RSA algorithm has been implemented and used as an experimentation workload. Direct program execution and full-system simulation have been used to evaluate each candidate processor configuration based on the aforementioned figures of merit. Our analysis was based on commodity processors in order to come up with realistic optimal processor configuration in terms of its clock rate, number of cores, number of hardware threads, process technology and cache hierarchy. Our results indicate that the optimal Multicore processor for parallel cryptographic algorithms must have a large number of cores, a large number of hardware threads, small feature size and should support dynamic frequency scaling. The execution of our parallel RSA algorithm on the identified optimal configuration has revealed a set of observations. First, the parallel algorithm has achieved a 79% performance improvement as compared to the serial implementation of the same algorithm. Second, running the optimal configuration at the highest possible clock rate has achieved 40.13% energy saving as compared to the same configuration with the lowest clock rate. Third, running the optimal configuration at the lowest clock rate has achieved a 19.7 % power saving as compared to the same configuration with the highest clock rate. Fourth, the optimal configuration with low clock rate has achieved 109.85 % higher mean time to failure (MTTF), on average, as compared to the high-frequency configuration. Consequently, the optimal configuration has always the same number of cores, hardware threads, and process technology but the clock rate should be adjusted appropriately based on the design constraints and the system requirements.

Keywords: RSA, performance, power, energy, lifetime reliability, optimal configuration

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #6, pp.54-77


Title of the Paper: Extending MPLS Technology in Providing Quality of Service among Different Autonomous Systems using Relay Race Transmission Approach

Authors: Anurag Goyal, Ashu Gupta, Reetu Gupta

Abstract: The paper has addressed the serious networking issues including signal strength retention and amplification to larger distance due to negative impact of amplifiers including 5-4-3 rule constraints, impact to internal delay, background noise amplification, limited maximum signal value of amplifier, usage of poor signal generators, lack of repeater support, virtually moving users to bigger distances etc. The research has enhanced the functionalities of MPLS in providing Quality of Service among different autonomous systems with desired signal strength. The research has been carried out in Red Hat Linux environment and using Experimental testbed for evaluation of conceptual, analytical, experimental and methodological details. The paper presents the new approach of Relay Race Transmission in MPLS technology to extend its QoS performance among autonomous systems by the means and methods using Label switched routers instead of signal amplifiers between remote distances. The results are presented with the comparative graphs of default and extended MPLS technology and conclude that extended MPLS technology has an edge over the default MPLS technology in terms of maintaining desired signal strength among autonomous systems and hence providing QoS among ASs.

Keywords: QoS, Relay race transmission, 5-4-3 rule, delay, noise, repeater, amplification, autonomous system

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #5, pp.43-53


Title of the Paper: Compound Regular Plans and Delay Differentiation Services for Mobile Clients

Authors: John Tsiligaridis

Abstract: The broadcast problem including the plan design is considered. Data can be reached at any time and place. A larger number of users are asking for services. The Regular Broadcast Plan (RBP) can be created with single or multiple channels after some data transformations are examined. A server with the data-parallelism can administrate more than one plans and before broadcasting any of the plans it can consider the case of union ready or candidate RBPs. The case of union of several candidate or ready RBPs is examined and useful results are provided. A new RBP algorithm based on the group length is introduced (RBPG). A set of algorithms related to the creation of a compound regular broadcast plan (CO_RBP) and their possibilities are developed. CO_RBP is based on the “hot” and “near hot” sets. The conditions of preferring CO_RBP to separate RBPs is also presented. Two types of CO_RBP are examined; the additive RBPs (ARBPs) and the union RBPs (URBPs). This proposed framework gives service priority to the “hot or “near hot” sets. In addition, a Dimensioning algorithm (DA) based on the delay differentiation services and conditions that can guarantee the desired ratio for the CO_RBP are developed. In this way the server, has an increasing of broadcasting capability ,by deciding on the union of RBPs using single or multiple channels. This ability can enrich the server infrastructure for self-monitoring, self-organizing and channel availability. Simulation results are provided.

Keywords: Broadcasting, broadcast plan, mobile computing

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #4, pp.34-42


Title of the Paper: Density-Based Clustering by P System with Active Membranes on Commodity Recommendation in E-commerce Websites

Authors: Jie Sun, Xiyuliu

Abstract: With the Popularity of shopping online in people's daily economic life, the commodity recommendation mechanism in e-commerce platform is presented to help customers quickly and accurately find the suitable product. Using the possibility of changing membrane structure, a variant of P system with active membranes is proposed to solve commodity recommendation problems. In this paper, the commodity recommendation problem is transformed into a density-based clustering problem firstly. Then it specifies the procedure of realizing this problem and a P system with a sequence of new rules is designed. The computation complexity of DBSCAN clustering algorithm in this system is O(n log n), while the original DBSCAN clustering algorithm is O(n^2) without spatial query. This new model of P system can reduce the computation complexity of clustering process and improve the efficiency to solve the problems of commodity recommendation. Through example verification, this new model of P system is proved to be feasible and effective to achieve this practical issue.

Keywords: Membrane Computing, P System, Active Membranes, DBSCAN, Commodity Recommendation

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #3, pp.20-33


Title of the Paper: An Integrated Multi-Agent Testing Tool for Security Checking of Agent-Based Web Applications

Authors: Fathy E.Eassa, M.Zaki, Ahmed M. Eassa, Tahani Aljehani

Abstract: In this paper, an integrated multiagent testing tool, is presented. Such tool comprises static analyzer, dynamic tester and an integrator of the two components for detecting security vulnerabilities and errors in agent based web applications written in Java. The static analysis component analyzes the source code of the web application to identify the locations of security vulnerabilities and displays them to the programmer. Consequently, dynamic testing of the web application is carried out. Here, a temporal-based assertion language is introduced to help in detecting security violations (errors) in the underlying application. The proposed language has operators for detecting SQL injection and cross-site scripting, XSS, security errors. The dynamic tester consists of two components: instrumentor (preprocessor) and run-time-agent. The instrumentor has many modules that have been implemented as software agents using Java language under the control of a multi agent framework. The agents of the instrumentor are: static analyzer agent, parser agent, and code converter agent. Moreover, an integrator for integrating both static and dynamic analyses is employed. Eventually the implementation details of IMATT are reported.

Keywords: web applications security testing, static testing, dynamic testing, temporal logic, assertion languages

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #2, pp.9-19


Title of the Paper: Early Detection and Prevention of Oral Cancer: Association Rule Mining on Investigations

Authors: Neha Sharma, Hari Om

Abstract: Early detection and prevention of oral cancer is critical, as it can increase the survival chances considerably, allow for simpler treatment and result in a better quality of life for survivors. In this research paper, the popular association rule mining algorithm, apriori is used to find the spread of cancer with the help of various investigations and then assess the chance of survival of the patient. This is achieved by extracting a set of significant rules among various laboratory tests and investigations like FNAC of neck node, LFT, Biopsy, USG, CT scan-MRI and survivability of the oral cancer patients. The rules clearly show that if FNAC of neck node, USG and CT scan/ MRI is positive then chance of survival is reduced. However, if LFT is normal, probability of survival is high. If diagnostic-biopsy results in squamous-cell-carcinoma then it clearly indicate oral cancer, which may lead to high mortality if appropriate treatment is not initiated. The experimental results demonstrate that all the generated rules hold the highest confidence level, thereby, making investigations very essential to understand the spread of cancer after clinical examination for early detection and prevention of oral cancer.

Keywords: Data Mining, Association Rule Mining, Apriori, Oral Cancer, Weka, Investigations

WSEAS Transactions on Computers, ISSN / E-ISSN: 1109-2750 / 2224-2872, Volume 13, 2014, Art. #1, pp.1-8