IJMECS Vol. 7, No. 2, Feb. 2015
Cover page and Table of Contents: PDF (size: 107KB)
REGULAR PAPERS
A new neuro-fuzzy system’s architecture and a learning method that adjusts its weights as well as automatically determines a number of neurons, centers’ location of membership functions and the receptive field’s parameters in an online mode with high processing speed is proposed in this paper. The basic idea of this approach is to tune both synaptic weights and membership functions with the help of the supervised learning and self-learning paradigms. The approach to solving the problem has to do with evolving online neuro-fuzzy systems that can process data under uncertainty conditions. The results proves the effectiveness of the developed architecture and the learning procedure.
[...] Read more.Recommender systems have shown great potential to help users find interesting and relevant Web service (WS) from within large registers. However, with the proliferation of WSs, recommendation becomes a very difficult task. Social computing seems offering innovative solutions to overcome those shortcomings. Social computing is at the crossroad of computer sciences and social sciences disciplines by looking into ways of improving application design and development using elements that people encounter daily such as social networks, trust, reputation, and recommendation. In this paper, we propose a social trust-aware system for recommending Web services (WSs) based on social qualities of WSs that they exhibit towards peers at run-time, and trustworthiness of the users who provide feedback on their overall experience using WSs. A set of experiments to assess the fairness and accuracy of the proposed system are reported in the paper, showing promising results and demonstrating that our service recommendation method significantly outperforms conventional similarity-based and trust-based service recommendation methods.
[...] Read more.To increase learning accuracy, it is important to remove misleading, redundant, and irrelevant features. Fuzzy rough set offers formal mathematical tools to reduce the number of attributes and determine the minimal subset. Unfortunately, using the formal approach is time consuming, particularly if a large dataset is used. In this paper, an efficient algorithm for finding a reduct is introduced. Several techniques are proposed and combined with the harmony search, such as using a balanced fitness function, fusing the classical ranking methods with the fuzzy-rough method, and applying binary operations to speed up implementation. Comprehensive experiments on 18 datasets demonstrate the efficiency of using the suggested algorithm and show that the new algorithm outperforms several well-known algorithms.
[...] Read more.The main component of study is to confirm that how developed security model are helpful for security improvement of object oriented designs. Software refactoring is an essential activity during development and maintenance. It promotes the reengineering measures for improving quality and security of software. The researcher made an effort in this regard to develop security improvement guideline using refactoring activities for object oriented deign. The developed guidelines are helpful to control design complexity for improved security. A case study is adopted from refactoring example by fowler to implement the Security Improvement Guidelines (SIG). The developed Security Quantification Model (SQMOODC) is being used to calculate the quantified value of security at each step. The proposed model SQMOODC calculates the effective security index by ensuring that revised version of object oriented design is being influenced through security improvement guidelines. There is some possibility that original code segment may have some security flaws, anomalies and exploitable entities or vulnerable information that may influence security at design stage. SIG is helpful to cease the security flaws, anomalies, exploitable entities into refactored code segment. Each refactored steps of case study match the prediction of the impact for refactoring rules on security and the impact study for security through SQMOODC model legalize the effectiveness of developed model and security improvement guidelines. The validated results of statistical analysis with different case studies of object oriented designs reflect the usefulness and acceptability of developed models and guidelines.
[...] Read more.Texture classification is an important application in all the fields of image processing and computer vision. This paper proposes a simple and powerful feature set for texture classification, namely micro primitive descriptor (MPD). The MPD is derived from the 2×2 grid of a motif transformed image. The original image is divided into 2×2 pixel grids. Each 2×2 grid is replaced by a motif shape that minimizes the local ascent while traversing the 2×2 grid forming a motif transformed image. The proposed feature set extracts textural information of an image with a more detailed respect of texture characteristics. The results demonstrate that it is much more efficient and effective than representative feature descriptors, such as Random Threshold Vector Technique (RTV) features and Wavelet Transforms Based on Gaussian Markov Random Field (WTBGMF) approach for texture classification.
[...] Read more.In real information systems, there are few static documents. On the other hand, there are too many documents that their content change during the time that could be considered as signals to improve the quality of information retrieval. Unfortunately, considering all these changes could be time-consuming. In this paper, a method has been proposed that the time of analyzing these changes could be reduced significantly. The main idea of this method is choosing a special part of changes that do not make effective changes in the quality of information retrieval; but it could be possible to reduce the analyzing time. To evaluate the proposed method, three different datasets selected from Wikipedia. Different factors have been assessed in term weighting and the effect of the proposed method investigated on these factors. The results of empirical experiments showed that the proposed method could keep the quality of retrieved information in an acceptable rate and reduce the documents’ analysis time as a result.
[...] Read more.In this paper we will propose model driven software development and Security Performance Framework (SPF) Model to maintain the balance between security and performance for web applications.
We propose that all security in a Trusted Operating System is not necessary. Some non-essential security checks can be skipped to increase system performance. These non essential security checks can be identified in any web application.
For implementation of this Security Performance framework based trusted operating system, we propose object oriented based Code generation through forward engineering. This involves generating source code of web application from one or more Object oriented Rational Rose model. The novel integration of security engineering with model-driven software expansion approach has varied advantages.
To maintain security in various applications like E-commerce, Banking, Marketplace services, Advertising, Auctions, Comparison shopping, Mobile commerce Payment, Ticketing, Online insurance policy management, we have to use high secured operating systems. In this regard a number of trusted operating systems like Argus, Trusted Solaris, and Virtual Vault have been developed by various companies to handle the increasing need of security. Due to high security reason these operating systems are being used in defense. But still these secure operating systems have limited scope in commercial sector due to lower performance; actually this security will come at a cost. This paper analyzes UML-based software development solutions for SPF to manage the security, performance and modeling for web applications.
Cloud computing is an emerging internet-based paradigm of rendering services on pay- as -per -use basis. Increasing growth of cloud service providers and services creates the need to provide a tool for retrieval of the high-quality optimal cloud services composition with relevance to the user priorities. Quality of Service rank-ings provides valuable information for making optimal cloud service selection from a set of functionally equiva-lent service candidates. To obtain weighted user-centric Quality of Service Composition, real-world invocations on the service candidates are usually required. To avoid the time-consuming and expensive real-world service invocations, this paper proposes framework for predic-tion of optimal composition of services requested by the user. Taking advantage of the past service usage experi-ences of the consumers more cost effective results are achieved. Our proposed framework enables the end user to determine the optimal service composition based on the input weight for individual service Quality of Service. The Genetic algorithm and basic Tabu search is applied for the user-centric Quality of Service ranking prediction and the optimal service composition. The experimental results proves that our approaches outperform other competing approaches.
[...] Read more.