IJIEEB Vol. 6, No. 1, Feb. 2014
Cover page and Table of Contents: PDF (size: 133KB)
REGULAR PAPERS
Most of the algorithms implemented in FPGAs used to be fixed-point. Floating-point operations are useful for computations involving large dynamic range, but they require significantly more resources than integer operations. With the current trends in system requirements and available FPGAs, floating-point implementations are becoming more common and designers are increasingly taking advantage of FPGAs as a platform for floating-point implementations. The rapid advance in Field-Programmable Gate Array (FPGA) technology makes such devices increasingly attractive for implementing floating-point arithmetic. Compared to Application Specific Integrated Circuits, FPGAs offer reduced development time and costs. Moreover, their flexibility enables field upgrade and adaptation of hardware to run-time conditions. A 32 bit floating point arithmetic unit with IEEE 754 Standard has been designed using VHDL code and all operations of addition, subtraction, multiplication and division are tested on Xilinx. Thereafter, Simulink model in MAT lab has been created for verification of VHDL code of that Floating Point Arithmetic Unit in Modelsim.
[...] Read more.In financial enterprise, electronic banking is an entirely financial enterprise integrated IT which is used for financial data transaction, human resources, and other important financial data management. It should be managed very well otherwise problem in data processing will bring harm to the enterprise. Data loss, transaction failure, and many others problems will bring a long term negative impact towards the enterprise. The presence of local state regulation issued by Bank Indonesia which requires financial institutions to audit electronic banking externally and internally is one of the reasons why this study is conducted. The methodology that used to measure the performance of IT management in a financial enterprise (e.g. Bank X) is the ones based on Framework COBIT 4.1. A mapping is done to make financial enterprise goals in line with COBIT purposes so the relevant domain will be gained to be able to do further assessment. From the questionnaire and interview done in Bank X, it was found that the maturity level average was 3-defined – IT management performance has developed until a phase where standard procedure and documentation process has taken place because of formal training for the users. But, the training was not fixed yet and as a result many shortages could not be detected maximally by the management although there was a policy created previously. The policy was not able to reach best practice level (i.e. level 5-optimized).
[...] Read more.In this paper, three new operations have been introduced on intuitionistic fuzzy soft sets. They are based on Second Zadeh's implication, conjunction and disjunction operations on intuitionistic fuzzy sets. Some examples of these operations were given and a few important properties were also studied.
[...] Read more.Defects are most detrimental entities which deter the smooth operation and deployment of the software system and can arise in any part of the life cycle, they are most feared, but still Defect Prevention is mostly discounted field of software quality. Unattended defects cause a lot of rework and waste of effort. Hence only finding the defects is not important, finding the root cause of the defect is also important which is quite difficult due to levels of abstraction in terms of people, process, complexity, environment and other factors. Through this study various techniques of Defect classification, prevention and root cause analysis are analysed. The intent of this paper is to demonstrate the structured process showing defect prevention flow and inferring three T's (Tracking, Technique and Training) after analysis.
[...] Read more.In this paper, we propose a new and simple approach to obstacle and free space detection in an indoor and outdoor environment in real-time using stereo vision as sensor. The real-time obstacle detection algorithm uses two dimensional disparity map to detect obstacles in the scene without constructing the ground plane. The proposed approach combines an accumulating and thresholding techniques to detect and cluster obstacle pixels into objects using a dense disparity map. The results from both analysis modules are combined to provide information of the free space. Experimental results are presented to show the effectiveness of the proposed method in real-time.
[...] Read more.This research is focused on proposed Proportional-Integral (PI) like fuzzy adaptive backstopping fuzzy algorithms based on Proportional-Derivative (PD) fuzzy rule base with the adaptation laws derived in the Lyapunov sense. Adaptive SISO PI like fuzzy adaptive backstopping fuzzy method has two main objectives; the first objective is design a SISO fuzzy system to compensate for the model uncertainties of the system, and the second objective is focused on the design PI like fuzzy controller based on PD method as an adaptive methodology. Classical backstopping control is robust to control model uncertainties and external disturbances and is a main controller in this research. The fuzzy controller is used in this method to system compensation. To increase the robust of this controller adaptive PI like fuzzy controller is introduced and applied to backstopping fuzzy controller. Classical backstopping control method has difficulty in handling unstructured model uncertainties. One can overcome this problem by combining a backstopping controller and artificial intelligence (e.g. fuzzy logic). To approximate a time-varying nonlinear dynamic system, a fuzzy system requires a large amount of fuzzy rule base. This large number of fuzzy rules will cause a high computation load. The addition of an adaptive law to a backstopping fuzzy controller to online tune the coefficients in use will ensure a moderate computational load. The adaptive laws in this algorithm are designed based on the Lyapunov stability theorem. This method is applied to continuum robot manipulator to have the best performance.
[...] Read more.This paper presents a new approach to off-line handwritten numeral recognition based on structural and statistical features. Five different types of skeleton features: (horizontal, vertical crossings, end, branch, and cross points), number of contours in the image, Width-to-Height ratio, and distribution features are used for the recognition of numerals. We create two vectors Sample Feature Vector (SFV) is a vector which contains Structural and Statistical features of MNIST sample data base of handwritten numerals and Test Feature Vector (TFV) is a vector which contains Structural and Statistical features of MNIST test database of handwritten numerals. The performance of digit recognition system depends mainly on what kind of features are being used. The objective of this paper is to provide efficient and reliable techniques for recognition of handwritten numerals. A Euclidian minimum distance criterion is used to find minimum distances and k-nearest neighbor classifier is used to classify the numerals. MNIST database is used for both training and testing the system. A total 5000 numeral images are tested, and the overall accuracy is found to be 98.42%.
[...] Read more.This paper presents a new class complexity metric of an Object-Oriented (OO) program which is used to predict the understandability of classes. The propose complexity metric is evaluated theoretically against Weyuker's properties to analyze the nature of metric and empirically evaluated against three small projects developed by Post Graduate (PG)/Under Graduate (UG) teams. Least Square Regression Analysis technique is performed to arrive at the result and find correlation coefficient of propose metric with the Degree of Understandability. The result indicates that the propose metric is a good predictor of understandability of classes. JHAWK TOOL (Java Code Metrics Tool) were used to evaluate the parameters values involved in propose metric and for analyzing the results of projects, Matlab6.1 and IBM SPSS software were used.
[...] Read more.