Work place: JNU/School of Computer & System Science, New Delhi, 110003, India
E-mail: schand20@gmail.com
Website:
Research Interests: Image Manipulation, Image Compression, Computer systems and computational processes, Graph and Image Processing, Image Processing
Biography
Satish Chand did his M.Sc. in Mathematics from Indian Institute of Technology (IIT), Kanpur, India, and M.Tech. in Computer Science from Indian Institute of Technology(IIT), Kharagpur, India and Ph.D. from Jawaharlal Nehru University(JNU), New Delhi, India.
Presently he is Professor in School of Computer & System Sciences at Jawaharlal Nehru University (JNU), Delhi, India. Areas of his research interest are Multimedia Broadcasting, Networking, Video-on Demand, Cryptography, Requirements Prioritization and Image processing.
By Deepak Sharma Bijendra Kumar Satish Chand
DOI: https://doi.org/10.5815/ijisa.2019.02.08, Pub. Date: 8 Feb. 2019
This paper aims to systematically examine the literature of machine learning for the period of 1968~2017 to identify and analyze the research trends. A list of journals from well-established publishers ScienceDirect, Springer, JMLR, IEEE (approximately 23,365 journal articles) related to machine learning is used to prepare a content collection. To the best of our information, it is the first effort to comprehend the trend analysis in machine learning research with topic models: Latent Semantic Analysis (LSA), Latent Dirichlet Allocation (LDA), and LDA with Coherent Model (LDA_CM). The LDA_CM topic model gives the highest topic coherence amongst all topic models under consideration. This study provides a scientific ground that helps to overcome the subjectivity of collective opinion. The Mann-Kendall test is used to understand the trend of the topics. Our findings provide indicative of paradigmatic shifts in research methodology of significant patterns of topical prominence and the evolving research areas. It is used to highlight the evolution regarding the previous and recent trends in research topics in the area of machine learning. Understanding such an intellectual structure and future trends will assist the researchers to adopt the divergent developments of this research in one place. This paper analyzes the overall trends of the machine learning research since 1968, based on the latent topics identified in the period of 2007~2017 that may be helpful to the researchers exploring the recommended areas and publish their research articles.
[...] Read more.By Yash Veer Singh Bijendra Kumar Satish Chand
DOI: https://doi.org/10.5815/ijisa.2019.01.02, Pub. Date: 8 Jan. 2019
Requirements prioritization is a most important activity to rank the requirements as per their priority of order .It is a crucial phase of requirement engineering in software development process. In this research introduced a MCDM model for requirements prioritization. To select a best supplier firm of washing machine three important criteria are used. In this proposed model investigation for requirements prioritization, a case study adopted from Ozcan et al using LOG FAHP (Logarithmic fuzzy analytic hierarchy process) and ANN (Artificial Neural Network) based model to choose the best supplier firm granting the highest client satisfaction among all technical aspects. The test was conducted on MATLAB software and result evaluated on fuzzy comparison matrix with three supplier selection criteria based on FAHP and LOGANFIS that shows the decision making outcome for requirements prioritization is better than existing approaches with higher priority.
[...] Read more.By Yash Veer Singh Bijendra Kumar Satish Chand Jitendra Kumar
DOI: https://doi.org/10.5815/ijitcs.2018.04.06, Pub. Date: 8 Apr. 2018
Requirements prioritization is an essential component of software release planning and requirement engineering. In requirement engineering the requirements are arranged as per their priority using prioritization techniques to develop high-quality software’s. It also helps to the decision makers for making good decisions about, which set of requirements should be executed first. In any software development industry a ‘software project’ may have a larger number of requirements and then it is very difficult to prioritize such type of larger number of requirements as per their priority when stakeholder’s priorities are in the form of linguistic variables. This paper presents a comparative analysis of existing seven techniques based on various aspects like: scale of prioritization, scalability, time complexity, easy to use, accuracy, and decision making, etc. It was found from literature survey none of the techniques can be considered as the best one. These techniques undergo from a number of drawbacks like: time complexity, lack of scalability, Negative degree of membership function, inconsistency ratio, rank updates during requirement development, and conflicts among stakeholders. This paper proposed a model called ‘ANN Fuzzy AHP model’ for requirements prioritization that will overcome these limitations and drawbacks. In the investigation of this proposed model, a case study is implemented. Ozcan et al [31] using a FAHP (Fuzzy AHP) with ANN based technique to choose the best supplier based on the multiple criteria. The examination on ANN with FAHP is performed on MATLAB software and outcome evaluated by fuzzy pair-wise comparison matrix with three supplier selection criteria states that the requirements prioritization outcome is better from existing techniques.with higher priority.
[...] Read more.By Saurabh Agarwal Satish Chand
DOI: https://doi.org/10.5815/ijigsp.2015.12.02, Pub. Date: 8 Nov. 2015
The quantization artifacts and blocking artifacts are the two significant properties for identifying the forgery in a JPEG compressed image. There are some techniques for JPEG compressed images that can remove these artifacts resulting no traces for forgery. These methods are referred as anti-forensic methods. A forger may perform some post-operations to disturb the underlying statistics of JPEG images to fool current forensic techniques. These methods create noise and reduce the image quality. In this paper we apply three different interpolation techniques namely nearest neighbor, bilinear and bicubic techniques to remove JPEG artifacts. The experimental results show that the bicubic interpolated images are found to be of better quality as compare to the nearest neighbor and bilinear interpolated images with no JPEG artifacts. For quality analysis of these interpolation methods on the images three popular quality metric are used. The proposed method is very simple to perform. This interpolation based method is applicable to both single and double JPEG compression.
[...] Read more.By Saurabh Agarwal Satish Chand
DOI: https://doi.org/10.5815/ijigsp.2015.10.08, Pub. Date: 8 Sep. 2015
Performing digital image forgery is very easy due to highly precise image editing tools. There is a concomitant need to have some mechanism to differentiate between a forged image and the original image. In this paper, we propose a passive image forgery detection method that uses entropy filter and local phase quantization (LPQ) texture operator. The entropy filter generally highlights the boundary of the forged regions. It is due to the fact that the entropy filter provides the randomness of a pixel in its local neighborhood. The LPQ operator provides internal statistics of the image based on the phase information. We apply entropy filter on different sized neighborhoods followed by LPQ operator on the CASIA v1.0, CASIA v2.0 and Columbia image forgery evaluation databases. We consider these databases in our experiments because these are standard databases and have been used in most of the methods. Our method provides promising results on both CASIA databases; however, they are comparable on Columbia database with that of the existing state of the art methods.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals