IJISA Vol. 6, No. 1, 8 Dec. 2013
Cover page and Table of Contents: PDF (size: 495KB)
Pervasive Computing, Middleware, Evaluation, Smart Environment
Pervasive computing aims at developing smart environments which enable user to interact with other devices. Pervasive computing includes a middleware to support interoperability, heterogeneity and self-management among different platforms. It provides efficient communications and context awareness among devices. Middleware for pervasive computing provides much more attention to coordinate the devices in the smart environment. The evaluation of the pervasive middleware is a challenging endeavor. The scope of evaluating smart environment is mainly increasing due to various devices involved in that environment. In this paper evaluation metrics are proposed based on the contexts available in the environment, how the devices are used, security and autonomy of smart applications. These metrics are used for evaluating different kind of smart applications.
J.Madhusudanan, V. Prasanna Venkatesan, "Metrics for Evaluating Pervasive Middleware", International Journal of Intelligent Systems and Applications(IJISA), vol.6, no.1, pp.58-63, 2014. DOI:10.5815/ijisa.2014.01.07
[1]Y. Li and J. A. Landay. “Activity-based prototyping of ubicomp applications for long-lived, everyday human activities” Proc. the twenty-sixth annual SIGCHI conference on Human factors in computing systems, ser. CHI ’08. New York, NY, USA: ACM, 2008, pp. 1303–1312.
[2]Benjamin Bertrana, JulienBruneaua, Damien Cassoua, Nicolas Loriantb, Emilie Ballanda, Charles Consela. “DiaSuite: a Tool Suite To Develop Sense/Compute/Control Applications”, May 2012.
[3]TeddyMantoro. “Metrics Evaluation for Context aware Computing”. Proceedings of MoMM2009, December 2009.
[4]AnandRanganathan, Jalal Al-Muhtadi, Jacob Biehl, Brian Ziebart, Roy Campbell, Brian Bailey. “Evaluating GAIA using a Pervasive Computing Benchmark” 2005.
[5]S. S. Intille, J. Rondoni, C. Kukla, I. Ancona, and L. Bao. “A context-aware experience sampling tool” CHI ’03 extended abstracts on Human factors in computing systems, ser. CHI EA ’03. New York, NY, USA: ACM, 2003, pp. 972–973.
[6]S. Carter, J. Mankoff, and J. Heer. Momento: “support for situated ubicomp experimentation” Proc. SIGCHI conference on Human factors in computing systems, ser. CHI ’07. New York, NY, USA: ACM, 2007, pp. 125–134.
[7]Di Zheng, Jun Wang, Ben Kerong. “Evaluation of Quality Measure factors for the middleware based Context-aware Applications” IEEE TRANSACTIONS, 2012.
[8]J. Barton and V. Vijayaraghavan. “UBIWISE, a simulator forubiquitous computing systems design” Hewlett-Packard Labs, Palo Alto, Tech. Rep. HPL-2003-93, 2003.
[9]E. O’Neill, M. Klepal, D. Lewis, T. O’Donnell, D. O’Sullivan, and D. Pesch. “A testbed for evaluating human interaction with ubiquitous computing environments” First International Conference on Testbeds and Research Infrastructures for the Development of Networks and Communities (Tridentcom) 2005., ser. TRIDENTCOM ’05. Washington, DC, USA: IEEE Computer Society, 2005, pp.60–69.
[10]H. Nishikawa, S. Yamamoto, M. Tamai, K. Nishigaki, T. Kitani,N. Shibata, K. Yasumoto, and M. Ito. Ubireal: “Realistic smartspacesimulator for systematic testing” UbiComp 2006: Ubiquitous Computing, ser. Lecture Notes in Computer Science, P. DourishandA. Friday, Eds. Springer Berlin / Heidelberg, 2006, vol. 4206, pp.459–476.
[11]E. Reetz, M. Knappmeyer, S. Kiani, A. Anjum, N. Bessis, and R.Tonjes. “Performance simulation of a context provisioning middleware based on empirical measurements” Simulation Modeling Practice and Theory, 2012.
[12]Grace Ngai, Stephen C.F. Chan, Vincent T.Y. Ng, Joey C.Y. Cheung, Sam S.S. Choy, Winnie W.Y. Lau and Jason T.P. Tse. “i*CATch: A Scalable, Plug-n-Play Wearable Computing Framework for Novices and Children” ACM Transactions, April 2010.
[13]Jason B. Forsyth, Thomas L. Martin. “Tools for interdisciplinary design of pervasive computing” International Journal of Pervasive Computing and Communications, Vol. 8 Iss: 2 pp. 112 – 132.