IJIEEB Vol. 17, No. 1, Feb. 2025
Cover page and Table of Contents: PDF (size: 636KB)
REGULAR PAPERS
In today's, dynamic business environment and complexity of its operations make the need for modeling Business Processes (BP) is a very critical. Modeling BP is a very important task for improving BP and achieving business needs and goals. BP modeling techniques are necessary for making BP more understandable and easily maintainable which lead to successful Business Process Management (BPM). There many types of modeling techniques for expressing and modeling business process. However, each one has its own characteristics and not all modeling techniques are suitable to all parts of the process. Therefore, it is critical to determine the right and suitable modeling technique. The problem of evaluating BP modeling techniques has been addressed by many researches. However, there is need to handle uncertainty and take into account costs and benefits of BP modeling techniques during the evaluation process. This work aims to introduce different types of BP modeling techniques and present different views of characteristics, features and quality criteria of BPmodeling techniques that can help the modeler during the evaluation process. Also, this work aims to adapt and introduce neutrosophic framework to handle uncertainty and remove confusion during evaluating and determining the BP modeling suitable technique. Moreover, the proposed framework utilizes the neutrosophic benefits and costs method with simple way to improve its use and to balance between benefits and costs during the evaluation process. The proposed framework is applied to a real world case study and the results concluded that the proposed framework can be adopted by business organizations and institutes that need for determining the suitable BP modeling technique to improve their business processes. Also, the results concluded that the utilization of proposed framework can be helpful for handling uncertainty during the evaluation process.
[...] Read more.The sudden surge of digitalization escalates the challenges faced by traditional tax systems to detect and combat tax evasion, and it is a pivotal concern for the smooth functioning and sustainable development of any nation. The paradigm shift offered by the emergence of new-age technologies presents unprecedented opportunities to tap their potential for administering effective tax systems. In our paper, we provide a systematic scientometric analysis of existing literature to analyze four focal new-age technologies in combating tax evasion. We also propose a novel holistic framework to understand the intricacies of this multifaceted landscape of tax evasion from technological, ethical, legal, social, and economic (TELSE) perspectives. The research methodology gives a quantitative scientific mapping to analyze research publications from Web of Science and Scopus databases using Biblioshiny. A total of 117 documents were examined, spanning over the last decade (2014-2024). The research findings highlight considerable traction regarding the number of publications from the two most populated countries in the world. The analysis of the most frequent keywords yields an increasing trend towards the adoption of other new-age technologies as well and depicts different factors that affect tax evasion, which is in line with varied laws and regulations across countries. The interdisciplinary research efforts need to be aligned to tap the full potential of these technologies and to develop effective intelligent taxation systems that are fair, accountable, and explainable.
[...] Read more.This research endeavors to provide a thorough and insightful analysis of Internet of Things (IoT) tools within the context of smart homes. As the IoT continues to revolutionize the domestic landscape, understanding the integration, functionality, and user experience of these tools becomes paramount. The study surveys and categorizes prevalent IoT tools, encompassing sensors, processors, actuators, and databases. Integration capabilities are scrutinized, emphasizing interoperability and compatibility to ascertain the seamless incorporation of diverse IoT tools. Functional roles and contributions of each tool are dissected to illuminate their impact on automation, inter-connectivity, and overall control mechanisms in smart homes. The research extends its gaze to the user experience, exploring factors such as ease of use, reliability, and customization options, shaping the holistic perspective of IoT tools’ impact on residents. Realworld implementations and case studies provide tangible insights into practical applications, while surveys and interviews capture user perspectives, forming a comprehensive view of the challenges and limitations associated with these tools. This study contributes valuable insights for informed decision-making, empowering both users and developers to navigate the evolving landscape of IoT tools within the realm of smart homes.
[...] Read more.This research presents a framework that integrates no-code and low-code approaches with AI-driven Python modules for data analysis and visualization, embedded within Jakarta Faces web applications through TCP socket communication. The framework addresses the challenge of enabling non-technical users to perform complex data analysis tasks without requiring extensive programming knowledge. By leveraging Python’s powerful data libraries, the system automates code generation based on user input, offering a seamless environment for data-driven decision-making. The proposed framework demonstrates significant benefits in democratizing access to AI tools, improving development efficiency, and fostering a user-friendly interface for real-time data analysis and visualization. Rigorous testing of the prototype indicates enhanced usability, scalability for moderate-sized datasets, and practical applications across multiple industries, including healthcare and education. This research contributes to the growing body of work on no-code and low-code platforms by offering a novel integration of Python-based data analysis into Java-based web environments, laying the groundwork for more accessible and scalable AI-driven solutions in web development.
[...] Read more.Healthcare IoT seeks to use technology to better patient care, optimize operational efficiency, and provide remote monitoring and management of health issues. Resource management is crucial in the context of Health Internet of Things (HIoT) since it enhances the performance of healthcare services. This research paper proposes a resource management model in healthcare Internet of Things (IoT) by using deep learning and bio-inspired algorithms. A deep learning model LSTM model is used to resource failure prediction and bio-inspired algorithms are used for resource allocation and load balancing. An accurate prediction of resource utilization and effective resource management algorithm will improve the overall performance of IoT services for Health care application. The proposed approach incorporates deep learning methods to identify and anticipate anomalies, enabling the proactive identification of future problems or resource failures and resource utilization. In addition, bio-inspired algorithms are used to dynamically distribute resources and optimize system performance in real-time. The efficacy of the proposed fault-tolerant method is proved by extensive simulations and performance tests. The experiment results show the improvement in performance parameters as compared to state-of-the-art resource management models
[...] Read more.During the implementation of the work on the creation of the system of tonality recognition and text categorization in the news, a study of the subject area was conducted, which allowed the understanding of the processes of text analysis in the mass media to be enriched. The necessary data for further processing was found. The work resulted from a program that consists of an information parser, a data analyser and cleaner, a Large Language Models model, a neural network, and a database with vectorized data. These components were integrated into the user interface and implemented as a program window. The program can analyse news texts, determining their tone and categories. At the same time, it provides the user with a convenient interface for entering text and receiving analysis results. Therefore, the created system is a powerful tool for automated analysis of textual data in mass media, which can be used for various purposes, including monitoring the news space, analysis of public opinion, and others. Also, the developed information technology successfully meets the set tasks aimed at tonality analysis and categorization of news. It effectively solves the task of collecting, analysing and classifying news materials, which allows users to receive operational and objective information. Its architecture and functionality allow for easy changes and additions in the
future, making it a flexible and adaptable tool for news analytics and decision-making in various business sectors.
Accurate stock price prediction is crucial for financial markets, where investors and analysts forecast future prices to support informed decision-making. In this study, various methods for integrating two advanced time series prediction models, Gated Recurrent Unit (GRU) and Neural Basis Expansion Analysis Time Series Forecasting (N-BEATS), are explored to enhance stock price prediction accuracy. GRU is recognized for its ability to capture temporal dependencies in sequential data, while N-BEATS is known for handling complex trends and seasonality components. Several integration techniques, including feature fusion, residual learning, Ensemble learning and hybrid modeling, are proposed to leverage the strengths of both models and improve forecasting performance. These methods are evaluated on datasets of ten stocks from the S&P 500, with some exhibiting strong seasonal or cyclic patterns and others lacking such characteristics. Results demonstrate that the integrated models consistently outperform individual models. Feature selection, including the integration of technical indicators, is employed during data processing to further improve prediction accuracy.
[...] Read more.