Authors - Mazdak Zamani Abstract - As generative AI (GenAI) becomes increasingly accessible and integrated into software development and education, there is a growing tendency to apply AI-based solutions—even when traditional, deterministic algorithms would be more appropriate. This paper discusses the critical differences between AI-based heuristic methods and deterministic algorithmic approaches, and argues for responsible, context-aware deployment of AI in problem-solving. Misusing GenAI in domains that demand precision, efficiency, or guaranteed optimality can lead to inefficient or even incorrect solutions, undermining both technical integrity and academic rigor.
Authors - Mmatshipi Chaba, Khutso Lebea Abstract - Transforming communication, 5G technology delivers exceptional speed, low latency, and enhanced connectivity, making it essential for the future of various industries. This research highlights the evolution and key components of 5G. A case study demonstrates its effectiveness as the ultimate solution, supported by concrete evidence of its transformative impact. Furthermore, there will be an exploration of 5G in general, focusing on policies governing 5G deployment and emphasising their critical role in ensuring security, privacy, and equitable access. Additionally, the advantages of 5G in improving operational efficiency within physical systems are highlighted.
Authors - Hiep. L. Thi Abstract - The Elliptic Curve Digital Signature Algorithm (ECDSA) has become a cornerstone in modern cryptographic applications, offering efficient and secure digital signatures. This article reviews recent advancements and notable events related to ECDSA, highlighting its growing adoption and addressing security considerations.
Authors - Lorentz Jantschi Abstract - A multiple-choice assessment system was created and was used from 2006 to 2024 to assess over 3,000 students in topics of chemistry. The system was used in multiple assessments - each student had the opportunity to assess themselves several times. When determining the grade, for each set of assessments associated with a student, one assessment was eliminated - the weakest assessment - in the case of multiple assessments - and the average assessment was calculated with the remaining assessments. Each test contained 30 multiple-choice questions, and each correct answer was awarded 3 points. In a previous study, the observed distribution of students by the number of assessments was provided. In the present study, the sample of the number of student evaluations was analyzed under 3 theoretical distribution hypotheses: the classical Poisson and two of its generalizations, P´olya-Aeppli-Poisson and Snyder–Ord–Beaumont, respectively. The study was motivated by the fact that once identified the theoretical distribution, is possible to estimate the number of students who dropped out before the assessment. The study provided a negative result: all these distributions were rejected with a risk of being in error of 5%.
Authors - Maen Hammad, Hamza Hanandeh, Ahmed Fawzi Otoom Abstract - For software development processes, software project management plays significant role in accomplishing successful projects. One major activity in project management is the forecasting of project expenses, time duration and resources that need to be determined and allocated in advance. This paper proposes an automated prediction model to predict the effort, in terms of months, required to complete a software project. The model applies different machine learning algorithms to predict the required effort in terms of months. A set of experiments has been applied on the ISBSG dataset. The result shows that both X-Gradient Boosting and Gradient Boosting algorithms produce the best classification results. While the Logistic Regression and SVM produce the lowest accuracy results. The results also show the positive impact of the feature selection process on the classifier’s accuracy. The goal is to minimize the project’s features to the most influential ones in the prediction process.
Authors - Mustafa Hammad, Maleeha Ismail, Fadheela Hussain Abstract - Choosing risk-free loan applicants presents a significant challenge for the banking industry, as the process is lengthy, resource-intensive, and prone to human error. Machine learning (ML) offers a promising approach to predict creditworthiness by learning patterns from historical data. In this study, we implemented four machine learning classifiers—Logistic Regression, Random Forest, Multilayer Perceptron, and Naïve Bayes—to predict the creditworthiness of loan applicants. Our results demonstrated that the Naïve Bayes classifier achieved the highest performance, with a precision of 83.3% and an F-measure of 79.2%, making it the most effective among the tested models. Feature selection and stacking techniques were explored but showed minimal improvements, with accuracy gains of less than 1%. These findings suggest that simpler ML models, like Naïve Bayes, may effectively address creditworthiness prediction without the need for complex ensembles. This research supports the application of ML for faster, more accurate loan processing in the banking industry, with potential to reduce credit risk and streamline loan approval decisions.
Authors - Phillip G. Bradford, Henry Orphys, Dmitry Udler, Nadia Udler Abstract - This paper discusses a bi-modal AI system applied to legal reasoning for tax law. The results given here are very general and they apply to similar systems beyond tax law. These results use the downward and upward L¨owenheim–Skolem theorems to contrast the two modalities of this AI approach to tax law. One modality focuses on the syntax of proofs and the other focuses on logical semantics. Particularly, one modality uses a rule-based theorem-proving system to perform legal reasoning. The objective of this theorem-proving system is to provide proofs as evidence of valid legal reasoning. These proofs are syntactic structures that can be presented in court. The second modality uses large language models (LLMs). An objective of our application of LLMs is to enhance and simplify user input and output for the theorem-proving system. In addition, the LLMs may help in the translation of the natural language tax law into logic for the theorem proving system. The combination of these two modalities empowers our vision. The LLMs leverage notions of semantics for tax law inputs and they may help translate tax law statutes. While the theorem proving system gives syntactic proof-trees for legal arguments for potential justifications of tax statements.
Authors - Shahad Al-Tamimi, Qasem Abu Al-Haija, Abdullah AlShuaibi Abstract - Serverless computing (SC), specifically Function-as-a-Service (FaaS), provides scalability and operational flexibility while posing substantial security concerns. This study investigates enhanced detection and response strategies for SC contexts, focusing on real-time threat monitoring, anomaly identification, and automated incident response. The report emphasizes typical vulnerabilities, such as misconfigurations and unsecured Application Programming Interface (APIs), while emphasizing the need for complete, integrated security policies to combat emerging threats. The study examines these difficulties and proposes strong solutions to improve the resilience and security of serverless systems.
Authors - Kelebogile Kgwadi, Timothy T Adeliyi Abstract - As technology advances rapidly, businesses increasingly rely on big data to drive digital transformation and gain a competitive edge. Big data enables organizations to understand customer needs better and enhance decision-making processes, delivering significant benefits. However, effective decision-making requires the meaningful processing of information, making the identification and application of critical success factors (CSFs) essential for the successful adoption of big data applications. These CSFs help organizations overcome challenges and leverage opportunities presented by big data adoption. This study aims to identify and analyze the critical success factors influencing the effective adoption of big data applications in business organizations. A systematic literature review was conducted, guided by the PRISMA framework, covering articles published between 2010 and 2023. The process involved thorough database searches, article screening, and selection based on predefined criteria. A total of 98 articles were meticulously analyzed using Principal Component Analysis, with R Studio and WEKA employed for statistical analysis. The findings identified privacy management, security, access control, data governance, and availability as the top five factors impacting big data adoption. These insights informed the development of a framework emphasizing the pivotal role of CSFs in the seamless deployment and utilization of big data technologies. This study provides valuable guidance for information technology experts in business organizations, highlighting the key factors necessary for successful big data implementation. Ultimately, it contributes to advancing knowledge in this critical field, offering actionable strategies for effective big data integration.
Authors - Prawit Chumchu, Kailas Patil, Alfa Nyandoro Abstract - Automated cannabis cultivation faces considerable challenges, primarily due to complex environmental control requirements and timely disease detection. To address these issues, this study develops and evaluates a novel cannabis growing system integrated with Artificial Intelligence of Things (AIoT), aimed at automating environmental parameters and disease management. The proposed AIoT-based system autonomously regulates humidity, temperature, lighting, nutrients, water, and medicinal treatments to optimize cannabis plant health. Central to this automation is an intelligent disease detection module leveraging deep learning techniques capable of classifying cannabis leaf conditions into five categories: Healthy, Malnutrition, Red Spider Mites, Bacterial Spot, and High Temperature Stress. Initially, we assessed five pretrained convolutional neural networks (CNNs): InceptionV3, Xception, ResNet50, ResNet50V2, and ResNet152V2, which achieved accuracies up to 100% after 100 epochs. Subsequently, we developed simplified CNN models specifically optimized for deployment on low-cost edge devices, such as the Raspberry Pi, achieving the same high accuracy (100%) while significantly reducing computational complexity. These optimized models were integrated into our AIoT system, successfully automating real-time adjustments of critical growing conditions. Our findings underscore the potential of AIoT technologies in transforming cannabis agriculture by providing accurate, efficient, and scalable solutions for disease detection and cultivation management, thus enabling broader adoption of intelligent, automated agricultural practices.
Authors - Carlos Roberto Franca Abstract - This paper presents the mathematical foundation for the main generative AIs currently available, both free and subscription-based. During the first two months of 2025 (January and February) and until March 9, challenges involving original and unpublished mathematical formulas titled Infinite Series with Multiple Ratios (SRMs) were presented. The proposer has been working on this research since 1996. The main objective of the paper is to present the developments of generative AI agents from OpenAI (free and paid ChatGPTs), DeepSeek R1 (a generative AI released in January 2025), Gemini Advanced 2.0 Flash (a subscription-based generative AI from Google), as well as the free generative AIs Grok 3 (from xAI) and Claude Sonnet 3.7 from Anthropic. Several challenges were posed, and this paper will focus on the challenge related to Computational Biology. The resolution occurs through original and unpublished formulas. It is believed that several fields can be impacted by SRMs. This article presents the performance of generative AIs when faced with specific questions that require advanced knowledge of mathematics and computational biology. There are three distinct stages. In the first, the work problem is presented with didactic support and an introduction to the conceptual part involved. In the second stage, the didactic part is removed and in the third stage, the full article published by Indus Foundations is presented, the questions are redone and it is verified how prepared they are to discern what is being asked and how they work with the available concepts autonomously and efficiently.
Authors - Omar N. Elayan, Qussai M. Yaseen, Ahmed S. Shatnawi Abstract - This paper proposes a machine-learning model using static and dynamic features to identify Windows malware. The paper uses a new dataset of 12158 Portable Executable PE files for the Windows operating system, 5936 malicious files belonging to nine malware families, and 6,222 benign files. The main features of the files were extracted based on Application Programming Interface (API) by three main known methods: Static using Python, Dynamic by Cuckoo Sandbox, and finally, Hybrid by combining them to check which way is more effective and accurate in detecting malicious files. The proposed model performs binary and multi-class classification to classify malicious files into nine types. The experiments show that Extra-Trees outperformed other classifiers, achieving an accuracy of 100% in binary classification and 97% in multiclass classification.
Authors - Ioannis Patias Abstract - The wide expansion of artificial intelligence (AI) agents and tools necessitates computational paradigms that can address the inherent limitations of centralized cloud-based architectures. Edge computing emerges as a critical enabler, providing distributed processing capabilities that are essential for real-time decision-making, reduced latency, and enhanced data privacy. This paper examines the fundamental reasons why edge architecture plays such a central role in powering AI agents and tools, and delves into the mechanisms through which such an integration occurs. The advantages of edge-based AI are analyzed, including localized inference, reduced bandwidth consumption, and improved resilience. Furthermore, we explore the architectural considerations and technological advancements that facilitate the deployment of AI models at the edge, such as optimized model compression, hardware acceleration, and federated learning. Through a synthesis of existing literature and an analysis of practical applications, this paper demonstrates the transformative potential of edge architecture in shaping the future of AI agents and tools, and proposes a simplified latency model. The rise in IoT devices, and the need for immediate localized decisions, makes edge AI a necessity.
Authors - Hicham ESSAMRI, Abderrahim BAJIT, Khalid BOUALI, Hamza BENZZINE, Yasmine ACHOUR, Mohamed Nabil SRIFI, Rachid EL BOUAYADI Abstract - In IOT platform [1-4], most IOT nodes [4] include sensors to separately collect data and send them to the CLOUD for storage and specific processing. Among other things, if there are 5 sensors, the platform is supposed to send their values one after the other, which results in significant consumption in terms of execution time, energy consumption, communication conflict. This increases exponentially when the security and encryption of this data are necessary and arise to protect the integrity of the data. To reduce such consumption and avoid communication conflicts and all kinds of congestion at the IOT platform level and especially to compensate for the size of the IOT PAYLOAD [4] which results from its encryption, then we have designed a real-time IOT node, compact, optimal, and synchronous with the CLOUD. To do this, our idea is to store all the data from the different sensors, compact them into a single QrCode image file, visually compress this file, and ensure integrity protection using a light encryption before sending it on demand and synchronously to the CLOUD.
Authors - Md. Omer Faruq, Mania Sultana, Taspia Akter Epou, Md. Tareq Hossain, Md. Motaharul Islam Abstract - The rapid growth of software systems has increased energy usage, creating environmental and financial problems. Poorly designed software, heavy algorithms, and unnecessary processes waste energy, raising costs and adding to pollution. To address this, our research focuses on making the software more energy-efficient without losing performance. We use tools like Intel Power Gadget to measure how much energy a software uses and suggest ways to improve, such as optimizing algorithms, managing memory efficiently, and cleaning up code. These techniques provide simple, practical steps for developers to create energy-saving software that supports sustainability and reduces costs.
Authors - Hiep. L. Thi Abstract - The Edwards-curve Digital Signature Algorithm (EdDSA) is a modern cryptographic signature scheme that provides high security and efficiency. This paper reviews recent advancements in EdDSA, highlighting its mathematical foundations, implementation considerations, and its growing adoption in various applications such as blockchain, IoT, and secure communication. Security challenges and potential improvements are also discussed.
Authors - Hiba GAIZI, Abderrahim BAJIT, Hamza BENZZINE, Youness ZAHID, Hicham ESSAMRI, Mohamed Nabil SFRIFI, Rachid EL BOUAYADI Abstract - The growing demand for sustainable agricultural practices is driven by urgent global environmental issues. Greenhouses provide controlled settings that improve plant productivity through technological innovations. A promising approach to addressing these challenges is the integration of smart greenhouses with mobile IoT nodes equipped with autonomous navigation capabilities. This article introduces an advanced mobile node that autonomously follows a predefined path in the greenhouse using sophisticated computer vision techniques and deep learning models. The mobile node utilizes convolutional neural networks (CNN) to precisely track the path and strategically pause at each plant, collecting comprehensive subjective and objective data, thus enhancing the conventional functionality of IoT nodes. Agricultural IoT devices play a pivotal role in data presentation and connectivity via wired and wireless synchronized communication systems, though data security continues to be a persistent issue. By leveraging computational intelligence, the system compensates for lost and inaccurate sensor data, producing critical forecasts through advanced data analysis tools, allowing farmers to make informed decisions and enhance overall performance. The mobile node not only gathers data but also operates as the master I2C controller[1], overseeing communication with various I2C slaves and ensuring efficient data exchange between the Cloud and IoT nodes. The methodologies described here enable the optimization of both objective and subjective PAYLOADs, significantly improving data analysis, predictive accuracy, energy efficiency, and overall system performance. This article outlines these techniques, highlighting their impact on reducing data transmission time and improving system effectiveness in smart greenhouse environments.
Authors - Hiep. L. Thi Abstract - Cryptocurrency wallets play a crucial role in the security, usability, and accessibility of digital assets. This paper explores different types of cryptocurrency wallets, their underlying cryptographic mechanisms, and their applications in financial transactions, decentralized finance (DeFi), and secure identity management. We also discuss security challenges and future developments in wallet technologies.
Authors - Danilo Menegatti, Alessandro Giuseppi, Antonio Pietrabissa Abstract - To overcome one of the main limitations of federated learning, that is the non-negligible communication overhead between the clients and server, the present work proposed a novel federated scheme based on principles envisaged by semantic communications. The proposed semantic-based dimensionality reduction algorithm is employed to reduce the data exchanges by more than one order of magnitude and negligible performance loss. The effectiveness of the proposed approach is validated through a classification scenario leveraging transfer learning.
Authors - Phillip Ma, Yijun Shao, Yan Cheng, Youxuan Ling, Qing Zeng-Treitler, Stuart J. Nelson Abstract - We explore the mental health of elderly in relation to social determinants of health (SDOH) during the COVID pandemic. Factors such as physical isolation and social disconnectedness, resulting from interventions such as home isolation and restricted visits to nursing homes, are considered as potential contributors to the decline in mental health. By leveraging data from the All of Us Research Program, we explore the relationship between mental health outcomes and SDOH among a large cohort of elderly participants. Male gender and reporting being African American appear to be associated with relatively better mental health outcomes, whereas being divorced, separated, or widowed is linked to poorer mental health. Income and education demonstrate inconsistent effects on mental health outcomes. The deep neural network models outperform regression models in the same outcomes, by modeling more complex relationships between variables. Predictive performance remains modest for COVID-related anxiety and general well-being.
Authors - Lethabo Mahlase, Daniel Ogwok Abstract - In this paper, we present a maze-solving system that integrates image preprocessing techniques, graph-based modelling, and pathfinding algorithms to identify the shortest paths through a maze. Preprocessing steps, including adaptive thresholding and median filtering, ensure accurate graph construction from images. The graph is built using a grid-based approach, balancing computational efficiency with structural accuracy. Three algorithms, Breadth-First Search, A*, and Ant Colony Optimization (ACO), are implemented and compared. Results show that Breadth-First Search offers the fastest solution for smaller, simple mazes, while A* minimises node exploration, providing the shortest paths and excelling in complex environments. ACO, though slower, demonstrates adaptability in scenarios where dynamic pathfinding is required. The system’s performance illustrates its potential for search and rescue applications, where fast and efficient pathfinding is essential. Future work will focus on optimising preprocessing times, refining algorithm performance, and exploring real-time data integration to enhance the system’s applicability in real life scenarios.
Authors - Zarak Khan, Jiatong Yang, Rimshah Jawad, Ivania Martinez, Md Mozammel Hoque, Xinyi Zhao, Jim Samuel Abstract - The birth of a child brings immense joy to a mother’s life. However, the reality can be different for mothers experiencing Postpartum Depression (PPD). According to the World Health Organization (WHO), around 13% of women experience postpartum mental health disorders, with rates rising to nearly 20% in developing countries. PPD is a condition that affects many women worldwide, but because of the social stigma and the lack of accessible mental health support, it often goes undiagnosed or untreated. This paper presents MOMCare, a chatbot designed to support mothers navigating the challenges of PPD. MOMCare has a retrieval-augmented architecture with an end-to-end pipeline from data preprocessing to response generation. It employs hybrid classification, a dual embedding system, a dual verification guardrail, and a medical domain-specific reranking mechanism to generate empathetic and relevant PPD responses. This refined design of Retrieval Augmented Generation (RAG) ensures fast and factual response by reducing noise in retrieval and providing abundant context to gpt-3.5-turbo. MOMCare was evaluated using both automated and human metrics. Results show strong performance in both evaluations, which underlines the potential for chatbot interventions in the postpartum mental health domain. This system is robust enough to take new data and create a conversation generation pipeline that includes new information. Expanding the knowledge base using the conversation history with the users is also in development. The MOMCare chatbot and its features were built on sound ethical principles of healthcare and Artificial Intelligence (AI) and present a strong design emphasis on safety and fairness.
Authors - Harnoor Kaur Khehra, Farnaz Sheikhi, Alan Bach, Elijah Mickelson, Farhad Maleki Abstract - Rice panicle segmentation plays a critical role in precision agriculture. Focusing on its indispensability, in this paper, we propose a multi-stage training pipeline that leverages the customized U-Net with EfficientNet-B3 and pseudo-labeling to improve segmentation accuracy. The datasets are prepared through the acquisition of video clips of rice crops, captured from different fields under diverse environmental conditions. The model is evaluated on internal and external test sets, demonstrating the effectiveness of the staged training and synthetic data augmentation. Comparing the performance of the proposed model with the strongly performed nnU-Net model shows the superiority of our proposed model in terms of Dice score and Intersection over Union. This highlights the impact of pseudo-labeling in improving the segmentation accuracy.
Authors - Arunima Saxena, Arindam Bhattacharyya, Lilian Molina, Savita Patil, Hussain Al-Asaad Abstract - Reliable interrupt handling is crucial in embedded systems, especially for real-time applications running on RISC-V architectures. This paper presents a modular and scalable verification methodology for evaluating external interrupt handling in the NEORV32 RISC-V processor. NEORV32 offers a rich trade-off between performance and resource usage, along with strong execution safety features and a flexible software framework - making it an ideal candidate for interrupt verification studies. To thoroughly test the processor’s interrupt logic, we built a custom simulation environment in ModelSim, encapsulating both the core and top-level architecture into reusable libraries. Our primary focus was the External Interrupt Controller (XIRQ), which supports 32 external interrupt channels with various triggering modes. Three targeted test scenarios were developed: a baseline interrupt case, an edge case stressing maximum bit values, and a rising edge-triggered case. Results from these simulations confirm consistent and reliable interrupt handling behavior across all test cases, with uniform response time and correct signal acknowledgment. This work provides practical insights for embedded system designers implementing interrupt-driven applications on RISC-V platforms and establishes a foundation for verifying other modules of NEORV32 in future work.
Authors - Khalid BOUALI, Abderrahim BAJIT, Hamza BENZZINE, Hicham ESSAMRI, Yasmine ACHOUR, Hassan EL FADIL, Rachid EL BOUAYADI Abstract - Modern agriculture faces significant challenges from climate change, necessitating innovative strategies for sustainability. Precision agriculture has emerged as a transformative solution, leveraging advances in Artificial Intelligence (AI), IoT, and Cloud computing to transition traditional farming into intelligent systems. However, reliance on centralized architectures introduces new challenges related to response time and operational costs. This paper presents an optimized and intelligent IoT system architecture leveraging Edge computing to enhance agricultural sustainability, with a focus on greenhouse IoT platforms. The proposed architecture integrates AIoT and incorporates an intelligent edge computing systems that locally manages sensor nodes, processes data, and predicts critical agricultural parameters, including Temperature T, Humidity H, Air Quality CO₂, Light Intensity UV, and Soil Moisture pH. The study evaluates the performance of three Supervised Machine Learning Regression models, Linear Regression, Random Forest, and Extreme Gradient Boosting (XGBoost), in predicting missing sensor data using dataset contains the key agriculture parameters. Additionally, a Long Short-Term Memory (LSTM) neural network is trained on the same dataset and evaluated at the edge to forecast future variations in microclimatic parameters. To optimize system efficiency, a novel functionality is implemented, operating in ON and OFF modes based on node states. This functionality enables the system to predict data, minimizing unnecessary data collection, conserving energy, and improving the longevity of Static Edge Nodes distributed across the Greenhouse. This work highlights the potential of AIoT-driven precision agriculture to provide robust, intelligent, and sustainable farming practices, effectively managing IoT system components and delivering reliable monitoring of critical agricultural parameters amidst climate-related challenges.
Authors - Mark Alfred M. Nuguit, Elleicarjay C. Ramilo, Noel B. Linsangan Abstract - Urban noise events in busy metropolitan areas present significant challenges to urban planning and environmental management. This study introduces a cost-effective system integrating wireless sensor networks with deep learning to capture and classify urban noise events in Metro Manila. The prototype employs sensor nodes equipped with ESP32 microcontrollers and MAX9814 microphone amplifiers to record audio data, which is stored on SD cards and subsequently converted into 512×512 pixel spectrogram images using Python-based signal processing. These images serve as inputs to a MobileNet-based convolutional neural network, fine-tuned via transfer learning on a dataset of over 4,300 samples spanning two categories: civilian vehicles and human activities. The system, implemented on a Raspberry Pi with an interactive touchscreen interface, achieved an overall classification accuracy of 96.11%, as verified through confusion matrix analysis. This work demonstrates a scalable, low-cost framework for urban noise monitoring and provides valuable insights for environmental management and future urban planning strategies. The method’s efficiency and adaptability make it especially suitable for addressing the unique acoustic challenges of rapidly urbanizing regions.