Authors - Abdullah I Alshoshan Abstract - Verification and/or identification (VI) of the individual writer-identity is one of the most common secure personal biometric authentications, particularly in banks for verification and in sensitive data storages for both VI. A writer-identity VI system (WIVIS) is proposed using the writer-invariant features of his/her script using two approaches: offline approaches, which rely on the script information in a static format, such as an image or shape, and online approaches, which require the collection of information in a dynamic format, such as speed and acceleration, using a tablet with a stylus pen to capture both of these dynamic information. Both offline script VI methods, such as normalized Fourier transform descriptor (NFTD) and normalized central moment (NCM), and online script VI methods, such as normalized script speed and acceleration, will be discussed. These features are compared individually and then as a combination. In the combination mode, the neural network (NN) is used for classification. Implementation and testing of the WIVIS is done and analyzed, and the effectiveness of each invariant algorithm, regardless of the language or form of the script shape, is discussed. A set of data on the online (dynamic) and offline (static) script is also discussed.
Authors - Dzung Lai Ngoc, Maria Luojus, Jukka Heikkonen, Rajeev Kanth Abstract - The development of cancer therapeutic therapies has made significant advancements in recent years. Numerous innovative solutions have emerged, achieving notable success, including immunotherapy, targeted drugs, and, among them, oncolytic viruses. Oncolytic virus therapy represents the first instance in which humans have employed a biological logic program, rather than a conventional drug, to treat a disease. Despite its promising potential, clinical trials involving oncolytic viruses have not yielded the anticipated outcomes, due to our incomplete understanding of the underlying biological logic and mechanisms. This paper will describe a treatment approach from a biological algorithmic standpoint, encompassing biological logic programs, molecules that carry biological logic (Logical Biomolecular Complexes - LBC), and the existing tools that can be used to design such treatment programs. Our proposal is based on a review of oncolytic virus studies, but the logic framework behind LBCs is tailored specifically for cancer treatment, rather than focusing on replication and spreading. This sets LBCs apart from their viral counterparts and can be considered a new concept.
Authors - Julian Andres Duarte Suarez, Leonardo Juan Ramirez Lopez Abstract - Given the continuous need of beds available for hospitalization in health institutions, and even more so during pandemic periods, it is necessary to have alternatives that allow patients to be transferred to their homes and from there to carry out continuous monitoring of their health. For this, DATALOG was developed, which is an Internet of Medical Things platform that acquires, processes, transmits, stores and manages the medical signals of patients from their home to a central hospital. Following this, the stored data is processed by applying the Standard Intersectoral Process methodology for data mining that al-lows them to provide the medical staff with the behavior of six physiological variables and visualize it in a unique control table developed in php. Initial tests show an acceptance of the medical staff as a service and support tool for medical decisions, and, in addition, it has been proven that hospitalization at home con-tributes significantly to the rapid improvement of the patient, thus decongesting the hospital system. Among the most recent innovations, DATALOG includes an early warning system that allows to warn about the patient's condition between normal, attention, alert and critical, with the possibility of sending the alert to mobile systems. It is concluded that DATALOG is a useful service and tool for e-health, and machine learning techniques allow for the prediction of patient states and alerts.
Authors - K P N S Dayarathne, U Thayasivam Abstract - Deep learning has achieved amazing success in multiple areas, such as Image Classification, speech recognition, Object Detection & Segmentation, natural language processing, audio-visual recognition, adaptive testing, etc., and is gaining major interest from the research community. The application for deep learning is growing day by day. Predicting interest rate as univariate analysis is important given that total spectrum of the interest rates is not available to apply yield curve analysis. This paper investigates the applicability of deep learning models such as RNN, LSTM, CNN and TCN to interest rates in Asina frontier countries such as Sri Lanka, Pakistan and Bangladesh. The deep learning approach for interest rate perdition is still under the radar, and this is the first attempt on the Asian Fronter market. Interest rates associates with Government securities were considered to have uniqueness for all three countries, where data range from 2010-2022 for Sri Lanka and Pakistan, whereas Bangladesh analysis was based on the same from 2015-2022. The results revealed CNN was the best model for Sri Lanka and Bangladesh, while LSTM was the best model for Pakistan based on the lowest RMSE. The study further investigates the applicability of different activation function for output layers and hidden layers, but found ReLU is the most viable activation function along with Max pooling. Further, it was found that CNN works better for countries with stable term structures of interest rates, and the immediate dynamics of the interest rate influence the near future interest rate.
Authors - Taeko Onodera, Koutaro Hachiya, Yuhei Hatakenaka Abstract - Numerous studies have applied machine learning to diagnosis and screening in the medical and welfare fields. However, it is rare for the resulting machine learning models to be widely adopted in clinical practice. This study proposes a method for deriving diagnostic rules from machine learning models that can be applied manually without the use of computers. The proposed method involves inputting all possible patterns into a trained model, generating a truth table with the corresponding prediction results, and then using the Quine– McCluskey method to derive logical expressions that serve as manual diagnostic rules. In the experiments, the proposed method was compared with conventional methods for deriving manual diagnostic rules from datasets: the point score system, a method based on likelihood ratios, and a logic derivation method based on rough set theory. Only the proposed method achieved a positive clinical utility index of 0.81 or higher—classified as “excellent”—even when the number of rules was limited to just two or three.
Authors - El Hadji Bassirou TOURE, Ibrahima FALL, Mandicou BA, Alassane BAH Abstract - Microservice architectures offer scalability and deployment benefits but introduce significant data consistency challenges due to distributed data ownership and the necessity of data duplication across services. Current approaches either compromise service autonomy with strong consistency mechanisms or shift complexity to developers. This paper presents Micro-Flex, a novel megamodel-based consistency management framework that treats data entities as component models with formally tracked consistency states.We extend the Modified-Shared-Invalid protocol with differentiated Shared states (Shared+ and Shared−) to accommodate varying consistency requirements while maintaining formal guarantees. Our approach formalizes Global Operation Models (GOMs) as application specifications where consistency states are actively managed through well-defined transition rules. We validate Micro-Flex through an e-commerce case study, demonstrating its effectiveness in balancing consistency guarantees with service autonomy while addressing data duplication challenges. 4 By applying Model-Driven Engineering principles to distributed consistency challenges, our framework contributes to more disciplined data management in microservice architectures.
Authors - Nikolaos Lazaropoulos, Ioannis Vaxevanakis, Ioannis Sigalas, Ioannis Lamprou, Vassilis Zissimopoulos Abstract - Cloud Native Functions (CNFs) support automated and dynamic orchestration of containerized network services, replacing traditional hardware-based architectures. These deployments consist of modular microservices that enable elastic scalability and collaborative service delivery. This paper presents an approximation framework for capacity constrained CNF resource allocation, modeled as variants of the Group Generalized Assignment Problem (Group GAP). The main contributions are: (1) a 1 2 -approximation algorithm for CNF placement when each function’s footprint is at most half the cluster capacity and (2) a 1 2 (1 − e−1/d)-approximation for shared microservices among multiple CNFs, where d is the degree of sharing, supported by experimental evaluation of the algorithm relative error.
Authors - Abdulaziz Alkhajeh, Sara Alhashmi, Alya Al Ali, Rakan Alhosani, Suhail Alshehhi, Deepa Pavithran, Joseph Anajemba Abstract - Healthcare has always been a crucial part of human life, with people investing resources to get the best services available. Ensuring patient confidentiality has always been crucial, but the digital era introduces new security risks. Hospitals now store patient information in computerized databases, which are vulnerable to cyberattacks. One major threat is ransomware attacks, where hackers capture sensitive and confidential patient data and demand large sums of money to prevent it from being leaked or sold. This puts patient privacy at risk and can disrupt healthcare services. Also, unauthorized access to the patients’ information compromising the data confidentiality has been a growing concern because health care has always been sensitive and personal information that should not be utilized for commercial purposes. Blockchain technology offers a solution by providing a secure way to store patient files. Using an Interplanetary File System (IPFS) on the blockchain, healthcare providers can save patient records in a decentralized and protected system, reducing the risks linked to traditional databases. This method helps protect patient information from cyber threats, ensuring privacy and security. In this paper, we are using blockchain-based architecture coupled with pinata IPFS cloud to secure the patient’s valuable information from any kind of cyber-attack, including ransomware.
Authors - Shreya Joshi, Vijay Ukani, Priyank Thakkar, Mrudangi Thakker, Dhruvang Thakker Abstract - This study introduces a hybrid malware detection approach that combines machine learning (ML) and deep learning (DL) techniques to enhance detection accuracy. By applying models like K-Nearest Neighbors (KNN), Support Vector Machine (SVM), and Gradient Boosting, we achieved 99.47% accuracy during training and 99.21% accuracy during testing, focusing on analyzing Portable Executable (PE) header data. Additionally, incorporating Convolutional Neural Networks (CNN) with Long Short-Term Memory (LSTM) networks improved performance, achieving 99% accuracy, 97% precision, and 98% recall after 30 epochs. The proposed hybrid method reduces false positives and negatives while demonstrating scalability across various datasets, offering a reliable and efficient solution for contemporary malware detection.
Authors - Balvinder Pal Singh, Thangaraju B Abstract - Modern computing systems face the dual challenge of meeting escalating performance demands, driven especially by AI and ML workloads, while operating under stringent thermal constraints. As energy consumption continues to rise, conventional thermal mitigation techniques like frequency throttling often compromise performance and increase cooling costs, threatening long-term sustainability. This paper proposes a multi-level, software-centric approach to address this challenge through intelligent, temperature-aware scheduling. Leveraging evolutionary techniques, specifically Genetic Algorithm (GA), the proposed model reorders jobs at the OS scheduler level based on thermal impact and energy profiles. A secondary optimization phase further fine-tunes job execution using dynamic slice adjustment for thermally intensive tasks. Simulation results obtained using custom-integrated simulation framework leveraging GEM5, McPAT, and Hotspot tools, demonstrate 28% overall performance improvement, 53% reduction in thermal violations and a 15% decrease in energy consumption with 80% of the tasks executed without performance degradation. This approach was validated using representative benchmark workloads, optimizing both energy and temperature profiles. This AI-augmented, multi-level scheduling strategy significantly enhances thermal efficiency and performance, offering a scalable solution for next-generation high-performance computing environments.
Authors - Kostiantyn Hrishchenko, Oleksii Pysarchuk, Danylo Baran Abstract - This paper introduces a batch processing method for constraint programming to improve solution performance. The method is demonstrated in an example of a production scheduling problem. Transformations from mass production of standardized products toward small-scale, customized orders in the manufacturing sector introduce a new challenge of handling extensive input data. Modern production scheduling systems struggle to handle BigData loads caused by the limitations of state-of-the-art scheduling algorithms. Therefore, the development of highly adaptive algorithms and models capable of efficiently managing numerous unique orders while maintaining the ability to adapt to dynamic constraints and objectives becomes critically important. The baseline discrete constraint programming approach is chosen for its flexibility and extensibility, allowing it to model various realistic manufacturing scenarios. The proposed method splits large input into smaller subsets, each scheduled independently by repeatedly involving the constraint solver in different portions of the input data, significantly improving performance compared to allocating the whole input in a single step. Computational experiments with Google's OR-Tools CP-SAT solver evaluated the method's effectiveness. Time and memory usage reductions were shown. The proposed method demonstrates the possibility of solving problems of much larger size using the same constraint model and solver. It combines the advantages of the greedy algorithm and the exact integer programming approach.
Authors - V. Machaca, J. Grados, K. Lazarte, R. Escobedo, C. Lopez Abstract - The interaction between peptides and the Major Histocompatibility Complex (MHC) is a critical factor in the immune response against various threats. In this work, we fine-tuned protein language models like TAPE, ProtBert-BFD, ESM2(t6), ESM2(t12), ESM2(t30), and ESM2(t33) by adding a BiLSTM block in cascade for the task of peptide-MHC class-I binding prediction. Additionally, we addressed the vanishing gradient problem by employing LoRA, distillation, hyperparameter guidelines, and a layer freezing methodology. After experimentation, we found that TAPE and a distilled version of ESM2(t33) achieved the best results outperforming state-of-the-art tools such as NetMHCpan4.1, MHC urry2.0, Anthem, ACME, and MixMHCpred2.2 in terms of AUC, accuracy, recall, F1 score, and MCC.
Authors - Nafiz Eashrak, Mohammad Ikbal Hossain, Md Omum Siddique Auyon, Md Abdullah Al Adnan Abstract - The proliferation of cryptocurrencies and blockchain technology has significantly reshaped the financial sector, introducing decentralized and transparent digital transactions. Despite substantial advantages such as reduced transaction costs, enhanced transparency, and financial inclusion, the anonymous and decentralized nature of cryptocurrencies has also facilitated illicit financial activities, including fraud, money laundering, and tax evasion. This literature review systematically examines forensic methodologies, regulatory challenges, and theoretical frameworks relevant to cryptocurrency investigations. This study highlights advancements in blockchain analytics, AI-driven monitoring tools, and regulatory frameworks such as the FATF Travel Rule and EU MiCA. However, it also underscores persistent challenges posed by privacy-focused technologies, decentralized finance (DeFi), cross-border jurisdictional inconsistencies, and technical limitations in forensic methodologies. This paper proposes an integrated forensic framework incorporating AI analytics, international regulatory collaboration, specialized forensic training, and privacy-preserving investigative techniques. Through this comprehensive review, it provides critical insights for enhancing forensic investigation capabilities, regulatory compliance, and policymaking, while outlining future research opportunities addressing emerging threats in cryptocurrency-based financial systems.
Authors - Victor Sineglazov, Alexander Ruchkin Abstract - The article proposes a new combined approach to investigation the problem of classification on real-world noisy dataset using a multi-stage semi-supervised learning method. The main idea of this approach is based on combining two methods: self-supervised learning on unlabeled data using Contrastive Loss Nets and semi-supervised learning label propagation using an enhanced Poisson Seidel learning technique. The proposed approach offers significant advantages, as it allows for preliminary classification without labels, strengthening the distinctions between classes, and then using a minimal amount of labeled data for final classification. This is demonstrated through the analysis of synthetic data from different intersecting ”Two moons” and real medical dataset on heart disease - ”Cardio Vascular” Accuracy in the first case exceeds 82%, and for the second example - 73%, which is one of the best result on the Kaggle database when compared to any other known 20 methods.
Authors - Amir Aieb, Alexander Jacob, Antonio Liotta, Muhammad Azfar Yaqub Abstract - Predicting soil moisture under dynamic climate conditions is challenging due to intricate dependencies within the data. This study presents a surrogate deep learning (SDL) model with a multitask learning (MTL) approach to improve daily soil moisture predictions across spatiotemporal scales. The model employs a two-level encoding process, first compressing climate parameters into a single feature and then applying sequential encoding to capture long-term temporal patterns within a one-year timeframe for better generalization. Seasonality detection using autocorrelation facilitates data resampling into homogeneous samples, enhancing the SDL model by optimizing hyperparameters through efficient weight sharing between layers. To evaluate the effectiveness of MTL, three different SDL architectures such as LSTM, ConvLSTM, and BiLSTM were implemented for a comprehensive analysis. All models struggle with soil moisture prediction, particularly during dry periods, where LSTM experiences the most significant accuracy drop. While BiLSTM demonstrates better performance, its effectiveness remains constrained. However, integrating MTL enhances model stability and spatio-temporal accuracy, reducing errors across various conditions and achieving a 10% improvement due to better data representation, enabling SDL models to capture regional heterogeneity more effectively.
Authors - V. Machaca, D. Lopez, J. Mamani, S. Ramos, Y. Tupac Abstract - Cancer immunotherapy offers a promising option to conventional cancer therapies, in this field, neoantigen detection is a rapidly evolving field; however, as it requires multiple bioinformatics stages, such as quality control, alignment, variant calling, annotation, and neoantigen prioritization. Each stage depends on specific software tools, which can create technical and compatibility challenges, often requiring significant expertise to integrate and manage. To address these challenges, we introduce NeoArgosTools, a novel flowchart-based platform, with an intuitive graphical interface, designed to simplify and streamline neoantigen detection pipelines in cancer immunotherapy research.