As market saturation and competitive pressure intensify within the banking sector, the mitigation of customer churn has emerged as a critical concern. Given that the cost of acquiring new clients substantially exceeds that of retaining existing ones, the development of highly accurate churn prediction models has become imperative. In this study, a hybrid customer churn prediction model was developed by integrating Sentence Transformers with a stacking ensemble learning architecture. Customer behavioral data containing textual content was transformed into dense vector representations through the use of Sentence Transformers, thereby capturing contextual and semantic nuances. These embeddings were combined with normalized structured features. To enhance predictive performance, a stacking ensemble method was employed to integrate the outputs of multiple base models, including random forest, Gradient Boosting Tree (GBT), and Support Vector Machine (SVM). Experimental evaluation was conducted on real-world banking data, and the proposed model demonstrated superior performance relative to conventional baseline approaches, achieving notable improvements in both accuracy and the area under the curve (AUC). Furthermore, the analysis of model outputs revealed several salient predictors of customer attrition, such as anomalous transaction behavior, prolonged inactivity, and indicators of dissatisfaction with customer service. These insights are expected to inform the development of targeted intervention strategies aimed at strengthening customer retention, improving satisfaction, and fostering long-term institutional growth and stability.
Significant advancements in artificial intelligence (AI) have transformed clinical decision-making, particularly in disease detection and management. Endometriosis, a chronic and often debilitating gynecological disorder, affects a substantial proportion of reproductive-age women and is associated with pelvic pain, infertility, and a reduced quality of life. Despite its high prevalence, non-invasive and accurate diagnostic methods remain limited, frequently resulting in delayed or missed diagnoses. In this study, a novel diagnostic framework was developed by integrating deep learning (DL) with explainable artificial intelligence (XAI) to address existing limitations in the early and non-invasive detection of endometriosis. Abdominopelvic magnetic resonance imaging (MRI) data were obtained from the Crestview Radiology Center in Victoria Island, Lagos State. Preprocessing procedures, including Digital Imaging and Communications in Medicine (DICOM)-to-PNG conversion, image resizing, and intensity normalization, were applied to standardize the imaging data. A U-Net architecture enhanced with a dual attention mechanism was employed for lesion segmentation, while Gradient-weighted Class Activation Mapping (Grad-CAM) was incorporated to visualize and interpret the model’s decision-making process. Ethical considerations, including informed patient consent, fairness in algorithmic decision-making, and mitigation of data bias, were rigorously addressed throughout the model development pipeline. The proposed system demonstrated the potential to improve diagnostic accuracy, reduce diagnostic latency, and enhance clinician trust by offering transparent and interpretable predictions. Furthermore, the integration of XAI is anticipated to promote greater clinical adoption and reliability of AI-assisted diagnostic systems in gynecology. This work contributes to the advancement of non-invasive diagnostic tools and reinforces the role of interpretable DL in the broader context of precision medicine and women's health.
The accurate estimation of the longitudinal dispersion coefficient is crucial for predicting solute transport in natural water bodies. In this study, an analytical (integral) method based on first principles is compared with Fischer’s widely used empirical approach, which is implemented in hydraulic modeling software such as the Hydrologic Engineering Center-River Analysis System (HEC-RAS). The primary objective is to evaluate the accuracy, applicability, and limitations of both methods under varying hydraulic conditions. A key advantage of the analytical approach is its ability to estimate the dispersion coefficient using velocity data alone, eliminating the need for high-cost tracer experiments that rely on solute concentration measurements. The determination index suggests an acceptable level of agreement between the two methods; however, the empirical approach systematically overestimates dispersion coefficients. Furthermore, a clear inverse relationship is observed between the slope of the channel and the magnitude of the dispersion coefficient, which is attributed to the increasing influence of shear velocity on the diffusion process. As slope values increase, solute separation time decreases, and concentration gradients become steeper. Conversely, at lower slopes, solute dispersion occurs over a broader time frame, resulting in lower concentration peaks. These findings indicate that while Fischer’s method provides a robust empirical framework, it should be supplemented with field measurements to improve reliability. In contrast, the analytical method offers a more theoretically grounded alternative that may enhance predictive accuracy in solute transport modeling. The implications of these results extend to water quality management, contaminant transport studies, and hydraulic engineering applications, where the selection of an appropriate dispersion estimation method significantly influences predictive outcomes.
Effective business system management necessitates strategic planning, efficient resource monitoring, and consistent team coordination. In practice, decision-making (DM) processes are frequently challenged by uncertainty, imprecision, and the need to aggregate diverse information sources. To address these complexities, a confidence-based algebraic aggregation framework incorporating the $p, q, r$-Fraction Fuzzy model has been proposed to enhance decision accuracy under uncertain environments. Within this framework, four novel aggregation operators are introduced: the Confidence $p, q, r$-Fraction Fuzzy Weighted Averaging Aggregation ($Cpqr$-FFWAA) operator, the Confidence $p, q, r$-Fraction Fuzzy Ordered Weighted Averaging Aggregation ($Cpqr$-FFOWAA) operator, the Confidence $p, q, r$-Fraction Fuzzy Weighted Geometric Aggregation ($Cpqr$-FFWGA) operator, and the Confidence $p, q, r$-Fraction Fuzzy Ordered Weighted Geometric Aggregation ($Cpqr$-FFOWGA) operator. These operators are designed to capture the inherent vagueness and subjectivity in business-related decision inputs, thereby facilitating robust assessments. The theoretical properties of the proposed operators—such as idempotency, boundedness, and monotonicity—are rigorously analyzed to ensure mathematical soundness and operational reliability. To illustrate the practical applicability of the model, a detailed case study is provided, demonstrating its effectiveness in maintaining resource sufficiency, preventing financial disruptions, and ensuring organizational coherence. The use of these aggregation mechanisms allows for systematic integration of expert confidence levels with varying degrees of fuzzy information, resulting in optimized decisions that are both data-informed and uncertainty-resilient. The methodological contributions are positioned to support real-world business contexts where dynamic inputs, incomplete data, and human judgment intersect. Consequently, the proposed approach offers a substantial advancement in intelligent decision-support systems, providing a scalable and interpretable tool for business performance enhancement.
This study investigates the relationship between municipal management and sustainable tourism in an urban protected area, specifically the Los Pantanos de Villa Wildlife Refuge in Lima, Peru. The research adopts a quantitative, correlational, non-experimental, cross-sectional design, focusing on a sample of 67 employees from the Municipal Authority. A probabilistic sampling technique was employed to select the sample from a population of 80 workers. Data were collected through two separate questionnaires, each tailored to measure one of the key variables, with responses recorded on a Likert scale ranging from 1 to 5. The study area, Los Pantanos de Villa, is an urban protected area situated in a densely populated region where challenges such as pollution, waste management, and urban sprawl exert significant pressure on environmental sustainability. Findings revealed that 88.06% of respondents assessed municipal management in the protected area as "good," while 76.12% rated sustainable tourism positively. Statistical analysis revealed a Pearson correlation coefficient of 0.590, with a p-value of 0.000, indicating a significant positive correlation between effective municipal management and the promotion of sustainable tourism. These results emphasize the crucial role of municipal governance in enhancing both environmental stewardship and sustainable tourism development within urban protected areas. Effective management practices can contribute to balancing the dual objectives of ecological conservation and urban development, thereby fostering a sustainable tourism model in highly urbanised contexts. This study underscores the importance of governance frameworks in mitigating urban pressures and advancing sustainability in Natural Protected Area (NPA).
The selection of optimal text embedding models remains a critical challenge in semantic textual similarity (STS) tasks, particularly when performance varies substantially across datasets. In this study, the comparative effectiveness of multiple state-of-the-art embedding models was systematically evaluated using a benchmarking framework based on established machine learning techniques. A range of embedding architectures was examined across diverse STS datasets, with similarity computations performed using Euclidean distance, cosine similarity, and Manhattan distance metrics. Performance evaluation was conducted through Pearson and Spearman correlation coefficients to ensure robust and interpretable assessments. The results revealed that GIST-Embedding-v0 consistently achieved the highest average correlation scores across all datasets, indicating strong generalizability. Nevertheless, MUG-B-1.6 demonstrated superior performance on datasets 2, 6, and 7, while UAE-Large-V1 outperformed other models on datasets 3 and 5, thereby underscoring the influence of dataset-specific characteristics on embedding model efficacy. These findings highlight the importance of adopting a dataset-aware approach in embedding model selection for STS tasks, rather than relying on a single universal model. Moreover, the observed performance divergence suggests that embedding architectures may encode semantic relationships differently depending on domain-specific linguistic features. By providing a detailed evaluation of model behavior across varied datasets, this study offers a methodological foundation for embedding selection in downstream NLP applications. The implications of this research extend to the development of more reliable, scalable, and context-sensitive STS systems, where model performance can be optimized based on empirical evidence rather than heuristics. These insights are expected to inform future investigations on embedding adaptation, hybrid model integration, and meta-learning strategies for semantic similarity tasks.
The effects of polycarboxylate superplasticizer (PCE) on the rheological properties and workability of cement-based composites were investigated by testing parameters such as static yield stress, dynamic yield stress, plastic viscosity, slump flow, bleeding rate, and penetration depth. The correlation between the dosage of PCE and the rheological parameters of fresh cement-based composites was analyzed. The results indicated that with an increase in the PCE dosage, the static yield stress, dynamic yield stress, and plastic viscosity of fresh cement-based composites decreased, demonstrating that PCE can improve the rheological properties of these composites. As the PCE dosage increased, the slump flow and bleeding rate of fresh cement-based composites also increased, but the rate of change decreased at higher dosages. Additionally, with an increase in PCE dosage, the penetration depth gradually increased, while the penetration depth difference ($\Delta {H}$) decreased. Furthermore, the compressive strength of cement-based composite cubes slightly decreased with an increase in PCE dosage.
To mitigate safety risks in subway shield construction within water-rich silty fine sand layers, a risk immunization strategy based on complex network theory was proposed. Safety risk factors were systematically identified through literature review and expert consultation, and their relationships were modeled as a complex network. Unlike traditional single-index analyses, this study integrated degree centrality, betweenness centrality, eigenvector centrality, and clustering coefficient centrality to comprehensively evaluate the importance of risk factors. Results indicated that targeted immunization strategies significantly outperformed random immunization, with degree centrality (DC) and betweenness centrality (BC) immunization demonstrating the best performance. Key risk sources included stratum stability, allowable surface deformation, surface settlement monitoring, and shield tunneling control. Furthermore, the optimal two-factor coupling immunization strategy was found to be the combination of DC and BC strategies, which provided the most effective risk prevention. This study is the first to apply complex network immunization simulation to safety risk management in subway shield construction, enhancing the risk index system and validating the impact of different immunization strategies on overall safety. The findings offer scientific guidance for risk management in complex geological conditions and provide theoretical support and practical insights for improving construction safety.
Seed Quality is an important area of agriculture and directly influences crop yield and germination percentage. Visual examination forms the foundation of traditional seed testing techniques, which are cumbersome, inflexible, and inefficient for effective assessment. This study proposed an automated approach to seed quality assessment based on physical measurement using machine learning and image processing techniques. Snapshots of the new seeds were captured and underwent feature extraction, segmentation, and image improvement to explore notable morphological attributes, such as size and colour. To tag seeds as "good" or "bad" based on physical characteristics, Support Vector Machines (SVMs) are used as a reference model. Rather, Convolutional Neural Networks (CNNs) have been utilised for deep feature extraction and classification. Experimental findings indicate that CNNs perform better than conventional machine learning models, with a scalable and highly accurate method of seed quality assessment. Future use will utilise quantum machine learning to improve prediction and facilitate sustainable, precision agriculture. The improved framework, optimised with great care for onion seeds, is a major breakthrough in increasing the agricultural productivity of onion cultivation.
Transformer-based language models have demonstrated remarkable success in few-shot text classification; however, their effectiveness is often constrained by challenges such as high intraclass diversity and interclass similarity, which hinder the extraction of discriminative features. To address these limitations, a novel framework, Adaptive Masking Bidirectional Encoder Representations from Transformers with Dynamic Weighted Prototype Module (AMBERT-DWPM), is introduced, incorporating adaptive masking and dynamic weighted prototypical learning to enhance feature representation and classification performance. The standard BERT architecture is refined by integrating an adaptive masking mechanism based on Layered Integrated Gradients (LIG), enabling the model to dynamically emphasize salient text segments and improve feature discrimination. Additionally, a DWPM is designed to assign adaptive weights to support samples, mitigating inaccuracies in prototype construction caused by intraclass variability. Extensive evaluations conducted on six publicly available benchmark datasets demonstrate the superiority of AMBERT-DWPM over existing few-shot classification approaches. Notably, under the 5-shot setting on the DBpedia14 dataset, an accuracy of 0.978±0.004 is achieved, highlighting significant advancements in feature discrimination and generalization capabilities. These findings suggest that AMBERT-DWPM provides an efficient and robust solution for few-shot text classification, particularly in scenarios characterized by limited and complex textual data.
Supply chain digitalization (SCD) has been recognized as a critical enabler of high-quality development in the manufacturing sector. To explore its influence mechanisms, an SCD indicator was constructed through textual analysis of corporate disclosures by Chinese manufacturing firms listed on the Shanghai and Shenzhen A-share markets from 2008 to 2022. Based on the theoretical lens of supply chain integration, the impact of SCD on high-quality development was empirically examined. The findings indicate that SCD significantly promotes high-quality development across manufacturing firms. Further analysis revealed that this relationship is positively mediated by two core mechanisms: supply chain collaborative innovation and the advancement of supply chain finance (SCF). These mediating effects were found to be strengthened under conditions of heightened environmental dynamism, underscoring the adaptive value of digital supply chain capabilities in volatile contexts. Heterogeneity analysis demonstrated that the positive effects of SCD are more pronounced in non-state-owned enterprises, firms in growth or decline stages, and those characterized by low levels of resource slack. Additionally, the long-term economic consequences of SCD were evaluated, and it was observed that enhanced digitalization contributes to the stable growth of firms’ long-term value by reinforcing their high-quality development trajectories. By clarifying the pathways through which SCD influences development outcomes, this study offers empirical evidence that enriches the existing body of literature on digital transformation within supply chains. Moreover, practical implications are provided for policy formulation and strategic decision-making aimed at fostering digitally integrated, innovation-driven, and financially resilient manufacturing ecosystems.
Ensuring the integrity of goods during cold chain transportation remains a critical challenge in logistics, as it is essential to preserve product quality, freshness, and compliance with stringent safety standards. Strategic decision-making in this context requires the prioritization of customer requirements and the optimal allocation of limited operational resources. In response to these demands, an integrated Multi-Criteria Decision-Making (MCDM) model was developed by combining the Best-Worst Method (BWM), Quality Function Deployment (QFD), and Measurement of Alternatives and Ranking according to Compromise Solution (MARCOS) approach. Within this framework, BWM was utilized to determine the relative importance of user requirements, which were then mapped onto specific operational resources through QFD to identify critical resource elements and derive their corresponding weights. These weights, subsequently treated as evaluation criteria in the MARCOS method, were applied to assess the performance of Third-Party Logistics (3PL) providers. The proposed methodology was validated through a case study involving eight user requirements and seven key resources. The findings indicated that precise temperature control and delivery speed were the most critical user requirements, whereas advanced temperature sensors and vehicles with cooling systems were identified as the most significant resources. Based on the MARCOS evaluation, Provider 1 emerged as the most optimal 3PL alternative. This integrated decision-making model offers a systematic and data-driven approach for aligning customer priorities with resource capabilities, thereby enabling logistics providers to enhance service quality, operational efficiency, and strategic competitiveness in temperature-sensitive supply chains. The model also demonstrates practical scalability and adaptability across diverse cold chain scenarios.