Javascript is required
Search
/
/
Acadlore Transactions on Applied Mathematics and Statistics
ATAIML
Acadlore Transactions on Applied Mathematics and Statistics (ATAMS)
ATG
ISSN (print): 2959-4057
ISSN (online): 2959-4065
Submit to ATAMS
Review for ATAMS
Propose a Special Issue
Current State
Issue
Volume
2024: Vol. 2
Archive
Home

Acadlore Transactions on Applied Mathematics and Statistics (ATAMS) is dedicated to advancing research in the fields of applied mathematics and statistics. Highlighting the pivotal role of mathematical methodologies and statistical techniques in diverse real-world applications, ATAMS strives to decode the complexities underpinning these domains. Published quarterly by Acadlore, this peer-reviewed, open access journal typically issues its editions in March, June, September, and December each year.

  • Professional Service - Every article submitted undergoes an intensive yet swift peer review and editing process, adhering to the highest publication standards.

  • Prompt Publication - Thanks to our proficiency in orchestrating the peer-review, editing, and production processes, all accepted articles see rapid publication.

  • Open Access - Every published article is instantly accessible to a global readership, allowing for uninhibited sharing across various platforms at any time.

Editor(s)-in-chief(2)
bisera andrić gušavac
University of Belgrade, Serbia
bisera.andric.gusavac@fon.bg.ac.rs | website
Research interests: Mathematical Modelling; Optimization; Industrial Engineering; Performance Analytics
milena popović
University of Belgrade, Serbia
milena.popovic@fon.bg.ac.rs | website
Research interests: Data Envelopment Analysis; Quantitative Models and Methods; Mathematical Modelling; Optimization; Business Analytics and Performance Analytics

Aims & Scope

Aims

Acadlore Transactions on Applied Mathematics and Statistics (ATAMS) stands as an academic beacon in the realms of applied mathematics and statistics, illuminating the academic horizon with profound insights. Designed to serve as a nexus for the global community of researchers, scholars, and professionals, ATAMS is committed to showcasing groundbreaking research articles, in-depth reviews, and technical notes that span the myriad intersections of mathematical applications and statistical methodologies.

As modern challenges beckon innovative solutions, the journal's core revolves around the transformative potential of mathematical and statistical theories. These theories, often intricately woven into sectors ranging from engineering to economics, physical to social sciences, form the fabric of contemporary advancements. ATAMS champions not just the formulation of avant-garde mathematical models but ardently promotes their practical applications, solving real-world conundrums.

Holding the torch of academic excellence, ATAMS seeks manuscripts that redefine boundaries, stir intellectual curiosity, and instigate meaningful discussions. By fostering a milieu of interdisciplinary dialogues and collaborative ventures, the journal becomes an academic crucible where theories meld and ideas crystallize.

Advocating for exhaustive explorations, ATAMS believes in unbridled knowledge dissemination. Consequently, there are no confines on the length of contributions. Authors are encouraged to elucidate with thoroughness, ensuring the replicability of their findings. Distinctive features of the journal encompass:

  • A commitment to equitable academic services, ensuring authors, irrespective of their geographical origins, receive unparalleled support.

  • An agile review mechanism that underpins academic rigor, paired with expedited post-approval publication timelines.

  • An expansive reach, powered by the journal's open access directive, ensuring research resonates globally.

Scope

In its pursuit of academic breadth and depth, ATAMS's scope is vast, intricately designed to cover the spectrum of applied mathematics and statistics. It includes:

  • Mathematical Modeling: A comprehensive exploration into how mathematical methods are tailored to describe, forecast, and resolve intricate real-world challenges, ranging from ecological systems to intricate urban planning.

  • Statistical Theory and Innovations: This section doesn't just introduce novel statistical methods but critically evaluates their properties, potential pitfalls, and adaptability in diverse scenarios. It shines light on emerging trends and their applicability in new domains.

  • Data Synthesis and Mining: Beyond just extraction, the focus here is on the holistic lifecycle of data. It delves into methods for preprocessing, transformation, deep analysis, interpretation, and the eventual representation of data to ensure informed decision-making.

  • Advanced Numerical Computations: Celebrating the confluence of pure mathematics, algorithm design, and computational sciences, this segment highlights the latest strides in numerical methods, iterative techniques, and high-performance computing applications.

  • Interdisciplinary Matrix: This isn't just a cursory glance but a deep dive. From the precision required in financial mathematics, the sensitivity of medical statistics, the predictive power of biostatistics, to the large-scale implications of environmental statistics, this section covers it all.

  • Probabilistic Systems and Stochastic Analysis: Investigate the realms of randomness and uncertainty, dissecting how probabilistic models and stochastic methodologies can offer insights in fields as varied as finance, quantum mechanics, and epidemiology.

  • Optimization Techniques: Be it linear programming, dynamic optimization, or the newer realms of quantum optimization, this domain touches upon the algorithms and strategies that strive for perfection, ensuring resources are utilized to their utmost potential.

  • Time Series Analysis and Forecasting: Engage with the rhythmic dance of data over time, understanding patterns, anomalies, and making informed predictions about future behaviors, critical for sectors like finance, meteorology, and even social sciences.

  • Machine Learning and Artificial Intelligence: In this age of automation and intelligence, understand the mathematical underpinnings of ML algorithms, neural network design, and the statistical validations that ensure AI operates within expected paradigms.

  • Graph Theory and Network Analysis: From social networks, biological pathways to the vast world wide web, delve into the intricate patterns, connectivity issues, and the cascading effects within networks.

Articles
Recent Articles
Most Downloaded
Most Cited

Abstract

Full Text|PDF|XML

The forecasting of wheat commodity prices plays a crucial role in mitigating financial risks for stakeholders across the agricultural supply chain. In this study, the predictive performance of three models—Simple Moving Average (SMA), Extreme Gradient Boosting (XGBoost), and a hybrid SMA-XGBoost model—was evaluated to determine their efficacy in capturing both linear trends and complex nonlinear patterns inherent in wheat price data. A 10-lag structure was employed to integrate historical dependencies and seasonal fluctuations, thereby enhancing the accuracy of trend identification. The dataset was partitioned into training (75%) and testing (25%) subsets to facilitate an objective performance assessment. The XGBoost model, known for its capability in modelling nonlinear dependencies, demonstrated the highest forecasting precision, achieving a Mean Absolute Percentage Error (MAPE) of 1.64%. The hybrid SMA-XGBoost model, which leveraged the complementary strengths of both SMA and XGBoost, yielded a MAPE of 1.75%, outperforming the standalone SMA model, which exhibited a MAPE of 2.60%. While the hybrid model displayed slightly lower accuracy than XGBoost, it offered greater stability and robustness by effectively balancing trend extraction and nonlinear adaptability. These findings highlight the hybrid approach as a viable alternative to purely machine learning-based forecasting methods, particularly in scenarios requiring resilience to diverse market fluctuations. The proposed methodology provides a valuable tool for policymakers, agricultural producers, and market analysts seeking to enhance decision-making strategies and optimize risk management within the agricultural sector.

Open Access
Research article
Challenges in the Adaptation of Biomass Energy in India: A Multi-Criteria Decision-Making Approach Using DEMATEL
tripti basuri ,
srabani guria das ,
aditi biswas ,
kamal hossain gazi ,
sankar prasad mondal ,
arijit ghosh
|
Available online: 12-30-2024

Abstract

Full Text|PDF|XML
As a rapidly developing nation, India faces an urgent need to diversify its energy portfolio to ensure long-term sustainability and energy security. Biomass energy, as a renewable and sustainable resource, has the potential to play a crucial role in achieving these objectives. Its integration into the national energy framework, however, is hindered by multiple challenges, including technological limitations, socio-economic constraints, and environmental concerns. Despite its advantages—such as reducing greenhouse gas emissions, promoting economic growth, managing waste, and preserving biodiversity—several barriers must be systematically analyzed to facilitate its widespread adoption. In this study, a structured approach is employed to identify and evaluate the key challenges associated with biomass energy adaptation in India. The Decision-Making Trial and Evaluation Laboratory (DEMATEL) methodology is applied to determine the relative importance of these challenges, offering insights into the most critical criteria that require focused intervention. The findings of this study are expected to provide a strategic foundation for policymakers and stakeholders in formulating effective policies and technological solutions to enhance the viability of biomass energy in India's energy transition.

Abstract

Full Text|PDF|XML

Graph structures (GSs) have appeared as a robust mathematical framework for modelling and resolving complex combinatorial problems across diverse realms. At the same time, the linear Diophantine fuzzy set (LDFS) is a noteworthy expansion of the conventional concepts of the fuzzy set (FS), intuitionistic fuzzy set (IFS), Pythagorean fuzzy set (PFS), and q-Rung orthopair fuzzy set (q-ROFS). The LDFS framework introduces a flexible parameterization strategy that independently relaxes membership and non-membership restraints through reference parameters, thereby attaining enhanced expressiveness in apprehending ambiguous real-world phenomena. In this paper, a novel concept of linear Diophantine fuzzy graph structure (LDFGS) is introduced as a generalization of intuitionistic fuzzy graph structure (IFGS) and linear Diophantine fuzzy graph (LDFG) to GSs. Several cardinal fundamental notions in LDFGSs, including $\breve{\rho}_i$-edge, $\breve{\rho}_i$-path, strength of $\breve{\rho}_i$-path, $\breve{\rho}_i$-strength of connectedness, $\breve{\rho}_i$-degree of a vertex, degree of a vertex, total $\breve{\rho}_i$-degree of a vertex, and the total degree of a vertex in an LDFGS are discussed. Additionally, $\breve{\rho}_i$-size of an LDFGS, the size of an LDFGS, and the order of an LDFGS are studied. Meanwhile, the ideas of the maximal product of two LDFGSs, strong LDFGS, degree, and $\breve{\rho}_i$-degree of the maximal product are introduced with several concrete illustrations. To empirically validate the efficacy and practical utility of the proposed LDFGS framework, this study presents a case study analyzing road crime patterns across heterogeneous urban regions in Sindh province, Pakistan.

Abstract

Full Text|PDF|XML

Nanofluids, which are suspensions of nanoparticles in base fluids, have demonstrated considerable potential in enhancing thermal conductivity, energy storage, and lubrication properties, as well as improving the cooling efficiency of electronic devices. Despite their promising applications, the industrial utilization of nanofluids remains in the early stages, with further research needed to fully explore their capabilities. This study investigates a generalized nanofluid model, incorporating fractal-fractional derivative (FFD), to better understand the thermophysical behaviors in vertical channel flow. The nanofluid consists of polystyrene nanoparticles uniformly dispersed in kerosene oil. An exact solution to the model is obtained by employing the Laplace transform technique (LTT) in combination with the numerical Zakian’s algorithm. The FFD operator with an exponential kernel is applied to extend the classical nanofluid model. Discretization of the generalized model is achieved using the Crank-Nicolson method, and numerical simulations are performed to solve the resulting equations. The study reveals that, at a nanoparticle volume fraction of 4% (0.04), the heat transfer rate of the nanofluid is significantly higher than that of the base fluid. Furthermore, the enhanced heat transfer leads to improvements in various thermophysical properties, such as viscosity, thermal expansion, and heat capacity, which are crucial for industrial applications. The numerical results are presented graphically to highlight the dependence of the flow and thermal dispersion characteristics on key physical factors. These findings suggest that the use of fractal-fractional models can provide a more accurate representation of nanofluid behavior, particularly for high-precision applications in heat transfer and energy systems.

Abstract

Full Text|PDF|XML
Efficient classification of interval data presents considerable challenges, particularly when group overlaps and data uncertainty are prevalent. This study introduces an innovative two-stage Mixed Integer Programming (MIP) framework for discriminant analysis (DA), which is designed to minimize misclassification of vertices while effectively addressing the problem of overlapping groups. By incorporating interval data structures, the proposed model captures both the shared characteristics within groups and the distinct separations between them. The first stage of the model focuses on the identification of group-specific boundaries, while the second stage refines classification by incorporating probabilistic estimates of group memberships. A Monte Carlo simulation is employed to evaluate the robustness of the model under conditions of imprecision and noise, and the results demonstrate its superior capability in handling overlapping data and classifying uncertain observations. Validation through numerical experiments illustrates the model’s effectiveness in accurately resolving group overlaps, thereby improving classification performance. The approach offers significant advantages over traditional methods by probabilistically estimating group memberships, thus enhancing decision-making processes in uncertain environments. These findings suggest that the proposed MIP framework holds substantial promise for applications across a range of complex decision-making scenarios, such as those encountered in finance, healthcare, and engineering, where data imprecision is a critical concern.

Abstract

Full Text|PDF|XML
This study proposes an advanced framework for performance evaluation by extending the Malmquist Productivity Index (MPI) to accommodate interval data, addressing the inherent uncertainty and imprecision frequently encountered in institutional assessments. In many contexts, input-output data are often reported as intervals rather than precise values, which poses significant challenges for evaluating productivity changes. The extended MPI model allows for a more comprehensive analysis of performance by incorporating such interval data, thus providing a robust mechanism for assessing both progress and regression in the productivity of Decision-Making Units (DMUs). A case study on university departments is employed to demonstrate the practical application of this interval-based model. The results highlight notable variations in efficiency and technological advancement, offering valuable insights for institutional decision-makers. The proposed methodology enhances the accuracy of performance evaluation in dynamic and uncertain environments, making it a powerful tool for strategic planning and policy formulation. Furthermore, it is suggested that this interval-based approach offers a significant improvement over traditional models by accounting for the uncertainty present in real-world data. The study contributes to the broader field of strategic performance analytics by advancing the methodological understanding of productivity analysis, offering a more nuanced and reliable framework for institutional assessment.

Abstract

Full Text|PDF|XML
This study investigates the regional logistics efficiency of Sichuan Province, China, from 2011 to 2019, using a combination of the Data Envelopment Analysis-Banker, Charnes, and Cooper (DEA-BCC) model and the Tobit model. The primary objective is to assess the efficiency of the logistics industry and identify the key determinants influencing this efficiency within the context of high-quality development. A comprehensive input-output index system and a set of influencing factor variables were constructed to evaluate logistics performance across various regions of the province. The findings indicate that factors such as the level of economic development, urbanization, and geographical location significantly enhance regional logistics efficiency. In contrast, the level of informatization and the industrial structure exhibit clear inhibitory effects. Specifically, a higher degree of informatization does not necessarily correspond with improved logistics efficiency, potentially due to inefficiencies in technology adoption or uneven infrastructure development. Furthermore, the current industrial structure, with its reliance on traditional industries, may hinder the optimization of logistics systems. Based on these results, several policy recommendations are put forward, including the optimization of the industrial structure, better integration of information technologies in logistics processes, and the strategic utilization of Sichuan’s geographical advantages. This research provides valuable insights for policymakers aiming to enhance logistics efficiency as part of the region’s broader economic development strategy.

Abstract

Full Text|PDF|XML

The incorporation of fractional calculus into nanofluid models has proven effective in capturing the complex dynamics of nanofluid flow and heat transfer, thereby enhancing the precision of predictions in this intricate field. In this study, the dynamics of a viscoelastic second-grade nanofluid model are examined through the application of the Laplace transform technique on a vertical plate. Initially, the model is formulated as coupled partial differential equations to describe the second-grade nanofluid system. The governing equations are then rendered dimensionless using appropriate dimensionless parameters. The non-dimensional model is subsequently generalized by introducing a modified Caputo fractional derivative operator. To model a homogenous nanofluid, nanoparticles of $\mathrm{Al}_2 \mathrm{O}_3$ in nanometer-sized form are suspended in mineral transformer oil. The Laplace transform is employed to solve the momentum, energy, and mass diffusion equations, providing analytical solutions. Graphical and tabular analyses are conducted to assess the influence of various physical parameters—including the fractional order, nanoparticle volume fraction, and time parameter—on the velocity, thermal, and concentration profiles. The results indicate that increasing the nanoparticle volume fraction, fractional order, and time parameter significantly enhances the rate of heat transfer. Additionally, it is observed that the velocity, temperature, and concentration profiles are notably affected by increasing the volume fraction of nanoparticles. The accuracy and reliability of the obtained solutions are validated through comparisons with existing literature. This work advances the understanding of nanofluid dynamics and presents valuable insights for industrial applications, particularly in enhancing heat transfer performance.

Abstract

Full Text|PDF|XML
This work aims to apply the spherical fuzzy set (SFS), a flexible framework for handling ambiguous human opinions, to improve decision-making processes in recycled water. It specifically looks at the application of Sugeno-Weber (SW) triangular norms in the spherical fuzzy (SF) information domain, providing reliable approximations that are necessary for decision-making. A new class of aggregation operators is presented in this paper. These operators are specifically made for spherical fuzzy information systems and include the interval value spherical fuzzy Sugeno–Weber power weighted average (IVSFSWPA), interval value spherical fuzzy Sugeno–Weber power geometric (IVSFSWPWG), and interval value spherical fuzzy Sugeno–Weber power weighted average (IVSFSWPWA). The realistic features and special cases of these operators are demonstrated, highlighting how well they fit into practical scenarios. A new method for multi-attribute decision-making (MADM) is used for a range of real-world applications with different requirements or characteristics. The efficacy of the recommended methodologies is demonstrated with an example of a recycled water selection process. Additionally, a thorough comparison method is provided to show how the suggested aggregation strategies work and are relevant by contrasting their results with those of the current methods. The study's conclusion highlights the potential contribution of the recommended research to the advancement of decision-making techniques in dynamic and complex environments. It also summarizes its findings and discusses its prospects moving forward.

Abstract

Full Text|PDF|XML

In the present work we investigate the collapsing and expanding solutions of the Einstein's field equation of anisotropic fluid in spherically symmetric space-time and with charge within the framework of ${f(R, T)}$ theory, where $R$ denotes the Ricci scalar and $T$ denotes the trace of the energy$-$momentum tensor. We also evaluate the expansion scalar, whose negative values result in collapse and positive values yield expansion. We analyzed the impacts of charge in ${f(R, T)}$ theory on the density and pressure distribution of the collapsing and expanding fluid and noticed the involvement of anisotropic fluid in the process of collapsing and expanding with charge in $ {f(R, T)}$. Furthermore, the definition of mass function has been used to analyse the condition for the trapped surface, and it has been found that in this case there is only one horizon. In all scenarios, the effects of coupling parameters $\lambda$ and $q$ have been thoroughly examined. Additionally, we have created graphs representing pressures, anisotropy, and energy density in ${f(R, T)}$ theory and check the effect of charge on these quantities.

load more...
- no more data -
- no more data -