Blockchain has attracted widespread attention due to its unique features such as decentralization, traceability, and tamper resistance. With the rapid development of blockchain technology, an increasing number of industries are gradually applying blockchain technology to various fields such as the Internet of Things, healthcare, finance, agriculture, and government affairs. However, there are certain differences in the underlying architecture, data structures, consensus algorithms, and other aspects of blockchain technology across different sectors, which restrict transactions to occur within a single blockchain. Achieving interoperability between different blockchains is challenging, hindering data exchange and collaborative business to some extent, inevitably leading to the problem of “data silo”. Against this backdrop, this study aims to explore a cross-chain solution based on relay technology to address the current challenges of interoperability between blockchain systems. By employing relay-based cross-chain technology, a blockchain cross-chain collaboration platform is established to simulate the construction of a real cross-chain network. By deploying business contracts, data and resources between heterogeneous blockchains can seamlessly communicate, resolving the challenge of cross-chain interoperability. The research findings demonstrate that the blockchain cross-chain solution based on relay technology can effectively enhance interoperability between different blockchain systems, enabling cross-chain asset circulation and information transmission, highlighting the practical applicability and scalability of this study.
This study introduces logarithmic operations tailored to intuitionistic fuzzy sets (IFSs) aimed at mitigating uncertainty in decision-making processes. Through logarithmic transformations, the membership and non-membership degrees are effectively scaled, thereby enhancing interpretability and facilitating the assessment of uncertainty. Advanced logarithmic aggregation operators have been developed, specifically the Induced Confidence Logarithmic Intuitionistic Fuzzy Einstein Ordered Weighted Geometric Aggregation (ICLIFEOWGA) operator and the Induced Confidence Logarithmic Intuitionistic Fuzzy Einstein Hybrid Geometric Aggregation (ICLIFEHGA) operator. These operators serve as versatile tools, providing robust frameworks for integrating diverse information sources in decision-making and assessment processes. The versatility of the operators is demonstrated through their application across various industries and domains, where they support the integration of multiple criteria in complex decision-making scenarios. An algorithm for the decision-making process is presented, and the effectiveness and efficiency of the proposed techniques are illustrated through a case study on laptop selection.
To facilitate early intervention and control efforts, this study proposes a soybean leaf disease detection method based on an improved Yolov5 model. Initially, image preprocessing is applied to two datasets of diseased soybean leaf images. Subsequently, the original Yolov5s network model is modified by replacing the Spatial Pyramid Pooling (SPP) module with a simplified SimSPPF for more efficient and precise feature extraction. The backbone Convolutional Neural Network (CNN) is enhanced with the Bottleneck transformer (BotNet) self-attention mechanism to accelerate detection speed. The Complete Intersection over Union (CIoU) loss function is replaced by EIoU-Loss to increase the model's inference speed, and Enhanced Intersection over Union (EIoU)-Non-Maximum Suppression (NMS) is used instead of traditional NMS to optimize the handling of prediction boxes. Experimental results demonstrate that the modified Yolov5s model increases the mean Average Precision (mAP) value by 4.5% compared to the original Yolov5 network model for the detection and identification of soybean leaf diseases. Therefore, the proposed method effectively detects and identifies soybean leaf diseases and can be validated for practicality in actual production environments.
The T-spherical fuzzy set (T-SFS), an advancement over the spherical fuzzy set (SFS), offers a refined approach for addressing contradictions and ambiguities in data. In this context, similarity measures (SMs) serve as critical tools for quantifying the resemblance between fuzzy values, traditionally relying on the calculation of distances between these values. Nevertheless, existing methodologies often encounter irrational outcomes due to certain characteristics and complex operations involved. To surmount these challenges, a novel parametric similarity measure is proposed, grounded in three adjustable parameters. This enables decision-makers to tailor the SM to suit diverse decision-making styles, thereby circumventing the aforementioned irrationalities. An analytical comparison with existing SM reveals the superiority of the proposed measure through mathematical validation. Furthermore, the utility of this measure is demonstrated in the resolution of multi-attribute decision-making (MADM) problems, highlighting its efficacy over several existing approaches within the domain of T-SFS. The implementation of the proposed SM not only enhances the precision of similarity assessment in fuzzy sets but also significantly contributes to the optimization of decision-making processes.
The regulatory system for hazardous materials is complex, with poor inter-departmental communication and low levels of data sharing, making effective regulation challenging. Blockchain technology, known for its decentralization, traceability, and secure and trustworthy information, is widely applied in data sharing. Concurrently, attribute-based encryption (ABE), a novel encryption technique, offers high security and fine-grained access control, providing technical support for secure data access and privacy protection. However, existing attribute encryption algorithms do not consider the hierarchical relationship of access structures among data files during data sharing. Moreover, the immutable nature of blockchain means that access policies stored on it cannot be altered, leading to a lack of flexibility in data sharing. To address these issues, this paper proposes a blockchain and attribute-based dynamic layered access scheme for hazardous materials circulation data sharing. By constructing a Linear Secret-Sharing Scheme (LSSS) matrix, layered access control is achieved, allowing data decryption related to the matching parts of a user's attributes with the access structure. Additionally, through the design of a policy update algorithm, the blockchain structure is organized into transaction blocks and policy blocks, storing the encrypted symmetric keys separately to enable dynamic updates of access policies. Security analysis and experimental comparisons demonstrate the scheme's effectiveness and security in hazardous materials circulation data sharing.
This study introduces novel aggregation operators aimed at enhancing data analysis and decision-making processes through the induction of confidence levels into complex polytopic fuzzy systems. Specifically, the induced confidence complex polytopic fuzzy ordered weighted averaging aggregation (ICCPoFOWAA) operator and the induced confidence complex polytopic fuzzy hybrid averaging aggregation (ICCPoFHAA) operator are proposed. By integrating confidence levels into the aggregation process, these operators facilitate a more nuanced interpretation of fuzzy data, allowing for the incorporation of expert judgment and uncertainty in decision-making frameworks. A practical demonstration is provided to validate the efficacy and proficiency of these innovative techniques. Through a comprehensive example, the ability of the ICCPoFOWAA and ICCPoFHAA operators to enhance decision-making accuracy and reliability is substantiated, showcasing their potential as powerful tools in the realms of data analysis and complex decision-making scenarios. The incorporation of confidence levels into fuzzy aggregation processes represents a significant advancement in the field, offering a sophisticated approach to handling uncertainty and expert opinions in multi-criteria decision-making problems. This work not only introduces groundbreaking aggregation operators but also sets a new standard for research in fuzzy decision-making, underscoring the importance of confidence levels in the analytical process.
In the era of branding, the design of plush toy brands often faces a contradiction with the needs of target user groups. Addressing the brand transformation challenges faced by small and micro enterprises in the plush toy industry, this paper proposes a method for generating creative design schemes for plush toy brands based on extension theory. This method involves introducing the theory of primitives, utilizing extension primitives to construct problem models, employing extension diamond thinking for ideation and divergence, and using extension analysis for a comprehensive description of brand design elements. Subsequently, the method involves transforming these elements through extension transformation to generate innovative brand design schemes.
Recent advancements in non-destructive testing methodologies have significantly propelled the efficiency of bearing defect detection, vital for maintaining optimal final quality standards. This study introduces a novel approach, integrating an Optimized Continuous Wavelet Transform (OCWT) and a Non-Local Convolutional Block Attention Module (NCBAM), to elevate fault diagnosis in motor bearings. The OCWT, central to this methodology, undergoes fine-tuning through a newly formulated metaheuristic algorithm, the Skill Optimization Algorithm (SOA). This algorithm bifurcates into two critical components: the acquisition of expertise (exploration) and the enhancement of individual capabilities (exploitation). The NCBAM, proposed for classification, adeptly captures long-range dependencies across spatial and channel dimensions. Furthermore, the model employs a learning matrix, adept at synthesizing spatial, channel, and temporal data, thus effectively balancing diverse data contributions by extracting intricate interrelations. The model's efficacy is rigorously validated using a gearbox dataset and a motor bearing dataset. The outcomes reveal superior performance, with the model achieving an average accuracy of 94.17% on the bearing dataset and 95.77% on the gearbox dataset. These results demonstrably surpass those of existing alternatives, underscoring the model's potential in enhancing fault diagnosis accuracy in motor bearings.
In the field of computer vision and digital image processing, the division of images into meaningful segments is a pivotal task. This paper introduces an innovative global image segmentation model, distinguished for its ability to segment pixels with intensity inhomogeneity and robustly handle noise. The proposed model leverages a combination of randomness measurement and spatial techniques to accurately segment regions within and outside contours in challenging conditions. Its efficacy is demonstrated through rigorous testing with images from the Berkeley image database. The results significantly surpass existing methods, particularly in the context of noisy and intensity inhomogeneous images. The model's proficiency lies in its unique ability to differentiate between minute, yet crucial, details and outliers, thus enhancing the precision of global segmentation in complex scenarios. This advancement is particularly relevant for images plagued by unknown noise distributions, overcoming limitations such as the inadequate handling of convex images at local minima and the segmentation of images corrupted by additive and multiplicative noise. The model's design integrates a region-based active contour method, refined through the incorporation of a local similarity factor, level set method, partial differential equations, and entropy considerations. This approach not only addresses the technical challenges posed by image segmentation but also sets a new benchmark for accuracy and reliability in the field.
In this study, a novel methodology is proposed for ranking the knowledge economies of European Union (EU) countries, leveraging their positioning within the global knowledge index (GKI). The GKI, encompassing seven pivotal indicators, serves as a benchmark for assessing a nation's knowledge economy. The EU, a prominent political and economic conglomerate, forms the focal point of this analysis. A multi-criteria analysis approach is adopted, wherein the Entropy method is utilized to determine the significance of individual GKI indicators. Additionally, the CRADIS (Compromise Ranking of Alternatives from Distance to Ideal Solution) method is employed for the ranking of these nations. The Entropy method, renowned for its efficacy in subjective weight determination, and the CRADIS method, a novel multi-criteria analysis tool yielding results based on deviations from the ideal and anti-ideal solutions, are integrated. This integration is pivotal, as it offers results comparable with other multi-criteria methodologies. The analysis reveals that Research Development and Innovation emerges as the most critical indicator. According to the CRADIS method, Sweden is identified as the leading country in terms of GKI indicators, followed by Finland and Denmark. This trend underscores a superior performance of the northern EU countries. Conversely, Eastern EU countries are observed to lag in their GKI standings. These findings are corroborated through comparative and sensitivity analyses, highlighting the influence of normalization on country rankings and pinpointing specific indicators necessitating enhancement for bolstering the knowledge economy. This research not only aids EU countries in identifying their strengths and weaknesses in the realm of knowledge economy but also serves as a strategic guide for policymakers. It provides actionable insights for fostering knowledge economy development, emphasizing the need for strengthening existing advantages and addressing shortcomings. Such strategic initiatives are crucial for enhancing global market competitiveness. The study's outcomes, therefore, offer valuable resources for decision-making in policy and economic development contexts.
Twitter, a predominant platform for instantaneous communication and idea dissemination, is often exploited by cybercriminals for victim harassment through sexism, racism, hate speech, and trolling using pseudony-mous accounts. The propagation of racially charged online discourse poses significant threats to the social, political, and cultural fabric of many societies. Monitoring and prompt eradication of such content from social media, a breeding ground for racist ideologies, are imperative. This study introduces an advanced hybrid forecasting model, utilizing convolutional neural networks (CNNs) and long-short-term memory (LSTM) neural networks, for the efficient and accurate detection of racist and hate speech in English on Twitter. Unlabelled tweets, collated via the Twitter API, formed the basis of the initial investigation. Feature vectors were extracted from these tweets using the TF-IDF (Term Frequency-Inverse Document Frequency) feature extraction technique. This research contrasts the proposed model with existing intelligent classification algorithms in supervised learning. The HateMotiv corpus, a publicly available dataset annotated with types of hate crimes and ideological motivations, was employed, emphasizing Twitter as the primary social media context. A novel aspect of this study is the introduction of a revised artificial hummingbird algorithm (AHA), supplemented by quantum-based optimization (QBO). This quantum-based artificial hummingbird algorithm (QAHA) aims to augment exploration capabilities and reveal potential solution spaces. Employing QAHA resulted in a detection accuracy of approximately 98%, compared to 95.97% without its application. The study's principal contribution lies in the significant advancements achieved in the field of racism and hate speech detection in English through the application of hybrid deep learning methodologies.
This study, rooted in extension theory and the principles of knowledge engineering, explores and formulates a novel method for generating sports protective gear designs. Given the critical role of sports protective gear in safeguarding athletes from injuries, coupled with escalating demands for product quality, the aim is to uncover a more effective approach to innovative design. This method involves formalizing modeling of various elements in the design process and representing this information in the elemental form of knowledge engineering. Through the related analysis, divergent analysis, as well as permutation and conduction transformations of these elements, innovative design schemes for sports protective gear are generated. This process not only optimizes design schemes in depth but also ventures into new design methods and processes. The objective is to offer a novel perspective in integrating extension theory and knowledge engineering in the design of sports protective gear, aspiring to provide more effective strategies to enhance existing design workflows. The goal of this new design method is to produce sports protective gear that is both practical and innovative, thereby enhancing the safety and enjoyment of athletes.
This paper aims to introduce the concepts of complex Polytopic fuzzy sets (CPoFSs) and complex Polytopic fuzzy numbers (CPoFNs), advancing the field of fuzzy logic. Three innovative aggregation operators based on CPoFNs are presented: The complex Polytopic fuzzy weighted averaging aggregation (CPoFWAA) operator, the complex Polytopic fuzzy ordered weighted averaging aggregation (CPoFOWAA) operator, and the complex Polytopic fuzzy hybrid averaging aggregation (CPoFHAA) operator. A significant application of these complex Polytopic fuzzy sets is their integration into decision-making processes, particularly in identifying the most suitable COVID-19 vaccines for patients. This application highlights the practical relevance and the innovative nature of the proposed methods. The paper further demonstrates the efficacy and efficiency of these methods through a comprehensive example provided towards the end, underscoring their potential in real-world scenarios.
Fermatean fuzzy set (FRFS) is very helpful in representing vague information that occurs in real world circumstances. Their eminent characteristic of FRFS is that the degree of membership $\Im^{\ell}$ and degree of nonmembership $\beth^\gamma$ satisfy the condition $0 \leq \Im^{\ell^3}(x)+\Im^{\ell^3}(x) \leq 1$, so the space of vague information they can describe is broader. This study introduces the concept of generalized parameters into the FRFS framework and proposes a set of generalized Fermatean fuzzy average aggregation operators for the purpose of information aggregation. Subsequently, the operators are expanded to encompass a generalized parameter based on group consensus, which is derived from the perspectives of numerous experienced senior experts and observers. The present study offers a multi-criteria decision-making (MCDM) methodology, which is demonstrated using a numerical example to successfully showcase the suggested technique. In conclusion, a comparative study is undertaken to validate the efficacy of the suggested technique in relation to existing methodologies.