Sort:
Open Access Issue
An Intelligent Big Data Security Framework Based on AEFS-KENN Algorithms for the Detection of Cyber-Attacks from Smart Grid Systems
Big Data Mining and Analytics 2024, 7(2): 399-418
Published: 22 April 2024
Abstract PDF (3 MB) Collect
Downloads:54

Big data has the ability to open up innovative and ground-breaking prospects for the electrical grid, which also supports to obtain a variety of technological, social, and financial benefits. There is an unprecedented amount of heterogeneous big data as a consequence of the growth of power grid technologies, along with data processing and advanced tools. The main obstacles in turning the heterogeneous large dataset into useful results are computational burden and information security. The original contribution of this paper is to develop a new big data framework for detecting various intrusions from the smart grid systems with the use of AI mechanisms. Here, an AdaBelief Exponential Feature Selection (AEFS) technique is used to efficiently handle the input huge datasets from the smart grid for boosting security. Then, a Kernel based Extreme Neural Network (KENN) technique is used to anticipate security vulnerabilities more effectively. The Polar Bear Optimization (PBO) algorithm is used to efficiently determine the parameters for the estimate of radial basis function. Moreover, several types of smart grid network datasets are employed during analysis in order to examine the outcomes and efficiency of the proposed AdaBelief Exponential Feature Selection- Kernel based Extreme Neural Network (AEFS-KENN) big data security framework. The results reveal that the accuracy of proposed AEFS-KENN is increased up to 99.5% with precision and AUC of 99% for all smart grid big datasets used in this study.

Open Access Issue
Adaptive Marine Predator Optimization Algorithm (AOMA)–Deep Supervised Learning Classification (DSLC)based IDS framework for MANET security
Intelligent and Converged Networks 2024, 5(1): 1-18
Published: 28 March 2024
Abstract PDF (7.5 MB) Collect
Downloads:47

Due to the dynamic nature and node mobility, assuring the security of Mobile Ad-hoc Networks (MANET) is one of the difficult and challenging tasks today. In MANET, the Intrusion Detection System (IDS) is crucial because it aids in the identification and detection of malicious attacks that impair the network’s regular operation. Different machine learning and deep learning methodologies are used for this purpose in the conventional works to ensure increased security of MANET. However, it still has significant flaws, including increased algorithmic complexity, lower system performance, and a higher rate of misclassification. Therefore, the goal of this paper is to create an intelligent IDS framework for significantly enhancing MANET security through the use of deep learning models. Here, the min-max normalization model is applied to preprocess the given cyber-attack datasets for normalizing the attributes or fields, which increases the overall intrusion detection performance of classifier. Then, a novel Adaptive Marine Predator Optimization Algorithm (AOMA) is implemented to choose the optimal features for improving the speed and intrusion detection performance of classifier. Moreover, the Deep Supervise Learning Classification (DSLC) mechanism is utilized to predict and categorize the type of intrusion based on proper learning and training operations. During evaluation, the performance and results of the proposed AOMA-DSLC based IDS methodology is validated and compared using various performance measures and benchmarking datasets.

Open Access Issue
Design and analysis of a recommendation system based on collaborative filtering techniques for big data
Intelligent and Converged Networks 2023, 4(4): 296-304
Published: 30 December 2023
Abstract PDF (649.9 KB) Collect
Downloads:93

Online search has become very popular, and users can easily search for any movie title; however, to easily search for moving titles, users have to select a title that suits their taste. Otherwise, people will have difficulty choosing the film they want to watch. The process of choosing or searching for a film in a large film database is currently time-consuming and tedious. Users spend extensive time on the internet or on several movie viewing sites without success until they find a film that matches their taste. This happens especially because humans are confused about choosing things and quickly change their minds. Hence, the recommendation system becomes critical. This study aims to reduce user effort and facilitate the movie research task. Further, we used the root mean square error scale to evaluate and compare different models adopted in this paper. These models were employed with the aim of developing a classification model for predicting movies. Thus, we tested and evaluated several cooperative filtering techniques. We used four approaches to implement sparse matrix completion algorithms: k-nearest neighbors, matrix factorization, co-clustering, and slope-one.

Open Access Issue
Human Action Recognition Using Difference of Gaussian and Difference of Wavelet
Big Data Mining and Analytics 2023, 6(3): 336-346
Published: 07 April 2023
Abstract PDF (13.5 MB) Collect
Downloads:76

Human Action Recognition (HAR) attempts to recognize the human action from images and videos. The major challenge in HAR is the design of an action descriptor that makes the HAR system robust for different environments. A novel action descriptor is proposed in this study, based on two independent spatial and spectral filters. The proposed descriptor uses a Difference of Gaussian (DoG) filter to extract scale-invariant features and a Difference of Wavelet (DoW) filter to extract spectral information. To create a composite feature vector for a particular test action picture, the Discriminant of Guassian (DoG) and Difference of Wavelet (DoW) features are combined. Linear Discriminant Analysis (LDA), a widely used dimensionality reduction technique, is also used to eliminate duplicate data. Finally, a closest neighbor method is used to classify the dataset. Weizmann and UCF 11 datasets were used to run extensive simulations of the suggested strategy, and the accuracy assessed after the simulations were run on Weizmann datasets for five-fold cross validation is shown to perform well. The average accuracy of DoG + DoW is observed as 83.6635% while the average accuracy of Discrinanat of Guassian (DoG) and Difference of Wavelet (DoW) is observed as 80.2312% and 77.4215%, respectively. The average accuracy measured after the simulation of proposed methods over UCF 11 action dataset for five-fold cross validation DoG + DoW is observed as 62.5231% while the average accuracy of Difference of Guassian (DoG) and Difference of Wavelet (DoW) is observed as 60.3214% and 58.1247%, respectively. From the above accuracy observations, the accuracy of Weizmann is high compared to the accuracy of UCF 11, hence verifying the effectiveness in the improvisation of recognition accuracy.

Open Access Issue
An Intelligent Heuristic Manta-Ray Foraging Optimization and Adaptive Extreme Learning Machine for Hand Gesture Image Recognition
Big Data Mining and Analytics 2023, 6(3): 321-335
Published: 07 April 2023
Abstract PDF (16.4 MB) Collect
Downloads:111

The development of hand gesture recognition systems has gained more attention in recent days, due to its support of modern human-computer interfaces. Moreover, sign language recognition is mainly developed for enabling communication between deaf and dumb people. In conventional works, various image processing techniques like segmentation, optimization, and classification are deployed for hand gesture recognition. Still, it limits the major problems of inefficient handling of large dimensional datasets and requires more time consumption, increased false positives, error rate, and misclassification outputs. Hence, this research work intends to develop an efficient hand gesture image recognition system by using advanced image processing techniques. During image segmentation, skin color detection and morphological operations are performed for accurately segmenting the hand gesture portion. Then, the Heuristic Manta-ray Foraging Optimization (HMFO) technique is employed for optimally selecting the features by computing the best fitness value. Moreover, the reduced dimensionality of features helps to increase the accuracy of classification with a reduced error rate. Finally, an Adaptive Extreme Learning Machine (AELM) based classification technique is employed for predicting the recognition output. During results validation, various evaluation measures have been used to compare the proposed model’s performance with other classification approaches.

Open Access Issue
Extraction of Fetal Electrocardiogram by Combining Deep Learning and SVD-ICA-NMF Methods
Big Data Mining and Analytics 2023, 6(3): 301-310
Published: 07 April 2023
Abstract PDF (6.1 MB) Collect
Downloads:473

This paper deals with detecting fetal electrocardiogram FECG signals from single-channel abdominal lead. It is based on the Convolutional Neural Network (CNN) combined with advanced mathematical methods, such as Independent Component Analysis (ICA), Singular Value Decomposition (SVD), and a dimension-reduction technique like Nonnegative Matrix Factorization (NMF). Due to the highly disproportionate frequency of the fetus’s heart rate compared to the mother’s, the time-scale representation clearly distinguishes the fetal electrical activity in terms of energy. Furthermore, we can disentangle the various components of fetal ECG, which serve as inputs to the CNN model to optimize the actual FECG signal, denoted by FECGr, which is recovered using the SVD-ICA process. The findings demonstrate the efficiency of this innovative approach, which may be deployed in real-time.

Open Access Issue
An Ensemble Learning Based Intrusion Detection Model for Industrial IoT Security
Big Data Mining and Analytics 2023, 6(3): 273-287
Published: 07 April 2023
Abstract PDF (3.5 MB) Collect
Downloads:233

Industrial Internet of Things (IIoT) represents the expansion of the Internet of Things (IoT) in industrial sectors. It is designed to implicate embedded technologies in manufacturing fields to enhance their operations. However, IIoT involves some security vulnerabilities that are more damaging than those of IoT. Accordingly, Intrusion Detection Systems (IDSs) have been developed to forestall inevitable harmful intrusions. IDSs survey the environment to identify intrusions in real time. This study designs an intrusion detection model exploiting feature engineering and machine learning for IIoT security. We combine Isolation Forest (IF) with Pearson’s Correlation Coefficient (PCC) to reduce computational cost and prediction time. IF is exploited to detect and remove outliers from datasets. We apply PCC to choose the most appropriate features. PCC and IF are applied exchangeably (PCCIF and IFPCC). The Random Forest (RF) classifier is implemented to enhance IDS performances. For evaluation, we use the Bot-IoT and NF-UNSW-NB15-v2 datasets. RF-PCCIF and RF-IFPCC show noteworthy results with 99.98% and 99.99% Accuracy (ACC) and 6.18 s and 6.25 s prediction time on Bot-IoT, respectively. The two models also score 99.30% and 99.18% ACC and 6.71 s and 6.87 s prediction time on NF-UNSW-NB15-v2, respectively. Results prove that our designed model has several advantages and higher performance than related models.

Open Access Issue
A Machine Learning Based Framework for a Stage-Wise Classification of Date Palm White Scale Disease
Big Data Mining and Analytics 2023, 6(3): 263-272
Published: 07 April 2023
Abstract PDF (1.8 MB) Collect
Downloads:355

Date palm production is critical to oasis agriculture, owing to its economic importance and nutritional advantages. Numerous diseases endanger this precious tree, putting a strain on the economy and environment. White scale Parlatoria blanchardi is a damaging bug that degrades the quality of dates. When an infestation reaches a specific degree, it might result in the tree’s death. To counter this threat, precise detection of infected leaves and its infestation degree is important to decide if chemical treatment is necessary. This decision is crucial for farmers who wish to minimize yield losses while preserving production quality. For this purpose, we propose a feature extraction and machine learning (ML) technique based framework for classifying the stages of infestation by white scale disease (WSD) in date palm trees by investigating their leaflets images. 80 gray level co-occurrence matrix (GLCM) texture features and 9 hue, saturation, and value (HSV) color moments features are extracted from both grayscale and color images of the used dataset. To classify the WSD into its four classes (healthy, low infestation degree, medium infestation degree, and high infestation degree), two types of ML algorithms were tested; classical machine learning methods, namely, support vector machine (SVM) and k-nearest neighbors (KNN), and ensemble learning methods such as random forest (RF) and light gradient boosting machine (LightGBM). The ML models were trained and evaluated using two datasets: the first is composed of the extracted GLCM features only, and the second combines GLCM and HSV descriptors. The results indicate that SVM classifier outperformed on combined GLCM and HSV features with an accuracy of 98.29%. The proposed framework could be beneficial to the oasis agricultural community in terms of early detection of date palm white scale disease (DPWSD) and assisting in the adoption of preventive measures to protect both date palm trees and crop yield.

Open Access Issue
Application of Internet of Things in the Health Sector: Toward Minimizing Energy Consumption
Big Data Mining and Analytics 2022, 5(4): 302-308
Published: 18 July 2022
Abstract PDF (9.6 MB) Collect
Downloads:78

The Internet of Things (IoT) is currently reflected in the increase in the number of connected objects, that is, devices with their own identity and computing and communication capacities. IoT is recognized as one of the most critical areas for future technologies, gaining worldwide attention. It applies to many areas, where it has achieved success, such as healthcare, where a patient is monitored using nodes and lightweight sensors. However, the powerful functions of IoT in the medical field are based on communication, analysis, processing, and management of data autonomously without any manual intervention, which presents many difficulties, such as energy consumption. However, these issues significantly slow down the development and rapid deployment of this technology. The main causes of wasted energy from connected objects include collisions that occur when two or more nodes send data simultaneously and the leading cause of data retransmission that occurs when a collision occurs or when data are not received correctly due to channel fading. The distance between nodes is one of the factors influencing energy consumption. In this article, we have proposed direct communication between nodes to avoid collision domains, which will help reduce data retransmission. The results show that the distribution can ensure the performance of the system under general conditions compared to the centralization and to the existing works.

Open Access Issue
Predicting Students’ Final Performance Using Artificial Neural Networks
Big Data Mining and Analytics 2022, 5(4): 294-301
Published: 18 July 2022
Abstract PDF (6 MB) Collect
Downloads:181

Artificial Intelligence (AI) is based on algorithms that allow machines to make decisions for humans. This technology enhances the users’ experience in various ways. Several studies have been conducted in the field of education to solve the problem of student orientation and performance using various Machine Learning (ML) algorithms. The main goal of this article is to predict Moroccan students’ performance in the region of Guelmim Oued Noun using an intelligent system based on neural networks, one of the best data mining techniques that provided us with the best results.

Total 12