[go: up one dir, main page]

 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (2,694)

Search Parameters:
Keywords = IoT security

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
27 pages, 2471 KiB  
Article
Secure Dynamic Scheduling for Federated Learning in Underwater Wireless IoT Networks
by Lei Yan, Lei Wang, Guanjun Li, Jingwei Shao and Zhixin Xia
J. Mar. Sci. Eng. 2024, 12(9), 1656; https://doi.org/10.3390/jmse12091656 - 16 Sep 2024
Abstract
Federated learning (FL) is a distributed machine learning approach that can enable Internet of Things (IoT) edge devices to collaboratively learn a machine learning model without explicitly sharing local data in order to achieve data clustering, prediction, and classification in networks. In previous [...] Read more.
Federated learning (FL) is a distributed machine learning approach that can enable Internet of Things (IoT) edge devices to collaboratively learn a machine learning model without explicitly sharing local data in order to achieve data clustering, prediction, and classification in networks. In previous works, some online multi-armed bandit (MAB)-based FL frameworks were proposed to enable dynamic client scheduling for improving the efficiency of FL in underwater wireless IoT networks. However, the security of online dynamic scheduling, which is especially essential for underwater wireless IoT, is increasingly being questioned. In this work, we study secure dynamic scheduling for FL frameworks that can protect against malicious clients in underwater FL-assisted wireless IoT networks. Specifically, in order to jointly optimize the communication efficiency and security of FL, we employ MAB-based methods and propose upper-confidence-bound-based smart contracts (UCB-SCs) and upper-confidence-bound-based smart contracts with a security prediction model (UCB-SCPs) to address the optimal scheduling scheme over time-varying underwater channels. Then, we give the upper bounds of the expected performance regret of the UCB-SC policy and the UCB-SCP policy; these upper bounds imply that the regret of the two proposed policies grows logarithmically over communication rounds under certain conditions. Our experiment shows that the proposed UCB-SC and UCB-SCP approaches significantly improve the efficiency and security of FL frameworks in underwater wireless IoT networks. Full article
(This article belongs to the Special Issue Underwater Wireless Communications: Recent Advances and Challenges)
18 pages, 3424 KiB  
Article
Architecture for Enhancing Communication Security with RBAC IoT Protocol-Based Microgrids
by SooHyun Shin, MyungJoo Park, TaeWan Kim and HyoSik Yang
Sensors 2024, 24(18), 6000; https://doi.org/10.3390/s24186000 (registering DOI) - 16 Sep 2024
Abstract
In traditional power grids, the unidirectional flow of energy and information has led to a decrease in efficiency. To address this issue, the concept of microgrids with bidirectional flow and independent power sources has been introduced. The components of a microgrid utilize various [...] Read more.
In traditional power grids, the unidirectional flow of energy and information has led to a decrease in efficiency. To address this issue, the concept of microgrids with bidirectional flow and independent power sources has been introduced. The components of a microgrid utilize various IoT protocols such as OPC-UA, MQTT, and DDS to implement bidirectional communication, enabling seamless network communication among different elements within the microgrid. Technological innovation, however, has simultaneously given rise to security issues in the communication system of microgrids. The use of IoT protocols creates vulnerabilities that malicious hackers may exploit to eavesdrop on data or attempt unauthorized control of microgrid devices. Therefore, monitoring and controlling security vulnerabilities is essential to prevent intrusion threats and enhance cyber resilience in the stable and efficient operation of microgrid systems. In this study, we propose an RBAC-based security approach on top of DDS protocols in microgrid systems. The proposed approach allocates roles to users or devices and grants various permissions for access control. DDS subscribers request access to topics and publishers request access to evaluations from the role repository using XACML. The overall implementation model is designed for the publisher to receive XACML transmitted from the repository and perform policy decision making and enforcement. By applying these methods, security vulnerabilities in communication between IoT devices can be reduced, and cyber resilience can be enhanced. Full article
(This article belongs to the Special Issue IoT Cybersecurity)
Show Figures

Figure 1

Figure 1
<p>Data flow of XACML.</p>
Full article ">Figure 2
<p>DCPS structure in microgrid.</p>
Full article ">Figure 3
<p>“Push” and “Pull” model in IEC 62351-8 [<a href="#B10-sensors-24-06000" class="html-bibr">10</a>].</p>
Full article ">Figure 4
<p>OpenFMB architecture [<a href="#B32-sensors-24-06000" class="html-bibr">32</a>].</p>
Full article ">Figure 5
<p>DDS and XACML into the concept of draft idea.</p>
Full article ">Figure 6
<p>Overall architecture.</p>
Full article ">Figure 7
<p>Communication of publish and subscribe on DDS.</p>
Full article ">Figure 8
<p>DDS and XACML data flow using domain ID.</p>
Full article ">
26 pages, 3492 KiB  
Article
Image Processing for Smart Agriculture Applications Using Cloud-Fog Computing
by Dušan Marković, Zoran Stamenković, Borislav Đorđević and Siniša Ranđić
Sensors 2024, 24(18), 5965; https://doi.org/10.3390/s24185965 (registering DOI) - 14 Sep 2024
Viewed by 296
Abstract
The widespread use of IoT devices has led to the generation of a huge amount of data and driven the need for analytical solutions in many areas of human activities, such as the field of smart agriculture. Continuous monitoring of crop growth stages [...] Read more.
The widespread use of IoT devices has led to the generation of a huge amount of data and driven the need for analytical solutions in many areas of human activities, such as the field of smart agriculture. Continuous monitoring of crop growth stages enables timely interventions, such as control of weeds and plant diseases, as well as pest control, ensuring optimal development. Decision-making systems in smart agriculture involve image analysis with the potential to increase productivity, efficiency and sustainability. By applying Convolutional Neural Networks (CNNs), state recognition and classification can be performed based on images from specific locations. Thus, we have developed a solution for early problem detection and resource management optimization. The main concept of the proposed solution relies on a direct connection between Cloud and Edge devices, which is achieved through Fog computing. The goal of our work is creation of a deep learning model for image classification that can be optimized and adapted for implementation on devices with limited hardware resources at the level of Fog computing. This could increase the importance of image processing in the reduction of agricultural operating costs and manual labor. As a result of the off-load data processing at Edge and Fog devices, the system responsiveness can be improved, the costs associated with data transmission and storage can be reduced, and the overall system reliability and security can be increased. The proposed solution can choose classification algorithms to find a trade-off between size and accuracy of the model optimized for devices with limited hardware resources. After testing our model for tomato disease classification compiled for execution on FPGA, it was found that the decrease in test accuracy is as small as 0.83% (from 96.29% to 95.46%). Full article
(This article belongs to the Special Issue Smart Decision Systems for Digital Farming: 2nd Edition)
Show Figures

Figure 1

Figure 1
<p>Cloud-Fog computing structure.</p>
Full article ">Figure 2
<p>PYNQ Z2 board.</p>
Full article ">Figure 3
<p>Training CNN models and preparing for image classification on the server and PYNQ Z2.</p>
Full article ">Figure 4
<p>Preparation of CNN models to run on PYNQ Z2.</p>
Full article ">Figure 5
<p>Preparing an acceleration model for image classification on FPGA.</p>
Full article ">Figure 6
<p>Test accuracy for CNN models run on server and PYNQ Z2.</p>
Full article ">Figure 7
<p>Latency in image classification on the server for different application settings.</p>
Full article ">Figure 8
<p>Latency in image classification on the server running all three applications.</p>
Full article ">Figure 9
<p>Time elapsed in receiving result of image classification.</p>
Full article ">Figure 10
<p>Network data transfer to the server.</p>
Full article ">Figure 11
<p>Energy consumption on the server during application testing.</p>
Full article ">
22 pages, 1170 KiB  
Systematic Review
Systematic Review of IoT-Based Solutions for User Tracking: Towards Smarter Lifestyle, Wellness and Health Management
by Reza Amini Gougeh and Zeljko Zilic
Sensors 2024, 24(18), 5939; https://doi.org/10.3390/s24185939 - 13 Sep 2024
Viewed by 345
Abstract
The Internet of Things (IoT) base has grown to over 20 billion devices currently operational worldwide. As they greatly extend the applicability and use of biosensors, IoT developments are transformative. Recent studies show that IoT, coupled with advanced communication frameworks, such as machine-to-machine [...] Read more.
The Internet of Things (IoT) base has grown to over 20 billion devices currently operational worldwide. As they greatly extend the applicability and use of biosensors, IoT developments are transformative. Recent studies show that IoT, coupled with advanced communication frameworks, such as machine-to-machine (M2M) interactions, can lead to (1) improved efficiency in data exchange, (2) accurate and timely health monitoring, and (3) enhanced user engagement and compliance through advancements in human–computer interaction. This systematic review of the 19 most relevant studies examines the potential of IoT in health and lifestyle management by conducting detailed analyses and quality assessments of each study. Findings indicate that IoT-based systems effectively monitor various health parameters using biosensors, facilitate real-time feedback, and support personalized health recommendations. Key limitations include small sample sizes, insufficient security measures, practical issues with wearable sensors, and reliance on internet connectivity in areas with poor network infrastructure. The reviewed studies demonstrated innovative applications of IoT, focusing on M2M interactions, edge devices, multimodality health monitoring, intelligent decision-making, and automated health management systems. These insights offer valuable recommendations for optimizing IoT technologies in health and wellness management. Full article
(This article belongs to the Special Issue IoT-Based Smart Environments, Applications and Tools)
Show Figures

Figure 1

Figure 1
<p>Flowchart of paper selection steps.</p>
Full article ">Figure 2
<p>Alluvial diagram showcasing technologies and methods utilized in the included articles for IoT-based user-tracking systems.</p>
Full article ">
22 pages, 3519 KiB  
Article
Deep Complex Gated Recurrent Networks-Based IoT Network Intrusion Detection Systems
by Engy El-Shafeiy, Walaa M. Elsayed, Haitham Elwahsh, Maazen Alsabaan, Mohamed I. Ibrahem and Gamal Farouk Elhady
Sensors 2024, 24(18), 5933; https://doi.org/10.3390/s24185933 - 13 Sep 2024
Viewed by 417
Abstract
The explosive growth of the Internet of Things (IoT) has highlighted the urgent need for strong network security measures. The distinctive difficulties presented by Internet of Things (IoT) environments, such as the wide variety of devices, the intricacy of network traffic, and the [...] Read more.
The explosive growth of the Internet of Things (IoT) has highlighted the urgent need for strong network security measures. The distinctive difficulties presented by Internet of Things (IoT) environments, such as the wide variety of devices, the intricacy of network traffic, and the requirement for real-time detection capabilities, are difficult for conventional intrusion detection systems (IDS) to adjust to. To address these issues, we propose DCGR_IoT, an innovative intrusion detection system (IDS) based on deep neural learning that is intended to protect bidirectional communication networks in the IoT environment. DCGR_IoT employs advanced techniques to enhance anomaly detection capabilities. Convolutional neural networks (CNN) are used for spatial feature extraction and superfluous data are filtered to improve computing efficiency. Furthermore, complex gated recurrent networks (CGRNs) are used for the temporal feature extraction module, which is utilized by DCGR_IoT. Furthermore, DCGR_IoT harnesses complex gated recurrent networks (CGRNs) to construct multidimensional feature subsets, enabling a more detailed spatial representation of network traffic and facilitating the extraction of critical features that are essential for intrusion detection. The effectiveness of the DCGR_IoT was proven through extensive evaluations of the UNSW-NB15, KDDCup99, and IoT-23 datasets, which resulted in a high detection accuracy of 99.2%. These results demonstrate the DCG potential of DCGR-IoT as an effective solution for defending IoT networks against sophisticated cyber-attacks. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

Figure 1
<p>The architecture convolution neural network in DCGR_IoT.</p>
Full article ">Figure 2
<p>The architecture of complex gated recurrent network.</p>
Full article ">Figure 3
<p>The architecture proposes DCGR_IoT.</p>
Full article ">Figure 4
<p>The flowchart proposes Model.</p>
Full article ">Figure 5
<p>DCGR_IoT confusion matrix for UNSW-NB15 dataset.</p>
Full article ">Figure 6
<p>DCGR_IoT training and testing performance on UNSW-NB15 dataset.</p>
Full article ">Figure 7
<p>DCGR_IoT is a confusion matrix for the KDDCup99 dataset.</p>
Full article ">Figure 8
<p>DCGR_IoT training and testing performance on KDDCup99 dataset.</p>
Full article ">Figure 9
<p>DCGR_IoT confusion matrix for IoT-23 dataset.</p>
Full article ">Figure 10
<p>DCGR_IoT A training and testing performance on IoT-23 dataset.</p>
Full article ">
16 pages, 2744 KiB  
Article
VGGIncepNet: Enhancing Network Intrusion Detection and Network Security through Non-Image-to-Image Conversion and Deep Learning
by Jialong Chen, Jingjing Xiao and Jiaxin Xu
Electronics 2024, 13(18), 3639; https://doi.org/10.3390/electronics13183639 - 12 Sep 2024
Viewed by 340
Abstract
This paper presents an innovative model, VGGIncepNet, which integrates non-image-to-image conversion techniques with deep learning modules, specifically VGG16 and Inception, aiming to enhance performance in network intrusion detection and IoT security analysis. By converting non-image data into image data, the model leverages the [...] Read more.
This paper presents an innovative model, VGGIncepNet, which integrates non-image-to-image conversion techniques with deep learning modules, specifically VGG16 and Inception, aiming to enhance performance in network intrusion detection and IoT security analysis. By converting non-image data into image data, the model leverages the powerful feature extraction capabilities of convolutional neural networks, thereby improving the multi-class classification of network attacks. We conducted extensive experiments on the NSL-KDD and CICIoT2023 datasets, and the results demonstrate that VGGIncepNet outperforms existing models, including BERT, DistilBERT, XLNet, and T5, across evaluation metrics such as accuracy, precision, recall, and F1-Score. VGGIncepNet exhibits outstanding classification performance, particularly excelling in precision and F1-Score. The experimental results validate VGGIncepNet’s adaptability and robustness in complex network environments, providing an effective solution for the real-time detection of malicious activities in network systems. This study offers new methods and tools for network security and IoT security analysis, with broad application prospects. Full article
Show Figures

Figure 1

Figure 1
<p>Transformation from feature vector to feature matrix.</p>
Full article ">Figure 2
<p>The deep network architecture of VGGIncepNet model.</p>
Full article ">Figure 3
<p>The system architecture of VGGIncepNet model.</p>
Full article ">Figure 4
<p>Data preprocessing steps.</p>
Full article ">Figure 5
<p>Confusion matrix of the VGGIncepNet model in NSL-KDD dataset.</p>
Full article ">Figure 6
<p>Confusion matrix of the VGGIncepNet model in CICIoT2023 dataset.</p>
Full article ">Figure 7
<p>Classification results comparison in NSL-KDD dataset.</p>
Full article ">Figure 8
<p>Classification results comparison in CICIoT2023 dataset.</p>
Full article ">
17 pages, 733 KiB  
Article
A Comparative Analysis of the TDCGAN Model for Data Balancing and Intrusion Detection
by Mohammad Jamoos, Antonio M. Mora, Mohammad AlKhanafseh and Ola Surakhi
Signals 2024, 5(3), 580-596; https://doi.org/10.3390/signals5030032 - 12 Sep 2024
Viewed by 227
Abstract
Due to the escalating network throughput and security risks, the exploration of intrusion detection systems (IDSs) has garnered significant attention within the computer science field. The majority of modern IDSs are constructed using deep learning techniques. Nevertheless, these IDSs still have shortcomings where [...] Read more.
Due to the escalating network throughput and security risks, the exploration of intrusion detection systems (IDSs) has garnered significant attention within the computer science field. The majority of modern IDSs are constructed using deep learning techniques. Nevertheless, these IDSs still have shortcomings where most datasets used for IDS lies in their high imbalance, where the volume of samples representing normal traffic significantly outweighs those representing attack traffic. This imbalance issue restricts the performance of deep learning classifiers for minority classes, as it can bias the classifier in favor of the majority class. To address this challenge, many solutions are proposed in the literature. TDCGAN is an innovative Generative Adversarial Network (GAN) based on a model-driven approach used to address imbalanced data in the IDS dataset. This paper investigates the performance of TDCGAN by employing it to balance data across four benchmark IDS datasets which are CIC-IDS2017, CSE-CIC-IDS2018, KDD-cup 99, and BOT-IOT. Next, four machine learning methods are employed to classify the data, both on the imbalanced dataset and on the balanced dataset. A comparison is then conducted between the results obtained from each to identify the impact of having an imbalanced dataset on classification accuracy. The results demonstrated a notable enhancement in the classification accuracy for each classifier after the implementation of the TDCGAN model for data balancing. Full article
Show Figures

Figure 1

Figure 1
<p>Flowchart of the study.</p>
Full article ">Figure 2
<p>The pseudo code of the proposed appraoch.</p>
Full article ">Figure 3
<p>CIC-IDS2017.</p>
Full article ">Figure 4
<p>CSE-CIC-IDS2018.</p>
Full article ">Figure 5
<p>KDD-cup 99.</p>
Full article ">Figure 6
<p>BOT-IOT.</p>
Full article ">
28 pages, 2936 KiB  
Systematic Review
Medical IoT Record Security and Blockchain: Systematic Review of Milieu, Milestones, and Momentum
by Simeon Okechukwu Ajakwe, Igboanusi Ikechi Saviour, Vivian Ukamaka Ihekoronye, Odinachi U. Nwankwo, Mohamed Abubakar Dini, Izuazu Urslla Uchechi, Dong-Seong Kim and Jae Min Lee
Big Data Cogn. Comput. 2024, 8(9), 121; https://doi.org/10.3390/bdcc8090121 - 12 Sep 2024
Viewed by 659
Abstract
The sensitivity and exclusivity attached to personal health records make such records a prime target for cyber intruders, as unauthorized access causes unfathomable repudiation and public defamation. In reality, most medical records are micro-managed by different healthcare providers, exposing them to various security [...] Read more.
The sensitivity and exclusivity attached to personal health records make such records a prime target for cyber intruders, as unauthorized access causes unfathomable repudiation and public defamation. In reality, most medical records are micro-managed by different healthcare providers, exposing them to various security issues, especially unauthorized third-party access. Over time, substantial progress has been made in preventing unauthorized access to this critical and highly classified information. This review investigated the mainstream security challenges associated with the transmissibility of medical records, the evolutionary security strategies for maintaining confidentiality, and the existential enablers of trustworthy and transparent authorization and authentication before data transmission can be carried out. The review adopted the PRSIMA-SPIDER methodology for a systematic review of 122 articles, comprising 9 surveys (7.37%) for qualitative analysis, 109 technical papers (89.34%), and 4 online reports (3.27%) for quantitative studies. The review outcome indicates that the sensitivity and confidentiality of a highly classified document, such as a medical record, demand unabridged authorization by the owner, unquestionable preservation by the host, untainted transparency in transmission, unbiased traceability, and ubiquitous security, which blockchain technology guarantees, although at the infancy stage. Therefore, developing blockchain-assisted frameworks for digital medical record preservation and addressing inherent technological hitches in blockchain will further accelerate transparent and trustworthy preservation, user authorization, and authentication of medical records before they are transmitted by the host for third-party access. Full article
(This article belongs to the Special Issue Research on Privacy and Data Security)
Show Figures

Figure 1

Figure 1
<p>Applications and use cases of blockchain technology in medical record preservation.</p>
Full article ">Figure 2
<p>The survey reading map highlighting the important sections.</p>
Full article ">Figure 3
<p>Distribution of articles used for quantitative and qualitative analysis.</p>
Full article ">Figure 4
<p>The PRISMA/SPIDER strategy for article gathering, screening, eligibility, and inclusion.</p>
Full article ">Figure 5
<p>Current status of clinical trials, highlighting the various processes and phases involved.</p>
Full article ">Figure 6
<p>Potential and prospects of blockchain technology in medical IoT record preservation and transmission fidelity.</p>
Full article ">Figure 7
<p>Achieving MIR transmission fidelity, preservation, and authentication via blockchain network for secured digitization of healthcare services.</p>
Full article ">Figure 8
<p>The milestones of blockchain for MIR security and preservation towards the actualization of secured digitization of health care, highlighting the various stages and underlying security improvements from automation to realization.</p>
Full article ">
13 pages, 265 KiB  
Article
Efficient Elliptic Curve Diffie–Hellman Key Exchange for Resource-Constrained IoT Devices
by Vinayak Tanksale
Electronics 2024, 13(18), 3631; https://doi.org/10.3390/electronics13183631 - 12 Sep 2024
Viewed by 245
Abstract
In the era of ubiquitous connectivity facilitated by the Internet of Things (IoT), ensuring robust security mechanisms for communication channels among resource-constrained devices has become imperative. Elliptic curve Diffie–Hellman (ECDH) key exchange offers strong security assurances and computational efficiency. This paper investigates the [...] Read more.
In the era of ubiquitous connectivity facilitated by the Internet of Things (IoT), ensuring robust security mechanisms for communication channels among resource-constrained devices has become imperative. Elliptic curve Diffie–Hellman (ECDH) key exchange offers strong security assurances and computational efficiency. This paper investigates the challenges and opportunities of deploying ECDH key exchange protocols on resource-constrained IoT devices. We review the fundamentals of ECDH and explore optimization techniques tailored to the limitations of embedded systems, including memory constraints, processing power, and energy efficiency. We optimize the implementation of five elliptic curves and compare them using experimental results. Our experiments focus on electronic control units and sensors in vehicular networks. The findings provide valuable insights for IoT developers, researchers, and industry stakeholders striving to enhance the security posture of embedded IoT systems while maintaining efficiency. Full article
(This article belongs to the Special Issue Security and Privacy in IoT Devices and Computing)
Show Figures

Figure 1

Figure 1
<p>Elliptic curve <math display="inline"><semantics> <mrow> <msup> <mi>y</mi> <mn>2</mn> </msup> <mo>=</mo> <msup> <mi>x</mi> <mn>3</mn> </msup> <mo>−</mo> <mn>3</mn> <mi>x</mi> <mo>+</mo> <mn>5</mn> </mrow> </semantics></math>.</p>
Full article ">Figure 2
<p>System diagram for ECDH key exchange between resource-constrained devices.</p>
Full article ">
22 pages, 3401 KiB  
Article
Using Blockchain Technology for Sustainability and Secure Data Management in the Energy Industry: Implications and Future Research Directions
by Marianna Lezzi, Vito Del Vecchio and Mariangela Lazoi
Sustainability 2024, 16(18), 7949; https://doi.org/10.3390/su16187949 - 11 Sep 2024
Viewed by 706
Abstract
In the current era of digital transformation, among the plethora of technologies, blockchain (BC) technology has attracted attention, carrying the weight of enormous expectations in terms of its applicability and benefits. BC technology promises immutability, reliability, transparency, and security of transactions, using decentralized [...] Read more.
In the current era of digital transformation, among the plethora of technologies, blockchain (BC) technology has attracted attention, carrying the weight of enormous expectations in terms of its applicability and benefits. BC technology promises immutability, reliability, transparency, and security of transactions, using decentralized models to scale up existing Internet of Things (IoT) solutions while guaranteeing privacy. In the energy industry, BC technology is mainly used to secure distributed power grids, which have proven to be easily hackable by malicious users. Recognizing the need for a preliminary analysis of the literature investigating the role of BC technology for sustainability and secure data management in the energy industry, this study conducts a bibliometric analysis, identifying the implications and research directions in the field. Specifically, a performance analysis and scientific mapping are performed on 943 documents using the Scopus database and the VOSviewer software version 1.6.20. The result is the identification of seven thematic clusters and the most relevant implications as well as future research actions at the strategic, technical, regulatory, and social levels. This study extends the literature by suggesting potential sustainability opportunities regarding BC technology adoption in the energy industry; it also supports managers in identifying strategies to strengthen business sustainability by leveraging the development of new knowledge for secure asset management. Full article
(This article belongs to the Section Economic and Business Aspects of Sustainability)
Show Figures

Figure 1

Figure 1
<p>The steps of the research methodology.</p>
Full article ">Figure 2
<p>The publishing trend of papers over the years.</p>
Full article ">Figure 3
<p>Document type distribution.</p>
Full article ">Figure 4
<p>The distribution of papers around the globe.</p>
Full article ">Figure 5
<p>The distribution of papers by subject area.</p>
Full article ">Figure 6
<p>Network visualization.</p>
Full article ">Figure 7
<p>Overlay visualization.</p>
Full article ">Figure 8
<p>Item density visualization.</p>
Full article ">
23 pages, 530 KiB  
Review
Machine Learning-Based Intrusion Detection Methods in IoT Systems: A Comprehensive Review
by Brunel Rolack Kikissagbe and Meddi Adda
Electronics 2024, 13(18), 3601; https://doi.org/10.3390/electronics13183601 - 11 Sep 2024
Viewed by 703
Abstract
The rise of the Internet of Things (IoT) has transformed our daily lives by connecting objects to the Internet, thereby creating interactive, automated environments. However, this rapid expansion raises major security concerns, particularly regarding intrusion detection. Traditional intrusion detection systems (IDSs) are often [...] Read more.
The rise of the Internet of Things (IoT) has transformed our daily lives by connecting objects to the Internet, thereby creating interactive, automated environments. However, this rapid expansion raises major security concerns, particularly regarding intrusion detection. Traditional intrusion detection systems (IDSs) are often ill-suited to the dynamic and varied networks characteristic of the IoT. Machine learning is emerging as a promising solution to these challenges, offering the intelligence and flexibility needed to counter complex and evolving threats. This comprehensive review explores different machine learning approaches for intrusion detection in IoT systems, covering supervised, unsupervised, and deep learning methods, as well as hybrid models. It assesses their effectiveness, limitations, and practical applications, highlighting the potential of machine learning to enhance the security of IoT systems. In addition, the study examines current industry issues and trends, highlighting the importance of ongoing research to keep pace with the rapidly evolving IoT security ecosystem. Full article
(This article belongs to the Special Issue Artificial Intelligence Empowered Internet of Things)
Show Figures

Figure 1

Figure 1
<p>Distribution of reviews over the past 5 years (PubMed Database).</p>
Full article ">Figure 2
<p>Distribution of documents by year (PubMed database).</p>
Full article ">Figure 3
<p>The PRISMA flow diagram.</p>
Full article ">Figure 4
<p>Common IoT architectures.</p>
Full article ">Figure 5
<p>Classification of vulnerabilities by layer.</p>
Full article ">Figure 6
<p>Classification of attacks by layers.</p>
Full article ">Figure 7
<p>Taxonomy of attack classification according to vulnerabilities.</p>
Full article ">Figure 8
<p>Taxonomy of machine learning methods in IoT.</p>
Full article ">
25 pages, 3537 KiB  
Article
A Complete EDA and DL Pipeline for Softwarized 5G Network Intrusion Detection
by Abdallah Moubayed
Future Internet 2024, 16(9), 331; https://doi.org/10.3390/fi16090331 - 10 Sep 2024
Viewed by 181
Abstract
The rise of 5G networks is driven by increasing deployments of IoT devices and expanding mobile and fixed broadband subscriptions. Concurrently, the deployment of 5G networks has led to a surge in network-related attacks, due to expanded attack surfaces. Machine learning (ML), particularly [...] Read more.
The rise of 5G networks is driven by increasing deployments of IoT devices and expanding mobile and fixed broadband subscriptions. Concurrently, the deployment of 5G networks has led to a surge in network-related attacks, due to expanded attack surfaces. Machine learning (ML), particularly deep learning (DL), has emerged as a promising tool for addressing these security challenges in 5G networks. To that end, this work proposed an exploratory data analysis (EDA) and DL-based framework designed for 5G network intrusion detection. The approach aimed to better understand dataset characteristics, implement a DL-based detection pipeline, and evaluate its performance against existing methodologies. Experimental results using the 5G-NIDD dataset showed that the proposed DL-based models had extremely high intrusion detection and attack identification capabilities (above 99.5% and outperforming other models from the literature), while having a reasonable prediction time. This highlights their effectiveness and efficiency for such tasks in softwarized 5G environments. Full article
(This article belongs to the Special Issue Advanced 5G and beyond Networks)
Show Figures

Graphical abstract

Graphical abstract
Full article ">Figure 1
<p>Proposed EDA and DL framework for softwarized 5G network intrusion detection.</p>
Full article ">Figure 2
<p>Number of benign and malicious attack instances—overview.</p>
Full article ">Figure 3
<p>Missing value analysis—the shade of blue represents the degree of correlation of missing values between columns.</p>
Full article ">Figure 4
<p>Number of benign and malicious attack instances—categorized per attack type.</p>
Full article ">Figure 5
<p>Attack type distribution.</p>
Full article ">Figure 6
<p>Mutual information—binary case: (<b>a</b>) Top 25 Features, (<b>b</b>) Remaining Features.</p>
Full article ">Figure 7
<p>Mutual information—multi-class case: (<b>a</b>) Top 25 Features, (<b>b</b>) Remaining Features.</p>
Full article ">Figure 8
<p>Principal component analysis—binary case.</p>
Full article ">Figure 9
<p>Principal component analysis—multi-class case.</p>
Full article ">Figure 10
<p>Principal component analysis—importance.</p>
Full article ">Figure 11
<p>Dense autoencoder neural network architecture.</p>
Full article ">Figure 12
<p>Convolutional neural network architecture.</p>
Full article ">Figure 13
<p>Recurrent neural network architecture.</p>
Full article ">Figure 14
<p>Dense autoencoder neural network confusion matrix—multi-class classification scenario.</p>
Full article ">Figure 15
<p>Convolutional neural network confusion matrix—multi-class classification scenario.</p>
Full article ">Figure 16
<p>Recurrent neural network confusion matrix—multi-class classification scenario.</p>
Full article ">
28 pages, 3973 KiB  
Systematic Review
Edge Computing in Healthcare: Innovations, Opportunities, and Challenges
by Alexandru Rancea, Ionut Anghel and Tudor Cioara
Future Internet 2024, 16(9), 329; https://doi.org/10.3390/fi16090329 - 10 Sep 2024
Viewed by 830
Abstract
Edge computing promising a vision of processing data close to its generation point, reducing latency and bandwidth usage compared with traditional cloud computing architectures, has attracted significant attention lately. The integration of edge computing in modern systems takes advantage of Internet of Things [...] Read more.
Edge computing promising a vision of processing data close to its generation point, reducing latency and bandwidth usage compared with traditional cloud computing architectures, has attracted significant attention lately. The integration of edge computing in modern systems takes advantage of Internet of Things (IoT) devices and can potentially improve the systems’ performance, scalability, privacy, and security with applications in different domains. In the healthcare domain, modern IoT devices can nowadays be used to gather vital parameters and information that can be fed to edge Artificial Intelligence (AI) techniques able to offer precious insights and support to healthcare professionals. However, issues regarding data privacy and security, AI optimization, and computational offloading at the edge pose challenges to the adoption of edge AI. This paper aims to explore the current state of the art of edge AI in healthcare by using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) methodology and analyzing more than 70 Web of Science articles. We have defined the relevant research questions, clear inclusion and exclusion criteria, and classified the research works in three main directions: privacy and security, AI-based optimization methods, and edge offloading techniques. The findings highlight the many advantages of integrating edge computing in a wide range of healthcare use cases requiring data privacy and security, near real-time decision-making, and efficient communication links, with the potential to transform future healthcare services and eHealth applications. However, further research is needed to enforce new security-preserving methods and for better orchestrating and coordinating the load in distributed and decentralized scenarios. Full article
(This article belongs to the Special Issue Privacy and Security Issues with Edge Learning in IoT Systems)
Show Figures

Figure 1

Figure 1
<p>Edge computing characteristics.</p>
Full article ">Figure 2
<p>Edge computing use cases.</p>
Full article ">Figure 3
<p>Overview of edge computing in healthcare.</p>
Full article ">Figure 4
<p>PRISMA flow diagram.</p>
Full article ">Figure 5
<p>Paper distribution per publishing year.</p>
Full article ">Figure 6
<p>Paper distribution per publisher.</p>
Full article ">Figure 7
<p>Paper distribution per indexing quartile.</p>
Full article ">Figure 8
<p>Paper distribution on three research areas.</p>
Full article ">Figure 9
<p>Security and privacy requirements in edge computing-enabled healthcare.</p>
Full article ">Figure 10
<p>An overview of edge offloading flow in the computing continuum.</p>
Full article ">
27 pages, 3641 KiB  
Article
Application of Attribute-Based Encryption in Military Internet of Things Environment
by Łukasz Pióro, Jakub Sychowiec, Krzysztof Kanciak and Zbigniew Zieliński
Sensors 2024, 24(18), 5863; https://doi.org/10.3390/s24185863 - 10 Sep 2024
Viewed by 303
Abstract
The Military Internet of Things (MIoT) has emerged as a new research area in military intelligence. The MIoT frequently has to constitute a federation-capable IoT environment when the military needs to interact with other institutions and organizations or carry out joint missions as [...] Read more.
The Military Internet of Things (MIoT) has emerged as a new research area in military intelligence. The MIoT frequently has to constitute a federation-capable IoT environment when the military needs to interact with other institutions and organizations or carry out joint missions as part of a coalition such as in NATO. One of the main challenges of deploying the MIoT in such an environment is to acquire, analyze, and merge vast amounts of data from many different IoT devices and disseminate them in a secure, reliable, and context-dependent manner. This challenge is one of the main challenges in a federated environment and forms the basis for establishing trusting relationships and secure communication between IoT devices belonging to different partners. In this work, we focus on the problem of fulfillment of the data-centric security paradigm, i.e., ensuring the secure management of data along the path from its origin to the recipients and implementing fine-grained access control mechanisms. This problem can be solved using innovative solutions such as applying attribute-based encryption (ABE). In this work, we present a comprehensive solution for secure data dissemination in a federated MIoT environment, enabling the use of distributed registry technology (Hyperledger Fabric), a message broker (Apache Kafka), and data processing microservices implemented using the Kafka Streams API library. We designed and implemented ABE cryptography data access control methods using a combination of pairings-based elliptic curve cryptography and lightweight cryptography and confirmed their suitability for the federations of military networks. Experimental studies indicate that the proposed cryptographic scheme is viable for the number of attributes typically assumed to be used in battlefield networks, offering a good trade-off between security and performance for modern cryptographic applications. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

Figure 1
<p>High-level scheme of MIoT main components.</p>
Full article ">Figure 2
<p>The MIoT layered architecture.</p>
Full article ">Figure 3
<p>General overview of the experimental environment.</p>
Full article ">Figure 4
<p>Detailed overview of the experimental environment.</p>
Full article ">Figure 5
<p>Diagram of data flow in proposed system.</p>
Full article ">Figure 6
<p>Sequence diagram illustrating the ABE system setup steps.</p>
Full article ">Figure 7
<p>Sequence diagram illustrating the ABE attribute revocation steps.</p>
Full article ">Figure 8
<p>Sequence diagram of requesting new permissions.</p>
Full article ">Figure 9
<p>Deployment of data exchange system within 5G.</p>
Full article ">Figure 10
<p>Scheme of experimental setup.</p>
Full article ">
16 pages, 2741 KiB  
Article
A Gnn-Enhanced Ant Colony Optimization for Security Strategy Orchestration
by Weiwei Miao, Xinjian Zhao, Ce Wang, Shi Chen, Peng Gao and Qianmu Li
Symmetry 2024, 16(9), 1183; https://doi.org/10.3390/sym16091183 - 10 Sep 2024
Viewed by 399
Abstract
The expansion of Internet of Things (IoT) technology and the rapid increase in data in smart grid business scenarios have led to a need for more dynamic and adaptive security strategies. Traditional static security measures struggle to meet the evolving low-voltage security requirements [...] Read more.
The expansion of Internet of Things (IoT) technology and the rapid increase in data in smart grid business scenarios have led to a need for more dynamic and adaptive security strategies. Traditional static security measures struggle to meet the evolving low-voltage security requirements of state grid systems under this new IoT-driven environment. By incorporating symmetry in metaheuristic algorithms, we can further improve performance and robustness. Symmetrical properties have the potential to lead to more efficient and balanced solutions, improving the overall stability of the grid. We propose a gnn-enhanced ant colony optimization method for orchestrating grid security strategies, which trains across combinatorial optimization problems (COPs) that are representative scenarios in the state grid business scenarios, to learn specific mappings from instances to their heuristic measures. The learned heuristic metrics are embedded into the ant colony optimization (ACO) to generate the optimal security policy adapted to the current security situation. Compared to the ACO and adaptive elite ACO, our method reduces the average time consumption of finding a path within a limited time in the capacitated vehicle routing problem by 67.09% and 66.98%, respectively. Additionally, ablation experiments verify the effectiveness and necessity of the individual functional modules. Full article
(This article belongs to the Section Computer)
Show Figures

Figure 1

Figure 1
<p>Alarm node relationship-directed graph construction.</p>
Full article ">Figure 2
<p>Overall architecture of GACO.</p>
Full article ">Figure 3
<p>Security policy orchestration architecture.</p>
Full article ">Figure 4
<p>Performance comparison of CVRP issues. We used two traditional ACO algorithms and two NCO algorithms as a comparison.</p>
Full article ">Figure 5
<p>Different learning rate settings on the TSP100 problem.</p>
Full article ">Figure 6
<p>Different learning rate settings on the CVRP 100 problem.</p>
Full article ">Figure 7
<p>Ablation experiment of GACO. We evaluated the role of different components in GACO.</p>
Full article ">
Back to TopTop