Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
Add more filters










Publication year range
1.
IEEE Internet Things J ; 10(5): 3995-4005, 2023 Mar.
Article in English | MEDLINE | ID: mdl-38046398

ABSTRACT

The awareness of edge computing is attaining eminence and is largely acknowledged with the rise of Internet of Things (IoT). Edge-enabled solutions offer efficient computing and control at the network edge to resolve the scalability and latency-related concerns. Though, it comes to be challenging for edge computing to tackle diverse applications of IoT as they produce massive heterogeneous data. The IoT-enabled frameworks for Big Data analytics face numerous challenges in their existing structural design, for instance, the high volume of data storage and processing, data heterogeneity, and processing time among others. Moreover, the existing proposals lack effective parallel data loading and robust mechanisms for handling communication overhead. To address these challenges, we propose an optimized IoT-enabled big data analytics architecture for edge-cloud computing using machine learning. In the proposed scheme, an edge intelligence module is introduced to process and store the big data efficiently at the edges of the network with the integration of cloud technology. The proposed scheme is composed of two layers: IoT-edge and Cloud-processing. The data injection and storage is carried out with an optimized MapReduce parallel algorithm. Optimized Yet Another Resource Negotiator (YARN) is used for efficiently managing the cluster. The proposed data design is experimentally simulated with an authentic dataset using Apache Spark. The comparative analysis is decorated with existing proposals and traditional mechanisms. The results justify the efficiency of our proposed work.

2.
Sci Data ; 10(1): 867, 2023 12 05.
Article in English | MEDLINE | ID: mdl-38052819

ABSTRACT

An ongoing thrust of research focused on human gait pertains to identifying individuals based on gait patterns. However, no existing gait database supports modeling efforts to assess gait patterns unique to individuals. Hence, we introduce the Nonlinear Analysis Core (NONAN) GaitPrint database containing whole body kinematics and foot placement during self-paced overground walking on a 200-meter looping indoor track. Noraxon Ultium MotionTM inertial measurement unit (IMU) sensors sampled the motion of 35 healthy young adults (19-35 years old; 18 men and 17 women; mean ± 1 s.d. age: 24.6 ± 2.7 years; height: 1.73 ± 0.78 m; body mass: 72.44 ± 15.04 kg) over 18 4-min trials across two days. Continuous variables include acceleration, velocity, position, and the acceleration, velocity, position, orientation, and rotational velocity of each corresponding body segment, and the angle of each respective joint. The discrete variables include an exhaustive set of gait parameters derived from the spatiotemporal dynamics of foot placement. We technically validate our data using continuous relative phase, Lyapunov exponent, and Hurst exponent-nonlinear metrics quantifying different aspects of healthy human gait.


Subject(s)
Gait , Walking , Adult , Female , Humans , Male , Young Adult , Biomechanical Phenomena , Foot , Lower Extremity
3.
IEEE Trans Industr Inform ; 19(1): 1030-1038, 2023 Jan.
Article in English | MEDLINE | ID: mdl-37469712

ABSTRACT

A fundamental expectation of the stakeholders from the Industrial Internet of Things (IIoT) is its trustworthiness and sustainability to avoid the loss of human lives in performing a critical task. A trustworthy IIoT-enabled network encompasses fundamental security characteristics such as trust, privacy, security, reliability, resilience and safety. The traditional security mechanisms and procedures are insufficient to protect these networks owing to protocol differences, limited update options, and older adaptations of the security mechanisms. As a result, these networks require novel approaches to increase trust-level and enhance security and privacy mechanisms. Therefore, in this paper, we propose a novel approach to improve the trustworthiness of IIoT-enabled networks. We propose an accurate and reliable supervisory control and data acquisition (SCADA) network-based cyberattack detection in these networks. The proposed scheme combines the deep learning-based Pyramidal Recurrent Units (PRU) and Decision Tree (DT) with SCADA-based IIoT networks. We also use an ensemble-learning method to detect cyberattacks in SCADA-based IIoT networks. The non-linear learning ability of PRU and the ensemble DT address the sensitivity of irrelevant features, allowing high detection rates. The proposed scheme is evaluated on fifteen datasets generated from SCADA-based networks. The experimental results show that the proposed scheme outperforms traditional methods and machine learning-based detection approaches. The proposed scheme improves the security and associated measure of trustworthiness in IIoT-enabled networks.

4.
IEEE Internet Things J ; 9(22): 22173-22183, 2022 Nov 15.
Article in English | MEDLINE | ID: mdl-37448955

ABSTRACT

Cyber-Physical Systems (CPS) connected in the form of Internet of Things (IoT) are vulnerable to various security threats, due to the infrastructure-less deployment of IoT devices. Device-to-Device (D2D) authentication of these networks ensures the integrity, authenticity, and confidentiality of information in the deployed area. The literature suggests different approaches to address security issues in CPS technologies. However, they are mostly based on centralized techniques or specific system deployments with higher cost of computation and communication. It is therefore necessary to develop an effective scheme that can resolve the security problems in CPS technologies of IoT devices. In this paper, a lightweight Hash-MAC-DSDV (Hash Media Access Control Destination Sequence Distance Vector) routing scheme is proposed to resolve authentication issues in CPS technologies, connected in the form of IoT networks. For this purpose, a CPS of IoT devices (multi-WSNs) is developed from the local-chain and public chain, respectively. The proposed scheme ensures D2D authentication by the Hash-MAC-DSDV mutual scheme, where the MAC addresses of individual devices are registered in the first phase and advertised in the network in the second phase. The proposed scheme allows legitimate devices to modify their routing table and unicast the one-way hash authentication mechanism to transfer their captured data from source towards the destination. Our evaluation results demonstrate that Hash-MAC-DSDV outweighs the existing schemes in terms of attack detection, energy consumption and communication metrics.

5.
J Netw Comput Appl ; 1752021 Feb 01.
Article in English | MEDLINE | ID: mdl-34690484

ABSTRACT

The Internet of Multimedia Things (IoMT) orchestration enables the integration of systems, software, cloud, and smart sensors into a single platform. The IoMT deals with scalar as well as multimedia data. In these networks, sensor-embedded devices and their data face numerous challenges when it comes to security. In this paper, a comprehensive review of the existing literature for IoMT is presented in the context of security and blockchain. The latest literature on all three aspects of security, i.e., authentication, privacy, and trust is provided to explore the challenges experienced by multimedia data. The convergence of blockchain and IoMT along with multimedia-enabled blockchain platforms are discussed for emerging applications. To highlight the significance of this survey, large-scale commercial projects focused on security and blockchain for multimedia applications are reviewed. The shortcomings of these projects are explored and suggestions for further improvement are provided. Based on the aforementioned discussion, we present our own case study for healthcare industry: a theoretical framework having security and blockchain as key enablers. The case study reflects the importance of security and blockchain in multimedia applications of healthcare sector. Finally, we discuss the convergence of emerging technologies with security, blockchain and IoMT to visualize the future of tomorrow's applications.

6.
IEEE Int Conf Commun ; 20212021 Jun.
Article in English | MEDLINE | ID: mdl-34690611

ABSTRACT

Due to the proliferation of Internet of Things (IoT) and application/user demands that challenge communication and computation, edge computing has emerged as the paradigm to bring computing resources closer to users. In this paper, we present Whispering, an analytical model for the migration of services (service offloading) from the cloud to the edge, in order to minimize the completion time of computational tasks offloaded by user devices and improve the utilization of resources. We also empirically investigate the impact of reusing the results of previously executed tasks for the execution of newly received tasks (computation reuse) and propose an adaptive task offloading scheme between edge and cloud. Our evaluation results show that Whispering achieves up to 35% and 97% (when coupled with computation reuse) lower task completion times than cases where tasks are executed exclusively at the edge or the cloud.

7.
Article in English | MEDLINE | ID: mdl-34692919

ABSTRACT

The onboarding of IoT devices by authorized users constitutes both a challenge and a necessity in a world, where the number of IoT devices and the tampering attacks against them continuously increase. Commonly used onboarding techniques today include the use of QR codes, pin codes, or serial numbers. These techniques typically do not protect against unauthorized device access-a QR code is physically printed on the device, while a pin code may be included in the device packaging. As a result, any entity that has physical access to a device can onboard it onto their network and, potentially, tamper it (e.g., install malware on the device). To address this problem, in this paper, we present a framework, called Deep Learning-based Watermarking for authorized IoT onboarding (DLWIoT), featuring a robust and fully automated image watermarking scheme based on deep neural networks. DLWIoT embeds user credentials into carrier images (e.g., QR codes printed on IoT devices), thus enables IoT onboarding only by authorized users. Our experimental results demonstrate the feasibility of DLWIoT, indicating that authorized users can onboard IoT devices with DLWIoT within 2.5-3sec.

8.
IEEE Trans Green Commun Netw ; 5(2): 765-777, 2021 Jun.
Article in English | MEDLINE | ID: mdl-34458659

ABSTRACT

In recent years, edge computing has emerged as an effective solution to extend cloud computing and satisfy the demand of applications for low latency. However, with today's explosion of innovative applications (e.g., augmented reality, natural language processing, virtual reality), processing services for mobile and smart devices have become computation-intensive, consisting of multiple interconnected computations. This coupled with the need for delay-sensitivity and high quality of service put massive pressure on edge servers. Meanwhile, tasks invoking these services may involve similar inputs that could lead to the same output. In this paper, we present CoxNet, an efficient computation reuse architecture for edge computing. CoxNet enables edge servers to reuse previous computations while scheduling dependent incoming computations. We provide an analytical model for computation reuse joined with dependent task offloading and design a novel computing offloading scheduling scheme. We also evaluate the efficiency and effectiveness of CoxNet via synthetic and real-world datasets. Our results show that CoxNet is able to reduce the task execution time up to 66% based on a synthetic dataset and up to 50% based on a real-world dataset.

9.
Article in English | MEDLINE | ID: mdl-34368399

ABSTRACT

Older adults and people suffering from neurodegenerative disease often experience difficulty controlling gait during locomotion, ultimately increasing their risk of falling. To combat these effects, researchers and clinicians have used metronomes as assistive devices to improve movement timing in hopes of reducing their risk of falling. Historically, researchers in this area have relied on metronomes with isochronous interbeat intervals, which may be problematic because normal healthy gait varies considerably from one step to the next. More recently, researchers have advocated the use of irregular metronomes embedded with statistical properties found in healthy populations. In this paper, we explore the effect of both regular and irregular metronomes on many statistical properties of interstride intervals. Furthermore, we investigate how these properties react to mechanical perturbation in the form of a halted treadmill belt while walking. Our results demonstrate that metronomes that are either isochronous or random break down the inherent structure of healthy gait. Metronomes with statistical properties similar to healthy gait seem to preserve those properties, despite a strong mechanical perturbation. We discuss the future development of this work in the context of networked augmented reality metronome devices.

10.
Article in English | MEDLINE | ID: mdl-34366555

ABSTRACT

The number of devices that the edge of the Internet accommodates and the volume of the data these devices generate are expected to grow dramatically in the years to come. As a result, managing and processing such massive data amounts at the edge becomes a vital issue. This paper proposes "Store Edge Networked Data" (SEND), a novel framework for in-network storage management realized through data repositories deployed at the network edge. SEND considers different criteria (e.g., data popularity, data proximity from processing functions at the edge) to intelligently place different categories of raw and processed data at the edge based on system-wide identifiers of the data context, called labels. We implement a data repository prototype on top of the Google file system, which we evaluate based on real-world datasets of images and Internet of Things device measurements. To scale up our experiments, we perform a network simulation study based on synthetic and real-world datasets evaluating the performance and trade-offs of the SEND design as a whole. Our results demonstrate that SEND achieves data insertion times of 0.06ms-0.9ms, data lookup times of 0.5ms-5.3ms, and on-time completion of up to 92% of user requests for the retrieval of raw and processed data.

11.
IEEE Wirel Commun ; 28(2): 121-127, 2021.
Article in English | MEDLINE | ID: mdl-34366719

ABSTRACT

Information-Centric Networking (ICN) has emerged as a paradigm to cope with the lack of built-in security primitives and efficient mechanisms for content distribution of today's Internet. However, deploying ICN in a wireless environment poses a different set of challenges compared to a wired environment, especially when it comes to security. In this paper, we present the security issues that may arise and the attacks that may occur from different points of view when ICN is deployed in wireless environments. The discussed attacks may target both applications and the ICN network itself by exploiting elements of the ICN architecture, such as content names and in-network content caches. Furthermore, we discuss potential solutions to the presented issues and countermeasures to the presented attacks. Finally, we identify future research opportunities and directions.

12.
Future Gener Comput Syst ; 122: 40-51, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34393306

ABSTRACT

In the densely populated Internet of Things (IoT) applications, sensing range of the nodes might overlap frequently. In these applications, the nodes gather highly correlated and redundant data in their vicinity. Processing these data depletes the energy of nodes and their upstream transmission towards remote datacentres, in the fog infrastructure, may result in an unbalanced load at the network gateways and edge servers. Due to heterogeneity of edge servers, few of them might be overwhelmed while others may remain less-utilized. As a result, time-critical and delay-sensitive applications may experience excessive delays, packet loss, and degradation in their Quality of Service (QoS). To ensure QoS of IoT applications, in this paper, we eliminate correlation in the gathered data via a lightweight data fusion approach. The buffer of each node is partitioned into strata that broadcast only non-correlated data to edge servers via the network gateways. Furthermore, we propose a dynamic service migration technique to reconfigure the load across various edge servers. We assume this as an optimization problem and use two meta-heuristic algorithms, along with a migration approach, to maintain an optimal Gateway-Edge configuration in the network. These algorithms monitor the load at each server, and once it surpasses a threshold value (which is dynamically computed with a simple machine learning method), an exhaustive search is performed for an optimal and balanced periodic reconfiguration. The experimental results of our approach justify its efficiency for large-scale and densely populated IoT applications.

13.
IEEE Trans Industr Inform ; 17(7): 5128-5137, 2021 Jul.
Article in English | MEDLINE | ID: mdl-33994885

ABSTRACT

Industrial Internet of Things (IIoT) ensures reliable and efficient data exchanges among the industrial processes using Artificial Intelligence (AI) within the cyber-physical systems. In the IIoT ecosystem, devices of industrial applications communicate with each other with little human intervention. They need to act intelligently to safeguard the data confidentiality and devices' authenticity. The ability to gather, process, and store real-time data depends on the quality of data, network connectivity, and processing capabilities of these devices. Pervasive Edge Computing (PEC) is gaining popularity nowadays due to the resource limitations imposed on the sensor-embedded IIoT devices. PEC processes the gathered data at the network edge to reduce the response time for these devices. However, PEC faces numerous research challenges in terms of secured communication, network connectivity, and resource utilization of the edge servers. To address these challenges, we propose a secured and intelligent communication scheme for PEC in an IIoT-enabled infrastructure. In the proposed scheme, forged identities of adversaries, i.e., Sybil devices, are detected by IIoT devices and shared with edge servers to prevent upstream transmission of their malicious data. Upon Sybil attack detection, each edge server executes a parallel Artificial Bee Colony (pABC) algorithm to perform optimal network configuration of IIoT devices. Each edge server performs the job migration to their neighboring servers for load balancing and better network performance, based on their processing and storage capabilities. The experimental results justify the efficiency of our proposed scheme in terms of Sybil attack detection, the convergence curves of our pABC algorithm, delay, throughput, and control overhead of data communication using PEC for IIoT.

14.
IEEE Trans Industr Inform ; 17(8): 5829-5839, 2021 Aug.
Article in English | MEDLINE | ID: mdl-33981186

ABSTRACT

Industry 5.0 is the digitalization, automation and data exchange of industrial processes that involve artificial intelligence, Industrial Internet of Things (IIoT), and Industrial Cyber-Physical Systems (I-CPS). In healthcare, I-CPS enables the intelligent wearable devices to gather data from the real-world and transmit to the virtual world for decision-making. I-CPS makes our lives comfortable with the emergence of innovative healthcare applications. Similar to any other IIoT paradigm, I-CPS capable healthcare applications face numerous challenging issues. The resource-constrained nature of wearable devices and their inability to support complex security mechanisms provide an ideal platform to malevolent entities for launching attacks. To preserve the privacy of wearable devices and their data in an I-CPS environment, we propose a lightweight mutual authentication scheme. Our scheme is based on client-server interaction model that uses symmetric encryption for establishing secured sessions among the communicating entities. After mutual authentication, the privacy risk associated with a patient data is predicted using an AI-enabled Hidden Markov Model (HMM). We analyzed the robustness and security of our scheme using BurrowsAbadiNeedham (BAN) logic. This analysis shows that the use of lightweight security primitives for the exchange of session keys makes the proposed scheme highly resilient in terms of security, efficiency, and robustness. Finally, the proposed scheme incurs nominal overhead in terms of processing, communication and storage and is capable to combat a wide range of adversarial threats.

15.
IEEE Trans Green Commun Netw ; 5(3): 1202-1211, 2021 Sep.
Article in English | MEDLINE | ID: mdl-35449692

ABSTRACT

Internet of Things (IoT) is considered as a key enabler of health informatics. IoT-enabled devices are used for in-hospital and in-home patient monitoring to collect and transfer biomedical data pertaining to blood pressure, electrocardiography (ECG), blood sugar levels, body temperature, etc. Among these devices, wearables have found their presence in a wide range of healthcare applications. These devices generate data in real-time and transmit them to nearby gateways and remote servers for processing and visualization. The data transmitted by these devices are vulnerable to a range of adversarial threats, and as such, privacy and integrity need to be preserved. In this paper, we present LightIoT, a lightweight and secure communication approach for data exchanged among the devices of a healthcare infrastructure. LightIoT operates in three phases: initialization, pairing, and authentication. These phases ensure the reliable transmission of data by establishing secure sessions among the communicating entities (wearables, gateways and a remote server). Statistical results exhibit that our scheme is lightweight, robust, and resilient against a wide range of adversarial attacks and incurs much lower computational and communication overhead for the transmitted data in the presence of existing approaches.

16.
IEEE Netw ; 34(6): 235-241, 2020.
Article in English | MEDLINE | ID: mdl-34366564

ABSTRACT

Named Data Networking (NDN) architectural features, including multicast data delivery, stateful forwarding, and in-network data caching, have shown promise for applications such as video streaming and file sharing. However, collaborative applications, requiring a multi-producer participation introduce new NDN design challenges. In this paper, we highlight these challenges in the context of the Network Time Protocol (NTP) and one of its most widely-used deployments for NTP server discovery, the NTP pool project. We discuss the design requirements for the support of NTP and NTP pool and present general directions for the design of a time synchronization protocol over NDN, coined Named Data Networking Time Protocol (NDNTP).

17.
Article in English | MEDLINE | ID: mdl-34368423

ABSTRACT

In this work we investigate Named Data Networking's (NDN's) architectural properties and features, such as content caching and intelligent packet forwarding, in the context of Content Delivery Network (CDN) workflows. More specifically, we evaluate NDN's properties for PoP (Point of Presence) to PoP and PoP to device connectivity. We use the Apache Traffic Server (ATS) platform to create a CDN-like caching hierarchy in order to compare NDN with HTTP-based content delivery. Overall, our work demonstrates that several properties inherent to NDN can benefit content providers and users alike through in-network caching of content, fast retransmission, and stateful hop-by-hop packet forwarding. Our experimental results demonstrate that HTTP delivers content faster under stable conditions due to a mature software stack. However, NDN performs better in the presence of packet loss, even for a loss rate as low as 0.1%, due to packet-level caching in the network and fast retransmissions from close upstreams. We further show that the Time To First Byte (TTFB) in NDN is consistently lower than HTTP (~ 100ms in HTTP vs. ~ 50ms in NDN), a vital requirement for CDNs. Unlike HTTP, NDN also supports transparent failover to another upstream when a failure occurs in the network. Finally, we present implementation-agnostic (implementation choices can be Software Defined Networking, Information Centric Networking, or something else) network properties that can benefit CDN workflows.

18.
IEEE Netw ; 34(6): 259-265, 2020.
Article in English | MEDLINE | ID: mdl-34393357

ABSTRACT

Delay-sensitive applications have been driving the move away from cloud computing, which cannot meet their low-latency requirements. Edge computing and programmable switches have been among the first steps toward pushing computation closer to end-users in order to reduce cost, latency, and overall resource utilization. This article presents the "compute-less" paradigm, which builds on top of the well known edge computing paradigm through a set of communication and computation optimization mechanisms (e.g.,, in-network computing, task clustering and aggregation, computation reuse). The main objective of the compute-less paradigm is to reduce the migration of computation and the usage of network and computing resources, while maintaining high Quality of Experience for end-users. We discuss the new perspectives, challenges, limitations, and opportunities of this compute-less paradigm.

SELECTION OF CITATIONS
SEARCH DETAIL
...