Next Article in Journal
A Transformer Encoder Approach for Localization Reconstruction During GPS Outages from an IMU and GPS-Based Sensor
Previous Article in Journal
Classification and Monitoring of Salt Marsh Vegetation in the Yellow River Delta Based on Multi-Source Remote Sensing Data Fusion
Previous Article in Special Issue
A Review of Unmanned Aerial Vehicle Based Antenna and Propagation Measurements
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Web Real-Time Communications-Based Unmanned-Aerial-Vehicle-Borne Internet of Things and Stringent Time Sensitivity: A Case Study

by
Agnieszka Chodorek
1 and
Robert Ryszard Chodorek
2,*
1
Department of Applied Computer Science, Faculty of Electrical Engineering, Automatic Control and Computer Science, Kielce University of Technology, Al. 1000-lecia P.P. 7, 25-314 Kielce, Poland
2
Institute of Telecommunications, Faculty of Computer Science, Electronics and Telecommunications, AGH University of Krakow, Al. Mickiewicza 30, 30-059 Krakow, Poland
*
Author to whom correspondence should be addressed.
Submission received: 1 October 2024 / Revised: 30 December 2024 / Accepted: 14 January 2025 / Published: 17 January 2025
(This article belongs to the Special Issue New Methods and Applications for UAVs)

Abstract

:
The currently observed development of time-sensitive applications also affects wireless communication with the IoT carried by UAVs. Although research on wireless low-latency networks has matured, there are still issues to solve at the transport layer. Since there is a general agreement that classical transport solutions are not able to achieve end-to-end delays in the single-digit millisecond range, in this paper, the use of WebRTC is proposed as a potential solution to this problem. This article examines UAV-borne WebRTC-based IoT in an outdoor environment. The results of field experiments conducted under various network conditions show that, in highly reliable networks, UAV and WebRTC-based IoT achieved stable end-to-end delays well below 10 ms during error-free air-to-ground transmissions, and below 10 ms in the immediate vicinity of the retransmitted packet. The significant advantage of the WebRTC data channel over the classic WebSocket is also demonstrated.

1. Introduction

Unmanned aerial vehicles (UAVs) are currently one of the fastest developing multi-role carrier technologies. These ubiquitous devices now have a multitude of economic, commercial, leisure, military, and academic uses [1], and their uses range from individuals flying them for recreation to large commercial package and medical supply companies [2]. They can be used to transport parcels and people between locations [3,4,5]. Equipped with on-board cameras and Internet of Things (IoT) systems, UAVs are used to monitor pollution [6,7], weather [8,9], road traffic [10], and crop production [11,12]. An important part of these UAV applications is the communication and computing support, including a flying range extender [13], a flying router [13,14], and a flying computer for aerial mobile edge computing (AMEC) purposes [15].
These and other UAV applications can be time-sensitive in a broad sense, i.e., they may have arbitrary time constraints imposed. These can be relatively large if they are related to the delivery of parcels or people. Such deliveries may have to be completed within a specific time window [3,4] or as soon as possible. It is estimated that, in a large city, the time needed for transporting parcels or people by air may be a few dozen percent shorter than the time for land transport [5]. Time constraints may also be relatively small when they concern the provision of real-time or near-real-time information: either to detect and locate the source of pollution [6,7], or for disaster response purposes [13,14]. The need for real-time information may also arise in the case of observations of weather [8,9], crop production [11,12], and road traffic [10], etc., if data are sent from the UAV to a ground station, where they are used for analysis, real-time visualization, and decision-making.
Time-sensitive UAV-IoT applications may also involve collecting and processing data from sensors located in a given area. For example, the integration of UAVs with the internet of medical things (IoMT) was reported in [15]. This UAV-enabled system implements AMEC functionality. In the proposed solution, communication delays were reduced from a range of 17 ms to 30 ms to a range of 12.5 ms to 24 ms. The freshness of data is expressed in the so-called age of information (AoI), i.e., the time that has passed since the generation of the most recently received data [16]. AoIs known from the literature include UAV flight time, hover time, and maintenance time, and range from less than 600 s to less than 1800 s [16] or from almost 1400 s to less than 2200 s [17], with low-AoI systems starting with an AoI of 70–80 s [18,19]. AoI improvement methods are based on optimizing the UAV trajectory [16,17,18,19], and the transmission delay for such large times is negligible.
Currently, the main challenge in the field of wireless communication with UAVs is time-sensitive applications that require low latency, defined as end-to-end delays measured in single-digit milliseconds at the application level. Examples of such applications are presented in Table 1. The traffic generated by these applications is deterministic, meaning hard real-time with no jitter, or non-deterministic, where low jitter can be observed. A high reliability of transmission is required, and in the case of deterministic traffic, ultra-high reliability is needed. This approach breaks with the classic division of telecommunications traffic into elastic and inelastic, where only inelastic traffic had to meet stringent time requirements and only elastic traffic had to be characterized by a high transmission reliability [20].
It is important to note here that using a low-latency network does not guarantee low end-to-end delays at the application level. This state of affairs is blamed on the mechanisms of classic transport protocols, which are unable to effectively meet the requirements of low delays [23]. Another problem is the socket application programming interface (API) for these protocols, which is too low-level, simple, and inflexible [23]. The authors believe that a solution to the above problems, at least in the case of time-sensitive UAV-borne IoT, could be the use of web real-time communications (WebRTC), which, as the name suggests, provides native real-time communication on the Web. The World Wide Web Consortium (W3C) in the document in [26] announced the general need for building a WebRTC-based IoT. Requirement N15 included in [26] states that a WebRTC-based IoT should be able to provide low and consistent latency under varying network conditions.

Main Contributions and Organization of This Paper

In our previous paper, we proposed a WebRTC-based application capable of operating like the classic IoT [27], intended for use in UAV-borne monitoring systems. This application was a part of our UAV- and WebRTC-based open universal framework [28]. In this paper, we present the results of field experiments aimed at verifying whether, and to what extent, a UAV-borne IoT based on the current WebRTC standard is able to provide low and consistent latency under varying network conditions. The main contributions of this paper are as follows:
  • Supplementing the application in [27], working as an element of the framework [28], with high-resolution time measurement and timer synchronization procedures.
  • Carrying out delay measurements at the level of the transport protocol and at the level of the web logical channel during air-to-ground IoT transmissions under varying network conditions, and then performing a statistical analysis of these delays.
  • For the completeness of the results, a comparison of the obtained results with the results obtained for IoT transmission via a classic web logical channel, i.e., WebSocket, in the same circumstances.
The rest of this paper is organized as follows: Section 2 analyzes related work. Section 3 discusses the materials and methods used during the experiments. Section 4 describes the field experiments, including post-selection of the measurement series for further analysis. Section 5 presents and discusses the measures of the location of the selected series of end-to-end delays, while Section 6 compares the transmissions carried out with the use of WebRTC and WebSocket in terms of the measures of location, as well as the measures of variation derived from these measures of location. Section 7 summarizes our experiences.

2. Related Work

While Section 1 provides a broad background, Section 2 discusses both real-time alternatives and the authors’ prior application solutions that formed the basis of this paper. The review of existing solutions covers time-sensitive applications that generate non-deterministic traffic. Although real-time transmission is usually associated with multimedia streaming (as are IoT real-time transmissions [29]), the paper only discusses the transmission of non-media data, usually data coming from sensors. The discussion focuses on aspects of the transport layer, i.e., transport protocols and interfaces. The criterion for selecting literature was the various non-media real-time transmission techniques found in the literature, preferring papers that explicitly provided transmission times in a local area network. Most references concern IoT communication between UAVs and ground stations.
Non-deterministic traffic generated by time-sensitive applications is most often transmitted air-to-ground using wireless local networks (WLANs), usually built using the Institute of Electrical and Electronics Engineers (IEEE) 802.11 standard [27,28,30,31,32,33,34,35,36,37,38,39,40], but also using standards for broadband cellular networks: the long-term evolution (LTE) standard [40,41], also known as the fourth-generation (4G) technology standard, and the fifth-generation (5G) technology standard [15,30,40]. Among IEEE 802.11 [42] networks, popular versions of the physical layer are used, such as 802.11g [31,32] and 802.11n [33,34,35,40], as well as the 802.11p version intended for the vehicular environment [30]. The works [27,28] employed the 802.11ac version, which is able to provide latencies below 10 ms, including providing handover latencies below 10 ms thanks to the fast roaming service [43].
The 802.11 standard was also used in [44], where a time-sensitive application, intended to work on board a UAV, was tested using a laptop and an unmanned ground vehicle (UGV). Another time-sensitive application in which the sensor system was carried on board a UGV was presented in [45]. The network used in [45] was based on software-defined radio (SDR) working with software-defined networking (SDN). In [46], a stationary robot communicated via both 802.11 WLAN and evolved high-speed packet access (HSPA+), also known as the 3.75G technology standard. As a side note, refs. [44,46] also tested transmissions between stationary end systems over longer distances using the public infrastructure of an Internet service provider (ISP). These are not the subject of this article, because current networks, including those built using 5G technology [47], are only able to provide low-latency services locally, using dedicated low-latency solutions with limited range.
Time-sensitive applications typically send sensor data using the classic transmission control protocol (TCP), which provides reliable congestion- and flow-controlled transmission. Applications use the TCP transport protocol directly [41] or via the WebSocket web logical channel [30,34,35,36,38,39,44,45,46]. The works in [31,32,33] used a multipath version of the TCP protocol, i.e., the multipath transmission control protocol (multipath TCP or MPTCP), which allows multi-homed senders to increase transmission efficiency. The use of MPTCP has been shown to reduce latency under certain conditions [48], although the protocol is sensitive to path asymmetry, especially if the paths are built with different technologies (e.g., 4G and 802.11) [49]. In the works in [31,32], MPTCP was modified to meet the requirements of time-sensitive networking (TNS). In [33], to deal with network stability in the face of high UAV mobility, the MPTCP scheduling algorithm was modified.
While the TCP protocol has been typically used for transmission of non-media data, another classic transport protocol, i.e., the user datagram protocol (UDP), is used for audio/video transmission, due to its simple structure and mechanisms reduced to an absolute minimum, including the lack of window-based control. Currently, the UDP is most often used as an underlay protocol for other transport protocols, where it occupies a lower sublayer of the transport layer. The most popular solution is to use the real-time transport protocol (RTP) in the upper sublayer. A RTP/UDP protocol stack is typically used for multimedia communications. This solution is also used in the WebRTC video channel. In [27,28,34,35,46], a WebRTC video channel was used to transmit video from a camera. In [44], it was used to transmit data from lidar. WebRTC uses the RTP protocol implemented in a WebRTC-capable browser and the UDP protocol implemented in an operating system.
UDP can also be used as a transport protocol for transmitting non-media data. In [41], a low-latency reliable transmission (LRT) application layer protocol operating directly over UDP was proposed to reduce the delay during ground-to-UAV transmission through a cellular network. In the abovementioned work [31], control data were sent via the UDP protocol, while the remaining data were sent via the MPTCP protocol. The WebRTC data channel used the stream control transmission protocol (SCTP) over UDP. To ensure reliable transmission, the SCTP uses error control and congestion control mechanisms similar to those of TCP. In the works in [27,28,46], the SCTP protocol was used to transmit data from sensors. Similarly to the RTP, the SCTP is always implemented in a WebRTC-capable browser, regardless of the operating system’s implementation of the SCTP. This is due to the need to use the new version of the SCTP standard intended for WebRTC [50]. However, the implementation of the SCTP in the operating system can also be used for IoT data transmission [37]. The UDP transport protocol implemented in the operating system is also the basis for the quick UDP internet connections (QUIC) protocol [51], initially intended for web applications and now proposed for low-latency communication in the next-generation IoT [52]. The papers in [38,39] presented the results of evaluations of a real IoT transmitted using the QUIC in an emulated [38] or simulated [39] wireless environment.
Applications, including web browsers, use the TCP and UDP transport protocols implemented in the operating system, communicating with them via the socket interface. In [31,32,33], applications communicated with the MPTCP protocol in the operating system via a classic stream socket. The WebSocket web logical channel uses the TCP protocol in the operating system, also communicating with it via a stream socket. Applications that send data via WebSocket, such as [30,34,44,45], use the high-level WebSocket API. WebRTC offers separate web logical channels for media and non-media transmission, each of which is associated with a separate high-level API used by WebRTC applications such as [27,28,34,35,44,46]. The RTP and SCTP protocols are implemented in WebRTC-capable browsers and communicate with the UDP protocol in the operating system via classic datagram sockets. WebRTC does not use the SCTP in the operating system and therefore does not use an SCTP socket.
A comparison of related work is presented in Table 2. Our WebRTC-based UAV-borne IoT application is presented in [27,28]. In this work, data were not transmitted in the web of things (WoT) architecture [53], using an intermediate server, but in the classic IoT manner, using a peer-to-peer WebRTC architecture. In [44], transmissions of lidar data via the WebRTC video channel and via Websocket were compared. In [34,35], IoT data were transmitted via Websocket, and only video was transmitted via WebRTC. In [46], WoT data were transferred from a robot to a WoT server over WebSockets, and then from the server to the recipient over WebRTC. In [40], WebRTC data channel was used to control a UAV and transmit telemetry. The remaining papers did not use WebRTC. In [30,36,45], only a WebSocket logical channel was used, while, in [15,31,32,33,37,38,39,41], a web logical channel was not used at all.
WebRTC applications are web-based equivalents of classic, standalone multimedia applications based on the session initiation protocol (SIP). The management plane protocol stack [54] and the production plane protocol stack for media streaming [55] are similar to their legacy SIP architecture counterparts. WebRTC applications are loaded from web servers as part of web pages and use web browsers as run-time environments. This makes them highly portable and secure, as detected browser vulnerabilities are eliminated on an ongoing basis. What distinguishes WebRTC from other web techniques, such as WebSocket, is its dual protocol stack, the idea of which was taken from the SIP architecture. As an effect, WebRTC can be used to transmit both media streams and non-media flows. Streams and flows are cryptographically protected and congestion-controlled. Since both the media stream and non-media flow use TCP-friendly congestion control, in the event of poor network conditions, data are protected at the expense of video [56]. If necessary, WebRTC applications can use more sophisticated streaming media congestion control methods, such as RTP translators or simulcast, and both RTP streams and SCTP flows can use differentiated services (DiffServ) to ensure quality of service (QoS).

3. Materials and Methods

This section introduces the flying monitoring system used in the field experiments, highlighting the time-sensitive aspects; defines the end-to-end delays measured at both the logical channel level and transport protocol level; shows the method for creating of a series of end-to-end delays; and finally describes the extreme values and measures of location calculated from these series.

3.1. System

In all experiments, a flying monitoring system built on the basis of the framework in [28] was used. Structurally, the system consists of an air station and a ground station, and functionally of an IoT system and an IoT carrier. The air station was an unmanned quadcopter, operating as the IoT carrier, with an IoT system on board, i.e., environmental sensors connected to a single-board computer (SBC) Raspberry Pi 4 Model B running the authors monitoring application. The environmental sensors included four weather sensors previously used to build a mobile weather station [9] and a gas sensor used in a pollution monitoring system [7], which allowed for the reuse of existing sensor-dependent code. The monitoring application was written in the JavaScript language as part of a web page, and its runtime environment was the Chromium browser in headless mode and run on the Raspberry Pi OS operating system. The authors’ analysis of the Chromium browser implementation showed that the browser has a limited send buffer, which allows transmissions to reduce buffering times, and the stream socket was set to disable Nagle’s algorithm, so there was no need to additionally set these parameters.
The monitoring application included the WebRTC video service, positioning service, and sensor service. The WebRTC video service was built as a classic WebRTC video application. The positioning service and the sensor service were built in a browser-driven manner [27], typical for IoT systems. In the experiments presented in this paper, the video service was turned off, the positioning service sent its data to the sensor service, and only the sensor service sent its data to the ground. Data from sensor service were transmitted in message queuing telemetry transport (MQ telemetry transport, or MQTT) messages bearing the MQTT topic, which identified each datum. Example topics used in the experiments and the method of creating them were described in the authors previous paper [7]. MQTT messages were transmitted over a web logical channel, using both the WebRTC data channel and the WebSocket. In the latter case, to improve the time properties of the TCP, the PUSH option was set in each TCP packet carrying the MQTT message, which means immediate pushing of the received data to the application. Because the correct operation of the monitoring application required that the central processing unit (CPU) always had a sufficient reserve of resources, this had to be monitored during the performance of all tests.
Unlike the air station, which was one device, the ground station was divided into two separate devices (Figure 1): the command and control console (CCC), and the WebRTC multimedia and monitoring station (WMMS). The CCC was used to pilot the IoT carrier. It was connected to the UAV via the control network (yellow lightning in Figure 1). The WMMS is designed for IoT purposes. It was connected to the monitoring software via the IEEE 802.11ac production network (red lightning in Figure 1), which was built as a heterogeneous extended service set (ESS). As an effect, the transmission between the air station and the WMMS was carried out both in a wireless environment and in a mixed wired/wireless one, in which the access points were connected to each other via a gigabit Ethernet (IEEE 802.3ab) network. The intermediate devices used in the ESS were NETGEAR Nighthawk X4 R7500 AC2350 access points (AP1, AP2, and AP3) and an HP 3500-24G-PoE+ yl Switch (SW1). The AP1 and the SW1 were placed close to the corners of a rectangular 70 m × 70 m parking lot, which was the test area. The AP2 and the AP3 were located at 50 m from the AP1 and the SW1, respectively.

3.2. Series of End-to-End Delays

During the flights, time parameters were collected, both at the air station and at the ground station. These parameters were used to determine a pair of delays: one at the transport level, the other at the level of the web logical channel.
Definition 1.
The end-to-end delay d i t of the i-th IoT datum transmitted between the air station and the ground station, measured at the transport level, is defined as
d i t = t i t t i i l c
where t i t is the reception time from the transport protocol and t i i l c is the entry time into the logical channel.
Definition 2.
The end-to-end delay d i t of the i-th IoT datum transmitted between the air station and the ground station, measured at the logical channel level, is defined as
d i l c = t i o l c t i i l c
where t i o l c is the reception time from the logical channel.
In the case of transmissions carried out over the WebRTC Data Channel, the times t i i l c , t i t and t i o l c were measured for each transmitted IoT datum, where i was the sequence number of this datum. From these times, the delays d i t and d i l c were then calculated according to Formulas (1) and (2), respectively. The difference between corresponding delays resulted from the processing of the payload of the SCTP packet (MQTT message) placed in the receive buffer of the logical channel. In the case of transmissions conducted over the WebSocket logical channel, performed for comparison purposes, for each transmitted IoT datum, the times t i i l c and t i o l c were measured, from which the delays d i l c were then determined, according to Formula (2).
Let t 1 i l c be the starting time, defined as the instant of time when the air station was directly above the ground station, D l c be the series of N = 40 , 000 end-to-end delays measured at the logical channel level, starting at time t 1 i l c , and D t be the corresponding series of end-to-end delays of the same amount measured at the transport level, starting at time t 1 i l c . The N value of 40,000 provided a high delivery rate of 0.999975 for a single error. After each pair of flights, one series of delays measured at the transport level, D W R T C t , and two series of delays measured at the logical channel level, D W R T C l c and D W S l c , were generated. The times t 1 i l c , at which the series D W R T C l c and D W S l c started, were shifted relative to each other by the time of the first flight of the pair and the service time of the second flight. The WebRTC and WS indexes indicate which web logical channel was used in a given measurement series (WebRTC data channel and WebSockets, respectively). The series D W R T C t , D W R T C l c and D W S l c were subjected to statistical processing.
For the analysis of the impact of single outliers, in particular the impact of the delay of IoT datum conveyed in the retransmitted packet, truncated measures were used. For this purpose, the series D W R T C t , D W R T C l c , and D W S l c of end-to-end delays d i , i = 1 , 2 , , 40 , 000 were sorted in non-decreasing order. Then, the two extreme delays (minimum and maximum delay) were discarded from the series of end-to-end delays. This resulted in new, shorter series D T W R T C t , D T W R T C l c , and D T W S l c of end-to-end delays d j , j = 1 , 2 , , 39 , 998 . These series were subjected to the same statistical processing as the series from which they were derived.

3.3. Statistics

For time-sensitive applications, the main key performance indicator (KPI) is latency, defined as the end-to-end transmission delay. As a result, of all the related works, only Ref. [41] took latency and jitter into account, while the rest only focused on latency. Following this lead, extremes and measures of location were calculated in statistical processing. In particular, after each flight achieved
  • the minimum value in each series: m i n ( D W R T C t ) , m i n ( D W R T C l c ) , m i n ( D W S l c ) ,
  • the maximum value in each series: m a x ( D W R T C t ) , m a x ( D W R T C l c ) , m a x ( D W S l c ) .
From the measures of location, both the classic measure of location, namely arithmetic mean, and the measures of position were calculated:
  • arithmetic mean: μ ( D W R T C t ) , μ ( D W R T C l c ) , μ ( D W S l c ) :
    μ ( D ) = 1 c a r d ( D ) i = 1 c a r d ( D ) d i , d i D
  • median: m e d ( D W R T C t ) , m e d ( D W R T C l c ) , m e d ( D W S l c ) ,
  • mode: m o d ( D W R T C t ) , m o d ( D W R T C l c ) , m o d ( D W S l c ) ,
  • lower quartile: Q 1 ( D W R T C t ) , Q 1 ( D W R T C l c ) , Q 1 ( D W S l c ) ,
  • upper quartile: Q 3 ( D W R T C t ) , Q 3 ( D W R T C l c ) , Q 3 ( D W S l c ) .
The same statistics, calculated from the truncated series of end-to-end delays, namely D T W R T C t , D T W R T C l c and D T W S l c , produced truncated statistics, such as the truncated minimum m i n ( D T W R T C t ) , truncated maximum m a x ( D T W R T C t ) , truncated mean μ ( D T W R T C t ) , etc. These statistics were used to assess whether and to what extent a single retransmission affected the statistical properties of the analyzed end-to-end delays.

4. Experiments

Document [26] introduces a number of requirements that a WebRTC-based IoT must meet. The aim of the experiments described in this section was to check whether and to what extent the UAV-borne IoT, based on the current WebRTC standard, was able to meet the N15 requirement of [26], i.e., was able to provide low and consistent latencies under varying network conditions. As mentioned in Section 1, the challenge is in time-sensitive applications that require end-to-end delays measured in single-digit milliseconds at the application level. To meet this challenge, a WebRTC-based UAV-borne IoT should communicate with the ground station through a highly reliable, low-latency network. Since a delivery rate of 99.99% to 99.999% is considered high reliability, at most 1 packet error detected in the transport layer per 40,000 IoT data sent was assumed, i.e., a minimum packet delivery rate of 99.9975%.
The second assumption was that variable network conditions should result from both deterministic and random factors. Classic deterministic factors include the network heterogeneity (wired and wireless links), handovers, signal strength decrease with distance from access points, and UAV behavior (moving, hovering). Random factors include the different weather conditions and the different times of conducting experiments, which results in different user activities in co-existing networks in the same area, which in turn results in different loads on co-existing networks. The source of any transmission errors should be random factors.
The rest of this section presents the location of the field experiments and the course of the experiments; and discusses the flight days and sessions, network operating conditions, and the number of errors detected in the medium access control (MAC) sublayer. Finally, the measurement series selected for statistical analysis and the reasons why these series were selected and not others are described.

4.1. Location of the Experiments

The field experiments were carried out in a square parking lot 70 m long and 70 m wide, located on the campus of the AGH University of Krakow, Poland. The location of the experiment site between the university’s teaching buildings and the dormitory made it possible to conduct experiments at times of the day when the students’ Internet activity was low and high, generating low and high loads on the wireless networks coexisting with the air-to-ground production network in the test area. The high load on co-existing networks was a factor contributing to the occurrence of single transmission errors in the transport layer.

4.2. Course of Experiments

During the experiments, the air station performed automatic flights, sweeping the same 70 m × 70 m test area, zigzagging over the parking lot along the same flight path, at the same speed (1.67 m/s), and at the same altitude of 15 m. Air-to-ground transmissions were conducted both on the fly and hovering, and flight phases were intertwined with hovering phases. The summary flight time was about 460 s, and the summary hover time was about 280 s. This gave a total of just over 740 s (about 12.5 min) mission duration. The hover point locations and hover times were always the same. As the air station swept the entire test area, it switched between access points transmitting data through the 802.11ac production network described in the previous section. To ensure a seamless handover, the production network used the fast handover technique, which is part of the IEEE 802.11ac standard.
The source of the IoT data was the five environmental sensors that the air station was equipped with. During each flight, the sensors cyclically performed 9 measurements of the environmental parameters in a given time interval (0.5 s). Since each measurement datum was accompanied by two metadata (time and position), the air station sent a burst of 27 packets to the ground every half a second. This was over 1480 bursts, i.e., over 40,000 data packets, per flight. During each flight, the entry times into the logical channel, the reception times from the transport protocol (only IoT transmissions over WebRTC), and the reception times from the logical channel were collected.
During the experiments, data were sent over a web logical channel. In order to compare the IoT transmission over the WebRTC data channel with the classic solution, each evaluation flight in which IoT data were transmitted over the WebRTC data channel was followed without undue delay by a comparison flight in which IoT data were transmitted via WebSockets. This required developing a procedure for quickly replacing the web pages that included the monitoring applications, which were downloaded from a web server and run in the Chromium browser environment, which is the web server run at the WMSS. The use of a buffer power supply for both the SBC and the flight controller made it possible to change the software and replace the battery in parallel. As a result, the total elapsed time for maintenance between the evaluation flight and the next comparison flight was about 1 min.

4.3. Flight Days, Flight Sessions, and Pairs of Flights

The experiments were conducted from the end of January to the end of May, on separate days, on average every two weeks with an interval of at least one week, and on the same day of the week. The separation of experiments into individual days allowed the authors to run tests under different environmental conditions, such as the temperature, relative humidity, and time of day. Because the experiments started in midwinter and ended at the turn of spring to summer, transmissions were carried out from mild winter days, when the temperature rose above 1 degree Celsius, to warm late spring days, when the temperature rose to 25 degrees Celsius, and from dry weather, with a relative humidity above 40%, to rainy weather, with a relative humidity below 90%.
Experiments were organized into flight sessions. Before each session, the clocks at the air station and the ground station were synchronized. After each flight session, check-ups were performed to check for time drift, detected as a mismatch between the air and ground station clocks after the end of the session. Due to the detection of a time drift, the results collected during one flight session were rejected. Each flight session lasted up to two hours. Morning flight sessions began after 6:00 a.m. and ended before the start of classes at the University, no later than approximately 7:50 a.m. Midday flight sessions started around noon, and the evening ones started around 5 p.m. Since the experiments were conducted during the semester on campus, the time of day was related to the degree of load on the IEEE 802.11 networks coexisting with the production network on the AGH University campus and using the 5 GHz band.
The flight sessions were organized into pairs of test flights, with the evaluation flight (IoT transmission over WebRTC) immediately followed by a comparison flight (IoT transmission over WebSocket). Breaks between test flights belonging to the same pair could not be longer than would result from normal operation of the monitoring system. On a flight day, one flight session was conducted, and at least three pairs of test flights were performed during each flight session.

4.4. Network Conditions

Network conditions can be roughly expressed by the number of errors in the MAC sublayer: the fewer errors, the better the network conditions. The environmental conditions, especially the time of day, affected the network conditions, which were manifested in the different numbers of lost IEEE 802.11 frames per 40,000 transmitted IoT data. The number of errors in the MAC sublayer was reported by the network interface during the experiments.
Based on the number of frames lost, the network conditions were divided into good, medium, and poor. Less than two-fifths of the transmissions took place under good conditions, with just over 30 MAC frames lost per 40,000 IoT data sent. More than two-fifths were carried out under medium conditions, with more than 40 and no more than about 95 frames lost. More than one fifth of the transmissions took place under poor network conditions, when approximately 100 frames or more were lost per 40,000 IoT data transmitted.
The IEEE 802.11ac error control mechanism successfully retransmitted almost all lost frames detected by the MAC sublayer. Under both good and medium network conditions, the MAC sublayer was always able to correct the transmission errors. As an effect, no errors were detected at the transport layer. In poor network conditions, the underlying network was always unable to successfully retransmit one lost frame. As a result, a single transmission error (one lost packet per 40,000 IoT data sent) was detected at the transport layer. Errors in the transport layer usually appeared during both flights from a given pair. There was only one registered exception to this rule, when the transmission of IoT data over the WebRTC data channel was error-free at the transport layer, while during the transmission of IoT data over the WebSocket, the TCP detected a single transmission error.

4.5. Selection of Measurement Series

Out of 35 pairs of flights, we selected five, conducted on five different flight days, when transmissions were carried out under the three different network conditions (good, medium, poor):
  • On day 1, transmissions were carried out in good network conditions. No errors were detected in the transport layer for both IoT data transmission over the WebRTC Data Channel and over the WebSocket. Thus, the packet error rate (PER) in the transport layer was P E R W R T C = P E R W S = 0 .
  • On day 2, network conditions were on the border between medium and poor. During the first flight, the exception described in previous section occurred: no errors were detected in the transport layer when transmitting IoT data over the WebRTC data channel, and one error was detected during transmission over WebSocket. P E R W R T C was 0 and P E R W S was 0.0025%.
  • On day 3 transmissions were again carried out under good network conditions. No errors were detected in the transport layer during both transmissions ( P E R W R T C = P E R W S = 0 ).
  • On the fourth day, transmissions took place under poor network conditions. Each transport protocol detected one transmission error ( P E R W R T C = P E R W S = 0.0025 % ).
  • On day 5, transmissions were conducted under medium network conditions. No errors were detected in the transport layer during both transmissions ( P E R W R T C = P E R W S = 0 ).
The five selected pairs of flights were conducted during different flight sessions (a morning session, a midday session, and an evening session) and under different weather conditions: from cold days to warm days (1.5 to 25 degrees Celsius), during dry, wet, and just after rainy weather (relative humidities from 46% up to 85%).

5. Results

This section presents and discusses the results of the field experiments, to verify whether a UAV and WebRTC-based IoT is suitable for time-sensitive applications operating in a highly mobile outdoor environment when the underlying network is capable of providing a reliable, low-latency communication service.

5.1. WebRTC Data Channel: Minimum and Maximum of the End-to-End Delays Measured at the Transport Level

Table 3 includes the minimum m i n ( D W R T C t ) and maximum m a x ( D W R T C t ) of the end-to-end delays measured at the transport level when the IoT data were transmitted over the WebRTC data channel, and the truncated maximum m a x ( D T W R T C t ) , calculated after discarding the extreme values from the series of end-to-end delays D W R T C t . Day 4 was the only day on which the PER was not equal to zero, and the large maximum delay recorded on that day was the retransmitted packet delay.
During the IoT transmissions over the WebRTC data channel carried out on day 1 to day 3 and day 5, where no transmission errors were detected at the transport layer, the maximum end-to-end delay measured at the transport level always achieved a single-digit millisecond value (Table 3). Each of the four transmissions achieved a maximum end-to-end delay of 3.5 ms (3469 µs). The maxima of the truncated series were also 3469 µs. This extremely high repeatability of the values of maxima and truncated maxima obtained on different days, when the values were repeated with an accuracy of one microsecond, may indicate an exceptionally high stability of the transmissions conducted under good and average network conditions.
The end-to-end delay minima did not show such outstanding stability, in the sense of the repeatability of results, over the different experiments carried out in the lossless environment. But even here, when the PER was zero, the differences between the results obtained on the different days did not exceeded 30 µs (values from 3377 µs to 3404 µs), which is less than 1% of the minimum values. As an effect, compared to the extremes calculated for the transmission error experiment conducted on day 4, both the maximum and the minimum can be considered stable across the error-free experiments carried out on the same network, but under different network conditions.
The single transmission error that occurred in the experiment conducted on day 4 affected both the maximum value of end-to-end delay measured at the transport level and the minimum value. Because the transmission error led to the retransmission of the lost packet, the maximum end-to-end delay was 10,120 µs, which is more than three times higher than the maxima obtained during the error-free transmissions (Figure 2a). When the delay of the retransmitted packet was discarded from the series of end-to-end delays, the maximum value dropped to 3497 µs. This is less than 1% above the maximum obtained during the error-free transmissions (Figure 2b). This shows that, at the transport level, this large increase in delay was local and its impact was limited to a single error correction via selective retransmission. The SCTP packets, except the retransmitted packet, were transmitted with delays suitable for time-sensitive applications.
The occurrence of an error not only increased the maximum, but also lowered the minimum (Figure 2). For day 4, the minimum was 3181 µs, which is about 10% (300 µs) less than on the other days. Analysis of the instantaneous values (in some publications, e.g., Ref. [37], also called real-time values) of the end-to-end delay shows that the delay of the next burst after the packet loss decreased, then started to increase, and after a few seconds returned to the level observed before the transmission error. The truncated minimum, calculated after discarding the minimum and the maximum delay from the series of delays, was the same as the minimum (i.e., 3181 µs).

5.2. WebRTC Data Channel: Minimum and Maximum of the End-to-End Delays Measured at the Logical Channel Level

While delays measured at the transport level refer to the moment at which the MQTT message decapsulated from the SCTP packet is placed in the receive buffer of the logical channel, delays measured at the logical channel level refer to the moment at which the MQTT is informed that the MQTT message is ready for reception. Table 4 includes the minimum m i n ( D W R T C l c ) , maximum m a x ( D W R T C l c ) , and truncated maximum m a x ( D T W R T C l c ) of the end-to-end delays measured at the logical channel level when the IoT data embedded in MQTT messages were transmitted over the WebRTC data channel.
The difference of two microseconds between almost all of the statistics listed in Table 3, except the truncated maximum, and the corresponding statistics listed in Table 4 is the processing time of WebRTC data channel that processed the payload of the SCTP packet buffered in the receive buffer.
A large difference is visible on day 4 between the maxima of the truncated end-to-end delay series measured at the transport level and at the logical channel level. While, at the transport level, the truncated maximum was about 3.5 ms (precisely: 3497 µs), at the logical channel level, it was about 9 ms (9187 µs). The truncated maximum calculated at the logical channel level (Figure 3b) was more than two and a half times larger than the truncated maximum obtained at the transport level (Figure 2b), and only about 10% less than the end-to-end delay of the retransmitted packet (10,122 µs), as presented in Figure 3a. Such a large difference resulted from the fact that the packets already received, but sent after the lost packet, were waiting for the retransmission of the lost packet. Only when the retransmitted packet was transferred to the receive buffer of the logical channel did the MQTT protocol receive information that these packets were ready.

5.3. WebSocket: Minimum and Maximum of End-to-End Delays Measured at the Logical Channel Level

Table 5 lists the minimum m i n ( D W S l c ) , maximum m a x ( D W S l c ) , and the truncated maximum m a x ( D T W S l c ) values of the end-to-end delays measured at the logical channel level when the MQTT messages were transmitted over the WebSocket. The end-to-end delays of the transmission of IoT data over the WebSocket, measured during experiments conducted on days 1 to 5, had relatively small minima (2515 µs to 2550 µs), smaller than minima of the delays of transmissions over WebRTC (Table 4). However, the maxima were very large, at 65,723 µs to 87,145 µs.
The occurrence of packet loss on days 2 and 4 slightly (by 20–30 µs) lowered the minimum and significantly (by 10–20 ms) increased the maximum compared to the experiments in which PER was equal to 0. Unlike the end-to-end delays measured during transmissions over WebRTC (Figure 3), discarding the extreme delays from the series of delays did not lower the maximum value enough to be practical (Figure 4). When the PER was non-zero, the truncated maximum was 2 µs (day 4) to 13 µs (day 2) less than the maximum, and when the PER was zero, the truncated maximum was 11 µs (day 1) to 234 µs (day 3).

5.4. WebRTC: Measures of Location

Table 6 and Table 7 present the mean μ ( D W R T C t ) and μ ( D W R T C l c ) , median m e d ( D W R T C t ) and m e d ( D W R T C l c ) , mode m o d ( D W R T C t ) and m o d ( D W R T C l c ) , upper quartile Q 3 ( D W R T C t ) and Q 3 ( D W R T C l c ) , and lower quartile Q 1 ( D W R T C t ) and Q 1 ( D W R T C l c ) of the end-to-end delays measured at the transport level and at the logical channel level, respectively, when the MQTT messages were transmitted over the WebRTC data channel. The results in Table 6 and Table 7 differ by 2 microseconds. This was a delay resulting from the processing in the receive buffer of the logical channel.
Because for days 1–3 and day 5, when P E R = 0 , the maxima of end-to-end delays were less than 10 ms, the measures of location for these days were also less than 10 ms (Table 6 and Table 7). Because the non-zero PER that occurred on day 4 was small (one packet lost per 40,000 packets sent), a single outlier (transport level) or a small group of outliers (logical channel level) were unable to influence either the arithmetic mean or measures of position. As an effect, all measures of location presented in Table 6 and Table 7 are one-digit milliseconds.
In the case of error-free transmission in the transport layer (PER equal to 0), which was carried out on day 1 to 3 and day 5, the measures of location calculated both at the transport level and at the logical channel level showed similar extremely high repeatability as the maximums of error-free transmissions shown in the previous section. At the logical channel level, the arithmetic mean of the end-to-end delays was 3461 to 3462 µs (i.e., 3461.5 ± 0.5 µs), the median was 3468 to 3469 µs (i.e., 3468.5 ± 0.5 µs), the mode was 3470 µs and equaled the upper quartile, and the lower quartile was 3456 to 3460 µs (i.e., 3458 ± 2 µs). At the transport level, the measures of location were reduced by 2 µs, and the abovementioned numerical relationships between the statistical measures were the same.
The stability of the measures of location over the different experiments, observed for P E R = 0 , was accompanied by very small differences between the measures (Figure 5b). The upper quartile was 1–2 microseconds greater than the median, and the lower quartile was about 10 ms lower than the median. The differences between the maximums and medians were also of a few microseconds. At the logical channel level, the maximum end-to-end delay was 3471 µs, while the median of these delays was 2 to 3 µs smaller. Because the above numerical relationships were preserved at the transport level, at least 50% of the end-to-end delays were maximum at both considered levels (within 3 microseconds). This indicated a strong stability for the transmissions performed on days 1–3 and day 5, with a very small jitter.
The poor network condition on day 4 caused a single transmission error that was detected by the transport protocol, and then the lost packet was retransmitted. The end-to-end delays measured on this day had higher values for all measures of location, both the mean and quartiles, by about 30 µs in relation to the measures calculated for error-free transmissions. Due to this uniform shift, the numerical relationships between measures of position were the same as observed in the case of error-free transmissions: the mode was equal to the upper quartile, the upper quartile was 1–2 microseconds (here, 1 µs) greater than the median, and the lower quartile was about 10 microseconds (here, 9 µs) below the median (Table 6 and Table 7). Since the maximum end-to-end delay was the delay of the IoT datum sent in the retransmitted packet, the difference between the median and the maximum value was more than 6.5 ms (Figure 5a). The values of the maximum and the median end-to-end delay cannot therefore be considered close. However, in the case of delays measured at the transport level, the truncated median (3495 µs) and the truncated maximum (3497 µs) were close to each other, differing by only 2 µs. In the case of delays measured at the logical channel level, due to the long waiting time for packets to be sorted out in the receive buffer after retransmitting a lost packet, the truncated median (3.497 µs) and the truncated maximum (9.187 µs) differed by 5.690 µs.

5.5. WebSockets: Measures of Location

Table 8 summarizes the measures of location: mean μ ( D W S l c ) , median m e d ( D W S l c ) , mode m o d ( D W S l c ) , upper quartile Q 3 ( D W S l c ) , and lower quartile Q 1 ( D W S l c ) of the end-to-end delays measured at the logical channel level when MQTT transmissions were carried out over the WebSocket. While all measures of the location of end-to-end delays of IoT data transmitted over WebRTC satisfied the single-digit millisecond requirements of time-sensitive applications (Table 7, Figure 5a), the lower quartile and the mode were the only measures of location that always met this requirement when transmissions were carried out over the classic web logical channel (Table 8, Figure 6a). The median only met this requirement on days 1, 3, and 5, when the underlying IEEE 802.11 network was able to ensure reliable transmission in the transport layer. The arithmetic mean and upper quartile only met it when the error rate in the MAC sublayer was not greater than 0.1 percent (day 1 and day 2).
The comparison of the results presented in Table 7 and Table 8 shows that the mode was the only measure of location in terms of which the IoT transmissions over WebSocket were superior to IoT transmissions over WebRTC Data Channel. Two such measures were therefore found: the mode and minimum (see previous sections). In IoT transmissions using WebSocket, the mode was not equal to the upper quartile, as in the IoT transmissions using WebRTC (Figure 5b), but to the minimum (Figure 6b). However, the relatively small number of delays whose value was a modal value (WebRTC: 12–15 thousand on days 1–3 and 5, over 10 thousand on day 4; WebSocket: about 1 thousand on days 1, 2, and 5, about 500 on day 2, no modal value on day 4) shows that this modal superiority of transmissions over WebSocket does not matter much in practice.

6. Discussion

The previous section presented and analyzed the minimum, maximum, arithmetic mean, mode, and quartiles (upper, median, and lower quartile) of the end-to-end delays of air-to-ground IoT transmissions carried out using the WebRTC data channel or WebSocket. Table 9 contains a comparison of the statistics collected at the logical channel level, in the form of the ratio of the value of a given statistical measure calculated for IoT transmission via WebSocket (Table 5 and Table 8) to the value of the same measure calculated for IoT transmission via WebRTC (Table 4 and Table 7). If the values of the same statistical measure calculated for transmissions using WebRTC Data Channel and WebSocket are equal, the ratio will be 1. In such a situation, it will not matter, from the point of view of a given statistical measure, through which of the web logical channels the IoT transmission is carried out between the UAV and the ground. This is a purely hypothetical case and does not appear in Table 9.
A ratio of statistical measures less than 1 indicates the superiority of the WebSocket web logical channel over the WebRTC data channel. This ratio appears in Table 9 twice: for minimum values and for modal values. The ratio of minima, m i n ( D W S l c ) to m i n ( D W R T C l c ) , ranged from 0.74 for day 2 to 0.79 for day 4. This means that, when using WebRTC, the minimum end-to-end delays were approximately one third greater under good and medium network conditions and approximately a quarter greater under poor network conditions than the minimum delays achieved when transmitting using WebSockets. The ratio of modal values, m o d ( D W S l c ) to m o d ( D W R T C l c ) , amounted to 0.73–0.74 for IoT transmissions under good and medium network conditions. Under the poor network conditions, no modal value was observed during transmission using WebSockets. However, the relatively small number of minimally delayed packets made this advantage of WebSockets over WebRTC relatively minor.
Figure 7 and Figure 8 show scatter plots drawn for the end-to-end delay statistics calculated at the logical channel level and presented in the previous section. The values obtained for transmissions using WebSockets (Table 5 and Table 8) are plotted as a function of the corresponding values obtained for the transmissions using WebRTC (Table 4 and Table 7). The markers denote statistics calculated for transmissions under good (x), average (+), and poor (o) network conditions. The diagonal of each plot (dashed line) illustrates the hypothetical case of a ratio of a given statistical measure equal to 1. Below the diagonal, there were only minimum and mode markers (Figure 7a). The values of these statistics for transmissions using WebRTC were higher than for transmissions using WebSockets, so the ratio given in Table 9 is less than 1. The highest minimum latencies and highest latency modes were observed under good network conditions. As the network conditions deteriorated, the minimum and mode began to decrease, although the observed differences were small, and when using WebRTC, there was no difference between the modes calculated under good and medium network conditions. The lowest minimum delay occurred under poor network conditions, but only in the case of WebRTC was the reduction in the minimum significant.
A ratio of statistical measures greater than 1 indicates the superiority of the WebRTC data channel over WebSocket. This ratio appears in Table 9 for the arithmetic mean, quartiles, and maximum value. Except for the latter, unlike the case of ratios smaller than 1, the least ratios greater than 1 were obtained under good network conditions, and as the network conditions deteriorated, the ratio value increased. As an example, the ratio of arithmetic means, μ ( D W S l c ) to μ ( D W R T C l c ) , ranged from 2.1 for day 3 to 6.47 for day 4. Under good network conditions, the mean delay of air-to-ground IoT transmissions using WebSockets was more than twice the mean delay of transmissions using WebRTC. Under medium network conditions, it was almost four times greater. When the network conditions were on the verge of medium to poor, the mean delay of transmissions using WebSocket was just over five times greater, and when the network conditions were poor, it was well over six times greater than the mean delay of transmissions using WebRTC (Table 9, Figure 7b). In addition, in the case of the ratio of quartiles, the greater the distance (in the number of samples, or position) of a given measure from the minimum, the greater the ratio values. The ratio of lower quartiles, Q 1 ( D W S l c ) to Q 1 ( D W R T C l c ) , ranged from 1.27 for day 3 to 2.53 for day 4 (Table 9, Figure 8a). The ratio of medians, m e d ( D W S l c ) to m e d ( D W R T C l c ) ranged from 1.48 for day 3 to 4.67 for day 4 (Table 9, Figure 7b). The ratio of upper quartiles, Q 3 ( D W S l c ) to Q 3 ( D W R T C l c ) , ranged from 1.27 for day 3 to 9.28 for day 4 (Table 9, Figure 8a).
In the case of end-to-end delay maxima ratios, m a x ( D W S l c ) to m a x ( D W R T C l c ) , the above observations were true for days 1 to 3 and day 5, when transmission at the transport layer was error-free. Under good network conditions, the maximum delay of transmission using WebSocket was approximately 20 times greater than the maximum delay of transmission using WebRTC. When the network conditions were medium, it was about 22 times greater, and when the network conditions were between medium and poor, it was more than 24 times greater than the maximum delay of transmission using WebRTC. In the case of a single transmission error (day 4), the ratio of maxima dropped to over 8 (Table 9), due to the large increase in the maximum delay in transmission using WebRTC caused by packet retransmission. As with all other measures for which the ratio was greater than 1 (Figure 7b and Figure 8a), the maximum transmission delay using the WebSocket was the highest under poor network conditions (Figure 8b).
When comparing the obtained results with those reported in related works, the delay introduced by the implementation of the MQTT protocol should be taken into account. Additional laboratory experiments showed that the classic Eclipse Paho JavaScript Client implementation of the MQTT protocol introduced delays averaging approximately 1.5 ms. For WebSocket-based IoT, this resulted in average application-level end-to-end delays of approximately 9 ms to 24 ms. Taking into account that similar delays during transmission from the UAV to the ground station in systems using WebSocket have been reported in the literature (e.g., 23 ms [30], 20 ms to 25 ms in the IEEE 802.11 network [34]), it can be concluded that that the end-to-end delay values for UAV-borne IoT using WebSocket were comparable to those reported in related works.
In the case of solutions other than WebSocket, but still based on the TCP protocol, the situation is similar. In [32], replacing single-path transmission using TCP with a multi-path transmission using the improved MPTCP reduced the latency from 920.4 ms to 568.1 ms (i.e., 1.62 times). This was achieved at the expense of the parallel transmission of cloned packets. However, in this paper, the use of WebRTC provided a greater relative improvement than [32] and without as much computational and energy cost. Significantly, modifying the MPTCP so that the UDP was used as the underlying protocol instead of the TCP [31] allowed for a relative improvement similar to that shown in this paper, and at lower computational costs than in [32] due to the simplicity of the UDP mechanisms. However, the energy cost of the solution proposed in [31] remained significant. Moreover, since the STCP implements multihoming, multipath transmissions of cloned sensor data can also be used in WebRTC-based IoT.
It can be expected that the advantages of the WebRTC data channel used in UAV communication may be comparable to those of using any other UDP-based solution. Comparing the results of experiments on MQTT over WebRTC data channel presented in this work with the results of the experiments on MQTT over QUIC presented in [38], it can be seen that, in the case of error-free transmissions, the results were approximately similar. If the latencies introduced by the underlying IEEE 802.11 networks (2 ms in this article and 25 ms in [38]), as well as the estimated delays introduced by the Paho implementation of the MQTT protocol (1.5 ms), are subtracted from the average results, the approximate average delays obtained in this paper and in [38] are the same and equal 1.5 ms. However, the spread of end-to-end delay values was much larger in [38] than in this work. Due to the significant differences in the test environment (in this paper, a mobile and highly variable real-world environment using a low-latency network was employed; a static environment using an emulated high-latency network was employed in [38]), it is impossible to say with certainty how beneficial it would be to use a WebRTC data channel in UAV-IoT communication instead of QUIC.
The second example of a UDP-based solution is presented in [37], which compared IoT transmissions over SCTP with IoT transmissions over TCP. The results of simulation experiments showed the better performance of SCTP and better stability of TCP in heterogeneous networks (wired and wireless). During error-free transmissions, the performance difference between SCTP and TCP shown in [37] was not as large as the performance difference estimated from the results presented in this paper. The issue of stability was also different: in this paper, the SCTP was extremely stable, and much more stable than the TCP. It is worth emphasizing here that, in [37], an older version of the SCTP was discussed, and the research conducted in this paper used a new, WebRTC-oriented version of the SCTP that is currently implemented in web browsers.

7. Conclusions

In recent years, wireless communication for time-sensitive IoT applications has become a hot research topic, including applications that require end-to-end delays measured in single-digit milliseconds. One of the problems encountered in these applications is the processing in higher network layers: even if the underlying network is capable of providing highly reliable, low-latency communications, the delays introduced at the transport layer and above may prove too great to meet stringent time requirements. The aim of this paper was to show that, in the case of IoT carried by UAV, the use of WebRTC can help solve this problem.
The paper used high-resolution time measurement procedures and timer synchronization to perform delay measurements at the level of the transport protocol and at the level of the network logical channel of the WebRTC IoT application, run on board the UAV. During the field experiments, air-to-ground IoT transmissions were carried out under various network conditions, followed by statistical analysis of these delays, focusing on extreme values and location measures. The obtained results were compared with those obtained for IoT transmission via the WebSocket logical channel, under the same circumstances.
The statistical characteristics of the end-to-end delays showed that, during air-to-ground transmission, the WebRTC-based IoT was able to achieve single-digit-millisecond end-to-end delays on both the transport protocol level and the logical channel level. When the WebRTC transmission was error-free, stable end-to-end delays well below 10 ms were achieved. When a single transmission error occurred, higher end-to-end delays were observed in the immediate vicinity of the retransmitted packet, although they were still below 10 ms. Only the delay of the retransmitted packet slightly exceeded 10 ms.
The results of the same IoT transmissions performed via WebSocket under the same circumstances showed that the WebRTC-based UAV-borne IoT had 8.5 to 24 times lower maximum delays and 2 to 6.5 times lower mean delays than the same IoT using WebSocket. The smallest differences between the maximum values and the largest differences between the arithmetic means were associated with the occurrence of a transmission error. The results therefore indicated the superiority of the WebRTC logical channel over the classic web logical channel.
Future research will focus on analyzing WebRTC-based UAV-borne IoT transmissions in Wi-Fi 6e and Wi-Fi 7 networks, as well as UAV swarm tests in a 5G test network.

Author Contributions

Conceptualization, A.C. and R.R.C.; formal analysis, A.C.; investigation, R.R.C.; software, A.C. and R.R.C.; visualization, A.C.; writing—original draft, A.C. and R.R.C.; writing—review and editing, A.C.; funding acquisition, A.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
Abbreviations
4GFourth-generation (technology for broadband cellular networks)
5GFifth-generation (technology for broadband cellular networks)
AMECAerial mobile edge computing
AMQPAdvanced Message Queuing Protocol
APIApplication programming interface
CPUCentral processing unit
CCCCommand and control console
DiffServDifferentiated Services
ESSExtended Service Set
HSPA+Evolved High Speed Packet Access (Evolved HSPA, HSPA Evolution)
HTTPHypertext Transfer Protocol
IEEEInstitute of Electrical and Electronics Engineers
IoTInternet of Things
ISPInternet service provider
KPIKey performance indicator
LRTLow-latency Reliable Transmission
LTELong Term Evolution
MACMedium access control
MPTCPMultipath Transmission Control Protocol (Multipath TCP)
MQTTMessage Queuing Telemetry Transport (MQ Telemetry Transport)
PAVPersonal air vehicle
PERPacket error rate
QoSQuality of Service
QUICQuick UDP Internet Connections
RTPReal-time Transport Protocol
SBCSingle-board computers
SCTPStream Control Transmission Protocol
SDNSoftware-defined networking
SDRSoftware-defined radio
SIPSession Initiation Protocol
TCPTransmission Control Protocol
TNSTime-Sensitive Networking
UAVUnmanned aerial vehicle
UDPUser Datagram Protocol
W3CWorld Wide Web Consortium
WebRTCWeb real-time communication
WLANWireless local network
WMMSWebRTC multimedia and monitoring station
WoTWeb of Things
XMPPExtensible Messaging and Presence Protocol
Indexes
ii-th IoT datum
jj-th IoT datum
i l c input of logical channel
l c logical channel level
o l c output of logical channel
ttransport level (output of transport protocol)
W R T C WebRTC’s web logical channel (Data Channel)
W S WebSocket web logical channel
Symbols
dend-to-end delay
d l c end-to-end delay measured at the logical channel level
d t end-to-end delay measured at the transport level
Dseries of end-to-end delays
D l c series of end-to-end delays at the logical channel level
D t series of end-to-end delays at the transport level
D T truncated series of end-to-end delays
D T l c truncated series of end-to-end delays at the logical channel level
D T t truncated series of end-to-end delays at the transport level
ttime
t 1 starting time
t i l c entry time into the logical channel
t o l c reception time from the logical channel
t t reception time from the transport protocol
μ ( D ) arithmetic mean of the time series D
μ ( D T ) truncated arithmetic mean of the time series D
m a x ( D ) maximum value in series D
m a x ( D T ) truncated maximum value in series D
m e d ( D ) median of the time series D
m e d ( D T ) truncated median of the time series D
m i n ( D ) minimum value in series D
m i n ( D T ) truncated minimum value in series D
m o d ( D ) mode of the time series D
m o d ( D T ) truncated mode of the time series D
Q 1 ( D ) lower quartile of the time series D
Q 1 ( D T ) truncated lower quartile of the time series D
Q 3 ( D ) upper quartile of the time series D
Q 3 ( D T ) truncated upper quartile of the time series D

References

  1. Mohsan, S.A.H.; Othman, N.Q.H.; Li, Y.; Alsharif, M.H.; Khan, M.A. Unmanned aerial vehicles (UAVs): Practical aspects, applications, open challenges, security issues, and future trends. Intell. Serv. Robot. 2023, 16, 109–137. [Google Scholar] [CrossRef] [PubMed]
  2. Unmanned Aircraft System. FAA Aerospace Forecast Fiscal Years 2022–2042. FAA. 2022. Available online: https://www.faa.gov/sites/faa.gov/files/2022-06/Unmanned_Aircraft_Systems.pdf (accessed on 23 December 2024).
  3. Wikarek, J.; Sitek, P.; Zawarczyński, Ł. An Integer Programming Model for the Capacitated Vehicle Routing Problem with Drones. In Computational Collective Intelligence. ICCCI 2019; Nguyen, N., Chbeir, R., Exposito, E., Aniorté, P., Trawiński, B., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2019; Volume 11683, pp. 511–520. [Google Scholar] [CrossRef]
  4. Sitek, P.; Wikarek, J.; Rutczyńska-Wdowiak, K. Capacitated Vehicle Routing Problem with Pick-Up, Alternative Delivery and Time Windows (CVRPPADTW): A Hybrid Approach. In Proceedings of the Distributed Computing and Artificial Intelligence, 16th International Conference, Special Sessions, Avila, Spain, 26–28 June 2019; Springer International Publishing: Cham, Switzerland, 2019; pp. 33–40. [Google Scholar] [CrossRef]
  5. Moradi, N.; Wang, C.; Mafakheri, F. Urban Air Mobility for Last-Mile Transportation: A Review. Vehicles 2024, 6, 1383–1414. [Google Scholar] [CrossRef]
  6. Motlagh, N.H.; Kortoçi, P.; Su, X.; Lovén, L.; Hoel, H.K.; Haugsvær, S.B.; Srivastava, V.; Gulbrandsen, C.F.; Nurmi, P.; Tarkoma, S. Unmanned aerial vehicles for air pollution monitoring: A survey. IEEE Internet Things J. 2023, 10, 21687–21704. [Google Scholar] [CrossRef]
  7. Chodorek, A.; Chodorek, R.R.; Yastrebov, A. The Prototype Monitoring System for Pollution Sensing and Online Visualization with the Use of a UAV and a WebRTC-Based Platform. Sensors 2022, 22, 1578. [Google Scholar] [CrossRef] [PubMed]
  8. Sziroczak, D.; Rohacs, D.; Rohacs, J. Review of using small UAV based meteorological measurements for road weather management. Prog. Aerosp. Sci. 2022, 134, 100859. [Google Scholar] [CrossRef]
  9. Chodorek, A.; Chodorek, R.R.; Sitek, P. Response Time and Intrinsic Information Quality as Criteria for the Selection of Low-Cost Sensors for Use in Mobile Weather Stations. Electronics 2022, 11, 2448. [Google Scholar] [CrossRef]
  10. Elloumi, M.; Dhaou, R.; Escrig, B.; Idoudi, H.; Saidane, L.A. Monitoring road traffic with a UAV-based system. In Proceedings of the 2018 IEEE Wireless Communications and Networking Conference (WCNC), Barcelona, Spain, 15–18 April 2018; pp. 1–6. [Google Scholar] [CrossRef]
  11. Phade, G.; Kishore, A.T.; Omkar, S.; Suresh Kumar, M. IoT-Enabled Unmanned Aerial Vehicle: An Emerging Trend in Precision Farming. In Drone Technology: Future Trends and Practical Applications; Mohanty, S.N., Ravindra, J.V.R., Surya Narayana, G., Pattnaik., C.R., Mohamed Sirajudeen, Y., Eds.; Scrivener Publishing LLC.: Beverly, MA, USA, 2023; pp. 301–324. [Google Scholar] [CrossRef]
  12. Ju, C.; Son, H.I. Multiple UAV Systems for Agricultural Applications: Control, Implementation, and Evaluation. Electronics 2018, 7, 162. [Google Scholar] [CrossRef]
  13. Ganesh, S.; Gopalasamy, V.; Shibu, N.S. Architecture for drone assisted emergency ad-hoc network for disaster rescue operations. In Proceedings of the 2021 International Conference on COMmunication Systems & NETworkS (COMSNETS), Bangalore, India, 5–9 January 2021; pp. 44–49. [Google Scholar] [CrossRef]
  14. Chand, G.S.L.K.; Lee, M.; Shin, S.Y. Drone based wireless mesh network for disaster/military environment. J. Comput. Commun. 2018, 6, 44–52. [Google Scholar] [CrossRef]
  15. Mohammed, A.; Erbad, A.; Nahom, H.; Albaseer, A.; Abdallah, M.; Guizani, M. FDRL Approach for Association and Resource Allocation in Multi-UAV Air-To-Ground IoMT Network. In Proceedings of the GLOBECOM 2022–2022 IEEE Global Communications Conference, Rio de Janeiro, Brazil, 4–8 December 2022; pp. 1417–1422. [Google Scholar] [CrossRef]
  16. Liu, X.; Liu, H.; Zheng, K.; Liu, J.; Taleb, T.; Shiratori, N. AoI-minimal clustering, transmission and trajectory co-design for UAV-assisted WPCNs. IEEE Trans. Veh. Technol. 2024, 1–16. [Google Scholar] [CrossRef]
  17. Fu, X.; Huang, X.; Pan, Q. Collaborative relay for achieving long-term and low-AoI data collection in UAV-aided IoT systems. Veh. Commun. 2024, 45, 100719. [Google Scholar] [CrossRef]
  18. Rahimi, O.; Shafieinejad, A. Minimizing age of information in multi-UAV-assisted IoT networks: A graph theoretical approach. Wirel. Netw. 2024, 30, 533–555. [Google Scholar] [CrossRef]
  19. Deng, C.; Fu, X. Low-AoI Data Collection in UAV-Aided Wireless-Powered IoT Based on Aerial Collaborative Relay. IEEE Sens. J. 2024, 24, 33506–33521. [Google Scholar] [CrossRef]
  20. Chodorek, A.; Chodorek, R.R.; Krempa, A. An analysis of elastic and inelastic traffic in shared link. In Proceedings of the 2008 Conference on Human System Interactions, Krakow, Poland, 25–27 May 2008; pp. 873–878. [Google Scholar] [CrossRef]
  21. Giordani, M.; Polese, M.; Mezzavilla, M.; Rangan, S.; Zorzi, M. Toward 6G Networks: Use Cases and Technologies. IEEE Commun. Mag. 2020, 58, 55–61. [Google Scholar] [CrossRef]
  22. Han, M.; Lee, J.; Rim, M.; Kang, C.G. Dynamic Bandwidth Part Allocation in 5G Ultra Reliable Low Latency Communication for Unmanned Aerial Vehicles with High Data Rate Traffic. Sensors 2021, 21, 1308. [Google Scholar] [CrossRef] [PubMed]
  23. Rico, D.; Merino, P. A Survey of End-to-End Solutions for Reliable Low-Latency Communications in 5G Networks. IEEE Access 2020, 8, 192808–192834. [Google Scholar] [CrossRef]
  24. Purucker, P.; Schmid, J.; Höß, A.; Schuller, B.W. System Requirements Specification for Unmanned Aerial Vehicle (UAV) to Server Communication. In Proceedings of the 2021 International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 15–18 June 2021; pp. 1499–1508. [Google Scholar] [CrossRef]
  25. Mourtzis, D.; Angelopoulos, J.; Panopoulos, N. Smart Manufacturing and Tactile Internet Based on 5G in Industry 4.0: Challenges, Applications and New Trends. Electronics 2021, 10, 3175. [Google Scholar] [CrossRef]
  26. WebRTC Extended Use Cases W3C Group Draft Note 14 December 2023. Available online: https://www.w3.org/TR/webrtc-nv-use-cases/ (accessed on 23 December 2024).
  27. Chodorek, A.; Chodorek, R.R. Work-in-Progress: A Browser-Driven Sensor Service for Embedded IoT. In Proceedings of the 2022 International Conference on Embedded Software (EMSOFT), Shanghai, China, 7–14 October 2022; pp. 15–16. [Google Scholar] [CrossRef]
  28. Chodorek, A.; Chodorek, R.R.; Sitek, P. UAV-Based and WebRTC-Based Open Universal Framework to Monitor Urban and Industrial Areas. Sensors 2021, 21, 4061. [Google Scholar] [CrossRef]
  29. Herrero, R. MQTT-SN, CoAP, and RTP in wireless IoT real-time communications. Multimed. Syst. 2020, 26, 643–654. [Google Scholar] [CrossRef]
  30. Yigit, Y.; Nguyen, L.D.; Ozdem, M.; Kinaci, O.K.; Hoang, T.; Canberk, B.; Duong, T.Q. TwinPort: 5G drone-assisted data collection with digital twin for smart seaports. Sci. Rep. 2023, 13, 12310. [Google Scholar] [CrossRef]
  31. Lee, W. Enabling Reliable UAV Control by Utilizing Multiple Protocols and Paths for Transmitting Duplicated Control Packets. Sensors 2021, 21, 3295. [Google Scholar] [CrossRef]
  32. Lee, W.; Lee, J.Y.; Joo, H.; Kim, H. An MPTCP-Based Transmission Scheme for Improving the Control Stability of Unmanned Aerial Vehicles. Sensors 2021, 21, 2791. [Google Scholar] [CrossRef] [PubMed]
  33. Jung, W.S.; Yim, J.; Ko, Y.B.; Singh, S. ACODS: Adaptive computation offloading for drone surveillance system. In Proceedings of the 16th Annual Mediterranean Ad Hoc Networking Workshop (Med-Hoc-Net), Budva, Montenegro, 28–30 June 2017; pp. 1–6. [Google Scholar] [CrossRef]
  34. Ramos, J.; Ribeiro, R.; Safadinho, D.; Barroso, J.; Pereira, A. Communication Protocol for Unmanned Vehicles: An Architectural Approach. In Proceedings of the 2020 Global Internet of Things Summit (GIoTS), Dublin, Ireland, 3 June 2020; pp. 1–7. [Google Scholar] [CrossRef]
  35. Ramos, J.; Ribeiro, R.; Safadinho, D.; Barroso, J.; Rabadão, C.; Pereira, A. Distributed Architecture for Unmanned Vehicle Services. Sensors 2021, 21, 1477. [Google Scholar] [CrossRef] [PubMed]
  36. Lee, H.; Yoon, J.; Jang, M.-S.; Park, K.-J. A Robot Operating System Framework for Secure UAV Communications. Sensors 2021, 21, 1369. [Google Scholar] [CrossRef]
  37. Sun, W.; Yu, S.; Xing, Y.; Qin, Z. Parallel Transmission of Distributed Sensor Based on SCTP and TCP for Heterogeneous Wireless Networks in IoT. Sensors 2019, 19, 2005. [Google Scholar] [CrossRef]
  38. Jeddou, S.; Fernández, F.; Diez, L.; Baina, A.; Abdallah, N.; Agüero, R. Delay and Energy Consumption of MQTT over QUIC: An Empirical Characterization Using Commercial-Off-The-Shelf Devices. Sensors 2022, 22, 3694. [Google Scholar] [CrossRef]
  39. Fernández, F.; Zverev, M.; Garrido, P.; Juárez, J.R.; Bilbao, J.; Agüero, R. Even Lower Latency in IIoT: Evaluation of QUIC in Industrial IoT Scenarios. Sensors 2021, 21, 5737. [Google Scholar] [CrossRef]
  40. Kilic, F.; Hassan, M.; Hardt, W. Prototype for Multi-UAV Monitoring–Control System Using WebRTC. Drones 2024, 8, 551. [Google Scholar] [CrossRef]
  41. Jiang, S.; Zhang, Q.; Wu, A.; Liu, Q.; Wu, J.; Xia, P. A low-latency reliable transport solution for network-connected UAV. In Proceedings of the 2018 10th International Conference on Communication Software and Networks, Ponta Delgada, Portugal, 6–9 July 2018; pp. 511–515. [Google Scholar] [CrossRef]
  42. IEEE Std 802.11-2020; IEEE Standard for Information Technology—Telecommunications and Information Exchange between Systems—Local and Metropolitan Area Networks—Specific Requirements—Part 11: Wireless LAN Medium Access Control (MAC) and Physical Layer (PHY) Specifications. IEEE: New York, NY, USA, 2021; 4379p. [CrossRef]
  43. Wireless Technologies and Use Cases in Industrial IOT. 2020. Available online: https://web.archive.org/web/20240502073912/https://www.ciscolive.com/c/dam/r/ciscolive/emea/docs/2020/pdf/BRKIOT-1775.pdf (accessed on 23 December 2024).
  44. Anand, B.; Kambhampaty, H.R.; Rajalakshmi, P. A Novel Real-Time LiDAR Data Streaming Framework. IEEE Sens. J. 2022, 22, 23476–23485. [Google Scholar] [CrossRef]
  45. Moorthy, S.K.; Lu, C.; Guan, Z.; Mastronarde, N.; Sklivanitis, G.; Pados, D.; Bentley, E.S.; Medley, M. CloudRAFT: A Cloud-based Framework for Remote Experimentation for Mobile Networks. In Proceedings of the 2022 IEEE 19th Annual Consumer Communications & Networking Conference (CCNC), Las Vegas, NV, USA, 8–11 January 2022; pp. 1–6. [Google Scholar] [CrossRef]
  46. Tiberkak, A.; Hentout, A.; Belkhir, A. WebRTC-based MOSR remote control of mobile manipulators. Int. J. Intell. Robot. Appl. 2023, 7, 304–320. [Google Scholar] [CrossRef]
  47. Samdanis, K.; Taleb, T. The road beyond 5G: A vision and insight of the key technologies. IEEE Netw. 2020, 34, 135–141. [Google Scholar] [CrossRef]
  48. Vu, V.A.; Walker, B. Redundant multipath-tcp scheduling with desired packet latency. In Proceedings of the 14th Workshop on Challenged Networks, Los Cabos, Mexico, 25 October 2019; pp. 7–12. [Google Scholar] [CrossRef]
  49. Chodorek, R.R.; Chodorek, A. MPTCP protocol misbehaviour in high-speed, uncongested network. J. Mar. Eng. Technol. 2017, 16, 248–256. [Google Scholar] [CrossRef]
  50. Stewart, R.; Tüxen, M.; Nielsen, K. Stream Control Transmission Protocol. RFC 9260. 2022. Available online: https://www.rfc-editor.org/info/rfc9260 (accessed on 23 December 2024).
  51. Kumar, P.; Dezfouli, B. Implementation and analysis of QUIC for MQTT. Comput. Netw. 2019, 150, 28–45. [Google Scholar] [CrossRef]
  52. Martalò, M.; Pettorru, G.; Atzori, L. A Cross-Layer Survey on Secure and Low-Latency Communications in Next-Generation IoT. IEEE Trans. Netw. Serv. Manag. 2024, 21, 4669–4685. [Google Scholar] [CrossRef]
  53. Web of Things (WoT) Architecture 1.1 W3C Recommendation 05 December 2023. Available online: https://www.w3.org/TR/wot-architecture11/ (accessed on 23 December 2024).
  54. Chodorek, A.; Chodorek, R.R. Model warstwowy ustanawiania sesji WebRTC. Stud. Inform. 2016, 37, 117–126. Available online: https://scholar.archive.org/work/lucrhn2ykrcppbcknqshbg2mum/access/wayback/http://studiainformatica.polsl.pl:80/index.php/SI/article/download/767/729 (accessed on 23 December 2024).
  55. Grigorik, I. High Performance Browser Networking: What Every Web Developer Should Know About Networking and Web Performance; O’Reilly Media, Inc.: Newton, MA, USA, 2013. [Google Scholar]
  56. Chodorek, A.; Chodorek, R.R.; Wajda, K. Benefits of Using WebRTC Technology for Building of Flying IoT Systems. In Proceedings of the 35th International Conference on Advanced Information Networking and Applications (AINA-2021), Toronto, ON, Canada, 12–14 May 2021; pp. 310–322. [Google Scholar] [CrossRef]
Figure 1. The testbed.
Figure 1. The testbed.
Sensors 25 00524 g001
Figure 2. The range of end-to-end delays (in ms) measured at the transport level when the IoT data were transmitted over the WebRTC data channel: (a) full series D W R T C t ; (b) truncated series D T W R T C t .
Figure 2. The range of end-to-end delays (in ms) measured at the transport level when the IoT data were transmitted over the WebRTC data channel: (a) full series D W R T C t ; (b) truncated series D T W R T C t .
Sensors 25 00524 g002
Figure 3. The range of end-to-end delays (in ms) measured at the logical channel level when the IoT data were transmitted over the WebRTC data channel: (a) full series D W R T C l c ; (b) truncated series D T W R T C l c .
Figure 3. The range of end-to-end delays (in ms) measured at the logical channel level when the IoT data were transmitted over the WebRTC data channel: (a) full series D W R T C l c ; (b) truncated series D T W R T C l c .
Sensors 25 00524 g003
Figure 4. The range of end-to-end delays (in ms) measured at the logical channel level when the IoT data were transmitted over the WebSocket: (a) full series D W S l c ; (b) truncated series D T W S l c .
Figure 4. The range of end-to-end delays (in ms) measured at the logical channel level when the IoT data were transmitted over the WebSocket: (a) full series D W S l c ; (b) truncated series D T W S l c .
Sensors 25 00524 g004
Figure 5. The five-number summary of the end-to-end delays (in ms) measured at the logical channel level when the IoT data were transmitted over WebRTC: (a) all experiments ( P E R > = 0 ); (b) the transport layer considered the transmission to be error-free ( P E R = 0 ). For P E R = 0 , the arithmetic mean and the mode are also shown.
Figure 5. The five-number summary of the end-to-end delays (in ms) measured at the logical channel level when the IoT data were transmitted over WebRTC: (a) all experiments ( P E R > = 0 ); (b) the transport layer considered the transmission to be error-free ( P E R = 0 ). For P E R = 0 , the arithmetic mean and the mode are also shown.
Sensors 25 00524 g005
Figure 6. The five-number summary of end-to-end delays (in ms) measured at the logical channel level when IoT data were transmitted over WebSocket: (a) all experiments ( P E R > = 0 ); (b) the transport layer considered the transmission to be error-free ( P E R = 0 ). For P E R = 0 the arithmetic mean and the mode are also shown.
Figure 6. The five-number summary of end-to-end delays (in ms) measured at the logical channel level when IoT data were transmitted over WebSocket: (a) all experiments ( P E R > = 0 ); (b) the transport layer considered the transmission to be error-free ( P E R = 0 ). For P E R = 0 the arithmetic mean and the mode are also shown.
Sensors 25 00524 g006
Figure 7. Scatter plots for the statistics of the end-to-end delays (in ms) measured at the logical channel level during air-to-ground transmissions using the WebRTC data channel and using the WebSocket logical channel: (a) minimum and mode; (b) mean and median.
Figure 7. Scatter plots for the statistics of the end-to-end delays (in ms) measured at the logical channel level during air-to-ground transmissions using the WebRTC data channel and using the WebSocket logical channel: (a) minimum and mode; (b) mean and median.
Sensors 25 00524 g007
Figure 8. Scatter plots for statistics of end-to-end delays (in ms) measured at the logical channel level during air-to-ground transmissions using WebRTC data channel and using the WebSocket logical channel: (a) lower quartile and upper quartile; (b) maximum.
Figure 8. Scatter plots for statistics of end-to-end delays (in ms) measured at the logical channel level during air-to-ground transmissions using WebRTC data channel and using the WebSocket logical channel: (a) lower quartile and upper quartile; (b) maximum.
Sensors 25 00524 g008
Table 1. Selected time-sensitive applications with stringent time constraints related to both UAVs and IoT that can be carried by UAVs.
Table 1. Selected time-sensitive applications with stringent time constraints related to both UAVs and IoT that can be carried by UAVs.
ApplicationTime ConstraintPaper
Connecting autonomous vehiclesbelow 1 ms[21]
Transport industry3 or 7 ms[22]
Intelligent transportation system5 to 10 ms[23]
Internet of drones (remote control)5 to 50 ms[23]
Approaching autonomous navigation infrastructure10 ms[24]
Mobile robots: video-operated remote control10 to 100 ms[25]
Command and control of UAV networks10, 40, or 140 ms[22]
Table 2. Related work.
Table 2. Related work.
PaperCarrierNetwork TechnologyTransport ProtocolWeb Logical ChannelAPI
[15]UAV5Gn/a 1,5n/a 1,5n/a 1,5
[27,28]UAV802.11acRTP, SCTPWebRTCWebRTC
[30]UAV5G, 802.11pTCP 4WebSocketn/a 1,5
[31,32]UAV802.11gMPTCPn/a 1socket
[33]UAV802.11nMPTCPn/a 1socket
[34,35]UAV802.11nRTP 4, TCP 4WebRTC, WebSocketWebRTC, WebSocket
[36]UAV802.11TCP 4WebSocketWebSocket
[41]UAV4GUDPn/a 1n/a 1,5
TCPn/a1n/a 1,5
[44]n/a 1,3, UGV 3802.11TCP 4WebSocketWebSocket
RTP 4WebRTCWebRTC
[45]UGVSDRTCP 4WebSocketWebSocket
[46]n/a 1,2802.11, 3.75GTCP 4, RTP 4, SCTP 4WebSocket, WebRTCWebSocket, WebRTC
[38,39]n/a 1802.11, CellularQUIC, TCPn/a 1,5socket
Satellite
[37]n/a 1802.11SCTP, TCPn/a 1,5n/a 1,5
[40]UAV802.11n, 4G, 5GSCTPWebRTCWebRTC
this paperUAV802.11acSCTPWebRTCWebRTC
TCPWebSocketWebSocket
1 not applicable, 2 stationary robot, 3 the target application is UAV, 4 stated implicitly, 5 simulation.
Table 3. Minimum and maximum of the end-to-end delays measured at the transport level when the IoT data were transmitted over the WebRTC data channel.
Table 3. Minimum and maximum of the end-to-end delays measured at the transport level when the IoT data were transmitted over the WebRTC data channel.
Days min ( D WRTC t ) max ( D WRTC t ) max ( DT WRTC t )  1
day 13394 µs3469 µs3469 µs
day 23384 µs3469 µs3469 µs
day 33404 µs3469 µs3469 µs
day 43181 µs10,120 µs3497 µs
day 53377 µs3469 µs3469 µs
1 Maximum of the series of end-to-end delays truncated by the extremes.
Table 4. Minimum, maximum, and truncated maximum of end-to-end delays measured at the logical channel level when the IoT data were transmitted through the WebRTC data channel.
Table 4. Minimum, maximum, and truncated maximum of end-to-end delays measured at the logical channel level when the IoT data were transmitted through the WebRTC data channel.
Days min ( D WRTC lc ) max ( D WRTC lc ) max ( DT WRTC lc )  1
day 13396 µs3471 µs3471 µs
day 23386 µs3471 µs3471 µs
day 33406 µs3471 µs3471 µs
day 43183 µs10,122 µs9187 µs
day 53379 µs3471 µs3471 µs
1 Maximum of the series of end-to-end delays truncated by the extremes.
Table 5. Minimum and maximum of the end-to-end delays measured at the logical channel level when the IoT data were transmitted through the WebSocket logical channel.
Table 5. Minimum and maximum of the end-to-end delays measured at the logical channel level when the IoT data were transmitted through the WebSocket logical channel.
Days min ( D WS lc ) max ( D WS lc ) max ( DT WS lc )  1
day 12550 µs69,865 µs69,854 µs
day 22520 µs83,652 µs83,649 µs
day 32545 µs65,723 µs65,489 µs
day 42515 µs87,145 µs87,143 µs
day 52538 µs74,722 µs74,549 µs
1 Maximum of the series of end-to-end delays truncated by the extremes.
Table 6. Measures of location (mean, median, mode, upper quartile, and lower quartile) of end-to-end delays measured at the transport level when the IoT data were transmitted through the WebRTC data channel.
Table 6. Measures of location (mean, median, mode, upper quartile, and lower quartile) of end-to-end delays measured at the transport level when the IoT data were transmitted through the WebRTC data channel.
Days μ ( D WRTC t ) med ( D WRTC t ) mod ( D WRTC t ) Q 3 ( D WRTC t ) Q 1 ( D WRTC t )
day 13460 µs3467 µs3468 µs3468 µs3458 µs
day 23459 µs3466 µs3468 µs3468 µs3454 µs
day 33460 µs3466 µs3468 µs3468 µs3455 µs
day 43492 µs3495 µs3496 µs3496 µs3486 µs
day 53460 µs3467 µs3468 µs3468 µs3457 µs
Table 7. Measures of location of end-to-end delays measured at the logical channel level when the IoT data were transmitted over the WebRTC data channel.
Table 7. Measures of location of end-to-end delays measured at the logical channel level when the IoT data were transmitted over the WebRTC data channel.
Days μ ( D WRTC lc ) med ( D WRTC lc ) mod ( D WRTC lc ) Q 3 ( D WRTC lc ) Q 1 ( D WRTC lc )
day 13462 µs3469 µs3470 µs3470 µs3460 µs
day 23461 µs3468 µs3470 µs3470 µs3456 µs
day 33462 µs3468 µs3470 µs3470 µs3457 µs
day 43494 µs3497 µs3498 µs3498 µs3488 µs
day 53462 µs3469 µs3470 µs3470 µs3459 µs
Table 8. Measures of location (mean, median, mode, upper quartile, and lower quartile) of end-to-end delays measured at the logical channel level when the IoT data were transmitted over the WebSocket.
Table 8. Measures of location (mean, median, mode, upper quartile, and lower quartile) of end-to-end delays measured at the logical channel level when the IoT data were transmitted over the WebSocket.
Days μ ( D WS lc ) med ( D WS lc ) mod ( D WS lc ) Q 3 ( D WS lc ) Q 1 ( D WS lc )
day 17947 µs5235 µs2550 µs8247 µs4431 µs
day 217,605 µs12,233 µs2520 µs26,445 µs4721 µs
day 37260 µs5123 µs2545 µs6919 µs4403 µs
day 422,584 µs16335 µs-32,448 µs8833 µs
day 513,030 µs9563 µs2538 µs19,389 µs4537 µs
Table 9. Comparison of extremes (minimum and maximum) and measures of location (mean, median, mode, upper quartile, and lower quartile) of the end-to-end delays measured at the logical channel level when the IoT data were transmitted using WebRTC data channel and WebSocket.
Table 9. Comparison of extremes (minimum and maximum) and measures of location (mean, median, mode, upper quartile, and lower quartile) of the end-to-end delays measured at the logical channel level when the IoT data were transmitted using WebRTC data channel and WebSocket.
Days min ( D WS lc ) min ( D WRTC lc ) max ( D WS lc ) max ( D WRTC lc ) μ ( D WS lc ) μ ( D WRTC lc ) med ( D WS lc ) med ( D WRTC lc ) mod ( D WS lc ) mod ( D WRTC lc ) Q 3 ( D WS lc ) Q 3 ( D WRTC lc ) Q 1 ( D WS lc ) Q 1 ( D WRTC lc )
day 10.7520.142.31.510.742.381.28
day 20.7424.115.093.530.737.621.37
day 30.7518.952.11.480.731.991.27
day 40.798.616.474.67-9.282.53
day 50.7521.543.772.760.735.591.31
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chodorek, A.; Chodorek, R.R. Web Real-Time Communications-Based Unmanned-Aerial-Vehicle-Borne Internet of Things and Stringent Time Sensitivity: A Case Study. Sensors 2025, 25, 524. https://doi.org/10.3390/s25020524

AMA Style

Chodorek A, Chodorek RR. Web Real-Time Communications-Based Unmanned-Aerial-Vehicle-Borne Internet of Things and Stringent Time Sensitivity: A Case Study. Sensors. 2025; 25(2):524. https://doi.org/10.3390/s25020524

Chicago/Turabian Style

Chodorek, Agnieszka, and Robert Ryszard Chodorek. 2025. "Web Real-Time Communications-Based Unmanned-Aerial-Vehicle-Borne Internet of Things and Stringent Time Sensitivity: A Case Study" Sensors 25, no. 2: 524. https://doi.org/10.3390/s25020524

APA Style

Chodorek, A., & Chodorek, R. R. (2025). Web Real-Time Communications-Based Unmanned-Aerial-Vehicle-Borne Internet of Things and Stringent Time Sensitivity: A Case Study. Sensors, 25(2), 524. https://doi.org/10.3390/s25020524

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop