Skip to main content
Erschienen in: EURASIP Journal on Wireless Communications and Networking 1/2019

Open Access 01.12.2019 | Research

Optimisation of NB-IoT deployment for smart energy distribution networks

verfasst von: Varun Nair, Remco Litjens, Haibin Zhang

Erschienen in: EURASIP Journal on Wireless Communications and Networking | Ausgabe 1/2019

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

A suitability assessment and performance optimisation is presented of narrowband Internet of Things (NB-IoT) cellular technology for use in smart energy distribution networks. The focus is on the reliable and timely delivery of outage restoration and management (ORM) messages at the event of a local or regional power outage. Both the cellular NB-IoT and the energy distribution networks are modelled in a system-level simulator, which is used to carry out an extensive sensitivity analysis of the ORM service performance w.r.t. various radio network configurations in different environments. In particular, different packet schedulers are proposed and analysed, addressing device prioritization and subcarrier allocation as essential mechanisms in optimizing the service performance. Furthermore, we consider all three possible NB-IoT spectral deployment modes: in-band, guard-band and stand-alone deployment. Results show that, with a proposed near-optimal radio network configuration, the reliability of the ORM message delivery is close to 100% for the majority of power outage scenarios, while the observed 95th transfer delay percentile for the ORM messages is within the acceptable limit of 20 s. The study concludes that indeed NB-IoT is a suitable technology for supporting ORM services in smart energy distribution networks.
Abkürzungen
3GPP
3rd Generation Partnership Project
CL
Coverage level
DL
Downlink
DU
Dense urban
EDDF
Earliest due date first
GSM
Global system for mobile communications
HH
Household
ISD
Inter-site distance
KPI
Key performance indicator
LGA
Least granularity allocation
LTE
Long-term evolution
MCS
Modulation and coding scheme
MGA
Maximum granularity allocation
MMA
Min-max allocation
NB-IoT
Narrowband Internet of Things
NLOS
Non-line of sight
NPDCCH
Narrowband physical downlink shared channel
NPDSCH
Narrowband physical downlink shared channel
NPRACH
Narrowband physical random access channel
NPUSCH
Narrowband physical uplink shared channel
ORM
Outage restoration and management
PRB
Physical resource block
RA
Random access
RAO
Random access opportunities
RAR
Random access response
RSRP
Reference signal received power
RTP
Real-time pricing
RU
Rural
SPTF
Shortest processing time first
SU
Suburban
TU
Typical urban
U
Urban
UE
User equipment
UL
Uplink

1 Introduction

The 3rd Generation Partnership Project (3GPP) [1] standardised a radio access technology known as Narrowband Internet of Things (NB-IoT) as part of its Release 13 specifications, in support of the huge growth in the number of connected Internet of Things (IoT) devices [2]. NB-IoT is specifically targeted at low-cost and low-data rate applications involving a large number of devices. Additionally, there is support for a long battery life and improved coverage when compared with general packet radio service (GPRS). In its subsequent Release 14 and Release 15 specifications, 3GPP has introduced some enhancements to NB-IoT, including the support of mobility, increased data rate (Release 14), reduced latency and higher reliability (Release 15). 3GPP has further agreed that NB-IoT will continue to evolve as part of the 5G specifications [3].
NB-IoT is potentially suitable for smart energy distribution networks in providing low-cost connectivity to smart meters in every household, enabling use cases such as automated meter readings, service switch operations and outage restoration and management (ORM) [46]. ORM enables utilities to efficiently and quickly detect, localise and restore power outages, using notifications received from smart meter devices upon the detection of a loss or restoration of power. For example, the smart meter sends a message to the utility’s outage management system every time it loses its main power supply. With the coordinates of the affected smart meters contained in the message, it is possible to localise, isolate and rectify the faulty distribution segment [7]. However, ORM may involve ‘near-simultaneous’ network access from a large number of devices at, e.g. the event of a power outage. This may lead to congestion of the network resources, particularly of the random access channel, consequently resulting in collisions and unwanted delays in the transmission. Ultimately, this impacts the reliability performance, defined as the percentage of notifications successfully delivered within a certain transfer delay budget and, consequently, the accuracy and timeliness of the power outage localisation [5].
Most of the existing work on NB-IoT, e.g. [815] focusses on the analysis and development of enhancements to technology elements, but do not address the performance impact for specific use cases. Exceptions are [16] and [17] which both address the suitability of NB-IoT for smart grids applications. Shi et al. [8] and Oh and Shin [9] address paging mechanisms and data transmission protocols suitable for NB-IoT, respectively. Chen et al. [10] gives an overview of NB-IoT technology, including its background, standardisation, key technical components, a comparison with other similar technologies and typical application areas. Zayas and Merino [11] focuses on the architectural impact of NB-IoT, in particular the required modifications to existing long-term evolution (LTE) deployments in order to support NB-IoT users. In [12], a coverage analysis of NB-IoT is given, which shows that NB-IoT achieves a coverage enhancement of up to 20 dB in comparison to LTE. In [13], a new concept of control channel load balancing for NB-IoT is introduced, aimed at dynamic allocation of control channel resources during sudden traffic spikes. The proposed methodology is however based on pre-standard specifications and needs to be adapted accordingly. The authors of [14] model the data flow arrival process and service process of NB-IoT with different distributions, and accordingly propose a method to analyse access delay bounds in NB-IoT utilizing stochastic network calculus. Chung et al. [15] addresses optimisation of modulation and coding for paging in multicell interference-limited NB-IoT scenarios in terms of paging success rate, as well as optimisation of coverage levels in a commercial NB-IoT network environment in terms of random access success rate. In [16], a capacity analysis is presented for NB-IoT in (sub)urban environments for smart metering applications. The results are based on analytical calculations with rather optimistic data rate assumptions. Li et al. [17] analyses the usage of NB-IoT in the smart grid domain. It compares NB-IoT with other alternative smart grid communication means in terms of, among others, data rate, latency and range. Moreover, it presents link-level NB-IoT performance for some typical smart grids applications in both rural and urban environments.
The objective of the presented study is to do a system-level suitability assessment and optimisation of NB-IoT network deployment and configuration concentrating on the reliability performance of the ORM use case in smart energy distribution networks. A key contribution to these goals is the design and analysis of a suitable packet scheduler that achieves (near-)optimal performance, comprising the tasks of both device prioritisation and resource (subcarrier) allocation. The conducted simulation-based assessment considers all relevant scenario aspects of smart energy distribution networks in rural, suburban, urban and dense urban environments, and a detailed modelling of the NB-IoT cellular network technology.
The remainder of the article is organised as follows. In Section 2, a brief description of the methods used in this study is provided. Section 3 describes the key use cases in a smart energy distribution network in terms of their traffic characterisation and performance requirements. This overview forms the basis of selecting the ORM use case for more detailed analysis. An overview of the workings of the NB-IoT cellular network technology is presented in Section 4, including a description of the three possible NB-IoT deployment modes and a specification of the proposed schedulers. In Section 5, the models of both the energy distribution network and the cellular NB-IoT network are given. Section 6 then presents and discusses the obtained simulation results. Key conclusions and recommendations for further research are given in Section 6.

2 Methods

This study aims to deliver an extensive performance assessment of the NB-IoT technology in the context of the ORM use case in smart grids and recommend radio network configurations that may provide near-optimal performance. The performance evaluation is done using a MATLAB-based system-level simulator. As further elaborated below, the simulator incorporates realistic models for both the smart grid (see Section 3) and the cellular NB-IoT networks (see Section 5). For an extensive discussion of the simulator architecture and operation, the reader may refer to [18]. Through appropriate simulator settings, a performance comparison is done for several realistic network scenarios, which is based on varying the extent of power outages (in terms of the number of impacted smart meters) in an energy distribution network. Note that, in ORM, the network load is proportional to the number of power outage-affected smart meters. For each network scenario, several simulation iterations are executed, to take into account statistical variations and appropriately derive key performance indicators (KPIs). The KPIs have been chosen on the basis of available references [5, 6] on performance requirements for smart grid applications (see Section 3).

3 Smart energy distribution network

Various relevant use cases exist in the distribution segment of smart energy grids, including (on demand or periodic) remote meter reading, real-time pricing (RTP), service switch operation, ORM and firmware updates [5, 6]. They require communications with large numbers of metering devices at potentially challenging locations, e.g. requiring deep indoor coverage. Further, since the latency requirements of such services are in the order of seconds, minutes or even hours, a cellular technology like NB-IoT seems particularly suitable to handle these services. Table 1 provides a non-exhaustive list of key use cases in an energy distribution network, with their respective traffic aspects and performance requirements regarding latency and reliability [5].
Table 1
Use cases: traffic aspects and requirements
Use case
Traffic aspects (UL = Uplink/DL = Downlink)
Requirements
Meter reading (scheduled)
4–6 messages/residential meter/day; 1600–2400 bytes (UL)
≤ 4 h, ≥ 98%
12–24 messages/industrial meter/day; 200–1600 bytes (DL)
≤ 2 h, ≥ 98%
RTP
60/1000 meters: 1 message/day
100 bytes (DL) + 25 bytes (UL)
≤ 5 s, ≥ 99%
Service switch operation
1–50/1000 meters: 1 message/day
25 bytes (DL) + 25 bytes (UL)
≤ 1–2 min, ≥ 98%
ORM
1 message/meter/event
25 bytes (UL)
≤ 20 s, ≥ 30%a
Firmware updates
2×/meter/year
400–2000 kB (DL)
≤ 4 h, ≥ 98%
aFor large outages. No reliability requirement is specified for small outages, nor is the distinction between what is a ‘large’ and what is a ‘small’ outage clearly defined [5]
Considering the relatively stringent latency requirement in combination with a large number of involved meters, we select ORM as the most demanding use case for the presented performance optimisation of NB-IoT deployment for smart energy distribution networks, assuming scheduled meter reading as background traffic. In the ORM use case, the smart meters identify any occurring power outage and must near-immediately report this to the utility operator. The utility operator then gathers all these reports and performs detection, localisation and subsequently restoration of a power outage. For the purpose of a simulation-based performance analysis, we model such ORM-based reporting as follows: upon an outage event, each affected meter initiates its reporting after a beta-distributed [19] amount of time with parameters α = 3 and β = 4, over a range of [0,10] seconds [20].
Besides the use case-specific traffic characteristics as shown in Table 1, the layout and nodal density of the smart energy distribution network are further aspects that affect the use case performance. The energy distribution network comprises three distinct types of components, as also visualised in Fig. 1 [21]. The network generally consists of a ring of substations (converting medium to low voltage), from where distribution feeders originate in a radial topology towards multiple households, each with a smart meter installed. Average settings of energy distribution network parameters in The Netherlands are given in Table 2 [21] for the rural (RU), suburban (SU), urban (U) and dense urban (DU) environments. Note that ‘HH’ refers to ‘household’. Note further that the household density per km2 is clearly highest in a dense urban environment, as intuitively expected, and so is the household density per substation, despite the fact that in dense urban environments there are typically also more substations per unit area, compared to rural, suburban and urban environments.
Table 2
Average settings of energy distribution network parameters
Environment
HH density
#HHs/ substation
Feeder length
RU
50 HHs/km2
24
0.80 km
SU
350 HHs/km2
165
0.60 km
U
1500 HHs/km2
480
0.40 km
DU
2272 HHs/km2
693
0.35 km

4 Narrowband

In this section, we first describe the key aspects of NB-IoT as a 3GPP-standardised technology and subsequently specifically address the task of scheduling along with a set of proposed implementations. Note that scheduling schemes are generally not standardised yet, but typically implemented by equipment vendors and configured by network operators. For a more detailed treatment of the NB-IoT technology, its modelling and the defined scheduling schemes, the reader is referred to [18, 24].

4.1 NB-IoT technology

As briefly mentioned in the previous paragraph, 3GPP has standardised the NB-IoT technology in support of a particular class of IoT applications characterised by modest bit rate yet deep coverage requirements [1]. In light of these requirements, the technology operates with a narrow carrier bandwidth of only 180 kHz in both up- and downlink, comprising subcarriers with a spacing of either 3.75 (uplink only) or 15 kHz (up/downlink). When configured to a 3.75 kHz subcarrier spacing, ‘single-tone’ (1 subcarrier) transmissions are mandated, while the 15 kHz subcarrier spacing allows both single- and multi-tone (3, 6 or 12 subcarriers) transmissions. In the NB-IoT uplink, the option for a reduced subcarrier spacing (i.e. 3.75 kHz) is standardised to support high concurrent numbers of low-complexity devices and specifically also to enhance coverage through an increase of power spectral density and consequently achievable signal-to-interference and noise ratios (SINRs) in uplink transmissions. It is noted that this comes at a cost of reduced bit rates and hence higher transfer delays, which is considered a reasonable trade-off for many delay-tolerant IoT applications. Similar arguments of delay tolerance and low complexity support the limitation of modulation to B/QPSK only.
Operating in half frequency-division duplexing fashion, the NB-IoT technology may be deployed in a given cell with multiple carrier pairs in order to provide sufficient traffic handling capacity, depending on the number of devices in the cell’s service area and their degree of transfer activity. It is noted however that an individual user equipment (UE) can at any one time use a single carrier pair only. The single or multiple NB-IoT carriers are deployed in one of three distinct modes [22], as visualised in Fig. 2.
  • In-band mode—In this mode, an NB-IoT carrier is deployed occupying a physical resource block (PRB) within an LTE carrier. This is likely the simplest and most cost-efficient deployment mode as it allows the reuse of existing LTE base station hardware and antennas. The key drawback is that some downlink time-frequency resources are occupied by LTE reference signals and possibly some control channels, and are hence unavailable for NB-IoT transmissions.
  • Guard-band mode—In this mode, an NB-IoT carrier is deployed within the guard band of an LTE carrier. As this overcomes the key aforementioned drawback of the in-band mode, a better downlink throughput performance can be achieved. One specific potential drawback of the guard-band mode from the perspective of NB-IoT transmissions is a possible regulator-imposed downlink power limitation, considering that the guard-band is relatively close to bandwidth licensed to other operators or other systems. This power limitation may be imposed to limit the interference impact on adjacent-spectrum ‘regular’ LTE transmissions, which is of course directly related to the very purpose of having such a guard band in the first place.
  • Stand-alone mode—In this mode, an NB-IoT carrier is deployed independently of any LTE carrier, e.g. in re-farmed global system for mobile communications (GSM) spectrum. As for the guard-band mode, the key advantage is the full availability of time-frequency resources. The key drawback is that this mode likely requires specific hardware deployments, while it is further noted that to our knowledge regulators around the world have not assigned any dedicated NB-IoT spectrum.
Mobile network operators generally tend to operate the NB-IoT technology in the in-band mode for reasons of low cost and deployment complexity [23]. In this article, we will therefore also concentrate most of our analysis on the in-band mode, while presenting limited results on the other two deployment modes.
In order to speed up technology development and deployment efforts, several LTE radio interface features are reused in NB-IoT, including the general radio resource grid structure and the multiple access schemes. A number of optimisations have been standardised on top of this, such as the aforementioned option of a reduced uplink subcarrier spacing and the use of repetitions to enhance coverage for the transmission of both signalling and data messages. In general, these improvements are made in support of application/device-oriented requirements related to, e.g. coverage, energy efficiency and implementational complexity [24].
For the unfamiliar reader, the following few paragraphs present a brief description of the assessed NB-IoT technology, whose specifications are effectively spread over a number of 3GPP documents [1] and therefore rather inaccessible. Please refer to [18] and [24] for more elaborate dedicated descriptions of the NB-IoT technology.
Figure 3 illustrates the operations and modeling of the random access and uplink data transmission procedures in NB-IoT technology, as relevant for the considered ORM use case. Consider first the ‘random access phase’. Upon the generation of an ORM message, the UE in the smart meter (assumed to be in idle mode) attempts the transmission of a preamble, randomly selected from a limited set, via the so-called narrowband physical random access channel (NPRACH). The number of available preambles is determined by the spectral width of the NPRACH resources, which is either 12, 24, 36 or 48 subcarriers, providing a matching number of available preambles. The set of preambles is split over the different so-called ‘coverage levels’ (CLs). Each UE determines its CL from its experienced coupling loss, which is in turn estimated based on reference signal received power (RSRP) measurements, and in accordance with a pre-determined and broadcast coupling loss-to-CL mapping. The time domain capacity of the NPRACH is set by the NPRACH period TNPRACH, which is configured to define the CL-specific periodicity of random access opportunities (RAOs), where a single such opportunity exists per configured period of, e.g. 40, 80, 160, up to a maximum of 2560 ms. Upon transmission, the preamble is sent multiple times with a CL-specific number of applied repetitions. In case the same preamble is simultaneously used by multiple devices, the preamble transmissions may collide, depending on, e.g. relative signal strengths. Furthermore, the more frequent (i.e. a lower TNPRACH) the RAOs in a CL, the lower is the collision probability for a given traffic load.
If a preamble is received successfully at the base station, it sends a random access response (RAR) message over the narrowband physical downlink shared channel (NPDSCH), along with a signalling indication of this message transmission on the narrowband physical dedicated control channel (NPDCCH). Similar to the NPRACH, the NPDCCH resource is configured by the NPDCCH period TNPDCCH and the maximum number of repetitions Rmax. It is noted that a higher setting of Rmax enhances coverage, at the cost of consuming more NPDCCH resources per UE with the risk of congestion and, consequently, blocking of other UEs. Additionally, there is a configurable offset parameter αoffset which dictates the relative timing of the start of the NPDCCH for different CLs [25]. In order to limit the number of scenarios, in our study we conservatively fix αoffset to 0.
The RAR message includes an uplink grant for the UE to send MSG3 on the narrowband physical uplink shared channel (NPUSCH) which shares its UL resources with the NPRACH. MSG3 includes an indication of the data volume the UE wants to transmit. Since the RAR message and hence the uplink resource grant for transmitting MSG3 are conveyed in response to a preamble transmission, the MSG3 messages may in fact also collide in case multiple UEs simultaneously used the same preamble and hence believe the transmitted RAR is meant for them. In case the MSG3 transmission is successful, it is followed by further handshaking in order to establish a radio resource control (RRC) connection, comprising a downlink MSG4 transmission and a corresponding acknowledgement (ACK/NACK) from the UE side, signalling the uplink resource grant to the UE, which is subsequently used for the actual uplink data transfer (‘data transfer phase’). This is in fact a control plane transmission as part of the RRC connection setup [24].
It is noted that any failure during this whole procedure, including the failure of preamble detection and the possible time-out of the different signaling messages (configured by the RAR/MSG4 window sizes), will trigger the device to re-attempt from the start, i.e. with a fresh preamble transmission, after a randomly selected back-off time, which is uniformly sampled within a configured back-off interval. In case of persistent failures even after a configured maximum allowed number of random access (RA) attempts, the random access process fails.

4.2 Scheduling

Scheduling plays a significant role in the overall data transmission process and is required for the transmission of all aforementioned messages, i.e. the RAR and MSG4 messages in the downlink, and the MSG3, ACK/NACK messages and actual data transmission in the uplink. The applied scheduling algorithm is not standardised but left open for vendor implementation and operator configuration. In this article, we present and assess schedulers which aim to maximise the reliability performance of the transmission of ORM messages, carrying out two key tasks. Firstly, the prioritisation scheme determines the priority order in which the queued UEs are served. We consider the following three prioritisation schemes, characterised by distinctly defined prioritisation metrics:
  • Earliest due date first (EDDF)—Characterised by prioritisation metric τwaitdue date, where τwait and τdue date denote the incurred waiting time and the remaining time until the due date, respectively. The due date depends on the nature of the scheduled message. For example, for the actual data transmission, the due date is immediately derived from the use case-specific delay requirement. This scheme prioritises UEs whose message has waited already long and/or whose due date is relatively near.
  • Shortest processing time first (SPTF)—Characterised by prioritisation metric 1/τtransmission, where τtransmission denotes the expected transmission time [18]. This scheme ignores due dates but follows the general time management principle of trying to satisfy as many UEs as quickly as possible.
  • Earliest due date first-shortest processing time first (EDDF-SPTF)—Characterised by prioritisation metric τwait/(τdue date × τtransmission) with τwait, τdue date and τtransmission as defined above. This scheme tries to strike an optimal compromise between the basic EDDF and SPTF schemes by integrating their respective merits: the EDDF component prioritises UEs whose message has low remaining delay budget, while the SPTF component prioritises UEs whose message has a relatively short expected transmission time.
Although the units of the time-oriented variables τwait, τdue date and τtransmission do not really matter, since the above-proposed prioritisation metrics are used for ordering purposes only, we note that in the simulations these variables are uniformly expressed in units of seconds.
Secondly, the subcarrier allocation scheme decides how many uplink subcarriers are allocated to each of the UEs. We consider three options:
  • Least granularity allocation (LGA)—This scheme aims to maximise the bit rate (and hence minimise the transmission time) of a scheduled UE by assigning the configured maximum possible number (1, 3, 6 or 12) of 15 kHz subcarriers. The drawback of this option is that it limits the number of UEs that can concurrently utilise the resources which may lead to high waiting times and hence possible violation of delay budgets.
  • Min-max allocation (MMA)—This scheme targets the most resource-efficient bit rate maximisation by assigning the minimum number of 15 kHz subcarriers that is needed to provide the scheduled UE with the maximum attainable bit rate.
  • Maximum granularity allocation (MGA)—This scheme maximises the number of concurrently scheduled UEs by assigning a single 3.75 kHz subcarrier per UE. The downside of this option is the suboptimal bit rates assigned to (particularly) UEs with high SINRs and in case insufficient UEs are present to utilise all resources.
Given the different schemes described above, in principle 3 × 3 = 9 candidate schedulers, i.e. combinations of a prioritisation and a subcarrier allocation scheme, can be devised. In order to limit our numerical study to the most sensible candidates, we have preselected five such combinations, viz. EDDF/MGA, SPTF/MGA, EDDF-SPTF/MGA, EDDF-SPTF/MMA and EDDF-SPTF/LGA. The motivation behind selecting these five candidate schedulers is as follows. The EDDF and SPTF schemes both ignore the possible impact on delay budget violations due to the different transmission time of UEs and the time remaining until the due date, respectively. The impact of this potential intrinsic shortcoming is expected to be least significant if the prioritisation scheme is combined with a more granular resource assignment, i.e. the MGA scheme. Alternatively, the LGA and MMA subcarrier allocation schemes are most suitably combined with the EDDF-SPTF prioritisation scheme, as it aims to address the given drawbacks of both the EDDF and SPTF schemes. As a final selection, the combination of EDDF-SPTF and MGA is also considered since the transmission time can potentially still impact performance in the MGA scheme due to the low bit rate of UEs.

5 Network models

This section describes the networking scenarios as considered in the simulation-based assessment study, considering both the smart energy distribution network and the cellular NB-IoT network.
We model the smart energy distribution network in a hexagonal layout, where each hexagon models the service area of a given substation with uniformly spread households. Refer to Table 2 for the assumed environment-specific settings for the household density and the number of households per substation. The hexagon radius is modelled by an effective feeder length, which slightly differs from the real feeder lengths given in the rightmost column of the table. These effective feeder lengths are chosen such that the number of households per hexagon (model) is the same as that for a substation area (reality). This results in effective feeder lengths of 0.43, 0.43, 0.35 and 0.35 km for the RU, SU, U and DU environments, respectively.
The cellular NB-IoT network comprises nineteen sectorised sites deployed in a hexagonal layout. It is stressed that even though both the energy distribution and the cellular NB-IoT network are both modelled in a hexagonal grid, they are of course independently dimensioned. The environment-specific inter-site distance (ISD) of the cellular network is generally dictated by coverage- (rural environments) or capacity-oriented (urban/suburban environments) requirements. The assumed ISDs are based on the consideration that an NB-IoT network is typically deployed on an existing 2/3/4G site grid. Each sector is served with a directional antenna with a main lobe gain of 18 dBi and a 3D antenna pattern taken from [1]. The UE (the meter-associated modem) is equipped with an omnidirectional antenna with a gain of 0 dBi, installed at an assumed height of 1.5 m. The considered frequency carrier is assumed to be in the 800 MHz band and is planned with contiguous reuse.
Table 3 shows the ISDs and propagation models considered for the different environments [2628]. In the propagation model, shadowing is modelled with assumed inter- and intra-site correlations of 0.5 and 1, respectively.
Table 3
Environment-specific network and propagation aspects
Environment
ISD (km)
Path loss (dB)
Shadowing
Penetration loss
Channel model
RU
7.5
94.6 + 34.1 × log10(dKM)
6 dB
Based on COST231 NLOS model [28]
Rural area model, 6 taps, 0 Hz Doppler
SU
3.2
103.8 + 33.6 × log10(dKM)
8 dB
Typical Urban (TU), 20 taps, 0 Hz Doppler
U
1.732
119.8 + 37.6 × log10(dKM)
10 dB
DU
0.5
As a general note, we recognise that different modelling choices may influence results to some degree, but insist that as long as realistic modelling assumptions are made, as we believe we have done, key qualitative outcomes are expected to be largely the same, irrespective of the detailed modelling specifics.
As explained, the overall model thus consists of two distinct and independently planned networks, viz. a smart energy distribution network including households with smart meters, and a cellular NB-IoT network used to convey the ORM messages transmitted by modem-equipped smart meters at the event of a power outage. Figure 4 shows both networks in an overlaid fashion for all four considered environments. As the figures illustrate, the number of substations covered within a radio cell is environment-specific and decreases in the order rural, suburban, urban and dense urban. This is a net effect of the ISD of the cellular network and the household density, both of which depend on the environment, but to a different degree. The traffic load experienced by the radio cell depends on the product of the number of substations covered in the radio cell and the number of households per substation, where the latter decreases when moving from dense urban to rural environment, as shown in Table 2. The figure also shows that the degree of overlap between a substation and radio cell is not uniform across all radio cells, particularly in the dense urban and urban environments. This may impact the network load in a radio cell during a power outage scenario, which affects one or more randomly selected substations. In order to model this variability in the network load in a single radio cell simulation, the relative position of both grids with respect to one another is chosen randomly in different simulation snapshots. Consequently, this results in a varying overlap between the radio cells and the substation cells.
Figure 5 zooms into the overlapping smart energy distribution and NB-IoT cellular networks for the urban environment. The depicted red, green and yellow coloured markers represent UEs (smart meters in households) served by the three cells of the central site, while all grey-coloured UEs are served by other cells. In the analysis, a single cell is explicitly simulated, while all other cells are statically configured to establish realistic interference levels.

6 Simulation results and analysis

In this section, the key results from an extensive performance analysis are presented. Although the focus of the discussion is on the in-band deployment mode, as that is the most likely mode used in actual deployments, some results are also presented for the guard-band and stand-alone deployment modes. Throughout the numerical analysis, the results primarily show the impact of the so-called ‘outage percentage’ on certain key performance indicators (KPIs). The outage percentage is defined as the fraction of substations in the simulated radio cell that are modelled to be in a power outage and will consequently trigger the meters (~ UEs) in the underlying households to initiate the transmission of an ORM message. To illustrate this more clearly, Fig. 6 shows for each environment the number of covered substations per radio cell, the average number of smart meters per substation and the number of affected smart meters in a radio cell for example outage percentages of 10%, 50% and 100%. The bar values and error bars indicate the mean and standard deviation, respectively, of the obtained values across 100 simulation snapshots. It is noted that these plots are the same irrespective of the selected deployment mode since the serving radio cell for each dropped UE is selected based purely on propagation conditions (coupling loss) which in turn depend only on the UE location and the given environment. The figure shows that the average number of affected smart meters per radio cell is highest in the urban scenario. It is noted that the average number of affected smart meters is for each environment approximately equal to the number of affected substations (based on the outage percentage) times the average number of smart meters per substation. It is then readily verified from the left two charts that this product is indeed highest for the urban scenario, which explains the observation from the rightmost chart.
The objectives of the extensive performance assessment and sensitivity analysis, as presented in the following paragraphs, are (i) to assess the suitability of NB-IoT as a communication technology for reliably conveying ORM messages in a smart energy distribution network, and (ii) to determine an (near-)optimal NB-IoT radio network configuration for the considered ORM use case. The radio network configuration concerns different aspects, including the configuration of the NPRACH and the NPDCCH, as well as the selection of a scheduler, which may need to be optimised in order to achieve a globally optimal configuration. Given this multi-dimensional optimisation problem, we assume a baseline configuration and perform a sensitivity analysis considering unilateral variations for each configuration aspect. In the end, a near-optimal configuration is derived by combining the unilaterally identified optimal settings for each configuration aspect. The assumed baseline configuration itself has been derived in a pre-study (excluded here; see also [18]), applying a similar approach of combining unilaterally optimised settings starting from an arbitrary ‘pre-baseline’ configuration. Throughout the analysis, the impact of an increasing outage percentage on the following three KPIs is studied:
  • The reliability, defined as the fraction of ORM messages that is successfully transferred within the assumed deadline of 20 s.
  • The success rate, defined as the fraction of ORM messages that is transferred successfully.
  • The 95th transfer delay percentile, determined over the successfully transferred ORM messages.
Among the considered KPIs, the reliability metric is probably the most relevant one, as it explicitly incorporates the delay requirement of the ORM messages. The reliability metric should be high enough to enable the utility network operator to dependably identify power outages.
Table 4 shows the baseline network configuration. The coupling loss ranges of the three coverage levels are set to reflect the typical distribution of UEs across these coverage levels [29].
Table 4
Baseline network configuration and scenario settings
Parameter
Settings
# Carriers
1
Scheduler
EDDF-SPTF with MGA
Coupling loss ranges per CL (dB)
CL1
CL2
CL3
[0, 130]
[130, 140]
[140, ∞]
NPDCCH configuration
Settings
Maximum # repetitions (Rmax)
8
Period (TNPDCCH)
12 ms
NPRACH configuration
Settings
Maximum # RA attempts
CL1
CL2
CL3
19
5
7
Resource configuration
Period (TNPRACH)
80 ms
160 ms
320 ms
# Preamble repetitions
2
4
32
# Preambles
24
12
12
Back-off interval
[0, 1024] ms
RAR window size
10 × T ms
MSG4 window size
64 × T ms
In order to study the impact of the selected NB-IoT deployment mode, a comparison of the reliability performance was done for in-band, guard-band and stand-alone modes, considering the baseline configuration, with results depicted in Fig. 7. Although better downlink throughput performance may be expected for guard-band and stand-alone modes, as qualitatively explained in Section 4, the actual performance difference is rather insignificant, due to the fact that the overall data transmission process involves only two downlink message transfers (RAR and MSG4) with relatively small message sizes (< 30 bytes). The reliability performance for the rural and dense urban environments, nearly 100%, is robust to all outage percentages. Only for the suburban and urban environments, the reliability significantly degrades for a 100% outage percentage, with the lowest reliability observed for the urban environment. This is because, at high outage percentages, the network load in terms of the number of power outage-affected smart meters per radio cell is significantly high only in the urban and suburban environments, as indicated in Fig. 6. If the cell load is significant, it leads to a high number of preamble collisions and random access failures, thus impacting the reliability performance.
For the following sensitivity analyses, the in-band deployment is assumed, as stated earlier. Figure 8 shows the simulation results obtained for the four considered environments, again assuming the baseline configuration and now showing all three above-defined KPIs. In this as well as later figures, all performance curves are accompanied by an indicated 95% confidence interval around the KPI estimates. Note that a subset of the reliability values in Fig. 8 also appear in Fig. 7 for the in-band case. Thus, the observation that the urban environment has the worst reliability for all outage percentages is consistent. This figure clearly shows that a higher cell load, caused by a higher outage percentage, leads to more preamble collisions and timeouts, and causes a drop in both the success rate and the reliability.
The sensitivity analyses that follow focus on the urban environment, given its relatively poor performance compared to other environments. The aim is to study the performance impact of unilateral variations of the following four configuration aspects:
  • Scheduler
  • NPRACH resource configuration
  • Maximum number of RA attempts
  • NPDCCH resource configuration
It is noted that for simplicity reasons, in all cases, where applicable, the change of configuration is performed only for CL1, since most UEs are located in the range of CL1.
Figure 9 shows a comparison of candidate scheduler configurations. Note that each configuration is a combination of a prioritisation and a subcarrier allocation scheme, as discussed in Section 4. We can see that the combination of EDDF-SPTF prioritisation with MGA subcarrier allocation achieves the highest reliability for nearly all outage percentages. It can be noted that, for the same prioritisation scheme (EDDF-SPTF), the reliability performance degrades as the granularity of the UL subcarrier allocation is decreased (from MGA to LGA). This is fundamentally due to the small packet sizes involved in the ORM use case, which limits the gain in actual data throughput with a decrease in subcarrier allocation granularity. A decrease in the granularity significantly impacts, on the other hand, the waiting time of UEs, which affects the success rate and the 95th transfer delay percentile.
Figure 10 Compares the different NPRACH configurations, which differ in terms of the number of RAOs per second, i.e. the number of preambles per period (see Table 4). In this analysis, only the length of the NPRACH period TNPRACH, i.e. the time interval between consecutive RAOs for a UE, is varied. The key in optimizing the NPRACH configuration is to balance the occurrence of preamble collisions and timeouts. This is because, as stated in Section 4, the NPRACH and the NPUSCH share the same UL resources. Hence, their respective composition influences the preamble collision probability and the UE waiting time, respectively. The results show that the baseline configuration and the configuration with 600 RAOs/s (blue plot) have similar and nearly optimal performance. Furthermore, their respective optimality remains robust to load (outage percentage) variations. The performance achieved by the remaining two configurations is significantly lower, with the NPRACH resources so limited that it causes too many preamble collisions.
In Fig. 11, the performance sensitivity with respect to the maximum number of RA attempts is analysed, which shows an improvement in reliability if the maximum number of RA attempts is increased from 15 (blue plot) to 19 (baseline), but a degradation for a higher setting, e.g. 23 or 27. It can be intuitively argued that an increase in the allowed number of RA attempts improves the success rate at the cost of increased transfer delays. For the configurations with the setting equal to 23 and 27, a significant proportion of the successful attempts lead to transfer delays exceeding the 20-s target, thereby reducing the reliability performance. It is the baseline configuration which indeed achieves the optimal trade-off between success rate and transfer delay.
Optimisation of the NPDCCH resources involves adjusting the NPDCCH resource allocation, in terms of the ratio of parameters Rmax and TNPDCCH (see Table 4). The higher the ratio for a given CL, the more resources are available for scheduling the UEs of that CL. However, this reduces the resource availability and the performance of the UEs at the other CLs. At high loads, the waiting time until scheduling for UEs in CL1 (where the majority UEs are present) is expected to be more dominant w.r.t. those of the UEs of CL2 and CL3, compared to low-load scenarios. Figure 12 illustrates the performance sensitivity with regards to the NPDCCH configuration (Rmax/TNPDCCH). A load-dependent optimal configuration is obtained. The baseline configuration with the maximum ratio (Rmax/TNDPCCH = 8/12) indeed performs better at high loads (where the CL1 load is dominant), whereas the configuration with a slightly lower ratio (Rmax/TNPDCCH = 4/8), performs better at low loads. The configuration setting with Rmax/TNDPCCH = 1/8 performs the worst at most loads, indicating that the available scheduling resources are too low.
The sensitivity analyses shown in Figs. 9, 10, 11 and 12 indicate that few alternative configurations perform on par with or better than the baseline configuration, such as the blue curve in Fig. 10 (NPRACH) and the violet curve in Fig. 12 (NPDCCH). We can observe that the blue curve is optimal for outage percentages above 70%, whereas the violet curve is optimal for outage percentages below 70%. To investigate whether a ‘robust’ configuration exists which is near-optimal for all loads, a candidate configuration is created for the final analysis presented in Fig. 13, combining the above alternative configurations. Note that the blue and violet curves correspond to the equivalently coloured curves and their configuration settings in Figs. 10 and 12, respectively. The third (red) curve corresponds to a configuration which combines the specific load-dependent optimal NPRACH and NPDCCH settings of the blue and violet curves respectively. As shown in Fig. 13, this configuration indeed performs near-optimally for all loads, with an achieved reliability performance close to 100% for the majority of the considered outage percentages.

7 Conclusions

In this article, a suitability assessment and performance optimisation was presented for the use of NB-IoT cellular technology to support smart energy distribution networks. The focus was on the reliable and timely delivery of ORM messages at the event of a local or regional power outage. To achieve this purpose, different realistically tuned models for both the energy distribution network and the cellular NB-IoT network were developed. In order to arrive at an optimal/near-optimal radio network configuration, we have conducted an extensive sensitivity analysis of the performance (in terms of reliability, success rate and latency) w.r.t a wide range of NB-IoT network configurations. In particular, we looked at scheduling as a key means to optimise the service performance. In this regard, some candidate schedulers, addressing device prioritisation and subcarrier allocation, were proposed and analysed.
The obtained results show that indeed the NB-IoT technology, when appropriately configured, is suitable to adequately support ORM and other smart energy distribution network services with similar or milder performance requirements. This conclusion is in line with the results of [16] and [17], which both indicated that the NB-IoT technology is suitable for smart grids applications which do not have stringent latency requirements. The overall reliability performance was seen to be similar across the three NB-IoT deployment modes, i.e. in-band, guard-band and stand-alone. Amongst the considered environment scenarios, the urban environment was found to be most challenging, given its relatively poor performance due to a high network load. The results also show that, among the assessed packet schedulers, the scheduler combining EDDF-SPTF prioritisation with MGA subcarrier allocation outperforms the other schedulers, with the capability of achieving reliability levels for example in the range of 98–100% for power outage percentages up to about 50%.
As future work, we recommend a further optimisation of the NB-IoT deployment, in the context of a diverse mix of smart grid use cases, including those outside the energy distribution segment as covered in this study. Considering that practical deployments will have spatio-temporal variations in environmental aspects, traffic loads, service mix and the associated performance requirements, another challenge for further research is to devise a self-optimisation scheme for the adaptation of radio network configurations in response to these variations. Although self-optimising schemes such as those proposed in [30] within the context of LTE/LTE-A may be used as a starting point, applicable limitations in NB-IoT need to be considered in the design. For example, the minimum periodicity with which network configuration changes can be broadcast will influence how quickly the system can respond to load changes.

Acknowledgements

The authors would like to thank Michiel Nijhuis, Department of Electrical Energy Systems, TU Eindhoven, for providing relevant inputs on the smart grid energy distribution network modelling.

Funding

This work is partially funded by the EU under Grant agreement no. 619437 ‘SUNSEED’.

Availability of data and materials

Not applicable.

Competing interests

Not applicable.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://​creativecommons.​org/​licenses/​by/​4.​0/​), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
Literatur
6.
Zurück zum Zitat M. Kuzlu, M. Pipattanasomporn, S. Rahman, Communication network requirements for major smart grid applications in HAN, NAN and WAN, Computer Networks, vol 67 (2014) M. Kuzlu, M. Pipattanasomporn, S. Rahman, Communication network requirements for major smart grid applications in HAN, NAN and WAN, Computer Networks, vol 67 (2014)
7.
Zurück zum Zitat Y. Jiang et al., Outage management of distribution systems incorporating information from smart meters. IEEE Trans. Power Syst. 31(5), 4144–4154 (2016)CrossRef Y. Jiang et al., Outage management of distribution systems incorporating information from smart meters. IEEE Trans. Power Syst. 31(5), 4144–4154 (2016)CrossRef
8.
Zurück zum Zitat Y. Shi, W. Xi, H. Zhou, Efficient paging message design based on a binary tree in NB-IoT system, Proceedings of CSCN ‘16 (IEEE, Berlin, 2016) Y. Shi, W. Xi, H. Zhou, Efficient paging message design based on a binary tree in NB-IoT system, Proceedings of CSCN ‘16 (IEEE, Berlin, 2016)
9.
Zurück zum Zitat S.-M. Oh, J. Shin, An efficient small data transmission scheme in the 3GPP NB-IoT system. Commun. Lett. 7798, 660–663 (2016)CrossRef S.-M. Oh, J. Shin, An efficient small data transmission scheme in the 3GPP NB-IoT system. Commun. Lett. 7798, 660–663 (2016)CrossRef
10.
Zurück zum Zitat M. Chen, Y. Miao, Y. Hao, K. Hwang, Narrow Band Internet of Things. IEEE Access 5, 20557–20577 (2017)CrossRef M. Chen, Y. Miao, Y. Hao, K. Hwang, Narrow Band Internet of Things. IEEE Access 5, 20557–20577 (2017)CrossRef
11.
Zurück zum Zitat A. Zayas, P. Merino, The 3GPP NB-IoT system architecture for the Internet of Things, Proceedings of ICC ‘17 (IEEE, Paris, 2017) A. Zayas, P. Merino, The 3GPP NB-IoT system architecture for the Internet of Things, Proceedings of ICC ‘17 (IEEE, Paris, 2017)
12.
Zurück zum Zitat A. Adhikary, X. Lin, Y.-P.E. Wang, Performance evaluation of NB-IoT coverage, Proceedings of VTC ‘16 (Fall) (IEEE, Montreal, 2016) A. Adhikary, X. Lin, Y.-P.E. Wang, Performance evaluation of NB-IoT coverage, Proceedings of VTC ‘16 (Fall) (IEEE, Montreal, 2016)
13.
Zurück zum Zitat J.W. Shin, J.S. Kim, M.Y. Chung, S.J. Lee, Control channel load balancing in narrow band cellular IoT systems supporting coverage class, Proceedings of ISMS ‘16 (IEEE, Bangkok, 2016) J.W. Shin, J.S. Kim, M.Y. Chung, S.J. Lee, Control channel load balancing in narrow band cellular IoT systems supporting coverage class, Proceedings of ISMS ‘16 (IEEE, Bangkok, 2016)
14.
Zurück zum Zitat X. Wang, X. Chen, Z. Li, Y. Chen, Access delay analysis and optimization of NB-IoT based on stochastic network calculus, Proceedings of SmartIoT ‘18 (IEEE, Xi’an, 2018) X. Wang, X. Chen, Z. Li, Y. Chen, Access delay analysis and optimization of NB-IoT based on stochastic network calculus, Proceedings of SmartIoT ‘18 (IEEE, Xi’an, 2018)
15.
Zurück zum Zitat H. Chung, S. Lee, J. Jeong, NB-IoT optimization on paging MCS and coverage level’, Proceedings of ISWCS ‘18 (IEEE, Lisbon, 2018) H. Chung, S. Lee, J. Jeong, NB-IoT optimization on paging MCS and coverage level’, Proceedings of ISWCS ‘18 (IEEE, Lisbon, 2018)
16.
Zurück zum Zitat S. Persia, L. Rea, Next generation M2M cellular networks: LTE-MTC and NB-IoT capacity analysis for smart grids applications, Proceedings of AEIT ‘16 (IEEE, Capri, 2016) S. Persia, L. Rea, Next generation M2M cellular networks: LTE-MTC and NB-IoT capacity analysis for smart grids applications, Proceedings of AEIT ‘16 (IEEE, Capri, 2016)
17.
Zurück zum Zitat Y. Li, X. Cheng, Y. Cao, D. Wang, L. Yang, Smart choice for the smart grid: narrowband Internet of Things (NB-IoT). IEEE Internet Things 5(3), 1505–1515 (2018)CrossRef Y. Li, X. Cheng, Y. Cao, D. Wang, L. Yang, Smart choice for the smart grid: narrowband Internet of Things (NB-IoT). IEEE Internet Things 5(3), 1505–1515 (2018)CrossRef
20.
21.
Zurück zum Zitat M. Nijhuis, M. Babar, M. Gibescu, S. Cobben, Demand response: social welfare maximisation in an unbundled energy market case study for the low-voltage networks of a distribution network operator in The Netherlands. IEEE Trans. Ind. Appl. 53(1), 32–38 (2017)CrossRef M. Nijhuis, M. Babar, M. Gibescu, S. Cobben, Demand response: social welfare maximisation in an unbundled energy market case study for the low-voltage networks of a distribution network operator in The Netherlands. IEEE Trans. Ind. Appl. 53(1), 32–38 (2017)CrossRef
25.
Zurück zum Zitat 3GPP, Radio Resource Control (RRC); protocol specification, TS 36.331, see www. 3qpp. org. 2016. 3GPP, Radio Resource Control (RRC); protocol specification, TS 36.331, see www. 3qpp. org. 2016.
28.
Zurück zum Zitat R. Litjens, Y. Toh, H. Zhang, O. Blume, Assessment of the energy efficiency enhancement of future mobile networks, Proceedings of WCNC ‘14 (IEEE, Istanbul, 2014) R. Litjens, Y. Toh, H. Zhang, O. Blume, Assessment of the energy efficiency enhancement of future mobile networks, Proceedings of WCNC ‘14 (IEEE, Istanbul, 2014)
30.
Zurück zum Zitat A. Lo, Y.W. Law, M. Jacobsson, M. Kucharzak, Enhanced LTE-Advanced random-access mechanism for massive machine-to-machine communications, Proceedings of WWRF 27 (2011) A. Lo, Y.W. Law, M. Jacobsson, M. Kucharzak, Enhanced LTE-Advanced random-access mechanism for massive machine-to-machine communications, Proceedings of WWRF 27 (2011)
Metadaten
Titel
Optimisation of NB-IoT deployment for smart energy distribution networks
verfasst von
Varun Nair
Remco Litjens
Haibin Zhang
Publikationsdatum
01.12.2019
Verlag
Springer International Publishing
DOI
https://doi.org/10.1186/s13638-019-1485-2

Weitere Artikel der Ausgabe 1/2019

EURASIP Journal on Wireless Communications and Networking 1/2019 Zur Ausgabe