Emerging wireless lighting control standards
Emerging wireless technology is disrupting the lighting controls industry
- Review basic building lighting control objectives.
- Understand fundamental concepts behind mesh and flood networks.
- Examine IEEE 802.15.4, Bluetooth and ZigBee standards.
- Review advantages and disadvantages of wireless lighting controls.
The evolution of lighting controls has traditionally been driven by the prevailing energy codes, such as International Energy Conservation Code and ASHRAE 90.1: Energy Standard for Buildings Except for Low-Rise Residential Buildings). Usually that evolution has amounted to nothing more than a series of incremental improvements to the basic underlying controls technology.
However, after several years of taking a “wait and see” approach, most major lighting control manufacturers are now fully embracing wireless lighting control technology. Wireless lighting controls represent a significant departure from traditional hardwired lighting controls solutions. This disruptive technological innovation will force the engineering design community to rethink how to specify and design lighting control systems. New design considerations include changes in sensor functionality, layout, interoperability and security.
What do you want to accomplish with lighting controls?
Many manufacturers tout the endless possibilities associated with “internet of things” and interconnected wireless lighting control system. While interconnecting wildly dissimilar devices such as refrigerators, door locks and lighting controls and the resulting data generated may have some potential benefits, as with any emerging technology, the exact consequences of such convergence are still nebulous.
With this uncertainty, the engineering community has to first focus on the core lighting control functionality requirements. Those core requirements can be summed up in a few very basic principles:
- Empower occupants with control over their environment.
- Reduce energy usage.
- Accomplish the previous two goals without impacting the health and safety of the occupants.
How do lighting controls accomplish this? The generic functions of any lighting control system can be categorized as a series of control inputs and resulting actions. On the most rudimentary level, you push a switch (input) to turn a light on/off (action). However, IECC considerations and sustainability/wellness standards such as U.S. Green Building Council’s LEED and the WELL Building Standard have expanded lighting control requirements dramatically.
Some inputs beyond manual user intervention that have to be considered are:
- Occupancy: Is there someone within the area controlled?
- Time of day: Is it nighttime or daytime? Is it during normal business hours? What is the expected correlated color temperature of the natural illumination at that time of the day?
- Ambient light levels: Is there adequate natural illumination within the area controlled?
- External signals such as utility load shed commands or fire alarm/mass notification system inputs.
The resulting actions that the lighting control system initiates have also expanded beyond just turning lights on/off. These actions include:
- Dim lights with a controlled fade rate.
- Adjust CCT for tunable white sources.
- Change color by varying hue, saturation and lightness for red-green-blue and red-green-blue-white light sources.
- Initiate “scenes” where a group of memorized settings are initiated simultaneously.
These basic input/action concepts are reasonably straightforward. However, adding functionality increases complexity and cost. Lighting controls must remain reliable, cost–effective and as simple as possible. They need to “just work.” As we will see, making that happen is harder than it sounds.
Data changes everything
The convergence of modern lighting controls with internet of things offers a potential wealth of new data and expanded functionality. Every sensor and lighting control device can generate data that can be recorded, analyzed for trends and acted upon. Ideally, this using this data would make the system more autonomous in how it adaptively reacts to changes in the environment that it is controlling. Seemingly simple device level information, such as changes in battery levels for a particular battery–operated sensor or being able to easily identify which control node/device is not functional can also offer potential for increasing system resiliency.
With the vast amounts of data available to harvest, the primary question is: What type of data are the most useful when trying to implement the three core guiding principles of empowerment, conservation and “do no harm” for a system that’s primarily intended to turn lights on and off?
The temptation is to gather as much information as possible and figure out what to do with it later. However, as the quantity of control devices and the amount of data that they generate increase that same data can also overwhelm a control system — especially one that is designed for simplicity, low energy consumption and low network use. Even if technology in the future develops to a point where the lighting control hardware is no longer a limitation, data harvesting can also bring about significant ethical and privacy concerns. Ethical and legal considerations in the internet of things are outside the scope of this article, but should be considered in parallel with technical considerations such as those presented in this article.
Why the clock is ticking
Traditionally, lighting control systems have been hardwired, with cables interconnecting individual devices back to a single centralized master control device or server. While generally offering consistent and predictable performance, that server also acts as a bottleneck because every sensor and switch needs to communicate with it. That server needs to respond to each and every input with an associated action command (on/off/dim).
While the performance may be sufficient with a smaller quantity of control devices, this type of network topology does not scale well to large installations. Often, the recommended solution would be to segment the large lighting control system into multiple smaller systems. In a pre-energy code world where the quantities of sensors and associated lighting control zones were limited, such design constraints may have been be acceptable.
IECC has a new optional code compliance path for luminaire level lighting controls where control devices are integrated into light fixtures. As the energy codes continue to evolve and the quantity of sensor and the granularity of control inevitably expands to where individual lights instead of groups of lights may be controlled, the performance and physical installation constraints of traditional hardwired network topologies becomes a severe problem.
- What happens when instead of a couple dozen control devices, they’re integrated into every light fixture and there are hundreds if not thousands of them within a building?
- What if those devices are sensors that constantly generate addition data beyond simply binary yes/no occupancy information, like ambient illumination levels?
- Will a single server be able to keep up with the amount network traffic? What happens if that single server fails?
- Is installing all of that wire required to interconnect all of those devices feasible or cost–effective?
- Will you spend more money installing the system than you will save through reduced energy consumption?
With these types of considerations, wireless lighting controls start to make more sense. However, for the aforementioned reasons, simply substituting wireless connections for the physical point-to-point hardwired connections doesn’t effectively address the key concern of being able to scale lighting control networks to effectively accommodate large systems. Another solution is required.
How to get the data where it needs it to go
Most commercial wireless lighting control systems use a network topology referred to as a “mesh.” As opposed to the previous point-to-point hardwired example where a control device sends a message directly to a specific centralized server, a device (node) in a mesh network can communicate with any physically adjacent neighbor nodes within range of its radio transmitter. That message, if not intended for any of the neighbors that received it, will be retransmitted by those neighbors. That message will continue to be retransmitted until it reaches its final destination(s).
Only the node or nodes that the message was addressed to will act upon the data contained within the message. Individual control devices in this type of network can talk directly to each other via a “publish and subscribe” relationship without necessarily having to be routed through traditional centralized server. This decentralization addresses one of the primary limitations of a traditional hardwired lighting control system using a star topology with a single master control server that’s responsible for relaying every command.
With this type of signal propagation methodology, the communicating nodes do not need to be within direct radio range of each other as long there are intermediary relaying nodes between the two. As such, the radio transmitter within a device doesn’t need to be strong enough to transmit directly to a specific final destination; it only needs to be strong enough to reach its nearest neighbors.
By extension of this concept, any node can communicate with any other node in that same mesh network. This can dramatically conserve power for battery–operated devices. While highly flexible, a major concern with this relaying scheme is how many retransmitting “hops” it will take for that message to reach its final destination. Signal latency increases with both the quantity of hops and size of the message packet that is transmitted.
Latency can significantly impact the performance of a lighting control system. It can cause annoying delays between when a switch is push and the lights turn on. It can also cause “popcorning” — when a group of light fixtures intended to operate as a single unified zone instead turn on/off and dim at different rates.
Ideally, establishing the shortest route through the mesh between any two nodes gives the least latency and best performance. However, a message also has the option of taking multiple potential paths as it hops it way to its destination, reducing the chance of single points of failure. If every device is listening for and retransmitting every message (a concept known as flooding) and that single message takes multiple paths, how can you control radio use and network traffic? Why isn’t there a resulting feedback loop that increases traffic to the point where it overwhelms the network?
These and similar concerns are addressed by a few simple concepts:
- Messages have a time to live. Basically, time to live is a countdown timer for messages. Relayed messages contain information regarding the number of hops typically required for that packet to reach the receiving destination node from the transmitting node. Every time that a message is retransmitted, the counter goes down by one. Eventually, the counter reaches a value where the message is no longer allowed to be relayed. This prevents a message from endlessly ping-ponging through a network. This time to live information can also be used to determine the average number of hops to get from one node in the network to another. Based on this information, the initial time to live value can be optimized and further reduce the network traffic by preventing needless extra relaying of messages.
- Nodes shouldn’t retransmit messages that they’ve already seen. Each device has a message cache where all recently received and retransmitted messages are stored. Every newly received message is compared against the cached messages. If there is a duplicate in the cache, it would indicate that the node had seen that message before and it’s immediately discarded.
What about devices that, for practical reasons, need to conserve power as much as possible and avoid wasting power retransmitting message that aren’t intended for it? Minimizing power use is an extremely important consideration in battery–operated devices because every time the radio transmitter is activated, a significant amount of power is used.
Low–power devices that don’t participate in relaying message are called “leaf” or “end” devices. These devices will form a “friendship” with another device, typically one without the same power limitations that can send and transmit without fear of depleting its power supply (i.e., a devices integrated into a hardwired light fixture with a constant power source). The low–power device would spend extended periods of time in sleep or standby mode and then when it needs to transmit data, only communicate directly with its “friend.” The friend would participate in the mesh on behalf of the lower power node and be responsible for storing messages while the low–energy device is sleeping and forward them when the low–energy device wakes up periodically to check for messages.
Coexistence is hard
Most common wireless lighting control communication protocols primarily use the 2.4 gigahertz frequency band to send and receive messages. This is because that particular portion of the frequency spectrum is unlicensed by the Federal Communications Commission. Anyone can use this band without special permission as long as their radio equipment complies with certain technical requirements such as maximum transmitter power. This same general frequency band is also shared with Wi-Fi traffic.
Keep in mind that the protocols and mesh network topology that enable wireless lighting controls are intended to conserve power and as such, based on radio transmitters that are inherently low–powered and not intended to be able to transmit very far. It is inevitable that conflicts will increase as the quantity of Wi-Fi and internet of things devices increase. This 2.4 gigahertz frequency band will become more congested and performance will subsequently suffer.
With this ever–increasing potential for interference, the first thought that comes to mind would be just to use a different frequency band. The problem is that the radio spectrum is already heavily regulated. To help illustrate this fact, the U.S. Department of Commerce had created an infographic that shows how frequency bands are currently allocated in the United States.
While it may be possible in the future to appropriate a different frequency band for use by internet of things devices, that type of frequency reassignment takes years to accomplish and is expensive. A recent example of this occurred in 2019 when the FCC completed the transition of 175 television stations across the country to different frequencies. This transition freed an 84-megahertz-wide band of radio spectrum for use by 5G mobile phone carriers. This spectrum wasn’t given away for free — the federal government sold the usage rights for these frequencies via a reverse auction. That auction yielded $19.8 billion in revenue. From this, it should be clear that radio spectrum is extremely valuable.
Without the ability to migrate to a different frequency band, coexistence strategies are primarily limited to improving packet collision avoidance schemes and optimized message retry capability within the current wireless lighting control protocol standards. While this need has been recognized by the industry, standards regarding enhanced coexistence implementations are still forthcoming.
Standards are still evolving
The ultimate goal for the wireless lighting controls industry is for devices from different manufacturers to communicate directly with each other without any type of special considerations. In an ideal world, everything would be plug–and–play. The first step toward this goal is to develop open communication protocol standards for interoperability.
All communication systems define multiple different functional requirements: the physical interface (radio hardware), how data are routed and formatted, the application that receives and interacts with the data, etc. These various functional requirements define an overall system that can be described as a multilayered protocol “stack” fashioned after the Open Systems Interconnection model.
Strict conformance with the system’s protocol specifications is critical for ensuring interoperability. To this end, third–party product certification is important. Many interoperability problems can often be attributed to manufacturers using customized protocol implementations (e.g., using additional radio transmission frequencies that aren’t defined by the standard to increase range or reduce interference). Although it is still rare, look for a certification mark from the applicable standards organization to help ensure a minimal level of interoperability between devices from different manufacturers.
One note of caution: The standards discussed here are still under active development. Certifications marks that indicate compliance with any particular standard can be misleading in that they often don’t state what version of that standard is being used. Older versions of each standard may not implement certain core functionality such as mesh networking and IPv6 addressing.
It is shaping up into a two-horse race, Bluetooth versus ZigBee.
There are numerous competing wireless lighting control communication standards; EnOcean, Z-Wave, Thread, ZigBee and Bluetooth mesh to name a few. Each has its own benefits but two are emerging as the frontrunners for commercial lighting controls — Bluetooth mesh and ZigBee. Both have a broad coalition of supporting companies. In fact, some semiconductor manufacturers offer products that support both protocols on a single chip.
The primary reason for this emergence is twofold, both standards scale well to accommodate large systems (supporting a theoretical maximum of 32,767 and 65,000 nodes, respectively) and are reasonably mature open solutions that aren’t based on proprietary chipsets from a single manufacturer. (Note that whenever the word “theoretical” is used, actual performance will usually be significantly less for a number of reasons.)
ZigBee has the primary advantage of being one of the first open, automation specific, wireless protocols. ZigBee has been a formal standard since 2004. The standard has been updated several times since then to add functionality such as that required for lighting control systems. However, it isn’t a full–stack protocol. Returning to our discussion of the OSI model, it only implements the upper application and network/transport functions.
ZigBee is built atop another standard, IEEE 802.15.4: Standard for Low-Rate Wireless Networks. IEEE 802.15.4 defines low data rate solution for low–power (typically battery–operated) devices with low complexity and reduced infrastructure requirements. This IEEE standard defines the lower two layers, the physical and medium–access control layers. The physical layer controls the radio transceiver with associated frequency, channel, radio power and signal management functions. The medium-access control layer controls traffic between the upper layers (as defined by the ZigBee standard) and the radio transceiver represented by the physical layer.
It does not control security, which is a function of the ZigBee upper layers. It is important to note that several other protocols are also built on top of IEEE 802.15.4 (e.g. Thread/6LoWPAN, HART, in addition to multiple protocols that are proprietary to specific manufacturers), but are not directly interoperable with ZigBee. Any reference to 802.15.4 in a manufacturer’s literature should not be inferred as meaning that it is ZigBee compatible, nor will it be compatible with any other manufacturer’s 802.15.4 lighting control device.
The IEEE standard operates at a theoretical maximum transmission speed of 250 kilobytes per second in the 2.4 gigahertz band and at 20/40 kilobytes per second in the 915 megahertz (North America) and 858 megahertz (Europe) bands. Recent amendments to the standard have added additional frequency bands for Japan and China.
All of these frequency bands are unlicensed, meaning anyone’s device can use them. As alluded to earlier, this also means that network congestion is a potential problem. To help alleviate congestion, the standard defines 27 different transmission channels with 16 2–megahertz–wide channels in the 2.4 gigahertz range and 11 at lower frequencies.
Unfortunately, all but four of the IEEE 802.15.4 2.4 gigahertz channels are in conflict with the three nonoverlapping 22–megahertz–wide 802.11b Wi-Fi channels that are typically used in North America: channels 1, 6 and 11. The open gaps between these Wi-Fi channels where ZigBee, Bluetooth and other protocols can operate relatively unobstructed are only 3 megahertz wide. The conflict is worse with more current Wi-Fi standards (802.11n and 802.11ac), which have the option to use wider 40–megahertz–wide channels. Most Wi-Fi network equipment manufacturers do not recommend the use of the 40–megahertz–wide channel option in the 2.4 gigahertz band because of this coexistence issue.
Bluetooth, Low Energy and mesh
Bluetooth has the primary advantage of broad name recognition beyond the automation and lighting control industry. Bluetooth was developed by Erickson more than 20 years ago and became a formal standard (IEEE 802.15.1) in 2002. Bluetooth is a full “top to bottom” stack protocol that was originally intended for point-to-point communication, not a mesh topology.
However, the standard has been updated numerous times with recent revisions dramatically expanding its functionality. Bluetooth version 4.0 introduced Bluetooth Low Energy and an addendum to version 5.1 published in 2017 formalized support for mesh network topologies. Both of these abilities are critical for Bluetooth’s use in wireless lighting controls. Earlier versions of Bluetooth do not have the critical lighting control profiles that are integrated into the Bluetooth mesh standard.
Most lighting control manufacturers’ Bluetooth products that are currently on the market only leverage point-to-point Bluetooth Low Energy technology to facilitate pairing of end/leaf device and use separate proprietary 802.15.4 based mesh protocols and security features. Because Bluetooth V5.1 mesh is still relatively new, devices using it are still rare. Bluetooth mesh lighting devices only started to show up on the official Bluetooth product listing database in late 2019.
The Bluetooth Low Energy standard has a theoretical maximum transmission speed of 2 megabytes per second. While the channel width used by Bluetooth is roughly the same as those in 802.15.4, they are not spaced as far apart. The standard defines 40 2–megahertz–wide channels in the 2.4 gigahertz band with three of these reserved for unidirectional “advertising.” Advertising is used for establishing an initial connection between two devices. Any speed claims have some significant caveats — Bluetooth only works in the 2.4 gigahertz band (with all of its congestion concerns) and achieving this speed is highly dependent on the manufacturer’s implementation of the new features in the standard.
Bluetooth’s speed advantage over the current version of Zigbee is an important consideration. The benefit of any additional speed compared to other standards isn’t necessarily having the ability to transmit a greater amount of data. Remember that all wireless lighting control protocols are intended to be low–data–rate solutions for low–power–use devices. Although it isn’t readily apparent, higher transmissions speed can improve coexistence in challenging radio frequency environments.
To conserve power, the primary goal is to use the radio transceiver as little as possible. Being able to quickly turn on the radio, send a message and turn it off again minimizes the amount of time spent transmitting and, therefore, decreases likelihood of message from different devices colliding.
Bluetooth versus Zigbee
Proprietary wireless lighting control solutions currently dominate the industry. While Bluetooth mesh and Zigbee appear to the most promising of the current batch of open wireless lighting control protocols, the industrywide adoption of either is in no way guaranteed. Each has its own merits but both are still evolving.
The current state of the lighting control industry is similar to the building automation industry in the early 2000s during the emergence of the competing BACNet and LonWorks open standards. With the rapid growth of the wireless lighting control industry, it is expected that this competition between competing open standards will only become more intense in the upcoming years.