Editor’s Note: Part 1 of this three-part series covered the fundamentals of popular wireless technologies, including key attributes such as frequency bands, network topology support, throughput, range, and coexistence. Part 2 will look at the design fundamentals of each technology such as chip availability, protocol stacks, application software, design tools, antenna requirements, and power consumption. Part 3 will look at new technologies that have emerged to address the needs of the Internet of Things.
Engineers have many choices when it comes to low-power wireless technologies, including RF-based technologies such as Bluetooth low energy, ANT, ZigBee, RF4CE, NFC and Nike+, plus infrared options championed by the Infrared Data Association (IrDA). This article is the second in a three-part series that considers popular wireless technologies’ technical foundations, and relative strengths.
In particular, the article will describe hardware, firmware and software development for the various low-power wireless technologies. It will then detail the power efficiency and peak current draw of each protocol and their effect on battery life. A bill-of-material (BOM) cost and target markets analysis will also be provided.
Designing with Bluetooth low energy
Today’s Bluetooth low energy SoCs are almost exclusively based on 2.4 GHz radios, supervised by Arm® Cortex®-M0, M3, M4, and M4F embedded processors with flash and RAM to facilitate stack firmware and application software storage and operation. In conjunction with introducing single-chip hardware, silicon vendors are working to make it much easier for non-expert RF engineers to design wireless products by supplying reference designs, application notes and design tools.
Nonetheless, while a SoC like Texas Instruments’ (TI) CC2640R2F Bluetooth 5 SoC incorporates all the hardware (and firmware) for a complete Bluetooth low energy solution, simply soldering the chip to a printed circuit board and applying power is unlikely to result in a working solution.
Like all RF designs, a functioning system requires additional circuitry in the form of passive components that form matching circuits, a power supply and an antenna (Figure 1).
Figure 1: Application circuit for TI’s CC2640R2F Bluetooth 5 SoC includes matching circuits, crystals and antenna. (Image source: Texas Instruments)
Alternatively, a wide range of third-party companies offer tested and compliant modules built around the latest SoCs, eliminating the need to design external circuitry with the trade-offs of greater cost and increased size.
The factory-supplied tested and verified Bluetooth low energy stacks take care of the wireless protocol requirements, and there are many official and open-source libraries with proven application code for the most common Bluetooth low energy applications. In addition, a proliferation of vendor and third-party development kits (DKs) and software DKs (SDKs), associated with widespread and user-friendly integrated development environments (IDEs), make designing new applications more straightforward for a competent engineer.
For more information on designing with Bluetooth low energy, see Digi-Key library article “Bluetooth 4.1, 4.2 and 5 Compatible Bluetooth Low Energy SoCs and Tools Meet IoT Challenges (Part 2)”.
What about ZigBee?
In many ways, ZigBee is a similar design proposition to Bluetooth low energy in part because many of the firms that supply Bluetooth low energy chips, reference designs, and DKs/SDKs do the same for their ZigBee products. SoCs integrating transceiver, microprocessor and large capacity flash memory and RAM are in plentiful supply, eliminating the need to work with separate processor and transceiver development environments. However, designers familiar with a specific processor family can choose to partner a transceiver-only ZigBee chip with their favorite MCU. An example of such a pairing is Microchip's AT86RF232 which can be paired with an MCU from the Microchip/Atmel AVR MCU family.
NXP's JN516x series is a good example of the latest SoCs available for ZigBee-based projects. The device comes programmed with the ZigBee PRO stack and features an embedded 32-bit RISC processor, flash and EEPROM memory. The SoC also incorporates a 2.4 GHz transceiver and a comprehensive mix of analog and digital peripherals.
Because RF4CE is a customized application of the ZigBee stack, the hardware design is identical to other ZigBee applications. The NXP JN516, for example, is perfectly capable of supporting an RF4CE application. All the developer needs to do is build his application on the RF4CE firmware instead of the standard ZigBee stack.
Like Bluetooth low energy, ZigBee products require both standards compliance and regulatory certification to carry the standard’s logo.
Is ANT a good design option?
ANT is a similar development challenge to a Bluetooth low energy or ZigBee project. A two chip (TI CC2570 plus MSP430F2 MCU), SoC (Nordic Semiconductor nRF52832), or modular (Dynastream Innovations' D52 ANT Module) hardware solution is available. The first two options demand additional peripheral components to form a working device, while the module is a fully tested hardware solution.
The ANT protocol is supplied as a self-contained firmware stack, leaving the developer with only the task of application code development on the software side of things. Application code development for the CC2570 is done using the chosen MCU’s development tools, while the nRF52832 SoC and D52 ANT Module (which is based on the nRF52832) use a software architecture that cleanly separates the protocol stack from the application code, easing the development process. The nRF52832 employs an embedded ARM M4F microprocessor. Both TI and Nordic Semiconductor provide ANT DKs and SDKs to support application code development.
One downside of the ANT protocol is that if an existing ANT+ device profile doesn’t suit the application, developing a new one demands a collaborative effort between the protocol developer Dynastream Innovations and the OEM to ensure interoperability. This can extend development time.
How is IrDA implemented?
IrDA provides several versions of its communication protocols depending on the application. The simple protocols demand the services of a modest IR transceiver such as Vishay's TFBS4711. The device is capable of data rates up to 115 Kbits/s over a range of 1 m. Complexity increases when higher data rates are required, because a heavyweight IrDA protocol stack and the services of a 16-bit microprocessor are needed.
Because IR is a line-of-sight technology, optical considerations are important. For example, plastic IR transmission windows — that ensure IR intensity is within specification — are a key requirement for end products using IR. Standards compliance certification is compulsory and needs to be carried out at an IrDA authorized test lab.
What are the key NFC design challenges?
Perhaps the toughest part of designing an NFC system is getting the antenna right. The antennae of the transmitter and receiver not only need to be coupled to pass data, but also power. That makes the design trickier than transmitting and receiving data alone. The good news is that there are many application notes and reference designs from major suppliers offering guidance on antenna design.
NFC’s firmware is based on a stack of functional layers, from the physical layer to application software implementation. Above the physical layer, typically comprising microcontroller and associated infrastructure, communication interfaces, and radio, are the middle layers comprising data packet assembly, generation of NFC commands, logical link control protocol (LLCP), and simple NDEP exchange protocol.
The higher layers comprise NDEF messages and NDEF records, topped off with a user interface (Figure 2). Design follows a process similar to that adopted for technologies such as Bluetooth low energy and ANT+, with developers basing the product on a factory firmware stack and using appropriate development tools to craft their own software tailored for the specific application.
Figure 2: NFC physical layer (in this case a TI TRF7970 NFC chip) and firmware stack interfacing with host microcontroller and NFC or RFID device. (Image source: Texas Instruments)
Because of the low power output, regulatory certification is not compulsory for NFC, but standard radio emission testing must be performed to ensure transmissions are confined to the 13.56 MHz band. However, a standards compliance certification program is required if interoperability is demanded in the end product.
Is Wi-Fi difficult to implement?
Of all the low-power wireless alternatives, Wi-Fi is the most complicated technology for the developer. In particular, the hardware must be designed to tight tolerances to ensure that the specification for radio performance is achieved.
Designing a Wi-Fi solution from the ground up demands a high degree of gigahertz frequency RF expertise making the pre-assembled module route popular among developers looking to accelerate time-to-market. Modules are tested, verified, and compliance certified wireless products that can be quickly incorporated into a Wi-Fi solution.
IEEE 802.11n modules and associated development tools are readily available from several silicon vendors. They typically integrate WLAN baseband processor and RF transceiver support for IEEE 802.11n a/b/n/g, power amplifier (PA), clocks, RF switches, filters, passives and power management. Modules are available to run either with a separate microprocessor or from an embedded device. As with most other RF hardware solutions the developer will likely have to specify either a 2.4 or 5 GHz antenna connected via controlled impedance traces for a fully functioning RF circuit (although some modular solutions even extend to these elements).
The microprocessor will need to run a high-level operating system (OS) such as Linux or Android. Drivers are available from Wi-Fi chip providers, while additional drivers such as those needed for WinCE and a range of real-time OSs are supported through third parties.
Certification for Wi-Fi device operation is not compulsory, but if the device is not certified, the Wi-Fi logo can’t be used. Certification costs are high relative to the other technologies discussed due to the extended testing regime. (For more information on Wi-Fi design, see Digi-Key library article “Compare 2.4 GHz and 5 GHz Wireless LAN in Industrial Applications”.)
The trade-off between protocol reliability and efficiency
Part 1 of this article described how each wireless communication protocol comprises an overhead (for example, packet ID and length, channel and checksum) and the information that’s being communicated (known as the “payload”). The ratio of payload/total packet size determines the protocol efficiency.
A Bluetooth low energy packet is shown (Figure 3). The protocol allows for various payloads with this example illustrating the case for the maximum payload. Note that this is the packet structure adhering to Bluetooth version 4.0. There are some subtle packet changes for versions 4.1, 4.2, and 5, but these haven’t significantly altered the protocol efficiency.
Figure 3: Bluetooth low energy packet. With a maximum payload of 248 bits, the protocol efficiency is 76 percent. (Image source: Bluetooth SIG)
The Bluetooth low energy packet in Figure 3 comprises:
- Preamble = 1 octet (8 bits)
- Access Address = 4 octets (32 bits)
- Header = 1 octet (8 bits)
- Payload length = 1 octet (8 bits)
- Payload = 31 octets (248 bits)
- Cyclic redundancy check (CRC) = 3 octets (24 bits)
- Bluetooth low energy protocol efficiency = Payload/Total packet size
- 31/41 = 76%
It’s important that a designer ascertains the protocol efficiencies of the shortlisted technologies because they directly affect the user experience. Low-power wireless solution providers are keen to tout “raw data” transfer speeds, but can be a little less forthcoming with how much of that data is useful payload.
From a user perspective, a technology with an impressive raw data speed can appear sluggish if the efficiency is low. Worse yet, an inefficient protocol expends lots of energy sending non-useful data – shortening battery life (a critical design parameter for low-power wireless technologies).
However, it’s important to note that there’s a trade-off between reliability and efficiency. For example, a protocol could gain some efficiency by eliminating a checksum or error correction, but that efficiency gain is quickly wiped out if packets must continually be retransmitted because they are often corrupted upon arrival at the receiver.
When comparing shortlisted technologies, the designer is advised to speak to the supplier to ascertain theoretical protocol efficiencies (remembering that even a single technology may come in different versions each with a unique packet structure), but then test the actual efficiency in several use cases by analyzing transmission to find out the rate of the successful packet transfers. This rate is typically lower than that indicated by the theoretical efficiency, even in optimal radio environments.
What does it cost to manufacture low-power wireless devices?
The main costs associated with a low power wireless sensor are the transceiver and microprocessor (or, if these components are combined, the SoC), antenna, voltage regulator and pc board real estate (Table 1). It is assumed that the cost of the battery, the battery connectors, and the sensors are independent of the wireless technology, so they have been omitted from the comparison table. Also, note that the figures are estimates and can vary considerably from vendor to vendor.
Table 1: The main costs associated with a low power wireless sensor are the transceiver and microprocessor, antenna, voltage regulator and pc board real estate. (Image source: Digi-Key Electronics)
External crystals can contribute significantly to the cost of a low power wireless sensor because high quality devices are often required to meet strict regulatory requirements. The higher the precision of the device, defined as lower parts-per-million (ppm) deviation from nominal frequency, the higher the price.
Typical crystal tolerances (in ppm) are:
- NFC = 500 (the crystal is only required to keep the radio operating in the allocated band, not for data clocking)
- Bluetooth low energy = 250
- Nike+ = 60
- ANT = 50
- ZigBee/RF4CE= 40
Low-power wireless technologies are primarily designed to extend battery life, but there is a considerable difference in the performance of each.
A key use case for each technology is in sensors. In the simplest example, such sensors could be used for measuring and transmitting, for example, temperature, humidity and pressure. Sensor design is typified by lightweight and compact design allowing space only for small batteries, for example the 3 volt CR2032 coin cell with a nominal capacity of around 225 mAh.
Calculating battery lifetime in an application is a complex task dependent on the sensor’s hardware. For example, if the sensor includes a status LED, then power consumption will be higher compared with a device carrying no status LED. Similarly, peak transmit and receive currents, time on air, sleep-mode currents and several other factors will affect battery life.
Some suppliers, such as ANT developer Dynastream Innovations, include handy power consumption calculators on their websites, enabling the developer to estimate battery life for a given use scenario.
A good benchmark for the power efficiency of each protocol is to consider the energy required to transmit a single bit. Such a calculation accounts for the fact that a protocol with greater throughput needs to be on air for less time than one with a more modest throughput. (Note that these numbers should be considered as a rough guide; actual energy per bit for a given technology varies markedly between chip suppliers and in different applications.)
Bluetooth low energy
Texas Instruments has published an application note (see reference 1) that calculates average current for Bluetooth low energy in a typical application to be 24 microamps (µA) (using a 3 volt CR2032 coin cell battery).
- Power consumption = 24 μA x 3 volts = 72 microwatts (µW)
- Bits per second (in typical sensor operation) = 960
- Energy per bit = 72 µW/960 bit/s = 75 nanojoules (nJ)/bit
A similar calculation for ANT reveals:
- Power consumption = 61 μA x 3 V = 183 µW
- Bits per second (at typical raw data throughput) = 256
- Energy per bit = 183 µW/256 bit/s = 715 nJ/bit
In transmission mode, a ZigBee device consumes 30 milliamps (mA).
- Power consumption = 30 mA x 3 V = 90 milliwatts (mW)
- Bits per second (at maximum raw data throughput) = 250,000
- Energy per bit = 90 mW/250,000 bit/s = 360 nJ/bit
Measurements of an IR remote control reveal that the device draws 1.948 mA while transmitting at 121 bits/s.
- Power consumption = 1.95 mA x 3 V = 5.85 mW
- Bits per second = 121
- Energy per bit = 5.85 mW/121 bits/s = 48 μJ/bit
Information on Nike+ current draw is sparse, but Nike indicates that in a typical application a CR2032 battery will last around 1000 hours, hence it can be assumed that average current is 225 mAh/1000 = 225 µA
- Power consumption = 225 µA x 3 V= 675 µW
- Bits per second = 272
- Energy per bit = 675 µW/272 bits/s = 2.5 μJ/bit
Wi-Fi was primarily designed with throughput rather than power consumption in mind. The result is a high current consumption even when not transmitting at maximum throughput. However, when the technology is running at full speed, its power efficiency is comparable with the other technologies discussed here.
- Power consumption = 116 mA x 1.8 V = 0.210 W
- Bits per second = 40 million
- Energy per bit = 0.210 W/40 Mbits/s = 5.25 nJ/bit
How long will my battery last?
Actual battery life primarily depends on the application, and then to a lesser extent, on the choice of technology. If a device is continuously scanning and transmitting it will consume a high current and quickly drain the battery no matter what wireless technology it employs.
However, most low-power wireless technologies are primarily designed to save energy by running at a very low duty cycle such that they can spend long periods in sleep mode before waking up, sending data, and returning quickly to sleep mode – thus minimizing the time spent running at high current. High-bandwidth technologies can be advantageous because they can spend less time in a high current consumption mode (for example, transmit or receive) to send a given amount of data compared to lower-bandwidth technologies.
Lifetimes for coin cell batteries are best calculated by estimating the average current consumed by the low power wireless sensor for a given application. Manufacturers’ data sheets often detail current consumption and duration of “connection events” such as wake up, connection, transmission/receiving, and return to sleep (Figure 4).
Figure 4: A Bluetooth low energy connection event comprises several current levels and timings. (Image source: Texas Instruments)
By consulting the data sheet for current consumption during sleep periods, and knowing the duty cycle and connection event current draw, designers can determine the average current of the target application (Table 2).
Table 2: Knowledge of connection events and parameters (plus radio sleep current and duty cycle) allows a calculation of the average current in typical applications. (Table source: Digi-Key Electronics)
Sleep periods are usually measured in nanoamps (nA) and are long, while connection events usually draw milliamps and are short. Average current consumption typically comes out in the microamp (µA) range.
For example, consider an example of a sensor equipped with ANT wireless technology. In one sensor application, the average current is measured at 175.5 μA. Battery life from a 225 mAh coin cell battery would therefore be 225 mAh/175.5 µA = 1,282 h or around 53 days.
In a similar sensor application, Bluetooth low energy technology consumes an average current of 49 μA for a battery life of 225 mAh/49 μA = 4592 hours = 191 days.
The effect of peak power consumption on battery life
Calculating battery life from the sensor’s average current consumption is a useful guide, but it’s only part of the story. The actual life of a coin cell is not only affected by the magnitude of average current, it is also detrimentally affected by peak current. Repeated current peaks exceeding 15 mA considerably degrade battery life. For example, peak currents exceeding 30 mA can limit the capacity of a battery to less than 80 percent of the manufacturer’s nominal figure. Attempting to draw even higher currents could permanently damage the coin cell battery. Battery life will typically be better using a technology with a higher average current but lower peak current, compared to one with superior average current but higher peak currents. It is therefore important to know the peak current draw for the various wireless technologies (Table 3).
Table 3: Low-power wireless technologies’ peak current draw. (Table source: Digi-Key Electronics)
Another reason to select a low-power wireless technology with low peak current is its improved compatibility with energy harvesting technologies, particularly photo-voltaic (PV) cells. PV cells exhibit relatively low efficiency when converting ambient light into useful electrical energy. An amorphous solar cell of similar dimensions to a CR2032 (3 cm2) would yield 1.5 volts x 8 μA = 12 μW. Having such small amounts of power available, it is critical that the radio features low peak (and average) current demands.
The low-power wireless technologies described in this article are targeted towards specific market segments, some of which overlap (Table 4). The list is not exhaustive and is likely to expand as the IoT matures.
B (Bluetooth low energy), A (ANT), A+ (ANT+), Zi (ZigBee), RF (RF4CE),
Wi (Wi-Fi), Ni (Nike+), Ir (IrDA), NF (NFC).
Table 4: Key low-power applications and the wireless technologies that cater to them. (Table source: Digi-Key Electronics)
A comparison of the various low-power wireless technologies shows that vendors are providing the solution support designers need in terms of firmware and development kits to get to market quickly and effectively. However, designers need to choose their low-power wireless interface with due consideration to its impact on overall power consumption and battery life, as well as the final BOM and the needs of the target markets.
- “Bluetooth Low Energy Power Consumption”, Sandeep Kamath & Joakim Lindh, Texas Instruments, 2012.