Welcome everyone to the First Hungarian Smarthome Expo. Thanks for the organizers for creating this event. My name is Gergely Kálmán, Procurement Officer at AccessPoint Ltd. Besides procurement tasks I'm also dealing with product development as well, and product development makes me have to dig quite deep into technical details, too. In my talk, I'll be looking at data transfer solutions that are fighting each other for the 2.4GHz range. Namely with WiFi, Zigbee, and Bluetooth.
And for safety's sake we're going to be comparing these against the Z-Wave standard that is getting so popular these days. Let's jump right in. Why 2.4 GHz? Why so many data transfer technologies are cramming into this band? First of all it's good to know what the ISM band is. The ISM band is a result of an international agreement that served the purpose of maintaining a frequency band freely usable for wireless systems used for industrial, scientific and medical needs.
This has changed a lot since than, and actual use has in many cases left these surfaces, and has carved a way into the private sphere, achieving tremendous success there. So 2.4 GHz is an ISM band, and within that, is a special one, too, that can be used freely literally worldwide.
Maybe with the exception of one single country, which is North Korea, radios used on this frequency can be turned on, operated and used without any type of licensing processes or paying any license fees. If we can use these products anywhere, then their mass production can be realized on a much larger scale. Therefore, these products can be manufactured in a much economical way. Development and manufacturing costs are divided among more products, Also their transportation is getting simplified, because these products won't need to be customs cleared or authorized through special processes just to be transported into every country. And this will generate an obvious advantage when comparing retail prices with those of other technologies using a different frequency band.
Let's start with Wi-Fi, one of the most widely spread wireless technology currently working on the 2.4GHz band. Presently it is living it's 6th generation, that enables us to transfer data from 1 to 143 Mbits on 2.4GHz, of course, in a case of 20 MHz raster. Everybody knows, it works in a star topology, so, we're talking about AP's and clients.
Clients are roaming among different AP's, if they're getting out of range. And most recently a thing called mesh is around, but don't let yourself be fooled, mesh currently is limited to the backbone connection between AP's. Therefore, it's use doesn't give the same advantages as that of mesh solutions implemented in IoT standards. A disadvantage of it is using a rather wide raster. We know that in the free 2.4 GHz range there are 3 channels.
Namely 1, 6, and 11 which don't overlap and don't interfere each other. We can say that the 2.4GHz band is used rather wastefully, if we look at it from an IoT point of view, precisely for this, it's much more exposed to interference, because a single data slice is realized on such a wide raster, so, if we jam into it even for just a little bit at one or two points, the whole 20 MHz transfer can fail, and result in resending. Another downside is that it's expensive. If we observe the implementation of the new standards as they appear, even in products from the same manufacturer, what we'll notice is that "ac" systems were more expensive than "n" systems, "ac" Wave 2's were more expensive than "ac" systems, and after this, again, more expensive than these, will come "ax" solutions.
So the principle of the more advanced the technology, the cheaper it becomes, that is typical in other businesses of computer technology, it doesn't prevail here. And the reason for it is simply that advanced technology in this case really does assume involving further calculation capacities, more antennae, implementation of more complicated systems, and these obviously will be factors that increase costs. Now, the solutions of some IoT manufacturers might contradict this, as they are available quite cheaply, but this can only come from the fact that that they are using obsolete WiFi technology, based on standards that are 1-2 generations older. that will have a negative effect on the whole WiFi network. Let's take Bluetooth next. Again a technology that's been with us for a very long time. earlier listed by IEEE, under standard no. 802.15.1, but
but got transferred from IEEE to Bluetooth SIG. This technology is applying a totally different frequency use that is called frequency hopping, which means that it realizes data transfer by dynamically changing between channels. This initially served security purposes, today is more to rule out interference. We're in the 5th generation of it. What's most important about it, and is a feature worthy of mentioning from an IoT perspective, is that it uses very little energy, we call this BLE, as in "Bluetooth Low Energy", and in spite of its low consumption it can handle relatively large bandwidths. In the latest, 5.25 version it goes from 125 Kbps all the way up to 2 Mbps
in can realize data transfer in 4 steps. But if we need bands larger than this, it can deliver 24 Mbps in PTP. this is a little tricky, actually, as it changes to 802.11, that is, WiFi PTP mode, for the time we'd like to transfer on this high bandwidth. With 5.1 capabilities came the mesh feature, too, that became the other
very important capability from an IoT point of view. At the moment very few manufacturers make use of the possibilities in this, but we can already see examples, like the smart lighting solutions by Yeelight, currently still unavailable in Europe, but I've already came across with them at the CES conference in the USA. There was a full smart lighting system on demonstration there, based on Bluetooth mesh. Bluetooth has a lot of special capabilities. It optimizes for sound transfer, even for indoors navigation, it can even estimate directions of leaving and arrival, which is a feat not many current technologies are capable of.
It's consumption, as I mentioned, is very low, there are also sleep modes available, The only disadvantage is perhaps the fact that products are somewhat expensive. If we compare the solutions of the aforementioned Yeelight, that are counted as cheap every other aspect in the market, but these Bluetooth solutions are much more expensive than identical lighting solutions using competing technologies. The third technology we're looking at today is Zigbee.
Also listed by IEEE, under standard 802.15.4, This is an open IoT standard, created with the objective of having low consumption, using low bandwidth - and low cost. These are the main differences from the previously discussed technologies, because this one does also expressly optimize cost, besides consumption, And besides all that, it's capable of transmitting 250 kbps.
Mesh structure, essentially without limits. The standard specifies 65000 nodes though. We have yet to see a system of this size, but this is not what matters, but the fact that it's indefinitely expandable. And what's more important, is that there is no hop limit, that is, the mesh that's built is not limited from the point of view of how many routers does it take for the data to get from point A to B. With Zigbee 3.0 so called green power has been added to the picture,
that is more or less equivalent to Bluetooth Ultra Low Energy. The only downsize to this technology is that while in specifications prior to 3.0 sub-GHz frequency bands were present, 3.0 does not have it, focusing only on the 2.4 GHz band, and that's where the downsize of this data transfer technology comes in, which is having to work in an interference-ridden spectrum. And some words about Z-Wave, the competitor standard then. We're also currently in generation 3, there were the 300, 500, and now the 700 chips.
Works subGHz, capable of transferring 100 kbps, capable of an ultra-long range, given the low frequency, we can create mesh structure networks from them, but here we're limited to 4 hops max. A big advantage of Z-Wave are association groups, which enable automations we've preprogrammed into actors or sensors to take place even skipping the hub. A disadvantage, or, in some respect even an advantage, that this is not an open standard. This is a proprietary data transfer solution, owned by Silicon Labs, which results in products with much higher prices than a Zigbee, on the market. A disadvantage we might currently not yet feel can be, but the 868 MHz band that can be used in the EU, is a band that's merely 2 MHz wide.
Therefore the Zigbee had only one channel realized in here, and there are other technologies fighting for this freely usable subGHz band, such as e.g. WiFi HaLow, that is about to make way for itself in this frequency band, although there are also many LoRa and SigFox networks built in Hungary, that also use this frequency. So, all I want to point with this, is that it's not true that there is no such thing as interference on this band. For the sake of simplicity I've created a table to simplify the comparison of the four standards we've just discussed. WiFi bandwidth is quite high, that is on the pro side for it, but therefore its range is short, or medium. depending on the number of access points applied to cover the given area.
It's consumption is rather high, compared to that of competing technologies, exemplified well in not really being able to market really cheap sensor solutions for it, and even these can typically only work with an operation time of a year or shorter, while competing solutions can offer operation times of 2-3 or even 5 years. It's scalability is comparatively low, mostly arising from the fact that we don't use WiFi for IoT purposes only, but want to also use it to serve our mobile devices, our television and our air purifier, too. I didn't put any color on its cost, because, depending on which generation of the WiFi standard is being implemented into a product, it can be cheap and expensive alike. I've obviously described the mesh use of Bluetooth in this table, bandwidth is flexible enough, can go from 125 Kbps to 2 Mbps, the range is relatively high, thanks to properly implementing the mesh, Its consumption is the lowest among the ones discussed here, the BLE ultralow power solutions, Its scalability is large, also arising from the mesh, but the cost of Bluetooth mesh solutions is higher than those of the Zigbee discussed here. In the case of the Zigbee I didn't put low bandwidth as a negative, because the features of this standard are more than enough for IoT use. On the other hand, its range is large, thanks to the mesh, its consumption very low, can be compared to that of Bluetooth, its scalability is large, mesh hop number is also unlimited, but its cost is rather low.
If we compare the Z-Wave standard to the former by the same factors, then we also state that bandwidth is also low, and its also not a negative, because it's enough for what it's meant to be used for. It's range is large, the largest among the technologies discussed here, can bridge distances of several kms even without building a mesh. It's consumption is low, and this is primarily true using the latest 700 chips. Its scalability is large, although not comparable to that of the Zigbee's.
Its cost, again, due to not being an open standard, is in the higher category. The key takeaway of my talk is: The use of the 2.4 GHz spectrum by the different standards, and their differences is what you can clearly see on this slide. Now the 3 mentioned WiFi channels cover basically the whole free spectrum, while this same spectrum is divided by Bluetooth into 40 channels. Then, here comes Zigbee, that shows 16 channels in the same spectrum. If we want to apply them simultaneously, then we might get a chaotic picture first, but let's see the details.
Ok, let's look at how the technologies discussed so far are using the 2.4 spectrum. On one hand there's WiFi that covers the whole spectrum with its 3 channels, it's very important to know that the only way we can cause (or suffer) interference, if and when there is transmission going on in that moment in time. This is how we should look at that scenario, when WiFi and Bluetooth are communicating simultaneously. Which looks something like this: It was even a bit too fast, I'll repeat that for you: This is the so-called frequency hopping, which is essentially about breaking transmission down to slices, and send them forward through different channels between the sending and receiving points. Therefore, Bluetooth can suffer less interference, because, while WiFi is keeps sending and receiving on the same channel all the time, as long as it's necessary, Bluetooth is dynamically changing channels, thus, even if it does hit a point where there is transmission going on, and the interference is of an extent resulting in repeated sending, then the next packet will probably already go through another channel to the receiving party, and probably with less of a chance to encounter interference.
No need to be bothered with resending. In this respect WiFi is in a disadvantage, as it uses a relatively wide spectrum, making several hundred-Mbps data transfers possible, but if we look at it from an IoT use perspective, there's no need for that anyway, it's just taking up the free frequency band with no use. The Zigbee on the other hand, much like WiFi, tries to communicate on the channel we previously set on the network.
If I want to compare WiFi and Zigbee, then we can't walk past the advantage of the Zigbee that it uses narrower channels, blocks of 7 MHz, compared to the 20 MHz blocks of WiFi. This is the reason why the Zigbee's spikes reach higher in the drawing, as the trapezoidal blocks of WiFi. The reason for this is lower thermal noise caused by the narrower raster.
Won't go into details now, but simply put, the output power is centred in a 2 MHz block, compared to the distribution of the same power in 20 MHz, in the case of WiFi. The increased thermal noise effect can also be observed, if we operate our WiFi on 40 MHz instead of 20 MHz. This way the distance that can be covered is further decreased.
The result of this is a better signal to noise ratio between any two points and by the same output power, when using Zigbee. If we want to operate a Zigbee network on the 2.4 spectrum, it's good to know that there are four Zigbee channels that suffer less of the WiFi generated interference if only channels 1, 6 and 11 are in operation in the given area.
If other bands are also in use, this advantage disappears. But what to do if we must operate our Zigbee network in an interference-ridden area? Two independent researches have concluded that if we simultaneously operate a Zigbee and a WiFi network in an area, then, resulting from the overlaps, we can count with larger interference around the central frequencies, while data transfer can still remain successful, while at the edges we'll witness packet loss in a much greater ratio, being the primary adversary to data transfer. So, if we want to operate in interference, it's advised to operate our network around the center frequencies.
Ok. So, what's wrong with WiFi? Why not use this widespread technology for IoT purposes? On one hand, because we use WiFi for other things, too, and it's primary task is to serve our mobile devices. If we just take a 3-5 member family living in a household, then we can safely say that there are at least 1 or 2 devices per person present (maybe 3). and as time advances this ratio will be increasingly gravitate towards 3 per person.
We surely have a mobile device, besides that we have a device with a larger display that we use to perform some kind of work, or study on it, a laptop, tablet, anything, and beyond these, there are the wearable devices, that, in order to achieve greater data throughput, sometimes also connect to WiFi. And these are just mobile devices. Beyond this, we have our household appliances that will also use WiFi, be it a smart vacuum cleaner or an air purifier, or a humidifier, or even smart ovens, smart fridges, smart dishwashers, washing machines, but most probably there's going to be a printer in the family, and let's not forget about smart speakers, that also want WiFi, because these have bigger bandwidth demands than an IoT solution. If we put these together, then we basically start off with even 25-35 devices, and we haven't even taken into account the basic infrastructure of the flat or house.
If we take it as a default that we need at least 2-4 sensors per room, and there can also be at least 2-4 actors as well, then this is obviously pointing in the direction that we should offload anything we can from WiFi, and should use a data transfer technology optimized specifically for this purpose, and possibly have this basic infrastructure independent from the WiFi that serves rather as a secondary use. Ok, but what's the problem with WiFi and IoT use regarding its operation? Now given its design, WiFi demands continuous communication. And the fact that we aren't necessarily trafficking useful data on our WiFi network doesn't mean our devices are not in continuous communication the network, even all the time. The reason of this, is that WiFi, in order to maintain its operation, is using what we call management frames, what results in AP's almost continuously broadcasting management information that is necessary to maintain network operability.
Such are for example the beacons that are responsible for broadcasting SSID's. But this is the way they also communicate available signal strengths and the channel on which certain data transfer is being realized. and a lot more things as well. This means a default burden on the network, without ever transmitting any useful data. Besides that, the devices on the network, be it laptops, tablets, speakers, or anything we want to control via an application, are broadcasting certain information, so that when we want to find them, they are available. And these broadcast messages are sent to each member of the network.
And since their goal is to get everywhere, the WiFi network communicates this on the lowest possible modulation. In a case of many AP's this results in an AP spending most of its time communicating on 1 or 6 Mbps to the clients, when it could be transmitting useful information on hundreds of Mbps. The next illustration from Ubiquiti shows that 400 Kbits worth of broadcast traffic on an 5-AP system makes up for 80% of Airtime of the accesspoints. The useful data transfer above that (shown as Unicast here) is only spread between 6-20 Mbps.
yet it keeps the accesspoints completely busy. This goes to show that airtime is a factor that is very important to consider when talking about WiFi networks. Because we typically tend to define WiFi only by its bandwidth, which is always a maximum bandwidth defined for and ideal condition, and real life is usually different from that. This causes a WiFi meant for even gigabit transfers, in cases of many clients might not even be able to deliver results over 20 Mbits.
Pros can say, well, I'll organize my IoT devices into a separate SSID group, and I'll block broadcast traffic there. But think about it that, when we install a new SSID, our management traffic will again grow. since the management and control frames stipulated by the WiFi standard must be communicated on these, as well. And this brings about some serious decrease in capacity again. We have to allocate airtime again, just think about this: if we have an internal network, a guest network and also an IoT network, the accesspoint has to communicate three times the management and control frames to the clients and if we approach from IoT, we can clearly see that we've added some 50-100 clients to the load on our network that we've meant to use for other things, too. I've added this video for you just to let you see how many services are broadcast by clients in my own network, whether they're phones, sensors, hubs, speakers, or just a smart TV.
Ok, you'll say, wait for WiFi 6, and 6E! That'll solve our problems, after all, it's optimized for IoT, too, isn't it? I have to say that first, we don't know what price tag will IoT optimized WiFi 6 solutions will hit the market with, But we can say, that new WiFi standards got to the market as more expensive, as they were always using more complicated technology. And we can expect this will be no different with WiFi 6 either. The advantage WiFi 6 brings over earlier technologies is primarily OFDMA, that means a client will suffice with 2 MHz - sounds familiar? - to communicate, as opposed to the earlier solutions of 20 MHz, to serve the small data amount of one single client. And also, target wake time, which enables WiFi sensors to last not just a year or less with users, but last longer. But regardless of this, the disadvantages posed by wanting to use WiFi network for everything, will not go away.
So if somebody's about to build a smart home, they'd better to take a turn towards Zigbee or Z-Wave. I don't mention Bluetooth just because there are currently not many products widely available on the market relying on Bluetooth mesh. Now, interference. What should we do if we want to use Zigbee,
since, as we saw, this is the cheapest technology, a lot cheaper than any of the ones we've mentioned, except of course, for obsolete WiFi solutions, that are not potent from an IoT perspective. So, just like a wired network, we have to design this, too. Network topology, just like with wired systems, must be properly designed. We must know where our routers are, where our end devices are, also which routers these end devices will be connecting to, and we should build our network that it does get realized according to this. Very important to place WiFi AP's and Zigbee coordinators to proper distances from each other.
Separation distances of 1-1.5 m can mean 10-15 dB's of difference in noise. Another important thing: just as when we install a WiFi network these days, first let's measure the interference present at the location, and let's choose a channel accordingly. Very important, that we should do more than a simple WiFi metering with just a phone, but using a real spectrum analyzer, that doesn't only differentiate WiFi channels, but is also capable of showing other devices using the 2.4 GHz frequency.
You don't have to think several hundred-dollar gadgets, I might give you some tips later. Also very important at the site where we want to operate our Zigbee network, don't overclock our WiFi accesspoints, since that will greatly affect the operation of our Zigbee network. Let's observe the maximum output powers of 100 mW, which, in the case of several AP's can even be lowered. And, what's most important, it's where we can leverage the advantages of Zigbee, is mesh topology, which means that the more nodes we use to build a network, the safer data transfer will be thanks to the multiple connections, and the more reliable our smart home will become. Let's look at two examples of what channels we can use for different systems. Philips Hue is quite widespread, so it was awarded a slide.
There are currently 4 bands we can use here: channels 11, 15, 20 and 25. We can see Philips engineers have also chosen to place privileged channels into the gaps which will expectedly suffer from less interference. The Homey hub from Athom raises the bet even higher, and supports a total of 11 channels (- important note: from firmware version 5 up), so we get an even better chance at operating a Zigbee network in a noisy environment. When we were talking about standards, we mentioned the effects they had on each other. but we haven't covered what is best applied for what.
WiFi has its focus on mobile devices: let's keep using it for that. other IT devices as printers, or security cams, baby cams, smart speakers TV's, household appliances, air purifiers, air conditioners, or vacuum cleaners can also be served with this standard. Since mesh solutions are not yet available, Bluetooth is best used for audio devices, earbuds, headphones and speakers, also indoors geolocation is a forte in Bluetooth, we can used it with smart locks, here it's very important which direction we're coming from, because it can open or lock based on this information. and I would miss mentioning tags, which can be used to sense the presence of certain objects we put these tags onto. Zigbee's primary development principle was the basic infrastructure of smart homes, use it for heating, cooling, valve control, thermostats, luminaires, lighing control, shading control, circuit control, and last but not least, sensors, in which they have a great edge over other standards. And Z-Wave, which is for the very same thing as Zigbee.
Their strengths are the same, it's just more expensive. And now some words about the strategy represented by AccessPoint Ltd, Our concept is rather about being able to make optimal decisions when deciding about parts based on either their data transfer abilities or their cost implications. This was the reason we've cast our vote for Athom's Homey smart home suite, because that includes a lot of interfaces, Z-Wave, Zigbee, WiFi, Bluetooth, infra, not to mention 868 MHz radios, too, that makes this hub the swiss army knife of smart homes, which is demonstrated best by the fact that compatibility with brands seen in this slide was developed and provided to users by Athom itself. But many manufacturers took compatibility development into their own hands, so, if a smart home developer wants to answer client demands with this hub, they can rest assured to rely on the manufacturers on the slides. Beyond these official apps, we are aware that other manufacturers too, are tackling compatibility development in-house, but at the moment these only contribute to the ecosystem with the use of external developers.
The compatibility list is a lot longer than the manufacturers listed here, but these are already the ones that a smart home developer can rely on in long term. Thank you for your attention, my name is Gergely Kálmán, if you have questions, feel free to contact me at my availabilities.
2021-03-01