Line of sight is a must

Line of sight between mobile beacons (tags) and serving them stationary beacons (anchors) is a must. It is the most critical requirement for any precise indoor positioning system. If a direct line of sight is provided, it doesn’t matter what materials are around (metal, wood, glass, or nothing); it doesn’t matter whether there is a reflection from those surfaces or not. Surroundings absolutely do not affect tracking, and the performance of the precise indoor positioning system is perfect.
 
However, the opposite also holds true: if there is no guaranteed line of sight, all kinds of issues appear: no tracking, jumps, lower accuracy, higher latency, etc. The system is trying to provide the best possible performance, but if there is no sufficient data to determine the location confidently, then it is simply physically impossible, and all those issues mentioned above arise.
 

We have addressed the subject of the paramount importance of direct line of sight several times already:

 
But it looks like we will have to stress it again and again because it is still a source of confusion, misunderstanding, and some rosy expectations.
 
Some systems claim they can precisely track non-line-of-sight configurations. When such claims are combined with an insufficient understanding of underlying technologies, they may lead to a bitter disappointment with the results of tracking indoor positioning systems. Our recommendation is very simple:

Always build indoor positioning systems with line of sight between mobile and serving stationary beacons

Why line of sight is the most important requirement

Think about the direct line of sight requirement as you would about GPS: if there is no direct visibility from a GPS tracker to the GPS satellites, you will have no tracking, full stop.

Very often, the situation is even more complex. Suppose you don’t have a direct line of sight between mobile beacons and stationary beacons. Still, there is a non-direct line of sight visibility – reflection or echo – for example, due to a strong reflection from a neighboring wall or a corner or a reflecting object, and the length difference between the two paths is not high. In that case, it is very difficult or often impossible for the positioning system to detect that there is something wrong, that it is not a direct line of sight, that the resulting measured distances between the stationary and mobile beacons are not correct, and that the resulting coordinates are not right.

The only good solution for the problem is to design the system properly right from the beginning and provide the line of sight between the mobile beacons and serve them stationary beacons. Then, there is no need to fight the problem with complex and unguaranteed solutions because you won’t have the problem at all.

One more time about the underlying physics and technologies

Precise positioning systems are using time of flight and multilateration approach:

  • GPS – time of flight of radio waves (1.1-1.6GHz) from 4 or more satellites. Time-synchronized atomic clocks
  • UWB – time of flight of radio waves (3-10GHz). Time synchronization over radio or Ethernet or no need for synchronization
  • Marvelmind Indoor “GPS” – time of flight of ultrasound. Time synchronization over radio in license-free ISM bands (868/915MHz mostly)

RSSI-based systems

Other technologies that rely on radio signal strength (RSSI)  – BLE, WiFi, ZigBee, LoRa – are imprecise by definition because the RSSI changes so drastically due to multi-path indoor that only by employing very sophisticated algorithms is it possible to define the location somehow. But, still, the underlying physical principle – measuring RSSI and thus calculating the distance – results in much poorer performance than the time-of-flight-based systems that do not depend on the strength of the signal because the latter can fluctuate too much.

Triangulation-based systems

Triangulation-based systems are not the same as trilateration-based systems.

Triangulation – a localization method based on the intersection of three lines. It requires precise knowledge of angles and doesn’t require knowledge of distances. Of course, the coordinates of points from where the lines are taken must be known.

In today’s practice, typically, these systems are scanning laser-based systems.

To some extent, optical systems that calculate mobile objects’ position based on cameras with a base between each other are also triangulation systems. By the way, our binocular sight – our eyes – is a typical biangulation system.

Triangulation systems can be relatively precise, particularly based on laser and with a proper base between the reflecting markers. However, their accuracy of positioning depends on the distance. However, typically, it is less so for trilateration-based systems. To simplify, triangulation systems measure position very well in the vicinity and with a favorable base between cameras or between reflecting markers. They become notably less precise on a distance.

The explanation for triangulation-based systems’ relatively lower accuracy compared with trilateration-based systems is very simple: We can measure angles with less accuracy than we can measure time.

For optical systems using cameras – the resolution of cameras is a limitation. Thus, one has to choose between:

  • Cost
  • Pixel resolution directly affects the angular accuracy and location accuracy. Higher pixel resolution – higher angular resolution, but lower light sensitivity (smaller pixels) and lower update rate (more latency)
  • Angular beam width impacts how many cameras will be required to cover the same area. Lower angular beam, i.e., higher zoom – better angular resolution, but more cameras will be required to cover the same area, and even worse – the complexity of matching the data between the cameras will be much higher. As a result – higher cost

But why is direct line of sight a must?

Assumption 1:

Trilateration-based systems use the time of flight of something (radio, light, ultrasound) to measure the distance. They are not measuring the distance directly. They are measuring time simply because we are capable of measuring time so precisely—picoseconds, if needed. Nanoseconds are pretty comfortable. Microseconds are very easy.

Thus, we must have precise clocks, such as atomic clocks or synchronized clocks, so that the difference in time doesn’t affect the expected accuracy.

For radio- or light-based systems, for example, you want to achieve 10cm accuracy:

  • 1e-1/3e8=0.3 (ns) – pretty strict requirement. But doable

For an ultrasound-based system and you want to achieve better – 1cm accuracy:

  • 1e-2/340=30 (us) – very relaxed requirement, i.e., it is about 1 million times easier to make precise enough clocks for ultrasound-based systems than for electromagnetic waves-based systems (radio or light)

Assumption 2:

Trilateration-based systems assume that the propagation speed is a) known and b) the same over the whole propagation path. With these two assumptions in mind, it is possible to measure the time very precisely and then multiply it by the precisely known propagation speed (speed of light or speed of ultrasound).

And that is the biggest problem with non-line of sight. When you don’t have a line of sight, your signal, for example, the radio signal of UWB, propagates through the materials with unknown properties (concrete, wet wood, glass, etc.). But the propagation speed (speed of radio waves) through these materials differs from the speed of radio waves in vacuum or air! It may vary substantially – times, i.e., the speed of a radio wave can be, for example, 2-3 times slower than in a vacuum. It means that 1ns of measurement in vacuum would result in 30cm distance, but in some ceramics it would be 1m instead. Huge difference!

The problem is worse. We can’t consider this because it is technically very challenging to consider those delays and somehow compensate. To our knowledge, nobody does it, though it would be possible. It is even worse than the properties of real materials, i.e., the speed of radio through them is purely known even in lab conditions. In real-life conditions, it is not known (thickness, types of materials, propagation direction, and resulting propagation thickness, etc.).

Thus, the only practical solution is to build the system using the line of sight between stationary and mobile beacons to avoid the problem altogether.

Non-line of sight indoor positioning systems do not exist

It is simply physically impossible to determine a mobile beacon’s location in a non-line-of-sight situation. Non-line-of-sight shall be read as “non-line of sight for visible light,” i.e., a human can’t see the mobile beacon, but the system does. Yes, with such remarks, NOLS systems could exist, for example, UWB. But they are line of sight for the UWB radio signal.

Simple reference to UWB as a NOLS system leads to bitter disappointments for the end users when they realize that only thin lab walls are transparent to the UWB signal. Real-life concrete and brick walls are not or produce such an error in distance measurements that the tracking system is highly unreliable in practice.

The only recommended and working solution is to always build any precise positioning and tracking system with the line-of-sight requirement in mind.

P.S. Even the human body gives non-line of sight for UWB. The recommendation is to place UWB mobile beacons (tags) on the shoulder, on top of the helmet, or on the wrist in the worst case. Don’t put it in the chest pocket. Body will block the UWB signal coming from the back with very bad impact on performance.

Problems with non-line of sight (NOLS)

It should be evident by now that a direct line of sight is a must for a precise indoor positioning system. However, let us add what else goes wrong when there is no direct line of sight (NOLS situation).

Nearby reflective walls combined with NOLS

As mentioned above, when there is a non-line of sight situation (NOLS), the system may be designed and tuned to detect such events and filter such measurements out, or, at least, it can warn users or other elements of the system that the results cannot be trusted. It can even employ some algorithms to calculate the distance or location differently or adjust calculations. Something could be done, at least. It is complex and not guaranteed, but still – it is something.

Such warning or filtering works reasonably well when there is a large enough difference between the expected LOS distance and the measured distance. The system will have sufficient trustable data to conclude that something doesn’t add up and then warn the user or filter out the measurement completely.

When the difference between the measured distance and the expected distance is small, the system has to choose between a) just badly measured distance, b) NOLS and must be filtered out, and c) our mobile object made some very unexpected twist, and the distance is fine. Choosing is difficult, because many assumptions must be made and there is always a trade of between false positive and false negative.

Again, the only robust solution to the problem is not to have it at all—avoid NOLS as such and build systems with line of sight only.

Static or slow-moving mobile objects are even more problematic with NOLS

When there is NOLS, and the object is moving relatively fast, it is possible to employ several techniques to predict an expected location. Based on prediction, it is possible to extrapolate or filter out wrong measurements.

Fast-moving mobile objects with NOLS produce jumping distances. The fact that the distances are jumping, combined with prior knowledge of the mobile objects (forklifts don’t jump), makes it relatively easy to filter out wrong distances.

However, when objects are static or slow-moving, the system repeatedly returns a measured distance with little variations between the measurements, i.e., no jumps. Distances will have low sigma, i.e., they will look like the results can be trusted. However, repeated measurements can be incorrect and cannot be trusted. But that isn’t easy to detect. Thus, without other sources of information (sensor fusion), it is difficult to warn the users or filter out wrong static measurements. The system will consistently and confidently locate a mobile object in the wrong place.

Slow update rate

When the update rate is high, latency is low, and one has many measurements per second. Thus, it is possible to accumulate some measurements, make decisions and filter out the bad ones, or conclude that everything is bad and filter out everything and still keep the latency relatively low.

When the update rate is high, the power consumption is high. The update rate is reduced to reduce power consumption. The system can’t accumulate or filter data anymore in the same manner without making latency unacceptably high.

The worst: slow update rate and fast-moving mobile objects

The scenario with a slow update rate with the fast-moving mobile object is complex even in good LOS conditions because the object may make turns, loops, etc.. Still, the slow update rate doesn’t allow us to notice that there was a loop or turn because they happen between the location updates.

With NOLS and the attempts to detect or filter out wrong measurements from the good ones, the task becomes nearly impossible because the LOS tracking looks pretty chaotic if the object is moving with turns and loops. However, with NOLS, the chaos of movements is multiplied with the chaos due to NOLS. There is no pattern, there is no trail, and tracking simply becomes impossible even with pretty advanced additional methods.

The only real solution for all these troubles is to provide the line of sight between the mobile beacon and two or more stationary beacons within 30m for 2D and three or more stationary beacons for 3D.

How to solve non-line of sight problems

Place the stationary beacons properly

Follow the Placement Manual and place the stationary beacons properly, thus providing the most confident coverage with the smallest number of beacons.

Clearly understand what sees what

What is the line of sight? Between what and what? Between stationary beacons and the mobile beacons. OK… what which parts of stationary beacons and which parts of the mobile beacons?

The answer is not very simple, actually:

  • In NIA and MF NIA, the mobile beacon is emitting. Therefore, there must be a direct line of sight between the emitting transducers of the mobile beacons and the receiving sensor of the stationary beacon
  • In IA, the stationary beacons are emitting, and the mobile beacons are receiving
  • NIA/MF NIA and IA are not always symmetrical from the line of sight requirements point of view. Check carefully
  • The beacon’s antenna can be non-line of sight for the microphone because it creates a small shadow right before it. Usually, it is not because it is relatively small and not so close after all, but still. It is not an obstacle for the emitting transducers

When stationary Super-Beacons automatically build the map of beacons, they act as stationary and as mobile at different times

One of not so obvious but rather typical problems happens when one stationary beacon is behind another stationary beacon, when they are placed on the wall, but not exactly on the wall, but at a different distance.

When the map of stationary beacons is being built by Super-Beacons, for example, during one time slot, one of the beacons is mobile, and another is stationary, and during the other time slot, they switch. The result—they must have a direct line of sight during this process.

If the beacons are side to side on the wall, it is already challenging because of the microphone sensitivity diagram. See the detailed video about that.

But when one of the beacons is behind the pack, the angle from the beacon to the neighboring beacon becomes negative, and a NOLS situation happens! Yes, the beacons’ bodies “see” each other. But the microphone doesn’t see a transducer, i.e., it is clearly a NOLS situation.

The solutions:

  • Avoid such situation
  • Turn the beacons slightly towards each other so that the ultrasound transmitters won’t be behind the microphone
  • Enter the distances/coordinates manually

If there is clearly a NOLS situation in your setup, for example, a column in the center of a room, the shadow from the column creates a NOLS for a robot behind the column; just create a fully overlapping submap:

  • Place two stationary beacons on the left wall to build a service zone for the whole room
  • Place the other two beacons on the opposite wall and build another service zone – to create fully overlapping service zones

Since the coverage will be provided from both directions, the mobile beacon will see at any given moment two pairs of beacons or, at least, one pair of beacons and, with high confidence, will track in either of the submaps or in both.

If you choose IA with different ultrasound frequencies, for example, 19+25 kHz and 31+37 kHz, then the system will get the location data from the left and the right submaps in the same time slot, i.e., with each location update.

If you choose the same ultrasound frequencies, for example, 25+31kHz and 25+31kHz, then you must choose TDMA=2, i.e. two time slots. In the first time slot the system will determine location using the left submap. In the second time slot – from the right submap.

When there is LOS, you will have good tracking, but provided by different submaps.

When there is a NOLS from one of the submaps, then there will be no tracking during that time slot and good and normal tracking in another time slot. Effectively, it looks like you just reduced the location update rate by 2 manually.

That is in a good situation. But there could be bad situations, when the tracking looks plausible to the left submap (making a mistake due to NOLS) and the right submap that is tracking correctly and having LOS. In the bad situation you may have a split track – different locations for left and right submaps, for different time slots. It is a pretty rare situation in practice, but possible. Some other sources of information usually shall be employed to select who is right and filter the bad ones out.

The same logic can be further extended for more than 2 fully overlapping submaps, if there are more potential obstacles. Remember, though, that it may come for the expense of update rate, because you may soon run out of available ultrasound frequencies, or even for the expense of the stability, because there may be too many opinions about the location of the mobile beacons and the opinions may be conflicting, thus, making the process of selection particularly challenging for the system

See example of fully-overlapping submaps.

The NOLS situation can be solved on the stationary beacons’ side, on the mobile beacons’ side or on both

It shall be very clearly understood that additional visibility between the stationary beacons and the mobile beacons in challenging conditions of potential NOLS situations can be provided by additional solutions on the side of the stationary beacons (Fully overlapping submaps, for example) or on the mobile beacons’s side or both.

What can be done on the mobile beacon’s side?

Well, there are several solutions, such as Marvelmind Jacket. The Jacket has a Mini-RX on one shoulder and an additional microphone on the other. What for? – to combat NOLS of sight from a person’s head. If the head obstructs stationary beacons that are placed on the left of the head and the head obstructs the Mini-RX on the right shoulder. Then, an additional microphone on the left should simply receive the signal and receive it earlier than the potentially obstructed signal to the Mini-RX on the right shoulder. In this way, the NOLS is completely resolved: either both or at least one of the microphones (external microphone or the one on the Mini-RX) will receive the signal.

The same solution is employed in the Marvelmind Badge. It has two pairs of microphones—on the left side of the neck and on the right side of the neck. The proximity of the neck to the microphones is a big problem because the neck closes too large of the “sky”—visibility to the stationary beacons. But a special solution to put the microphones slightly further from the neck and have two pairs of microphones solves the problem.

See more about the solutions against NOLS on the mobile beacon side:

FAQ

No, this is a requirement for the serving stationary beacons. Each submap has 1-4 stationary beacons for 1D, 2D, and 3D submaps. So, the mobile beacons must have a direct line of sight to the serving stationary beacons in each submap. It is OK not to have LOS to stationary beacons in other submaps.

No, it is not a must. You can provide the distance between the stationary beacons or their coordinates manually, for example. Of course, Super-Beacons in this setup can’t measure the distance automatically, and the map of stationary beacons won’t be built automatically. Moreover, it must be checked that the stationary beacons haven’t measured distance automatically and that the distances didn’t accidentally appear to be right to the system, which is wrong in practice.

However, a direct line of sight between the stationary beacons is not a requirement as long as the line of sight is provided from the stationary beacons in the serving submap to the mobile beacons in the service area of the submap.

Yes, it does. The human body is clearly a non-line of sight for the ultrasound. If a beacon is in front of a person, serving stationary beacons are behind the person, and the person produces a shadow for the ultrasound, it is a clear non-line of sight.

Additionally, remember that proximity of, for example, Mini-RX or the antenna of another beacon to the human body results in about 10dB degradation of the radio signal strength in the communication channel between the beacon and the modem. Usually, radio is a less limiting factor because the line of sight for ultrasound is far more critical, and NOLS of sight for ultrasound will lead to no tracking or jumps. However, for the farther-placed beacons, the human body may block the radio connectivity between the modem and the beacon. It will result in no tracking, not due to NOLS of ultrasound, but due to not getting synchronization, telemetry, and the measured data between the modem and the beacon.

Scroll to Top