FAQs for the Tau Camera
A collection of frequently asked questions about the Tau LiDAR Camera:
- What's the best way to mount the Tau Camera?
- Can I use multiple Tau Cameras at the same time?
- Do I need internet to use the Tau Camera?
- What is the Tau Camera's power consumption?
- What devices are compatible with the Tau Camera? Can I used it with the Omega2? How about the Raspberry Pi?
- What is the operating temperature range for the Tau Camera?
- What is Tau's light source and what are the specs?
What is the Tau Camera's power consumption?
The Tau Camera's power consumption measures about 250mA@5V from the connected USB power source.
What devices are compatible with the Tau Camera? Can I use it with the Omega2? How about the Raspberry Pi?
The Tau Camera is compatible with any computer with a USB port. Computer here implies having an operating system.
The Python API and Tau Studio web app both require Python (the full version).
It might be possible to connect the Tau Camera to a Microcontroller with a serial/USB port, but will likely need some experimentation.
Yes, the Tau Camera is compatible with the Omega2.
Programs built with the TauLidarCamera Python library and that do not require OpenCV will work just fine.
Raspberry Pi Compatibility
Yes, the Tau Camera is 100% compatible with the Raspberry Pi
Does the Tau Camera still work in direct sunlight?
The Tau Camera does work well in direct sunlight, see an example here: https://www.crowdsupply.com/onion/tau-lidar-camera#works-independently-of-ambient-light
Users can adjust the integration time and minimal amplitude using the API to further refine outdoor performance.
Is the Python API available? What about the Tau Studio web app?
All of the software built for the Tau Camera is open source and available on GitHub:
- Tau Studio web app - https://github.com/OnionIoT/tau-lidar-server
- TauLidarCamera Python API - https://github.com/OnionIoT/tau-lidar-camera
- TauLidarCommon Python Library (a helper library) - https://github.com/OnionIoT/tau-lidar-common
And there's documentation available as well:
- TauLidarCamera Python API Documentation - https://taulidarcamera.readthedocs.io
- TauLidarCommon Python Library Documentation - https://taulidarcommon.readthedocs.io
We welcome all feedback and suggestions! Please make an issue or PR on GitHub!
What is the depth resolution? As in the minimum differential distance between two levels
The Tau Camera's theoretical differential depth level is on the millimeter level. However, the effective differential depth is on the centimeter level depending on the level of noise.
To get down to the millimeter level of granularity an effective noise suppression algorithm must be used.
To be more specific, the depth accuracy is within +/-2% of the measuring distance. The closer camera is to the object, the lower the error will be. For the range of 100mm to 2000mm, the depth accuracy would be under +/-20mm.
What's the best way to mount the Tau Camera?
Use the 4 mounting holes to secure your Tau Camera to a mount or enclosure.
If you're in need of a quick solution, we've published a design for a 3D printed mount.
What is the operating temperature range for the Tau Camera?
All of the components used in the Tau LiDAR Camera are rated for -40° to 85°C
What does purple represent in the 2D depth map?
This means the data point is "saturated." That is, the illumination from the Tau's IR light source that's reflected off the object is too strong to calculate a depth for that point.
If you're using Tau Studio, you'll notice the point cloud struggles to correctly handle these points since they have indeterminate depth.
To reduce the saturation, try reducing the 3D integration time. This is the same principle as reducing exposure time on a regular camera - it will reduce how much reflected light the Tau captures for depth measurements, hopefully bringing it into a usable range.
When the integration time is reduced, you'll immediately see more detail in the Grayscale and Amplitude frames coming from the camera.
There is a trade-off though: with a reduced integration time, there will be worse detection of far away objects and objects that reflect light poorly.
If reducing integration time doesn't help significantly, the object may just be too close. Or perhaps too close for how reflective it is, you'll have different results with more and less reflective objects. In both cases, try moving it farther from the camera.
Why do different coloured objects show up better/worse in the depth map?
Different colours (even on the same material) will affect how well an object is detected because of reflectivity.
(If you're unclear on how LiDAR and Time-of-Flight devices work, we suggest reading our short explainer on the subject)
Different coloured objects will absorb and reflect the IR light from the Tau differently, resulting in better or worse depth detection. Dark coloured objects and low reflectivity objects (grass, leather) will be harder to detect.
(There's no escaping from physics )
For better detection of these objects, try increasing the 3D integration time as much as possible.
The trade-off is that reflective or very close objects will be saturated.
What is Tau's light source and what are the specs?
The light source is made up of 3 near-infrared LEDs. The wavelength of the light centres around 850nm.
This is the emission spectrum of the LEDs:
Can I use multiple Tau Cameras at the same time?
Yes! Even if the fields of view overlap, there should be minimal interference.
Do I need internet to use the Tau Camera?
The camera works without internet!
You only need internet to install Tau Studio and the API.