Sensing

Table of Contents

Spatial Sensing

Real-world perception for digital twins, robotics, and autonomous systems. Connect LiDAR, cameras, and industrial sensors to your spatial models.


The Problem

Sensors produce data. Making sense of that data requires:

  • Context — Where is the sensor? What’s it looking at?
  • Fusion — Combining multiple sensor streams into a coherent picture
  • Integration — Connecting to design tools, not just dashboards

Most sensing solutions stop at data collection. We connect sensors to spatial models.


Deployment Modes

Same edge agent. Same sensors. Different config.

ModePlatformUse Case
AerialDJI enterprise dronesSurveying, inspection, mapping
GroundTripod, backpackInterior scanning, construction
RobotViam RDK, ROS2Navigation, pick-and-place
FixedPermanent mountTraffic, security, warehouse

GPS-Free Navigation

Visual-inertial odometry enables autonomous flight without GPS—critical for indoor environments, urban canyons, and GPS-denied zones.

Real-time position tracking using only cameras and IMU. No external infrastructure required.

Autonomous closed-loop flight using visual-inertial odometry.

For detailed hardware specifications, see our Drone Fleet Hardware →


Sensor Abstraction

Hardware-agnostic by design. Your code talks to our unified API, not individual sensor drivers.

Supported Sensor Types

TypeExamples
LiDARLivox Mid-360, Avia
RGB-D CamerasIntel RealSense, Luxonis OAK-D
PositionGPS/GNSS (u-blox RTK), IMU
IndustrialModbus sensors, CAN bus

Swap sensors without changing code. Configuration-driven, not code-driven.


Edge Agent Architecture

Go binary that runs on your hardware—Raspberry Pi, Jetson, industrial Linux, or custom ARM.

CapabilityDescription
Plugin systemAdd sensors via config, not code changes
Local bufferingStore-and-forward when offline
Real-time streamingNATS JetStream to cloud
LightweightSingle binary, no runtime dependencies

Connectivity

5G/LTE with eSIM OTA

No SIM swapping. No QR code scanning. Server-push provisioning.

  • Modem ships with bootstrap profile
  • Your platform triggers carrier profile download
  • Switch carriers mid-deployment via API

Works for drones in the air, robots on the move, fixed installations in remote locations.


Integration with Spatial

Sensors feed directly into your 3D models:

Data FlowPurpose
Point clouds → Spatial modelReality capture
GPS/IMU → Model positioningGeoreferencing
Environmental sensors → TwinLive state updates
Industrial I/O → AutomationClosed-loop control

Not just dashboards. Sensors in context.


Built on Foundation

Sensing inherits all Foundation capabilities automatically:

CapabilityWhat It Means
Offline-firstCapture without internet, sync when connected
Universal deploymentEdge, mobile, desktop—same agent
Self-sovereignYour sensors, your data, your servers
Real-time syncStream to multiple destinations simultaneously

Learn more about Foundation →


Part of Something Bigger

Sensing is the perception layer of the Ubuntu Software platform.

For organizations that need 3D design and AI, our Spatial platform provides the design environment—with direct integration to your sensor data.

Explore Spatial →


Production Reference: 1,000-Drone Fleet

See how these sensing capabilities scale to production. Our reference architecture demonstrates a complete fleet deployment using the hardware and software stack described above—1,000 Holybro X500 drones with PX4, dual companion computers, and NATS JetStream for fleet-scale digital twinning.

Explore Drone Fleet Architecture →


Build With Us

Deploying sensors? Building perception systems? Let’s talk.

Contact →