The data backbone for Edge AI


Mycelial enables on-device computation by eliminating data bottlenecks, unleashing the power of local AI

Computer vision, predictive maintenance, autonomous navigation - all of these use cases present huge opportunities to automate difficult work and improve safety, reliability, and customer experiences across the board. But what happens when we want to run AI on a device that’s disconnected from the Internet for some or all of its lifespan?

In these scenarios, where devices working in the real-world (ie, the Edge) can’t rely on AI infrastructure hosted in the Cloud, we need systems that can:

  1. Gather local sensory data (ex., camera footage, vibration measurements, hydrophone audio, etc.) and present it to the AI model so the AI can determine what the sensory data means.
  2. Incorporate data from other Edge assets and Cloud data sources (ex., coordinating navigation amongst undersea drones while adjusting mission parameters based on new instructions from command systems in the Cloud).
  3. Send data to other Edge assets and Cloud data sources when connected, while cacheing data to be exfiltrated later when there is no connection.

Edge Sensing & Inferencing

Reduce time to action by processing data and making decisions locally on-device

When considering the Edge, there is no alternative to processing data and running AI locally on the device. For example, a drone swarm tasked with surveillance and threat detection cannot wait for the results of every video frame to be sent to the Cloud, processed, and sent to the AI model for inferencing.

But the value of moving data processing and inferencing to the Edge has significant benefits even for use cases that today leverage Cloud AI infrastructure. One good example is Apple’s voice assistant, Siri. Today, whenever a user asks Siri any question, the audio of the users voice must be converted to text, then that text must be presented to the AI model(s) that power Siri.

Ever notice how long it takes for Siri to respond? Even under pristine conditions, the back and forth flow with Siri will never approximate a human-to-human conversation, which is dynamic, co-emergent, responsive - and immediate.

For safety-critical systems, any delay in returning critical alerts is unacceptable.

Edge World Model Creation

Assemble individual device observations into a comprehensive theater-level view

For many organizations building an Edge AI strategy, one of the first missions is to empower remote assets and operators to have a comprehensive understanding of the theater in which they are operating.

This can be a daunting challenge with last-gen technologies - each device needs a complete understanding of the status and observations/sensor readings from every other device. Today’s tools require significant investment in customized software code, network communications, message queues, and state reconciliation.

Mycelial takes a different approach, by abstracting over these low-level details (read more here). With Mycelial, one Workflow, created in under 2 minutes, is all it takes. One Workflow to declaratively, deterministically, and definitively propagate data from one device to all others, and from other devices back to one.

DDIL/network tolerant data synchronization

Sync data between peers and intelligently prioritize Cloud uploads - even when disconnected

In adverse conditions or contested environments, communications suffer degradation from distance, weather, or intentional attempts to jam or block signals. In these cases, traditional topologies that rely on devices communicating with a single base to send and receive data do not suffice.

Mycelial addresses these concerns in 2 ways:

  1. Data Muling

Rather than requiring each device to pass its data back to a centralized point, Mycelial can be configured such that one or more devices serves as a “Data Mule.” The Data Mule device(s) automatically carry other devices’ data to their desired destination in the Cloud or elsewhere.

For example, a Humvee far from base may not be able to communicate data directly to base. Instead, Mycelial can pass the data from that Humvee to a UAV to another Humvee and finally back to base. Each of those steps can rely on a completely different communications protocol - tactical radio, Wifi, Bluetooth - based on what is available.

  1. AI-powered Data Prioritization

Consider the case of an aerial drone (UAV) used to help emergency services locate an abducted child in an Amber Alert.

The UAV is tasked with using its on-board camera to surveil an area looking for a 2012 red Ford Focus. The UAV is using Mycelial to run a computer vision AI that can identify different types of cars.

The bandwidth available to the drone in-flight is extremely limited - so how can we ensure that the drone uses that precious bandwidth only for the most important data that needs to be relayed to headquarters? How will the network know what to prioritize?

With Mycelial, the AI model itself is the answer. The AI model tells us which video frames contain the car we’re looking for and which do not. In any scenario, it is the AI model which will determine whether a piece of data is of value or not.

Mycelial provides the ability to customize the order in which data is exfiltrated from a device. We don’t want the images of the UAV taking off from the runway, we only want the images with the red Ford Focus. Mycelial provides users the ability to define a sorted SQL query that determines which data moves first. In this case, we need only define a query that ranks high-confidence red Ford Focus inferences at the top, and emergency services will receive that alert first.