Your submission was sent successfully! Close

You have successfully unsubscribed! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates about Ubuntu and upcoming events where you can meet our team.Close

Edge AI in a 5G world

Alex Cattle

on 6 February 2020

This article is more than 4 years old.


Deploying AI/ML solutions in latency-sensitive use cases requires a new solution architecture approach for many businesses.

Fast computational units (i.e. GPUs) and low-latency connections (i.e. 5G) allow for AI/ML models to be executed outside the sensors/actuators (e.g. cameras & robotic arms). This reduces costs through lower hardware complexity as well as compute resource sharing amongst the IoT fleet.

Strict AI responsiveness requirements that before required IoT AI model embedding can now be met with co-located GPUs (e.g. on the same factory building) as the sensors and actuators. An example of this is the robot ‘dummification’ trend that is currently being observed for factory robotics with a view to reducing robot unit costs and fleet management.

In this webinar we will explore some real-life scenarios in which GPUs and low-latency connectivity can unlock previously prohibitively expensive solutions now available for businesses to put in place and lead the 4th industrial revolution.

Watch the webinar

smart start

IoT as a service

Bring an IoT device to market fast. Focus on your apps, we handle the rest. Canonical offers hardware bring up, app integration, knowledge transfer and engineering support to get your first device to market. App store and security updates guaranteed.

Get your IoT device to market fast ›

smart start logo

IoT app store

Build a platform ecosystem for connected devices to unlock new avenues for revenue generation. Get a secure, hosted and managed multi-tenant app store for your IoT devices.

Build your IoT app ecosystem ›

Newsletter signup

Get the latest Ubuntu news and updates in your inbox.

By submitting this form, I confirm that I have read and agree to Canonical's Privacy Policy.

Related posts

Deploying Open Language Models on Ubuntu

Discover the benefits of using Ubuntu for open-source AI and how to seamlessly deploy models on Azure, including leveraging GPU and Confidential Compute capabilities.

Canonical presence at Qualcomm DX Summit @Hannover Messe

At the world’s leading industrial trade fair, companies from the mechanical engineering, electrical engineering and digital industries as well as the energy...

Generative AI with Ubuntu on AWS. Part II: Text generation

In our previous post, we discussed how to generate Images using Stable Diffusion on AWS. In this post, we will guide you through running LLMs for text...