Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform, offering over 165 fully-featured services from data centers globally. Millions of customers - including the fastest-growing startups, largest enterprises, and leading government agencies — trust AWS to power their infrastructure, become more agile, and lower costs.
This is a computer vision demonstration that integrates several technologies to showcase an edge computing scenario that involves the detection of pasta using artificial intelligence technology - deep learning, and the visualization of live and historic data processing both on a cloud dashboard, using cloud services from Amazon Web Services, as well as a local graphical user interface (GUI), using a GUI framework that relies on modern web technologies.
Here is a brief technical overview of the technologies used:
See below an illustration of how the AI at the Edge Pasta Detection demonstration integrates the AWS services:
AWS IoT Greengrass extends AWS to edge devices so they can act locally on the data they generate, while still using the cloud for management, analytics, and durable storage. With AWS IoT Greengrass, connected devices can run AWS Lambda functions, execute predictions based on machine learning models, keep device data in sync, and communicate with other devices securely – even when not connected to the Internet.
In the context of this demo, AWS IoT Greengrass provides a seamless, secure and resilient connection to the cloud. System data such as the deep learning inference results and the overall system status are collected, pre-processed and sent to an MQTT broker. At the same time, commands to control the conveyor belt and the LED brightness are received from the web dashboard, providing two-way communication between device and cloud.
Amazon SageMaker Neo enables developers to train machine learning models once and run them anywhere in the cloud and at the edge. Amazon SageMaker Neo optimizes models to run up to twice as fast, with less than a tenth of the memory footprint, with no accuracy loss.
Amazon SageMaker is used to train a deep learning inference model from a pasta dataset, focusing on object detection and using the MobileNet SSDv1 algorithm, while Amazon SageMaker Neo then optimizes the trained model for the NXP i.MX 8QuadMax processor, which is the core of Toradex Apalis iMX8.
The following Computer on Module is supported:
The Apalis iMX8 Embedded Vision Kit with Allied Vision does not contain the mechanical parts and accessories to assemble the full demo as showcased by Toradex, NXP, and AWS in tradeshows and events. These items are not mandatory to have the functional demo running. However, you can source or manufacture the components by yourself, and then assemble the full demo:
This section provides instructions for you to get started with the AI at the Edge Pasta Detection demo.
To assemble the kit only, follow the instructions enumerated and illustrated by the gif below:
We do not yet provide instructions for assembling the conveyor belt - which consists of the optional items described in the Prerequisites section.
Power on the system. Toradex Easy Installer comes pre-installed and will be display on the HDMI interface:
Note: Your module should have come with the Toradex Easy Installer pre-installed. If this isn't the case, you can easily follow the Loading Toradex Easy Installer article before proceeding.
Do the following:
Installing AWS and NXP AI at the Edge Pasta Detection Demo using the Toradex Easy Installer (click to enlarge)
After the installation and reboot, you will see at the HDMI display the welcome screen.
It may take 5 minutes or more for the demo to start. It happens because Docker containers are fetched online after the first boot. After the containers are downloaded and started you can see some inferences in the local user interface.
Now you can test the local user interface with some pasta.
This section provides instructions to setup the connect your board to your AWS account.
Follow AWS page instructions to create a new AWS account.
Once you have access to the account, create a user access key. Follow the instructions on the article Managing Access Keys for Your AWS Account Root User to understand how to create it. You will create the access key ID and secret access key as a set.
Make sure to download your AWS credentials to your local machine, which will be used for future reference, by clicking on the "Download Key File" button.
Attention: if you have just registered and this is a fresh AWS account, some services are not available before AWS validates your account which can take up to 24 hours.
Attention: During access key creation, AWS gives you only one opportunity to view and download the secret access key part of the access key. If you miss or lose it, you will need to create a new access key.
When you installed the demo image using the Toradex Easy Installer, you were instructed to write down the serial number and the Ethernet IP. You will select one of them to access a web-based user interface to easily create the cloud infrastructure directly from the Computer on Module.
On a desktop PC connected to the same network as the Computer on Module, open a web-browser (for example Chrome or Firefox) and use either one of the following URLs:
http://<Board's Ethernet IP>:8080
See the example below for my Apalis iMX8, with Ethernet IP
192.168.10.5 and serial number
0333444555. The zero to the left cannot be disregarded:
You will see the following screen:
Note: the browser may mark the website as not secure.
Fill the access key ID and secret access key created previously and hit the Run CloudFormation button. It will deploy the entire AWS infrastructure required to run the demonstration.
Note: this step can take around 15 minutes to 1 hour, since Amazon CloudFront needs time to register the internet domains. Now is the perfect time to go grab a cup of coffee.
Once the progress bar hits 100%, the full deployment is finished. Within seconds your board will start sending data to your dashboard. Copy the URL for your own dashboard as illustrated in the gif above.
Use the URL retrieved above to access, sign-up and sign-in to your web dashboard:
Put some pasta under the camera and see it reflect on the local user-interface and the web dashboard.
The AI at the Edge Pasta Detection demo is open-source. Notice that it is provided as-is.
You can tweak the demo in several ways, from the web UI to the inference model and much more. Here are the public GitHub repositories and additional resources:
|GitHub Repository||Description||Additional Resources|
|aws-nxp-ai-at-the-edge||AWS Lambdas, containers, inference application, etc||-|
|aws-nxp-ai-at-the-edge-gui||Local graphical user interface (GUI) that outputs on HDMI||-|
|aws-nxp-ai-at-the-edge-credentials-setup||Tool to add AWS IoT Greengrass Core device credentials and set up the cloud infrastructure to a fresh setup hardware||-|
|meta-pasta-demo||Yocto layer to re-build and/or customize the Linux image||Instructions how to re-build the Linux image on Build the AWS AI at the Edge Demo Image|
|aws-nxp-ai-at-the-edge-cloud-dashboard||Cloud dashboard interface GUI and infrastructure||-|
Image tarball: AWS Pasta Demo
Image tarball: AWS Pasta Demo
Image tarball: AWS Pasta Demo