NXP Solves Edge-Computing Challenges with Containers and Platform Trust

I recently had a water-cooler debate with a colleague about whether edge-computing frameworks are as general as their cloud counterparts or are limited to event-driven data processing. I ultimately concluded that edge frameworks can be fully general after having looked at Microsoft’s Azure IoT Edge. Azure IoT Edge hosts not only serverless functions but also containers. Edge-computing customers, therefore, can run rich container-encapsulated applications at the edge just as they do in the cloud. A cloud-based system manages the applications regardless of location.

For this to work, the edge-computing platform must meet a few prerequisites:

  • Support for server-like Linux distributions, such as Ubuntu
  • Support for container environments, such as Docker
  • Ability to remotely and securely provision edge hardware
  • Ability to securely provision and manage functions and containers running on edge hardware

This is where NXP comes in. Our Layerscape processors are well suited to edge nodes because of their performance, integration, 64-bit Arm® compatibility, and scaling from single-core to 16-core devices. They also include virtualization support and our trust architecture, features we have refined over multiple product generations.

But, a cool chip is just a glittering sliver of silicon until software unlocks its potential. We, therefore, complement our Layerscape processors with software to address each prerequisite above. Our Layerscape Software Development Kit (SDK) includes drivers, tools, and libraries enabling Layerscape processor features. For the operating system, we combine a Linux kernel with an Ubuntu user-land environment. The kernel is a long-term-support (LTS) release and includes drivers and other features specific to our processors that we have provided upstream. If a processor or feature is too new for its enablement to have been upstreamed, we offer kernel patches. The kernel also supports containers—the lightweight approach favored in cloud computing of bundling the code, libraries, tools, and settings required of a rich application. Like the kernel, the Ubuntu environment is also an LTS release, and it includes standard user-space libraries and applications. The benefit of this NXP-supplied software environment is that developers see a standard server-like Linux environment, even when the hardware is a palm-sized system such as our FRWY-LS1012A board or the upcoming SMARC and Q7 computer-on-modules Kontron plans to develop based on the Layerscape LS1028A and i.MX 8 platforms.

 

FRWY-LS1012A board Image

FRWY-LS1012A board Image

Provisioning hardware and software can be tricky. Here we turn from the server model to the smartphone model in which only authorized devices connect to the network and firmware updates are secure and automatic. This is challenging because edge-computing nodes don’t have the user interface of a handset, and an IT manager might have a thousand nodes to commission. We also want application installation to be as easy as with a smartphone. However, instead of pulled down to the device from an app store in the cloud, software is pushed from the cloud down to the edge nodes.

Our trust architecture is a key underlying technology enabling authorization of devices and secure updates. Recapping previous blog posts, at the architecture’s heart is a hardware root of trust, including secure key generation and storage. Additional secure capabilities branch out from this root. For example, it helps enable secure enrollment because each Layerscape-based node can attest its unique identity in a cryptographically secure fashion. Conversely, the node can cryptographically verify that an IT manager approved the downloaded firmware. The node similarly can verify functions and containers pushed from the cloud.

As noted above, making this work requires software, both on the edge node and in the cloud to manage edge nodes and their firmware throughout their lifecycle. At the same time, this software can’t get in the way of providing a server-like OS environment or a standard edge-computing framework like Azure IoT Edge. Another blog posts discusses NXP’s device-management software, EdgeScale.

In summary, NXP and our collaborators, such as Microsoft, offer OEMs a tremendous starting point for their own edge-computing creations. Our hardware and software is scalable and secure, providing a server-like environment, secure enrollment and device monitoring, and secure container and application deployment. The resulting edge-computing solutions can be as general as their cloud-based counterparts, streamlining the development and deployment of applications regardless of their hosts’ location and enabling the reduction in latency and data transmitted off premises that motivates edge deployments.

 

Joseph Byrne
Joseph Byrne
Joe Byrne is a senior strategic marketing manager for NXP's Digital Networking Group. Prior to joining NXP, Byrne was a senior analyst at The Linley Group, where he focused on communications and semiconductors, providing strategic guidance on product decisions to senior semiconductor executives. Prior to working at The Linley Group, he was a principal analyst at Gartner, leading the firm's coverage of wired communications semiconductors. There, he advised semiconductor suppliers on strategy, marketing and investing. Byrne started his career at SMOS Systems after graduating with a bachelor of science in engineering from Duke University. He spent three years at SMOS as part of the R&D engineering team working on 32-bit RISC microcontrollers. He then returned to school for an MBA, which he received with high distinction from the University of Michigan. He worked with Deloitte & Touche Consulting Group for a year before going on to work at Gartner, where he spent the next nine years until going to work for The Linley Group in 2005.

Comments are closed.

Buy now