About Stacks

We've put together Clear Linux* OS  based reference stacks to allow developers to quickly get up and running on Clear Linux OS with images tailored, optimized, and tested together for specific use cases.

Deep Learning Reference Stack

The Deep Learning Reference Stack, an integrated, highly-performant open source stack optimized for Intel® Xeon® Scalable platforms. This open source community release is part of our effort to ensure AI developers have easy access to all of the features and functionality of the Intel platforms.  The Deep Learning Reference Stack is highly-tuned and built for cloud native environments. With this release, we are enabling developers to quickly prototype by reducing the complexity associated with integrating multiple software components, while still giving users the flexibility to customize their solutions. The stack includes highly tuned software components across the operating system (Clear Linux OS), deep learning framework (TensorFlow*), deep learning libraries (Intel® Math Kernel Library for Deep Neural Networks (MKL-DNN)) and other software components.

Deep Learning Reference Stack repositories on Docker Hub:

Deep Learning Reference Stack with Optimized Eigen Deep Learning Reference Stack with Intel® MKL-DNN

Deep Learning Reference Stack repository on Clear Linux* Project:

Documentation

Clear Linux Edge Stack

The Clear Linux Edge Stack, a single-node FaaS edition, brings together Clear Linux OS and a number of dependencies for cloud connectivity (Amazon* Greengrass in the initial release, with plans to support additional cloud service providers (CSPs)), and the powerful OpenVINO™ toolkit. We also bundle some open source models along with details on how to support hardware acceleration of Function-as-a-Service containers (such as AWS Lambda) using OpenVINO.

Documentation