Dear Community User! We have started the migration process.
This community is now in READ ONLY mode.
Read more: Important information on the platform change.

RUN MACHINE LEARNING MODEL ON ctrlX CORE

How to run Machine-Learning Models on ctrlX CORE

kuldeepM
Long-established Member

 
Introduction

The ctrlX CORE, developed by Bosch Rexroth, is a versatile industrial control platform that enables running ONNX models for enhanced automation and intelligence. By ensuring compatibility between the ONNX model and the target hardware, converting it using tools like the ONNX Runtime, deploying it on the ctrlX CORE, and integrating it into industrial automation workflows using the SDK of ctrlX AUTOMATION, users can achieve advanced levels of automation and intelligence, optimizing performance and making data-driven decisions within their industrial environments.

Overview

Tropology of running model on ctrlX CORETropology of running model on ctrlX CORE

Prerequisites

Build ONNX Model

First a sample random forest machine-learning model model was built using sklearn library and with help of skl2onnx lib. I have saved the model as .onnx format. which I have used to build example snap below.

Model takes two input parameters and output the validation, whether product belongs to good or bad quality.

-> So train the model depending on your use-case and follow the steps below to build snap for ctrlX CORE.

Build snap to run ONNX

There are many sample development examples in the SDK of ctrlX AUTOMATION but in this case, I have used "ctrlx-automation-sdk/samples-python/datalayer.provider". Open this folder inside the app build environment.

Changes need to be made at:

  1. main.py
  2. setup.py
  3. requirements.txt
  4. snap/snapcraft.yaml

But making any changes create a new folder inside this directory and name it model. Move your ONNX model to this folder.

Modify main.py

After deployment of the snap, this main.py file will generate datalayer nodes. It will read the data from datalayer nodes and run them through a machine learning model and the output will be written in datalayer.

At the top import the libraries, which are important to run your model and manipulate data. If these libraries could not be found then respectively install them in the terminal with the pip command.

main.py importmain.py import
Before doing anything with the model. First, we have to define the data types that we need for input and output. Which are, later on, can be visible in datalayer nodes of the ctrlX CORE.

To define this data-type provider code will look like in the below image.

Define float provider nodeDefine float provider node
At the run time function will call the provider and (line 150) create a variant and (line 151) set it to type float with an initial value (from line 152-158). It will try to register the node and handles the error, if something went wrong at the end it will return the node.

After defining the main function load the ONNX model and start the instance of a model as shown in the picture below.

Load onnx and start instanceLoad onnx and start instanceInside the "with provider:" block, you have to define the necessary variable with the help of provider functions, which we have defined earlier. An example of execution could be seen in the picture below.

Create nodesCreate nodes
In order to make continues the execution of the ONNX model, while loop needs to be defined as shown below.

While loop executionWhile loop execution(Line 115-118) Print functions are defined to see changes in the log book. (Line 119) with the help of data.get_float32(), we are reading nodes from the data layer that we defined earlier and placing these values inside numpy array. (Line 121) with .run function, we are providing numpy array and save prediction to the "output" variable. As this output is boolean (Line 124-128) we process this boolean output to set the value in string format at the node.

Depending on the complexity of the model you might have to add or remove Python lines.

Modify snapcraft.yaml 

The snapcraft.yaml file is necessary as it serves as a configuration file for defining the build process, required dependencies, and other essential metadata for building the snap package. It provides a declarative way to specify how the application should be packaged and distributed as a snap.

yaml configsyaml configs
Inside the "parts": block we have to add the "configs:" block in order to pack the model folder for ONNX. If you have multiple folders for a different model, you have to define it inside the configs block.

In the apps section, modification is necessary for the environment, as shown in the below picture, only if you are building a snap for the real core. for the virtual core, it is not necessary.

kuldeepM_0-1697449359651.png

at Parts->Provider->python-packages: add required libs, which are imported at run time.

kuldeepM_1-1697449672120.png

Build and deploy snap

For virtual core "Terminal->Run build task..->build-snap-amd64"

For real core: "Terminal->Run build task..->build-snap-arm64" (for real core, depending on dependencies you might have to build snap on aws-arm instances, Raspberry-Pi, or similar arm x64 hardware).

After installing the snap on the CORE, it will start running the model every second. And it will look something like the picture below.Classification model in datalayerClassification model in datalayer

Thank you for taking the time to read this article. I hope you found it informative and enjoyable. If you have any questions, comments or encounter any unusual problems with the project, feel free to leave them in the comments section below. I would love to hear from you and continue the conversation. Your feedback is always appreciated! 

7 Comments
Must Read
Icon--AD-black-48x48Icon--address-consumer-data-black-48x48Icon--appointment-black-48x48Icon--back-left-black-48x48Icon--calendar-black-48x48Icon--center-alignedIcon--Checkbox-checkIcon--clock-black-48x48Icon--close-black-48x48Icon--compare-black-48x48Icon--confirmation-black-48x48Icon--dealer-details-black-48x48Icon--delete-black-48x48Icon--delivery-black-48x48Icon--down-black-48x48Icon--download-black-48x48Ic-OverlayAlertIcon--externallink-black-48x48Icon-Filledforward-right_adjustedIcon--grid-view-black-48x48IC_gd_Check-Circle170821_Icons_Community170823_Bosch_Icons170823_Bosch_Icons170821_Icons_CommunityIC-logout170821_Icons_Community170825_Bosch_Icons170821_Icons_CommunityIC-shopping-cart2170821_Icons_CommunityIC-upIC_UserIcon--imageIcon--info-i-black-48x48Icon--left-alignedIcon--Less-minimize-black-48x48Icon-FilledIcon--List-Check-grennIcon--List-Check-blackIcon--List-Cross-blackIcon--list-view-mobile-black-48x48Icon--list-view-black-48x48Icon--More-Maximize-black-48x48Icon--my-product-black-48x48Icon--newsletter-black-48x48Icon--payment-black-48x48Icon--print-black-48x48Icon--promotion-black-48x48Icon--registration-black-48x48Icon--Reset-black-48x48Icon--right-alignedshare-circle1Icon--share-black-48x48Icon--shopping-bag-black-48x48Icon-shopping-cartIcon--start-play-black-48x48Icon--store-locator-black-48x48Ic-OverlayAlertIcon--summary-black-48x48tumblrIcon-FilledvineIc-OverlayAlertwhishlist