Teaser.png

Fast time series data from ctrlX PLC to local InfluxDB

jacaré
Long-established Member

Discovering an efficient method to obtain rapid data, such as within a 1 ms timeframe, from the ctrlX PLC is crucial. This article presents a practical demonstration of acquiring data swiftly and in real-time from the PLC App, ensuring seamless preservation in the InfluxDB for future utilization, without any data loss.

Prerequisites

PLC code

The PLC writes data into an internal buffer array. If this array reached the maximum number of samples, it will be copied into the external buffer array and the buffer counter increments. From outside the PLC e.g. in Node-RED we then can observe the buffer index value. If it changes, it means that a new buffer array is ready. We then can read and post process it.

The external buffer array are in fact two arrays. One for the value and the other for the timestamp.

With the following variables, we can control the behavior of the data sampling:

  • bStart set it to true to start the data acquisition
  • num_samples the number of data points to write into the buffer array
  • input_value the value of the data that will be recorded
  • trigger the trigger is observed and on change the value will be stored into the buffer array
  • sample_interval this acts as a multiplier, e.g. if we want the data every 10 ms we can set it to 10.

1. Create the Function Block "daq"

Declaration:

FUNCTION_BLOCK daq

VAR_INPUT
	num_samples : UINT := 1000;
	input_value: UINT; 
	trigger: UDINT;
END_VAR

VAR_OUTPUT
	buffer_id : INT := 0;
	buffer_value : ARRAY [0..10000] OF UINT;
	buffer_timestamp : ARRAY [0..10000] OF ULINT;
END_VAR

{attribute 'hide'}
VAR
	run : BOOL := TRUE;
	num_values : INT := 16;
	i : INT;
	index : INT := 0;
	sample : INT := 0;
	buffer_internal_value : ARRAY [0..10000] OF UINT;
	buffer_internal_timestamp : ARRAY [0..10000] OF ULINT;
	rtc : ULINT;;
	trigger_old: UDINT := 0; 
END_VAR

 
Implementation:

Util.SysTimeRtcHighResGet(pTimestamp:= rtc);

IF trigger <> trigger_old THEN 
	buffer_internal_value[sample] := input_value;
	buffer_internal_timestamp[sample] := rtc;
	sample := sample + 1;
END_IF

IF sample >= num_samples THEN
	sample := 0;
    Util.SysMemCpy(pDest:=ADR(buffer_value), pSrc:=ADR(buffer_internal_value), udiCount:= num_samples * SIZEOF(input_value)); 
	Util.SysMemCpy(pDest:=ADR(buffer_timestamp), pSrc:=ADR(buffer_internal_timestamp), udiCount:= num_samples * SIZEOF(rtc)); 
	buffer_id := buffer_id + 1;
END_IF

trigger_old:=trigger;
 
2. Create the GVL
VAR_GLOBAL
	{attribute 'symbol' := 'read'} i: UINT;
	{attribute 'symbol' := 'read'} trigger: UINT := 0;
	{attribute 'symbol' := 'readwrite'} bStart: BOOL;
	{attribute 'symbol' := 'readwrite'} sample_interval: UINT := 10;
	{attribute 'symbol' := 'read'} buffer_id: INT;
	{attribute 'symbol' := 'read'} num_samples: UINT := 5000;
	{attribute 'symbol' := 'read'} buffer_value: ARRAY [0..10000] OF UINT;
	{attribute 'symbol' := 'read'} buffer_timestamp: ARRAY [0..10000] OF ULINT;
END_VAR

 

3. Create the fast task

Declaration:

PROGRAM fast_task
VAR
	daq: daq;
END_VAR


Implementation:

IF bStart THEN
	IF i MOD sample_interval = 0 THEN
		trigger := trigger + 1;
	END_IF
	i := i+1;
END_IF

daq(
	num_samples:= num_samples, 
	input_value:= i, 
	trigger:= trigger, 
	buffer_id=> buffer_id, 
	buffer_value=> buffer_value,
	buffer_timestamp=> buffer_timestamp
);

 
Configure the task so that it will run every millisecond.

jacar_0-1677591432173.png

 

Node-RED Flow

You should now create a flow which observes the buffer_id variable. On change it then should get the buffer_value and the buffer_timestamp arrays and store it to msg.buffer_value and msg.buffer_timestamp like shown in the image:Node-RED FlowNode-RED Flow

The function node to write the data into the database:

class fieldObject {
    constructor (time, value) {
    this.time = time;
    this.value = value; 
    } 
}

msg.payload = [];
const tagObject = {}

for (let i = 0; i < msg.buffer_value.length; i++) {
    msg.payload.push([new fieldObject(msg.buffer_timestamp[i], msg.buffer_value[i]), tagObject])
}

return msg;


If everything went according to plan, the raw data should be visible in InfluxDB with 1 ms distance between each data point.

Node-RED DebugNode-RED Debug
 
Importing the Project

You need to first select "SPS-Logic" in your project and then "Project" -> "import".

jacar_0-1706535230193.png

 

Must Read
Icon--AD-black-48x48Icon--address-consumer-data-black-48x48Icon--appointment-black-48x48Icon--back-left-black-48x48Icon--calendar-black-48x48Icon--center-alignedIcon--Checkbox-checkIcon--clock-black-48x48Icon--close-black-48x48Icon--compare-black-48x48Icon--confirmation-black-48x48Icon--dealer-details-black-48x48Icon--delete-black-48x48Icon--delivery-black-48x48Icon--down-black-48x48Icon--download-black-48x48Ic-OverlayAlertIcon--externallink-black-48x48Icon-Filledforward-right_adjustedIcon--grid-view-black-48x48IC_gd_Check-Circle170821_Icons_Community170823_Bosch_Icons170823_Bosch_Icons170821_Icons_CommunityIC-logout170821_Icons_Community170825_Bosch_Icons170821_Icons_CommunityIC-shopping-cart2170821_Icons_CommunityIC-upIC_UserIcon--imageIcon--info-i-black-48x48Icon--left-alignedIcon--Less-minimize-black-48x48Icon-FilledIcon--List-Check-grennIcon--List-Check-blackIcon--List-Cross-blackIcon--list-view-mobile-black-48x48Icon--list-view-black-48x48Icon--More-Maximize-black-48x48Icon--my-product-black-48x48Icon--newsletter-black-48x48Icon--payment-black-48x48Icon--print-black-48x48Icon--promotion-black-48x48Icon--registration-black-48x48Icon--Reset-black-48x48Icon--right-alignedshare-circle1Icon--share-black-48x48Icon--shopping-bag-black-48x48Icon-shopping-cartIcon--start-play-black-48x48Icon--store-locator-black-48x48Ic-OverlayAlertIcon--summary-black-48x48tumblrIcon-FilledvineIc-OverlayAlertwhishlist