remote_logging_howto.jpg

Remote Logging for ctrlX OS: Simple rsyslog & Advanced Elastic Stack

jacaré
Long-established Member
rsyslog

This guide will walk you through setting up your Linux remote logging server to receive logs from a ctrlX OS device and save them to a file in your home folder using rsyslog.

Setup ctrlX Core

ctrlX CORE settingsctrlX CORE settings

Remote logging server settingsRemote logging server settings

Use the Host address where the syslog server will be reachable.

Setup remote logging server with rsyslog
Host System Prequisites

For this How To I used the package manager "apt". It is available on Debian derivates like Ubuntu, Linux-Mint and elementary OS. I you prefer another Linux distribution you should use the appropriate package manager.

Step 1: Install rsyslog

First, ensure that rsyslog is installed on your notebook. Run the following commands to update your package list and install rsyslog:

sudo apt-get update
sudo apt-get install rsyslog
Step 2: Setup rsyslog

Open the rsyslog configuration file for editing:

sudo nano /etc/rsyslog.conf
 

To receive logs via TCP, add the following lines to the rsyslog configuration file. This will enable TCP reception on port 514 (the default syslog port):

# provides TCP syslog reception
module(load="imtcp")
input(type="imtcp" port="514")
 

Add a rule to write the incoming logs from a specific ctrlX OS device (with IP 192.168.88.250) to a file in your home folder. Append this line to the end of the configuration file:

:fromhost-ip, isequal, "192.168.88.250" /home/yourusername/device_logs.log

Replace yourusername with your actual Linux username.

After adding the configuration, save the file and exit the editor.

Restart the rsyslog service to apply the changes:

sudo systemctl restart rsyslog
Step 3: Allow Incoming Traffic on Port 514

Ensure your firewall allows traffic on port 514. Run the following command:

sudo ufw allow 514
Step 4: Monitor Logs in Real-Time

You can monitor the incoming logs in real-time by running the following command:

tail -f /home/yourusername/device_logs.log

Notes:

  • This setup supports both RFC3164 and RFC5424 log formats. If you need to handle these differently or separate them, you might need to add more complex rules in the rsyslog configuration.
  • Replace any placeholder values (yourusername, 192.168.88.250, 192.168.88.252) with your actual device and user information.
Elastic Stack

This guide will walk you through setting up your Linux remote logging server with Elastic Stack to receive logs from a ctrlX OS device and make them searchable in a user friendly way.

Elastic dashboardElastic dashboard

Setup ctrlX CORE

Same as in the rsyslog configuration but this time we use port 5000.

Overview remote logging serverOverview remote logging server

Step 1: Setup Elastic Stack via Docker compose

Install Docker and Docker Compose if you haven't already:

   sudo apt-get update
   sudo apt-get install docker.io docker-compose

Create a new directory for your ELK stack:

   mkdir elk-stack
   cd elk-stack

Copy the docker-compose.yml and logstash.conf file into the newly created folder.
Start the Elastic Stack:

   sudo docker-compose up -d

Wait a few minutes for all services to start up.

Step 2: View the logs

Access Kibana by opening a web browser and navigating to http://localhost:5601.
In Kibana, go to "Management" > "Stack Management" > "Index Patterns" > "Create index pattern".

Create indexCreate index

NavigationNavigation

Navigation globalNavigation global

Enter logstash-* as the index pattern and click "Next step".

Index patternIndex pattern

Select @timestamp as the Time field and click "Create index pattern".

Index pattern timestampIndex pattern timestamp

Go to "Discover" in the main menu to start exploring your logs.

Start Discover to explore your logsStart Discover to explore your logs

Example: Search for scheduler

Search schedulerSearch scheduler

To troubleshoot:

  • Check if Logstash is receiving logs:

    sudo docker-compose logs logstash
    
  • Ensure Elasticsearch is running:

    curl http://localhost:9200
    
  • If you don't see data in Kibana, check Elasticsearch indices:

    curl http://localhost:9200/_cat/indices
    

Remember to secure your ELK stack before exposing it to a network, as this setup doesn't include authentication.

Explanation of the services

Let's break down the role of each service in our ELK (Elasticsearch, Logstash, Kibana) stack for this log visualization use case:

  1. Elasticsearch:

    • Purpose: Acts as the search and analytics engine.
    • In this use case:
      • Stores all the log data in an indexed format for quick searching.
      • Provides a RESTful API for other components to store and retrieve data.
    • Configuration notes:
      • Running in single-node mode for simplicity.
      • Has a volume attached (esdata) for data persistence.
  2. Logstash:

    • Purpose: Data collection pipeline tool that ingests data from multiple sources, transforms it, and sends it to a "stash" like Elasticsearch.
    • In this use case:
      • Reads log data from one source: TCP port 5000 for receiving logs directly from remote devices.
      • Processes and structures the log data (as defined in logstash.conf).
      • Forwards the processed data to Elasticsearch.
    • Configuration notes:
      • Uses a custom configuration file (logstash.conf) to define its behavior.
      • Has access to the host's log file through a volume mount.
  3. Kibana:

    • Purpose: Provides a web interface for visualizing and exploring the data stored in Elasticsearch.
    • In this use case:
      • Offers a user-friendly interface to search through logs.
      • Allows creation of custom dashboards and visualizations of log data.
      • Provides tools for real-time log monitoring and analysis.
    • Configuration notes:
      • Exposes port 5601 for web access.

The flow of data in this setup is:

  1. Logs are sent directly to Logstash via TCP.
  2. Logstash picks up these logs, processes them according to rules in logstash.conf.
  3. Processed logs are sent to Elasticsearch for indexing and storage.
  4. Kibana connects to Elasticsearch to retrieve and display this data.

This architecture allows for:

  • Centralized log collection from multiple sources.
  • Efficient searching and analysis of large volumes of log data.
  • Real-time visualization and monitoring of log events.
  • Creation of custom dashboards for specific monitoring needs.
Must Read
Icon--AD-black-48x48Icon--address-consumer-data-black-48x48Icon--appointment-black-48x48Icon--back-left-black-48x48Icon--calendar-black-48x48Icon--center-alignedIcon--Checkbox-checkIcon--clock-black-48x48Icon--close-black-48x48Icon--compare-black-48x48Icon--confirmation-black-48x48Icon--dealer-details-black-48x48Icon--delete-black-48x48Icon--delivery-black-48x48Icon--down-black-48x48Icon--download-black-48x48Ic-OverlayAlertIcon--externallink-black-48x48Icon-Filledforward-right_adjustedIcon--grid-view-black-48x48IC_gd_Check-Circle170821_Icons_Community170823_Bosch_Icons170823_Bosch_Icons170821_Icons_CommunityIC-logout170821_Icons_Community170825_Bosch_Icons170821_Icons_CommunityIC-shopping-cart2170821_Icons_CommunityIC-upIC_UserIcon--imageIcon--info-i-black-48x48Icon--left-alignedIcon--Less-minimize-black-48x48Icon-FilledIcon--List-Check-grennIcon--List-Check-blackIcon--List-Cross-blackIcon--list-view-mobile-black-48x48Icon--list-view-black-48x48Icon--More-Maximize-black-48x48Icon--my-product-black-48x48Icon--newsletter-black-48x48Icon--payment-black-48x48Icon--print-black-48x48Icon--promotion-black-48x48Icon--registration-black-48x48Icon--Reset-black-48x48Icon--right-alignedshare-circle1Icon--share-black-48x48Icon--shopping-bag-black-48x48Icon-shopping-cartIcon--start-play-black-48x48Icon--store-locator-black-48x48Ic-OverlayAlertIcon--summary-black-48x48tumblrIcon-FilledvineIc-OverlayAlertwhishlist