Backend app for visualizing CANedge log files in Grafana (directly from local disk or S3)

Overview

CANedge Grafana Backend - Visualize CAN/LIN Data in Dashboards

This project enables easy dashboard visualization of log files from the CANedge CAN/LIN data logger.

Specifically, a light-weight backend app loads, DBC decodes and parses MDF log files from local disk or an S3 server. This is done 'on demand' in response to query requests sent from a Grafana dashboard frontend by end users.

This project is currently in BETA - major changes will be made.

CAN Bus Grafana Dashboard

Backend vs. Writer

We provide two options for integrating your CANedge data with Grafana dashboards:

The CANedge Grafana Backend app only processes data 'when needed' by an end user - and requires no database. It is ideal when you have large amounts of data - as you only process the data you need to visualize.

The CANedge InfluxDB Writer processes data in advance (e.g. periodically or on-file-upload) and writes it to a database. It is ideal if dashboard loading speed is critical - but with the downside that data is processed & stored even if it is not used.

For details incl. 'pros & cons', see our intro to telematics dashboards.


Features

- allow users to visualize data from all of your devices & log files in Grafana 
- data is only processed "on request" - avoiding the need for costly databases
- data can be fetched from local disk or S3
- data can be visualized as soon as log files are uploaded to S3 for 'near real-time updates'
- the backend app can be easily deployed on e.g. your PC or AWS EC2 instance 
- plug & play dashboard templates & sample data let you get started quickly 
- view log file sessions & splits via Annotations, enabling easy identification of underlying data 
- allow end users control over what devices/signals are displayed via flexible Variables

Installation

In this section we detail how to deploy the app on a PC or an AWS EC2 instance.

Note: We recommend to test the local deployment with our sample data as the first step.


1: Deploy the integration locally on your PC

A local PC deployment is recommended if you wish to load data from an SD, local disk or MinIO S3.

Deploy the backend app locally

  • Install Python 3.7 for Windows (32 bit/64 bit) or Linux (enable 'Add to PATH')
  • Download this project as a zip via the green button and unzip it
  • Open the folder with the requirements.txt file and enter below in your command prompt:
Windows
python -m venv env & env\Scripts\activate & pip install -r requirements.txt
python canedge_datasource_cli.py "file:///%cd%/LOG" --port 8080
Linux
python3 -m venv env && source env/bin/activate && pip install -r requirements.txt
python3 canedge_datasource_cli.py file:///$PWD/LOG --port 8080

Set up Grafana locally

  • Install Grafana locally and enter http://localhost:3000 in your browser to open Grafana
  • In Configuration/Plugins install SimpleJson and TrackMap
  • In Configuration/DataSources select Add datasource and SimpleJson and set it as the 'default'
  • Enter the URL http://localhost:8080/, hit Save & test and verify that it works
  • In Dashboards/Browse click Import and load the dashboard-template-sample-data.json from this repo

You should now see the sample data visualized in Grafana.

Next: If you aim to work with CANedge2 data from AWS S3, go to step 2 - otherwise go to step 3.


2: Deploy the integration on AWS EC2 & Grafana Cloud

An AWS EC2 instance is recommended if you wish to load data from your AWS S3 bucket.

Deploy the backend app on AWS EC2

  • Login to AWS, search for EC2/Instances and click Launch instances
  • Select Ubuntu Server 20.04 LTS (HVM), SSD Volume Type, t3.small and proceed
  • In Step 6, click Add Rule/Custom TCP Rule and set Port Range to 8080
  • Launch the instance, then create & store your credentials (we will not use them for now)
  • Wait ~5 min, click on your instance and note your IP (the Public IPv4 address)
  • Click Connect/Connect to enter the GUI console, then enter the following:
sudo apt update && sudo apt install python3 python3-pip python3-venv tmux 
git clone https://github.com/CSS-Electronics/canedge-grafana-backend.git && cd canedge-grafana-backend
python3 -m venv env && source env/bin/activate && pip install -r requirements.txt
tmux
python3 canedge_datasource_cli.py file:///$PWD/LOG --port 8080

Set up Grafana Cloud

  • Set up a free Grafana Cloud account and log in
  • In Configuration/Plugins install SimpleJson and TrackMap (log out and in again)
  • In Configuration/DataSources select Add datasource and SimpleJson and set it as the 'default'
  • Replace your datasource URL with the http://[IP]:[port] endpoint and click Save & test
  • In Dashboards/Browse click Import and load the dashboard-template-sample-data.json from this repo

You should now see the sample data visualized in your imported dashboard. In the AWS EC2 console you can press ctrl + B then D to de-attach from the session, allowing it to run even when you close the GUI console.

Next: See step 3 on loading your AWS S3 data and step 5 on deploying the app as a service for production.


3: Load your own data & DBC files

Below we outline how to load your own data & DBC files.

Note: To activate your virtual environment use env\Scripts\activate (Linux: source env/bin/activate)

Load from local disk

  • Replace the sample LOG/ folder with your own LOG/ folder (or add an absolute path)
  • Verify that your data is structured as on the CANedge SD card i.e. [device_id]/[session]/[split].MF4
  • Add your DBC file(s) to the root of the folder
  • Verify that your venv is active and start the app

Load from S3

  • Add your DBC file(s) to the root of your S3 bucket
  • Verify that your venv is active and start the app with below syntax (use python3 on Linux/EC2)
python canedge_datasource_cli.py [endpoint] --port 8080 --s3_ak [access_key] --s3_sk [secret_key] --s3_bucket [bucket]
  • AWS S3 endpoint example: https://s3.eu-central-1.amazonaws.com
  • Google S3 endpoint example: https://storage.googleapis.com
  • MinIO S3 endpoint example: http://192.168.192.1:9000

Import simplified dashboard template

  • To get started, import the dashboard-template-simple.json to visualize your own data
  • After this, you can start customizing your panels as explained in step 4

Regarding DBC files

You can load as many DBC files as you want without reducing performance, as long as your queries only use one at a time (as is e.g. the case when using the simple dashboard template). However, if your queries need to use multiple DBC files, you may consider 'combining' your DBC files for optimal performance.

Regarding compression

It is recommended to enable the CANedge compression as the compressed MFC files are 50%+ smaller and thus faster to load.


4: Customize your Grafana dashboard

The dashboard-template-sample-data.json can be used to identify how to make queries, incl. below examples:

# create a fully customized query that depends on what the user selects in the dropdown 
{"device":"${DEVICE}","itf":"${ITF}","chn":"${CHN}","db":"${DB}","signal":"${SIGNAL}"}

# create a query for a panel that locks a signal, but keeps the device selectable
{"device":"${DEVICE}","itf":"CAN","chn":"CH2","db":"canmod-gps","signal":"Speed"}

# create a query for parsing multiple signals, e.g. for a TrackMap plot
{"device":"${DEVICE}","itf":"CAN","chn":"CH2","db":"canmod-gps","signal":"(Latitude|Longitude)"}

Bundle queries for multiple panels

When displaying multiple panels in your dashboard, it is critical to setup all queries in a single panel (as in our sample data template). All other panels can then be set up to refer to the original panel by setting the datasource as -- Dashboard --. For both the 'query panel' and 'referring panels' you can then use the Transform tab to Filter data by query. This allows you to specify which query should be displayed in which panel. The end result is that only 1 query is sent to the backend - which means that your CANedge log files are only processed once per update.

Set up Grafana Variables & Annotations

Grafana Variables allow users to dynamically control what is displayed in certain panels via dropdowns. For details on how the Variables are defined, see the template dashboard under Settings/Variables.

Similarly, Annotations can be used to display when a new log file 'session' or 'split' occurs, as well as display the log file name. This makes it easy to identify the log files underlying a specific view - and then finding these via CANcloud or TntDrive for further processing.

Regarding performance

Using the 'zoom out' button repeatedly will currently generate a queue of requests, each of which will be processed by the backend. Until this is optimized, we recommend to make a single request a time - e.g. by using the time period selector instead of the 'zoom out' button.

Also, loading speed increases when displaying long time periods (as the data for the period is processed in real-time).


5: Move to a production setup

Managing your EC2 tmux session

Below commands are useful in managing your tmux session while you're still testing your deployment.

  • tmux: Start a session
  • tmux ls: List sessions
  • tmux attach: Re-attach to session
  • tmux kill-session: Stop session

Deploy your app as an EC2 service for production

The above setup is suitable for development & testing. Once you're ready to deploy for production, you may prefer to set up a service. This ensures that your app automatically restarts after an instance reboot or a crash. To set it up as a service, follow the below steps:

  • Ensure you've followed the previous EC2 steps incl. the virtual environment
  • Update the ExecStart line in the canedge_grafana_backend.service 'unit file' with your S3 details
  • Upload the modified file to get a public URL
  • In your EC2 instance, use below commands to deploy the file
sudo wget -N [your_file_url]
sudo cp canedge_grafana_backend.service /etc/systemd/system/
sudo systemctl daemon-reload
sudo systemctl start canedge_grafana_backend
sudo systemctl enable canedge_grafana_backend
sudo journalctl -f -u canedge_grafana_backend

The service should now be deployed, which you can verify via the console output. If you need to make updates to your unit file, simply repeat the above. You can stop the service via sudo systemctl stop [service].

Regarding EC2 costs

You can find details on AWS EC2 pricing here. A t3.small instance typically costs ~0.02$/hour (~15-20$/month). We recommend that you monitor usage during your tests early on to ensure that no unexpected cost developments occur. Note also that you do not pay for the data transfer from S3 into EC2 if deployed within the same region.

Regarding public EC2 IP

Note that rebooting your EC2 instance will imply that your endpoint IP is changed - and thus you'll need to update your datasource. There are methods to set a fixed IP, though not in scope of this README.

Port forwarding a local deployment

If you want to access the data remotely, you can set up port forwarding. Below we outline how to port forward the backend app for use as a datasource in Grafana Cloud - but you could of course also directly port forward your local Grafana dashboard directly via port 3000.

  • Set up port forwarding on your WiFi router for port 8080
  • Run the app again (you may need to allow access via your firewall)
  • Find your public IP to get your endpoint as: http://[IP]:[port] (e.g. http://5.105.117.49:8080/)
  • In Grafana, add your new endpoint URL and click Save & test

Pending tasks

Below are a list of pending items:

  • Optimize Flask/Waitress session management for stability
  • Improve performance for multiple DBC files
  • Update code/guide for TLS-enabled deployment
  • Provide guidance on how to best scale the app for multiple front-end users
  • Determine if using Browser in SimpleJson datasource improves performance (requires TLS)
You might also like...

Small python macro which generates a sketch for a disk of a cycloidal drive.

Small python macro which generates a sketch for a disk of a cycloidal drive.

FreeCAD-Cycloid-Macro Small python macro for FreeCAD which generates a sketch for a disk of a cycloidal drive. The parametrization of the curve is tak

Oct 28, 2022

Simple and fast implementation of a static on disk key value store, in python

static_ondisk_kv Simple and fast implementation of a static on disk kv, in python Why this lib? leveldb, rocksdb and lmdb all have issues for a static

Jun 22, 2022

Fusion 360 add-in script to generate cycloidal disk profiles

CycloidalDiskGenerator Fusion 360 add-in script to generate cycloidal disk profiles. Still a WIP This add-in builds upon the rotechnic script which ge

May 25, 2022

Flasher is a simple Disk Image USB burner tool..

Flasher is a simple Disk Image USB burner tool..

Image Writer flasher is a simple Disk Image USB burner tool.. It is currently a work in progress. Maintenance is done by DAFT-8 team. Dependencies Thi

Aug 31, 2022

Sparse and dense arrays backed by on-disk storage in Zarr or HDF5

backedarray Sparse (csc and csr) and dense arrays backed by on-disk storage in Zarr or HDF5. Allows accessing slices of larger than memory arrays. Ins

Nov 2, 2022

Optional service for the TubeArchivist project to export metrics for users who run prometheus/grafana

Tube Archivist Metrics Provide Tube Archivist metrics in Prometheus/OpenMetrics format This is an optional service as part of the Tube Archivist stack

Apr 15, 2022

Infrastructure for starting TG bot project. Postgres, Minio, Grafana, Alembic

Telegram bot project template This repo contains boilerplate code and infrastructure provisioning for development of telegram bot. Services Bot: Entry

Oct 22, 2022

Telemetry FastAPI application with three pillars of observability: Traces (Tempo), Metrics (Prometheus), Logs (Loki) on Grafana through OpenTelemetry and OpenMetrics.

Telemetry FastAPI application with three pillars of observability: Traces (Tempo), Metrics (Prometheus), Logs (Loki) on Grafana through OpenTelemetry and OpenMetrics.

FastAPI with Observability Telemetry FastAPI application with three pillars of observability on Grafana: Traces with Tempo and OpenTelemetry Python SD

Nov 22, 2022

A collection of Prometheus/PCP data visualization images utilizing Grafana and a collection of bundled dashboards

LiveMetricVisualizer A tool available to visualize Prometheus and PCP data during live collection Build locally with podman build -t name -f Dockerf

Aug 10, 2022

Sample implementation of cloudshell centralized logging with Grafana-loki-promtail stack

Sample implementation of cloudshell centralized logging with Grafana-loki-promtail stack

Cloudshell Loki-Grafana Centralized Logging This repo documents a sample implementation of cloudshell centralized logging with the Grafana / Loki / Pr

Sep 4, 2022

A MLOps platform using prefect, mlflow, FastAPI, Prometheus/Grafana und streamlit

A MLOps platform using prefect, mlflow, FastAPI, Prometheus/Grafana und streamlit

MLOps Platform Skeleton This repository contains a simplified MLOps platform (including training, serving and monitoring). The goal of this tutorial w

Nov 14, 2022

Pronounce This' implementation of grafana logging

Pronounce This Grafana Lib Overview To log data we use the standard output of our pros program, which logs to the usb, to tube data to the computer, w

Oct 4, 2022

Grafana - Directory Traversal and Arbitrary File Read

CVE-2021-43798 Grafana - Directory Traversal and Arbitrary File Read https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-43798 https://grafana.com

Oct 9, 2022

From Hackaday Supercon 2022: Collect sensor data into Grafana from a constellation of BLE remotes.

BLE Sensor Net This code is part of a talk given at Hackaday Supercon 2022. The gist is two parts: Sensor code, written in CircuitPython 7.x, that run

Nov 5, 2022

I employed Scikit Learn Dummy and Random Forest Classification functions to perform Supervised Machine Learning on the HDFS log files

I employed Scikit Learn Dummy and Random Forest Classification functions to perform Supervised Machine Learning on the HDFS log files

I employed Scikit Learn Dummy and Random Forest Classification functions to perform Supervised Machine Learning on the HDFS log files. The results illustrate the successful training of the computer to correctly label log files as anomalous or as normal. The precision scores are 0.95 and 0.99 with recall at 0.60 and 1, and F1-scores at 0.74, and 0.99 respectfully.

Aug 15, 2022

Still using Try-Catch the old way? Hard-Coding log files into your code? Meh, use Decorators instead!

write-logs-like-pros What and Why? Every automated module and scheduled code will at some point run into problems that will cause exceptions to occur.

Jun 9, 2022

Search specific words or sentences in the large log files quickly.

Epicor Automation using Python - Searching specific words or sentences in the large log files. About The Project This project is for the Youtube chann

Aug 26, 2022

YouTube Pycor - Searching specific words or sentences in the large log files and send a notification to chat platform

Epicor Automation using Python - Searching specific words or sentences in the large log files and send a notification for findings. About The Project

Aug 27, 2022

Chat app backend.

django-channels-chat Asynchronous chat app backend built with django channels Usage Clone this repository. Create a new virtualenv environment. instal

Aug 10, 2022
Comments
  • Issue with deploying the app locally (1: Deploy the integration locally on your PC)

    Issue with deploying the app locally (1: Deploy the integration locally on your PC)

    Commit : 196aa9efc4ca892554eabff52175e6b2523b3e6 (tag : v1.0.0)

    Platform : Windows 10

    I have followed these steps : https://github.com/CSS-Electronics/canedge-grafana-backend#1-deploy-the-integration-locally-on-your-pc

    The application starts up fine and is validated by SimpleJson; As soon as I attempt to import the dashboard dashboard-template-sample-data.json, the following error appears :

    (env) C:\{SNIPPED/PATH/}canedge-grafana-backend>python canedge_datasource_cli.py "file:///%cd%/LOG" --port 8080 --limit 100
    Mount path: file:///C:\{SNIPPED/PATH/}canedge-grafana-backend/LOG
    Loaded DBs: canmod-gps
    2022-04-27 08:42:36,438 - waitress - INFO - Serving on http://0.0.0.0:8080
    2022-04-27 08:54:08,622 - canedge_datasource.annotations - WARNING - Failed to annotate: int() argument must be a string, a bytes-like object or a number, not 'dict'
    2022-04-27 08:54:08,628 - canedge_datasource.annotations - WARNING - Failed to annotate: int() argument must be a string, a bytes-like object or a number, not 'dict'
    2022-04-27 08:54:08,854 - canedge_datasource - ERROR - Exception on /query [POST]
    Traceback (most recent call last):
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\env\lib\site-packages\flask\app.py", line 2073, in wsgi_app
        response = self.full_dispatch_request()
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\env\lib\site-packages\flask\app.py", line 1518, in full_dispatch_request
        rv = self.handle_user_exception(e)
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\env\lib\site-packages\flask\app.py", line 1516, in full_dispatch_request
        rv = self.dispatch_request()
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\env\lib\site-packages\flask\app.py", line 1502, in dispatch_request
        return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\canedge_datasource\query.py", line 130, in query_view
        return jsonify(query_cache(req_in))
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\env\lib\site-packages\flask_caching\__init__.py", line 952, in decorated_function
        rv = f(*args, **kwargs)
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\canedge_datasource\query.py", line 117, in query_cache
        res = _query_time_series(req, start_date, stop_date)
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\canedge_datasource\query.py", line 171, in _query_time_series
        return time_series_phy_data(fs=app.fs,
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\canedge_datasource\signal.py", line 202, in time_series_phy_data
        log_files = canedge_browser.get_log_files(fs, device, start_date=start_date, stop_date=stop_date,
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\env\lib\site-packages\canedge_browser\listing.py", line 152, in get_log_files
        selected_sessions = _bisect_list(
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\env\lib\site-packages\canedge_browser\listing.py", line 354, in _bisect_list
        start_index = bisect.bisect_left(bisect_list, lower_bound)
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\env\lib\site-packages\canedge_browser\support\FuncBackedList.py", line 39, in __getitem__
        self._values[item] = self._func(key)
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\env\lib\site-packages\canedge_browser\support\FuncBackedList.py", line 15, in <lambda>
        self._func = lambda x: func(x, *args, **kwargs)
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\env\lib\site-packages\canedge_browser\listing.py", line 325, in _extract_date_from_session_wrapper
        result = extract_date(handle, passwords)
      File "C:\{SNIPPED/PATH/}canedge-grafana-backend\env\lib\site-packages\canedge_browser\listing.py", line 239, in _extract_date_mdf4
        mdf_file = mdf_iter.MdfFile(*args)
    TypeError: int() argument must be a string, a bytes-like object or a number, not 'dict'
    
    opened by fstojanovic 3
Releases(v1.1.2)
Owner
null
This Blender & Figma addon offers you the possibility to load textures directly from Figma into Blender file. You can apply them directly to selected objects or simply paste them as a plane.

Introduction This Blender & Figma addon offers you the possibility to load textures directly from Figma into Blender file. You can apply them directly

Phil 8 Nov 11, 2022
Zipping files is the most convenient way to save on disk space and transfer files using less bandwidth

Zipping files is the most convenient way to save on disk space and transfer files using less bandwidth. You can even ZIP a file and password protect it for better protection, but what happens when you forget a ZIP password? Unzipping a password-protected ZIP file without its password is a challenging job without specialized tools.

Sarwagya Singh 1 Oct 9, 2022
TEM FLEX-1080/FLEX-1085 1.6.0 log log.cgi Information Disclosure

CVE-2022-1077 Description A vulnerability was found in TEM FLEX-1080 and FLEX-1085 1.6.0. It has been declared as problematic. This vulnerability log.

null 5 Oct 10, 2022
The project will help you scrape image from a website and will send it directly (with a local path) to a Azure Blob Storage

Python Image Scrapper to Azure Blob Storage (with local path) The project will help you scrape image from a website and will send it directly (with a

Antoine Smet 5 Nov 7, 2022
Observe FastAPI app with three pillars of observability: Traces (Tempo), Metrics (Prometheus), Logs (Loki) on Grafana through OpenTelemetry and OpenMetrics.

FastAPI with Observability Observe FastAPI application with three pillars of observability on Grafana: Traces with Tempo and OpenTelemetry Python SDK

Blueswen 72 Nov 22, 2022
A streamlit web app visualizing global surface water datasets.

streamlit-multipage-template A streamlit multipage app template for geospatial applications. It can be deployed to Streamlit Cloud, Heroku, or MyBinde

Qiusheng Wu 10 Oct 8, 2022
Feder is an javascript tool that built for visualizing anns index files, so that we can have a better understanding of anns and high dimensional vectors.

feder What is feder Feder is an javascript tool that built for visualizing anns index files, so that we can have a better understanding of anns and hi

Zilliz 139 Nov 12, 2022
These files are for bulk renaming files, bulk render blender files, convert xml to csv (labelimg), export blender camera data, export additional test files from frcnn tests.

FRCNN_Related_Code The file you see here are all used for preparing frcnn training and testing files. Software Version Windows 11 Blender 3.1 Python 3

null 1 Sep 23, 2022
Emulated Disk Operating System (eDOS)

eDOS the Emulated Disk Operating System Introduction eDOS is a Python-based "Operating System" similiar in design to DOS. Its principle design revolve

Benjamin 1 Apr 4, 2022