Skip to content

SoftwareDefinedVehicle/fleet-management-fork

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

52 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

The Eclipse SDV Blueprints project is a collaborative initiative led by Eclipse SDV members to bring the software defined vehicle concepts to life.

The project hosts a collection of blueprints that demonstrate the application of technologies developed in the context of the Eclipse SDV Working Group.

This repository contains the Fleet Management Blueprint which is a close to real-life showcase for truck fleet management where trucks run an SDV software stack so that logistics fleet operators can manage apps, data and services for a diverse set of vehicles.

The use case illustrates how the standard VSS model can be customized and used to report data from a vehicle to a back end. The following diagram provides an overview of the current architecture:

The overall idea is to enable back end applications to consume data coming from a vehicle using the rFMS API.

Data originates from the vehicle's sensors which are represented by a CSV file that is being played back by the kuksa.val CSV feeder. The CSV feeder publishes the data to the kuksa.val Databroker. From there, the FMS Forwarder consumes the data and writes it to an InfluxDB in the back end. The measurements in the InfluxDB can then be visualized in a web browser by means of a Grafana dashboard. Alternatively, the measurements can be retrieved by a Fleet Management application via the FMS Server's (HTTP based) rFMS API.

Quick Start

The easiest way to set up and start the services is by means of using the Docker Compose file in the top level folder:

docker compose -f ./fms-blueprint-compose.yaml up --detach

This will pull or build (if necessary) the container images and create and start all components.

Once all services have been started, the current vehicle status can be viewed on a Grafana dashboard, using admin/admin as username and password for logging in.

The rFMS API can be used to retrieve the data, e.g.

curl -v -s http://127.0.0.1:8081/rfms/vehicleposition?latestOnly=true | jq

Using Eclipse Hono to send Vehicle Data to Back End

By default, the Docker Compose file starts the FMS Forwarder configured to write vehicle data directly to the Influx DB running in the back end.

However, in a real world scenario, this tight coupling between the vehicle and the Influx DB is not desirable. As an alternative, the blueprint supports configuring the FMS Forwarder to send vehicle data to the MQTT adapter of an Eclipse Hono instance as shown in the diagram below.

  1. Register the vehicle as a device in Hono using the create-config-hono.sh shell script:

    ./create-config-hono.sh --tenant MY_TENANT_ID --device-id MY_DEVICE_ID --device-pwd MY_PWD --provision

    Make sure to replace MY_TENANT_ID, MY_DEVICE_ID and MY_PWD with your own values.

    The script registers the tenant and device in Hono's Sandbox installation at hono.eclipseprojects.io unless the --host and/or --kafka-brokers command line arguments are used. Use the --help switch to print usage information.

    The script also creates configuration files in the OUT_DIR/config/hono folder. The OUT_DIR can be specified using the --out-dir option, default value is the current working directory. These files are used to configure the services started via Docker Compose in the next step.

  2. Start up the vehicle and back end services using Docker Compose:

    docker compose --env-file ./config/hono/hono.env -f ./fms-blueprint-compose.yaml -f ./fms-blueprint-compose-hono.yaml up --detach

    The path set via the --env-file option needs to be adapted to the output folder specified in the previous step.

    The second compose file specified on the command line will also start the FMS Consumer back end component which receives vehicle data via Hono's north bound Kafka based Telemetry API and writes it to the Influx DB.

Manual configuration

All information required for setting up the networks, volumes, configs and containers is contained in the Docker Compose file. Please refer to the Docker and/or Podman documentation for details how to perform the setup manually.

Additional information can be found in the components' corresponding subfolders.

Contributing

We are looking forward to your ideas and PRs. Each PRs triggers a GitHub action which checks the formating, performs linting and runs the test. You can performe similar check in your development environment. For more details check the respective action where the checks are listed in the bottom of the file.

About

No description, website, or topics provided.

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages

  • Rust 92.1%
  • Shell 7.9%