Skip to content

Processor and API for backend tooling to help with DBC and PID vehicle decoding.

License

Notifications You must be signed in to change notification settings

DIMO-Network/vehicle-signal-decoding

Repository files navigation

vehicle-signal-decoding

Api for managing vehicle signal decoding on the DIMO platform.

There must be at least one 'default' template per powertrain and protocol combo. Default templates are those where their name starts with the word default-. Templates can also have special mappings by:

  • specific device ethereum address (device_to_template table)
  • specific list of device definition ID's (template_device_definitions table)
  • range of years, makes and models (template_vehicles table)

Flow for a device to get their configuration:

  1. Get the URL's to your device's configuration: /device-config/eth-addr/:ethAddr/urls
  2. If above fails, typically b/c your vehicle/device pairing hasn't been minted yet, then can try by VIN: /device-config/vin/:vin/urls
  3. Now you should have the URL's to the different device configurations:
  • PID's
  • DBC file, may be blank
  • settings, usually power related settings If nothing is returned, your device should at least have some default PID's and power settings it can try in the meantime. Check back every so often on endpoint in 1.
  1. Download the respective data and use it for your device.
  2. You should also persist the urls responses so that you can compare them in the future to check for configuration updates for your device/vehicle. Note that the URL's have a version at the end eg. @v1.0.0

Developing locally

To download postgres:

brew install postgresql
pg_ctl -D /usr/local/var/postgres start && brew services start postgresql

If postgres is already installed, go ahead and create database in postgres.

psql postgres
create user dimo PASSWORD 'dimo';
grant dimo to postgres;
create database vehicle_signal_decoding_api
    with owner dimo;

Open postgres database in DataGrip to view schema, tables, etc.

TL;DR

cp settings.sample.yaml settings.yaml
docker compose up -d
go run ./cmd/vehicle-signal-decoding migrate
go run ./cmd/vehicle-signal-decoding

Regenerate swagger docs

swag init -g cmd/vehicle-signal-decoding/main.go --parseDependency --parseInternal --generatedTime true

https://github.com/swaggo/swag#declarative-comments-format

Generating gRPC client and server code

  1. Install the protocol compiler plugins for Go using the following commands
brew install protobuf
go install google.golang.org/protobuf/cmd/protoc-gen-go@latest
go install google.golang.org/grpc/cmd/protoc-gen-go-grpc@latest
  1. Run protoc in the root directory
protoc --go_out=. --go_opt=paths=source_relative \
    --go-grpc_out=. --go-grpc_opt=paths=source_relative \
    pkg/grpc/*.proto

Linting

brew install golangci-lint

golangci-lint run

This should use the settings from .golangci.yml, which you can override.

If brew version does not work, download from https://github.com/golangci/golangci-lint/releases (darwin arm64 if M1), then copy to /usr/local/bin and sudo xattr -c golangci-lint

Database ORM

This is using sqlboiler. The ORM models are code generated. If the db changes, you must update the models.

Make sure you have sqlboiler installed:

go install github.com/volatiletech/sqlboiler/v4@latest
go install github.com/volatiletech/sqlboiler/v4/drivers/sqlboiler-psql@latest

To generate the models:

sqlboiler psql --no-tests --wipe

Make sure you're running the docker image (ie. docker compose up)

If you get a command not found error with sqlboiler, make sure your go install is correct. Instructions here

Adding migrations

To install goose in GO:

$ go get github.com/pressly/goose/v3/cmd/[email protected]
export GOOSE_DRIVER=postgres

To install goose CLI:

$ go install github.com/pressly/goose/v3/cmd/goose
export GOOSE_DRIVER=postgres

Have goose installed, then:

goose -dir internal/infrastructure/db/migrations create slugs-not-null sql

Local development

Importing data: Device definition exports are here You can use sqlboiler to import or this command:

psql "host=localhost port=5432 dbname=vehicle_signal_decoding_api user=dimo password=dimo" -c "\COPY vehicle_signal_decoding_api.integrations (id, type, style, vendor, created_at, updated_at, refresh_limit_secs, metadata) FROM '/Users/aenglish/Downloads/drive-download-20221020T172636Z-001/integrations.csv' DELIMITER ',' CSV HEADER"

Starting Kafka locally

$ brew services start kafka $ brew services start zookeeper

This will use the brew services to start kafka locally on port 9092. One nice thing of this vs. docker-compose is that we can use this same instance for all our different locally running services that require kafka.

Produce some test messages

$ go run ./cmd/test-producer

In current state this only produces a single message, but should be good enough starting point to test locally.

Create decoding topic

kafka-topics --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic topic.JOB.decoding

Sample read messages in the topic

kafka-console-consumer --bootstrap-server localhost:9092 --topic topic.JOB.decoding --from-beginning

About

Processor and API for backend tooling to help with DBC and PID vehicle decoding.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages