Protox is an Elixir library for working with Google's Protocol Buffers, versions 2 and 3, supporting binary encoding and decoding.
The primary objective of Protox is reliability: it uses property testing, mutation testing and has a near 100% code coverage. Protox passes all the tests of the conformance checker provided by Google.
Note
If you're using version 1, please see how to migrate to version 2 here.
Given the following protobuf definition:
message Msg{
int32 a = 1;
map<int32, string> b = 2;
}
Protox will create a regular Elixir Msg
struct:
iex> msg = %Msg{a: 42, b: %{1 => "a map entry"}}
iex> {:ok, iodata, iodata_size} = Msg.encode(msg)
iex> binary = # read binary from a socket, a file, etc.
iex> {:ok, msg} = Msg.decode(binary)
You can use Protox in two ways:
- pass the protobuf schema (as an inlined schema or as a list of files) to the
Protox
macro; - generate Elixir source code files with the mix task
protox.generate
.
- Prerequisites
- Installation
- Usage with an inlined schema
- Usage with files
- Encode
- Decode
- Packages and namespaces
- Specify include path
- Files generation
- Unknown fields
- Unsupported features
- Implementation choices
- Generated code reference and types mapping
- Conformance
- Benchmark
- Contributing
- Elixir >= 1.15 and OTP >= 26
- protoc >= 3.0 This dependency is only required at compile-time. It must be available in
$PATH
.
Add :protox
to your list of dependencies in mix.exs
:
def deps do
[{:protox, "~> 2.0"}]
end
The following example generates two modules, Baz
and Foo
:
defmodule MyModule do
use Protox, schema: """
syntax = "proto3";
message Baz {
}
message Foo {
int32 a = 1;
map<int32, Baz> b = 2;
}
"""
end
Note
The module in which the Protox
macro is called is ignored and does not appear in the names of the generated modules. To include the enclosing module’s name, use the namespace
option, see here.
Use the :files
option to pass a list of files:
defmodule MyModule do
use Protox, files: [
"./defs/foo.proto",
"./defs/bar.proto",
"./defs/baz/fiz.proto"
]
end
Here's how to encode a message to binary protobuf:
msg = %Foo{a: 3, b: %{1 => %Baz{}}}
{:ok, iodata, iodata_size} = Protox.encode(msg)
# or using the bang version
{iodata, iodata_size} = Protox.encode!(msg)
It's also possible to call encode/1
and encode!/1
directly on the generated structures:
{:ok, iodata, iodata_size} = Foo.encode(msg)
{iodata, iodata_size} = Foo.encode!(msg)
Note
encode/1
and encode!/1
return an IO data for efficiency reasons. Such IO data can be used directly with files or sockets write operations:
iex> {iodata, _iodata_size} = Protox.encode!(%Foo{a: 3, b: %{1 => %Baz{}}})
{["\b", <<3>>, <<18, 4, 8>>, <<1>>, <<18>>, [<<0>>, []]], 8}
iex> {:ok, file} = File.open("msg.bin", [:write])
{:ok, #PID<0.1023.0>}
iex> IO.binwrite(file, iodata)
:ok
Use :binary.list_to_bin/1
or IO.iodata_to_binary
if you need to get a binary from an IO data.
Here's how to decode a message from binary protobuf:
{:ok, msg} = Protox.decode(<<8, 3, 18, 4, 8, 1, 18, 0>>, Foo)
# or using the bang version
msg = Protox.decode!(<<8, 3, 18, 4, 8, 1, 18, 0>>, Foo)
It's also possible to call decode/1
and decode!/1
directly on the generated structures:
{:ok, msg} = Foo.decode(<<8, 3, 18, 4, 8, 1, 18, 0>>)
msg = Foo.decode!(<<8, 3, 18, 4, 8, 1, 18, 0>>)
Protox honors the package
directive:
package abc.def;
message Baz {}
The example above will be translated to Abc.Def.Baz
(note the camelization of package abc.def
to Abc.Def
).
In addition, Protox provides the possibility to prepend a namespace with the :namespace
option:
defmodule Bar do
use Protox, schema: """
syntax = "proto3";
package abc;
message Msg {
int32 a = 1;
}
""",
namespace: __MODULE__
end
In this example, the module Bar.Abc.Msg
is generated:
msg = %Bar.Abc.Msg{a: 42}
One or more include paths (directories in which to search for imports) can be specified using the :paths
option:
defmodule Baz do
use Protox,
files: [
"./defs1/prefix/foo.proto",
"./defs1/prefix/bar.proto",
"./defs2/prefix/baz/baz.proto"
],
paths: [
"./defs1",
"./defs2"
]
end
Note
It corresponds to the -I
option of protoc.
It's possible to generate Elixir source code files with the mix task protox.generate
:
protox.generate --output-path=/path/to/messages.ex protos/foo.proto protos/bar.proto
The files will be usable in any project as long as Protox is declared in the dependencies as functions from its runtime are used.
Note
protoc is not needed to compile the generated files.
-
--output-path
The path to the file to be generated or to the destination folder when generating multiple files.
-
--include-path
Specifies the include path. If multiple include paths are needed, add more
--include-path
options. -
--multiple-files
Generates one file per Elixir module. It's useful for definitions with a lot of messages as the compilation will be parallelized. When generating multiple files, the
--output-path
option must point to a directory. -
--namespace
Prepends a namespace to all generated modules.
Unknown fields are fields that are present on the wire but which do not correspond to an entry in the protobuf definition. Typically, it occurs when the sender has a newer version of the protobuf definition. It enables backwards compatibility as the receiver with an old version of the protobuf definition will still be able to decode old fields.
When unknown fields are encountered at decoding time, they are kept in the decoded message. It's possible to access them with the unknown_fields/1
function defined with the message.
iex> msg = Msg.decode!(<<8, 42, 42, 4, 121, 97, 121, 101, 136, 241, 4, 83>>)
%Msg{a: 42, b: "", z: -42, __uf__: [{5, 2, <<121, 97, 121, 101>>}]}
iex> Msg.unknown_fields(msg)
[{5, 2, <<121, 97, 121, 101>>}]
You must use unknown_fields/1
as the name of the field (e.g. __uf__
in the above example) is generated at compile-time to avoid collision with the actual fields of the Protobuf message. This function returns a list of tuples {tag, wire_type, bytes}
. For more information, please see the protobuf encoding guide.
Note
Unknown fields are retained when re-encoding the message.
- The Any well-known type is partially supported: you can manually unpack the embedded message after decoding and conversely pack it before encoding;
- Groups (deprecated in protobuf);
- All options other than
packed
anddefault
are ignored as they concern other languages implementation details.
-
(Protobuf 2) Required fields Protox enforces the presence of required fields; an error is raised when encoding a message with missing required field:
defmodule Bar do use Protox, schema: """ syntax = "proto2"; message Required { required int32 a = 1; } """ end iex> Protox.encode!(%Required{}) ** (Protox.RequiredFieldsError) Some required fields are not set: [:a]
-
Enum aliases When decoding, the last encountered constant is used. For instance, in the following example,
:BAR
is always used if the value1
is read on the wire:enum E { option allow_alias = true; FOO = 0; BAZ = 1; BAR = 1; }
-
(Protobuf 2) Unset optional fields are assigned
nil
. You can use the generateddefault/1
function to get the default value of a field:defmodule Bar do use Protox, schema: """ syntax = "proto2"; message Foo { optional int32 a = 1 [default = 42]; } """ end iex> %Foo{}.a nil iex> Foo.default(:a) {:ok, 42}
-
(Protobuf 3) Unset fields are assigned to their default values. However, if you use the
optional
keyword (available in protoc >= 3.15), then unset fields are assignednil
:defmodule Bar do use Protox, schema: """ syntax = "proto3"; message Foo { int32 a = 1; optional int32 b = 2; } """ end iex> %Foo{}.a 0 iex> Foo.default(:a) {:ok, 0} iex> %Foo{}.b nil iex> Foo.default(:b) {:error, :no_default_value}
-
Messages and enums names are converted using the
Macro.camelize/1
function. Thus, in the following example,non_camel_message
becomesNonCamelMessage
, but the fieldnon_camel_field
is left unchanged:defmodule Bar do use Protox, schema: """ syntax = "proto3"; message non_camel_message { } message CamelMessage { int32 non_camel_field = 1; } """ end iex> msg = %NonCamelMessage{} %NonCamelMessage{__uf__: []} iex> msg = %CamelMessage{} %CamelMessage{__uf__: [], non_camel_field: 0}
- The detailed reference of the generated code is available in documentation/reference.md.
- Please see documentation/types_mapping.md to see how protobuf types are mapped to Elixir types.
The Protox library has been thoroughly tested using the conformance checker provided by Google.
To launch these conformance tests, use the protox.conformance
mix task:
$ mix protox.conformance
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1738246114.224098 3490144 conformance_test_runner.cc:394] ./protox_conformance
CONFORMANCE TEST BEGIN ====================================
CONFORMANCE SUITE PASSED: 1368 successes, 1307 skipped, 0 expected failures, 0 unexpected failures.
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1738246115.065491 3495574 conformance_test_runner.cc:394] ./protox_conformance
CONFORMANCE TEST BEGIN ====================================
CONFORMANCE SUITE PASSED: 0 successes, 414 skipped, 0 expected failures, 0 unexpected failures.
Note
A report will be generated in the directory conformance_report
.
Please see benchmark/launch_benchmark.md for more information on how to launch benchmark.
Please see CONTRIBUTING.md for more information on how to contribute.