A Calustra - Eloy Coto Pereiro Home About Books

Dynamic Serialization with Protobuf on Embedded Rust


The other day, I found myself in a situation where I needed to transmit data via LORA, and then, I realized I needed dynamic serialization for my data structures. Fortunately, there are several serialization engines available that I could leverage.

Typically, I'd opt for JSON due to its simplicity and widespread support. However, Protocol Buffers (ProtoBuf), which has been around for some years, usually has faster encoding and decoding. While ProtoBuf may seem a bit more complex initially, I found it to be quite manageable, especially when working with proto files. Another option, Avro, exists, but I lack sufficient experience with it.

When it comes to ProtoBuf in Rust, the go-to library is tokio-rs/prost. It offers an excellent API and non-standard support, although it does require heap allocation (alloc).

Heap allocation on embedded devices

Heap allocation isn't typically present in embedded environments, but thanks to the Rust-embedded team's efforts, there is a heap allocator library available. However, allocating memory on embedded devices requires careful consideration due to its implications.

To enable allocation on a device, follow these steps:

First, we need to incorporate the alloc crate:

use embedded_alloc::Heap;
extern crate alloc;

The Global Allocator is the way to define the Rust allocator for your program. The GlobalAlloc trait outlines what's necessary to use it. You can find a straightforward example in the documentation here.

Our application should define a new global allocator using the embedded_alloc::Heap type:

static HEAP: Heap = Heap::empty();

I highly recommend exploring the Heap alloc implementation , as it's relatively straightforward and provides insights into the internals, which can be helpful. You can find it here.

One of the prerequisites is to initialize the heap. It's crucial to understand the constraints of your board to avoid filling up memory unnecessarily. Here's an example:

fn main() -> ! {
        use core::mem::MaybeUninit;
        const HEAP_SIZE: usize = 1024;
        static mut HEAP_MEM: [MaybeUninit<u8>; HEAP_SIZE] = [MaybeUninit::uninit(); HEAP_SIZE];
        unsafe { HEAP.init(HEAP_MEM.as_ptr() as usize, HEAP_SIZE) }

With this setup, all the alloc libraries are now available, like alloc::Vec for use with growable arrays.

Encoding to protobuf

Let's delve into why prost needs the alloc library. When parsing protobuf data, the list of items can be undetermined, making it challenging to work with fixed arrays. Prost leverages alloc::Vec, enabling arrays to grow dynamically if it's needed.

There's a helpful guide on using prost, which you can follow. For our purposes, we'll focus on serializing an internal struct. In this case, it's crucial to annotate the struct with attributes like Message and prost:

#[derive(Clone, PartialEq, ::prost::Message)]
pub struct Chrono {
    #[prost(uint32, tag = "1")]
    pub id: u32,
    #[prost(uint32, tag = "2")]
    pub time: u32,

Once the struct is initialized, the encode method is available, which return the buffer with the data in hexadecimal:

let mut dst = Vec::with_capacity(chrono.encoded_len());

chrono.encode(&mut dst).unwrap();
writeln!(hstdout, "DST: {:?}", dst).unwrap();

To wrap up, I've opted for Prost despite its slight complexity because I require a dynamic array with an unpredictable length, and I'd rather avoid implementing chunked reading or custom serialization. This approach aligns perfectly with my use case, although I'll proceed with caution in another scenarios.


Related articles: