Record size limit

I’m getting record size limit exceptions

fdb.impl.FDBError: b'Value length exceeds limit' (2103)

although the record is only 58k. I’m calling from C via C-Python bindings and the C PyBytes object tells me PyBytes_GET_SIZE(value) is 58378.

What’s going on? I thought the fdb limit is 100k?

Any pointers much appreciated! :slight_smile:


Nevermind, got it… the value contains 41630 0x00 and 1781 0xFF which makes 101789 total. Ouch!

Any way to store binary values as length encoded blobs instead of the current Byte String the needs to escape 0x00 (and 0xff)?

It sounds like you are putting the byte string into a tuple and then packing that tuple to get the key-value pair that you store. Is that right? Is that because it is part of some larger data structure? If not, you could just put the bytes themselves into the value, with no further encoding at all. If so, then perhaps you need a different serialization format in place of tuples.

Tuples have nice ordering properties that make them a good choice for the key, but that aren’t needed for the value.

Thanks! That’s it. I realized that sometimes after posting this. Guess I originally thought I had to pack the record, as when fetching and returning that object is of type fdb.impl.value (iirc) which I didn’t handle correctly in my C program that used fdb via C Python bindings → fdb Python API.