java redis OOM during serialization

I have a list (15 million) of POJOs, when put these onto redis, it will always throw out a OOM.

In the beginning, i thought it might be the heap size, however, after increasing that to even 100GB (the 15 million rows consuming ~30GB or less memory footprints), it still throw the same exception.

Looking further into that, seems like this is an issue due to the redis java client implementation which is taking in a varargs paramter.

the ArrayList is treated as an Object[] of size 1. (this is jdk’s implementation for varargs). However, when redis client trying to push that value onto the redis server, it’s reallying serializing one object (the arraylist of 15 million POJOs)

https://github.com/spring-projects/spring-data-redis/blob/main/src/main/java/org/springframework/data/redis/core/AbstractOperations.java#L136

somehow, for both jackson and default jdk serializer, it’s maintaining the size of bytes already written. with enough elements written, the number of bytes written would exceed `Integer.MAX_VALUE`

https://github.com/FasterXML/jackson-core/blob/2.14/src/main/java/com/fasterxml/jackson/core/util/ByteArrayBuilder.java#L279

Looks like redis client should parse the input differently if the input is a collection, instead of treating it as Object[]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s