I have a list (15 million) of POJOs, when put these onto redis, it will always throw out a OOM.
In the beginning, i thought it might be the heap size, however, after increasing that to even 100GB (the 15 million rows consuming ~30GB or less memory footprints), it still throw the same exception.
Looking further into that, seems like this is an issue due to the redis java client implementation which is taking in a
the ArrayList is treated as an Object of size 1. (this is jdk’s implementation for varargs). However, when redis client trying to push that value onto the redis server, it’s reallying serializing one object (the arraylist of 15 million POJOs)
somehow, for both jackson and default jdk serializer, it’s maintaining the size of bytes already written. with enough elements written, the number of bytes written would exceed `Integer.MAX_VALUE`
Looks like redis client should parse the input differently if the input is a collection, instead of treating it as