recently, i have implemented a function which could put variable size of data onto redis. When the data size is small, the function is working fine. However, when it’s trying to put a large data onto redis, it resulted in an infinite loop of “reconnecting”
have spent a lot of time on this, turned out this could be a bug with the lettuce implementation.
https://github.com/lettuce-io/lettuce-core/issues/2093
after switching it to jedis, it uncovers the real exception “error: invalid bulk length”, which lettuce seems like have handled it with infinite redirects.
the real error thrown by redis is due to it’s hard limit on the value size of 512MB, which sounds reasonable
https://github.com/redis/redis/blob/586a16ad7907d9742a63cfcec464be7ac54aa495/src/config.c#L3065
as for the solution, i was using spring cache + spring data redis (which by default using lettuce) for put the cache data. now, since the value exceed the size limit of redis, and spring cache doesn’t support for this case, i have created a customized @
XCachePut to split and put the cache blocks in parallel through redisTemplate
.