记录 RedisTemplate.executePipelined 使用问题

需求,向redis写入2000万个key

@Slf4j
@Component("job2")
public class ToRedis2 implements IJob {

    private AtomicLong count = new AtomicLong(0);
    private Long oldCount=0L;
    private List<String> userIdList = new ArrayList<>();

    private ExecutorService es = Executors.newFixedThreadPool(Runtime.getRuntime().availableProcessors() * 4);

    @Autowired
    private RedisTemplate<String, String> redisTemplate;

    @Setter
    @Getter
    @Value("${user.limit:100000}")
    private volatile int userLimit;
    @Setter
    @Getter
    @Value("${user.skip-count:0}")
    private volatile int skipCount;
    @Setter
    @Getter
    @Value("${user.batch-count:10000}")
    private volatile int batchCount;

    private AtomicBoolean stop=new AtomicBoolean(false);

    private void toRedis() throws IOException {
        String root =ystem.getProperty("user.dir") + "/2021";
        if (userIdList.isEmpty()) {
            // userId
            String filePath = root + "/user.txt";
            readFile(filePath, 0, userIdList);
//            save();
        }

        for (String t : type) {
            // 组装数据并写入Redis
            es.execute(()->{
                List<String> info = new LinkedList<>();
                exit:for (int i = 0; i < userIdList.size(); i++) {
                    if (i < skipCount) {
                        continue;
                    }
                    if (stop.get()) {
                        log.info("job2 stop");
                        break exit;
                    }
                    String key = "app:xxx:202105:userid_" + userIdList.get(i) + ":" + t;
                    info.add(key);
                    if (info.size() == getBatchCount() || i == userIdList.size() - 1) {
                        if (!stop.get()) {
                            executePipelined(info);
                            info.clear();
                        }
                    }
                }
            });
        }
    }

    private void executePipelined(List<String> info) {
        RedisSerializer<String> serializer = redisTemplate.getStringSerializer();
        redisTemplate.executePipelined((RedisCallback<String>) connection -> {
            info.forEach((key) -> {
                if(!stop.get()){
                    long c=count.incrementAndGet();
                    connection.set(serializer.serialize(key), serializer.serialize(String.valueOf(c)));
                }
            });
            return null;
        }, serializer);
    }

}

分批处理数据,此处将数据分为10000条每批,这样不会造成由于接收redis返回结果而造成内存溢出问题

posted on   何苦->  阅读(942)  评论(0编辑  收藏  举报

相关博文:
阅读排行:
· 全程不用写代码,我用AI程序员写了一个飞机大战
· DeepSeek 开源周回顾「GitHub 热点速览」
· 记一次.NET内存居高不下排查解决与启示
· MongoDB 8.0这个新功能碉堡了,比商业数据库还牛
· .NET10 - 预览版1新功能体验(一)
历史上的今天:
2022-04-17 xml中的特殊字符的转义
2022-04-17 使用DATE_SUB()函数实现网站访问量日,月,年统计

导航

< 2025年3月 >
23 24 25 26 27 28 1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31 1 2 3 4 5
点击右上角即可分享
微信分享提示