记录 RedisTemplate.executePipelined 使用问题

需求,向redis写入2000万个key

@Slf4j
@Component("job2")
public class ToRedis2 implements IJob {

    private AtomicLong count = new AtomicLong(0);
    private Long oldCount=0L;
    private List<String> userIdList = new ArrayList<>();

    private ExecutorService es = Executors.newFixedThreadPool(Runtime.getRuntime().availableProcessors() * 4);

    @Autowired
    private RedisTemplate<String, String> redisTemplate;

    @Setter
    @Getter
    @Value("${user.limit:100000}")
    private volatile int userLimit;
    @Setter
    @Getter
    @Value("${user.skip-count:0}")
    private volatile int skipCount;
    @Setter
    @Getter
    @Value("${user.batch-count:10000}")
    private volatile int batchCount;

    private AtomicBoolean stop=new AtomicBoolean(false);

    private void toRedis() throws IOException {
        String root =ystem.getProperty("user.dir") + "/2021";
        if (userIdList.isEmpty()) {
            // userId
            String filePath = root + "/user.txt";
            readFile(filePath, 0, userIdList);
//            save();
        }

        for (String t : type) {
            // 组装数据并写入Redis
            es.execute(()->{
                List<String> info = new LinkedList<>();
                exit:for (int i = 0; i < userIdList.size(); i++) {
                    if (i < skipCount) {
                        continue;
                    }
                    if (stop.get()) {
                        log.info("job2 stop");
                        break exit;
                    }
                    String key = "app:xxx:202105:userid_" + userIdList.get(i) + ":" + t;
                    info.add(key);
                    if (info.size() == getBatchCount() || i == userIdList.size() - 1) {
                        if (!stop.get()) {
                            executePipelined(info);
                            info.clear();
                        }
                    }
                }
            });
        }
    }

    private void executePipelined(List<String> info) {
        RedisSerializer<String> serializer = redisTemplate.getStringSerializer();
        redisTemplate.executePipelined((RedisCallback<String>) connection -> {
            info.forEach((key) -> {
                if(!stop.get()){
                    long c=count.incrementAndGet();
                    connection.set(serializer.serialize(key), serializer.serialize(String.valueOf(c)));
                }
            });
            return null;
        }, serializer);
    }

}

分批处理数据,此处将数据分为10000条每批,这样不会造成由于接收redis返回结果而造成内存溢出问题

posted on 2023-04-17 17:34  何苦->  阅读(761)  评论(0编辑  收藏  举报

导航