批量时使用Lists.partition(list,num)
场景:当我们通过一堆ID,批量查数据库,并写入文件时,oracle数据库的 in(xx,xx,xx,xx..)里面的个数不能超过1000,超过1000则会报错,我们就可以批量去做这件事;
public static void main(String[] args) { List<String> list = Lists.newArrayList(); for (int i = 0; i < 21; i++) { list.add("data"+i); } List<List<String>> result = Lists.partition(list,5); System.out.println(result.get(0)); System.out.println(result.get(1)); System.out.println(result.get(2)); System.out.println(result.get(3)); System.out.println(result.get(4)); }
结果: [data5, data6, data7, data8, data9] [data10, data11, data12, data13, data14] [data15, data16, data17, data18, data19] [data20]
工作中代码:
@SuppressWarnings("unchecked") private void write2TempFile(String sql, String tFile, DiyDataExportTask task) { DiyDataExportTask.Query batchQuery = task.getBatchQuery(); // 打开一个写文件器,构造函数中的第二个参数true表示以追加形式写文件 File tempFile = new File(tFile); try (OutputStreamWriter writerOut = new OutputStreamWriter(new FileOutputStream(tempFile, true), "GBK"); CSVWriter writer = new CSVWriter(writerOut, CSVWriter.DEFAULT_SEPARATOR, CSVWriter.DEFAULT_QUOTE_CHARACTER, CSVWriter.DEFAULT_ESCAPE_CHARACTER, CSVWriter.DEFAULT_LINE_END)) { final Map<String, Long> counts = Maps.newHashMap(); counts.put("index", 0L); if (batchQuery != null && task.getQueryInfo().size() == 0) {// 有查询条件先 List<String> batchValue = (List<String>) batchQuery.getValue(); List<List<String>> list = Lists.partition(batchValue, BATCH_NUM);// 500个数据分组执行 for (int i = 0; i < list.size(); i++) { StringBuilder batchSql = new StringBuilder(sql); appendFilterIn(batchSql, "T." + batchQuery.getName(), list.get(i)); queryData(batchSql.toString(), writer, counts); } } else { queryData(sql, writer, counts); } } catch (DataAccessException | IOException e) { logger.error(e.getMessage(), e); throw new ServiceException("please check template!",e); } }