首页 > 代码库 > jdbc--取大量数据
jdbc--取大量数据
最近使用jdbc方式查询数据,保存为csv文件中。当然你可以在pl/sql中直接查出来,copy to excel就好了。但我想通过程序实现
1 @Test 2 public void test() throws IOException { 3 BufferedReader reader=new BufferedReader(new InputStreamReader(new FileInputStream("C:\\Users\\yhzh\\Desktop\\zh_20160913"))); 4 String tmp=null; 5 List<String> nos=new ArrayList<String>(); 6 while((tmp=reader.readLine()) !=null) 7 nos.add(tmp); 8 9 Connection con = null;// 创建一个数据库连接10 PreparedStatement pre = null;// 创建预编译语句对象,一般都是用这个而不用Statement11 PreparedStatement pre2 = null;12 ResultSet result = null;// 创建一个结果集对象13 BufferedWriter csvWriter=null;14 try15 {16 String tag=(new SimpleDateFormat("hhmmss")).format(new Date());17 csvWriter = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(new File("C:\\Users\\yhzh\\Desktop\\贷后还款计划_"+tag+".csv")), "utf-8"));18 19 Class.forName("oracle.jdbc.driver.OracleDriver");20 //:6006/hotfix21 String url = "jdbc:oracle:thin:@//*.*.*.*:16030/zcgl",user = "*",password = "*";22 con = DriverManager.getConnection(url, user, password);// 获取连接23 String sql="select max(lr_id) from t_loan_request where LR_REQUESTSTATUS =‘2‘ and lr_applyid=?";24 /*String sql = 27 "select req.LR_APPLYID,CURR_PERIODS,REPAY_DAY,\n" +28 "(NEEDREPAY_PRINCIPAL+NEEDREPAY_INTEREST+NEEDREPAY_ADMIFEE+nvl(RISK_AMOUNT,0)+nvl(SERVICEFEE,0)+nvl(PARKINGFEE,0))NEEDREPAY_TOTAL,\n" +29 "PERIOD_REPAY_AMOUNT \n" +30 "from(\n" +31 " select lr_id,LR_APPLYID from t_loan_request t \n" +32 " where t.LR_REQUESTSTATUS =‘2‘ and t.creater=‘PostLoanOuterAction‘\n" +33 " order by lr_id desc)req\n" +34 "left join t_repay_plan rp\n" +35 "on req.lr_id=rp.lr_id\n" +36 "order by req.lr_id,CURR_PERIODS ";// 预编译语句,“?”代表参数*/37 pre = con.prepareStatement(sql);38 pre2=con.prepareStatement("select CURR_PERIODS,REPAY_DAY,\n" +39 "(NEEDREPAY_PRINCIPAL+NEEDREPAY_INTEREST+NEEDREPAY_ADMIFEE+nvl(RISK_AMOUNT,0)+nvl(SERVICEFEE,0)+nvl(PARKINGFEE,0))NEEDREPAY_TOTAL,\n" +40 "PERIOD_REPAY_AMOUNT \n" +41 "from t_repay_plan\n" +42 "where lr_id=? " +43 "order by CURR_PERIODS");44 for(String no:nos){45 pre.setString(1,no);46 result = pre.executeQuery();47 if(result.next()) {48 long lrId=result.getLong(1);49 pre2.setLong(1,lrId);50 result = pre2.executeQuery();51 while (result.next()) {52 csvWriter.write(no);53 csvWriter.write(",");54 csvWriter.write(result.getString(1));55 csvWriter.write(",");56 csvWriter.write(result.getString(2));57 csvWriter.write(",");58 csvWriter.write(result.getString(3));59 csvWriter.write(",");60 csvWriter.write(result.getString(4));61 csvWriter.newLine();62 }63 }64 }65 66 csvWriter.flush();67 }68 catch (Exception e)69 {70 e.printStackTrace();71 }72 finally73 {74 try75 {76 if(csvWriter !=null)77 csvWriter.close();78 if (result != null)79 result.close();80 if (pre != null)81 pre.close();82 if (con != null)83 con.close();84 System.out.println("数据库连接已关闭!");85 }86 catch (Exception e)87 {88 e.printStackTrace();89 }90 }91 }
先读取所有编号形成List,后遍历这个List,先查出id再查详细数据。这样的数据csv文件中大约8万多条
一条条的来肯定慢,如果不按照编号,直接一次查出,数据是9万多条。速度都很慢!!!
后来想提高下,至少要有个明显的提升呀。写文件这块基本排除了,剩下的疑问就是ResultSet是否拿到了所有结果呢?
根据网上查到的资料和实际调试,得出答案:ResultSet默认一次取10条数据,怪不得要慢,如果一次全部读入内存再写入文件就一定很快了。
怎样一次读取所有数据呢?
sql改为读取全部
1 String sql="select req.LR_APPLYID,CURR_PERIODS,REPAY_DAY,\n" + 2 "(NEEDREPAY_PRINCIPAL+NEEDREPAY_INTEREST+NEEDREPAY_ADMIFEE+nvl(RISK_AMOUNT,0)+nvl(SERVICEFEE,0)+nvl(PARKINGFEE,0))NEEDREPAY_TOTAL,\n" + 3 "PERIOD_REPAY_AMOUNT \n" + 4 "from(\n" + 5 " select lr_id,LR_APPLYID from t_loan_request t \n" + 6 " where t.LR_REQUESTSTATUS =‘2‘ and t.creater=‘PostLoanOuterAction‘\n" + 7 " order by lr_id desc)req\n" + 8 "left join t_repay_plan rp\n" + 9 "on req.lr_id=rp.lr_id\n" +10 "order by req.lr_id,CURR_PERIODS";
设置PreparedStatement:
1 pre = con.prepareStatement(sql);2 pre.setFetchSize(100000);3 result = pre.executeQuery();4 //result.setFetchSize(100000);
主要是PreparedStatement的 setFetchSize 方法,
后来发现ResultSet也有个setFetchSize 方法,也是可行的,只是这个时候resultset中已经有了10条记录直到循环10次后,再次使用result.next()才去取100000,fetchSize才起作用
这样设置后速度飞快!!!
jdbc--取大量数据
声明:以上内容来自用户投稿及互联网公开渠道收集整理发布,本网站不拥有所有权,未作人工编辑处理,也不承担相关法律责任,若内容有误或涉及侵权可进行投诉: 投诉/举报 工作人员会在5个工作日内联系你,一经查实,本站将立刻删除涉嫌侵权内容。