首页 > 代码库 > Hadoop读书笔记(三)Java API操作HDFS
Hadoop读书笔记(三)Java API操作HDFS
Hadoop读书笔记(一)Hadoop介绍:http://blog.csdn.net/caicongyang/article/details/39898629
Hadoop读书笔记(二)HDFS的shell操作:http://blog.csdn.net/caicongyang/article/details/41253927
JAVA URL 操作HDFS
OperateByURL.java
package hdfs; import java.io.InputStream; import java.net.URL; import org.apache.hadoop.fs.FsUrlStreamHandlerFactory; import org.apache.hadoop.io.IOUtils; public class OperateByURL { private static final String PATH ="hdfs://192.168.80.100:9000/test.txt"; public static void main(String[] args) throws Exception { //查看文件 URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory()); URL url = new URL(PATH); InputStream in = url.openStream(); IOUtils.copyBytes(in, System.out, 1024,true); } }
Hadoop Java api 操作HDFS
OperateByHadoopAPI.Java
package hdfs; import java.io.File; import java.io.FileInputStream; import java.io.FileOutputStream; import java.net.URI; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FSDataInputStream; import org.apache.hadoop.fs.FSDataOutputStream; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IOUtils; public class OperateByHadoopAPI { //Hadoop HDFS路径 private static final String PATH="hdfs://192.168.80.100:9000/"; private static final String DIR="/d1"; private static final String FILE="/d1/default.cfg"; public static void main(String[] args) throws Exception { FileSystem fileSystem = FileSystem.get(new URI(PATH), new Configuration()); //创建文件夹 fileSystem.mkdirs(new Path(DIR)); //上传文件 //方法一 //fileSystem.copyFromLocalFile(new Path("F:/hadoopbaiduyundownload/liclog.txt"), new Path(DIR)); //方法二 FSDataOutputStream out = fileSystem.create(new Path(FILE)); FileInputStream in = new FileInputStream(new File("F:/hadoopbaiduyundownload/default.cfg")); IOUtils.copyBytes(in, out, 1024, true); //下载 //方法一 产生WARN 待解决:14/11/19 21:39:49 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable File file = new File("F:/hadoopbaiduyundownload/test.txt"); File file2 = new File("F:/hadoopbaiduyundownload/test2.txt"); fileSystem.copyToLocalFile(new Path("hdfs://192.168.80.100:9000/test.txt"), new Path(file.getAbsolutePath())); //方法二 FSDataInputStream inputStream = fileSystem.open(new Path("hdfs://192.168.80.100:9000/test.txt")); FileOutputStream outputStream = new FileOutputStream(file2.getAbsolutePath()); IOUtils.copyBytes(inputStream, outputStream, 1024, true); //遍历文件夹 FileStatus[] listStatus = fileSystem.listStatus(new Path("/")); for (FileStatus fileStatus : listStatus) { System.out.println(fileStatus.isDir()?"文件夹":"文件"+" "+fileStatus.getOwner()+" "+fileStatus.getReplication()+" "+ fileStatus.getPath()); } //删除文件 /** * @parameter path * @parameter boolean :如果填true,path为文件夹时递归删除 * */ fileSystem.delete(new Path(DIR), true); } }
欢迎大家一起讨论学习!
有用的自己收!
记录与分享,让你我共成长!欢迎查看我的其他博客;我的博客地址:http://blog.csdn.net/caicongyang
Hadoop读书笔记(三)Java API操作HDFS
声明:以上内容来自用户投稿及互联网公开渠道收集整理发布,本网站不拥有所有权,未作人工编辑处理,也不承担相关法律责任,若内容有误或涉及侵权可进行投诉: 投诉/举报 工作人员会在5个工作日内联系你,一经查实,本站将立刻删除涉嫌侵权内容。