首页 > 代码库 > SparkSQL External Datasource简易使用之CSV
SparkSQL External Datasource简易使用之CSV
下载源码&编译:
git clone https://github.com/databricks/spark-csv.gitsbt/sbt package
Maven GAV:
groupId: com.databricks.sparkartifactId: spark-csv_2.10version: 0.1
$SPARK_HOME/conf/spark-env.sh
export SPARK_CLASSPATH=/home/spark/software/source/spark_package/spark-csv/target/scala-2.10/spark-csv-assembly-0.1.jar:$SPARK_CLASSPATH
测试数据下载:
wget https://github.com/databricks/spark-csv/raw/master/src/test/resources/cars.csv
Scala API:
import org.apache.spark.sql.SQLContextval sqlContext = new SQLContext(sc)import com.databricks.spark.csv._val cars = sqlContext.csvFile("file:///home/spark/software/data/cars.csv")cars.collect
SQL:
CREATE TEMPORARY TABLE carsUSING com.databricks.spark.csvOPTIONS (path "file:///home/spark/software/data/cars.csv", header "true");select * from cars;
SparkSQL External Datasource简易使用之CSV
声明:以上内容来自用户投稿及互联网公开渠道收集整理发布,本网站不拥有所有权,未作人工编辑处理,也不承担相关法律责任,若内容有误或涉及侵权可进行投诉: 投诉/举报 工作人员会在5个工作日内联系你,一经查实,本站将立刻删除涉嫌侵权内容。