scala - Spark SQL build for hive? -


i have downloaded spark release - 1.3.1 , package type pre-build hadoop 2.6 , later

now want run below scala code using spark shell followed steps

1. bin/spark-shell  2. val sqlcontext = new org.apache.spark.sql.hive.hivecontext(sc)  3. sqlcontext.sql("create table if not exists src (key int, value string)") 

now problem if verity on hue browser

select * src; 

then

table not found exception

that means table not created how configure hive spark shell make successful. want use sparksql need read , write data hive.

i randomly heard need copy hive-site.xml file somewhere in spark directory

can please explain me steps - sparksql , hive configuration

thanks tushar

indeed, hive-site.xml direction correct. take @ https://spark.apache.org/docs/latest/sql-programming-guide.html#hive-tables .

also sounds wish create hive table spark, @ "saving persistent tables" in same document above.


Comments