May 22, 2018

Note to self. Pyspark failling with 'Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState'

If you run pyspark and see this error (it happens in scala-shell as well):

Error while instantiating ‘org.apache.spark.sql.hive.HiveSessionState’

The solution is easy, yet ridiculous. 1. Create the folder /tmp/hive 2. Give it chmod permissions sudo chmod -R 777 /tmp/hive

Found here

Powered by Hugo & Kiss.