I am facing issues in running spark sql (SQL size around 85KB) on a dataset. It seems we can not have an SQL more than 64KB in spark 1.6.1 version.
I do see this issue is being fixed in Spark 2.0. https://issues.apache.org/jira/browse/SPARK-15285
My cluster is still on 1.6.1. Is there a temporary work around to get 85kb SQL run on 1.6.1? Thanks for your help.
Databricks Inc.
160 Spear Street, 13th Floor
San Francisco, CA 94105
info@databricks.com
1-866-330-0121