We updated one of our jars with a new version using the databricks UI.
I detached and deleted the old jar. Created a new cluster and uploaded and attached the new jar.
However I am getting versions conflict since the old versions of the jar is still added to the classpath, and idea how to solve this?
Still experiencing the same issue. @Kyle is this supposed to have been fixed in previous releases?
Answer by Kyle · Jul 05, 2015 at 11:51 PM
Try deleting all old jars in "dbfs://FileStore/jars/
Answer by sujitpal · Jul 15, 2015 at 04:09 AM
I had the same issue. Based on what I learned from my colleague, this is an open issue with Databricks. Here is what I did (partly based on his advise) to get around it. Not pretty but seems to work:
1) delete the old JAR file.
2) remove the cached JAR per Kyle's advice above - snippet from Scala notebook to do this.
val jarfiles = dbutils.fs.ls("dbfs:/FileStore/jars")
.map(_.path)
.filter(_.indexOf("your pattern") > -1)
jarfiles.foreach(dbutils.fs.rm(_))
3) detach notebook from cluster.
4) restart cluster.
5) add the jar back and attach the jar to the cluster.
6) re-attach notebook to cluster.
Its a painful process, and leads me to think that the pattern of depending on a JAR file under active development is discouraged in this workflow. I should probably have set up my tests on a local Spark cluster instead of debugging it here, would have gone much faster.
Databricks Inc.
160 Spear Street, 13th Floor
San Francisco, CA 94105
info@databricks.com
1-866-330-0121