Note: this question was originally posted on stack overflow:>
I am experimenting w/ the Databricks cloud deployed Spark service. I created some data and would like to download it to my machine rather than lose it. This post: https://stackoverflow.com/questions/49019706/databricks-download-a-dbfs-filestore-file-to-my-local-machine/49021261#49021261 suggested going to the URL -> https://community.cloud.databricks.com/files whilst logged in to databricks (I am using community edition by the way).
So, I tried that URL and I got back a plain text document with the content "1". That was not so useful. So, I then looked up the instructions for using the Databricks CLI, and according to this page -> https://docs.databricks.com/api/latest/authentication.html#token-management I need to set up a personal access token, which I generate by clicking on the 'Access Tokens' tab of Account Settings screen. However, I found no 'Access Tokens' tab.
So, I'm wondering:
Just for completeness sake, I will note that when I did an 'ls' of /FileStore, I found that folder to be non-empty:
%fs ls /FileStore/
dbfs:/FileStore/chris.txt chris.txt 20044 dbfs:/FileStore/tables/ etc etc..
Does scheduling a spark job to run every day in Databricks start up the Spark cluster automatically? 1 Answer
Can't determine cause of spark driver crash 2 Answers
Is string interpolation supported as part of %run command 1 Answer
Connecting Spark to CosmosDB - Entity with the specified id does not exist in the system. 0 Answers
Notebook failing only on the first run 2 Answers
Databricks Inc.
160 Spear Street, 13th Floor
San Francisco, CA 94105
info@databricks.com
1-866-330-0121