I am trying get the workspace name inside a python notebook. Is there any way we can do this?
Ex:
My workspace name is databricks-test.
I want to capture this in variable in python notebook
I am trying get the workspace name inside a python notebook. Is there any way we can do this?
Ex:
My workspace name is databricks-test.
I want to capture this in variable in python notebook
To get the workspace name (not Org ID which the other answer gives you) you can do it one of two main ways
spark.conf.get("spark.databricks.workspaceUrl")
which will give you the absolutely URL and you can then split on the first. i.e
spark.conf.get("spark.databricks.workspaceUrl").split('.')[0]
You could also get it these two ways:
dbutils.notebook.entry_point.getDbutils().notebook().getContext() \
.browserHostName().toString()
or
import json
json.loads(dbutils.notebook.entry_point.getDbutils().notebook() \
.getContext().toJson())['tags']['browserHostName']
Top tip if you're ever wondering what Spark Confs exist you can get most of them in a list like this:
sc.getConf().getAll()
By using of below command , we can get the working workspace ID . But getting the workspace name ,I think difficult to find it .
spark.conf.get("spark.databricks.clusterUsageTags.clusterOwnerOrgId")
We add the workspace name and the env name, among other attributes, to each cluster as custom tags. You will find them under Compute > CLUSTER_NAME > Configuration > Tags.
In your notebook, use the below command to get an array of dicts of all custom and automatically-added tags.
spark.conf.get("spark.databricks.clusterUsageTags.clusterAllTags")
If you named the workspace tag as WorkspaceName
, you will have a dict in the form of {"key":"WorkspaceName","value":"databricks-test"}
spark.conf.get("spark.databricks.clusterUsageTags.clusterName")
This command will return the cluster name :)
© 2022 - 2024 — McMap. All rights reserved.