How to get workspace name inside a python notebook in databricks
Asked Answered
G

4

9

I am trying get the workspace name inside a python notebook. Is there any way we can do this?

Ex: My workspace name is databricks-test. I want to capture this in variable in python notebook

Grudging answered 31/1, 2022 at 21:16 Comment(1)
Did you find the solution? Most answers are outputting URL which does not contain the workspace NameSuber
S
12

To get the workspace name (not Org ID which the other answer gives you) you can do it one of two main ways

spark.conf.get("spark.databricks.workspaceUrl")

which will give you the absolutely URL and you can then split on the first. i.e

spark.conf.get("spark.databricks.workspaceUrl").split('.')[0]

You could also get it these two ways:

dbutils.notebook.entry_point.getDbutils().notebook().getContext() \
.browserHostName().toString()

or

import json

json.loads(dbutils.notebook.entry_point.getDbutils().notebook() \
  .getContext().toJson())['tags']['browserHostName']

Top tip if you're ever wondering what Spark Confs exist you can get most of them in a list like this:

sc.getConf().getAll()
Shoeshine answered 4/5, 2022 at 23:55 Comment(2)
If one doesn't have any tags indicating the environment, an alternative could be to have a config object like a dictionary in source code that maps the workspace IDs to names. This of course needs to be maintained manually or automatically in another process.Anelace
It gets workspace URL which may not contain workspace name.Platinum
Z
0

By using of below command , we can get the working workspace ID . But getting the workspace name ,I think difficult to find it .

spark.conf.get("spark.databricks.clusterUsageTags.clusterOwnerOrgId")
Zumstein answered 31/1, 2022 at 21:50 Comment(1)
This should workGrudging
L
0

We add the workspace name and the env name, among other attributes, to each cluster as custom tags. You will find them under Compute > CLUSTER_NAME > Configuration > Tags.

In your notebook, use the below command to get an array of dicts of all custom and automatically-added tags.

spark.conf.get("spark.databricks.clusterUsageTags.clusterAllTags")

If you named the workspace tag as WorkspaceName, you will have a dict in the form of {"key":"WorkspaceName","value":"databricks-test"}

Lon answered 4/12, 2023 at 12:24 Comment(0)
L
-2
spark.conf.get("spark.databricks.clusterUsageTags.clusterName")

This command will return the cluster name :)

Large answered 8/7, 2022 at 14:19 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.