I am new to databricks and delta live tables. I have problem with creating delta live table in python.
How to create delta live table from json files in filestore?
I am new to databricks and delta live tables. I have problem with creating delta live table in python.
How to create delta live table from json files in filestore?
It is a decorator, so I think you also need a function after. Meaning
@dlt.table(comment="your comment")
def get_bronze():
df=spark.sql("""select * from myDb.MyRegisterdTable""")
#If you wanna check logs:
#print("bronze",df.take(5),"end")
return df
In silver function then you can read it as:
@dlt.table
def get_silver():
df = dlt.read("get_bronze")
[..do_stuff...]
return df
Also from your screenshots I am not sure, are you running all this as a pipeline or are you trying to run a Notebook? The latter does not work.
Sandro’s answer should solve your problem. For ingesting json files with live tables, you can check this article for some use cases https://medium.com/@chaobioz/create-delta-live-tables-dlt-dynamically-with-pyspark-e06a718199c8
Also if for production, better use auto loader as well.
You can check this process mentioned in the documentation itself. https://docs.databricks.com/workflows/delta-live-tables/delta-live-tables-quickstart.html
Read the steps just above the requirement, I guess that would help.
Could you try to install dlt before importing it?
%pip install dlt
dlt
package available via pip install
has nothing to do with Databricks Delta Live Tables –
Compellation © 2022 - 2024 — McMap. All rights reserved.