Spark SQL Error while creating a Delta Table with NULL as column in Databricks
Asked Answered
F

1

5

I am trying to create a Delta table from SELECT statement and a NULL value for column. There is no error coming when I create the table but it is throwing a error when trying to run select.

%sql
create or replace table test1 as (
  select
    col1,
    null as a,
  from
    table1
);

No error coming up.

%sql
select * from test1;

Error : IllegalStateException: Couldn't find a#31234 in [col1#31233]

Flyover answered 24/11, 2022 at 12:32 Comment(0)
S
7

It is because by writing null you created a column of type VOID - check your schema.

You need to cast it to a useful type, eg. STRING, when creating table:

create or replace table test1 as
  select
    col1,
    cast(null as string) as a,
  from
    table1;

Then the query will work.

Sheena answered 24/11, 2022 at 13:12 Comment(2)
why can't a column of type VOID be permissible in a database ? Afterall there is a VOID data-typeLoredo
Here's a PR that forbids this: github.com/apache/spark/pull/28833. There's also some discussion around this topic if you're interested.Fumed

© 2022 - 2024 — McMap. All rights reserved.