Using hive table over parquet in Pig
Asked Answered
W

1

9

I am trying to create a Hive table with schema string,string,double on a folder containing two Parquet files. The first parquet file schema is string,string,double and the schema of the second file is string,double,string.

CREATE EXTERNAL TABLE dynschema (
 trans_date string,
 currency string,
 rate double) 
STORED AS PARQUET
LOCATION '/user/impadmin/test/parquet/evolution/';

I am trying to use the hive table in pig(0.14) script.

 A = LOAD 'dynschema' USING org.apache.hive.hcatalog.pig.HCatLoader();

DUMP A;

But I get the error

java.lang.UnsupportedOperationException: Cannot inspect org.apache.hadoop.hive.serde2.io.DoubleWritable

Which I suspect is due to the schema of the second file is different from the table schema as the first file's split is successfully read but this exception occurs while reading the second file's split.

I also looked into the HCatRecordReader's code and found this piece of code

DefaultHCatRecord dr = new DefaultHCatRecord(outputSchema.size());
  int i = 0;
  for (String fieldName : outputSchema.getFieldNames()) {
    if (dataSchema.getPosition(fieldName) != null) {
      dr.set(i, r.get(fieldName, dataSchema));
    } else {
      dr.set(i, valuesNotInDataCols.get(fieldName));
    }
    i++;
  }

Here, I see that there is logic of conversion from the data schema to the output schema, but while debugging, I found there is no difference in both the schema.

Please help me to find if,

  1. Pig support such cases of reading data from hive table created over multiple parquet files with different schema.

  2. If yes, how to do this.

Writeup answered 20/1, 2016 at 1:58 Comment(3)
If you know the schema, you could use the pig Parquet loader to read the files and specify the schema manually which should trigger a schema evolution. I'm not sure if it would help in this specific case though as a schema evolution seems hard to do.Porthole
Will it the case with Avro as well?Writeup
Don't know how the pig avro storage works. I think you can manually specify a schema as well.Porthole
A
1

If you have files with 2 different schemas, the following seems to be sensible:

  1. Split up the files, based on which schema they have
  2. Make tables out of them
  3. If desirable, load the individual tables and store them into a supertable
Arette answered 29/5, 2016 at 14:53 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.