I am trying to implement a CDC in Apache Beam, deployed in Google Cloud Dataflow.
I have unloaded the master data and the new data, which is expected to coming daily. The join is not working as expected. Something is missing.
master_data = (
p | 'Read base from BigQuery ' >> beam.io.Read(beam.io.BigQuerySource(query=master_data, use_standard_sql=True))
| 'Map id in master' >> beam.Map(lambda master: (
master['id'], master)))
new_data = (
p | 'Read Delta from BigQuery ' >> beam.io.Read(beam.io.BigQuerySource(query=new_data, use_standard_sql=True))
| 'Map id in new' >> beam.Map(lambda new: (new['id'], new)))
joined_dicts = (
{'master_data' :master_data, 'new_data' : new_data }
| beam.CoGroupByKey()
| beam.FlatMap(join_lists)
| 'mergeddicts' >> beam.Map(lambda masterdict, newdict: newdict.update(masterdict))
)
def join_lists(k,v):
itertools.product(v['master_data'], v['new_data'])
Observations (on sample data):
Data from the master
1, 'A',3232
2, 'B',234
New Data:
1,'A' ,44
4,'D',45
Expected result in master table, post the code implementation:
1, 'A',44
2, 'B',234
4,'D',45
However, what I am getting in master table is:
1,'A' ,44
4,'D',45
Am I missing a step? Can anyone please assist in rectifying my mistake.