The goal is to store audit logging from different apps/jobs and be able to aggregate them by some ids. We chose to have BigQuery for that purpose and so we need to get a structured information from the logs to the BigQuery.
We successfully use apps deployed in kubernetes engine that log output logs to stdout as json string and it is parsed and the structure can be seen in stackdriver as jsonPayload
. We took it from this tutorial.
However when I use the same log appender from within dataflow job, it is not treated as a structured message and is viewed as a string in jsonPayload.message
field.
I need the structure for two things:
- to use it in filter in custom exporter to big query
- to have the structure in BigQuery as described here
What is the easiest way to achieve this?