Tested multiple options to read a lager amount of rows from BigQuery.
The most commont approach is
// read from bq
Another approach is to use a temporary cloud storage bucket
/* export to csv + reader export to avro + reader
tests for 10.000 rows 100.000 rows 1.000.000 rows 10.000.000 rows