How to read a large number of rows from BigQuery
Tested multiple options to read a lager amount of rows from BigQuery.
The most commont approach is
// read from bq
Another approach is to use a temporary cloud storage bucket
/* export to csv + reader export to avro + reader
https://pkg.go.dev/github.com/hamba/avro/v2 */
tests for 10.000 rows 100.000 rows 1.000.000 rows 10.000.000 rows
Conclusion