BigQuery Row Limits -


google says bigquery can handle billions of rows.

for application estimate usage of 200,000,000 * 1000 rows. on few billion.

i can partition data 200,000,000 rows per partition support in bigquery seems different tables. (please correct me if wrong)

the total data size around 2tb.

i saw in examples large data sizes, rows under billion.

can bigquery support number of rows dealing in single table?

if not, can partition in way besides multiple tables?

below should answer question

i run agains 1 of our dataset
can see tables size close 10tb around 1.3-1.6 billion rows

select    round(size_bytes/1024/1024/1024/1024) tb,    row_count rows [mydataset.__tables__]  order row_count desc limit 10 

i think max table dealt far @ least 5-6 billion , worked expected

row   tb        rows      1   10.0    1582903965    2   11.0    1552433513    3   10.0    1526783717    4    9.0    1415777124    5   10.0    1412000551    6   10.0    1410253780    7   11.0    1398147645    8   11.0    1382021285    9   11.0    1378284566    10  11.0    1369109770    

Comments

Popular posts from this blog

resizing Telegram inline keyboard -

command line - How can a Python program background itself? -

php - "cURL error 28: Resolving timed out" on Wordpress on Azure App Service on Linux -