Update: I was told on Twitter that's due to a minimum for data queries. I still don't understand why in the Web interface query history the bigrquery queries show those minimums but the ones I run in the SQL BigQuery Web editor don't, but I guess that must be it.
If I run this SQL in the BigQuery Web-based SQL editor
SELECT homeTeamName from bigquery-public-data.baseball.schedules
it says it will use 21.1 KiB. But if I run the query in R and bigrquery
library(bigrquery)
library(dplyr)
library(DBI)
data_project_id <- "bigquery-public-data"
billing_project_id <- "my_project_id"
database_id <- "baseball"
con <- dbConnect(
bigrquery::bigquery(),
project = data_project_id,
billing = billing_project_id,
dataset = database_id
)
sql <- "SELECT homeTeamName from bigquery-public-data.baseball.schedules"
my_results <- dbGetQuery(con, sql)
Results say 10.49 MB were billed.
Anyone know why the same query in bigrquery uses so much more analysis data? Thanks.