Hello,
I seem to be having trouble uploading a data frame to a new table in BigQuery. It seems to take an inordinate amount of time for the size of table. 209MB. Wait time is possibly 10 minutes or longer, and then often I get a timeout error (408), and then occasionally it actually works. What am I doing wrong? I've tried the DBI interface as well as bg_perform_upload and both seem to have the same outcome.
Minimal reproducible example would be:
src<-"path/to/csv"
new_tbl <- read_csv(src)
#making sure it's a dataframe
new_tbl<-as.data.frame(new_tbl)
billing<-"###############"
dataset<-"###############"
project<-"###############"
con <- dbConnect(
bigrquery::bigquery(),
project = project,
dataset = dataset,
billing = billing
)
DBI::dbWriteTable(con, "temp", new_tbl)
Also tried the following
bq_Temp<-bq_table(project, dataset, "temp")
bq_perform_upload(
bq_Temp,
new_tbl,
fields = NULL,
create_disposition = "CREATE_IF_NEEDED",
write_disposition = "WRITE_EMPTY",
priority= "INTERACTIVE",
billing = billing
)
When it does work, it seems to be hanging for a long time before progressing with the actual upload. I'm wondering if there is a config I need to do on the BigQuery side or if there are any elements of the data frame that need to change prior to executing the upload.