I've been working on getting up to speed with dbplyr to munge two large tables in Snowflake. It's like learning R all over again!
After a while the SQL queries generated by dbplyr begin to be many pages long, which makes it kinda hard to see what's going on in the console. Is there any way to prevent dbplyr echoing the queries to the console?
A bit of a workaround, but what happens if you use sink() to redirect the output to a file?
If you haven't used it before, this post is a nice intro to sink():
Thanks. I'm using logr to capture the outputs of my script, and I also want to see what's happening in the console, I think sink() puts everything into a file so it's not that useful here. I guess it would be a long file too since those queries are multiple pages.
I found the answer. I had to change the Trace level in the dbConnect. For some reason Trace = 1 didn't work, but Trace = 2 does.
snowflake <- DBI::dbConnect(odbc::odbc(),
Driver = "SnowflakeDSIIDriver",
Server = "url.azure.snowflakecomputing.com",
UID = creds[1],
PWD = creds[2],
Database = "BRONZE", # the name of the particular database head
Warehouse = warehouse, # engine that does the work and costs money
Role = "ECONOMICS_READER",
# Role = "SYSADMIN",
# https://docs.snowflake.com/en/developer-guide/odbc/odbc-parameters
Trace = 2)