Hello!
I am getting a weird result from running a piece of dbplyr code.
What I have is a column in a table where the values are strings of 11 characters each, with a bunch of leading zeros, something like this:
Furthermore, adding filter(ACCT_NO != "0") seems to be taking no effect at all if put beforecollect(). But it does work aftercollect()
It doesn't happen on every table, but it happens on some tables.
On the tables where it doesn't happen, the top values seem to be having fewer leading zeros (not sure if it a coincidence or not)
Does Microsof SQL server use casting on queries like postgresql? maybe the driver is getting the class wrong and trying to convert to a numeric class (e.g. int8)
I guess I could do something like cast(ACCT_NO as char) in SQL, but how do I do that in dbplyr?
I understand I can use something like build_sql() or as.sql() and paste a sting, but how do I execute it against the database, what function do I use?
@edgararuiz, can this be a known issue with the package by chance?
I am cut off from GitHub at work and can't check open issues, but my problem does seem like an unexpected result that shouldn't happen.
It always worked in SQL Studio without a fail.
I am also not sure about the driver, as it is the same driver used by Tableau, and I've never had this problem in Tableau.
Just as a comment, I have had issues lately with odbc() and non English locale, after a connection is made my decimal mark gets changed to "," and that causes all my numeric values get converted to NA if I apply any dplyr command afterwards
Basically, an explicit cast(field as varchar(8000)) was needed.
Not sure how to incorporate this into dbplyr workflow, but at least I can use a SQL vanilla query and run it with dbGetQuery