Using a bunch of old SQL queries on dataframes in R

Here's the situation:

For whatever reason the source data is in data frames, in memory, and not in a database. The goal is to leverage a large set of existing SQL statements against this. I'm just looking for general advice because I don't think I have a complete picture of all the solutions for this problem.

The solutions I thought of:

  1. Use the sqldf package against these dataframes.
  2. Convert the data frames to SQLite and query that via the SQL statements and DBI functions.

Is there a strongly preferred choice here? Any real pluses or minuses? Is there an even better solution than either of those two?

Right now I'm leaning towards #2 because SQL RMarkdown chunks are a feature that could be used after a conversion. I'm seeking this general advice because it's a problem I've never encountered personally, but I can imagine it's come up many times before.

Thanks.

Under the hood, both options are the same, sqldf only automates the process, if your are going to use this data just once I would go with the first option, but if you are going to be using this data regularly then the second choice would be better, because gives you more options to access the data and finer control over the conversion process

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.