Load a table from DB, as a dynamic file (like format = "file")


I'm building a targets pipline that starts with reading tables from an ORACLE DB.

Is there a way to define that table as a dynamic "file", similar to format = "file".
My aim is to reload the table every time the table changes, and only then, making the entire pipeline out of date automatically.


Hello @yoav.raskin ,

Welcome in this forum!

I think that to know if the table is changed (what does that mean by the way) you still have to read it, unless you have meta-data like a time-stamp that you can rely on. This not so much of an R question as a data base question?

Thanks @HanOostdijk!

In a targets pipeline, if e.g., a CSV is defined as format = "file", then any change to the CSV is tracked and the entire pipeline downstream becomes out of date. A change can be as little as a single value in a cell.

I'm looking for a "best-practice" to work with a DB upstream in a targets pipeline, that makes sure changes to the DB table (like additions of new rows), are tracked. Much like dynamic datasets in targets, stored locally.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.