I'm building a targets pipline that starts with reading tables from an ORACLE DB.
Is there a way to define that table as a dynamic "file", similar to format = "file".
My aim is to reload the table every time the table changes, and only then, making the entire pipeline out of date automatically.
I think that to know if the table is changed (what does that mean by the way) you still have to read it, unless you have meta-data like a time-stamp that you can rely on. This not so much of an R question as a data base question?
In a targets pipeline, if e.g., a CSV is defined as format = "file", then any change to the CSV is tracked and the entire pipeline downstream becomes out of date. A change can be as little as a single value in a cell.
I'm looking for a "best-practice" to work with a DB upstream in a targets pipeline, that makes sure changes to the DB table (like additions of new rows), are tracked. Much like dynamic datasets in targets, stored locally.