Best Practice for Installing R packages with depenencies on a network drive (Windows)

What I am hoping for is people sharing what has been done before as well as general discussion/advice about approaches to this.

The context

At the company I work for we have a collection of internal R packages. We currently have our packages as binary (.zip) files stored on a network drive and we currently instruct people to install packages using the following code:

install.packages("K:/path/to/folder/", repos = NULL)

CRAN-like structure

While writing this post, I came across this:

which links to this:
and could form a potential solution.

I am positing this anyway for additional insights but this method looks promising.

The problem

The issue with the above approach is that while it will install the package itself, it will not install any of its dependencies (either on CRAN or the network drive). Ideally the CRAN dependencies would be installed.

Question: What is the best practice for sharing and the installation process for organisational R packages?
Preferably assume we have to continue sharing them on a network drive. Our technology team have not been helpful when we've asked them to set up some sort of cloud repository (although solutions to this effect would also be interesting).

We have investigated a few potential solutions I will outline below. While what we currently have works okay, it could be more streamlined.

Also note that we exclusively use the Windows operating system and a lot of the information about this topic I have found (in particular: Install a local R package with dependencies from CRAN mirror - Stack Overflow) appears to be more geared towards Mac/Linux.

Potential Solutions

In addition to our existing solution we have investigated the approaches outlined below.

remotes::install_local() + tar.gz (source version)

remotes::install_local("K:/path/to/folder/packageName_x.x.x.tar.gz",  upgrade = "never",  type="source")

This what I have started using for one our packages (a bit of a behemoth with lots of CRAN dependencies), instead of the above approach and it works quite well for that package.

Issues with this

This works quite well, but installing via the source/.tar.gz (as opposed to the binary/.zip version) means that RTools can be required to install some packages. Additionally we would prefer a solution that doesn't involve any other dependencies (i.e. just using utils::install.packages()).

Failures (thus far)

Stackoverflow post

We have gone through the answers to this stackoverflow post, in particular the one # linked when you click on the link below:
Install a local R package with dependencies from CRAN mirror - Stack Overflow

Unfortnunately we can't get it to work with our setup. There is a possibility we are doing something wrong.

Other combinations of remotes::install_local() versus install.packages() and source versus binary

Based on my experimentation this is what happens:

# gives a warning and package doesnt get installed
devtools::install_local("K:/path/to/folder/", upgrade = "never",  type="binary")

# errors if package dependencies are missing:
install.packages("K:/path/to/folder/packageName_x.x.x.tar.gz", repos = NULL, type="source")

I think you have a couple of other options that you could explore. drat is a pretty good and common option for creating a cran-like repository, and it can be set up on a shared network drive. Posit also offers the (paid) Posit Package Manager.

This older thread might also give you some more ideas.

pak would also be a good thing to explore. It generally does a very good job at solving dependencies, so pak::pak("K:/path/to/folder/packageName_x.x.x.tar.gz") should do a better job of installing and resolving dependencies. Are the dependencies of the local packages all on CRAN or are they also local?

1 Like

This topic was automatically closed after 45 days. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.