Reading multiple files in with purrr and retaining their names

Hi all,
I've been working in two different projects recently that have multiple files I'd like to import at once, so I'm trying to get a workflow down that makes importing and managing multiple data frames easier. I can currently import a batch of files with the following set of commands, but it's a little unwieldy. Particularly the last part.

# create string of the desired file names and locations
files=paste0(here::here("Data", "Cognition data", "/"), 
             list.files(path=here::here("Data", "Cognition data"), pattern = ".sav"))

# Read in all data files simultaneously
cog_data=files |> map(haven::read_sav)

# find the file names; apply said names to the list levels
names(cog_data)=file.path(here::here("Data", "Cognition data")) |> #specify file path as a string
  list.files(pattern = ".sav") |> # pass the path string to list files; search in this location for files with this extension
  gsub(pattern=".sav", replacement = "") # remove this pattern to save only the name

# extract data frames from list to env
list2env(cog_data, globalenv())

My question is if there's a better way to read the files in, so that their names are retained. My method reads them into a list with blank, unnamed levels, so I have to scrape their names and stitch them back in. Would love to hear suggestions on how to do this better!

This seems clunky but it does read the data into objects that have the same names as the files, including the extension.

FILE_Names <- list.files(path = here("FILES"), ".csv$")
ReadFunc <- function(Nm) {
  tmp <- read.csv(paste0(here("FILES"), "/",  Nm))
  assign(Nm, tmp, envir = .GlobalEnv)
walk(.x = FILE_Names, ReadFunc)
1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.