Unable to import Excel workbook

I have data in my excel workbook which is converted to table.

But when I try to import excel data, it gives an error / unable to import into R.

Can anyone please help, excel data which is converted to table, how to import it into R.

Thanks

What have you done so far? Maybe readxl or openxlsx might help.

hi,
please try rio package - https://www.rdocumentation.org/packages/rio/versions/0.5.16 - it helps a lot with all type of data. if you need to import several worksheets, you can use:

library(rio)
import_list("your_excel_file.xlsx")

rio uses data.table::fread(), so if needed please see fread doc for separate char, factor on strings, and other variables..
Not knowing an example what you trying to import - its impossible to solve your problem, I can just advise whats the best way to do import of xlsx data.

If we could see your script and the error message, we can find the solution for you.

Could you ask this with a minimal REPRoducible EXample (reprex)? A reprex makes it much easier for others to understand your issue and figure out how to help.

Thanks for all your kind replies.

Below is the error pic I get when I try to import excel file..

I am using 32 bid operating system, could this be the issue???

My excel file is 300 mb only... but donno why the error says 1.4Gb

I have posted the errror pic... can you please help

If I try to import other file, there is another error....

I have created SnT_FY19.. but the error says SnT_FY19 not found .... why is this happening

Please help

Judging by the screenshoot. You have not created object SnT_FY19. There is an error there.

Now, could you at least show us how your data was structured? or even better, a reproducible example?

This warning message "package 'readxl' was built under R version 3.3.3" intrigues me, do you have updated to R >= 3.5.0 ? If so, you may have to reinstall your package under the new R version.

I have observed that many times a csv file is significantly larger/bulkier than xlsx. Try converting it into a csv and then check its size. If it goes to 1.4 Gb, then perhaps the RAM on your machine is limiting the import as R is an in-memory software.

Thanks!
Heramb

2 Likes

Are you sure that you have typed in the name of the excel file correctly? For a sanity check...here are some basic steps that I use personally. See if this helps (screenshot below).

Thanks for the reply ....

But first... why is it giving an error I am trying to export Excel file into R ??

On disk file size is always much less than on memory uncompressed file size, it's very likely that you are actually out of RAM memory, since you are on a 32bit windows system, I guess you might have just 3GB of RAM installed in your computer

Thanks for the help @andresrcs

Yes I work on 32 bit OS and 4GB Ram.... But could this be the issue that, i am unable to open JUST 300 mb excel file ???

Thanks @Piranha....

As you can see in the pic that I have typed correctly ... But donno why it says SnT_FY19 object not found

Again, 300mb on disk could be a lot more on RAM when is uncompressed, like the 1.4GB it's reporting.
Also, have you checked how much of those 4GB are actually available for your R process?

1 Like

Thanks @rexevan

I am new to the community, I have never used Reprex yet.. I am learning it .. May be next time i will learn and put questions in that format...

Anyways.. As you have mentioned I have not created object SnT_FY19 .... But in the pic there is clearly
SnT_FY19 <- read_excel("C:/user.....xlsx").. then why does this error shows..

How do I check how much is available for R....

Seems, operating R isn't easy as it presumed

Because the read_excel() function returns an error, so the SnT_FY19 object never gets created