I have an excel file with 4 lakhs of rows. I have used this file in my application. I know the file is huge so the app is slow. Is there any alternate way to deal with this? Like can we out excel data into online database and then read it?
Are you sure the app is slow because of the excel file? Or is slow because of the amount of data you are manipulating into the R session? How do you got to this conclusion?
Yes, one of them, upload your data to a sql server and use sql code or dbplyr
to manipulate the data on the server side fetching back to your R session just the results.
Yes you can, there are many options like setting your own server (local or in the cloud), using a cloud database service like Amazon RDS, ElephantSQL, etc.
Thanks for the information. Answering your question Are you sure the app is slow because of the excel file? Or is slow because of the amount of data you are manipulating into the R session? How do you got to this conclusion?
There are 2 reasons, One is because of 4lakhs rows and another is I have melted this data in R session so it is taking time
So the format of the file is not the problem, if you load your data into a database and fetch the whole dataset into your R session for manipulation you would have exactly the same result.
I think you are a little bit blinded here, it seems like you are trying to find coding solutions to design problems, I recommend you to check the logic behind your code first.
Is it? Then how can I solve this :)? Can you please guide
Sorry but your request is very abstract, so I can only give you the general guidelines I already gave you on my first reply.
I think this is the kind of request you should be making to a paid consultant.
That is fine No worries.
I have some follow up questions that may help you clarify your problem and needs. When you say "the app is slow", can you clarify which part of the app is slow? Is it slow to load or is it slow when you update an input?
Does your app do a lot of manipulation to the data? If you were to run the same code in your R session outside of the Shiny app, would it also take a long time?
A general strategy for speeding up an app with large-ish data sets is to do as much data processing as possible before launching the app or at app launch. For example, if your app summarizes aggregated counts from your data and your inputs control filters on the summarized data, don't load the full data set and don't run the aggregation step in Shiny. Instead, perform the aggregation down to the smallest level possible in an offline R script and then have your Shiny app use the smaller data set.
Thanks for the reply. First and foremost my app is in Flexdashboard
Answering your questions,
- Yes your right, when i run the same manipulation code outside, it will get executed but will take little time. But when I run the same code, the app will get hang
- I agree with you. Since my app is in Flexdashboard. So I believe the working should happen inside the app (I tried running outside the app. But did not work)
Hope this makes sense
Sorry, it's very hard to help without more detailed information about your application and data. It would be helpful if you were able to post a small, reproducible example (reprex). Please refer to this post for more information about creating a reprex for Shiny apps.
This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.