R Studio Freezing with large operations

Hi everyone, posting on here for the first time and looking for some help with trying to understand why RStudio is freezing/ getting stuck when I run a large operation.

I am using spatial data. I have a SpatialLinesDataFrame dataset of all roads in a city. I also created a lattice of 10,000 points within the city boundaries which I generated buffers of 50m, 100m etc around. I am trying to clip the road segments by the point buffers and calculate the total distance of the road lines within each point buffer. When I run this operation with a lattice of 10,000 points RStudio stops processing the code after a while, and this has happened when I try on multiple computers. I can run this code fine (takes a while) with 5,000 points but the goal is to go as large as possible.

If anyone has any ideas as to whether this is something I can deal with inside R Studio (as opposed to a larger computer issue), please let me know. (BTW it is not my computer RAM which is quite large).

Thanks in advance!
Sierra

have you tracked the memory usage of your code to evidence this ?
Most operating systems let you track cpu and memory useage of your running processes.
For better more precise detail look into using profvis to profile your R code, and understand the processing bottlenecks.

use libraries like microbenchmark. to time your code at different magnatitudes to get estimates of the order of complexity of your code. i.e. how long does it take to run for 1k points, 2k points 3k, 4k, 5k, plot these. Is it a straightline, or is it quadratic, or worse?

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.