If you can provide a bit more information around the structure of your JSON file, that can be helpful as well. One thing I have had success with in the past (since it looks like you are running out of memory), is streaming the file.
This is easiest if every line is a JSON object (that's a standard in some JSON implementations... although I forget what it is called). However, if that is not the case and you know a JSON object usually does not have more than 500 lines, let's say, then it is possible to determine the start / end of a JSON object and split the file up that way. It's a bit more work, but it gets around your limitation 
readr has some really great streaming file support. If I get a chance, I would love to scrounge up the example that I did (working with 4 GB of memory and a 10 GB file or something like that... way bigger than I could fit in memory).
I want to say I used jsonlite with readr::read_lines_chunked or something along those lines. tidyjson is also a favorite of mine when parsing JSON data (if the data is very complex), but it is not on CRAN anymore (a working version can be had with devtools::install_github("colearendt/tidyjson")).
EDIT: try jsonlite directly first, as @cderv suggested. Streaming is much more complex and potentially painful, but it gets around a memory limitation if one does exist 