I looked into this a few years ago so I don't remember all of the specifics; however, I do not believe that lapply is a true parallel function.
When I was looking into parallel computing, the snow and parallel packages had a lot of good information.
The link to the parallel package has information on the number of cores that you are using and this link has good information on how to replace lapply using the snow package to make it truly parallel.
Can you clarify what you mean by that question?
Also, take a look at future/furrr packages that make using parallelization much easier. future is an "interface of interfaces", so you can use it with snow, parallel, mclapply, slurm etc.
Most of the time you'll see something like availableCores or detectCores or something like that in the code, but there is no way to know for sure how many cores are used by any given function. But can you clarify why that is important for you to know?
In my question around lapply I meant to ask if it can be considered as one of the functions used for parallel computing in R. Here I am referring to parallel computing from the perspective of using more than one core (and not just running a function on multiple elements of a vector simultaneously). Thanks for suggesting the future package! Seems very useful!!
In order to understand the parallel computing better I am actually trying to understand how a general code works in terms of the memory usage. So the other question was asked in this context.
lapply will run sequentially, but presence of lapply in your code often hints at possibility to run your code in parallel. In other words, if your computations take a long time and can be expressed with lapply or map's from purrr, there is a high chance you'll be able to parallelize your code.
Still not sure what you mean . Perhaps, if you have a concrete example, it'll be easier for me.