From a programming perspective:
As someone who learned SAS after already knowing how to program, it was something of an abomination. While R allows you to use poor programming techniques, SAS practically requires it (this is ignoring their "C++"-like interface that I have never used, so that may be completely different). Using the SAS Macro language means getting things done despite the language, not because of it.
In R, I used to have to dig into how to pull a p-value from a linear model using the matrix embedded in the list returned by summary.lm
. Getting anything in SAS felt like that every single time.
In the end, I think becoming very good at SAS would make you an excellent SAS programmer. Becoming very good at R should make you a better programmer, period.
From a statistics perspective:
I have admittedly narrow expertise in SAS vs R in academia, but most of that experience was being taught how to run a statistical procedure in SAS and turning around to figure out how get the same numbers in R so that I could still do it once my academic SAS license expired. I had several professors that treated the SAS way of doing stats as the One True Way. I didn't always know enough to assess that judgement at the time, but the pieces that I dug into at the time and since then have shown that they had a pretty narrow view.
The most obvious example is Type I / II / III / IV sum of squares in ANOVA/regression. You can find volumes of discussion about this that I won't recap here. However, R can make it difficult to get exactly the numbers SAS gets without using just the right package and configuring it just so. I think that is the source of some academic distaste for R -- why do they use the "wrong" methods? But when you get down to how people actually use regression in "the wild", all the methods are wrong and you have to either look at more complex methods or just accept that being "kind of right" is often good enough. So, getting the textbook answers just isn't that important.
From a capability standpoint:
The most persuasive thing SAS has going for it is that it natively handles bigger data sets than R, and can analyze them as-is. In R, you have to step outside base, but not very far outside from what I've seen of Microsoft's XDF tools.
I believe SAS also addresses some issues with data control/auditing that are probably not well-handled by R directly. I would argue that there are other tools focused on making sure no one is fudging your data, so R doesn't need to do that.
In summary:
I've answered neither of your original questions. Well, I guess it's fair to assume that I've answered #1 indirectly, by making it clear that I'd be a terrible choice to lead this discussion. I think the main reason to stick with SAS is that it's what you've always used, or it's what your adviser used, or your organization has a lot of investment in using SAS "right".
You can generally duplicate everything SAS does in R, but focusing on that will give you a skewed view of R because the people that are making R better often don't care about duplicating results that they may not agree with. It's like saying that I should use ABC Painting because their lead paint is top-of-the-line.