I am a newbie to R, and working on a project to automate report generation and scrape data using R for tracking diseases.
Since I have no prior experience with R, apart from regression and logistic analysis, hence seriously need guidance as to how to go about this project in terms of coding, the methodology to be followed. Any help in this regard would be awesome.
Thanks, will surely keep forum guidelines in mind. This was actually summary of what needs to be done and not the exact question itself. Just not sure how to go about it.
The ULTIMATE GOAL is to scrape data from websites (HTML/CSS), automate report generation in pdf/HTML format (output), followed by creating interactive dashboards using tableau. Now i know how to make dashboards etc in tableau. But really don't know how to Scrape date using R (which packages to install, reference code guide/tutorial etc for understanding) and automate report generation.
I have tried Rcurl() & Readlines() for reading and parsing data directly from the websites, but still a long way off from perfecting it.
Since it sounds like you're still in the phase of your project of 'how do I even approach this kinds of problems with R?', I'd also like to encourage you to check out Garret and Hadley's book, https://r4ds.had.co.nz/. It contains a nice collection of chapters on various tools useful to project a project like this, and is a nice onboard to R and the Tidyverse generally.
When you do get to the point where you are looking for help with a specific coding question, I'd encourage you to check out FAQ: Tips for writing R-related questions. It's a nice guide to improve the probability folks who want to help can understand your problem, and quickly reply with useful suggestions.