How can I make shiny-app appear on search engine?

I am looking for advice on how to make a shinyapp recognizable by search engines.

I understand that the app remains inactive until someone clicks on the app, which will instantiate a session of the app that will last for whatever option you have. I understand that Google is unable to crawl for the app's content, but the app cannot even be found by the search engine (even the title not just the content).

An alternative I 've looked is by embedding an iframe into my static website, but my app has multiple tabs so I don't really think this is possible!

I am hosting into with a basic plan.

I would also be interested in an answer!

We recently had another discussion, to what extent search engines are able to visit sites at App deployed to shinyappsio keeps waking up with no user connection

So far it seems nobody really knows?!
However, when I search for my username in combination with the app-name (the parts before and after nothing is found, although the app is online for 9 months or so.

I think it would be awesome if it would be possible to upload a file together with the app that is loaded before the app even starts, telling the crawlers what they could find there, or on the other hand if they should move on and there is nothing of interest for them.

I mean something like the robots.txt or sitemaps. Just containing the name of the app, the name of the author and maybe a short description or a few keywords.

R-Studio developers, is this possible?

I found a workaround but it is not really a solution.

I created an empty index.html that only contains an iframe with the app url and hosted it in the gh-pages in github. I assume that the crawler gets into the static webpage and crawls the url, and make the url of the initial app (not the static webpage) visible in the search engine. However, it is impossible to add any description, keywords or anything else to the google results. I tried meta tags in both the static webpage and the app but nothing happened.

Again, this is an assumption of how things worked. I really looking forward to a more sophisticated solution.

Is there a more recent update on this?


I was facing the same problem with my Shiny App and I just discovered via my Google Search Console account that the Shiny App was containing a blocking file for Google indexation, namely a "robots.txt" which prevent robots to explore the app.

I assume this default setting is meant to save the active hours available for individual apps in, otherwise robots would spend it and that would be sad.

Nevertheless I don't know how to remove this "robots.txt" file, if we can. I am going to send an email to support. I think that Google indexation could bring new users to an app and encourage the owners to upgrade their "plan" (starter -> basic -> standard ...) so everyone would be glad :slight_smile:

1 Like

Hi all,

Unfortunately, at this time, there isn't an ability to modify the robots.txt for applications. We currently block crawling deliberately because we found that the robot crawlers would cause applications to start, but would not follow the sockjs connections to get actual content. Because of this, we were paying the cost of starting the application, waiting for it to become idle, and then stopping, without getting any benefit from the crawl.

That said, we are looking into options to allow this in the future in some fashion and we'll see what we can do going forward.

Have you had any progress regarding that? Is this problem only for what about other options of deployment such as RStudio connect? Can a deployed app on RStudio connect appear on search engines?

I am sorry, I have clearly no idea how this is working, but does the crawler actually "tells" the webpages it's a crawler? Or can it somehow be recognised, e.g. as it doesn't send any information about the used browser?
In this case could there be something implemented like: Are you a crawler? Yes = show robots.txt, do nothing / No = start application.
Just thinking...

Edit: Or check for 1/5 (or so) of a second if the connection is still active before starting the app - short enough for a human to wait for this (as the start anyway takes a few seconds) and long enough for the crawler leaving the page again.

PS: Sorry for digging out this old thread at was suggested to me under another topic and I didn't check the age.

1 Like

I was wondering if anyone had a resolution to the initial post. I too am currently trying to make my R Shiny app searchable through Google and have no way of doing that. Any help or guidance would be much appreciated. Thank you in advance.

@kvasilopoulos , I am wondering if you have found a solution for this. I too have an app with multiple tabs and would like to make it recognizable by Google. Please let me know. Thank you in advance.

A simple workaround is to publish a simple (but of course attractive) web page with a link to your Shiny app that describes it appropriately. On your web page, you'll want to be sure the robots.txt file allows search engine indexing, and you'll at least be able to connect search users to your app.