It might also be helpful to to set up a nofollow to your Staging environment if you are using one. Not sure if there is a use case to having a staging site indexed.... so if you agree you may be able to use these steps to help block that.
If you are using Tomcat, set an environment variable such as NOFOLLOW=true --> see here for example: TOMCAT_OPTS, environment variable and System.getEnv()
Next as mentioned by @doelleri set the urlMappings
UrlMappings
//Robots.txt
"/robots.txt"(controller: 'robots', action:'robots')
Then use your robotsController to detect the environment variable you set on your staging tomcat.
RobotsController
def robots() {
if (System.getenv('NOFOLLOW') == 'true') {
def text = "User-agent: *\n" +
"Disallow: /cgi-bin/ \n" +
"Disallow: /tmp/ \n" +
"Disallow: /junk/ \n" +
"Disallow: /admin/ \n" +
"Crawl-delay: 5 \n" +
"Sitemap: https://www.example.com/sitemap.xml"
render(text: text, contentType: "text/plain", encoding: "UTF-8")
} else {
render(status: 404, text: 'Failed to load robots.txt')
}
}
robots.gsp
<%-- Content rendered from controller -> so leave blank :) --%>