Are you creating a new site on Stanford Sites, but don't want Search Engines to see it? The NoBots module is here to replace Stanford MetaTag NoBots.
NoBots recently joined the growing list of modules available to Stanford Sites during the Summer 2016 Updates. It replaces the soon-to-be-deprecated Stanford Metatag NoBots module, which was featured as a Module of the Day here on the SWS Blog in 2014.
Why is Stanford MetaTag NoBots being deprecated?
Two great reasons are driving this change: 1) NoBots has a slight performance increase over Stanford MetaTag Nobots 2) NoBots is supported by the Drupal Community. The latter point means, among other things, that the module is covered by the Drupal project's security advisory policy.
How does NoBots work?
This module blocks (well-behaved) search engine robots from crawling, indexing, or archiving your site by setting a "X-Robots-Tag: noindex,nofollow,noarchive" HTTP header. NoBots is created and maintained by our own Stanford Web Service developers John Bickar and Shea McKinney.
How to enable NoBots on your site
- Log into your Stanford Sites website
- Go to Modules in the Admin toolbar at the top of your page
- Filter by the phrase "nobots"
- Select the module No Bots
- Click Save configuration
Then continue creating content and developing your website. When you're ready to unleash your site to the robots, go back to Modules and disable No Bots.
Blocking only one page from robots
NoBots is an all or nothing tool. It cannot block only one page from search engine robots. The best way to block selected content from search engines, once NoBots is disabled, is keeping the content unpublished.
Not using Stanford Sites?
You can still use this module! It's available on Drupal.org.