How to Create a Private, Online Drupal Environment

Drupal 8 Private Online Development

An OSTraining member asked us how to develop their Drupal 8 site online and also stop bots from crawling that site.

In this tutorial, I’m going to show you how to create a private, online Drupal 8 development environment. We’re going to use 3 modules to secure our environment: RobotsTxt, Required Login and Shield.

#1. The RobotsTxt module

The module allows us to easily edit the robots.txt file, which controls where search engines access our site.

  • Download and enable RobotsTxt module
  • Select configure from the “Extend” menu, or you can configure RobotsTxt from the “Search and metadata” section.

config robots.txt

The configuration area gives us direct access to the Robots.txt file, and we can edit it here. If you want to disallow any crawling by search engines, we can either comment out the lines or remove them entirely. I have removed them for this example.

configured2

Click the link at the top of the configuration page and it will take you to your robots.txt file. Here you can verify that the changes have been saved.

If the file does not match what you have saved, then you do not have sufficient permissions for Drupal to edit the original file. If this happens, you will have to manually delete or rename the original file so that the new one can be saved in its place.

You can verify with Google that your robots.txt by following these instructions.

#2. The Require Login module

Next, we’ll set up the Require Login module to make sure no-one can access our site, even if they find it.

  • Download and enable Require Login module.
  • Go to Configuration > Require Login.
  • On this page, you can set the message that visitors will see. By default, the message is “You must login to use this site.”
  • Click the “Save configuration” button.

require3

Open another window and visit your site. You should be automatically forced to go to the sign-in page.

verify require 4

Please note: that any links that lead to resources not contained within your site will still be clickable. If you really want to prevent anything from being visible, try the next module in our list …

#3. The Shield Module

Now that we have configured RobotsTXT and Require Login, we will setup our third module: Shield. This module will block anyone from even seeing the site, let alone trying to login.

  • Download and enable Shield. This module requires Apache to be installed on the server.
  • Click “Configure” next to the module, as in the image below:

4.5

  • Select a username and password for access.
  • Save the configuration.

shield5

  • Now when you visit your site, you’ll see a login screen before you are even able to login.

7

Author

0 0 votes
Article Rating
Subscribe
Notify of
3 Comments
Oldest
Newest
Inline Feedbacks
View all comments
Paul Rijke
Paul Rijke
7 years ago

It looks to me that if you have Shield enabled, the use of the robots.txt module is overkill? The crawlers can’t see the robots.txt file then?

Mike Shiyan
Mike Shiyan
6 years ago

The Shield module does NOT require Apache to be installed on the server. It’s a common fallacy. It can be successfully used with, for example, Nginx as well. See [url=https://www.drupal.org/node/2843068]https://www.drupal.org/node…[/url] for more details.

3
0
Would love your thoughts, please comment.x
()
x