www.mysite.com/pmwiki/myfirstpage
and, of course, www.mysite.com/pmwiki/mysecondpage
. This way, when someone types a specific word in Google that appears on those pages, your PmWiki page would show up.
However, what you don't want is a complete archive of every www.mysite.com/pmwiki/myfirstpage?action=edit
, or any other PmWiki-command directly accessible through Google. The reason that you don't want this is because these commands are triggered the moment a visitor clicks the link to get to your site. And end up in an edit-screen or change-log.
There are two ways to prevent this scenario from happening. The first one is easy and consists of creating a robots.txt file in the root of your website (i.e. www.mysite.com/robots.txt
). The other way is to use meta-tags that you program in your local.php script.
robots.txt
file:
User-agent: Googlebot Disallow: */main/allrecentchanges$ Disallow: */pmwiki* Disallow: */search* Disallow: *recentchanges* Disallow: *action=*However, now PmWiki will by default include special meta-information in the pages that it returns when edit and diff actions are performed that instructs search engines to neither index the page, nor follow any links in it. This removes the need for some of the lines in the
robots.txt
file above.
<< QAMarkup | PmWiki.DocumentationIndex | TroubleShooting >>