Dieses Plugin ist nicht mit den jüngsten 3 Hauptversionen von WordPress getestet worden. Es wird möglicherweise nicht mehr gewartet oder unterstützt und kann Kompatibilitätsprobleme haben, wenn es mit neueren Versionen von WordPress verwendet wird.

DB Robots.txt

Beschreibung

DB Robots.txt is an easy (i.e. automated) solution to creating and managing a robots.txt file for your site. It is easy to create a robots.txt without FTP access.

If the plugin detects an existing XML sitemap file it will be included into robots.txt file.

It automatically includes the host-rule for Yandex.

Installation

  1. Upload bisteinoff-robots-txt folder to the /wp-content/plugins/ directory
  2. Activate the plugin through the ‚Plugins‘ menu in WordPress
  3. Enjoy

FAQ

Will it conflict with any existing robots.txt file?

If a physical robots.txt file exists on your site, WordPress won’t process any request for one, so there will be no conflict.

Will this work for sub-folder installations of WordPress?

Out of the box, no. Because WordPress is in a sub-folder, it won’t „know“ when someone is requesting the robots.txt file which must be at the root of the site.

Rezensionen

Alle 1 Rezension lesen

Mitwirkende & Entwickler

„DB Robots.txt“ ist Open-Source-Software. Folgende Menschen haben an diesem Plugin mitgewirkt:

Mitwirkende

Übersetze „DB Robots.txt“ in deine Sprache.

Interessiert an der Entwicklung?

Durchstöbere den Code, sieh dir das SVN Repository an oder abonniere das Entwicklungsprotokoll per RSS.

Änderungsprotokoll

2.2

  • Fixed Sitemap option

2.1

  • Tested with WordPress 5.5.
  • Added wp-sitemap.xml

2.0

  • Tested with WordPress 5.0.
  • The old Host directive is removed, as no longer supported by Yandex.
  • The robots directives are improved and updated.
  • Added the robots directives, preventing indexind duplicate links with UTM, Openstat, From, GCLID, YCLID, YMCLID links

1.0

  • Initial release.