Important: This documentation covers Yarn 1 (Classic).
For Yarn 2+ docs and migration guide, see yarnpkg.com.

Package detail

@deskeen/web-builder-create-robots-txt

deskeen5MIT0.2.0

Web Builder Module - Create robots.txt file

build, website, node.js, robots.txt

readme

Web Builder Module - Create robots.txt file

This module allows the @deskeen/web-builder engine to create a robots.txt file.

Install

npm install @deskeen/web-builder
npm install @deskeen/web-builder-create-robots-txt

Usage

And add the module to the list of modules:

const builder = require('@deskeen/web-builder')
await builder.build({
  source: [
    // List of files or directories
  ],
  modules: [
    [
      '@deskeen/web-builder-create-robots-txt',
      {
        path: '/deploy/path', // Deploy path
        sitemapUrl: 'https://example.com/sitemap.xml', // Sitemap URL
        disallowedCrawlers: [
          // List of crawlers to exclude
          // Example:
          // 'Facebot'
        ],
        disallowedUrls: [
          // List of URLs to exclude
          // Example:
          // '/login'
        ],
      }
    ]
  ]
})
  • Apple: Applebot
  • Amazon: ia_archiver
  • Askdotcom: Teoma
  • Bing: bingbot
  • DuckDuckGo: DuckDuckBot
  • Facebook: Facebot
  • Google: Googlebot
  • Google Ads: AdsBot-Google, Mediapartners-Google
  • Google Images: Googlebot-Image
  • IBM: ScoutJet
  • MSN: msnbot
  • Twitter: Twitterbot
  • Yahoo: Slurp
  • Chinese search engine: baiduspider
  • China’s largest eCommerce site: EtaoSpider
  • Russian search engine: Yandex
  • Czech Republic search engine: seznambot

Contact

You can reach me at {myfirstname}@{myname}.fr

Licence

MIT Licence - Copyright (c) Morgan Schmiedt

changelog

v0.2.0 - 2021-05-21

New:

  • Add disallowedUrls option.

Breaking change:

  • disallowedBotNames option becomes disallowedCrawlers.

v0.1.0 - 2020-09-02

  • Initial verion