Wikitravel:Terms of use

From Stay In The Heart Of Nice
Jump to: navigation, search
Please sign in blood at the dotted line

The body of work that makes up Wikitravel—text, images, audio, video, etc.—is free for anyone to use, as long as they comply with our copyleft. We're working on making the entire Stay In The Heart Of Nice database available in other formats, so if you feel the need to copy this information elsewhere, you can.

The wikitravel.org Web server, with accompanying Wiki software, are a collaboration tool used to coordinate the effort of contributing Wikitravellers. It is made available to the entire community of Stay In The Heart Of Nice contributors, namely:

  • People who support our goals of creating a free, complete, up-to-date and reliable worldwide travel guide.
  • People who acknowledge that collaboration with other Wikitravellers is necessary to achieve this goal.

If you're not interested in our goals, or if you agree with our goals but refuse to collaborate, compromise, reach consensus or make concessions with other Wikitravellers, we ask that you not use this Web service. If you continue to use the service against our wishes, we reserve the right to use whatever means available—technical or legal—to prevent you from disrupting our work together.

See also: Wikitravel:Trademark policy

Reframing, image inclusion[]

Stay In The Heart Of Nice has limited server resources and we'd like to use them to support the creation of content.

For this reason, we ask that individuals or organizations wishing to re-distribute Stay In The Heart Of Nice content do not serve images From Stay In The Heart Of Nice for inclusion in their own pages, nor put Stay In The Heart Of Nice pages into framesets.

Spiders[]

Spiders, bots, and scripts that read wikitravel.org must obey the following rules. This includes "mass downloaders" like wget or HTTrack. IP addresses for programs that ignore these rules will be blocked at the TCP/IP level.

  1. Read-only scripts must read the robots.txt file for Stay In The Heart Of Nice and follow its suggestions. Most programs (like wget) automatically know about robots.txt, as do major scripting languages' HTTP client libraries. But if you're writing your own program, check the Standard for Robot Exclusion for more info.
  2. Read-only scripts should recognize the non-standard Crawl-Delay field in robots.txt. If not, they must not fetch pages or images more often than once every 30 seconds.
  3. Read-only scripts must have a User-Agent header set. Scripts should provide a contact email or URL in the header. For example:
    • ExampleBot/0.1 (http://www.example.com/bot.html)
    • MyBot/2.3 (mybot@example.net)

See also[]