Skip to content

Commit

Permalink
mention that real web crawlers should be a lot smarter (closes #411)
Browse files Browse the repository at this point in the history
  • Loading branch information
kraih committed Nov 8, 2012
1 parent d7c998f commit 8b30830
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions lib/Mojolicious/Guides/Cookbook.pod
Expand Up @@ -971,8 +971,8 @@ can keep many parallel connections active at the same time.
Mojo::IOLoop->start unless Mojo::IOLoop->is_running;

You can take full control of the L<Mojo::IOLoop> event loop. Note that real
web crawlers should respect C<robots.txt> files, and not overwhelm web servers
with too frequent requests.
web crawlers should be a lot smarter and respect C<robots.txt> files for
example, so they don't overwhelm web servers with too frequent requests.

=head2 Parallel blocking requests

Expand Down

0 comments on commit 8b30830

Please sign in to comment.