Because it takes work to obey the rules, and you get less data for it. The theoretical competitor could get more ignoring those and get some vague advantage for it.
I’d not be surprised if the crawlers they used were bare-basic utilities set up to just grab everything without worrying about rules and the like.
Because it takes work to obey the rules, and you get less data for it. The theoretical competitor could get more ignoring those and get some vague advantage for it.
I’d not be surprised if the crawlers they used were bare-basic utilities set up to just grab everything without worrying about rules and the like.