Brian Eno has spent decades pushing the boundaries of music and technology, but when it comes to artificial intelligence, his biggest concern isn’t the tech — it’s who controls it.
That article is overblown. People need to configure their websites to be more robust against traffic spikes, news at 11.
Disrespecting robots.txt is bad netiquette, but honestly this sort of gentleman’s agreement is always prone to cheating. At the end of the day, when you put something on the net for people to access, you have to assume anyone (or anything) can try to access it.
You think Red Hat & friends are just all bad sysadmins? Source hut maybe…
I think there’s a bit of both: poorly optimized/antiquated sites and a gigantic spike in unexpected and persistent bot traffic. The typical mitigations do not work anymore.
Not every site is and not every site should have to be optimized for hundreds of thousands of requests every day or more. Just because they can be doesn’t mean that it’s worth the time effort or cost.
That article is overblown. People need to configure their websites to be more robust against traffic spikes, news at 11.
Disrespecting robots.txt is bad netiquette, but honestly this sort of gentleman’s agreement is always prone to cheating. At the end of the day, when you put something on the net for people to access, you have to assume anyone (or anything) can try to access it.
You think Red Hat & friends are just all bad sysadmins? Source hut maybe…
I think there’s a bit of both: poorly optimized/antiquated sites and a gigantic spike in unexpected and persistent bot traffic. The typical mitigations do not work anymore.
Not every site is and not every site should have to be optimized for hundreds of thousands of requests every day or more. Just because they can be doesn’t mean that it’s worth the time effort or cost.