Thank you for posting that question, that is a very valid question.
Mainly the issue is scraping bots / AI (googles and Meta's models mainly), they are expensive to run and as such they scrape as much info as possible to monetize in any way possible.
There are generally a few main ways on how the bot and slow down the website as listed bellow.
- High Request Volume: and not enough bandwidth or processing power to provide for all the artificial visits.
- Insufficient Server Resources: such as relying on a single raid of hard drives (far slower than what an AI / Bot can handle hence why it can easily max out read write performance)
- Lack Of Caching: No caching means that it has to regenerate each page from scratch each time the Bot wants to read its contents
I hope that this helped to answer your question!
Kind Regards, Damian!