How to: Development: Crawl and Work with APIΒΆ

If you want to use crawlers or some bots, you need to keep in mind the several rules

  1. Respect robots.txt file
  2. Always introduce yourself via User-Agent [1], and provide their naming correctly [2]
  3. In case works with API make the timeouts between the requests. Don’t DoS [3] your server :)

We have integrated security monitoring system which will automatically block the malicious and abnormal traffic that can lead to website unstable works. And we don’t provide the whitelists because of the “legitimate” traffic from the whitelisted IP’s can lead to website problems and this will affect your customers.


[1]Definition of User-Agent
[2]Syntax aka rules for the naming your User-Agent: Syntax, RFC 7231, section 5.5.3: User-Agent
[3]Denial-of-service attack

---

Hint

If you have a problem, need assistance with tweaks or a free consultation, if you just want to discuss your project with experts and estimate the outcome, if you're looking for a solution that reinforces your online business, we will help. Let us know through MyCloud or email.