DDoS chow on Packet Pushers
Yesterday I finally found time to listen to the DDoS chewing podcast on Packet Pushers. While I know quite a bit about the technical solutions, their focus on the big picture and Service Provider offerings was a truly refreshing one (after all, if you’re under attack, it’s best if your upstream SP filters the junk).
They also mentioned a few interesting application-related issues that will definitely help me streamline my web sites (for example: once your load goes above a certain threshold, start serving cached data instead of retrieving it live from the database) and discussed an interesting case study where a networking engineer (Greg, if I’m not mistaken) managed to persuade the programmers to optimize the application, thus saving the company a lot of money in the long run.
Even if DDoS protection might not be relevant to your current job position and although a lot of their discussion was spinning around SP offerings and application-level solutions, I would strongly recommend that you listen to the podcast. After all, it never hurts to glance around your sandbox and consider other perspectives (and I definitely enjoyed the view).
http://www.informit.com/articles/article.aspx?p=663080
http://www.informit.com/articles/article.aspx?p=665932
http://www.informit.com/articles/article.aspx?p=669598
The idea mentioned in the podcast goes beyond web caching. Imagine you build the page content dynamically: it might be retrieved from a database or you have to combine static descriptions with comments, reviews and the like. Sometimes it takes a while to retrieve the data and build HTML (or, in my case, XML), so if your site is overloaded or under attack, it's better to serve a cached snapshot to NEW (not RETURNING) visitors instead of spending time trying to build it from scratch.
BTW, Mediawiki (the Wikipedia engine) has implemented something very similar for high-performance retrievals.