So I’m new here on the product team at Keynote and one of the things I’ve been asked to do is think about is Cloud Computing. My background is originally in telecommunications, especially in secure communications over public infrastructure. As I’ve been learning about Cloud Computing one of the first things that struck me is that the security approach in this business seems to be built on a faulty model: security through obscurity. Here is a great, but long, article on this topic by a favorite expert of mine, Bruce Schneier. A simplified cliff notes version: the more secrets a system depends on the weaker it is. Invariably someone determined who is lucky or smart will find something you missed or don’t want them to know and then problems ensue. That’s why encryption and authentication techniques are based on open algorithms and a rite of passage for a new system is to survive a challenge to world: break me if you can.
You are probably wondering by now what this has to do with performance? Well I’ve been thinking that this philosophy could apply to performance issues as well. To date the cloud message seems to be that customers should simply trust and assume the cloud will perform. Most of the standard SLAs from cloud providers today are only around availability. And while I’ve heard that large customers are negotiating performance related SLAs, there can still be problems when determining what went wrong and who is responsible. Further complicating this is the reality that cloud vendors themselves are dependent on 3rd parties like their ISPs.
Of course realistically the degree to which a cloud system is transparent or obscure probably doesn’t change what companies must do: monitor the performance of critical applications. Still I think improvement in the transparency of a cloud computing platform is a good signal for how mature and ready the solution is for adoption.