I just received an alert that I had exceeded 80% of my data usage.  I took a look and was surprised at what I saw….


The closer I began to look at the usage report the more confused I became.  I almost picked up the phone but instead decided to check my iPhone to see what it was telling me.  Then things got really interesting.  What’s 80%? What’s my usage? How much data do I really have?


Taking a step back – there’s a bigger picture.

How do you guarantee ‘user experience’?  How do you keep your channels in sync?  How do you know the data you are presenting from your API is correct?  Today is the 14th of the month so the reporting is early. These features are not new. I’ve been checking this information for some time so how could it change?

The quality controls across the implementation that could be applied here are wide and deep.   What is adequate?  Are all these issues equal? What test techniques do you use?  Two that seem obvious are regression testing your API and regular expression assertions on your presentation layer, but who’s responsibility is it to check these things.  Development? Operations? Test?

If you need a good reason to find the time to work on these things, maybe this is a good example for you.  It’s in production and users are experiencing this.  In today’s environment when go-to-market is key and the cost of quality and the time available to ensure quality using traditional practices are becoming obsolete, taking a step back and looking at how more effective quality controls can be applied across the implementation life-cycle is paramount.