The recent publicity and ranting about Twitter’s onMouseOver flaw* got me thinking about our perception of software quality and expectations of risk. Why is there no room for error when Twitter makes a mistake yet we put up with so many bigger – and more personal – issues in our everyday lives?

Imagine if every quality issue we experience in our daily lives resulted in the alarm such as the Twitter discovery? So many bigger things affect us yet we just seem to roll over and not question it. Maybe we should hold our audacious and careless politicians and government agencies to such standards of perfection? Perhaps we need to make certain automobile manufacturers more accountable for their low quality vehicles. Or maybe there’s some way to hold retailers responsible for the junk products they build and sell. These kinds of issues affect the livelihoods of each and every one of us in some capacity… yet we go on without sudden – and massive – backlash.

Ironically, I think the very purpose of Twitter – the forum for quick thoughts, breaking news, and related immediate gratification bits we crave – actually exacerbated the problem. The law of unintended consequences at its finest.

Don’t get me wrong, I do expect companies such as Twitter that have global visibility to hold themselves to the highest standards of quality and security. The need for secure development processes, proactive security testing and smart risk analysis is a must. But one error/oversight on their part and suddenly the media and “experts” are lambasting them to no end? Give me a break.

Sometimes we forget that people make mistakes. Intentional or not, we always have and we always will. Even with the most stringent of processes and controls in place, things like the Twitter flaw are going to get through. The solution is simple: find the problems, fix them and do your best to not let them happen again. Just know that new things are going to crop up.

Most importantly, I believe the hoopla over the Twitter flaw underscores our tendency to get caught up in short-term, often self-serving and inconsequential, issues rather than seeing the big picture. I recently experienced this with a family member whose oncologist and neurosurgeon were so focused on one illness in one part of the body that they ended up missing a much bigger problem that was screaming to be heard. Be it application security, our personal lives or whatever – thinking long-term and seeing the big picture by not getting caught up in the hype, making reasonable efforts to keep things in check, learning from our mistakes and adjusting our focus where appropriate is the only way we’re going to get ahead.

* http://blog.twitter.com/2010/09/all-about-onmouseover-incident.html

SHARE THIS POST
THE AUTHOR
Kevin Beaver

Kevin Beaver, CISSP is an independent information security consultant, writer, and professional speaker with Atlanta, GA-based Principle Logic, LLC. With over 32 years in IT and 26 years in security, Kevin specializes in vulnerability and penetration testing, security program reviews, and virtual CISO consulting work to help businesses uncheck the boxes that keep creating a false sense of security.

Comments are closed.