Does Older Mean More Secure?

It's an interesting thought.  Most operational security plans will promote the constant need for operating system and application software to be running the most recent stable release.  Patching and roll out platforms are big business and take up a significant portion of a system administrators time.  Keeping mobiles flashed, operating systems patched, router firmware updated and the IPS/NGFW/AV/Blacklist (delete as applicable) at it's most recent signature release, is a constant cycle of automation and checks.  But is it worthwhile?

A new release of any piece of software would (should) have gone through rigorous QA and UAT before being released and made available for roll out.  The maker of the said software will nearly always promote that the customer roll out the most recent release, as it makes their support processes easier to manage and getting everyone onto the same (or similar) version as soon as possible, helps with bug management and security issues.  This is a pretty fair assumption.  Unless you're an application vendor delivering a service via an online site where everyone uses the same version regardless, managing multiple versions of client software installed and configured on different platforms can be a complex process.

A newly released version or patch, on one hand fixes known issues, bugs and security loopholes, but is also open to newer attacks which have yet to be identified and patched by the vendor - the dreaded Zero-day scenario.  The other issue with newly released software, is that generally it will have fewer users, meaning that bugs and stability issues may still exist.  So whilst the chances of encountering a bug/security vulnerability or stability issue may increase in the short term, the speed and quality of a vendor response will likely increase.



But what if we look at this process from a few steps back?  What are the most attacked platforms and applications today?  Android?  Chrome browser? Flash or Javascript based vulnerabilities?  Windows 7 bugs and issues?  All are relatively recent developments, but are they attacked due to inherent flaws, or is it more likely because they're popular and well used?  An exploited vulnerability will have a bigger impact as the target user space is large.  So should we therefore all use unpopular applications, browsers and operating systems?

The strong argument for Linux based operating systems is that they are inherently 'more secure'.  Yes, the number of virus's is lower than say Windows and the basic approach to things like non-root running is a strong concept, but are they simply attacked less as they're used less?  How many Cobol and Fortran vulnerabilities are being identified and attacked?  Perhaps that is an unfair comparison as vulnerabilities for older platforms do exist and in the appropriate environment are identified, patched and make the head lines.

Sometimes security isn't about patching and controls though, it can be about doing things differently and counter acting potential attacks in a subtle and diversified manner.

(Simon Moffatt)