SCADA Security and the Broken Business Model for Software Testing

Recently Rob Hulsebos wrote an article for this blog where he raised the perennial problem of programming errors contributing to security vulnerability. I have a newsflash for you - this isn’t new. It may be a new concept to some in the world of Industrial Control Systems, but it’s been a problem for software engineers since about 5 seconds after the first ever program successfully compiled.

Too Much Code, Too Little Testing

I studied Systems Architecture and Software Engineering in the late 80’s and early 90’s. It was known then that you could write a program only 100 lines long that had so many execution pathways that it was impossible to test them all.  Nowadays IEEE reports that the average mobile phone now has over 2 million lines of code, and that smartphones have at least five times as many as that. That’s a lot of code to test!

And, it’s not just the size of the code but the coding standards that are of concern. I can remember having to read and inwardly digest standards for writing software, coding structures, calls and buffer/array usage.

In part this was to make code easy to maintain, in part to ensure compatibility between sections written by different people, and by no means least, to make the code secure and reliable. The standards should include what Hulsebos described previously – ‘Programming mistakes that thou shall not make’.

Good Coding Standards have not been the Priority

Unfortunately, a lot of people didn’t have such an education, were self-taught, or worked for companies who didn’t care about anything more than how many lines of compiled, usable code their staff wrote each week. The worst offenders were obvious to anyone who followed BugTraq, as I have for many years.

Look how long it took Microsoft to ‘get’ the issue of security, start writing code that was more secure ,and then have a pro-active patch policy. Once they got their act together the number of bugs in their code fell by a factor of ten in many products.

One of the objectives of Java was to overcome the buffer overflow problem. The way it is written makes it almost impossible to have a vulnerability of this sort, and the ‘sandbox’ capability can restrict untried and untrusted code from many calls that can cause damage.

Unfortunately that isn’t true for most programming languages. They still have these problems and rely on the programmer and testing process to overcome them. Anyone who’s studied quality knows that you have to design it in to be cost-effective, not test it in afterwards. Go look up ‘Six Sigma’ and ‘Kaizen’.

The Broken Business Model of Software Testing

What concerns me most is that the ‘business model’ of software testing and analysis is often more attractive for the hackers than it is for the software vendors. There’s a potential return on investment for the hackers in terms of denial of service, possible extraction of data and reputational damage. They can also sell the vulnerabilities on the open market. There are websites hosted in some countries where these things, among others, are openly traded for money. You can even buy a ready-made exploit package or hire someone to write it for you.

In comparison, the software vendors have to take time out from developing new features that they can sell and give them a competitive edge over their rivals to re-educate their software team. Plus they have to go back and review and fix existing code, and then issue free patches. I can’t see a lot of managers doing cartwheels down the hall at that prospect, especially with the pressures on revenue and profit margins in the current economic climate.

Vendors Only React to Bad PR

Blogs like the one written by Hulsebos are praiseworthy for helping to raise awareness and generate demand among the end-user community for vendors to fix the problems, or at least adopt technology that can manage the risk in other ways. History shows us that without the academic papers describing just how bad the products are and ‘naming and shaming’ the manufacturers nothing will change.

As a long-term follower of BugTraq I can remember how bad it was before the ‘You’ve got 30 days before we go public with this vulnerability’ policy was adopted. Ask any old-timer about those days...and have a chair and a fresh cup of tea in hand when you do, because they’ll be talking for some time.

ICS Vendors Need to Prioritize Software Security

Far too many organisations put profits before customer service and only the threat of adverse publicity made them take remedial action. Some software vendors have even used legal action to try and stop disclosures. Yes, they would rather spend money on lawyers than on training software engineers and fixing the problem. Guess who I won’t buy from?

End users out there, please join me in demanding higher security from DCS and ICS vendors, and fix the business model for software testing so it benefits customers, not hackers.

This article is a special guest contribution by:

David Alexander
Head of Vulnerability Research, Regency IT Consulting
david.alexander@regencyitc.co.uk     

Practical SCADA Security thanks David for this article.

Related Content to Download

Note: you need to be a member of tofinosecurity.com and logged in to have access to the document below. Register here to become a member.

Article - "Revealing network threats, fears"

 

This article by Eric Byres explains how the ANSI/ISA-99 security standards provide a framework for dealing with network security threats.

While not related to the problem of software testing, it is a good article about the fundamentals of Industrial Control System security.

Related Links

Information about Secure Development Lifecycle (SDL) Processes and Certification

 Other Practical SCADA Security Articles on Security Vulnerabilities in SCADA and ICS

 

 RSS Feed Subscribe to the "Practical SCADA Security" news feed

Comments

1

great blog

Add new comment