Coding standards needed to counter software defects at heart of failed IT programmes
Mark James, vice president, CAST UK & Ireland asks if standards adoption is the answer to IT project performance in the UK public sector
The last five years has seen a substantial realignment of IT sourcing in the UK. The change has affected all sectors but nowhere has it been more prevalent than in government. Right across government organisations are rethinking how they outsource technology development and whether it’s more beneficial to invest in in-house talent.
There’s been a significant shift from long duration, monolithic outsourcing contracts to shorter multi-sourced agreements and the adoption of commercial cloud services for infrastructure and office productivity applications. But perhaps the most striking of all has been the reversal the decades old trend for outsourcing with the re-emergence of in-sourced application development in major Departments such as Defence, the Home Office, DWP, HMRC and Transport.
There is no doubt that the former approach, as shown in numerous NAO reports, led to a lack of control, limited visibility, inflexibility, failed programmes and seemingly routine cost over-runs. But is the new approach delivering better results?
According to the latest research from CAST Research Labs, the UK puts out the poorest quality software globally , and reports from the Public Accounts Committee suggests this transition to in-house talent is still “a work in progress” in the Public Sector.
The challenge of IT Complexity
Many have highlighted the need to re-skill internal teams so that they have the knowledge and experience to deliver in-sourced programmes and hold suppliers accountable for their delivery.
Beyond the skills gap, developers are also under pressure to deliver code quickly to meet business demands. This might be, for example, turning on a new citizen-facing website that allows them to submit tax forms or a back-office app that processes citizen claim forms. They are often faced with a trade-off between productivity and taking more time to ensure the new systems are risk-free.
As stated by CAST Research Labs co-author, Bill Curtis, “the killer defects for large, critical business systems, especially those written in multiple languages, are at the system, architectural level. They must be eliminated early [to avoid disruption to the business].”
This kind of complexity leads to defects which can lead to outages – the ones that get you all the wrong type of press coverage . Complex IT systems take longer to fix, require more effort and are more expensive to modify as your business changes. They are also the ones that invariably cause all the problems when you move suppliers or in-source their management. To cap it all, defects are the things that cyber criminals seek out when they want to steal from you, shut down your operation or publicly embarrass you.
Re-skilling IT staff is undoubtedly an important step toward fending off this complexity, although the problem reaches beyond human capacity.
Fighting IT Complexity with standards
The NAO recently reviewed the government’s progress with Digital Transformation. Making a judgement on the impact on citizen services is one thing, but in this world of complexity how do you review the effectiveness or efficiency of the IT that has been delivered? Charles Symons was on to something when he suggested in one of his blogs that government needs the equivalent of the National Institute for Clinical and Care Excellence to measure the performance of government IT. But what he misses is that you still need an objective way to measure the complexity. It’s reassuring to see that last week’s NAO report on Digital Transformation recommended that the government Digital Service improve the relevance and consistency of technical standards it uses and find better ways to ensure that there is robust assurance against technical performance measures.
Thankfully groups like the Consortium for IT Software Quality (CISQ), an IT standards organization, have come on the scene to help tackle the IT complexity problem. CISQ has published a set of engineering best practices and data that correlates poor software structure with major security, stability and performance-related glitches that cause business disruptions, and in our case disruption to citizen services.
In addition to helping government departments measure the overall “health” and complexity of their software, organisations like CISQ are empowering the public sector to objectively and automatically assess the quality of software applications, whether they are in-sourced or outsourced. This is a particularly valuable measure for developers that are being re-skilled to meet modern delivery needs.
It also gives Whitehall a way to shine a light into the workings of legacy applications, many of which are using technologies over 30 years old where the real experts have long since retired. This buys time to understand how these applications work and whether they should be modified or retired.
To succeed in the future, UK government departments would do well to adopt CISQ standards in their organisations, helping developers improve not only the quality but the speed of their output. Objective measures also provide visibility into the quality of outsourced applications. Or to put it another way, if we don’t identify and eliminate poor software quality, we will continue to see failed IT programmes, cost over-runs, unexpected outages and unwanted cyber incidents from the public sector.
Mark James is Vice President of CAST UK and Ireland , a software analysis and measurement company. Mark has two decades of experience working within government as a board member of a Defence agency as well on the supply side in technology, service and consulting companies .