ao link
Business Reporter
Business Reporter
Business Reporter
Search Business Report
My Account
Remember Login
My Account
Remember Login

Secure by design: voluntary doesn’t cut it

John Smith at Veracode-argues that secure by design must become a mandatory standard in the UK 

Linked InXFacebook

As cyber-crime increases, the UK’s need to strengthen digital resilience is becoming critical. Yet despite these advances, the government’s Software Security Code of Practice remains voluntary, leaving a crucial gap at the heart of our national cyber-security posture. Optional guidelines, however well-intentioned, aren’t enough to match the urgency or scale of today’s cyber-security threats. 

 

We’ve reached a pivotal moment. Software now powers everything from NHS booking systems to local council services and national infrastructure. Yet too often, security is still an afterthought — bolted on late, inconsistently, or not at all.

 

The results speak for themselves: a steady drumbeat of breaches, escalating supply chain attacks, and spiralling remediation costs. According to our latest State of Software Security report, the average time to fix security flaws has increased to 252 days – up 47% over the past five years and 327% over the past 15. Half of organisations now carry critical security debt, with 70% of that debt stemming from third-party code and supply chain components.  

 

To break this cycle, we need to move from ‘best-effort’ guidance to enforceable standards. Secure by design should not be optional. It should be mandated. It is the only approach that can scale resilience across both the private and public sectors and shift the burden of software security away from overstretched customers and onto the organisations building the software in the first place. 

 

 

Voluntary codes can’t match today’s risks 

The Code of Practice rightly encourages vendors to consider security from the start, but experience tells us that market incentives alone won’t drive consistent action. When security is voluntary, it’s the first thing to slip when delivery deadlines loom. 

 

Despite multiple pledges and industry frameworks, too many software developers have yet to meaningfully embed secure development principles into their processes. Our research shows that even among organisations with some level of secure development practice, flaws are often left open for extended periods. In fact, in lagging organisations (the bottom 25%) more than 67% of applications carry unresolved security debt. In contrast, within leading organisations (the top 25%) that figure drops below 17%. This gap underscores the stark difference between organisations that treat security as a foundational principle and those that view it as a checkbox exercise.  

 

And while large enterprises may have the resources to compensate for insecure software with layered defences and bespoke testing, smaller organisations or those with smaller budgets, like schools, SMEs and NHS trusts, don’t. But without secure software by default, they remain exposed. That’s why a mandatory, standardised approach to software security is not just a regulatory issue, but one of digital equality. 

 

 

Secure by design: from principle to practice 

Secure by design means that security is built in from the start and across every stage of the software development lifecycle. But it’s not just a development methodology — it’s a mindset shift, and it starts at the top. 

 

Embedding secure by design begins with leadership accountability. Organisations must treat software security as a core product requirement, not a nice-to-have. That means setting secure coding standards and integrating security into the ‘definition of done’ in delivery processes. 

 

It also means facing up to security debt, those unresolved vulnerabilities that quietly accumulate over time. Our data shows that leading organisations resolve over 10% of flaws monthly, while lagging organisations fix less than 1%. Prioritisation is key: rather than boiling the ocean, security teams must focus on vulnerabilities that are exploitable and reachable, using tools that contextualise risk and drive smarter, faster remediation.   

 

Then there’s the growing risk in third-party components. Today’s applications are increasingly assembled, not written — often comprising hundreds of open-source libraries, many with their own hidden dependencies. Alarmingly, our research found that 70% of critical security debt now originates from third-party code. Without full visibility into these components and the ability to track and update them, organisations leave themselves exposed to supply chain attacks that can cascade across industries and borders. Maintaining a Software Bill of Materials (SBOM) and applying continuous software composition analysis must become non-negotiable practices. 

 

 

AI is not a silver bullet — but it can help 

There’s no question that generative AI tools are changing the pace and shape of software development. But productivity gains come with new risks. AI-generated code can contain just as many vulnerabilities as human-written code and at far greater scale. 

 

Unless we radically improve how we detect and fix flaws, AI will only widen the gap between development speed and security assurance. But if we train AI models on secure patterns, and use them to accelerate remediation as well as creation, we have a shot at closing that gap. 

 

In the long run, we may find it easier to teach a handful of AI systems to code securely than to train millions of developers from scratch. But that future depends on what we do now to embed secure by design principles across both human and machine-generated code. 

 

 

A call for mandatory standards 

The private sector has made progress, but not fast enough. The voluntary model has clear limits. That’s why I believe the UK should consider mandating secure by design for software vendors in critical sectors — with clear accountability, transparent reporting and alignment with international standards like those promoted by the U.S. Cybersecurity and Infrastructure Security Agency (CISA). 

 

We have precedent: from food hygiene to vehicle safety, baseline standards exist to protect the public from hidden defects. It’s time we treated software the same way. Customers should be able to trust that the software they rely on meets basic security requirements and vendors should be expected to prove it. 

 

The UK has an opportunity to lead here. By turning principles into policy, and guidance into guardrails, we can tip the balance back towards defenders, protect our digital economy and ensure that security is not a luxury but a foundation.  

 


 

John Smith is EMEA CTO at Veracode

 

Main image courtesy of iStockPhoto.com and Angelina Melik-Akopian

Linked InXFacebook
Business Reporter

Winston House, 3rd Floor, Units 306-309, 2-4 Dollis Park, London, N3 1HF

23-29 Hendon Lane, London, N3 1RT

020 8349 4363

© 2025, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543