Cyber Security in the Aerospace and Defence Industry

Wednesday, 11 March 2020

Increasingly, software has become critical to aerospace and defence systems. Here comes the fundamental role of security. In fact, security should be thought of as an enabler for such systems. Specific concerns in secure software for aerospace would include unauthorized access (and use) of on-board and off-board systems (not just be users but also by other software applications and services, given that aerospace systems are essentially “systems of systems”).

Then there are always the cyberattacks, which is bread and butter for defence. In the case of software systems, especially large software systems (100s of millions of lines of code), it is not easy (in fact very difficult) to develop “correct” code let alone “secure” code. Rigorous secure software development (and enforcement of good security practices in software development) has always been an issue with big software companies. When the systems are large (e.g. in Defence and Aerospace), this will be a major issue. Related to this issue is the testing of large-scale software, which can be time consuming and can lead to delay the delivery of systems, which is always a factor when it comes to large scale software systems (security adds more complexity to this process).

I believe the developments in cloud and machine learning based technologies also have a role to play in the secure systems for aerospace and defence. Cloud is aimed to provide flexibility and functionalities thereby helping to improve efficiency (hence contributing to cost reductions), provided these are achieved securely – whether it is to do with securing the data (while it is in the cloud and being used by cloud services) as well as security of the cloud services themselves. As aerospace and defence systems are complex, use of secure cloud-based technologies will be beneficial.

Then there is the area machine learning; while leveraging machine learning has significant potential to transform aerospace and defence, once again security and trust issues need to be considered. A key issue is that of trust. How to ensure that the machine learning systems are trustworthy. Once again, there are multiple aspects to this issue. It is trustworthy in the sense that the decisions made by these systems can be “trusted” (they correspond to what are expected). These machine learning systems themselves cannot be subverted (at least if they are subverted by the adversaries, we should be able to detect them and discard those decisions made by subverted machine learning systems). This area of adversarial machine learning is one of the major areas of concern at present and I believe this will be of direct relevance when it comes to greater adoption of machine learning (or more loosely AI) based systems for aerospace and defence industry. These notions of trust and security are also relevant to autonomous systems such as unmanned vehicles (whether in the air, in the water or on the roads).

Vijay Varadharajan 24 Feb 2020
Global Innovation Chair Professor in Cyber Security
The University of Newcastle

Related news

The University of Newcastle acknowledges the traditional custodians of the lands within our footprint areas: Awabakal, Darkinjung, Biripai, Worimi, Wonnarua, and Eora Nations. We also pay respect to the wisdom of our Elders past and present.