Tuesday, July 16, 2024
No menu items!

Why Immutable Architecture is Key to Protecting Backup Data

Must Read
Jenna Delport
Jenna Delport
I’m a tech writer, world traveller, avocado-eater and dog lover, not always in that order.

Ransomware and other malware are a constant threat, and data backup is critical to safeguard a business’ most important asset. However, cybercriminals are increasingly targeting and encrypting backup data copies as well, a strategy that leaves organisations unable to recover unless they pay the ransom.

Keeping an immutable copy of backup data is a best practice for data protection and is the solution in the event of a successful two-pronged ransomware attack.

What is an immutable architecture?

For data to be immutable means that it is unable to be changed over a period of time. With an immutable architecture, when retention policies are set for data backup and recovery, data will be secured, locked and unchangeable for that retention period. This in turn means that it cannot be encrypted by malware, keeping it safe from cybercriminals and ensuring that recovery is possible without the need to pay a ransom.

An immutable architecture is critical in today’s landscape and should address three key elements: data locks; air gaps and isolation; and data validation.

Automatically secure against unauthorised changes

Data locks automatically secure storage, applications, and the backup infrastructure from unauthorised changes, such as those that would be made by malicious applications like ransomware. Locking mechanisms should be able to be applied to any storage, including hyperscale, and should be applied at a deep layer under the operating system. This is to ensure that the locks do not rely on software so that if a malicious actor moves or alters the software, the storage is still protected.

The first layer is hardening the storage infrastructure itself. Leading-edge industry standards, as well as industry best practices, need to be used to harden storage appliances. This includes the underlying operating system and databases for metadata. This layer helps to improve the overall security posture.

Secondly, it is essential to harden the application, in other words, the backup and recovery management interface. The typical environment has many users at varying levels of permission who have access to the backup and recovery environment. They may be able to change, delete or maintain data, and this introduces the potential for both accidental and malicious removal of critical data. To protect against this, it is essential to lock the storage and the application layer using multi-factor authentication controls. This will help to protect and validate access to backup and management software. It is also advisable to implement command authorisation as an additional layer of locking, to add yet another safeguard.

By adding layers of protection at both the appliance and application levels, organisations are better protected against both accidental data loss and malicious intent.

Segment and block direct access to backup data

Air gaps and isolation enable organisations to segment data and block direct access to backup data copies. This is essential for protection against ransomware events. Threats typically infiltrate an environment through various exploits, which gain the malicious actor access to the network. From there, the attack infiltrates through the environment, locating data and ‘lying in wait’ to be triggered in a full-blown ransomware attack.

Air gapping mitigates this, providing a layer of protection against these laterally moving threats. Since storage is segmented, isolated and unreachable, it becomes difficult for a threat to gain access to storage targets.

Continuously ensure backup copy integrity

Organisations rely on backup data to be protected and secured. This in turn relies on the assumption that the data is not corrupted since this would make it unrecoverable. However, without data validation, it is impossible to verify the integrity of backup data. With an on-premise solution, this needs to be performed at a hardware level, whereas with cloud-based and hyperscale storage, data validation can be performed more efficiently, at a software level. Data can be validated at a block-level as it is being transmitted, and if it passes checks it will be written to disk. Should data be invalid, an alert will be sent so that the backup can be rerun.

This delivers a more proactive view of data, rather than a reactive strategy once corrupt data is already written to disk, ensuring that the backup copy is always valid and can be restored. Intelligent solutions can even use distributed file systems that have the inherent capability to mitigate data corruption and heal at the file system level. This offers extra resiliency to ensure data is valid from the start and always stays valid.

Unchanging, protected, safe from harm

Today, successful ransomware attacks rely not only on encrypting primary data but attacking the backup copy as well. Once this happens, organisations are defenceless and are left with little option other than to pay the ransom or lose their data. Implementing data management best practices, including keeping an immutable copy of data, is essential to an effective disaster recovery strategy. An immutable architecture that incorporates data locks, air gaps and data validation, will ensure organisations can recover from any data disaster.

By Kate Mollett, Regional Director at Commvault Africa

Edited by Jenna Delport
Follow Jenna Delport on Twitter

Follow IT News Africa on Twitter

- Advertisement -

Innovative AI Partnership Transforms Healthcare Technology

BroadReach Group and BAO Systems- a US-based health IT organization, have signed a collaboration agreement that will expand how...
Latest News
- Advertisement -

More Articles Like This

- Advertisement -