In the modern world, data is regarded as the lifeblood of an organisation. Consequently, cyberthreats have taught us that a reactive response to a data breach, regardless of the magnitude of the event, is no longer adequate. All data is important, therefore a more modern and innovative approach needs to be adopted to keeping it safe yet available. As such, organisations need a comprehensive data storage solution that protects data and ensures its availability, providing Business Continuity (BC) if a cyber-attack or other form of data loss occurs.
All data should be treated equally
As data volumes continue to grow exponentially, this creates a challenge for businesses as they are now required to back up petabytes rather than terabytes of data. Due to the implementation of General Data Protection Regulation (GDPR) in the European Union and the imminent implementation of the Protection of Personal Information (PoPI) Act in South Africa, all data needs the same high-level of protection as well as availability.
Aside from regulatory requirements, data also needs to be protected, as it assists organisations with historic and current analysis that delivers valuable insights. It keeps companies on track to meet strategic goals, calculate growth and ensure optimal decision making.
This can only be conducted if all data is readily available, irrespective of location or profile and whether the data is part of a working set or already protected through some data protection strategy. Organisations should be able to access and analyse data at any time to assist in making informed decisions or whenever legislation requires.
Segregation not an option
While industry standards in the past dictated that a tiered approach be adopted for data availability, such legacy methods did not guarantee that data will be readily available when needed. These days, organisations cannot afford to classify data into tiers. This is not necessarily cost-related but rather due to the segregation of data not being an option. As a result, all data is vital.
It is important that organisations have access to their data, 100% of the time. As time is money, the impact of downtime on a company’s bottom line can be adverse. The same applies to the time it takes to recover from an outage or to retrieve data that has been corrupted or deleted. A low Recovery Time Objective (RTO) reduces revenue lost due to less ‘downtime’. Even more importantly, it can increase profits, provided data is made available in the shortest timeframe possible.
A popular misconception is that a high performing data protection architecture ensures an exceedingly high throughput when actual data backup takes place. Although true and important, the exact opposite is generally seen when it comes to retrieval of backed up data. An optimal scenario is where organisations can retrieve the data at the same rate as their high-performing backup. This, in turn, translates into the best possible RTO.
Choosing the correct data storage solution ensures there is no compromise when it comes to data protection and availability, as the vendor values all data as important, just as an organisation itself should. Unfortunately, some organisations approach their data protection strategy as an insurance tick box. This gives no credence to the importance of data, whether it is protected or not and especially no consideration as to how fast a business can retrieve specific data, within acceptable time frames.
Different data and applications need different types of protection, yet an all-in-one solution is always challenging to find. As data is changing, generated with different characteristics, it is easy for an organisation to lose control over what the business is doing, essentially diverting attention away from what needs to take priority for business growth. Data protection options must be carefully considered when protecting dynamic data, safeguarding cost savings and eliminating complexity and siloed views.
By Lourens Sanders, Solution Architect at Infinidat South Africa