Backup: If accountants and lawyers aren’t getting it right, how well are you doing?
On the back of recent research conducted by Alto Africa, a fascinating glimpse has been taken into the data backup habits of some of the most regulated industries in the country: the legal and accounting professions. What we found is that while most practitioners have some measures in place, information isn’t universally well protected - and a small percentage does nothing at all to safeguard their data.
That should be seen in context. With today’s modern technology, there should be little reason to lose any information, should things go wrong.
Experience has proven to all of us, at one time or another, that computers, laptops, phones and desktop drives can and do break. When they do, we’ve also all experienced the frustration and anger at losing data – yet, even with easy-to-use solutions which virtually guarantee never having to lose information again, we don’t use them.
But what of accountants and lawyers, who have a legal obligation to store information for specified periods?
Overall, the levels of ‘having something in place’ are very good. The real question is ‘are they good enough’ or ‘are you using the best available options’. Typically, those in these professions are backing up manually, onsite. That leaves the business at risk, for example, if a fire breaks out, or if the designated person forgets to do it.
We polled a sample of 63 accountants in the Western Cape and found that:
- While 85% backup both data and emails, these are typically done onsite.
- Of these, only half perform backups daily; 20% weekly; 11% monthly.
- Almost 1% of respondents do not perform backups at all.
- 40% personally backup their data, while 25% leave it to IT professionals. A further 20% could not specify (they don’t know how their data is backed up).
- Most backups are conducted manually; 19% are done in the cloud, 37% are done onsite and 25% are done on both.
Alto Africa also polled 30 legal professionals in the Western Cape. We found that:
- While 90% backup both data and emails, these are typically done onsite.
- Of these, half perform backups daily; 27% weekly and 10% monthly.
- 20% personally backed up their data, while 40% leave it up to IT professionals. A further 10% could not specify.
- Most backups are conducted manually; 13.3% are done in the cloud, 33.3% are done in the cloud and 56.7% on both.
According to the South African Institute of Chartered Accountants, the regulations regarding retention of records include that auditor’s data is kept for 5 years after completion; Close Corporation documentation for up to 15 years; companies information for 7 years; and consumer information/credit records for 3 years.
The findings demonstrate that even in the most regulated industries where there are clear rules surrounding information protection, the rigour applied to backups is patchy.
And intuitively, we know that the situation is likely to be far worse in those industries where there aren’t specific regulations for information protection – but one commonality is that, regardless of vertical industry, data loss hurts.
What happens when your computer crashes and five years worth of records go with it? According to research conducted by BCN, 93% of businesses that lose their data go out of business within a year, and 50% that lose their data stores for a period of 10 or more days file for bankruptcy immediately. This is a significant risk, particularly as it applies to records that have sensitive information.
The question that remains is simple: Do you really want to risk your business by not implementing inexpensive, effective backup protection?
*Alto provides an online backup solution that enables secure cloud server backups which offer protection from disaster, automated daily backups and immediate data restoration from as little as R3/GB/month.
Oscar Pistorius trial shows the importance of data management
Oscar Pistorius’ trial has been dubbed the ‘trial of the century’ and is receiving a lot of media attention. A young woman has died, under tragic circumstances and the fate of her killer lies in the evidence, the prosecutions ability to create a picture from that evidence, and the ability of the defence to create reasonable doubt. The same concept can be applied to data management within an organisation.
Typically, we think of data as information captured in a database. The evidence used in this trial serves as a reminder that data is simply raw information and can take many forms.
A court case presents evidence - data - and draws conclusions based on this data. This can be likened to the decision-making process of any business. Ironically, in a court case, the prosecution and the defence are each pushing for diametrically opposed conclusions – innocence or guilt, using the same data.
While the merits of the case are for the court to decide, what has been interesting is the legal process itself, and how the opposing counsel is applying data management principles to support their desired outcomes.
Indeed, the outcome of this case could depend on the data management principles of data governance and data quality.
Subscribe content preview
To continue reading: Log-In above or Subscribe now.
Want the full story?
SUBSCRIBE NOW
Get The SA Leader the way you want it
- One Year Digital Subscription - R320
- 10 Issues Print Subscription - R580
- One Year All Access - Just R900 Best Deal!
Print Magazine + Digital Edition + Subscriber-only content on SALeader.co.za
If you are already a subscriber, please Log-In using the Log-In button found on the top right of the site!
The changing role of resellers
Educating customers on backup is critical for data protection
Data has become the lifeblood of any organisation, and with an increasing shift away from hardware towards a more service-oriented market, the role of the reseller when it comes to backup has changed dramatically. A ‘box drop’ approach is no longer sufficient, given the critical nature of data. Resellers of data protection solutions now have a responsibility to their customers to educate them on backup solutions and practices and ensure business continuity by making certain that their customers can recover effectively in the event of a data issue.
While most large organisations have realised the critical nature of effective backup and recovery solutions, the Small to Medium Business (SMB) market still relies heavily on memory sticks, external hard drives and other ad-hoc backup processes, if they have any such processes in place at all. However, these backups are often not regularly checked, and only when a data issue occurs and a restore is necessary do the problems with this method become evident. In addition, when backups are recovered from such devices, it is usually difficult or impossible to recover just the missing data, and only the data from the last backup can be restored. This usually results in work since the last backup being lost.
Irregular or infrequent backups often go hand-in–hand with users not checking their backed-up data for integrity and the ability to recover it as and when necessary, which can cripple the business in the event of data not being recoverable. Without this critical data, many SMB organisations simply cannot recover, leading to lost income and even the closure of the business itself. With approximately 97% of all data restores necessitated due to hardware failure, hard drive malfunctions or data corruption, the need for end users from all sizes of business to move to automated backup environments is clear.
Resellers of these solutions are in a favourable position to educate their end user customers on the benefits of automated backup and the repercussions of not having a plan or process in place. Many businesses, particularly in the SMB space, do not have the expertise or capacity to adequately manage backup and recovery on their own. Added to this, the research required to find a solution that is ‘fit for purpose’ has proved onerous in the past, leading to poor backup practices that can cause problems further down the line.
As providers of backup solutions, these resellers understand the market, the challenges, and the needs of their customers, and are also able to offer a managed service that delivers more comprehensive backup and recovery. With the evolution of technology, there is also a wider range of solutions on offer to deliver fast, efficient and above all automated backup to protect vital data.
There are now a host of best-of-breed solutions available for businesses of all sizes, addressing backup from the level of individual PCs right up to servers and entire data centres. The growth of the cloud, and increased trust in cloud solutions, has also provided another avenue for resellers to offer remote backup solutions, which store data securely offsite in the cloud, meeting best practice guidelines and ensuring always-available data recovery.
In order to take advantage of new opportunities and provide better customer services, resellers need to make the leap from selling products to providing solutions and services that deliver value to their customers. The onus is now on resellers to take this proactive step, do their research and find the right products, including cloud or hosted platforms, to adopt and sell on to their customers. This not only opens up new revenue streams, but delivers immense satisfaction in knowing that customers’ data is secure and properly backed up. Resellers have the opportunity to become trusted partners and reinforce relationships, strengthening their own business while helping their customers at the same time.
By taking on this new role, not only are resellers able to take on a more strategic position in an IT world dominated by the cloud, they are also able to benefit from improved credibility and annuity revenue that results from selling solutions and advice rather than simply products.
Recordings: the contact centre security hole
The management and security of contact centre recordings tends to be sorely neglected, says Karl Reed, Chief Marketing & Solutions Officer at Elingo.
Most contact centres record voice calls and interactions, but few manage and utilise the resulting data effectively.
In line with legislation, and to protect themselves in the event of customer complaints and queries, most companies have recording systems installed in their contact centres. However, in our experience, recording is not among the top priorities for contact centres, and tends to be a ‘grudge purchase’. This is a mistake.
Installing ‘cheap and nasty’ off the shelf recording tools simply to record and store conversations defeats the object of recording, and leaves gaping holes in enterprise security and risk management.
Business tends to overlook the importance of the confidential information captured in the recordings, as well as its potential strategic business value.
Recorded data from customer voice calls, emails, faxes and SMSes contains a great deal of personal information about clients and their transactions with the enterprise. The potential losses and reputational damage if this data should fall into the wrong hands is huge. While most systems do include some form of tamper record, recorded data is simply not treated with the same levels of security as other enterprise data. Too many contact centres are vague on how the data should be stored and even the length of time the recordings must be stored for – even though, by law, some must be kept for up to ten years.
As enterprises increasingly offer contractual transactions via various channels – including voice and email – the voice or electronic communication has become the binding contract, making security, storage and management of these recordings increasingly important. It is becoming critically important that all recordings are monitored for action points or potential problems, and that these are attended too quickly.
Besides the data security issues, there is business risk and customer retention to be considered. Recorded contact centre data contains vast amounts of valuable information about customer sentiment, potential customer losses and even about the abilities of the contact centre team.
If the recording is not quality managed and integrated into other enterprise systems, there is every chance it will simply be stored, burying valuable insights in a virtual vault. Without effectively using this recorded data, the enterprise is left to rely on the contact centre agent alone as the face of the company, responsible for interpreting sentiment and flagging any concerns. When an agent is managing hundreds of calls a day, they may not be effective in flagging problems and retaining customers.
However, with a quality management team in place to analyse calls, in addition to word spotting functionality and alerts built into the system, the contact centre is able to thoroughly analyse the interactions and respond in a proactive way.
As enterprises look to expand their insights into customer sentiment, they may be ignoring the reserves of information they already have access to. By managing these insights correctly, enterprises can reduce customer churn, grow their customer bases and even respond more timeously to market demand.
Some contact centres are becoming aware of the need to better manage interaction recordings, but may delay upgrading to high-end integrated systems due to the investment required. It’s necessary to consider the costs of not doing so; as well as the potential return on investment in a system that allows the contact centre to deliver better service, increased customer satisfaction and improved business practices. Business management needs to understand how critical it is to maximise the recorded data, to enable a complete, 360 degree view of the customer, for ultimate business success.
Recording needs to be a lot more than a tick in the box. Considering the risk mitigation and customer insight value it can deliver, the management and security of recordings of all transactions, correspondence and conversations needs to be taken seriously.
Adopting a Lean approach to data governance
Data governance is fast becoming a priority on the business agenda, in light of regulations and guidelines such as King III, which outline recommendations for corporate governance of which data governance forms a critical part. However, data governance can be a challenging and complex task. Adopting a Lean approach to data governance can help businesses to eliminate wasteful data governance activity and promote efficiencies.
Mergers and acquisitions – don’t forget about the data
In recent years across the globe there has been a substantial amount of consolidation spanning various industries, from multiple acquisitions by Google and consolidation of manufacturing in many areas to the recent local buyout of Avusa by the Times Media Group. Mergers and acquisitions are typically driven by opportunities to increase revenue, for example cross selling into each customer base, or to increase operational efficiency by leveraging economies of scale in the new, larger business. However when it comes to the actual integration of two disparate companies, organisations often focus exclusively on the commercial perspective, attempting to leverage synergies between the businesses while other areas such as data are left by the wayside. The reality though is that data is a critical component in the success of any merger or acquisition, and a comprehensive data strategy it vital in ensuring a smooth transition and expedient realisation of business goals.
There are many high-level considerations to take into account, and any acquisition is fraught with complexities. Consolidation in any market is typically as a result of a need to remain agile and competitive. However, without data integration, a sound data strategy and data quality initiatives, this agility is difficult to achieve. This is particularly true in a situation where a company either buys out another or attempts to merge two separate entities. Whenever a merger or acquisition takes place, business objectives need to shift and data needs to support these business objectives.
The challenge lies in integrating data from the two entities, as if this cannot be successfully achieved data within the new merged entity will be incomplete or inconsistent, leading to compromised business decision-making, potential non-compliance with data legislation and even degraded customer service levels. In order to ensure the success of mergers and acquisitions, it is imperative to define frameworks and methodologies around data, linked to the business goals of the organisation.
From a data perspective, the challenges around integration are the same for any organisation, regardless of industry or vertical sector. Companies often have vastly different business processes, as well as different financial and legal constraints and different data management practices. If data quality is poor it can have negative implications for the newly merged organisation, and this needs to be addressed. The integration of the business data from the acquired company needs to take into account the legal and financial constraints of the acquirer, and must also incorporate the acquiring business’ best practices, data standards and business rules. This helps to ensure that data quality is maintained before, during and after the integration, to guarantee business continuity and to reduce business risk.
Data integration is often forgotten in the process of a merger or an acquisition, as a result of the ever-present disconnect between business and IT. Often, business will assume that this is IT’s problem, and IT will think that this should be a part of the business side of the acquisition. This means that frequently data conversion and integration is an afterthought and is not handled effectively, which can have significant consequences for a business. For example, if payment data is inaccurate, payments can be delayed, the data must be reworked which takes time and costs money, and organisations can end up in bad debt situations. If data is incomplete, financial reporting will be inaccurate which has several consequences of its own, including legal implications.
In order to address these issues, key business rules need to be identified and applied to avoid business impact during data conversion. Data integration is critical to the success of a merger or acquisition. If the data is not consistent with or does not support the business rules, it becomes increasingly difficult for an organisation to meet and achieve the business case of an acquisition in a timely fashion. A data excellence framework, which encompasses common processes, best practices and common methodology around data quality, data integration and data governance, provides a step-by-step approach to the integration of data to enable acquiring companies to fully leverage on the expected value of an acquisition.
The business-driven rules of a data excellence framework ensure the successful and effective integration of data by providing data quality metrics, exception management and thresholds for data quality after integration to enable smooth future operations. Business rules are defined as rules which data must comply with in order to execute business processes, addressing areas including accuracy, completeness, consistency, duplication and obsolescence. The data excellence framework also ensures that a practical approach to data is taken that does not unduly delay matters, by focusing on critical business rules. For example, rather than seeking 100% quality, which may not be achievable given the acquisition timelines, the optimal data quality level can be determined and targeted to balance integration, quality, governance and given time frames.
Key components of such a framework include data management processes which are interlinked with business, standardised data structures, defined best practices, a standardised data integration platform, and increased levels of ownerships and best practices in business functions. It offers a toolkit for managing change, particularly related to data, within an organisation.
A core function of the data excellence framework is to provide a simple, straightforward process governing data quality business rules as well as to link data quality objectives to their impact on business, to ensure that data integration initiatives can be prioritised accordingly. The framework also defines roles needed for effective data governance and accountability, and enables the execution of multiple data integration projects in the same timescale.
Ultimately the success of the business aspect of mergers and acquisitions relies on the successful integration of data. A detailed data excellence framework and best practices along with a data governance model that includes both business and IT are critical in ensuring this success. In today’s data driven environment, business strategy can only succeed if it is supported by data, which means that high quality data is vital to a merged or acquiring organisation’s sustainability and growth. Data is a business asset, and particularly in the case of mergers and acquisitions should be treated as such.
Does your data strategy support your business goals?
Big data and the data explosion are terms that have become industry buzzwords, and have led many organisations to hop on the bandwagon without considering all of the facts. The reality is that big data does not simply refer to data volumes, but also to the unstructured nature of the data and its source, typically social media. Data is increasing at an exponential rate, which does present challenges, and big data can provide business insight. The question that businesses need to ask themselves however is "which big data do I need" or even "do I need big data at all". Answering these, and similar, questions requires a sound data strategy, and more importantly a data strategy that is aligned with and supports an organisation's business goals.
Organisations today are experiencing something of an information overload, and in an attempt to keep up, many enterprises join in with the big data hype without fully understanding the data, their business, and the big data's place in their business. This unstructured data generated by social media tools can help businesses to understand what their customers are saying about them. However it is scattered through various media across the Internet, and traditional data tools are simply not equipped for this.
The cost of gathering and analysing this data needs to be weighed against the benefit it will provide. In truth, for many organisations, this equation means that big data is not yet relevant and will not add any value, so spending large sums of money on big data initiatives will prove to be a waste of time and money.
Added to this, the big data phenomenon and the data explosion offer additional challenges. Attempting to manage, measure and monitor absolutely every scrap of data often leads organisations to become overwhelmed, leading to complete inaction with regard to data, or ineffective management of multiple data sources. This can end in the proverbial "flooded with information but starved of knowledge" phenomenon affecting so many businesses. Overcoming this phenomenon requires data to support the business and its goals. In order to leverage any real benefit from data, it is vital to have a data management plan or strategy, but more than this, it is imperative that data management plans are linked into and support business strategy. This will help to ensure that any data that is measured and monitored will add business value and go towards improving the bottom line. This approach also ensures that resources, time and money can be focused on delivering information that will have the biggest positive impact on the business.
In an increasingly data-driven world where complexity is only increased by different types of data, increasing volumes of data and multiple regulations and guidelines governing the use and management of data, a data strategy is critical. A business driven data management strategy helps organisations to achieve three principal objectives. It ensures that data initiatives are aligned to business needs and focused on business priorities by achieving data excellence and information management. It ensures that operational efficiency is improved by helping businesses to identify and increase key business process failures that cause poor data quality. Finally it helps to reduce risk, by ensuring that a pragmatic information risk management strategy is developed and maintained.
Ultimately a business is driven by the principle of creating shareholder value, and data needs to support this. If data is not delivering value, then it is not worth spending money to manage it. In order to ensure that data does deliver value, a sound understanding of the business goals and strategy, linked into data management plans, is critical. The data excellence framework describes a proven methodology, processes and roles necessary to achieve a business-driven data management strategy that will help businesses to ensure that data is manageable and that, more importantly, value can be obtained from this data.
Assessing your data quality – going back to basics
We all understand that quality information is an enabler for cost cutting, risk reduction and revenue enhancement. Yet, companies have different approaches to managing the corporate information assets, ranging from ad hoc, tactical projects to meet specific goals, to strategic imperatives that seek to embed data quality principles across the corporate architecture.
This makes it difficult to know where to start, what is effective, or whether you are on track or not for success in meeting the data management challenge.
For many organisations data management remains a reactive process, with individual project teams or business areas identifying specific data issues that are impeding the ability of the project to achieve a specific goal. Short-term measures may include tactical data cleansing initiatives, with or without a tool, and typically do not take corporate standards and goals into account.
Reaching a mature state requires a metamorphosis from this chaotic approach to more proactive approaches that incorporate data quality metrics, standards, shared master data and data stewardship. An enterprise driven focus on data management requires data management principles to be embedded in the corporate culture - being driven both top down and bottom up.
Just a caterpillar does not become a butterfly in one step, it is important to build upon early successes in a sustainable way to reach data management maturity. The starting point is being able to understand the state of your data and once this is established, planning how to improve this is the next logical step. From here, identifying opportunities for incremental improvement will allow you to maximise the value of your information with benefits such as lowered costs, improved efficiencies and customer retention to mention a few.
A crucial success factor is to appoint custodians of the data that take responsibility and most importantly, are accountable for the state of the data. Many organisations have failed to achieve data integrity due to the fact that IT blames business, business blames IT and the end user blames their line manager. Data has no boundaries and is an integral part of each and every component within the business, whether it is sales, finance or operations. Setting the company up for successful data quality means that responsibility must be clear across the entire organisation, and that the consequences of non-delivery are also clear.
Once this step is addressed, it is pointless embarking on a data quality exercise or project if there are no measurements in place. It makes sense to create a ‘baseline’ of the quality of your data (which is usually established with a survey and a data audit) and then key value indicators (KVIs) should be established in order to measure the improvement and success of the data quality initiative. These baselines should be linked to the business impact of failing data quality – there is no point in trying to address data quality issues that have no commercial impact on your business.
In order to fully realise the benefits of a data quality exercise, having the right tools is another fundamental. Many companies are lulled into the false sense of security that their ‘stack’ solution that incorporate data cleansing will suffice but more than often, the functionality is limited. Specialist tools are purpose built and therefore provide richer features and functionality. However, it is also important to note that technology alone is not the panacea to a business’ data quality woes. It is recommended that a data quality specialist assists with the project and in some instances, it is better to outsource the project or initiative to a specialist. They deal with data quality issues on a daily basis which ultimately means they have more experience and insight into some of the trickier issues that need to be dealt with.
In order to tackle the first step of establishing the data of your data, take this free survey, , to assess the effectiveness of your approach against those taken by over a thousand IT and business professionals across the globe.
One account, multiple devices?
One online profile, one place for storage, seamless access from any device... and the security and a reliable connection to make it all possible? If that sounds good to you, you are probably one of a growing number of people who have an embarrassing amount of digital devices on which they work, play, interact and store important business and personal information. Managing and controlling all that data, not to mention security, has become a challenge. There are a number of solutions to this problem, but because business and personal data now co-exist on devices, it has multiple dimensions.
The reality is that many people today have many personal productivity devices -- two or more phones; perhaps a business laptop; a desktop PC at work and/or at home; a private laptop for recreation; a tablet PC or Ultrabook for taking into meetings or for travelling -- and as they proliferate, productivity becomes an issue.
It's pretty inconvenient. You struggle to find the data, contacts and conversations you have scattered across these devices; access to a business network is declined because your device doesn't have the right version of antivirus; no Wi-Fi at the office (or at a client) means it's difficult to hook in with your tablet PC; you can't remember the passwords for various sites or applications and don't have the device on which you stored them... you know the frustration.
The answers to this challenge are literally streaming in. But the answer may not be just cloud-based; perhaps it's time to consolidate the devices we carry. There are some promising solutions, like the Ultrabook, that have recently poked their heads over the horizon.
Is the answer in the cloud?
Topping the 'cloud' list are the opportunistic and very useful sync and storage applications, like DropBox where you can share info instantly with someone you identify, or Apple's iCloud that allows you to aggregate and synchronise data across all your Apple devices. SkyDrive is Microsoft's SSL encrypted online storage system that also enables group editing of MS files. And then there's Google Drive, Google's storage service that lets you add photos to email, edit video and send faxes directly and make, share and edit documents. For this service, Google wants you to sign over usage rights, however.
There are a lot of advantages to be had but privacy and security is an issue. IBM, for example, recently banned its staff from using DropBox or iCloud for fear of sensitive data being 'lost' since it has little control over the applications staff load onto their devices, which can create vulnerabilities.* There's little chance of devices being banned, however.
BYOD and workplace Wi-Fi benefits
There's no denying the huge benefits of Wi-Fi in the workplace. It facilitates instant synchronisation of devices like tablet PCs and wireless notepads, Ultrabooks and other devices with the office network, diaries, calendars and other line of business or workgroup collaboration networks. Wi-Fi also has a knock on effect, automating other business functions – like time and attendance logging which occurs when say an access control system recognises a user entering a building or a meeting room.
"Bring your own device" or BYOD is a practice many corporates allow. It lets mobile workers – actually any workers today – bring their own devices to work and use them. The challenge, as IBM highlights, is then maintaining a security layer around work related data. Authentication and authorisation can be managed, but controlling what anyone puts on their devices (data and/or apps) is another kettle of fish entirely... and of course if users have numerous devices synced together, vulnerabilities may be multiplied across devices.
A single device?
Perhaps risk can be averted through use of the still illusive single 'uber device' that marries convenience, power and all the other features our lifestyles demand of the electronic devices we use for work and play. Hardware and system vendors are working hard to achieve this.
Some are building security into devices. There's the physical security, like biometrics and iris scan needed for system access, but also technology like Intel's Anti-Theft solution that allows Ultrabooks with Intel processors to be disabled from anywhere in the world if they are compromised. Intel also offers online identity protection in second generation Ultrabooks with its Intel Identity Protection Technology creating a trusted link to the user's system, accounts and favourite places.
A happy medium
Finding a happy medium may mean finding the sweet spot with regard to functionality and seamless function. At least a part of the solution seems to lie in the cloud, but awareness of the vulnerabilities of various applications, especially for business data, is important. For others, a large part of the answer will be in finding a single device that marries the best features of a business and personal device. Unfortunately there is no one-size-fits-all; the 'ideal' device will depend very much on the lifestyle of the individual. The Ultrabook shows promise; its evolution will certainly be worth watching.
Big data – do you really need it in your production database?
Big data is all the buzz, as organisations scramble to leverage their mountains of data to drive business insight. The question that begs, however, is how much data does a business actually need? Organisations sit with terabytes of data, many several years old, and all of it kept in the production database for instant access. The reality is that this is often not required for everyday business operations. While governance and regulations may require that certain data is kept for legal purposes, it is not necessary to store this data in expensive, 'instant access' databases. Historical data can be archived, saving money and time and helping organisations to use the right big data to make better business decisions.
As content generation continues to explode, data storage strategies have become increasingly important for businesses. Effectively managing this data should be a top priority, for greater cost effectiveness and efficiency. When it comes to big data, not only is it not cost effective, it is also impractical to store all data up front in the production database, and can in fact decrease everyday server performance. Organisations need to have a strategy in place to reduce storage costs in the face of exponential data growth, optimise performance according to the needs of the business and mitigate the risk of lost data and information.
The production database should contain only the current data that is needed for the day-to-day business and operations of the organisation. This database should feature high performance capabilities to deliver this data to users quickly and efficiently. However, if it is being used to store data that is not needed for everyday use, and becomes 'heavy' or bogged down with data, performance will inevitably be compromised.
Data cleansing and consolidation can assist with reducing data volumes in the production database, but this is often not enough to deliver the required performance gains. Strategy needs to be put into place to ensure that data is archived, removed from production and stored in more cost effective options. This strategy, however, must be linked into the business and its needs, including its daily operations. If data storage, retention and archiving strategies are not in line with the needs of the business, users will not be able to access the data they need when they need it and as fast as they need to. It is vital to first understand the needs of business and then put rules into place around archiving. This means that archiving is not simply an IT decision, but a business decision as well, and database administrators need to understand the business in order to provide advice for a better strategy.
With data maintenance plans and archiving strategies in place, data can be moved out of the production environment onto archive servers, which will still enable the data to be easily accessed by users but will not affect the performance of the production server. This will make searching faster and increase performance when accessing or creating data. Historical data will take longer to access, but since this is not needed as often the performance gains on daily data outweigh the minor inconvenience.
Partitioning data in this way will bring down the costs of hardware, software and licensing as well, saving organisations money. Production databases must deliver high performance, which means higher cost. Database size, server memory or CPUs and licensing are interlinked. Those companies who have historical data residing on the production server will need to spend more on ensuring high performance and licensing that is based on CPU's. Archive servers do not need to provide the same levels of performance, so lower cost and lower specified servers can be used for this purpose. This approach also means that maintenance on the production environment is easier, rebuilding indexes is quicker and backups will run faster. In general, performance and uptime will be maximised. The archive server can also be used as a quality control environment to validate data integrity in a safer manner, since doing this on the production server can have a negative impact on business performance.
Ultimately the rules of data storage strategy are simple. The production database should contain only the data needed for day-to-day operations, and all historical data should be moved onto an archive server. This will allow for the production server to be streamlined and deliver the best possible performance, and will optimise the cost of maintenance and running of storage. This in turn will allow organisations to deal with big data in a more intelligent fashion, comply with regulations around data retention, and make more agile decisions based on current data thanks to optimised system performance.
- Start
- Prev
- 1
- 2
- Next
- End