A+ A A-
Tuesday, 22 October 2013 12:05

The changing role of resellers

The changing role of resellers

Educating customers on backup is critical for data protection

Data has become the lifeblood of any organisation, and with an increasing shift away from hardware towards a more service-oriented market, the role of the reseller when it comes to backup has changed dramatically.  A ‘box drop’ approach is no longer sufficient, given the critical nature of data. Resellers of data protection solutions now have a responsibility to their customers to educate them on backup solutions and practices and ensure business continuity by making certain that their customers can recover effectively in the event of a data issue.

 

While most large organisations have realised the critical nature of effective backup and recovery solutions, the Small to Medium Business (SMB) market still relies heavily on memory sticks, external hard drives and other ad-hoc backup processes, if they have any such processes in place at all. However, these backups are often not regularly checked, and only when a data issue occurs and a restore is necessary do the problems with this method become evident. In addition, when backups are recovered from such devices, it is usually difficult or impossible to recover just the missing data, and only the data from the last backup can be restored. This usually results in work since the last backup being lost.

 

Irregular or infrequent backups often go hand-in–hand with users not checking their backed-up data for integrity and the ability to recover it as and when necessary, which can cripple the business in the event of data not being recoverable. Without this critical data, many SMB organisations simply cannot recover, leading to lost income and even the closure of the business itself. With approximately 97% of all data restores necessitated due to hardware failure, hard drive malfunctions or data corruption, the need for end users from all sizes of business to move to automated backup environments is clear.

 

Resellers of these solutions are in a favourable position to educate their end user customers on the benefits of automated backup and the repercussions of not having a plan or process in place. Many businesses, particularly in the SMB space, do not have the expertise or capacity to adequately manage backup and recovery on their own. Added to this, the research required to find a solution that is ‘fit for purpose’ has proved onerous in the past, leading to poor backup practices that can cause problems further down the line.

 

As providers of backup solutions, these resellers understand the market, the challenges, and the needs of their customers, and are also able to offer a managed service that delivers more comprehensive backup and recovery. With the evolution of technology, there is also a wider range of solutions on offer to deliver fast, efficient and above all automated backup to protect vital data.

 

There are now a host of best-of-breed solutions available for businesses of all sizes, addressing backup from the level of individual PCs right up to servers and entire data centres. The growth of the cloud, and increased trust in cloud solutions, has also provided another avenue for resellers to offer remote backup solutions, which store data securely offsite in the cloud, meeting best practice guidelines and ensuring always-available data recovery.

 

In order to take advantage of new opportunities and provide better customer services, resellers need to make the leap from selling products to providing solutions and services that deliver value to their customers. The onus is now on resellers to take this proactive step, do their research and find the right products, including cloud or hosted platforms, to adopt and sell on to their customers. This not only opens up new revenue streams, but delivers immense satisfaction in knowing that customers’ data is secure and properly backed up. Resellers have the opportunity to become trusted partners and reinforce relationships, strengthening their own business while helping their customers at the same time.

 

By taking on this new role, not only are resellers able to take on a more strategic position in an IT world dominated by the cloud, they are also able to benefit from improved credibility and annuity revenue that results from selling solutions and advice rather than simply products.

Published in Storage & Data Centres
Thursday, 18 April 2013 12:02

Recordings: the contact centre security hole

Recordings: the contact centre security hole

The management and security of contact centre recordings tends to be sorely neglected, says Karl Reed, Chief Marketing & Solutions Officer at Elingo.

Most contact centres record voice calls and interactions, but few manage and utilise the resulting data effectively.

 

In line with legislation, and to protect themselves in the event of customer complaints and queries, most companies have recording systems installed in their contact centres. However, in our experience, recording is not among the top priorities for contact centres, and tends to be a ‘grudge purchase’. This is a mistake.

 

Installing ‘cheap and nasty’ off the shelf recording tools simply to record and store conversations defeats the object of recording, and leaves gaping holes in enterprise security and risk management.

 

Business tends to overlook the importance of the confidential information captured in the recordings, as well as its potential strategic business value.

 

Recorded data from customer voice calls, emails, faxes and SMSes contains a great deal of personal information about clients and their transactions with the enterprise. The potential losses and reputational damage if this data should fall into the wrong hands is huge. While most systems do include some form of tamper record, recorded data is simply not treated with the same levels of security as other enterprise data. Too many contact centres are vague on how the data should be stored and even the length of time the recordings must be stored for – even though, by law, some must be kept for up to ten years.

 

As enterprises increasingly offer contractual transactions via various channels – including voice and email – the voice or electronic communication has become the binding contract, making security, storage and management of these recordings increasingly important. It is becoming critically important that all recordings are monitored for action points or potential problems, and that these are attended too quickly.

 

Besides the data security issues, there is business risk and customer retention to be considered. Recorded contact centre data contains vast amounts of valuable information about customer sentiment, potential customer losses and even about the abilities of the contact centre team.

 

If the recording is not quality managed and integrated into other enterprise systems, there is every chance it will simply be stored, burying valuable insights in a virtual vault. Without effectively using this recorded data, the enterprise is left to rely on the contact centre agent alone as the face of the company, responsible for interpreting sentiment and flagging any concerns. When an agent is managing hundreds of calls a day, they may not be effective in flagging problems and retaining customers.

 

However, with a quality management team in place to analyse calls, in addition to word spotting functionality and alerts built into the system, the contact centre is able to thoroughly analyse the interactions and respond in a proactive way.

 

As enterprises look to expand their insights into customer sentiment, they may be ignoring the reserves of information they already have access to. By managing these insights correctly, enterprises can reduce customer churn, grow their customer bases and even respond more timeously to market demand.

 

Some contact centres are becoming aware of the need to better manage interaction recordings, but may delay upgrading to high-end integrated systems due to the investment required. It’s necessary to consider the costs of not doing so; as well as the potential return on investment in a system that allows the contact centre to deliver better service, increased customer satisfaction and improved business practices. Business management needs to understand how critical it is to maximise the recorded data, to enable a complete, 360 degree view of the customer, for ultimate business success.

 

Recording needs to be a lot more than a tick in the box. Considering the risk mitigation and customer insight value it can deliver, the management and security of recordings of all transactions, correspondence and conversations needs to be taken seriously.

Published in Security
Read more...
Tuesday, 19 February 2013 16:02

Adopting a Lean approach to data governance

Adopting a Lean approach to data governance

Data governance is fast becoming a priority on the business agenda, in light of regulations and guidelines such as King III, which outline recommendations for corporate governance of which data governance forms a critical part. However, data governance can be a challenging and complex task. Adopting a Lean approach to data governance can help businesses to eliminate wasteful data governance activity and promote efficiencies.

Published in Analytics & BI
Read more...
Thursday, 20 September 2012 10:06

Mergers and acquisitions – don’t forget about the data

Mergers and acquisitions – don’t forget about the data

In recent years across the globe there has been a substantial amount of consolidation spanning various industries, from multiple acquisitions by Google and consolidation of manufacturing in many areas to the recent local buyout of Avusa by the Times Media Group. Mergers and acquisitions are typically driven by opportunities to increase revenue, for example cross selling into each customer base, or to increase operational efficiency by leveraging economies of scale in the new, larger business. However when it comes to the actual integration of two disparate companies, organisations often focus exclusively on the commercial perspective, attempting to leverage synergies between the businesses while other areas such as data are left by the wayside. The reality though is that data is a critical component in the success of any merger or acquisition, and a comprehensive data strategy it vital in ensuring a smooth transition and expedient realisation of business goals.

There are many high-level considerations to take into account, and any acquisition is fraught with complexities. Consolidation in any market is typically as a result of a need to remain agile and competitive. However, without data integration, a sound data strategy and data quality initiatives, this agility is difficult to achieve. This is particularly true in a situation where a company either buys out another or attempts to merge two separate entities. Whenever a merger or acquisition takes place, business objectives need to shift and data needs to support these business objectives.

The challenge lies in integrating data from the two entities, as if this cannot be successfully achieved data within the new merged entity will be incomplete or inconsistent, leading to compromised business decision-making, potential non-compliance with data legislation and even degraded customer service levels. In order to ensure the success of mergers and acquisitions, it is imperative to define frameworks and methodologies around data, linked to the business goals of the organisation.

From a data perspective, the challenges around integration are the same for any organisation, regardless of industry or vertical sector. Companies often have vastly different business processes, as well as different financial and legal constraints and different data management practices. If data quality is poor it can have negative implications for the newly merged organisation, and this needs to be addressed. The integration of the business data from the acquired company needs to take into account the legal and financial constraints of the acquirer, and must also incorporate the acquiring business’ best practices, data standards and business rules. This helps to ensure that data quality is maintained before, during and after the integration, to guarantee business continuity and to reduce business risk.

Data integration is often forgotten in the process of a merger or an acquisition, as a result of the ever-present disconnect between business and IT. Often, business will assume that this is IT’s problem, and IT will think that this should be a part of the business side of the acquisition. This means that frequently data conversion and integration is an afterthought and is not handled effectively, which can have significant consequences for a business. For example, if payment data is inaccurate, payments can be delayed, the data must be reworked which takes time and costs money, and organisations can end up in bad debt situations. If data is incomplete, financial reporting will be inaccurate which has several consequences of its own, including legal implications.

In order to address these issues, key business rules need to be identified and applied to avoid business impact during data conversion. Data integration is critical to the success of a merger or acquisition. If the data is not consistent with or does not support the business rules, it becomes increasingly difficult for an organisation to meet and achieve the business case of an acquisition in a timely fashion. A data excellence framework, which encompasses common processes, best practices and common methodology around data quality, data integration and data governance, provides a step-by-step approach to the integration of data to enable acquiring companies to fully leverage on the expected value of an acquisition.

The business-driven rules of a data excellence framework ensure the successful and effective integration of data by providing data quality metrics, exception management and thresholds for data quality after integration to enable smooth future operations. Business rules are defined as rules which data must comply with in order to execute business processes, addressing areas including accuracy, completeness, consistency, duplication and obsolescence. The data excellence framework also ensures that a practical approach to data is taken that does not unduly delay matters, by focusing on critical business rules. For example, rather than seeking 100% quality, which may not be achievable given the acquisition timelines, the optimal data quality level can be determined and targeted to balance integration, quality, governance and given time frames.

Key components of such a framework include data management processes which are interlinked with business, standardised data structures, defined best practices, a standardised data integration platform, and increased levels of ownerships and best practices in business functions. It offers a toolkit for managing change, particularly related to data, within an organisation.

A core function of the data excellence framework is to provide a simple, straightforward process governing data quality business rules as well as to link data quality objectives to their impact on business, to ensure that data integration initiatives can be prioritised accordingly. The framework also defines roles needed for effective data governance and accountability, and enables the execution of multiple data integration projects in the same timescale.

Ultimately the success of the business aspect of mergers and acquisitions relies on the successful integration of data. A detailed data excellence framework and best practices along with a data governance model that includes both business and IT are critical in ensuring this success. In today’s data driven environment, business strategy can only succeed if it is supported by data, which means that high quality data is vital to a merged or acquiring organisation’s sustainability and growth. Data is a business asset, and particularly in the case of mergers and acquisitions should be treated as such.

Published in Trade & Investment
Read more...
Friday, 31 August 2012 08:51

Does your data strategy support your business goals?

Data strategy alignment with business goals

Big data and the data explosion are terms that have become industry buzzwords, and have led many organisations to hop on the bandwagon without considering all of the facts. The reality is that big data does not simply refer to data volumes, but also to the unstructured nature of the data and its source, typically social media. Data is increasing at an exponential rate, which does present challenges, and big data can provide business insight. The question that businesses need to ask themselves however is "which big data do I need" or even "do I need big data at all". Answering these, and similar, questions requires a sound data strategy, and more importantly a data strategy that is aligned with and supports an organisation's business goals.

 

Organisations today are experiencing something of an information overload, and in an attempt to keep up, many enterprises join in with the big data hype without fully understanding the data, their business, and the big data's place in their business. This unstructured data generated by social media tools can help businesses to understand what their customers are saying about them. However it is scattered through various media across the Internet, and traditional data tools are simply not equipped for this.

 

The cost of gathering and analysing this data needs to be weighed against the benefit it will provide. In truth, for many organisations, this equation means that big data is not yet relevant and will not add any value, so spending large sums of money on big data initiatives will prove to be a waste of time and money.

 

Added to this, the big data phenomenon and the data explosion offer additional challenges. Attempting to manage, measure and monitor absolutely every scrap of data often leads organisations to become overwhelmed, leading to complete inaction with regard to data, or ineffective management of multiple data sources. This can end in the proverbial "flooded with information but starved of knowledge" phenomenon affecting so many businesses. Overcoming this phenomenon requires data to support the business and its goals. In order to leverage any real benefit from data, it is vital to have a data management plan or strategy, but more than this, it is imperative that data management plans are linked into and support business strategy. This will help to ensure that any data that is measured and monitored will add business value and go towards improving the bottom line. This approach also ensures that resources, time and money can be focused on delivering information that will have the biggest positive impact on the business.

 

In an increasingly data-driven world where complexity is only increased by different types of data, increasing volumes of data and multiple regulations and guidelines governing the use and management of data, a data strategy is critical. A business driven data management strategy helps organisations to achieve three principal objectives. It ensures that data initiatives are aligned to business needs and focused on business priorities by achieving data excellence and information management. It ensures that operational efficiency is improved by helping businesses to identify and increase key business process failures that cause poor data quality. Finally it helps to reduce risk, by ensuring that a pragmatic information risk management strategy is developed and maintained.

 

Ultimately a business is driven by the principle of creating shareholder value, and data needs to support this. If data is not delivering value, then it is not worth spending money to manage it. In order to ensure that data does deliver value, a sound understanding of the business goals and strategy, linked into data management plans, is critical. The data excellence framework describes a proven methodology, processes and roles necessary to achieve a business-driven data management strategy that will help businesses to ensure that data is manageable and that, more importantly, value can be obtained from this data.

Published in Analytics & BI
Read more...
Assessing your data quality – going back to basics

We all understand that quality information is an enabler for cost cutting, risk reduction and revenue enhancement. Yet, companies have different approaches to managing the corporate information assets, ranging from ad hoc, tactical projects to meet specific goals, to strategic imperatives that seek to embed data quality principles across the corporate architecture.

This makes it difficult to know where to start, what is effective, or whether you are on track or not for success in meeting the data management challenge.

For many organisations data management remains a reactive process, with individual project teams or business areas identifying specific data issues that are impeding the ability of the project to achieve a specific goal. Short-term measures may include tactical data cleansing initiatives, with or without a tool, and typically do not take corporate standards and goals into account.

Reaching a mature state requires a metamorphosis from this chaotic approach to more proactive approaches that incorporate data quality metrics, standards, shared master data and data stewardship. An enterprise driven focus on data management requires data management principles to be embedded in the corporate culture - being driven both top down and bottom up.

Just a caterpillar does not become a butterfly in one step, it is important to build upon early successes in a sustainable way to reach data management maturity. The starting point is being able to understand the state of your data and once this is established, planning how to improve this is the next logical step. From here, identifying opportunities for incremental improvement will allow you to maximise the value of your information with benefits such as lowered costs, improved efficiencies and customer retention to mention a few.

A crucial success factor is to appoint custodians of the data that take responsibility and most importantly, are accountable for the state of the data. Many organisations have failed to achieve data integrity due to the fact that IT blames business, business blames IT and the end user blames their line manager. Data has no boundaries and is an integral part of each and every component within the business, whether it is sales, finance or operations. Setting the company up for successful data quality means that responsibility must be clear across the entire organisation, and that the consequences of non-delivery are also clear.

Once this step is addressed, it is pointless embarking on a data quality exercise or project if there are no measurements in place. It makes sense to create a ‘baseline’ of the quality of your data (which is usually established with a survey and a data audit) and then key value indicators (KVIs) should be established in order to measure the improvement and success of the data quality initiative. These baselines should be linked to the business impact of failing data quality – there is no point in trying to address data quality issues that have no commercial impact on your business.

In order to fully realise the benefits of a data quality exercise, having the right tools is another fundamental. Many companies are lulled into the false sense of security that their ‘stack’ solution that incorporate data cleansing will suffice but more than often, the functionality is limited. Specialist tools are purpose built and therefore provide richer features and functionality. However, it is also important to note that technology alone is not the panacea to a business’ data quality woes. It is recommended that a data quality specialist assists with the project and in some instances, it is better to outsource the project or initiative to a specialist. They deal with data quality issues on a daily basis which ultimately means they have more experience and insight into some of the trickier issues that need to be dealt with.

In order to tackle the first step of establishing the data of your data, take this free survey, , to assess the effectiveness of your approach against those taken by over a thousand IT and business professionals across the globe.

Published in Analytics & BI
Friday, 17 August 2012 08:46

One account, multiple devices?

One account, multiple devices?

One online profile, one place for storage, seamless access from any device... and the security and a reliable connection to make it all possible? If that sounds good to you, you are probably one of a growing number of people who have an embarrassing amount of digital devices on which they work, play, interact and store important business and personal information. Managing and controlling all that data, not to mention security, has become a challenge. There are a number of solutions to this problem, but because business and personal data now co-exist on devices, it has multiple dimensions.

The reality is that many people today have many personal productivity devices -- two or more phones; perhaps a business laptop; a desktop PC at work and/or at home; a private laptop for recreation; a tablet PC or Ultrabook for taking into meetings or for travelling -- and as they proliferate, productivity becomes an issue.

It's pretty inconvenient. You struggle to find the data, contacts and conversations you have scattered across these devices; access to a business network is declined because your device doesn't have the right version of antivirus; no Wi-Fi at the office (or at a client) means it's difficult to hook in with your tablet PC; you can't remember the passwords for various sites or applications and don't have the device on which you stored them... you know the frustration.

The answers to this challenge are literally streaming in. But the answer may not be just cloud-based; perhaps it's time to consolidate the devices we carry. There are some promising solutions, like the Ultrabook, that have recently poked their heads over the horizon.

Is the answer in the cloud?

Topping the 'cloud' list are the opportunistic and very useful sync and storage applications, like DropBox where you can share info instantly with someone you identify, or Apple's iCloud that allows you to aggregate and synchronise data across all your Apple devices. SkyDrive is Microsoft's SSL encrypted online storage system that also enables group editing of MS files. And then there's Google Drive, Google's storage service that lets you add photos to email, edit video and send faxes directly and make, share and edit documents. For this service, Google wants you to sign over usage rights, however.

There are a lot of advantages to be had but privacy and security is an issue. IBM, for example, recently banned its staff from using DropBox or iCloud for fear of sensitive data being 'lost' since it has little control over the applications staff load onto their devices, which can create vulnerabilities.* There's little chance of devices being banned, however.

BYOD and workplace Wi-Fi benefits

There's no denying the huge benefits of Wi-Fi in the workplace. It facilitates instant synchronisation of devices like tablet PCs and wireless notepads, Ultrabooks and other devices with the office network, diaries, calendars and other line of business or workgroup collaboration networks. Wi-Fi also has a knock on effect, automating other business functions – like time and attendance logging which occurs when say an access control system recognises a user entering a building or a meeting room.

"Bring your own device" or BYOD is a practice many corporates allow. It lets mobile workers – actually any workers today – bring their own devices to work and use them. The challenge, as IBM highlights, is then maintaining a security layer around work related data. Authentication and authorisation can be managed, but controlling what anyone puts on their devices (data and/or apps) is another kettle of fish entirely... and of course if users have numerous devices synced together, vulnerabilities may be multiplied across devices.

A single device?

Perhaps risk can be averted through use of the still illusive single 'uber device' that marries convenience, power and all the other features our lifestyles demand of the electronic devices we use for work and play. Hardware and system vendors are working hard to achieve this.

Some are building security into devices. There's the physical security, like biometrics and iris scan needed for system access, but also technology like Intel's Anti-Theft solution that allows Ultrabooks with Intel processors to be disabled from anywhere in the world if they are compromised. Intel also offers online identity protection in second generation Ultrabooks with its Intel Identity Protection Technology creating a trusted link to the user's system, accounts and favourite places.

A happy medium

Finding a happy medium may mean finding the sweet spot with regard to functionality and seamless function. At least a part of the solution seems to lie in the cloud, but awareness of the vulnerabilities of various applications, especially for business data, is important. For others, a large part of the answer will be in finding a single device that marries the best features of a business and personal device. Unfortunately there is no one-size-fits-all; the 'ideal' device will depend very much on the lifestyle of the individual. The Ultrabook shows promise; its evolution will certainly be worth watching.

Published in Mobile
Big data – do you really need it in your production database?

Big data is all the buzz, as organisations scramble to leverage their mountains of data to drive business insight. The question that begs, however, is how much data does a business actually need? Organisations sit with terabytes of data, many several years old, and all of it kept in the production database for instant access. The reality is that this is often not required for everyday business operations. While governance and regulations may require that certain data is kept for legal purposes, it is not necessary to store this data in expensive, 'instant access' databases. Historical data can be archived, saving money and time and helping organisations to use the right big data to make better business decisions.

As content generation continues to explode, data storage strategies have become increasingly important for businesses. Effectively managing this data should be a top priority, for greater cost effectiveness and efficiency. When it comes to big data, not only is it not cost effective, it is also impractical to store all data up front in the production database, and can in fact decrease everyday server performance. Organisations need to have a strategy in place to reduce storage costs in the face of exponential data growth, optimise performance according to the needs of the business and mitigate the risk of lost data and information.

The production database should contain only the current data that is needed for the day-to-day business and operations of the organisation. This database should feature high performance capabilities to deliver this data to users quickly and efficiently. However, if it is being used to store data that is not needed for everyday use, and becomes 'heavy' or bogged down with data, performance will inevitably be compromised.

Data cleansing and consolidation can assist with reducing data volumes in the production database, but this is often not enough to deliver the required performance gains. Strategy needs to be put into place to ensure that data is archived, removed from production and stored in more cost effective options. This strategy, however, must be linked into the business and its needs, including its daily operations. If data storage, retention and archiving strategies are not in line with the needs of the business, users will not be able to access the data they need when they need it and as fast as they need to. It is vital to first understand the needs of business and then put rules into place around archiving. This means that archiving is not simply an IT decision, but a business decision as well, and database administrators need to understand the business in order to provide advice for a better strategy.

With data maintenance plans and archiving strategies in place, data can be moved out of the production environment onto archive servers, which will still enable the data to be easily accessed by users but will not affect the performance of the production server. This will make searching faster and increase performance when accessing or creating data. Historical data will take longer to access, but since this is not needed as often the performance gains on daily data outweigh the minor inconvenience.

Partitioning data in this way will bring down the costs of hardware, software and licensing as well, saving organisations money. Production databases must deliver high performance, which means higher cost. Database size, server memory or CPUs and licensing are interlinked. Those companies who have historical data residing on the production server will need to spend more on ensuring high performance and licensing that is based on CPU's. Archive servers do not need to provide the same levels of performance, so lower cost and lower specified servers can be used for this purpose. This approach also means that maintenance on the production environment is easier, rebuilding indexes is quicker and backups will run faster. In general, performance and uptime will be maximised. The archive server can also be used as a quality control environment to validate data integrity in a safer manner, since doing this on the production server can have a negative impact on business performance.

Ultimately the rules of data storage strategy are simple. The production database should contain only the data needed for day-to-day operations, and all historical data should be moved onto an archive server. This will allow for the production server to be streamlined and deliver the best possible performance, and will optimise the cost of maintenance and running of storage. This in turn will allow organisations to deal with big data in a more intelligent fashion, comply with regulations around data retention, and make more agile decisions based on current data thanks to optimised system performance.

Published in Storage & Data Centres
Tuesday, 24 July 2012 17:26

Data management and data cleansing – it’s not about the data

Data management and data cleansing – it’s not about the data

We meet with many IT teams that are trying to establish a data quality culture within their organisations. In most cases they share a common concern with us:

“The business people aren’t interested in solving data problems!”

“We don’t get buy in from senior management fordata governance!”

The axiom ‘You can’t see the wood for the trees’ is apt when considering their approach to data management and data cleansing.  They see so many issues caused by poor data that they assume that these will be obvious to everyone. So the problem that they are trying to address becomes “poor data quality” – rather than the business issue (or issues) that the data is affecting.

This common mistake has resulted in business people questioning the business value of data management projects, with sometimes a lot of money being spent but ultimately they do not yield any results and are perceived to be pointless. This is more than often due to data management projects that are frequently driven by technologists who do not speak the ‘business language’.

After all, data management is not about the data. It’s about addressing the business issues caused by data and improving the business outcome.

It is therefore vital to partner with a data management professional that understands the business language and how the technology can deliver results that can be linked back to business issues.

For example, a data person may ask for budget to “improve address data quality”, assuming that this is the business driver. After all, address inaccuracies are the cause of many business issues - ranging from returned mail, to the inability to trace and collect bad debts, to the increased risk associated with non-compliance.

In each of these cases, the business driver is not better address data quality. A project that asks for budget to ‘reduce overall debtor's days by x%’, or to ‘cut the volume of returned mail by y%”, is far more likely to get attention, and budget, from business than a project intended to ‘improve the accuracy of addresses’. 

This approach also focuses the attention of any implementation on the specific end goal, or goals, ensuring that unnecessary effort and money is not expended cleansing data for the sake of it.

Of course, this approach can lead to duplication of effort.  Do we need multiple projects, each with a different end goal all working on the same data?

Pragmatic data governance assists to ensure that overlapping business goals are addressed by the same project, rather than having many tactical projects that may impact negatively on each other, or waste resources repeating a task that has been delivered by another project.

Experience can help to link the business and IT goals creating that all important business buy in and support.

 

Published in Analytics & BI
Read more...
Monday, 09 July 2012 09:45

Big Data – what’s the big deal and why is data quality so important?

Big Data – what’s the big deal and why is data quality so important?

In today’s world volumes of data are rapidly expanding, and the nature of this data is more varied and complex than ever before. While this concept of Big Data brings with it numerous possibilities and insights for organisations, it also brings many challenges, particularly around the management and quality of data. All data has potential value if it can be collected, analysed and used to generate insight, but in the era of Big Data new concepts, practices and technologies are needed to manage and exploit this data, and data quality has become more important than ever.

Big Data has three characteristics, namely volume, variety and velocity. Data today is growing at an exponential rate, with around 90% of all digital data being created in the last two years alone and somewhere in the region of 2.5 quintillion bytes of data being generated each and every day. Data is more varied and complex than ever, consisting of a mixture of text, audio, images, machine generated data and more, and much of this data is semi-structured or unstructured. This data is often generated in real-time, and as a result analysis and response needs to be rapid, often also in real-time. This means that traditional Business Intelligence (BI) and data warehousing environments are becoming obsolete and that traditional techniques are ill equipped to process and analyse this data.

When it comes to generating Big Data there are multiple sources, some widely known and some that organisations rarely think about. Photographs, emails, music downloads, smartphones, video sharing and social networks account for a large proportion of this data, and are well known sources. But in today’s digital world there are many other places that data may come from, including point of sale devices, RFID chips, recording and diagnostic tools in aeroplanes, cars and other transport, manufacturing, and even scientific research.

All of this machine generated data, along with Internet and socially generated data, can potentially be a source of insight and intelligence. Big Data has the potential to help organisations gain a better understanding of customer and market behaviour, improve knowledge of product and service performance, aid innovation, enhance revenue, reduce cost, and enable more rapid, fact-based decisions which can enhance efficiency, reduce risk and more. The sheer volume of Big Data presents a massive potential pitfall however, and the challenge lies in being able to identify the right data to solve a business problem or address an opportunity, and in the ability to integrate and match data from multiple sources.

In order to leverage the potential of Big Data, it is vital for organisations to be able to define which data really matters out of the explosion of information, and assure the quality of internal data sources to ensure accuracy, completeness and consistency and enable this data to be matched with that from external sources. It is also important to define and apply business rules and metadata management around how the data will be used, and a data governance framework is crucial for consistency and control. Processes and tools need to be in place to enable source data profiling, data integration and standardisation, business rule creation and management, de-duplication, normalisation, enriching and auditing of data amongst others. Many of these functions need to be capable of being carried out in real-time, with zero lag. 

Data quality is the key enabler when it comes to leveraging actionable insight and knowledge from Big Data. Poor data quality around the analysis of Big Data can lead to big problems. Consolidating various sources of data without data quality will create a mess of conflicting information, and conducting analytics on data without first assuring its quality will likely lead to the wrong results, poor decision making, and other negative consequences, including lack of compliance which has massive legal implications. The three V’s of Big Data, namely volume, velocity and variety, need to be met by a fourth V – validity – in order to harness this data for the benefit of an organisation and its customers.

While the concept of Big Data may have been over-hyped, the reality is that it is here, and it will continue to grow in volume, velocity and variety. While it is currently an immature concept, as it begins to mature data will increasingly be viewed as a strategic business asset and data skills will be increasingly valued. Big Data reflects the expectations and needs of the emerging generation, and businesses would do well to pay attention to this, and ensure that their data quality initiatives are up to speed to ensure they can leverage the potential value of the Big Data phenomenon.

Published in Analytics & BI
Read more...
  • Start
  • Prev
  • 1
  • 2
  • Next
  • End
Page 1 of 2

The SA Leader Magazine

Cover sml

In the November issue

When it comes to big data, how big is big?


Economic outlook – hoping for business as usual


Talent management is a C-suite priority


Cyber Security is an ‘Invisible War’ that needs attention

Subscribe

Copyright © 2013 gdmc (Geoffrey Dean Marketing Corporation cc). All rights reserved. Material may not be published or reproduced in any form without prior written permission. Use of this site constitutes acceptance of our Terms & Conditions and Privacy Policy. External links are provided for reference purposes. SALeader.co.za is not responsible for the content of external Internet sites.

Login or Subscribe