Oscar Pistorius trial shows the importance of data management
Oscar Pistorius’ trial has been dubbed the ‘trial of the century’ and is receiving a lot of media attention. A young woman has died, under tragic circumstances and the fate of her killer lies in the evidence, the prosecutions ability to create a picture from that evidence, and the ability of the defence to create reasonable doubt. The same concept can be applied to data management within an organisation.
Typically, we think of data as information captured in a database. The evidence used in this trial serves as a reminder that data is simply raw information and can take many forms.
A court case presents evidence - data - and draws conclusions based on this data. This can be likened to the decision-making process of any business. Ironically, in a court case, the prosecution and the defence are each pushing for diametrically opposed conclusions – innocence or guilt, using the same data.
While the merits of the case are for the court to decide, what has been interesting is the legal process itself, and how the opposing counsel is applying data management principles to support their desired outcomes.
Indeed, the outcome of this case could depend on the data management principles of data governance and data quality.
Subscribe content preview
To continue reading: Log-In above or Subscribe now.
Want the full story?
SUBSCRIBE NOW
Get The SA Leader the way you want it
- One Year Digital Subscription - R320
- 10 Issues Print Subscription - R580
- One Year All Access - Just R900 Best Deal!
Print Magazine + Digital Edition + Subscriber-only content on SALeader.co.za
If you are already a subscriber, please Log-In using the Log-In button found on the top right of the site!
Compliance bug bites SA companies
South African companies outside of the financial services industry are earnestly looking for data or information governance frameworks to meet statutory and regulatory requirements on the one hand and handle their data management lifecycle (DMLC) on the other. That’s the word from Johann van der Walt, MDM practice manager at Knowledge Integration Dynamics (KID).
Subscribe content preview
To continue reading: Log-In above or Subscribe now.
Want the full story?
SUBSCRIBE NOW
Get The SA Leader the way you want it
- One Year Digital Subscription - R320
- 10 Issues Print Subscription - R580
- One Year All Access - Just R900 Best Deal!
Print Magazine + Digital Edition + Subscriber-only content on SALeader.co.za
If you are already a subscriber, please Log-In using the Log-In button found on the top right of the site!
Adopting a Lean approach to data governance
Data governance is fast becoming a priority on the business agenda, in light of regulations and guidelines such as King III, which outline recommendations for corporate governance of which data governance forms a critical part. However, data governance can be a challenging and complex task. Adopting a Lean approach to data governance can help businesses to eliminate wasteful data governance activity and promote efficiencies.
Data management and data cleansing – it’s not about the data
We meet with many IT teams that are trying to establish a data quality culture within their organisations. In most cases they share a common concern with us:
“The business people aren’t interested in solving data problems!”
“We don’t get buy in from senior management fordata governance!”
The axiom ‘You can’t see the wood for the trees’ is apt when considering their approach to data management and data cleansing. They see so many issues caused by poor data that they assume that these will be obvious to everyone. So the problem that they are trying to address becomes “poor data quality” – rather than the business issue (or issues) that the data is affecting.
This common mistake has resulted in business people questioning the business value of data management projects, with sometimes a lot of money being spent but ultimately they do not yield any results and are perceived to be pointless. This is more than often due to data management projects that are frequently driven by technologists who do not speak the ‘business language’.
After all, data management is not about the data. It’s about addressing the business issues caused by data and improving the business outcome.
It is therefore vital to partner with a data management professional that understands the business language and how the technology can deliver results that can be linked back to business issues.
For example, a data person may ask for budget to “improve address data quality”, assuming that this is the business driver. After all, address inaccuracies are the cause of many business issues - ranging from returned mail, to the inability to trace and collect bad debts, to the increased risk associated with non-compliance.
In each of these cases, the business driver is not better address data quality. A project that asks for budget to ‘reduce overall debtor's days by x%’, or to ‘cut the volume of returned mail by y%”, is far more likely to get attention, and budget, from business than a project intended to ‘improve the accuracy of addresses’.
This approach also focuses the attention of any implementation on the specific end goal, or goals, ensuring that unnecessary effort and money is not expended cleansing data for the sake of it.
Of course, this approach can lead to duplication of effort. Do we need multiple projects, each with a different end goal all working on the same data?
Pragmatic data governance assists to ensure that overlapping business goals are addressed by the same project, rather than having many tactical projects that may impact negatively on each other, or waste resources repeating a task that has been delivered by another project.
Experience can help to link the business and IT goals creating that all important business buy in and support.
Data quality will help insurance companies gear up for impending SAM regime
Similar to Basel II and III for the banking industry, at the heart of the SAM is the fact that risk calculations must be based on provably correct data. Data is the key here, and insurance companies should learn from the experiences of the banking industry in their early attempts to comply with Basel II. In many cases very large, expensive projects were undertaken that did not address the underlying issue - data management.
According to the Financial Services Board (FSB), ‘the primary purpose of SAM is to improve the protection afforded to policyholders and beneficiaries and encourage insurers to adopt a more sophisticated approach to risk monitoring and risk management.’ Poor data quality will therefore have a significant impact on risk calculations. If insurance companies are not certain of the quality or integrity of their data, they must assume the worst case scenario when calculating risk, which in turn may raise the required holding capital amounts to prohibitive levels which may make doing business impossible.
Assume, for example, that we have two related client records. One record shows the client to be low risk. The other, due to missing information, causes a high risk assessment. Under Solvency II the second record takes precedence and the client must be assumed to be high risk. This then ties up unnecessary capital and can eat into profitability. Because of an inability to accurately assess risk, and a resulting higher capital holding level, the insurance organisation itself may also be seen as a bad credit risk. This in turn means that money borrowed will be at a far higher interest rate. Aside from the increased cost of doing business, non-compliance with SAM is sure to have other penalties, which may include fines and trading bans, and other consequences similar to those around Solvency II.
Compliance is not an option, it is something that all insurance companies must do, and the cost of compliance will require certain systems and processes to be put into place. However, insurers should not make the mistake of addressing SAM compliance as a once-off, standalone project, as this will mean that any solution will not be cost effective.
SAM compliance should instead be viewed as a strategic imperative. Insurers should look towards ensuring data governance, in other words the process of managing data as an asset, within a more holistic and reusable framework. The same systems required for SAM compliance can also be leveraged for other purposes, including addressing other regulatory requirements, such as the Foreign Accounts Tax Compliance Act (FATCA) or privacy legislation such as the Protection of Personal Information bill.
This investment will then not only enable compliance to SAM and other legislation but deliver benefits such as improved operational efficiency. Data quality improvements and a more accurate view of customers can also be used to provide more effective marketing, more efficient client service and an enhanced customer experience.
The bottom line is that a short-sighted, tactical approach to SAM compliance may well have the effect of culling weaker companies from the herd. However, SAM by no means dooms these companies to failure, since a strategic approach centred around data governance can drive significant value for the organisation. Using this approach, investments in data quality and data management can be leveraged to address multiple needs.
SAM compliance need not be a death sentence for smaller insurers, and nor should it be viewed as a money pit that requires huge resources and infrastructure. Rather it should be viewed as an opportunity to improve the quality of data within insurance organisations to deliver positive returns across the organisation.