top of page
Search
  • James Wartho

Fifth Logic unlocks value with a strong data foundation

Working alongside our client on an initial six-week engagement, Fifth Logic provided a complete and detailed assessment of Data and Technology maturity and strategic readiness.

Assessing data capability with reference to the CMMI-Data Maturity Model produced low maturity scores, while detailed interviews with staff provided an organisational narrative where significant effort was spent on locating, verifying and gaining confidence in data rather than using data for value-add activities. In other words, highly skilled staff with deep industry knowledge spent time cleansing and doing things “to data” rather than using their time to identify insights and create value “with data”. The Client recognised their gaps in capability and were keen to upskill as soon as possible to be self-sufficient, while also adding value early to ensure the ongoing support of revenue generating areas.

Approach

Our team devised a data governance pilot with 4 key phases, aimed at giving a large number of Client staff exposure to data tools and techniques (driving data literacy) while also demonstrating tangible business results:

1. Data Operating model


The key at this stage was to ensure data governance supports business strategy. With weekly working group and sponsor meetings to ensure engagement and drive cadence, the team ran a series of fact find meetings before presenting early options on appropriate data operating models. The working group were also engaged to explore a range of possible data principles that resonated with the organisation and aligned with APRA’s guidance standard on Data Risk (CPG 235), and their own strategic and data objectives. As an example, the Client was concerned that they were falling behind the industry in terms of technical capabilities and suffering a lack of agility and ability to deliver automated insights – and so when data principles were created a statement was added to express a strong preference for automated solutions.

With principles in place and a draft operating model agreed, a draft data policy was created – all within 4 weeks of starting the engagement. The involvement of Client teams helped create a sense that the model was “theirs” and not “the consultants”, while also providing the opportunity to educate teams on concepts such as data domains, critical usages, critical data elements, data ownership, data stewardship and data lifecycle – all helped by the fact that the framework would be tested throughout all remaining phases of the pilot – effectively a money back guarantee!

An interesting observation at this stage was the increase in agency produced by this approach. Spending time with team members breaking down the company data “problem” into business categories (domains) and identifying critical data (elements) within that data gave an immediate view that the problem felt more manageable. It could be broken into pragmatic, risk based, “chunks” reducing regulatory risk, while also minimising delivery risk and creating a task that those responsible felt able to achieve.

2. Data Issues

The second phase of the pilot focussed on surfacing and quantifying data issues. In many organisations data programs stutter when those close to the data know the magnitude of the problem, but are unable to sufficiently demonstrate the challenge to the wider organisation. Through a series of approximately 30 facilitated workshops, teams underwent training on, and were helped to identify, data issues across all aspects of the organisations lifecycle. Rich data was gathered across more than 20 variables including impacted data quality dimension (e.g., Completeness, Accuracy etc), required fix timeframes, and detailed impact – allowing rapid triage and creation of items for immediate focus, quick wins and strategic, high value, focus areas. Through this phase, areas of the organisation that had showed an initial scepticism transitioned to full buy in as the potential for error, opportunity for duplication and cost of the problem (and conversely, the size of the possible prize), became clear. With issues presented in their solution categorises, all levels of the organisation were able to see where Data Governance uplift was required, which challenges could be solved by improved data management and data quality management, where a data architecture solution was required, or where an alternate solution was needed – and when it would be appropriate to have these delivered.

Significant learning also occurred for management at this stage as the variety of issues was clarified. Quick wins that identified a lack of training were addressed during the project, and a set of issues that had a potential to impact financial statements and customers were given immediate focus.

3. Data Lineage

Around week 10 the initiative moved to piloting data lineage across two critical usages (in this case, both regulatory reports). The purpose of this phase was multi-faceted:


  • Introduce the Client to the benefits of mapping detailed data flows (data lineage)

  • Deliver tangible data process improvements to reduce regulatory burden

  • Build a business data glossary and assign roles as per the data framework

This third phase of the project has further embedded best practice into the Client organisation. Within a matter of weeks, process challenges and data issues (the majority of which were found in the data issues phase) had been “brought to life” for sponsors and project participants. By delivering full lineage rapidly with an engaged Client, the importance of detailed data flow mapping, data definitions, ownership and quality rules could be showcased. As examples;

  • A Client form did not specifically request a data attribute and instead many hours were being spent trying to “reverse engineer” that attribute from unstructured text.

  • Administration staff were being asked to quantify issues to three levels of specificity (giving over 70,000 choices) when the end consumer of the data only required the highest level (approximately 25 choices), leading to significant additional complexity, wasted time and potential for unnecessary error.

  • The addition of a data domain owner allowed the process of agreeing additional quality controls to start without further delay.

While still in the pilot phase, the organisation was at this stage effectively up and running with the process of “bringing data under governance”, assigning data ownership, agreeing definitions and quality rules and populating an initial data glossary. The process created data advocates that will ensure the continued success of the data program long after the pilot completes.


4. Revisit Data Framework and Deliver Roadmap

The final phase of the pilot has given the Client the opportunity to adjust the data framework and actively participate in the creation of long-term data objectives – guided by Fifth Logic experts but based on lived experience. The Outcome


The hands on “pilot” approach has meant high level engagement, with around 20% of the organisation volunteering for some form of data role. Staff members actively participated in a review of the draft data framework, making changes based on experience and helping to ensure the framework stands the test of time. Alongside the Client, the Fifth Logic team have been able to utilise their industry experience to deliver a long-term data roadmap that builds on the success of the pilot. The Client has a strong focus on data, and a deep belief in the long-term plan. They now have the skills to prioritise the data sets that should be focussed on from a lineage perspective to improve their risk profile, surface the quality initiatives that will provide the greatest increase in capacity to serve customers, and focus on the technology initiatives that will resolve the largest number of underlying data challenges. All this activity will drive them towards their strategic data-driven goals.

137 views0 comments
bottom of page