Studio 1, The Glove Factory
Av. 24 de Julho
4 4º Esq.
35 Artillery Lane
121 Doyle Street
Unit 6, Olympus Court
20 Broad Street
Cloud Hosting & Services
Cloud Services Consultancy
AI & Machine Learning
Bespoke Data Solutions
Sterling is expanding through acquisition and has identified the integration of digital assets as essential to achieving operational and strategic synergies. Additionally, accurate and timely data delivery to clients is at the core of their business model and advanced data management offers an opportunity for competitive differentiation
Our client’s opportunities focus on ensuring data quality and consistency of use as the company migrates and integrates new data sources thanks to acquisitions and new business models. The company is expanding through acquisition and has identified the integration of digital assets as essential to achieving operational and strategic synergies. Additionally, accurate and timely data delivery to clients is at the core of their business model and advanced data management offers an opportunity for competitive differentiation. The result will be a clear lineage and consistent language for data across its lifecycle, from creation to consumption.
Select Performance Gaps Problems not addressed by off-the-shelf solutions (Organizational Singularity): Consolidating systems had to reflect the unique needs and circumstances of the client’s data-heavy business model and create a single data model and consistent data definitions.
Mitigating disruption of change (Organizational Singularity): Many disparate data sources from operational, financial, and customer data all had different structures and content, with no clear mapping to each other.
No single source of truth (Analytical Limitations): Data quality and transformation activities needed to happen rapidly in order to prove acquisition synergies to senior stakeholders and ensure no loss of service quality for clients.
Develop a platform that will accept data from disparate systems, analyze the content as it relates to established business rules, cleanse the data and migrate into the single enterprise structure while reporting on any transformation that has occurred. Our client approached us and began with an assessment and analysis of the existing internal data and the establishment of a single target data model. Together we designed the business rules applicable to this new model. We created a platform for the codification and execution of these rules to clean, change and/or transform data on both internal data, as well as new data resulting from acquisition. Source to target mapping was established and the data cleansing and migration process was executed, along with reporting functionality to show how much data was being touched and changed throughout the process.
Given the time-sensitive nature of the project, we helped accelerate the time table, achieving the entire project in 6 weeks, through: • 27 daily status meetings • 82 data element assessments completed • 51 SME assessments completed • 19 repeatable cleansing functions created • 29 targeted scrub procedures created • 2,319 direct text to text remediation rules
After understanding the business requirements, we built a customized data product that consisted of a SQL Server Database with generalized procedures to accept schema, table, and column names (in order to perform data scrubbing operations on arbitrary tables), with scrubbing rules as defined over the course of this project by the stakeholders involved. Data was fed through, emerging in a clean and consistent format ready for consumption by the end data platform.
• 31% of data records transformed and improved out of over 10 million records analyzed • Tribal knowledge from over 51 SME’s codified for ongoing use • 19 repeatable cleansing functions created for periodic scrubbing • 29 targeted scrub procedures created • 2,319 direct text to text remediation rules Other Wins • A data community was created to improve SMEs ownership of data assets • Data Quality issues were isolated as opportunities for future improvement • An ongoing process for managing data quality was implemented, including rules to guide future data entry, sourcing, and migration activities