Digital transformation has gradually become a priority in the financial services industry over the past two decades, and business interruption due to the Covid-19 pandemic has served only to accelerate it. With digitization, data has become the backbone of the financial system. Financial institutions have put great focus on adapting data architecture and business strategies to the changing economic environment and to innovative technologies over the past few years. One of the most significant reforms for financial institutions today is moving towards the cloud to store and compute big data in a centralized environment. Although this gives institutions the capacity to resolve the legacy data problem, it also poses new challenges for data management. These challenges include addressing data privacy concerns, improving data agility, ensuring data security and accessibility, and helping consumers gain access to and apply the right data quickly. Additionally, evolving data regulations globally around data protection and data consumption require organizations to respond in an agile fashion. Failure to keep up with data rules and regulations can result in lost business, reputational risk, and hefty fines. The key driver of all of these is the availability of quality data - well-organized data that is visible and accessible to users, which empowers users to extract and analyze with the appropriate protections. Needless to say, implementing a strategic data management process and maintaining a high-quality database is critical towards data governance success and achieving regulatory compliance.
There are many benefits to improving data quality:
Improve institutional data integrity and traceability
Provide better insight into business practices and market changes
Incorporate comprehensive data controls and extensive data models
Increase efficiency on data classification and distribution
Reduce data maintenance cost
Improve adaptability to growing regulatory requirements
Support executive decision making
As technology executives and their business user stakeholders are running through the annual end-of-summer/fall budgeting cycle, highlighting the business benefits and ROI are imperative to ensure appropriate investments to cloud infrastructure are made as firms embark on the process of building a quality database that supports sustainable and resilient enterprises.
How to build a quality database?
Define
The first step in the process is to define organizational data management goals and specify the scope of the data. In this phase, the team should align with the institution’s data management policy while also aligning with the organization’s short-term and long-term goals. It is important to engage the Chief Data Officer to ensure proper data governance. Success criteria and milestones should also be set concurrently. Lastly, the scope of the data must be specified and this includes the types of data to be captured, the asset classes to be covered, and the business lines to be considered. To ensure accountability, owners of the data will be specified and responsibilities and delivery timelines will be assigned to the respective teams.
Organize
The next step is to locate the target database and organize data strategically across the financial institution. IT specialists will establish the connections between the target database and upstream databases/platforms where the data gets fed. Teams need to agree on the critical data elements (CDEs), or the key data attributes required for each data record. To ensure the accuracy and completeness of the CDEs, a preliminary analysis needs to be carried out where data hierarchy is assessed and understood, underlying source systems and tables are identified, table linking strategies are developed, and data gaps and issues are isolated. Subsequently, remediation processes on CDEs will start, including these actions: (1) setting plans and priorities by asset classes, countries or business lines; (2) tracking remediation progress; and (3) resolving and enriching data from source systems. It is important to incorporate technology tools like Jira, a popular defect tracking and Agile methodology tool to expedite the issue management process.
Build
Once the data structure has been organized and the CDEs have been validated, the official build phase for the new database shall kick-off. This phase involves defining the format of data output, determining the granularity of the data records, building data models, constructing data query logic, and running a proof of concept. It’s essential that various teams communicate clearly and effectively, and that project stakeholders are closely engaged as the database is being built out, to ensure that any feedback and concerns can be promptly addressed.
Test and Iterate
Testing is a critical stage, where the team is dedicated to finding data issues and validating data outputs in the database. A customized and sustainable testing strategy that utilizes a best practice testing methodology and leverages cutting-edge testing tools is the key to drive the team to success. A sensible approach is to break the testing cycle into multiple iterations. In the first iteration, the team should start with a small subset of data that’s a good representation of the full data population; based on the results, the testing team can perform data analysis, identify and document data issues, and thoroughly troubleshoot the root cause of those issues. With all the analysis and findings, a joint team effort is then made to remediate the issues, close data gaps, and find workarounds. By the end of the first iteration, the data queries and data outputs in the target database should be enhanced. The second iteration follows in which a wider range of data will be tested, analyzed, and enriched. Each iteration continues until the data coverage and accuracy rates have both reached the goals set in the define phase.
Train and Go Live
Now that the database is ready to use, it’s time to set up data access rules and carry out training sessions for the users. A stability test ranging from one month to three months is also recommended before the database goes live. In addition, BAU responsibilities after the “go live” phase should be defined at this time; these include the daily activities of managing and reporting data, measuring and reporting on SLAs, logging and resolving data issues, and managing schedules for data purging and database updates.
A Gateway to New Opportunities
Having a clean, high-quality database up and running is the gateway to new opportunities. Leveraging the current database, financial institutions can then easily incorporate new data requirements and build systematic controls; applying the same streamlined data management process, future enhancements can be implemented efficiently to increase the coverage and visibility of source systems and business activities. It’s shown that large financial institutions with robust databases and management frameworks obtain better oversight and business intelligence of the global trade activities, create wider surveillance data repository, adapt faster to growing internal audit and regulatory requirements, and effectively reduce capital charges.
While innovative technology and data automation tools grant the financial industry great power, it also comes with great responsibility. With the explosion of regulations and growing consumer concern around how data can and cannot be used, it’s more important than ever for organizations to address compliance requirements. Leveraging data to organizational business advantage, continuously improving data quality, and setting up rules to use data consistently and reliably are essential elements in the path of data success. Furthermore, applying a sustainable data management strategy and maintaining a well-structured database creates a great foundation for financial institutions to meet changing regulations and stay competitive in today’s growing market.
About Monticello
Monticello Consulting Group is a management consulting firm supporting the financial services industry through deep knowledge and expertise in digital transformation, change management, and financial services advisory. Our understanding of the competitive forces reshaping business models in capital markets and digital banking are proven enablers that help our clients drive innovative change programs to be more competitive and gain market share in new and existing businesses.
Get In Touch
LEARN MORE ABOUT MONTICELLO AND PURSUE OPPORTUNITIES WITH OUR TEAM