Finance in the Fast Lane: Low Latency Data Solutions for a Top Financial Institution

If data is the “new oil” for many companies, speed may be the engine. While many companies are grappling with the proliferation of data, one of Infinitive’s clients, a Top 10 Financial Institution, has now tackled the speed problem. Speed to write, process, and read data in several business applications can mean the difference between millions of dollars and millions of customers – use cases include fraud and real-time marketing decisions. The solution can take the form of low-latency data stores to improve query performance and make data more accessible. This large bank found their existing data ecosystem to be inefficient for some teams within the organization. They contacted Infinitive to engineer a solution that would provide faster read and write speeds, while also maintaining data resiliency.

Challenge

Infinitive was challenged to create a low-latency data store solution for the financial institution. The client already had an enterprise data lake, as well as multiple enterprise-grade data warehousing solutions, but all were too slow for teams to access for low-latency needs, had slow query performance, and were struggling to integrate with business intelligence and analytics tools.

Solution

Infinitive and the client researched various low-latency solutions and determined that AWS DynamoDB was the best candidate for their needs. When architected correctly, DynamoDB can support faster read and write speeds.

To create a data sink as part of the client’s large data ecosystem, Infinitive processed all records through the client’s enterprise Kafka implementation and wrote them to DynamoDB. They achieved low write latency using high throughput and concurrent batchWrites. Infinitive also developed a sink management REST API to create and delete tables, GSIs, update RCUs/WCUs, and manage authorization.

In addition, Infinitive created a consumption API to read from DynamoDB where caching was leveraged to increase performance with DynamoDB Accelerator (DAX). With caching, Infinitive was able to achieve microsecond latency. Furthermore, it enabled consumers to query main keys as well as global secondary indexes and use DynamoDB features such as filter and projection expressions.

To ensure data resiliency, Infinitive implemented our best practices for batchWrites, including retrying with exponential backoff for retry-able operations. Infinitive also created a strategy to manage data that fails to be written after multiple attempts to prevent data loss.

Infinitive established an onboarding process where data producers defined their table name, primary key, sort key, and record TTL, which the Infinitive team provisioned in the form of a write rule.

Finally, Infinitive integrated both applications with the client’s data lineage system to extract value from existing practices/tools.

Outcome

Infinitive delivered a valuable low-latency data store solution to the financial institution that resulted in reduced query latency and increased accessibility to data. By implementing DynamoDB, Infinitive was able to achieve microsecond latency, providing faster read and write speeds. The sink management REST API and onboarding process established by Infinitive made it easy for the client to onboard and get started with writing and consuming data with DynamoDB. By integrating the new solution with the client’s existing data ecosystem, Infinitive was able to provide the client with a comprehensive solution that met their needs.

Are you ready to get more value out of your data?