As one of our primary job functions, the Field Data Science (FDS) team works with our sales team and potential customers to assess the best way to prove the value of using Ekata in their production workflow through a standard data evaluation. One of the ways to show this value is through an offline historical data evaluation. While every evaluation is different, some standardized components will be discussed in this post. Keep reading to learn more about our standard data evaluations and the value of our data.
Step 1: Planning
We start by working with the customer to dig into their specific fraud prevention and identity verification needs. Then we set goals for the test, which helps us decide what the test data set needs to look like to achieve the goal(s).
A major part of this planning phase is creating our Joint Test Plan (JTP). This document outlines the goals and specifics of the data test so that everyone can reference it at any point during the evaluation. The JTP is a collaborative effort so that everyone has clear expectations to maximize the time that all parties invest in the data test.
Step 2: Batch run and analysis
The next phase is the meat of the data evaluation. This analysis aims to demonstrate how Ekata data can be incorporated into a customer’s workflow.
We take the test data set and run it through the API endpoint we are testing. This process outputs all of our API responses for each row of the data evaluation test file. For all of our products, we do point-in-time testing to return the signals as they would have been at the time the customer collected the data.
The FDS team then begins an analysis of the results based on the JTP goals. During that analysis, we supply a basic summary of the resulting signals but then do a deeper analysis on how the customer could build a rule set or incorporate our signals into their machine learning model. We provide either a few rule set options or an example model based on Ekata features to show examples of feature engineering for our signals.
Step 3: Customer analysis
As the final part of the data evaluation, the customer takes the data and recommendations to do their analysis. Generally, they can look at Ekata data in conjunction with other internal signals to provide even more value. Once the customer’s team has evaluated Ekata’s data, they present their findings. Of course, we love hearing that there is value in the data, but we also appreciate feedback for improvement. This feedback is what drives future product development within Ekata.
We strongly believe in our product and value proposition and are pleased to offer these tests to our customers. Plus, the challenge of new use cases or business problems that can be solved with Ekata data excites us.