At Ekata, we continuously perform analysis on proof of concept tests using large data sets to show our customers how to leverage Ekata APIs. While running data tests is labor-intensive, when done right, they can yield excellent results. As a senior field data scientist at Ekata, I have a lot of experience – from live integration tests or historical batch tests – with what worked and what did not.
To help people better understand how to best engage with our award-winning APIs, I’ve created a list of five proof of concept questions that any team should ask before they engage us.
1) Do You Have a Business Problem?
Ekata APIs are meant to solve problems, such as evaluating the likelihood of fraud, or streamlining the customer experience with well-established digital identities.
2) Can You Measure the Business Problem?
If fraud is the business problem, are the transactions or account sign-ups labeled as fraud? If not, can something else be used as a proxy for fraud? For instance, is there too much customer friction or transaction rejection? Is there something to show customers that may have been rejected in error? Linking call center interactions is often a proxy here for identifying the problem.
3) Do You Have a Detailed Test Plan?
No matter what kind of test we do, we write a joint test plan. This plan helps us at Ekata understand the goals of the proof of concept and helps the customer understand what data needs to be pulled. Since a test can take months from an initial conversation to an analysis presentation, the test plan is a useful guide for refreshing everyone’s memory. The best test plans may even include block diagrams for a customer’s workflow.
4) Can You Simulate Reality?
The reason why we conduct tests in a proof of concept is to answer the question: “if I integrated Ekata into my BLANK workflow at BLANK time, what decisions could I have made using that information and what would have been the outcome?”
Ekata uses five identity elements: name, email, IP, phone, and address to provide responses on a customer’s level of risk. These data elements should be tested the same way that they would be sent through in production. Simulating how fraudsters work sounds simple but emulating their attacks often isn’t. For example, one customer attempted to simulate fraud by swapping emails and phone numbers between historical records. It wasn’t a great way to test our API as it’s simply not how fraudsters behave.
5) Communication Throughout the Test
No matter how well you plan, unexpected things may come up during a test. It could be something as simple as a mismatched API mapping, or a new metric to analyze. Both sides should plan to ask questions and check in with one another throughout the process.
The field data scientists at Ekata are here to provide guidance. The better we can understand a customer’s problem and goals, the better we can tailor our advice.
Get in touch with us today and let’s start a conversation.