The rules of employee benefit administration are constantly being reshuffled, governed by rolling updates to federal regulations, evolving guidance from oversight agencies, variations in employee expectations, and relentless attacks from bad actors seeking out private data.
The leadership team at iTedium managed to thrive in this information maelstrom for 20 years, but recently the costs of protecting their data were headed toward unmanageable territory. Almost half of the company’s employees are developers, QA’s, and testers across multiple geographical locations. All needed access to highly sensitive data.
ROI needed to be improved without impacting security and compliance so iTedium brought on Youssuf Elkalay, founder of engineering consultancy firm 2038 Labs, as an executive engineer.
“The company was spending an inordinate amount of money on AWS Cloud services. In the back of my mind, I always knew we needed some sort of process or mechanism to better deal with this highly regulated data.”
Profitability and efficiency were top concerns for the scalability of their processes as they grew. The company’s core business involves processing massive streams of COBRA-related employee records. That means handling highly sensitive data governed by both strict HIPAA regulations and SOC 2 compliance audits. With the Department of Health and Human Services fining HIPAA violators up to $1 million and onwards, there was no room for error.
Like most modern distributed organizations, iTedium struggled to keep their data secure as it moved through:
In his cost analysis, Elkalay found too many MySQL instances, with different copies of the same data in pre-production environments. They were using Aurora RDS to facilitate rapid instantiation with MySQL DBs.
Developers had to spin up their own DBs and grab subsets randomly with little controls in place. The result was DB sprawl, wasted time and excessive costs with AWS Cloud.
Teams had to use workarounds including custom scripts, flat files, open source fake data generators and whatever pre-existing dummy data they had on hand.
Beyond the cost of excess time clocked on the AWS cloud, these manual processes were unworkable because they couldn’t scale along with the company’s growing demand for services. The piecemeal approach left the company exposed to a high risk of data breach.
iTedium auditors had become concerned about the massive uptick in data loss by health care companies. Due to the sophistication of ransomware, the size of data breaches, and the damage to brand reputation, insurance companies are now charging much higher rates if they aren’t confident that proper security practices are in place.
Change control across multiple databases and locations was all manual at iTedium, leaving open the possibility that developers or testers could be working with production data in cases where the schema changed and no one remembered to apply fake data generators to those columns.
What iTedium needed was a tool to handle governance, reliably de-identify data, and alert them to schema changes. At the same time it had to come in a simple UI that appealed to the developers so they could adopt it with minimal training.
Tonic.ai instantly offered:
Tonic.ai gave iTedium the missing puzzle piece they needed, and one that easily fit into their existing workflow like a glove. iTedium saw efficiency, safety and guardrails for future growth.
To test how Tonic would work in their environment, the leadership team conducted an experiment with Tonic’s output where they didn't tell developers and testers that the data they were using was de-identified. They wanted to see if developers could tell the difference between production data and Tonic’s mimicked data. No one spotted the change.
As an early test of Tonic’s functionality, iTedium allowed Tonic to generate new test data that accurately reproduced the statistical structure of their production data. No engineer could spot the difference.
As Elkalay noted, “The development team really appreciates being able to automatically pull in the latest changes. We’re always testing with the latest version of our production schema and [de-identified] data. I don’t have to do anything special to make it work. Just the simplicity of not having to do anything asynchronous to handle de-identification and propagation of the data.”
“We’re always testing with the latest version of our production schema and [de-identified] data. I don’t have to do anything special to make it work.”
Costs have dropped substantially now that there are no hoops to jump through to create working data for development and testing.
It was a dream come true for management when it saw close to zero down time in the transition. There were no offsets to a protocol that reduced costs and heightened security.
Elkalay said,“This roll out didn't impact developer or QA time at all. There was zero training. That was a pretty big win.”
“This roll out didn't impact developer or QA time at all. There was zero training. That was a pretty big win.”
From a compliance and risk management standpoint, iTedium is now protected from the most significant risks. Even if data leaks from development or testing, it is fake data that poses no jeopardy to the company.
With Tonic, iTedium gained production-grade test data quality with zero risk. WIth Tonic, production data ONLY lives in production.
Enable your developers, unblock your data scientists, and respect data privacy as a human right.