
- Devis can secure and monitor with their Government cloud anytime
- Enabled to add more layers of encryption and protect their sensitive PII data automatically
- Connect to a variety of 200+ data sources and data targets without requiring any coding or complex configuration
- Data synchronization between sources and targets
- Replicate and transform data (as needed) on the fly
Transferring sensitive data between clouds presented several notable issues. The team quickly determined that it was not feasible, cost-effective, nor sustainable to write hundreds of API scripts to query the SaaS proprietary data source and transfer the data to our Government cloud-based data lake. The team at Devis looked for options from within the commercial and government marketplaces to automate, transform and maintain the transfer of this data as part of our operational production environment.
- The solution must be either FedRAMPed-approved or be able to be self-hosted in our their secure and managed VPC within a Government-approved cloud;
- The tool must be able to connect to a wide variety of data sources and data targets without requiring any coding; and
- The platform must enable them to perform near-real-time replication and transformation (ETL/ELT) of data on the fly to meet the Government’s unique end-point data requirements.
Many of the alternative solutions they examined were either SaaS-based or used their VPC within a non-Government compliant cloud. These solutions violated the first hard requirement which dealt with our Government mandate to enact and enforce security policies that protected and secured their sensitive PII data. The architecture of those alternative solutions would have forced Devis’s data to transit outside the authorized security boundaries into an unprotected and unauthorized cloud and/or on-premise environments before landing back into their Government cloud’s secured data lake.
Outside-The-Box Requirements
The conclusions of their analysis pointed to Lyftrondata as the best-fit solution to implement their automated and secure data pipeline solution between these two distinctly different cloud platforms. Out of the box, it met their three hard requirements allowing them to work through the constraints they faced.
Although Lyftrondata is not a FedRAMP-approved product, it can be self-hosted on an EC2 instance within a VPC environment that they can secure and monitor with their Government cloud. Additionally, Lyftrondata enabled them to add more layers of encryption and protect their sensitive PII data automatically.
It also possesses an impressive suite of 150+ on-premise and cloud data source API connectors, allowing them to connect to a variety of data sources and data targets without requiring any coding or complex configuration. Lyftrondata’s architecture allowed them to set up continuously selected data synchronization between their data sources and data targets.
Lastly, Lyftrondata provides them with the ability to replicate and transform their data (as needed) on the fly by allowing them to either use SQL-based internal Lyftrondata stored procedures or their own scripts to perform ETL/ELT operations on their data before it lands at its intended data-target environment which can include cloud-native services and tools such as Amazon Redshift, AWS S3, SQL Server/RDS, and COTS products such as Tableau, Alteryx or Databricks. Lyftrondata possesses native integration to many services and tools.
The Grand Result
Since deploying Lyftrondata into their 24/7/365 global-based cloud production environment in September 2020, they have been able to quickly migrate data from one cloud-based data source to multiple cloud-based data targets securely, quickly and without the need to write or manage any code. To date, they are pleased with Lyftrondata’s performance, innovation and agility as a critical part of their current production environment’s pipeline and tool stack. They plan to continue using Lyftrondata to quickly and effortlessly adapt to their customer’s evolving data strategy and production operations to ensure their mission success.
Start analyzing your data in minutes, not months
