by Dan DeMers, CEO, Cinchy
The COVID-19 pandemic has stress-tested resources and institutions around the globe. While doctors focus on the healthcare side of things, and accountants on the economic impacts, I find myself examining the data management behind the pandemic response. As someone who has worked in enterprise IT for the past decade, I see the strains of COVID-19 revealing the many places where today’s data management technology falls flat.
It’s unfortunate, but not surprising. Today’s sophisticated healthcare applications are still running on a data integration architecture that’s over 40 years old. That’s like building a race car on a Model T chassis; sure it can be done, but your final product will always be held back by the limitations of its outdated platform. Fortunately, when it comes to technology there’s a new platform available. It’s called Data Collaboration.
Data Collaboration is a new way to work with data, and the first real evolution since the database was invented in the seventies. It offers a network-based data fabric and permission-based access to data, instead of mass copying (the bane of today’s data management technology). This lets users protect data privacy and governance, reduce data integration efforts, and create new solutions quickly.
It might sound like a far-off technology, but it’s already being used by some of the world’s top financial institutions, and it is exactly what today’s healthcare enterprises need to manage their data.
Eliminate data copying, embrace data privacy
As mentioned above, data copying is one of the biggest headaches in modern data management. Every new business solution means another data integration project, and every data integration project means more data copies. Data is only ever as secure as its most vulnerable copy, and today’s enterprises often have thousands of copies of data to control.
Data Collaboration helps protect and preserve data privacy by eliminating copies and using universal data access permissions instead. It’s the difference between emailing out 10 copies of a Word doc, or 10 cloud links to a Google Doc.
With the Word doc, each of those 10 recipients now has their own copy. Maybe some of them will make changes to it, which can lead to conflicting versions of the same document. Some will send it around to other people, maybe even someone who wasn’t supposed to see it. At least one person will accidentally delete it. In short, as soon as you start copying data, you lose your control over it.
Now think of sharing links to a Google Doc. Right off the bat, you can control whether a recipient can edit the document, make suggestions, or just read it. You can control whether or not they can share it with other people. And even if someone deletes it, it just takes a few clicks to go back to your preferred version. These are all examples of access controls, just like Data Collaboration uses to protect data.
Building solutions safely
Because Data Collaboration relies on secure access to a single copy of data, it makes it easier to control and protect personal information. For example, there’s been a lot of talk about the SDK from Google and Apple that allows COVID-19 researchers to work with location data from people’s phones. This is obviously a major concern for data privacy, as there’s a lot of personal identification data at risk. Just like our Word Doc example above, there are going to be a lot of data copies flying around without any real control.
With Data Collaboration, access to this personal information is controlled by permissions at the data level. Whenever that data is used, it will retain the same read/edit/share rights. This solves the issue of data privacy automatically, allowing research teams to focus on building solutions instead of worrying about data privacy.
It’s this sort of effort-reducing benefit that makes Data Collaboration the ideal choice for a quickly changing world. Project timelines shrink from months to weeks, as teams can build new outcomes on established data architectures.
Creating a Data Collaboration Command Center
In the long run, Data Collaboration will allow healthcare agencies to create a comprehensive, rapid-response Command Center connecting datasets from hospitals, pharmaceutical researchers, medical equipment manufacturers, government agencies, and other key stakeholders. Because the Data Collaboration framework eliminates the need for complex data integration projects, it’s suddenly possible for all these different entities to share resources—all while maintaining full control of how their data is accessed and used.
The promise of Data Collaboration is that the more data you have connected to your system, the easier it becomes to build new solutions. Essentially, this is the famous “network effect” being applied to healthcare IT delivery—the more you use it, the faster it gets.
Preparing for the future
This Data Collaboration Command Center would support the rapid deployment of real-time healthcare solutions, saving lives and resources on a mass scale. That’s a needed solution during a pandemic crisis like we’re currently dealing with, and it will also provide the technological underpinnings to prevent such calamities in the future.
For example, the agencies operating a Data Collaboration Command Center could continue to connect and protect new data sources in order to develop “early warning systems” in the form of predictive analytics while also enhancing the richness of their COVID-19 emergency solutions for when they’re next needed.
And even once we’ve solved for COVID-19, we’ll still be fighting more familiar outbreaks—seasonal flu, Lyme disease, HIV/AIDS, even future COVID outbreaks—Command Center solutions will help contain and flatten these outbreaks as well, as the flexibility of Data Collaboration allows researchers to easily adapt solutions from one dataset to the next.
Right now, Data Collaboration is the only way to meet today’s data demands. Things have simply advanced too far to continue relying on the 40-year old paradigm of buying or building applications and carrying out data integration projects. We need to simplify and streamline in order to become more efficient. Data Collaboration is the only way to do that.
Dan DeMers is the CEO and co-founder of Cinchy, the global leader in enterprise Data Fabric and Data Collaboration technology. Previously, he spent over a decade as an IT executive with leading global financial institutions where he was responsible for multi-million dollar technology investments. Dan talks regularly at major technology conferences and has recently appeared at TechCrunch Disrupt, Strata Data NYC, and Finovate NYC.