kentoh - Fotolia
Integrating data across corporate departments can be challenging. There invariably are technical and cultural hurdles that must be cleared.
"We typically think of IT as a back-office function," said Jon Gottsegen, chief data officer for the state of Colorado. Not so in Colorado. A deeply troubled benefits eligibility system -- more than a decade in development and charged with making improper payments -- had put IT in the limelight -- and not in a good way, he said. The newspaper term above the fold became part of his vocabulary, Gottsegen grimly joked.
Work on the state's new API integration platform began in 2017 with a major transformation of the infamous benefits system. Partnering with Deloitte, IT rewrote the system's code, migrated services to Salesforce and AWS, and used APIs to drive integration into various databases, Gottsegen said. This helped reduce the amount of time to determine eligibility for benefits from days to minutes -- a major boon for state employees.
Today, the API integration platform -- while still a work in progress -- has dramatically sped up a number of state processes and is paving the way for better data sharing down the road, Gottsegen said.
Speaking at the recent MuleSoft Connect conference in San Francisco, Gottsegen shared the objectives of Colorado's API integration strategy, the major challenges his team has encountered and the lessons learned.
People, of course, should come first in projects of this scope: Delivering better services to the people of Colorado was aim No. 1. of his team's API integration platform, Gottsegen said. Security, data governance and corporate culture also demand attention.
Becoming the 'Amazon of state services'
The task before Gottsegen and his group was to create a process for rolling out APIs that work seamlessly across dozens of different agencies. "Ideally, we want to be the Amazon of state services," he said of IT's grand mission.
Developers had to learn how to connect systems to databases that were regulated in different ways. Gottsegen's team spent a lot of time putting together a comprehensive platform, which was important for integration, he said. It was also important to deliver the APIs in a way that they could be easily consumed by the various state agencies. One goal was to ensure that new APIs were reusable.
Part of the work also involved looking at how services relate to each other. For example, if someone is getting Medicaid, there is a good chance they are also eligible for house services. The API platform had to support the data integration that helps automate these kinds of cross-agency processes, he said.
Getting the API program off the ground was not just about solving the technical problems. When communicating with technology personnel across agencies, Gottsegen said it was important to convey that the API integration platform is about better serving the residents of Colorado.
Learning from contractors
IT did not go it alone. Gottsegen said the state worked with a variety of contractors to speed up its API development process. This included working with MuleSoft to roll out a more expansive API management tier. IT also hired some contractors with integration expertise to kick-start the project. But he added that it was important to ensure the knowledge involved in building the APIs was retained after the contract ends.
"We want our teams to sit next to those contractors to ensure the knowledge of those contractors gets internalized. There have been many cases where state workers did not know how to maintain something after the contractor has left," he said.
Good metrics, communication critical to API integration success
Before Gottsegen's team launched a formal API integration program, no one was tracking how long it took agencies to set up a working data integration process. Anecdotal examples of problems would emerge, including stories of agencies that spent over a year negotiating how to set up a data exchange.
The team now has formal metrics to track time to implementation, but the lack of past metrics precludes precise measurements on how the new API integration platform speeds up data exchange compared to before.
In any case, expediting the data exchange process is not just about having a more all-encompassing integration tier, Gottsegen stressed. Better communication between departments is also needed.
As it rolls out the API integration platform, IT is working with the agencies to identify any compliance issues and find a framework to address them.
Each agency oversees its own data collection and determines where it can be used, Gottsegen said. There are also various privacy regulations to conform to, including HIPAA and IRS 1075.
"One of the reasons we pursued MuleSoft was so we could demonstrate auditable and consistent security and governance of the data," he said.
Navigating the distinctions between privacy and security is a big challenge, he said. Each agency is responsible for ensuring restrictions on how its data is used; it is not a task assigned to a centralized government group because the agency is the expert on privacy regulations. At the same time, Gottsegen's group can provide better security into API integration mechanisms used to exchange data between agencies.
To provide API integration security, Gottsegen created a DevOps toolchain run by a statewide center for enablement. This included a set of vetted tools and practices that agencies could adopt to speed the development of new integrations (dev) and the process of pushing them into production (ops) safely.
Gottsegen said the group is developing practices to build capabilities that can be adopted across the state, but progress is uneven. He said the group has seen mixed results in getting buy-in from agencies.
Improving data quality across agencies
Gottsegen's team has also launched a joint agency interoperability project for the integration of over 45 different data systems across the state. The aim is to build a sturdy data governance process across groups. The first question being addressed is data quality, in particular to ensure a consistent digital ID of citizens. "To be honest, I'm not sure we have a quality measure across the state," Gottsegen said.
Gottsegen believes that data quality is not about being good or bad, but about fitness for use. It's not easy articulating what particular data set is appropriate across agencies.
"Data quality should be a partnership between agencies and IT," he said. His team often gets requests to integrate data across agencies. The challenge is how to provide the tools to do that. The agencies need to be able to describe the idiosyncrasies of how they collect data in order to come up with a standard. Down the road, Gottsegen hopes machine learning will help improve this process.
Building trust with state IT leaders
A lot of state initiatives are driven from the top down. But, if workers don't like a directive, they can often wait things out until a new government is elected. Gottsegen found that building trust among IT leaders across state agencies was key in growing the API program. "Trust is important -- not just in technology changes, but in data sharing as well," he said.
And face-to face connections matter. In launching its API integration platform, he said, it was important for IT leaders across organizations to learn each other's names and to meet in person, even when phone calls or video conferences might be more convenient.
As for the future, Gottsegen has a vision that all data sharing will eventually happen through API integrations. But getting there is a long process. "That might be 10 years out -- if it happens. We keep that goal in mind while working with our collaborators to build things out."