Zaki Saleh, vice president and general manager of Global Health Business at Peraton, recently spoke with ExecutiveBiz regarding how the federal health sector continues to be influenced by IT modernization and how improvements in network, data and platforms capabilities.
In addition, Saleh discussed the challenges of zero trust implementation and the renewed focus on data security for our government agencies. He also spoke about the strides our industry has made in emerging technologies as well as the unique challenges on the business side of innovation during the latest Executive Spotlight interview.
“The core of zero trust architecture is around the policy administrator and policy engine. That is the determination point for all access and without their approval, no connection will be allowed. This means there should be heavy emphasis on the configuration and maintenance of these two aspects of zero trust architecture.”
You can read the full interview with Zaki Saleh below:
ExecutiveBiz: With zero trust technology becoming a major focal point moving forward, what can you tell us about the difficulties of implementing zero trust architectures and focusing on data security?
Zaki Saleh: “Agencies have morphed their security architectures overtime to meet the needs of the threats they face. Most of those architectures followed a “castle-moat” construct–trust everyone inside the castle and no one on the other side of the moat.
This was reasonably effective; however, as we have moved to cloud-based computing and have adopted a more hybrid and distributed workforce strategy, perimeters have become blurred, and the theft of credentials have increased.
Over the last few years, Peraton has been working with different zero trust initiatives alongside federal, state, and local governments as well as the Department of Defense to address this digital shift. As a result, we have identified key challenges and solutions to be successful.
First and foremost, zero trust architecture isn’t a ‘one-size-fits-all’ situation. Every agency has nuances – for example, varying numbers of legacy applications, different architectures, various investments in software to perform some of the zero trust functions.
There is a need to properly assess the organization’s current cybersecurity state against the zero trust concept tenets to create a customized zero trust architecture with the right software applications effectively integrated into a cost-effective cyber defense solution.
To implement zero trust architectures, one must assess the current landscape. Most agencies have legacy systems that can be very expensive to re-architect to accommodate zero trust architecture. There are also various guidelines such as NIST, GSA, OMB, DoD, and CISA as well as agency-specific guidelines. Understanding which framework/strategy to adopt can be confusing at times.
Implementation of zero trust architecture requires a policy decision and policy enforcement point. Essentially, this is where the rules are put to grant, revoke, or deny a user access to some enterprise resource as well as simultaneously enables the termination and ability to monitor connections.
These new policy decision points may have organizational change impacts requiring individuals in the organization to have less access than they have had historically or request permission to gain heightened access that they may have had in the past.
In addition, the core of zero trust architecture is around the policy administrator and policy engine. That is the determination point for all access and without their approval, no connection will be allowed. This means there should be heavy emphasis on the configuration and maintenance of these two aspects of zero trust architecture.”
ExecutiveBiz: In recent years, what are some of the biggest improvements you’ve seen in the way we talk and think about innovation across the federal sector since the rise of cybersecurity, AI/ML, 5G and other emerging technologies?
Zaki Saleh: “There is a quite bit of a buzz around AI/ML in the industry. Let’s take a specific example around COVID and how AI/ML can help with innovations in detecting future disease outbreaks. The US has disease surveillance systems in place.
The current disease surveillance system of using medical indicators such as new case rates, lab test results, hospitalization rates, death rates, etc., did what it was designed to do. They are lagging indicators of a new disease outbreak.
They cannot be collected and analyzed fast enough by public health officials to immediately detect and act on a new outbreak. It takes time to do this with our current systems. That is not ideal and needs improvement.
Technology can accelerate the detection of a new outbreak by looking for leading indicators of a novel disease outbreak utilizing AI & machine learning to look at non-medical indicators from social media or other indicators that something unusual is going on such as increases (spikes) in retail sales of items like cold medicines, facial tissues, or other similar items during the summer non-cold season months. Predictive analytics is what I am referring to here.
More rapid collection, aggregation, dissemination, and analysis of medical indicators such as lab tests using AI/MI could also accelerate the detection, communication & planning of a new outbreak as well as & ramp up the medical supply chain in response to an outbreak as well as re-supply.
For example, America’s warfighters can be deployed across the globe where internet connectivity might be hard to come by in a theater setting. Peraton can deploy “on the edge” cloud solutions to address such challenges. We can use 5G to upload real-time data to the cloud and provide our warfighters with the best information to move forward.”
ExecutiveBiz: As the federal health sector continues to be heavily influenced by IT modernization and a wide range of other initiatives to improve its networks, platforms, and data, what are the greatest improvements that are being developed in federal health and what else needs to be addressed?
Zaki Saleh: “Another great question. I will highlight data modernization in Healthcare as one of the areas where we are seeing great improvements. Healthcare in the US is full of data sets that are disconnected that cannot be viewed or acted upon by patients, providers, and payers.
While the broader industry shift to certified electronic health record (EHR) technology delivered legible digital records and more portable data, it came up short in quantifiably improving the cost, effectiveness, and satisfaction of health care services.
On average, it takes 17 years to integrate health care best practices into the flow of medicine. It creates inefficiencies (duplication of tests, treatments, etc.) which raises the cost of care and ultimately becomes a barrier to quality care. Assessing care quality in time to make a difference with high-risk patients is simply not feasible due to fragmented and poorly standardized population health data sets.
At Peraton, we looked at harnessing technology to break down the barriers to care – we call ‘Care without Boundaries’ and built a digital healthcare data integration hub that we brand as HealthConcourse. To realize care without boundaries, we strive to enable complete interoperability of all assets that are shared or of mutual interest to multiple stakeholders.
Namely, we aim to overcome the boundaries prohibiting data interoperability (the sharing of medical records), knowledge interoperability (the sharing of clinical decision support algorithms and models) and process interoperability (the sharing of workflow and situational context).
The end result is that our providers, payers, and patients can focus on the business of providing the best care possible by using HealthConcourse as an interoperable data platform.”
ExecutiveBiz: We often discuss innovation from the technical or capability side. What are some of the unique challenges that you’ve seen on the business side of innovation that haven’t been addressed or discussed enough?
Zaki Saleh: “Great question about the business side of innovation, not just innovation in technology. Our customers are starting to partner with us on As-a-Service offerings as they are thinking about moving from CapEx to OpEx.
Simply said, CapEx and OpEx are how companies invest. While CapEx refers to capital expenditures for the purchase of goods, OpEx refers to operating costs.
In the world of Everything as a Service, the shift from CapEx to OpEx is a very important discussion, similar to decisions we make as consumers to say buy a car upfront (which is like outlaying CapEx) or choose to lease a car for a monthly payment (more like OpEx) or just rent a car when you need to use it forgoing the expense of buying or leasing a car and related costs.
Think of OpEx more as renting capability as you need and scale up or down as you need to consume such capabilities.
Our customers need flexibility to scale up or down as they consume resources and capabilities and are looking for utility-based pricing. We have offerings for IT as a Service or Business as a Service. A couple of examples of IT as a Service would be a hybrid cloud as a service or storage as a service. For Business as a Service, we are offering our customers say Digital Transformation as a Service or Intelligent Contact Center As-A-Service.
Peraton recently acquired the as-a-service (aaS) business of ViON Corporation, enhancing our offerings in the design, delivery and governance of critical IT infrastructure for our government customers. Additional aaS offerings give our government customers more flexibility and resiliency with storage, computing and network capacity.”