Cloud Computing - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Mon, 22 Jul 2024 17:53:24 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Cloud Computing - Federal News Network https://federalnewsnetwork.com 32 32 Ready, set, run: Meeting users’ tech expectations and needs now https://federalnewsnetwork.com/federal-insights/2024/07/ready-set-run-meeting-users-tech-expectations-and-needs-now/ https://federalnewsnetwork.com/federal-insights/2024/07/ready-set-run-meeting-users-tech-expectations-and-needs-now/#respond Thu, 18 Jul 2024 17:55:31 +0000 https://federalnewsnetwork.com/?p=5077442 Everyone wants everything in real time (or almost). Tech leaders from CACI, Future Tech, MITRE and SAIC share the impact that has on lifecycle management.

The post Ready, set, run: Meeting users’ tech expectations and needs now first appeared on Federal News Network.

]]>

This is the fourth article in our IT lifecycle management series, Delivering the tech that delivers for government. It’s Part 2 of a two-part roundtable with enterprise tech leaders from CACI, Future Tech Enterprise, MITRE and SAIC. Find Part 1 here.

ASAP. It’s an acronym thrown around with impunity by almost anyone who has a technology need. For the leaders of technology enterprise operations at government contractors though, delivering capabilities right now is increasingly the desired norm.

Part of that is driven by the needs of the agencies that federal systems integrators work with. Federal agencies often look to their industry partners to provide technology they might not have themselves, said Cedric Sims, senior vice president of enterprise innovation and integration at MITRE.

“We have an obligation to our sponsors to be able to provide some compute and capacity that they don’t have access to now — at scale,” Sims said during the second half of a two-part roundtable discussion for our Federal News Network series, Delivering the tech that delivers for government.

Sims went on to point out that such needs have led MITRE to make investments in technologies ahead of government agencies making their own in some instances. For example, he said, “we’re building out a fairly significant artificial intelligence sandbox.”

For the roundtable, we talked with Sims and also technology leaders from CACI, Future Tech Enterprise and SAIC about how they increasingly deliver technology capabilities at speed to support users across their organizations and also the government organizations and missions that their companies support.

They shared the tactics and technology approaches they’re deploying now to meet these ASAP demands. The discussion homed in on five critical areas that impact delivering and supporting services for users in real or near-real time: cloud, data, security, AI and preparing for the future of lifecycle management. (You can find the first part of the discussion, “How federal contractors put users first — whether employee or fed — to deliver on government missions,” here.)

When cloud and on-prem cross paths

Managing the needs of users as well as preparing for Day 1 readiness on programs increasingly involves cloud — even cloud-like management in the data center.

“We’ve tried to go to a model that’s a little bit more hyperscaler, to where at least from an original equipment manufacturer standpoint, they can provision hardware, new hardware into that environment — creating that hyperscaler environment within the data center,” said Bernie Tomasky, vice president of enterprise IT at SAIC.

That said, he quickly added that ultimately “it’s all about trying to drive everybody into a hyperscaler cloud.”

It’s now more common for agencies to ask about shutting down data centers rather than standing them up for new programs — and how to maximize their existing capital investment while leaning into the cloud for scale, said Erik Nelson, senior vice president of the enterprise IT operating group at CACI.

On premise versus in the cloud are often competing needs, he said, especially for agencies with missions that take them to remote locations where they must deliver technology capabilities for temporary missions. Think the military services or the Federal Emergency Management Agency, for instance.

“It’s being able to figure out how to configure what is available to be out there in that austere location. And you don’t have a lot of time to deploy it,” Nelson said. “So you have to figure out how to kind of MacGyver some things to make things work. What is important about that is to have all the smart people in a room and be able to say, ‘Hey, here’s how this is going to work here.’ Then, test it out, pilot and deploy pretty quickly.”

The ability to easily navigate between cloud and on-premise environments is critical, said Rick Schindel, leader of federal systems integrator programs at Future Tech Enterprise.

“The OEMs have done a good job in kind of reinventing their as-a-service model,” he said, adding that it led Future Tech to develop Day 1 Readiness capability that blends the OEM elements with any mission-unique technology that may be required as FSIs work with agencies.

Security and data at the forefront of enterprise lifecycle services

Although everyone agreed that agencies no longer resist migrating to the cloud, security still leads government organizations to keep some systems on premise. But the ability to support users interactively, provide services as needed, and proactively manage lifecycle and cybersecurity is pushing agencies to actively embrace cloud’s consumption-based and operational expenditure model, Tomasky said.

“Data security is paramount to what the customer is thinking when they typically are on-prem. But by and large, the obstacles they’re already facing by having on-prem solutions, they want to get past,” he said.

The data security focus ties directly to the government’s numerous cyber requirements, such as establishing zero trust architectures (with a fall 2024 deadline looming for agencies) and ensuring supply chain risk management through vendors providing third-party verification.

What is known about each industry provider’s infrastructure has become critical with the move to software as a service and multiuser platforms, Sims said. Plus, it’s common to integrate multiple vendors’ products and tools for federal projects or to host data and environments on the large cloud services providers’ platforms, he pointed out. Going forward, this makes visibility and transparency essential, Sims said.

It also requires reimagining how agencies and vendors manage data so that users can access exactly what they need when they need it to do their jobs, Nelson said. Yes, agencies and their industry partners are implementing least-privilege and role-based access models, but rethinking privacy and classification practices and extracting data selectively must occur in tandem, he advised. This is particularly challenging as the federal government houses vast stores of personally identifiable, sensitive and classified information.

“Being able to, to plan out, ‘Hey, this is the portion of that data that’s really important. Everything else around it, we can do something different,’ ” Nelson said. “It’s really a cultural change for all these government agencies because it’s easier to classify something than to say, ‘Well, only this is classified.’”

AI and the future of lifecycle management

Artificial intelligence should help provide answers to data needs — both culling data but also managing end user devices and meeting users’ needs, Sims added.

At MITRE, AI and machine learning are “being used for things that we could have never imagined,” he said. Its network engineers apply AI models to log data to identify functions on the network that may not be performing as anticipated.

“We have some very bespoke capabilities around some of our security capability logs that come in,” Sims explained. “It allows our staff, our talent, to really explore: What does it actually mean to approach these problems in a different way? And we’ve seen some really impactful outcomes because of it.”

Schindel agreed that AI has potential to improve sustainment and maintenance operations of existing devices and platforms. “Our government has deployed all of these platforms across their operations. AI will help them sustain them for longer periods of time because you’re going to be able to do a ton more in terms of predictive and preventive maintenance,” he said.

Sims added that MITRE expects the development of new small language models trained specifically to do just that, to run on the edge so that IT and security needs can be “responsive, adaptive and predictive — without [devices] having to kind of dial back to a mothership.”

But for that evolution to take place, Schindel circled back to what Nelson shared about the need to focus on data management. Agencies will need to consider the security aspect of how their organizations use AI models, he said. Agencies will need to protect the resulting data and manage who has access to it and who can take part in discussions around it, Schindel noted. In other words, what are the appropriate guardrails for managing transparency and security simultaneously?

“It is an irony of AI” and definitely generative AI, Tomasky noted with a laugh. “We want all the data brought in, but we’re not going to put any of our data back out.”

He expects though that AI will let technology teams get away from looking at dashboards and screens before making changes. Throughout  the IT lifecycle, “you’re going to see AI and automation play a bigger and bigger role,” Tomasky said. “It already has in the service desk environment. And you’ll see it across the network, and cyber, and everything else as well.”

Part of getting there depends on avoiding data overload, by focusing on the value that any given AI use and dataset delivers to the end user or the organization, and also on demystifying AI as well, Nelson said. That’s how to take advantage of “lots of AI ops capabilities and service desk to really draw down the mundane tasks and make them much easier to do.”

Discover more stories about how federal systems integrators and government contractors manage their enterprise infrastructure environments in our series Delivering the tech that delivers for government, sponsored by Future Tech Enterprise.

To listen to the full Part 2 discussion, click the podcast play button below:

Check out all podcast episodes of the Delivering the tech that delivers for government series.

The post Ready, set, run: Meeting users’ tech expectations and needs now first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/07/ready-set-run-meeting-users-tech-expectations-and-needs-now/feed/ 0
FedRAMP’s 2 new efforts target long-time vendor frustrations https://federalnewsnetwork.com/cybersecurity/2024/07/fedramps-2-new-efforts-target-long-time-vendor-frustrations/ https://federalnewsnetwork.com/cybersecurity/2024/07/fedramps-2-new-efforts-target-long-time-vendor-frustrations/#respond Mon, 15 Jul 2024 22:11:25 +0000 https://federalnewsnetwork.com/?p=5073798 The cloud security program launched two programs, an agile delivery pilot and a new technical documentation hub, to accelerate cloud authorizations.

The post FedRAMP’s 2 new efforts target long-time vendor frustrations first appeared on Federal News Network.

]]>
The final policy guidance for the cloud security program known as FedRAMP is still a few weeks away from coming out, but the General Services Administration continues its aggressive refresh of the 13-year-old effort.

GSA launched two new initiatives to continue to relieve some of the burdens of getting cloud services authorized under the Federal Risk Authorization and Management Program (FedRAMP) that contractors and agencies have long-complained about.

Eric Mill, the executive director of cloud strategy at GSA, said the agile delivery pilot will choose about 20 contractors to test out how to use secure software delivery approaches to accelerate the “significant change request” process, which essentially is an approval gate for cloud providers to add new features or capabilities to a FedRAMP authorized service.

Eric Mill is the director of cloud strategy in the Technology Transformation Service in the General Services Administration.

“For a lot of cloud providers, this can go on for a long time and really get in the way of what we know to be secure software deployment and delivery practices, which are agile software delivery practices and the federal government absolutely needs to get the benefits of these companies who we are relying on for them to be able to share as many security improvements and updates as possible, new security tools, new patches, and new technology and new capabilities,” Mill said at the GovForward conference sponsored by Carahsoft. “This is an area where we think we can take a look at the way that FedRAMP has operated to date and refactor the process to be one that is based on continuous assessment. I think that’s a phrase you’re going to hear us use a lot because we think we should be getting both more security and more speed at the same time. When we focus our attention on overseeing the process by which changes are made, rather than repeatedly exercising like a stop and go process on every point in time change that a cloud provider makes.”

The PMO says as part of its plan to limit the scope and potential impact of changes to agencies, the new features CSPs launched as part of this pilot must be opt-in.

The PMO says any changes to the fundamental underlying architecture, or new security control implementations that apply to the entire offering, will be excluded from the pilot.

For the purposes of this pilot, the PMO says agencies must choose to use the new feature and the new feature cannot change the:

  • System’s fundamental architecture,
  • Types of components used such as databases, operating systems, or containers,
  • Tooling used to configure, secure, and scan those components, and
  • Customer responsibilities for existing features or services.

The FedRAMP program management office will accept applications from vendors to take part in the pilot through July 26 and then make selections by Aug. 16.

The second new initiative is focused on bringing more automation to the program.

The new technical documentation hub will help CSPs in the development, validation and submission of digital authorization packages, and the developers of governance, risk and compliance (GRC) applications and other tools that produce and consume digital authorization package data.

Mill said one of the goals of FedRAMP more broadly is to reduce the time and costs to industry to get their services authorized.

“We’re still in a universe where we traffic 600-page Word documents and PDFs, which is really not how to run a data oriented organization,” Mill said. “We’ve made, what I think are, very concrete investments in changing that dynamic over time. Some of that is who we have hired and brought on to the program where we have a dedicated Open Security Controls Assessment Language (OSCAL) and data standards lead. We already have more technical expertise and practitioner background in the program now than it has had historically, and we’re going to be increasing that very significantly in the near future. We think that by bolstering our technical capacity, we’re going to be able to move dramatically more effectively, and be a more empathetic and effective partner with the cloud providers and agencies who ultimately have the tools that need to integrate with our program so that we don’t have to have people emailing things around much less emailing things around with passwords and stuff like that.”

The website initially is focused on promoting the use of OSCAL and application programming interfaces (APIs) to share digital authorization packages with the PMO and among agencies.

The PMO says this technical hub site will help make the FedRAMP authorization process more efficient and accessible by:

  • Providing faster and more frequent documentation updates
  • Expanding the breadth and depth of available technical documentation
  • Improving the user experience for stakeholders who are implementing OSCAL-based FedRAMP packages and tools
  • Establishing a collaborative workflow that supports community contributions for improvements to the documentation

Mill added this approach isn’t necessarily new because FedRAMP is doing all of this work out on GitHub and open source development already.

VA proved out automation

FedRAMP has long held out for the promise of OSCAL. In May 2022, it received the first security authorization package using the framework. The National Institute of Standards and Technology released version 1.0 of OSCAL in June 2021 and in August 2021, FedRAMP released the first set of validation rules via GitHub.

But both the program and vendors have been slow to catch on.

Amber Pearson, the deputy chief information officer at the Department of Veterans Affairs, said at the event that VA was the first agency to deploy and submit a systems security plan using OSCAL.

“We were able to actually transform our standard 426 page system security plan from a text file to machine readable language. We’re really excited where automation is going to take us to help us speed up how we deploy our authority to operates (ATOs) in our environment,” Pearson said. “OSCAL will be the first step to explore automation during our assessment and authorization process because it allows us to programmatically look at how do we build in key metrics to do automatic control testing. We’re actually exploring that with our partnerships with NIST and others. How do we actually speed up from a 360-day ATO timeline to receive an ATO to maybe an assessment and authorization (A&A) in a day? That’s some of the efforts that we’re looking at and how do we quickly assess the security controls and most importantly, about automation, it comes into play when you think about continuous monitoring and being able to measure your risk in near real time.”

Drew Mykelgard, the federal deputy chief information officer, said he hopes OSCAL becomes common place for any organization building or approving software within the next year.

“At every stage, I hope people are like, OSCAL is saving me from Word flat files, PDFs and it is changing the game from one of the biggest points of friction that we feel. We also know that when like the federal government gets behind a standard, we can really push it forward,” he said. “When we have people like Amber and her team pushing this through their governance, risk and compliance (GRC) platforms to intake OSCAL more effectively, running the tests on it and increasing, we can write all the policy we want, but without people like Amber, it’s doesn’t happen.”

The agile delivery pilot and the automation hub are two of the latest efforts the program management office has released since January.

FedRAMP’s continued modernization march

In June, FedRAMP finalized its emerging technology framework, focusing initially on generative artificial intelligence.

In May, OMB and GSA detailed the new structure of FedRAMP, replacing the joint authorization board with the new FedRAMP Board and creating the technical advisory group.

And two months before that, the FedRAMP PMO outlined 28 near-term initiatives in a new roadmap for the future.

All of this started in October when OMB issued the draft policy update to FedRAMP.

The PMO is still without a permanent director after more than three years.

Mykelgard said GSA is close to hiring a new permanent director of the program management office after receiving more than 400 applications.

GSA’s Mill said these and other upcoming changes are all about making concrete investments to change the dynamic over time. He said speed and security don’t have to be polar opposites.

“If you look at the elements on our roadmap, a very healthy chunk of them are designed to chip away in different ways and different slices of the things that generate that time and cost,” Mill said. “What we really need when commodity services out there exist, which can do core functions by companies and other agencies sometimes, it’s the shared services strategy in another form. We benefit from a security perspective, as federal agencies and the federal government when we’re able to stop doing things ourselves. Now when we’re talking about software, we have different and new and exciting opportunities to start running fewer things that are held together by shoestring apps and use things that are given dedicated maintenance, love and security investment. That, in and of itself, is a huge security boon for the government, which should be able to focus its limited IT and security people on the things that cannot be commoditized, that are just unique and core to their mission. That’s the theory of FedRAMP.”

The post FedRAMP’s 2 new efforts target long-time vendor frustrations first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cybersecurity/2024/07/fedramps-2-new-efforts-target-long-time-vendor-frustrations/feed/ 0
Modernize to location-agnostic applications https://federalnewsnetwork.com/federal-insights/2024/07/modernize-to-location-agnostic-applications/ https://federalnewsnetwork.com/federal-insights/2024/07/modernize-to-location-agnostic-applications/#respond Sun, 14 Jul 2024 19:12:30 +0000 https://federalnewsnetwork.com/?p=5074685 A goal of modernized, containerized applications: You can easily deploy it in the cloud, in your data center, or at the edge.

The post Modernize to location-agnostic applications first appeared on Federal News Network.

]]>

Application and infrastructure modernizing has started to take an all-of-the-above approach for where to operate — commercial clouds, data centers and edge computers. The cloud first idea has given way to optimizing hybrid infrastructures for mission requirements.

“We’re seeing a big shift away from the cloud all of a sudden,” said Brandon Gulla, the chief technology officer of Rancher Government Solutions. It’s not that cloud computing doesn’t continue to provide tangible bets, but rather that in modernizing applications, agencies are balancing their approaches.

One reason, Gulla said, stems from artificial intelligence and a desire to move data to where AI applications execute.

5G wireless circuits have “provided a great opportunity for the data to bounce back and forth from the edge to the traditional cloud faster. And that’s providing a lot of opportunity for our government customers,” Gulla said.

For example, military operators seeking speedy decision-making often want data and applications housed locally.

“That’s where edge computing really comes into play,” Gulla said. “It allows them to have sensors and the processing localized to their environment, and be able to have those decisions faster without that reliance on the [cloud] platform.” He said such applications should work even in air-gapped or intermittent connectivity situations.

A diversification of computing environments also lowers risk by providing redundancy and fault tolerance.

“This is a way for them to mitigate risk while modernizing their platform,” he said.

This compute-anywhere presupposes that the elastic, service-rich commercial cloud environment can replicate at the edge or in the agency data center, noted Tricia Fitzmaurice, Rancher Government Solutions public sector vice president of sales.

In recent months, she said, “we saw our customers, with the advancements in small form factor hardware, wanting to process their data where mission was actually happening.”

She added, “We looked at that and said, ok, we need our solutions to be able to deploy out at the edge in the same way that they would perform out in the data center.” Fitzmaurice said that agency tech staffs also want “single pane of glass” management of workloads, regardless of where the workloads are executing.  

Technical diversity

Fitzmaurice and Gulla said that, in addition to locational flexibility, agencies need flexibility in the way they design or refactor applications. Gulla said that sticking to standards and open source software can simplify the choice of technology for a given use case. He said some organizations that bet heavily on a proprietary virtual machine hypervisor and one commercial cloud computing provider later find they must deal with technical debt as technology evolves.

“We want to promote organizations to be willing to adopt open standards such as Kubernetes or Linux containers,” Gulla said, “and actually be able to lift and shift these applications and their IT modernization across infrastructures, no matter if it’s public cloud, private cloud or on premises.” Otherwise, he said, “the technical debt involved with reverse engineering and changing from one hyper scaler to another is massive.”

Gulla said that flexibility can extend even to multiple CPU architectures, noting the emergence of low-power mobile device chips moving into servers. He noted the Navy, constrained by available power on ships, is turning to hardware that uses Arm 64-bit hardware.

Open source containerization also can potentially cut the time to deploy new code.

“A line of code today will take six months to get into the warfighters’ hands,” Gulla said. “That’s too long.” He said the government, using containerization technology “has started to focus on inheritable security patterns and security controls, and concepts such as continuous authority to operate.” He called that capability “an express lane” to get modernized applications into production.

Consistent security must underlie any modernization, Gulla and Fitzmaurice said, hence the importance of inheritable or reusable security mechanisms.

“As signers of the [Cybersecurity and Infrastructure Security Agency] secure-by-design pledge,” Fitzmaurice said, “we wanted to ensure that our entire product set adheres to Executive Order 14028,” referring to the Biden administration 2021 initiative.

Gulla added, “We’re trying to ship the software in a secure state. That’s alleviating responsibility from that operator, from that IT staff.” In military settings, “we’re seeing that our warfighters are being asked to do too much these days. By making [security] the responsibility of our team, not only are we delivering secure software, we’re also alleviating those responsibilities and getting code into the warfighter and operators’ hands faster.”

The post Modernize to location-agnostic applications first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/07/modernize-to-location-agnostic-applications/feed/ 0
Resolving federal hybrid cloud challenges with AI and automation https://federalnewsnetwork.com/commentary/2024/07/resolving-federal-hybrid-cloud-challenges-with-ai-and-automation/ https://federalnewsnetwork.com/commentary/2024/07/resolving-federal-hybrid-cloud-challenges-with-ai-and-automation/#respond Tue, 09 Jul 2024 14:36:40 +0000 https://federalnewsnetwork.com/?p=5068192 As government networks load up on new data and applications, gaining visibility over modern IT estates has become more difficult than ever.

The post Resolving federal hybrid cloud challenges with AI and automation first appeared on Federal News Network.

]]>
Federal agencies are modernizing aggressively, driving the addition of new systems and capabilities and creating increasingly diverse hybrid cloud ecosystems. While such modernization is necessary to keep up with growing service mandates and citizen expectations, the complexity that arises from these hybrid cloud architectures poses significant challenges in orchestrating and monitoring government IT systems.

To solve this conundrum, federal IT leaders must lean into artificial intelligence and automation to better manage their complex IT environments. When supported by a strong data management foundation, this combination can deliver enhanced service-level visibility and control for government IT teams in charge of ever-changing hybrid cloud architectures.

Hybrid cloud brings challenges of complexity and scale

As government networks load up on new data and applications, gaining visibility over modern IT estates has become more difficult than ever. Rather than adopt a single cloud service from a single cloud provider, agencies are embracing a wide range of cloud vendors and approaches. This can leave teams, who may already be understaffed and swimming in technical debt, siloed and struggling further to manage a workload-intensive mix of legacy and modern applications and infrastructure.

This dramatic proliferation of operational complexity is fueled by massive increases in the volume, variety and velocity of data to be managed. Additionally, IT platforms are often not accessible, understandable or usable for many user-level government workers who need to collaborate on them. The picture is further complicated by the fact that not all workloads are moving to the cloud and by the persistence of legacy monitoring tools that aren’t able to keep up with the variety and velocity of data across hybrid cloud architectures.

All these factors contribute to an unsustainable scenario of outdated tools and disjointed processes that stifles IT’s ability to respond to spiraling complexity and keep up with evolving agency and end user expectations. Fortunately, government IT teams can overcome these obstacles by making strategic use of both AI and automation to progress towards a state of autonomic IT and bring more visibility and control to their hybrid cloud architectures.

Overcoming hybrid cloud complexity with AI plus automation

To make sense of the current state of hybrid cloud complexity and better meet key mission objectives, federal IT teams must opt for a modern approach to ITOps that combines AI and automation to create a more unified service view across the entire hybrid cloud universe. This includes all data center, public cloud – software-as-a-service, infrastructure-as-a-service and platform-as-a-service — and private cloud environments.

The combination of AI and automation is crucial to driving observability across each of these environments, applying machine learning and scalable process optimization throughout all hybrid infrastructure data and systems. This empowers staff to perfect and then automate routine operational tasks, such as collecting diagnostic data, exchanging real-time operational data between systems and platforms, executing ticketing, remediation workflows and more.

The most successful deployments combine a wide range of data across environments to establish a real-time operational data lake. This makes it possible for IT teams to analyze and act on the data at “cloud scale” while applying a rich set of analytical techniques to add business service context and meaning to the data – with multi-directional workflows for both proactive and responsive actions.

Facilitating AI and automation with stronger data management techniques

While there is no single blueprint to follow for applying AI and automation for more alignment and orchestration of agencies’ hybrid cloud environments, the most successful efforts make sure to prioritize the underlying integrity of data. The right data management foundation will allow AI to properly manage, model and analyze operations, and this foundation is also essential to optimize and scale processes with automation.

In particular, federal IT teams should pursue three essential data-related priorities to support the journey to complete visibility and autonomous IT operations. To begin with, data must be of high fidelity, meaning it’s critical to collect the right types of data from the right sources in order to accurately reflect the state of what’s happening with an agency’s IT and business services at any given time. In addition, the cleaning, analyzing and acting on data must happen in real-time – ideally via automated processes and closed-loop decision making to enable action quickly without the need for a human analyst to be involved.

Throughout, data must be thoroughly contextualized, with all metadata and asset dependencies clearly defined through a service oriented view that enhances the ability to understand operational patterns and identify anomalies or performance issues. The right platform for AI and automation will include capabilities for managing data in these ways, enabling teams to cut through the noise and quickly establish the impact and root causes of issues. This, in turn, sets the broader stage for fundamental IT and agency transformation toward stronger agility, speed and growth.

As governments become increasingly digitized, many agencies struggle to manage their integrated hybrid-cloud environments. Fortunately, the right combination of AI and automation founded on the right data management techniques can bring more visibility and control to these environments. As a result, federal IT teams can conduct faster root cause analysis, reduce downtime, optimize IT investments, and provide a more stable foundation to support broader agency modernization efforts as technology continues to advance.

Lee Koepping is senior director for global sales engineering at ScienceLogic.

The post Resolving federal hybrid cloud challenges with AI and automation first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/commentary/2024/07/resolving-federal-hybrid-cloud-challenges-with-ai-and-automation/feed/ 0
DISA’s Skinner not a fan of ‘wooden shoe’ IT services https://federalnewsnetwork.com/defense-news/2024/07/disas-skinner-not-a-fan-of-wooden-shoe-it-services/ https://federalnewsnetwork.com/defense-news/2024/07/disas-skinner-not-a-fan-of-wooden-shoe-it-services/#respond Mon, 08 Jul 2024 21:41:39 +0000 https://federalnewsnetwork.com/?p=5067789 Lt. Gen. Robert Skinner, the director of the DISA, said DoDNet users must change their processes versus asking for unique IT services.

The post DISA’s Skinner not a fan of ‘wooden shoe’ IT services first appeared on Federal News Network.

]]>
BALTIMORE — Air Force Lt. Gen. Robert Skinner, the director of the Defense Information Systems Agency, doesn’t have time for any wooden shoes.

No, he’s not taking a swipe at the Dutch and their almost 800 year old tradition of creating clogs.

Rather, Skinner, whose tenure as DISA director will be ending as soon as the Senate confirms his replacement, Maj. Gen. Paul Stanton, who currently leads the Army’s Cyber Center of Excellence, is talking about wooden shoes as a metaphor for IT services offered through the DoDNet. The DoDNet is the network consolidation initiative under the $11 billion contract award under the Defense Enclave Services and Fourth Estate optimization initiatives.

Air Force Lt. Gen. Robert Skinner is the director of the Defense Information Systems Agency.

“We take a best of breed offering, so let’s take identity, for example, or let’s take networking or let’s take applications, and we’re developing an application that is best for the enterprise. A lot of times services, and even some agencies, have, they say, unique requirements that they want to make sure that as part of this. Well, whenever you add a unique requirement to an enterprise solution, it’s suboptimizes that enterprise solution,” Skinner said during a press conference after speaking at the AFCEA TechNet Cyber conference on June 26. “So really, what we’re saying is, you have to change your processes to leverage this environment. Our focus, as a combat support agency, has to be on the combatant commands, who are the warfighters. So our focus is on them and the agencies because that’s what we’ve been tasked and that’s what our mission is. I would love to have the services consume more DISA services and capabilities, that would be great because then as the services and the service components are supporting those combatant commands, they are all leveraging the same capabilities.”

Skinner said the unique requirements that the military services say they need are those wooden shoes, where no two pairs are the same.

DISA taking on software licensing

A prime example of this wooden shoe problem is how the Defense Department set up the unclassified version of Office 365. Across all of DoD, there are 14 different and distinct tenants.

Skinner said DoD is not making that same mistake with the classified or secret version of O365 where everyone will use one instance.

“If you’re a Marine, for example, that is a working at the Marine Corps headquarters and then you go to Cyber Command, and you go to a joint tenant, you have to move your data and move your stuff to this new tenant versus it all being in one tenant already,” he said. “As we lay this framework of cloud identity, networking applications, it’s really for the greater good, and I would offer sometimes the organization’s need to change your process, even it’s just a little bit, to consume it versus saying, we have this unique requirement that and we’re going to use that as a as a reason for not supporting an enterprise capability.”

The other challenge is the cost of licensing the software in the distinct tenants. That issue came about under the DISA Defense Enterprise Office Solutions (DEOS) contract where software licenses could cost as much as 20% higher.

DISA is leading an effort to address this higher cost of software licenses as well as part of this overall DoDNet effort.

“I’m a big proponent of DoD purchasing power that if you’re using, we’ll just say Microsoft, for example, everybody should be paying the same price for the same capabilities, whether it’s Microsoft or ServiceNow [or any other provider]. But from a department standpoint, we have more purchasing power when we’re all aligned” he said. “Same thing to me from a collaboration interoperability standpoint, the more that we’re leveraging the same capabilities, the easier it is to be more proficient and quicker. If I’m a soldier, sailor, airman, Marine or guardian, who is a network operator, for example, and I’m going from a joint position to a service position and back to a joint position, if you’re using common capabilities, then you can be proficient faster as you go into that new position.”

DoDNet ready for expansion

Skinner said the DoD CIO’s board is working on three or four enterprise licenses deals to show the services the potential savings as a way to gain some traction for this effort.

“The other thing that we’re working with the department at the DoD level, is to say, ‘hey, for licenses, why don’t we just take it off the top?’ So instead of a service having to take it [out of their budget,] why don’t we just take it off the top and therefore services don’t have to worry about it?” Skinner said. “Well, as a service, I’m actually paying more than what I did before because they didn’t get as good a deal. But because I’m part of this enterprise license, let’s just take it off the top. So that’s another area that we’re working within the department.”

In the meantime, DISA continues to expand DoDNet and its capabilities.

Miguel Cerritos-Aracen, the DoDNet operations chief for DISA, said during an AFCEA TechNet Cyber panel that they are ready to expand beyond the three current customers, DISA, the Defense Technical Information Center and the Defense POW/MIA Accounting Agency (DPAA).

Cerritos-Aracen said about he expects the 30,000 current users of common IT services on the unclassified and secret tenants to increase to more than 300,000 in the next year or so.

“Right now, we’re working with, for example, the Defense Media Activity, the Defense Contract Audit Agency, the Defense Contract Management Agency, Defense Microelectronics Activity and others,” he said. “That’s also why it’s so important that we take advantage of things of new technologies like automation and things of that nature. Our latest partners have actually brought in more solutions that have removed some of the legacy technologies of our environment, and actually allowed us to be on the precipice to be more advanced. In addition to that, some of the other achievements that we did just last year, as an example, is we did retire some legacy environments. For example, in DISA we had a corporate network called DISANet. I actually put to bed because we were able to move on into DoDNet.”

DISA’s Thunderdome included in DoDNet

Additionally, Cerritos-Aracen said by moving to DoDNet, its customers are now using Windows 11 across both unclassified and classified platforms, and are using new applications through O365 that they previously didn’t have access to and allows for more collaboration on the secret side.

Chad Buechel, a vice president for Leidos, which is the lead contractor for DoDNet, said at the end of May or early June, DoDNet reached initial operating capability for some of those new technologies and capabilities, such as virtual desktop-as-a-service and unified end point management, which lets DISA more effectively manage customer end points and protect them from cyber threats.

He said it’s all about making it easier for users to migrate and then use DoDNet.

DISA expects to continue to expand DoDNet to the combatant commands starting in fiscal 2026.

Cerritos-Aracen said DISA also is integrating its zero trust architecture under Thunderdome as part of DoDNet to include a secure access service edge (SASE) and software-defined wide area network (SD-WAN) capabilities.

The post DISA’s Skinner not a fan of ‘wooden shoe’ IT services first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/defense-news/2024/07/disas-skinner-not-a-fan-of-wooden-shoe-it-services/feed/ 0
Evolving hybrid cloud strategies in modern agencies https://federalnewsnetwork.com/cme-event/federal-insights/evolving-hybrid-cloud-strategies-in-modern-agencies/ Mon, 01 Jul 2024 18:34:20 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=5060586 How are the CDC and TSA managing cloud adoption to meet their missions?

The post Evolving hybrid cloud strategies in modern agencies first appeared on Federal News Network.

]]>
May was the three-year anniversary of President Joe Biden’s cybersecurity executive order.

At the same time, June was the five-year anniversary of the Office of Management and Budget’s cloud smart policy.

These two anniversaries mark important mileposts in agency digital transformation journeys.

The latest data from Deltek, a market research firm, found agencies could spend more than $8 billion on cloud services in fiscal 2025. That is up from over $5 billion in 2020.

As agencies spend more on cloud services and continue to have some applications and data on-premise, security in this hybrid cloud set up becomes even more important.

Agencies need tools and capabilities to monitor applications and data on-premise and in the cloud. They also need to understand the data to make faster and more accurate decisions.

At the same time as agencies are moving applications and data to the cloud and ensuring its security, they have to balance those efforts with improving the employee and customer experience.

Joe Lewis, the chief information security officer for the Centers for Disease Control and Prevention in the Department of Health and Human Services, said his agency is prioritizing the modernization of systems and workloads that serve emergency response and public health crises.

“CDC is full steam ahead on cloud migration and modernization. I think we have embraced the notion that we are going to have legacy workflows that have to reside on-premise, which means that we will perpetually live to some degree in some level of hybrid cloud,” Lewis said on the discussion Evolving Hybrid Cloud Strategies in Modern Agencies. “In that space, I feel like we are working to solve long-standing legacy technical debt problems as we modernize workloads and applications and things that historically were built in stovepipes into more enterprise level platforms that enable data sharing and visualization, and more importantly, the ability to make faster decisions around public health. It’s an exciting time. It’s probably some of the most agile work I’ve seen in my nearly 20 years in the federal space.”

At the same time the CDC is trying to modernize legacy technology, Lewis said changing organization culture is an equally important goal.

Moving to a hybrid cloud culture

He said getting employees to embrace new ways of doing business, specifically how technology can help solve more complex problems, is a key piece to the entire modernization effort.

CDC is not alone in facing this challenge. At the Transportation Security Administration, the pace of change isn’t always comfortable.

“At TSA, a real struggle of bringing people up to a certain level of saying, ‘here’s the next thing, here’s the next change,’ and that constant effort of continuous improvement has really been a real struggle of keeping everybody up to date,” said Dan Bane, the branch manager for secure infrastructure and vulnerability management in the Information Assurance and Cybersecurity Division for Information Technology at TSA in the Department of Homeland Security. “When you have large organizations bringing those people along with the IT changes that are happening so rapidly, it’s a real challenge for the organization.”

TSA has been on a modernization journey for several years, initially starting with infrastructure-as-a-service (IaaS) and transitioning to software-as-a-service (SaaS) most recently for business and mission critical functions.

“We’ve found that some of the expenses that we ran into with some of the SaaS and then also some of the complexities of the technical debt, we didn’t really have people that were really capable at deploying some of those technologies on a quick scale. Frequently the development teams were getting ahead of our security teams,” Bane said. “Our CIO Yemi Oshinnaiye has really helped us integrate a development secure operations DevSecOps approach. It’s not perfect, but we’re a lot better than we were.”

Bane’s team is working more closely now with the development teams, integrating security tools to help automate checks of code to ensure there is speed to production.

“It’s really an area where we are sitting down with an engineer and going through every setting and every activity, and then getting the monitoring capabilities for those different applications running back into our security operations center. It is a huge lift,” he said. “It really becomes an area where we are trying to standardize on a couple of different infrastructure and platforms that we try to build on top of those, instead of this service, that service, this service. Those things have taken a great deal of time, and have really impacted the IT operations’ ability to really deliver the mission capabilities of what we’re trying to do for the organization.”

Reducing tools, complexity

The need to address the culture change as part of the overall modernization journey is common among public and private sector organizations.

But one way is by reducing the number of tools any organization relies on, and then bringing them all together through a single pane of glass, said Brian Mikkelsen, the vice president and general manager for U.S. public sector at Datadog.

“Historically, you’ll have a network group, a [security operations center] group, an operations team, a development team and, then probably, all kinds of different interactions between those teams. But each of those teams have historically had their own tools. They’ll use one tool for the network; one tool for infrastructure observability; another for application performance monitoring (APM); and then something that connects perhaps legacy on-premise security and maybe another tool for cloud security,” Mikkelsen said. “A new way of thinking is built from having an end-to-end observability and security platform. One of the primary things we help customers with is tool reduction and bringing teams into a very common understanding of the health and security posture of their infrastructure and cloud architecture.”

He added by breaking down silos across disparate teams and creating a single source of truth, each of the teams have the same data and can address challenges as they arise.

Having the single source of truth also makes it easier for agencies to decide which applications can go to the cloud today, which ones will need some work, and which ones need to stay on-premise for the foreseeable future.

“What we’re doing is we’re helping federal agencies visualize and instrument their existing legacy platforms, which inherently allows them to baseline and create a roadmap for what they want to prioritize,” Mikkelsen said. “The first question I would ask is just simply, ‘whatever solutions we’re bringing to market, does this connect the dots?’ What I really mean by that is does it provide for tagging, for correlation and for automation? Or am I creating yet another silo? Or am I breaking down silos and bringing teams together? All of this connects to what we’re really trying to do, is these systems are capabilities that deliver experiences to our citizens, our employees, and so all this revolves around also citizen experience initiatives.”

Learning objectives:

  • Overarching cloud strategies and where agencies stand today 
  • Approaching security and compliance to PREM
  • What are the meaningful priorities in the next 12-18 months 

The post Evolving hybrid cloud strategies in modern agencies first appeared on Federal News Network.

]]>
Cloud Exchange 2024: CMS’ Andrea Fletcher, Remy DeCausemaker on extending digital services through open source https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-cms-andrea-fletcher-remy-decausemaker-on-extending-digital-services-through-open-source/ https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-cms-andrea-fletcher-remy-decausemaker-on-extending-digital-services-through-open-source/#respond Fri, 28 Jun 2024 12:38:58 +0000 https://federalnewsnetwork.com/?p=5031845 The Centers for Medicare & Medicaid Services sees The Centers for Medicaid & Medicavalue and ease of open source software in developing new capabilities faster.

The post Cloud Exchange 2024: CMS’ Andrea Fletcher, Remy DeCausemaker on extending digital services through open source first appeared on Federal News Network.

]]>

The Centers for Medicaid & Medicare Services is one of the biggest software development organizations in the country.

Every day, CMS’ six major centers release code, build new products and tools, and rely on thousands of contractors for help to bring new capabilities to everything from COBOL mainframes to cloud-based services.

It’s why CMS being the first federal agency to launch an open source program office isn’t just significant, it has the potential to transform how agencies across the government develop and manage software.

“We’re going through and reducing burden, reducing costs and reducing duplicate work and effort. These aren’t foreign concepts. We are providing accountability for contractor performance and helping to show metrics and growth around our projects and communities,” said Remy DeCausemaker, open source lead at CMS, on Federal News Network’s Cloud Exchange 2024.

“One of the goals of the Open Source Program Office, and just open source in general, is including folks who are not just consumers of software but also contributors,” he continued, “so giving people an opportunity on the outside to contribute to CMS projects, not just by providing feedback but for finding patches as well as engaging with early career talent pipeline programs, like Coding It Forward, the U.S. Digital Corps, the U.S. Digital Response and other internship and early career opportunities that help get young technologists and early career professionals into public service.”

A recent of example of the power of open source came when Congress required CMS to establish transparency around the cost of hospital services.

CMS used open source technology to develop a hospital price transparency tool, which hospitals use to provide machine-readable lists of all their services and the associated costs.

CMS reduces duplication through use of open source

DeCausemaker said rather than building a bespoke custom application, CMS built a static open source website that’s hosted on the internet, where hospitals can upload and validate their data files.

“It will give them warnings, and it will help them to name the files correctly. It really helps to fill in some of the gaps where the [transparency] policy may not have said at that machine-readable level what the standards needed to be. But we can provide a reference implementation to make it easier for hospitals to comply and make their lives easier for uploading that file,” he said. “Hospitals don’t even need to install it locally. They can just go to one website, which everybody can get to it and where everyone can use it. We’ve already seen contributions coming in from the outside world.”

The hospital price transparency tool example demonstrates the potential of open source software, the cloud and common data standards that collectively can improve federal services.

Andrea Fletcher, chief digital strategy officer at CMS, said the use of open source also addresses a long-time problem of organizations building the same software tools over and over again.

“I didn’t realize how much software we actually develop, and one thing that’s really frustrating is when we build the same thing twice — because we don’t have the resources to do that,” she said. “It’s frustrating for the teams that are building software when they’re watching somebody else build the exact same thing. Why aren’t we working together? It’s because of funding streams or contracts or something that doesn’t really make sense, to be honest. We just want to build it once and be able to reuse it. That takes a total mentality shift of how we build and manage the processes around building software at CMS.”

That’s why, among other reasons, CMS officially launched its Open Source Program Office in January 2024. But open source isn’t new to CMS nor to the government.

The Defense Department has issued memos encouraging the use of open source since the 1980s, including updates in 2003, 2009 and again in 2022.

In 2016, the Office of Management and Budget issued a governmentwide memo encouraging the use of open source and directing agencies to release their code in the code.gov repository.

CMS applies 5-tiered open source maturity model

Fletcher said the time was right for CMS to jump into open source in a much bigger way. She said several ongoing programs, like application programming interfaces for the Blue Button 2.0 initiative or the ability to send data in bulk through the Federal Healthcare Interoperability Resources standard, proved the value of relying on open source.

“We looked at these as like, how do we do more of that? What do we need in order to push ourselves into the next version of open source at CMS?” she said.

Fletcher even pushed back on the idea that CMS having a program office for open source was innovative.

“Building code in the open is how the internet was built. Something like 80% to 90% of all code that is used today is actually open source. It’s the foundation of how we build software,” she pointed out. “What’s different now is, especially in the federal government, recognizing that and saying, ‘We’re going to use this to our advantage.’ So rather than ignoring the open source ecosystem, how do we leverage that? How do we make sure it’s more secure? And then also, how do we build in the open to basically encourage a better health care IT ecosystem? So really looking at this as a mechanism for reaching beyond the government into the immense ecosystem that is health care software development.”

As part of setting up the office, DeCausemaker said CMS adopted a five-tiered maturity model for software development, repository templates, metrics to gauge adoption and participation and a GitHub repository where the code lives.

“We’ve seen folks inside of CMS, our Centers for Clinical Quality and Standards across our Centers for Medicare and Medicaid Innovation — and even outside the department — actually release their first open source repository on GitHub,” he said. “We got to see them using those templates and starting from somewhere. So we know this is valuable inside the agency, and we know that it’s valuable at the federal level. We hope that’s valuable wherever folks can get them and use them. We want to reduce work and effort across the ecosystem.”

DeCausemaker said the DoD Open Source Software FAQ helped dispel many of the myths about open source and helped “sell” the use of open source up the leadership chain at CMS.

Dispelling myths about open source for federal use

Fletcher added they had to emphasize that open source didn’t mean giving away data or opening up personally identifiable information to the public. She explained it’s the code that the data runs on that put on the internet.

“That was big, just by dispelling all of the myths around what open source software is and is not. That took a lot of work,” she said. “Once we got that buy-in, we were off to the races. There are some challenges just around how we manage software in general, how we manage GitHub repositories, and we have a lot of contractors at CMS who tend to be siloed within our centers. We have multiple centers so we’re really trying to break down those silos and work across different components. That’s something that’s very difficult to do in government.”

But even in the short amount of time Fletcher and DeCausemaker have been promoting and pursuing open source efforts at CMS, the impact is starting to become clear.

“We’re looking to raise the floor for all of the open source projects across the CMS ecosystem, and then we’re also looking to raise the ceiling for our existing open source projects or looking for specific guidance and specific goals that they have on increasing adoption or increasing their contributor and developer community,” DeCausemaker said.

“By taking both of those work streams into account and thinking about inbound code (that is code that comes into the agency that we depend on from the outside) and outbound code (that is code that we share with the outside world) — everything in a hospital sort of falls into those two work streams — we get to figure out how to best allocate our limited resources to help our big existing open source projects and build from the ground up to support everyone and raise their digital standard of living.”

Discover more articles and videos now on Federal News Network’s Cloud Exchange event page.

The post Cloud Exchange 2024: CMS’ Andrea Fletcher, Remy DeCausemaker on extending digital services through open source first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-cms-andrea-fletcher-remy-decausemaker-on-extending-digital-services-through-open-source/feed/ 0
Cloud Exchange 2024: GSA’s Ryan Palmer, CISA’s Chad Poland on new tools to help agencies protect data in the cloud https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-gsas-ryan-palmer-cisas-chad-poland-on-new-tools-to-help-agencies-protect-data-in-the-cloud/ https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-gsas-ryan-palmer-cisas-chad-poland-on-new-tools-to-help-agencies-protect-data-in-the-cloud/#respond Fri, 28 Jun 2024 12:10:40 +0000 https://federalnewsnetwork.com/?p=5050657 What’s next for FedRAMP? What about CISA’s SCuBA? How about secure AI adoption? Where do these efforts interconnect?

The post Cloud Exchange 2024: GSA’s Ryan Palmer, CISA’s Chad Poland on new tools to help agencies protect data in the cloud first appeared on Federal News Network.

]]>

The General Services Administration’s FedRAMP cloud security program is launching a pilot to allow cloud service providers to quickly introduce security features into their products.

Ryan Palmer, senior technical and strategic advisor for FedRAMP at GSA, said the pilot will launch this summer. The effort is centered on a “nonblocking significant change request process,” Palmer said.

“We’re working on running a pilot where some capabilities and some changes are going to be able to be made with an informed but not an approved step for those agencies,” Palmer said.

Palmer joined Chad Poland, cybersecurity product manager at the Cybersecurity and Infrastructure Security Agency, for a panel during Federal News Network’s Cloud Exchange 2024 to discuss continuing initiatives at GSA and CISA to help agencies secure their cloud presences.

Making FedRAMP more adaptable

Currently, FedRAMP requires cloud service providers have an approved “significant change request” before making major updates to their already authorized cloud services, such as new technologies.

The process has been seen as an impediment, however, to CSPs introducing new features that could help better secure agency data and services.

“I think that pilot can enable some of those capabilities to be offered,” Palmer said. “There’s a crawl, walk, run approach I think we have to do when we look at how we do change management within the program.”

The pilot comes amid broader reforms to the long-running FedRAMP program. The reforms are aimed at addressing the time and cost for CSPs to get approved. The changes are specifically intended to expand the pool of available software as a service offerings for agencies, Palmer said.

GSA released a FedRAMP roadmap earlier this year to help guide the program’s evolution. The Office of Management and Budget is also finalizing a White House draft memo on FedRAMP.

Palmer said GSA is also looking to introduce more automation into the process, including by moving to a common data format for FedRAMP authorization documents.

“Once we get there — and I think we’re going to get there fairly quickly — we can start using that data and combine data to start speeding up the process,” he said.

Automation could help cloud service providers catch errors in their paperwork, for instance, potentially saving time during the FedRAMP review process. Palmer compared it to tax preparation software that alerts users to obvious errors and typos.

“Through automation, I see us providing increased insights to cloud service providers early in the process,” he said.

CISA dives deeper on SCuBA

Meanwhile, CISA continues to work on additional tools and services as well.

For example, its Secure Cloud Business Applications, which offers agencies baseline security configurations for their cloud environments, finalized the Microsoft 365 secure configuration baseline in December. SCuBA is also piloting a draft configuration for Google Workspace products with several agencies.

Poland said the strength of the SCuBA baselines is in their specificity.

“They’re very prescriptive,” he said. “So it tells an end user exactly what setting they need to change, why they should change it via a rationale statement. And then we’ve actually gone a step further and provided mappings to MITRE ATT&CK so that they know, if they turn the setting on, what actual TTP it’s going to prevent.”

The program also offers open source assessment tools that agencies can use to evaluate their security posture for Microsoft 365 and Google Workspace respectively.

Meanwhile, FedRAMP is examining how to incorporate the SCuBA program’s guidance into secure configuration profiles in the first half of fiscal 2025, according to FedRAMP’s roadmap.

Poland said CISA is also considering how to expand its work on secure configurations to other cloud products beyond Microsoft 365 and Google Workspace.

“There are hundreds of other SaaS products out there on the marketplace, and then thousands of other SaaS products out there not on the FedRAMP marketplace,” he said. “How can we scale that and try to see if we can mimic that same type of prescriptive guidance for organizations?”

Ensuring secure government AI adoption

Both CISA and GSA are also playing pivotal roles in the government’s secure use of AI. In March, GSA released a draft emerging technology framework to help prioritize FedRAMP’s approval process for technologies like AI.

FedRAMP finalized the prioritization framework on June 27. Vendors can start submitting generative AI capabilities for priority approvals starting Aug. 31.

Palmer said GSA received more than 200 comments on the draft.

“We’ve tried to incorporate those changes into the final framework,” he said. “Some of the things that we heard were the concerns around the limits that we had in the framework. We tried to adjust those and clarify that those are going to be flexible and really driven by agencies’ needs.”

Many commenters also focused on the framework’s benchmarking process.

“Collectively, people liked the benchmarks,” Palmer said. “But some of the concerns around the benchmarks were, ‘How are they relating to different agency use cases?’ Let’s say there’s a need for an AI large language model to do a translation capability. Is the highest performing large language model also the best one for that translation? There could be large language models that are related to specific industries or particular government areas that may be the highest performing but may not show up on the initial benchmarks. So we are looking at standardized communication around what benchmarks are relevant to the use cases and what those use cases are for particular models that are being offered as part of a cloud service offering.”

CISA’s SCuBA program is also examining the incorporation of AI into the Microsoft 365 and Google Workspace products, Poland said.

“Both of them are add-ons to those base platforms,” he said. “And so once we go to the internal approval process, we’re going to get those into our test environments and see how they affect and change some of those configurations. Do we need to provide additional policies? Does our assessment tool need to adapt to make sure that we’re capturing everything? Can we leverage some information from that in order to make our products better? It’s something we’re already working on.”

Discover more articles and videos now on Federal News Network’s Cloud Exchange 2024 event page.

The post Cloud Exchange 2024: GSA’s Ryan Palmer, CISA’s Chad Poland on new tools to help agencies protect data in the cloud first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-gsas-ryan-palmer-cisas-chad-poland-on-new-tools-to-help-agencies-protect-data-in-the-cloud/feed/ 0
Cloud Exchange 2024: NOAA’s Adrienne Simonson, Patrick Keown on sharing massive datasets with the public https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-noaas-adrienne-simonson-patrick-keown-on-sharing-massive-datasets-with-the-public/ https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-noaas-adrienne-simonson-patrick-keown-on-sharing-massive-datasets-with-the-public/#respond Fri, 28 Jun 2024 12:06:02 +0000 https://federalnewsnetwork.com/?p=5049367 NOAA’s Open Data Dissemination Program provides public access to the agency’s open data through a partnership with major CSPs, NODD leaders explain.

The post Cloud Exchange 2024: NOAA’s Adrienne Simonson, Patrick Keown on sharing massive datasets with the public first appeared on Federal News Network.

]]>

The National Oceanic and Atmospheric Administration collects tens of terabytes of data every day from satellites, radars, ships, weather models and more.

All that data adds up. NOAA currently holds about 56 petabytes of data across hundreds of datasets.

The agency’s mission involves gather and sharing information about everything from the surface of the sun to the bottom of the ocean, and a wide array of industries rely on its datasets to do business.

That’s why it created the NOAA Open Data Dissemination (NODD) Program, which provides public access to the agency’s open data through a partnership with major commercial cloud platforms.

“NODD makes NOAA assets available to the public at no charge. We publish those data assets that are machine-readable. We maintain the data inventory. We engage with the public on the use of the data in order to expand the use of the data. And we provide the public with an opportunity to ask for the datasets they might want,” NODD Director Adrienne Simonson said during Federal News Network’s Cloud Exchange 2024. “Users can write in and talk to us about their challenges, and we’re going to answer their questions.”

Consumers of NOAA data includes researchers and scientists, who rely on it to analyze climate change and disaster recovery, for instance.

“We’ve gotten overwhelmingly positive support for data dissemination that we’re doing, the data that is out in the cloud and the access that we’re providing. That comes because we’re deeply engaged with those users,” NODD Program Manager Patrick Keown said.

“It certainly is a challenge to store just the sheer volume of the data,” he added. “One of our biggest issues is the fact that that’s rapidly changing, and we don’t know the volumes of the data that are coming. That has been one of our bigger challenges.”

Simonson said agriculture, insurance, renewable energy, retail and transportation industries all rely on NOAA data.

“Those folks are using the data for things like strategic operational planning. They’re also using it for risk management and supply chain management, and to create innovative products for their own customers,” she said. “So the scalability and the access really allow NOAA to essentially support the economy and spur innovation.”

The NODD Program supports the data needs of a wide range of businesses.

“We have heard from a lot of folks who are startups that say that they wouldn’t otherwise be able to have started the companies, except for that the data was free. We also hear from Fortune 500 companies, so we’ve got quite a spread there going on,” Simonson said.

The wide range of uses of the data is often staggering, Keown added, recounting being on a panel when someone from a large appliance maker discussed the importance of public access to NOAA data. The panel member pointed out that the NODD data helps it ensure that its refrigerators run at the right temperatures.

The panelist told Keown: “The only way for us to do that efficiently and to reduce our greenhouse gas emissions, and all of these other things, is to understand what the weather’s going to be like. And we can only do that if data is unfettered.”

How NODD makes ‘data democratization’ possible

NODD started out in 2015 as the agency’s Big Data Project, with agreements with Amazon Web Services, Google, IBM, Microsoft and a nonprofit called Open Commons Consortium.

In 2021, the project was rebranded as NODD as it began to focus on providing the public with free and easy data access via the cloud.

Keown said NOAA continuously improves NODD based on feedback from its users and addresses concerns around access and search.

“Some users really enjoyed that kind of user interface, feeling like they could find what they needed. But others, especially large private industry, large research industry, they needed completely open access. And they were the ones that were providing early on overwhelmingly positive feedback to us,” he said.

NODD is in its fifth year of essentially a 10-year contract. The contract is a two-year base contract, with four option periods of two years each.

“We’re looking to take on more datasets. We’re also looking to continue to engage with our users, to make sure they understand how to use the data. While we speak, though, there are teams within NOAA that are working with each other and with the cloud service providers to develop the future data dissemination,” Simonson said.

Among the program’s goals, Simonson said NODD supports “data democratization.”

NOAA wants to keeping expanding the access to the data to more people, she said.

“One of the reasons why this got started initially was because NOAA has a lot of data already that people were struggling to get to, and we were seeing the exponential rise of data volumes,” she said.

NOAA’s evolving cloud infrastructure

Keown said NOAA has been expanding its cloud presence over the past five years.

“All of the line offices within NOAA and the organization have worked at a different pace — at the pace that they felt they could be both innovative but also support operations. The National Weather Service is a very operational component. There’s life and property that are at stake, so making major infrastructure changes take a little bit longer. Others like the National Ocean Science Service, who really are often at the forefront of open science and trying to find new innovative ways to do things, they were able to maybe adopt cloud a little bit differently,” Keown said.

As NODD has evolved, the program office’s goal has been “to help those offices get to the cloud and give them this infrastructure to at least say, ‘Here, we’re going to utilize this for dissemination of data until we can figure out how to maybe do this internally.’ ”

All the agency’s offices now “are coming up to speed of how to build this cloud infrastructure, this cloud environment to support their needs and ultimately support the needs of the users — many of them new and different types of users than they originally had,” he added.

Keown said NOAA ensures all of the datasets in the NODD Program have full and accurate descriptions and that data users can follow up with the NOAA component agencies that generate the data.

“It also gives them the ability to get back to the owner of the data. For instance, if it’s a dataset that belongs to the National Marine Fisheries Service, it can directly link them back there. It could also directly link them to the cloud offering where they can access that data,” he said.

Keown said NOAA continues to refine NODD based on user feedback, and the agency is looking at ways to make its data more accessible and valuable to the private sector and the public.

“When we start to talk about data in five years, or data in 10 years, we’re reaching out to every portion of the organization, every program. We work with the data owners to better understand what new datasets they’re going to bring online,” he said.

“Then, we have to work to create strategies, whether they’re cloud strategies, or data strategies, or strategies for AI — things that would give us the ability to make those decisions long term for the organization. And I think we have to be a little more flexible than probably the government has been in the past. We used to develop 10- and 20-year plans, and now we see ourselves developing two- to five-year plans that often change in a year or two.”

Discover more articles and videos now on Federal News Network’s Cloud Exchange 2024 event page.

The post Cloud Exchange 2024: NOAA’s Adrienne Simonson, Patrick Keown on sharing massive datasets with the public first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-noaas-adrienne-simonson-patrick-keown-on-sharing-massive-datasets-with-the-public/feed/ 0
Cloud Exchange 2024: GSA’s Adam Grandt-Nesher, USDA’s Arianne Gallagher-Welcher on future of federal digital services https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-gsas-adam-grandt-nesher-usdas-arianne-gallagher-welcher-on-future-of-federal-digital-services/ https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-gsas-adam-grandt-nesher-usdas-arianne-gallagher-welcher-on-future-of-federal-digital-services/#respond Fri, 28 Jun 2024 11:03:08 +0000 https://federalnewsnetwork.com/?p=5046999 Modernization and technical transformation weave together cloud, data management and customer experience, say GSA and USDA digital services leaders.

The post Cloud Exchange 2024: GSA’s Adam Grandt-Nesher, USDA’s Arianne Gallagher-Welcher on future of federal digital services first appeared on Federal News Network.

]]>

No one explicitly planned it, but the next generation of federal digital services depends heavily on the adoption of commercial cloud computing. Digital services require modernization of applications, supporting infrastructure and, typically, multiple data sources. That makes them a natural to go hand in hand with cloud technology.

Two panelists at Federal News Network’s Cloud Exchange 2024 outlined this idea in detail.

Adam Grandt-Nesher, managing director of the Cloud Adoption and Infrastructure Optimization for Center of Excellence at the General Services Administration, put it this way: “We’ve gotten to the point where most agencies that we work with have already-established, well-governed, reasonably well-managed digital services frameworks,” Grandt-Nesher said. Therefore, “you can start focusing on specific workloads and transition into the cloud.”

Agencies aren’t necessarily retiring legacy systems that continue to support loans, grants, payments and other basic transactional services. Whether refactored for cloud hosting or remaining on agency data center mainframes, such applications remain necessary to the latest digital services.

For example, the Agriculture Department focuses on interconnecting modern, user-facing front-end systems in the cloud with its legacy systems in its data centers.

“There’s a really big push for that interconnectedness between the front end — what members of the public seeing and how they are navigating — with the back end,” said Arianne Gallagher-Welcher, executive director of USDA Digital Service.

She aims to “make sure, from A to Z, the whole way the system works from end to end provides that design outcome that you’re looking for, for a particular user.”

Gallagher-Welcher said that at USDA, some people focus on the user experience while others focus on back-end systems.

“Data centers, they still have a value. They still have use, especially when folks are being intentional about what a modernization journey looks like. That’s not just flipping a switch, and all the data centers are gone,” she said. “With a lot of transitions to the cloud, I’ve seen that agencies are looking at how you can bring that together into one seamless experience.”

Plus, when adding cloud hosting, USDA pays attention to how data moves from one service to another so that it can maintain security, privacy and archiving requirements, Gallagher-Welcher said.

For seamless digital services, liberate the data

Digital services require data sharing and mobility in ways that traditional applications typically do not.

“To facilitate any of that communication or digitization of services that currently exists online, the first and foremost thing that an agency has to do is what we refer to as ‘liberate the data,’ “ Grandt-Nesher said.

Because data tends to be so widely distributed, “we have to either transition that into a shared space or build a mechanism to allow access to that data from any edge points in one of those multiple front doors,” he said.

For a department with as many distinct programs as USDA, the digital services challenge becomes about making its programs easily navigable from a customer standpoint.

“It’s really hard for the public to navigate and know exactly which place they need to go to get the answer they need,” Gallagher-Welcher said.

This means the public-facing component of an organization’s digital services must present a “no wrong door” experience, both she and Grandt-Nesher advised. That’s where human-centered design and journey mapping techniques come in. At the back end, the agency must rework processes to integrate data silos and share services.

It’s also where the cloud comes in. In fact, GSA helped USDA with an initial front door effort.

“One of our earlier projects was the ‘Ask USDA’ framework,” Grandt-Nesher said.

Ensuring customer-centricity defined the effort, he said.

“Especially since the CX executive order, when we arrive at an agency, our customer experience center defines what we’re there to do,” he said. “Everything that we do starts and ends with the customer’s experience. Technology, both on the data side and the infrastructure platform development side, serves to facilitate that transition.”

Beyond providing answers, USDA wants to improve access to its numerous, specific loan and other farm support programs. Gallagher-Welcher said USDA’s Chief Information Officer Gary Washington has charged the department’s tech staff with making access to services as easy as using an ATM.

That effort hinges on several strategies.

Fundamentally, she said, the mission areas are looking at how they can modernize business processes. The chief data officer is also USDA’s accountable official for artificial intelligence. That office, Gallagher-Welcher said, “is working really hard across the department to identify a number of use cases to test out using AI to see if that technology helps streamline some of those business processes.”

Another strategy component seeks to find ways to share products and technologies among programs to speed up modernization and avoid duplicative spending or work.

For instance, USDA’s Digital Infrastructure Services Center (DISC) “helps all of our mission areas and agencies access enterprise-level agreements for cloud, software as a service and other products so that we’re leveraging the buying power of USDA and also sharing and leveraging successes,” Gallagher-Welcher said.

Programs trying to modernize their services use DISC’s “Rolodex of solutions that could be deployed to help meet that need,” she said.

A third strategy component aims to ensure various missions and program officers learn from one another. It involves discussions at the department level, and even with congressional appropriators, to discover “those different types of mechanisms that we can leverage that really help incentivize and drive that collaboration upfront,” she said.

The idea is to use funding and shared enterprise services to make collaboration a regular part of the DNA of modernization throughout the department.

Making digital services viable for federal use

Managers and their staffs may understand their programs but perhaps not all of the nuances of the cloud. GSA has a cloud maturity and skills assessment capability to help with that, Grandt-Nesher said. He described it as a “library of solutions that are proven to work and proven to drive success in the federal government.”

The Technology Transformation Service, in which Grandt-Nesher’s center lives, also can take on the tricky acquisition of cloud services for client agencies if needed, he said.

He named cloud-hosted data services among those that can help with back-end integration for multiple applications. The agency still must determine which tier a given set of data must lie in, then use cloud services to get away from the need to buy and manage data hardware.

“Utilizing cloud services, you have high-availability, high-frequency access data sitting in equivalent to in memory storage versus low storage or deep storage data that is accessed rarely,” Grandt-Nesher said. That tactic lets agencies “then push it out into the world in a nice, governed fashion” based on access rights and privacy policies.

“This is, from an engineering perspective, one of the most technically complex challenges and has been well resolved in some cases,” he added.

Delivering services digitally to constituents has elements in common with, but is not totally analogous to, the private sector. Federal programs have requirements and rules unique to dealing with citizens and using taxpayer dollars.

Still, USDA “hears from our demand signals from the public that they want to be able to access services online,” Gallagher-Welcher said. “They don’t want to necessarily go to one of our 4,500 county offices to get some of the guidance and expertise they need.”

USDA officials believe that by streamlining and automating 75% of the basic transactions and services, they can free up staff to focus on exceptions and complicated cases.

Plus, modernizing applications for digital delivery affects the other channels by which constituents deal with agencies by ensuring people responding to phone calls or chat queries have the same data that’s powering the digital service.

Because in the end, Grandt-Nesher said, picking up the phone or going online “are actually the same process.”

Discover more articles and videos now on Federal News Network’s Cloud Exchange 2024 event page.

The post Cloud Exchange 2024: GSA’s Adam Grandt-Nesher, USDA’s Arianne Gallagher-Welcher on future of federal digital services first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-gsas-adam-grandt-nesher-usdas-arianne-gallagher-welcher-on-future-of-federal-digital-services/feed/ 0
FedRAMP finalizes ‘fast pass’ approval process for AI tools https://federalnewsnetwork.com/cloud-computing/2024/06/fedramp-finalizes-fast-pass-approval-process-for-ai-tools/ https://federalnewsnetwork.com/cloud-computing/2024/06/fedramp-finalizes-fast-pass-approval-process-for-ai-tools/#respond Thu, 27 Jun 2024 22:32:05 +0000 https://federalnewsnetwork.com/?p=5056631 The new emerging technology prioritization framework will help determine which generative AI tools need to be pushed to the front of the line for approval.

The post FedRAMP finalizes ‘fast pass’ approval process for AI tools first appeared on Federal News Network.

]]>
The FedRAMP cloud security program is opening up its doors to specific types of generative artificial intelligence capabilities for priority approvals starting Aug. 31.

Vendors can submit GenAI tools, specifically used for chat interfaces and code generation, and debugging tools that use large language models (LLMs), and prompt-based image generation as well as associated application programming interfaces (APIs) that provide these functions to receive expedited review as part of Federal Risk and Authorization Management Program’s (FedRAMP) new emerging technology prioritization framework. The program office released the final version today.

“FedRAMP will open submissions for prioritization requests twice a year. Requests for prioritization by cloud service providers (CSPs) are voluntary. FedRAMP holds prioritized cloud services to the same security standards as all other cloud services, and reviews them in the same way,” the program office stated in a blog post. “FedRAMP ensures prioritized cloud services are reviewed first in the authorization process. Requests will be evaluated against the qualifying and demand criteria to ensure prioritized technologies meet the goal of ensuring agencies have access to necessary emerging technologies. Initially, FedRAMP expects to prioritize up to 12 AI-based cloud services using this framework.”

FedRAMP PMO says it will announce initial prioritization determinations by Sept. 30.

The program management office said while its started first with AI tools and capabilities, the framework is technology agnostic. It features a governance and CSP evaluation process.

“The governance process defines how up to three capabilities will be prioritized for ‘skip the line’ access to FedRAMP at any given time, and the amount of cloud service offerings (CSOs) with a given capability that will be prioritized,” the framework stated. “The CSP evaluation process outlines how new cloud service providers will have their CSOs qualified to access an accelerated review. Existing cloud service providers must work with their authorizing official and will follow the significant change request (SCR) process to include new enterprise technology (ET) CSOs in their authorization.”

New forms for FedRAMP priority process

Along with the new framework, the PMO released two forms for agencies and vendors to fill out. Cloud service providers whose offerings meet the ET criteria, and can demonstrate agency demand, can apply for the initial round of prioritization by completing the Emerging Technology Cloud Service Offering Request Form for cloud service offerings and the Emerging Technology Demand Form by Aug. 31.

The General Services Administration, which manages the FedRAMP program, issued the draft emerging technology framework in March seeking industry and agency feedback.

FedRAMP PMO developed the framework as required under the November 2023 safe, secure and trustworthy AI executive order issued by President Joe Biden.

Ryan Palmer, a senior technical and strategic advisor for FedRAMP at GSA, told Federal News Network during the 2024 Cloud Exchange, that the program office received more than 200 comments.

“Some of the things that we heard were concerns around the limits that we had in the framework. We tried to adjust those and clarify that those are going to be flexible and really driven by agency’s needs, which could be more generative AI solutions getting prioritized after the initial batch.,” Palmer said. “Prioritization is not a blocker. So it’s not that other services are not going to get prioritized. It’s just that you we do want to prioritize within our review process certain capabilities. Another area we did get feedback is on the benchmarks. Collectively, people liked the benchmarks. But some of the concerns around the benchmarks were how are they relating to different agency use cases?”

Palmer said the program office is looking at ways where they can standardize the communication around what benchmarks are relevant to the use cases.

From those initial comments, the program office made four major changes to the framework and two to the prioritization list.

Source: FedRAMP blog post June 27, 2024.

The PMO says one significant change was how it will analyze whether a service qualifies as generative AI.

“We’ve transitioned away from measuring cloud services against quantitative benchmarks and leaderboards. Instead, cloud service providers now submit public links to industry-standard ‘model cards.’ Those model cards describe key features of how their underlying AI models operate,” the PMO said. “Given the rapid pace of AI development, relying on benchmarks likely would have required an impractical amount of ongoing changes to have them continue to stay relevant across diverse use cases. Instead, FedRAMP will use the information on model cards to validate whether the AI being used is the type of capability being advertised. The purpose of collecting this information is not to assess the performance of the AI capability, but about whether the capability being offered is the one intended for prioritization.”

The PMO says it will continually review its processes and update its list as new requirements emerge, both AI and otherwise.

“FedRAMP will update and maintain an evolving list of prioritized ETs at least annually with input from agencies and industry followed by approval from the FedRAMP Board,” the framework stated. “Technologies will be removed from prioritization either by decision of the board, or when the target number of CSOs with the desired capabilities are available within the marketplace.”

 

The post FedRAMP finalizes ‘fast pass’ approval process for AI tools first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/06/fedramp-finalizes-fast-pass-approval-process-for-ai-tools/feed/ 0
Cloud Exchange 2024: CIO Council’s Innovation Committee on making gains through community collaboration https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-cio-councils-innovation-committee-on-making-gains-through-community-collaboration/ https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-cio-councils-innovation-committee-on-making-gains-through-community-collaboration/#respond Wed, 26 Jun 2024 20:45:47 +0000 https://federalnewsnetwork.com/?p=5039111 DOE, FDIC and DHS tech leaders of the CIO Council’s Innovation Committee collaborate with agencies on how to think differently about operations and technology.

The post Cloud Exchange 2024: CIO Council’s Innovation Committee on making gains through community collaboration first appeared on Federal News Network.

]]>

Earlier this year, Federal Chief Information Officer Clare Martorana challenged the government community to think about this moment in technology differently. She encouraged the federal community to think across the ecosystem and about the “big bets” the public and private sectors are making in technology for the next generation.

Martorana asked: How can agencies harness that energy to think differently, to maximize their opportunities and to be safe, responsible and creative?

She may have thrown the gauntlet down to agencies and industry alike, but the Federal CIO Council’s Innovation Committee is leading the response. We talked with the committee’s three leaders — Ann Dunkin of the Energy Department, Chez Sivagnanam of the Federal Deposit Insurance Corp. and Dave Larrimore of the Homeland Security Department — to learn about the committee’s efforts during Federal News Network’s Cloud Exchange 2024.

“Sometimes when innovation comes up, people easily relate to emerging technologies. But it’s not. It doesn’t have to be emerging technology and all about new ideas,” said Sivagnanam, chief architect at FDIC. “To me, innovation is taking great ideas from people, experimenting and then finding a way to scale and make sure it’s adapted at an enterprise scale. That is really the [innovation] ecosystem.”

The Innovation Committee creates that ecosystem by bringing together not just CIOs, but chief technology officers, chief data officers, chief artificial intelligence officers and other technology managers across government.

Larrimore, CTO at DHS, said because each of these leaders have different roles and responsibilities, the committee has become a forum for members to discuss big picture challenges and opportunities.

Given the many laws, regulations and policies that collectively and individually impact agencies’ technology efforts and missions, Larrimore described the Innovation Committee as “the connective tissue” between numerous teams, councils, committees, organizations and groups.

“We have representatives from far-reaching backgrounds in various departments and agencies that include cyber equities, that include data, that includes IP operations, et cetera,” he said. “Really, our group is designed around the intersection of those various committees and helping to drive efforts that make sense to have a combined approach.”

Federal innovation ideas with potential for tangible outcomes

One forthcoming example of that is a potential project to be submitted to the Technology Modernization Fund for funding. Dunkin, CIO at Energy, said she can’t offer too many details except that the project will be around AI.

But it’s just one example of how the committee is taking on Martorana’s challenge to think differently.

“We are really trying to find opportunities to create tangible outcomes and not just talk. Even when we do these symposiums, our goal has been to have follow-up happen after the symposium,” Dunkin said. “For example, we did that fraud symposium, and then the CFO Council took that, ran with it and did the next activity around that. Our goal is really to make sure whatever we’re doing it gets sticky, that it gets some outcomes and results and not just, ‘Oh, we had a nice chat today.’ ”

Sivagnanam added that how the committee creates community comes in many different forms. Sometimes it’s as symposiums, like the one it sponsored in March around AI and quantum computing. Other times, the committee develops artifacts like roadmaps or journey maps focused on emerging technologies or data-driven innovation, cybersecurity or customer experience.

“We want to be the bridge across all these different people. If there are great things happening on one side and the other side needs help, we create some artifacts and then drive that journey,” Sivagnanam said. “This year our focus is primarily on data-driven innovations and customer experience. We’re also helping our groups on some of the cybersecurity innovation stuff.”

When it comes to innovation, the committee looks at ideas from two perspectives. The first is operational innovation, which Sivagnanam described as taking a hard look at processes that are done repeatedly — but doing so with an “agile mindset where you try to improve it with some new contributions. That is something that happens all the time.”

The second innovation category involves looking at a process or technology from a “radically different” perspective based on new technology, new people, new data or something that changes the lives of the users.

“The mark of innovation for us is really about the ability to continuously modernize. Our ability to look at big problems and solve them in ways to where we continuously are able to innovate is ultimately the measure of success,” Larrimore said. “It’s something that cloud, agile, DevSecOps all give us access to and so when you start applying commercial AI to those three foundational tenants of modernization, you now have an opportunity to start achieving that goal.”

Examples of innovation targeting AI at DHS around

Dunkin added the committee seeks opportunities that meet a couple of criteria, starting first with what’s most valuable to the government as a whole.

“What can the Innovation Committee do that is going to bring the most value to our colleagues, to the government and to the citizens who pay for all this?” she said. “The second thing is, what is the committee going to get excited about doing? We can come up with this great agenda of things that are going to be great taxpayer value, but ultimately this is a volunteer activity and people do this because they want to do it. We have to create things that people are going to get excited about doing.”

Larrimore highlighted two ongoing innovation efforts at DHS that are all about technology but simultaneously have nothing to do with the technology itself.

“DHS identified a real gap around knowledge, skills and abilities around AI — and in the federal workspace. We’ve instituted a one-of-a-kind DHS AI Corps to bring in 50 AI experts. We already have, I think, seven on board as of today. We’re going to have close to 15 by the end of this fiscal year,” he said.

“We’re already starting to gear them up to deploy projects that are supporting the executive order and other administration priorities in the department. It’s just been incredibly exciting as we have received over 5,500 resumes, and we’ve had several hundred interviews. It is incredible to see the caliber of people. Never in my career have I seen the caliber of individuals coming from the private sector, raising their hand and really saying that ‘I want to support a mission.’ ”

The second example also focuses AI. Larrimore said DHS spent the last several months putting together a first-of-its-kind policy around the use of commercial generative AI tools. It also developed training agreements with several companies for “government-friendly terms of service” as well as supervisory guidebooks and other tools.

“We have made this publicly accessible. We have had several conversations at the state and federal levels, and any federal department can use this,” Larrimore said. “A lot of the companies that we built these agreements with have told us, ‘Please share this as much as possible. We want to be able to accommodate other potential customers.’ We get protection on privacy that we don’t get just using what’s out in the marketplace. We get protections around indemnification that we don’t with a lot of these companies, and it solves a real big gap that we’ve had in the cloud space that the department is really starting to realize is a huge benefit for more than just DHS.”

Dunkin, Larrimore and Sivagnanam also warned that while innovation and government might seem like strange bedfellows, agencies have good reasons to be more cautious than the private sector.

Dunkin said there is widespread acknowledgement that the procurement process can lead to frustrations and can slow down innovative efforts.

But, at the same time, Larrimore said he has seen more desire both from government and industry to explore new and different ways to modernize technology and processes. To that end, he encourages people to participate in the committee’s activities.

“The best way to prevent errors is to ask questions and learn — and have a culture of learning. That has been a huge center point around the reason why the Innovation Committee exists. We want to bring around that culture of learning and culture of sharing,” he said. “I’m going to tell folks to come by and lean in. You don’t have to have a CIO sitting on the Innovation Committee. You can send folks who are really interested in innovation and learning about technologies that their department or agency hasn’t had access to yet, to ask questions about how something works and be part of the community. I guarantee you and your department will be better off.”

Discover more articles and videos now on Federal News Network’s Cloud Exchange 2024 event page.

The post Cloud Exchange 2024: CIO Council’s Innovation Committee on making gains through community collaboration first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-cio-councils-innovation-committee-on-making-gains-through-community-collaboration/feed/ 0
Cloud Exchange 2024: OMB’s Drew Myklegard on fostering mature cloud capabilities https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-ombs-drew-myklegard-on-agency-it-modernization-maturation/ https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-ombs-drew-myklegard-on-agency-it-modernization-maturation/#respond Tue, 25 Jun 2024 20:45:43 +0000 https://federalnewsnetwork.com/?p=5029508 OMB’s deputy federal CIO says the government aims to ensure parity with the commercial sector by modernizing policies like the one for cloud security.

The post Cloud Exchange 2024: OMB’s Drew Myklegard on fostering mature cloud capabilities first appeared on Federal News Network.

]]>

The federal Cloud Smart Strategy turns five years old this June, and it’s more relevant today than it was back in 2019 when the Office of Management and Budget released it.

Why is that? “There’s a comfort level, I think, that we have already. We have removed a lot of the barriers, and people are good at buying it now,” Drew Myklegard, deputy federal chief information officer at OMB, said during Federal News Network’s Cloud Exchange 2024.

The strategy focuses on three key pillars to make cloud adoption successful: security, procurement and workforce. OMB said at the time of its release that these elements “embody the interdisciplinary approach to IT modernization that the federal enterprise needs in order to provide improved return on its investments, enhanced security and higher quality services.”

Not only do those words ring true today, but agencies’ technology teams are also more educated about the what and the how to make their mission and back-office systems run in the cloud, Myklegard said.

He said OMB has no plans, at this point, to update the strategy — in part because of its relevance now more than ever.

“The things that we are always very attuned to are all the developers and whether they are using the most modern tools in very similar ways to the environments that you see in the commercial sector,” Myklegard said. “Our goal is to have a developer come from a commercial company and have a very similar experience within the government. That means how long it takes them to commit their first line of code or how quickly they can spin up an environment, how quickly they can scale. We’re trying to seek parity.”

Parity also means not having to even ask if an agency is in the cloud. Myklegard said the assumption is all agencies are and can use whatever tools are available to them.

OMB providing guardrails, top cover on cloud

Another way OMB works to ensure parity with the commercial sector is by modernizing specific policies like those around the cloud security program, FedRAMP, or giving agencies more comfort to apply artificial intelligence capabilities within cloud instances.

“OMB does a really good job in the areas of: ‘Here are the guardrails. Here’s the top cover.’ But when [agencies] are building a custom app in the cloud today, we’ve really put out the guardrails and we’ve put out the top cover so they’re able to use the best tools to fit their mission and use risk-based decision-making. We are making sure we haven’t tied their hands,” Myklegard said.

“What we see is the next innovation of our policies. A lot of times we’ll put out a strategy, or a lot of times we’ll use memos, or a lot of times we’ll use circulars. We like to use all of them to achieve the goal. But we’re also getting feedback from agencies about the things that are most helpful and the things that are barriers to them to succeeding. Those are the areas that we’re absolutely focused on.”

One recent example? Myklegard pointed to the update to the FedRAMP memo. He said it was clear agencies were moving quickly to use more software as a service (SaaS) applications. Therefore, he said, the security and other supporting policies needed to change.

The draft FedRAMP memo, released in October, reflects the demands agencies have on the program to approve more SaaS products. The FedRAMP Program Management Office followed the draft memo in March with a new roadmap. The roadmap recognizes the need for an accelerated path to bring in more SaaS products, Myklegard said.

“We have found that the agencies that are leaning in on SaaS products are doing a couple of things. One, they’re able to fulfill a lot of security requirements since those SaaS products don’t have a lot of the legacy challenges that our on-premise solutions have. You can adopt them very rapidly through acquisitions, get them secured by using most of the ones that are already FedRAMP-ed and then you’re just thinking about the controls that are pertinent and relevant to your office,” he said.

“As developers, internally, we have to manage roadmaps. We have technical debt. We’re fighting for dollars. We’re doing a lot of different things to make a product successful that most people don’t see. So if you’re able to offload a lot of that to an outside company but still influence the roadmap, it’s really the best of all worlds.”

Budget matters more than ever to cloud use, Saas

Myklegard, who worked for the Department of Veterans Affairs before coming to OMB, pointed to that department’s updated website, VA.gov, as an example of how an agency could reap the benefits of moving to SaaS. He said VA now is able to put their best developers and engineers on the project, and they don’t have to worry about the back-end infrastructure and can focus on providing veterans with the best services.

“Our main goal is great IT across the federal government, and so the way to get there is we just cannot build it. There’s no reason to try and build something that’s already in the commercial sector. So when a business line is trying to deliver on a policy and mission or some other need, you want to speed the time in which we can roll out programs and public benefits to those that need it the most,” he said. “We’ve seen that across the federal government. So not only are we smarter in tech, but we’re also smarter in all the things that are enabling that to be successful.”

Myklegard said two of the biggest factors driving this change is the focus from the top, whether OMB or Capitol Hill, and then the understanding of the budget side of why funding these technology programs matter more today than ever.

He said one way OMB measures the progress of IT modernization is through constant communications with agencies, from the Office of the Federal CIO as well as from OMB’s desk officers.

“Of course, we want to run by metrics. There’s a couple of areas that we really focus on there. One is the Federal Information Security Modernization Act. We bring in a lot of data through that, which is digested by our teams, and we make decisions at the leadership level where we should be investing, where we should help, be budgeting and making strategic investments at agencies,” Myklegard said.

“Then, we’re six months in on our Digital Experience memo. Once we get the memo out the door, as Federal CIO Claire Martorana likes to say, ‘The cement starts to harden,’ and the program starts to take form. We are looking at some very specific things around digital analytics programs, a couple key areas that we see over at the General Services Administration’s programs around customer experience.”

He added OMB also pays attention to feedback agencies receive from customers and citizens, particularly the high-impact service providers.

“We spend a lot of time focused on that data collection and that outreach. We really feel like it pays off when we go and work with our budget colleagues, or when we’re speaking with the Hill,” Myklegard said.

“On the budget side, we’re making decisions around where we invest while being in a constrained budget environment. This is a time when we need to be getting the most value out of our dollars in the programs. There are very intense conversations around, ‘How much are we going to put into this program? What should this program cost if they’re launching a new one? How soon can we expect time to value? Is there a way that we can limit our costs by not building a huge infrastructure and waiting for it to grow?’ Well, that’s cloud and you are building an app that you inherit a lot of the security that’s already there, so I’m reducing those costs versus having to build all that infrastructure myself.”

Discover more articles and videos now on Federal News Network’s Cloud Exchange 2024 event page.

The post Cloud Exchange 2024: OMB’s Drew Myklegard on fostering mature cloud capabilities first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-ombs-drew-myklegard-on-agency-it-modernization-maturation/feed/ 0
Cloud Exchange 2024: Reimagining customer experience through ‘actionable intelligence’ https://federalnewsnetwork.com/federal-insights/2024/06/reimagining-customer-experience-through-actionable-intelligence/ https://federalnewsnetwork.com/federal-insights/2024/06/reimagining-customer-experience-through-actionable-intelligence/#respond Tue, 25 Jun 2024 14:55:21 +0000 https://federalnewsnetwork.com/?p=5052534 Genesys' U.S. public sector vice president offers actionable insights on improving CX through agent assist technology and cloud.

The post Cloud Exchange 2024: Reimagining customer experience through ‘actionable intelligence’ first appeared on Federal News Network.

]]>
Stress is too often present in customer experience encounters, on both the side of the customer and the employee.

Part of this is that customer expectations are dramatically higher in recent years. Citizens have become used to the kinds of experiences they have on their cell phones, from online shopping to apps. But government agencies struggle to provide that same kind of experience, for a host of reasons.

“There are many things that make customer experience a tricky category. It isn’t easy to meet those needs and those expectations,” said Jason Schick, U.S. public sector vice president for Genesys. “Different customers want to communicate differently, but they expect the same satisfying outcomes, whether they interact by phone call, email, SMS text, social media or chatbots. Getting consistency and uniformity of information and outcome across channels takes a lot of coordination.”

Agencies have to figure out answers to difficult questions, Schick said during Federal News Network’s Cloud Exchange 2024.  He offered a few examples:

  • How do you get an agent involved at the right time, before the customer gets stuck?
  • How do you determine the agent best suited to resolve the issue?
  • How can that agent be aware of the context without requiring the customer to repeat themselves?

Then, there’s the issue of employee morale, which is a struggle most call centers face and leads to the dual challenges of low retention and large investments in hiring and training that take time to pay off, Schick said.

How agencies can improve CX

Improving CX is a governmentwide goal, and every public facing agency wants to help its employees who interact directly with citizens do so better and accomplish more in those interactions. One way Schick suggested that agencies can do this is with agent assist technology.

For example, working with the National Domestic Violence Hotline, Genesys was able to help implement an artificial intelligence-driven system that makes inferences about the caller — whether they’re using voice, text or chat channels — to help route them to the right agents and provide additional context for that agent. That helps the hotline bring the best possible capabilities to bear in support of callers in their moments of crisis.

“It’s about deriving intelligence that becomes actionable in the moment,” Schick said. “CX is becoming increasingly data-centric. There are large, important repositories of information across large enterprises that include lots of biographic data, lots of historical data about every individual customer. We’re going to see the history and all the characteristics of the person who’s calling start to show up in the services provided by the agents and by all the bots. That shows an even greater understanding of who they are and what they’ve done before and therefore predicts what they might need.”

Importance of cloud to future of CX

Many of these advances are made possible by the maturation of the cloud. Dynamic scaling and automatic fail-over are helping agencies deal with fluctuations in volume, which is especially important to agencies like the IRS, which have predictable, seasonal influxes, or the Federal Emergency Management Agency, which has spikes that follow major natural disasters. It also allows agencies to scale their workforce more quickly by enabling employees to work remotely.

Cloud is also a necessary prerequisite for AI. Cloud scaling is required for the kind of data collection and analysis required to train these algorithms, as well as the processing power necessary to operate them.

“We’re at the early stages of a major reimagination of what experience is for customers and for government,” Schick said.

Where agencies can begin on CX improvement

For agencies looking at where to begin their efforts in improving CX, Schick has some advice: Start with experience orchestration. Agencies can map out customer interaction journeys and identify the points of friction. There are usually easily identifiable interactions where customers get stuck, lost or frustrated. And often, these will align with areas of stress for employees as well. Schick said application of AI or automation at these friction points will smooth the interaction, improve the resolution path and build confidence with customers.

From the opposite perspective, agencies can also integrate existing systems and user interfaces to improve the employee experience. Reducing or eliminating the “swivel chair” dynamic — where employees have to pivot between systems, including reentering data manually — can allow them to focus on serving customers more effectively and efficiently, improving the customer experience.

“The experiences of the customer and of the call center agent are so deeply intertwined. At Genesys, we exist to make big improvements for both,” Schick said. “Genesys is highly focused on using the latest technology to bring empathy to each touch for the customer and the employee. Because we believe deeply that by doing so, we power the kind of experiences that breed satisfaction, loyalty and confidence.”

Discover more articles and videos now on Federal News Network’s Cloud Exchange 2024 event page.

The post Cloud Exchange 2024: Reimagining customer experience through ‘actionable intelligence’ first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/reimagining-customer-experience-through-actionable-intelligence/feed/ 0
Cloud Exchange 2024: Wiz’s Mitch Herckis on reimagining cloud security https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-wizs-mitch-herckis-on-gaining-visibility-for-cloud-security/ https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-wizs-mitch-herckis-on-gaining-visibility-for-cloud-security/#respond Thu, 20 Jun 2024 23:28:08 +0000 https://federalnewsnetwork.com/?p=5048068 Former White House official says agencies are in the midst of a “culture change” when it comes to cybersecurity.

The post Cloud Exchange 2024: Wiz’s Mitch Herckis on reimagining cloud security first appeared on Federal News Network.

]]>

The federal government’s ongoing shift to cloud computing has brought cybersecurity benefits, but it’s also added complexity for cyber defenders.

That’s according to Mitch Herckis, who served as branch director for federal cybersecurity in the White House Office of the Federal Chief Information Officer. Herckis left that post this spring to join Wiz as global head of government affairs.

“The cloud provides a huge amount of predictability and ability to configure in such a way that there’s lots of documentation,” Herckis said during Federal News Network’s Cloud Exchange 2024. “There are APIs, there are configurations that make it very easy to understand and make it much more uniform around what you do in the cloud. However, there is complexity and speed that comes with that.”

During the height of the COVID-19 pandemic, agencies were able to spin up cloud workloads quickly to continue carrying out their missions and delivering services to the public.

“So there’s that flexibility and adaptability,” Herckis said. “[But] maintaining visibility and being able to understand the risk that comes with those changes is very unique. Especially as you add more complexity, you mesh these services together and create new relationships within the cloud that the security teams, the development teams, the underlying service provider and the users need to keep up with.”

Cloud begets cybersecurity culture change

Herckis said President Joe Biden’s May 2021 cybersecurity executive order and the subsequent 2022 federal zero trust strategy set a baseline for agencies, starting them “on the right path in the cloud.” Agencies are moving to adopt key cybersecurity practices, such as multifactor authentication and data encryption, across their enterprises.

“We have a real culture change going on within the federal government to really push us into that zero trust environment,” Herckis said. “And I think it’s really a testament to the folks who are there with how far we’ve gotten in such a short period of time. There’s more to be done, obviously.”

Herckis advocates for maintaining visibility into cyber risks within the changing network environment — which spans from on-premises data centers to various cloud services.

“How do we democratize security to ensure that the development team understands how those changes might affect the risk to the cloud, and the security operations centers and the security teams can also understand the business cases and use cases without having to continuously reach back to the development team?” he said. “It has to be a team effort. And there has to be a unified approach to creating that visibility and understanding potential attack patterns that come from this continuous change.”

Addressing AI cybersecurity risks in government

The imperative to understand cybersecurity risks has only heightened with agencies’ exploration of artificial intelligence and machine learning.

“This is an entirely new feature that is essentially often getting spun up in the cloud alone,” Herckis said. “That creates new risks that we’re not used to managing. And we don’t have a playbook that’s been around for a decade. So agencies need to be really thinking about the security posture of these and working with people both in the public sector and private sector to understand the best practices around that.”

Under the Biden administration’s AI approach, the Cybersecurity and Infrastructure Security Agency wants to ensure AI systems are protected from cyber-based threats.

“We’ve had some threat researchers find some extraordinarily unique ways of breaking out of isolated environments by injecting into one large language model some nefarious information, which seemed to be through legitimate purposes, and then being able to access others that are supposedly isolated,” Herckis said. “There’s a lot of unique ways to essentially poison those environments or change it and create new attack vectors.”

Discover more articles and videos now on Federal News Network’s Cloud Exchange event page.

The post Cloud Exchange 2024: Wiz’s Mitch Herckis on reimagining cloud security first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-wizs-mitch-herckis-on-gaining-visibility-for-cloud-security/feed/ 0