Automation - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Fri, 19 Jul 2024 02:41:20 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Automation - Federal News Network https://federalnewsnetwork.com 32 32 ONR looks to automation to speed hiring federal hiring process https://federalnewsnetwork.com/federal-insights/2024/07/navy-enhancing-automation-to-onboarding-and-offboarding-processes-for-better-customer-experience/ https://federalnewsnetwork.com/federal-insights/2024/07/navy-enhancing-automation-to-onboarding-and-offboarding-processes-for-better-customer-experience/#respond Wed, 17 Jul 2024 15:20:11 +0000 https://federalnewsnetwork.com/?p=5078520 The Office of Naval Research wants an automated hiring portal to track employees' information during onboarding and offboarding processes.

The post ONR looks to automation to speed hiring federal hiring process first appeared on Federal News Network.

]]>

Derace Lauderdale |

The Office of Naval Research is developing an automated hiring portal to track its onboarding and offboarding processes, hoping to improve customer experience for both.

ONR, the Navy Department’s main science and technology organization, is constantly looking for new talents all over the world while having the largest total of PhDs in any federal agency, from trained scientists to research professionals.

For now, the customer experience for new hires is somewhere between good and very good, but ONR is still spending a lot of time looking into automation and areas where they can improve, said Curtis Pelzer, the organization’s chief information officer. He said ONR is looking at reducing the time it takes to onboard personnel and providing more information to leadership on why employees are leaving.

“In terms of our onboarding process, a lot of that process is manual. And when we identify a potential hiring candidate, a lot of that initial interaction also happens manually.  In terms of all the forms that they need to fill out to become a federal employee or transition from another federal agency, what we’re looking to do is allow them to provide all that information online, through what we’re calling a program or hiring portal. They log in, and they are able to see all the information that they need in terms of completing forms and being able to transmit those forms back to the hiring manager,” Pelzer said on Federal Monthly Insight — Customer Experience. “And, then, after the employee has been onboarded to the command, we’d be able to automate the entire lifecycle of that employee. So all those documents that were generated during the hiring process are made available to that employee.”

Information would also be automatically passed on to others in the organization who need to know about the onboarding process, including hiring managers, supervisors, and HR personnel, depending on their roles.

“The system would provide for the individual roles that are needed to make sure that the employee lifecycle is being met, and the things that the employee would need during their tenure. I speak of this in terms of lifecycle, because I believe that there’s a beginning, which is the onboarding process,” Pelzer said. “And then you look at the sustainment of that employee during their tenure, and you look at the offboarding process. That would entail, how do you recover those assets that have been provisioned for the employee? And then how do you successfully offboard that employee, making sure that they have everything that they need when they’re departing the organization?”

Other information in the portal could provide insights into why an employee decides to leave the agency might enhance the automated process in the future,

Prior to the development of the new automated portal, ONR has experienced issues automating processes due to their existing manual process. Originally, they would overlay new technology over the manual process — but would not receive the  level of efficiency they were expecting.

Artificial intelligence has also been a conversation at ONR, as they’re working on AI-enabled capabilities. Pelzer said the challenge is identifying what data will be allowed, and making sure the data remains secure, not putting anything at risk.

“We have been looking at using bots using robotic process automation to help streamline routine processes that we believe can be done better by a bot, or using an AI. And most certainly bring a level of efficiency to these processes like onboarding and offboarding of personnel and tracking personnel better,” Pelzer said. “When you look at our data and analytics program, we’re building AI-enabled capabilities every single day. These are taking systems that we’ve already built, and then layering that generative AI on top of that, to be able to better serve our workforce, to give them an additional capability that we previously didn’t have, prior to the advancement of AI.”

ONR is also looking at new ways to track metrics on how many personnel are onboarding and offboarding. Currently, they’re manually capturing these metrics. For Pelzer, looking at metrics provides insight when it comes to a call for resolution and meeting customer needs.

“One thing that’s the most gratifying in terms of customer satisfaction is the messages that I receive from the customer that say, ‘Hey, well done, your team has done something that I didn’t think was possible,’ or the response time, or the level of satisfaction the customer received. So, the metrics certainly give you insight into how well your team is performing. We look at the data, but having those notes that come in is really something that I look forward to,” Pelzer said.

The post ONR looks to automation to speed hiring federal hiring process first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/07/navy-enhancing-automation-to-onboarding-and-offboarding-processes-for-better-customer-experience/feed/ 0
The power of AI, data in preparing for the next national emergency https://federalnewsnetwork.com/commentary/2024/07/the-power-of-ai-data-in-preparing-for-the-next-national-emergency/ https://federalnewsnetwork.com/commentary/2024/07/the-power-of-ai-data-in-preparing-for-the-next-national-emergency/#respond Tue, 16 Jul 2024 16:39:01 +0000 https://federalnewsnetwork.com/?p=5077133 The potency of AI and implementation of data enabled missions hinges on skilled talent.

The post The power of AI, data in preparing for the next national emergency first appeared on Federal News Network.

]]>
Our nation is currently embroiled in multiple geopolitical theaters, and our government is working hard with allies and partners around the world to ensure resilience and mission success. Simultaneously, we can’t let global events and international needs halt or impede innovation at home on the civilian front and for the good of U.S. citizens.

Core civilian agencies are tasked each day with ensuring U.S. prosperity, continuity and trust on a national level — and these agencies now find themselves in a unique position at the intersection of massive mission needs combined with increasing volumes of data. The key to pulling it all together: artificial intelligence, the most important technology advancement in a generation. Stemming from last year’s Executive Order on AI, deadlines are quickly approaching for agencies to comply with the EO and Office of Management and Budget requirements, serving as a critical impetus to ensure the resiliency of the country — powered by data and AI — no matter what is happening on the global scale.

It is critical that civilian agencies forge ahead with robust, coordinated, scalable and repeatable strategies to take advantage of AI and the power of data to prepare all-of-government responses to not only maintain equilibrium, but also prepare to meet challenges at home — from extreme weather events, public health crises drawing on lessons from the COVID-19 pandemic, financial and critical infrastructure threats, and beyond. In an era marked by geopolitical tensions, climate crises and cybercrime, being prepared is not merely an option but a necessity.

AI and data: The key to empowering critical civil agencies

So what does preparation look like in action? Technology and data are not just tools, but lifelines that can significantly impact emergency responses and day-to-day operations. Here are three critical areas where AI-powered and data-enabled mission approaches can revolutionize civilian and public sector efficiency and efficacy:

  1. Climate resilience: As natural disasters become more frequent and severe, the need for comprehensive data sharing across agencies has never been more urgent. AI can process vast datasets rapidly, pinpointing at-risk communities and extending the lead time for extreme weather forecasts, turning hours into minutes and saving lives in the process.
  2. Public health: Early detection of public health threats can prevent them from spiraling into endemics and full-blown pandemics. Through enhanced data sharing between local and federal entities, and AI-driven pattern recognition, agencies can quickly identify potential outbreaks, ensuring that preparedness is a step ahead of the problem.
  3. Fraud Prevention: The importance of bolstering the resilience and security of systems cannot be overstated. A recent advisory from the Cybersecurity and Infrastructure Security Agency highlights the threat of nation-state hackers targeting civil society organizations to destabilize democratic values. The repercussions of such cyberattacks have already disrupted our healthcare systems. By employing AI to continuously monitor and analyze systems and data, and to respond to security breaches and fraudulent activities swiftly, we can enhance the integrity of our civil agencies and protect the interests of our citizens. This proactive approach is vital in safeguarding our nation, its people, and our democratic way of life against nefarious threat actors.

Investing in the future: The human factor and reimagined automation

The potency of AI and implementation of data enabled missions hinges on skilled talent. To meet the challenge of tomorrow, agencies need to double down on investing in their people to modernize their workforce the same way they are modernizing their technology. Reflecting on how the widespread availability of Microsoft Office tools transformed workforce skillsets three decades ago, it’s clear that tools alone do not suffice; adoption and proficiency in their use does.

Today, we find ourselves at a similar juncture with AI and data. It’s not just data scientists and people in technical roles who need to become proficient — it’s everyone. You don’t need to be a data scientist, but you do need to be data fluent. As technology perpetually evolves, the constant that remains is the people behind the machines. Success hinges not just on having the latest technology but on working collaboratively to leverage these tools for better mission outcomes.

Tied to reimagined talent development in the quest for public sector modernization, it will be paramount to transition from manual to automated processes, particularly in data management and emergency response. Reimagining workflows with AI and real-time data can free up agency staff to focus on strategic priorities and empower urgent, data-informed action in crisis situations. Civil agencies must make their data readily available to stakeholders and be equipped with AI tools and proficient personnel to deploy solutions at a moment’s notice when American lives and livelihoods hang in the balance.

A call to action

Agencies need to balance today’s demands with tomorrow’s potential. As we continue to navigate the digital age, the mandate for civil government agencies is clear: Embrace technological advancement, invest in talent, and create and maintain a proactive roadmap for modernization. There’s greater awareness and excitement about civil agencies being able to solve challenges through better use of their data. While agencies are at different points in their digital transformation journeys, the potential to overcome challenges with data is becoming more apparent.

The challenge is that agencies need to deliver on the missions in front of them today with the tools they have, while taking modernization steps to build the road for tomorrow. Only then can we truly safeguard and serve the American public.

Richard Crowe is president of the civil sector at Booz Allen Hamilton, the leading provider of AI services to the U.S. federal government.

The post The power of AI, data in preparing for the next national emergency first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/commentary/2024/07/the-power-of-ai-data-in-preparing-for-the-next-national-emergency/feed/ 0
FedRAMP’s 2 new efforts target long-time vendor frustrations https://federalnewsnetwork.com/cybersecurity/2024/07/fedramps-2-new-efforts-target-long-time-vendor-frustrations/ https://federalnewsnetwork.com/cybersecurity/2024/07/fedramps-2-new-efforts-target-long-time-vendor-frustrations/#respond Mon, 15 Jul 2024 22:11:25 +0000 https://federalnewsnetwork.com/?p=5073798 The cloud security program launched two programs, an agile delivery pilot and a new technical documentation hub, to accelerate cloud authorizations.

The post FedRAMP’s 2 new efforts target long-time vendor frustrations first appeared on Federal News Network.

]]>
The final policy guidance for the cloud security program known as FedRAMP is still a few weeks away from coming out, but the General Services Administration continues its aggressive refresh of the 13-year-old effort.

GSA launched two new initiatives to continue to relieve some of the burdens of getting cloud services authorized under the Federal Risk Authorization and Management Program (FedRAMP) that contractors and agencies have long-complained about.

Eric Mill, the executive director of cloud strategy at GSA, said the agile delivery pilot will choose about 20 contractors to test out how to use secure software delivery approaches to accelerate the “significant change request” process, which essentially is an approval gate for cloud providers to add new features or capabilities to a FedRAMP authorized service.

Eric Mill is the director of cloud strategy in the Technology Transformation Service in the General Services Administration.

“For a lot of cloud providers, this can go on for a long time and really get in the way of what we know to be secure software deployment and delivery practices, which are agile software delivery practices and the federal government absolutely needs to get the benefits of these companies who we are relying on for them to be able to share as many security improvements and updates as possible, new security tools, new patches, and new technology and new capabilities,” Mill said at the GovForward conference sponsored by Carahsoft. “This is an area where we think we can take a look at the way that FedRAMP has operated to date and refactor the process to be one that is based on continuous assessment. I think that’s a phrase you’re going to hear us use a lot because we think we should be getting both more security and more speed at the same time. When we focus our attention on overseeing the process by which changes are made, rather than repeatedly exercising like a stop and go process on every point in time change that a cloud provider makes.”

The PMO says as part of its plan to limit the scope and potential impact of changes to agencies, the new features CSPs launched as part of this pilot must be opt-in.

The PMO says any changes to the fundamental underlying architecture, or new security control implementations that apply to the entire offering, will be excluded from the pilot.

For the purposes of this pilot, the PMO says agencies must choose to use the new feature and the new feature cannot change the:

  • System’s fundamental architecture,
  • Types of components used such as databases, operating systems, or containers,
  • Tooling used to configure, secure, and scan those components, and
  • Customer responsibilities for existing features or services.

The FedRAMP program management office will accept applications from vendors to take part in the pilot through July 26 and then make selections by Aug. 16.

The second new initiative is focused on bringing more automation to the program.

The new technical documentation hub will help CSPs in the development, validation and submission of digital authorization packages, and the developers of governance, risk and compliance (GRC) applications and other tools that produce and consume digital authorization package data.

Mill said one of the goals of FedRAMP more broadly is to reduce the time and costs to industry to get their services authorized.

“We’re still in a universe where we traffic 600-page Word documents and PDFs, which is really not how to run a data oriented organization,” Mill said. “We’ve made, what I think are, very concrete investments in changing that dynamic over time. Some of that is who we have hired and brought on to the program where we have a dedicated Open Security Controls Assessment Language (OSCAL) and data standards lead. We already have more technical expertise and practitioner background in the program now than it has had historically, and we’re going to be increasing that very significantly in the near future. We think that by bolstering our technical capacity, we’re going to be able to move dramatically more effectively, and be a more empathetic and effective partner with the cloud providers and agencies who ultimately have the tools that need to integrate with our program so that we don’t have to have people emailing things around much less emailing things around with passwords and stuff like that.”

The website initially is focused on promoting the use of OSCAL and application programming interfaces (APIs) to share digital authorization packages with the PMO and among agencies.

The PMO says this technical hub site will help make the FedRAMP authorization process more efficient and accessible by:

  • Providing faster and more frequent documentation updates
  • Expanding the breadth and depth of available technical documentation
  • Improving the user experience for stakeholders who are implementing OSCAL-based FedRAMP packages and tools
  • Establishing a collaborative workflow that supports community contributions for improvements to the documentation

Mill added this approach isn’t necessarily new because FedRAMP is doing all of this work out on GitHub and open source development already.

VA proved out automation

FedRAMP has long held out for the promise of OSCAL. In May 2022, it received the first security authorization package using the framework. The National Institute of Standards and Technology released version 1.0 of OSCAL in June 2021 and in August 2021, FedRAMP released the first set of validation rules via GitHub.

But both the program and vendors have been slow to catch on.

Amber Pearson, the deputy chief information officer at the Department of Veterans Affairs, said at the event that VA was the first agency to deploy and submit a systems security plan using OSCAL.

“We were able to actually transform our standard 426 page system security plan from a text file to machine readable language. We’re really excited where automation is going to take us to help us speed up how we deploy our authority to operates (ATOs) in our environment,” Pearson said. “OSCAL will be the first step to explore automation during our assessment and authorization process because it allows us to programmatically look at how do we build in key metrics to do automatic control testing. We’re actually exploring that with our partnerships with NIST and others. How do we actually speed up from a 360-day ATO timeline to receive an ATO to maybe an assessment and authorization (A&A) in a day? That’s some of the efforts that we’re looking at and how do we quickly assess the security controls and most importantly, about automation, it comes into play when you think about continuous monitoring and being able to measure your risk in near real time.”

Drew Mykelgard, the federal deputy chief information officer, said he hopes OSCAL becomes common place for any organization building or approving software within the next year.

“At every stage, I hope people are like, OSCAL is saving me from Word flat files, PDFs and it is changing the game from one of the biggest points of friction that we feel. We also know that when like the federal government gets behind a standard, we can really push it forward,” he said. “When we have people like Amber and her team pushing this through their governance, risk and compliance (GRC) platforms to intake OSCAL more effectively, running the tests on it and increasing, we can write all the policy we want, but without people like Amber, it’s doesn’t happen.”

The agile delivery pilot and the automation hub are two of the latest efforts the program management office has released since January.

FedRAMP’s continued modernization march

In June, FedRAMP finalized its emerging technology framework, focusing initially on generative artificial intelligence.

In May, OMB and GSA detailed the new structure of FedRAMP, replacing the joint authorization board with the new FedRAMP Board and creating the technical advisory group.

And two months before that, the FedRAMP PMO outlined 28 near-term initiatives in a new roadmap for the future.

All of this started in October when OMB issued the draft policy update to FedRAMP.

The PMO is still without a permanent director after more than three years.

Mykelgard said GSA is close to hiring a new permanent director of the program management office after receiving more than 400 applications.

GSA’s Mill said these and other upcoming changes are all about making concrete investments to change the dynamic over time. He said speed and security don’t have to be polar opposites.

“If you look at the elements on our roadmap, a very healthy chunk of them are designed to chip away in different ways and different slices of the things that generate that time and cost,” Mill said. “What we really need when commodity services out there exist, which can do core functions by companies and other agencies sometimes, it’s the shared services strategy in another form. We benefit from a security perspective, as federal agencies and the federal government when we’re able to stop doing things ourselves. Now when we’re talking about software, we have different and new and exciting opportunities to start running fewer things that are held together by shoestring apps and use things that are given dedicated maintenance, love and security investment. That, in and of itself, is a huge security boon for the government, which should be able to focus its limited IT and security people on the things that cannot be commoditized, that are just unique and core to their mission. That’s the theory of FedRAMP.”

The post FedRAMP’s 2 new efforts target long-time vendor frustrations first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cybersecurity/2024/07/fedramps-2-new-efforts-target-long-time-vendor-frustrations/feed/ 0
Social Security Administration will soon transition to Login.gov platform https://federalnewsnetwork.com/federal-newscast/2024/07/social-security-administration-will-soon-transition-to-login-gov-platform/ https://federalnewsnetwork.com/federal-newscast/2024/07/social-security-administration-will-soon-transition-to-login-gov-platform/#respond Mon, 15 Jul 2024 16:16:51 +0000 https://federalnewsnetwork.com/?p=5075576 The Social Security Administration is transitioning all users who made their accounts before 2021, to the Login.gov platform.

The post Social Security Administration will soon transition to Login.gov platform first appeared on Federal News Network.

]]>
var config_5075249 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB3815082348.mp3?updated=1721016410"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FedNewscast1500-150x150.jpg","title":"Military suicide numbers continue to trend in the wrong direction","description":"[hbidcpodcast podcastid='5075249']nn[federal_newscast]"}};
  • Social Security online users will soon have to create Login.gov accounts, if they don't already have them. The website is a one-stop-shop for Americans to access government benefits and services online. The Social Security Administration is transitioning all users who made their accounts before 2021, to the Login.gov platform. Any beneficiary who already has a Login.gov account doesn't need to take any action. More than five million customers have already made the switch. SSA says the change aims to simplify the sign-in process, while providing more secure access to online services.
    (Upcoming changes to accessing online services - Social Security Administration)
  • The cloud security program, known as FedRAMP, is now taking on another long-held frustration by its industry and agency customers: the need to automate system security plans. A new technical document hub, released last Friday, aims to give users technical documentation, best practices and guidance for creating and managing digital authorization packages using the OSCAL framework. By using this open source language, FedRAMP hopes vendor plans move from 600 page Word or PDF files to files that are machine readable and promote automation.
  • The Department of Homeland Security wants to reduce duplicative cyber incident reporting requirements. DHS is working on interagency agreements so organizations don’t have to report cyber incidents to multiple agencies. Those agreements fall under DHS’ implementation of the Cyber Incident Reporting for Critical Infrastructure Act. DHS assistant secretary for cyber Iranga Kahangama. “We are going to be viewing and administering CIRCIA with an eye towards harmonization.” The Cybersecurity and Infrastructure Security Agency published the draft CIRCIA rule in April. CISA expects to finalize the rule next spring.
  • A new Senate bill would target counterfeit electronics in the federal government’s supply chain. Senators John Cornyn and Gary Peters introduced the Securing America’s Federal Equipment in Supply Chains Act or the SAFE Act last week. The legislation would require agencies to only buy electronics from original manufacturers or authorized re-sellers. The lawmakers say gray-market sellers can circumvent trusted supply chains and introduce risks into federal networks. Their bill does include an option to waive the requirements if it’s in the interest of national security.
  • At the Department of Health and Human Services, using shared certificates has cut the agency’s time-to-hire by as much as 50 percent. Along with reducing time-to-hire, HHS human capital leaders say shared certificates create a better experience for candidates. They also help HR staff work more strategically. Sharing certificates is a relatively new recruitment practice in government. It lets federal recruiters expedite some of the early steps of the hiring process by sharing applications across different offices that are hiring for the same position. Over time, HHS has increasingly relied on shared certificates. In the last four years, HHS hired nearly 12,000 employees using that strategy.
  • A new study finds significant increases in the diagnosis of chronic pain among female service members exposed to combat. Military wives also show higher odds of developing chronic pain when their spouses are deployed. Researchers looked into military health records of female patients suffering from chronic pain from 2006 to 2020. Researchers intended for the military wives to serve as a control group. But the research showed that military wives are at a much higher risk of developing chronic pain as well.
  • The Department of the Navy reported the greatest increase in suicide deaths in the first quarter of 2024. The news comes after the Navy’s internal survey revealed that the percentage of Sailors reporting “severe or extreme” levels of stress has increased significantly since 2019. The Air Force reported 17 deaths by suicide — up from 13 in the first quarter of 2023. Meanwhile, the Army saw a significant decrease in suicide deaths this year.

The post Social Security Administration will soon transition to Login.gov platform first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-newscast/2024/07/social-security-administration-will-soon-transition-to-login-gov-platform/feed/ 0
Evolving hybrid cloud strategies in modern agencies https://federalnewsnetwork.com/cme-event/federal-insights/evolving-hybrid-cloud-strategies-in-modern-agencies/ Mon, 01 Jul 2024 18:34:20 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=5060586 How are the CDC and TSA managing cloud adoption to meet their missions?

The post Evolving hybrid cloud strategies in modern agencies first appeared on Federal News Network.

]]>
May was the three-year anniversary of President Joe Biden’s cybersecurity executive order.

At the same time, June was the five-year anniversary of the Office of Management and Budget’s cloud smart policy.

These two anniversaries mark important mileposts in agency digital transformation journeys.

The latest data from Deltek, a market research firm, found agencies could spend more than $8 billion on cloud services in fiscal 2025. That is up from over $5 billion in 2020.

As agencies spend more on cloud services and continue to have some applications and data on-premise, security in this hybrid cloud set up becomes even more important.

Agencies need tools and capabilities to monitor applications and data on-premise and in the cloud. They also need to understand the data to make faster and more accurate decisions.

At the same time as agencies are moving applications and data to the cloud and ensuring its security, they have to balance those efforts with improving the employee and customer experience.

Joe Lewis, the chief information security officer for the Centers for Disease Control and Prevention in the Department of Health and Human Services, said his agency is prioritizing the modernization of systems and workloads that serve emergency response and public health crises.

“CDC is full steam ahead on cloud migration and modernization. I think we have embraced the notion that we are going to have legacy workflows that have to reside on-premise, which means that we will perpetually live to some degree in some level of hybrid cloud,” Lewis said on the discussion Evolving Hybrid Cloud Strategies in Modern Agencies. “In that space, I feel like we are working to solve long-standing legacy technical debt problems as we modernize workloads and applications and things that historically were built in stovepipes into more enterprise level platforms that enable data sharing and visualization, and more importantly, the ability to make faster decisions around public health. It’s an exciting time. It’s probably some of the most agile work I’ve seen in my nearly 20 years in the federal space.”

At the same time the CDC is trying to modernize legacy technology, Lewis said changing organization culture is an equally important goal.

Moving to a hybrid cloud culture

He said getting employees to embrace new ways of doing business, specifically how technology can help solve more complex problems, is a key piece to the entire modernization effort.

CDC is not alone in facing this challenge. At the Transportation Security Administration, the pace of change isn’t always comfortable.

“At TSA, a real struggle of bringing people up to a certain level of saying, ‘here’s the next thing, here’s the next change,’ and that constant effort of continuous improvement has really been a real struggle of keeping everybody up to date,” said Dan Bane, the branch manager for secure infrastructure and vulnerability management in the Information Assurance and Cybersecurity Division for Information Technology at TSA in the Department of Homeland Security. “When you have large organizations bringing those people along with the IT changes that are happening so rapidly, it’s a real challenge for the organization.”

TSA has been on a modernization journey for several years, initially starting with infrastructure-as-a-service (IaaS) and transitioning to software-as-a-service (SaaS) most recently for business and mission critical functions.

“We’ve found that some of the expenses that we ran into with some of the SaaS and then also some of the complexities of the technical debt, we didn’t really have people that were really capable at deploying some of those technologies on a quick scale. Frequently the development teams were getting ahead of our security teams,” Bane said. “Our CIO Yemi Oshinnaiye has really helped us integrate a development secure operations DevSecOps approach. It’s not perfect, but we’re a lot better than we were.”

Bane’s team is working more closely now with the development teams, integrating security tools to help automate checks of code to ensure there is speed to production.

“It’s really an area where we are sitting down with an engineer and going through every setting and every activity, and then getting the monitoring capabilities for those different applications running back into our security operations center. It is a huge lift,” he said. “It really becomes an area where we are trying to standardize on a couple of different infrastructure and platforms that we try to build on top of those, instead of this service, that service, this service. Those things have taken a great deal of time, and have really impacted the IT operations’ ability to really deliver the mission capabilities of what we’re trying to do for the organization.”

Reducing tools, complexity

The need to address the culture change as part of the overall modernization journey is common among public and private sector organizations.

But one way is by reducing the number of tools any organization relies on, and then bringing them all together through a single pane of glass, said Brian Mikkelsen, the vice president and general manager for U.S. public sector at Datadog.

“Historically, you’ll have a network group, a [security operations center] group, an operations team, a development team and, then probably, all kinds of different interactions between those teams. But each of those teams have historically had their own tools. They’ll use one tool for the network; one tool for infrastructure observability; another for application performance monitoring (APM); and then something that connects perhaps legacy on-premise security and maybe another tool for cloud security,” Mikkelsen said. “A new way of thinking is built from having an end-to-end observability and security platform. One of the primary things we help customers with is tool reduction and bringing teams into a very common understanding of the health and security posture of their infrastructure and cloud architecture.”

He added by breaking down silos across disparate teams and creating a single source of truth, each of the teams have the same data and can address challenges as they arise.

Having the single source of truth also makes it easier for agencies to decide which applications can go to the cloud today, which ones will need some work, and which ones need to stay on-premise for the foreseeable future.

“What we’re doing is we’re helping federal agencies visualize and instrument their existing legacy platforms, which inherently allows them to baseline and create a roadmap for what they want to prioritize,” Mikkelsen said. “The first question I would ask is just simply, ‘whatever solutions we’re bringing to market, does this connect the dots?’ What I really mean by that is does it provide for tagging, for correlation and for automation? Or am I creating yet another silo? Or am I breaking down silos and bringing teams together? All of this connects to what we’re really trying to do, is these systems are capabilities that deliver experiences to our citizens, our employees, and so all this revolves around also citizen experience initiatives.”

Learning objectives:

  • Overarching cloud strategies and where agencies stand today 
  • Approaching security and compliance to PREM
  • What are the meaningful priorities in the next 12-18 months 

The post Evolving hybrid cloud strategies in modern agencies first appeared on Federal News Network.

]]>
With new AI tools available, State Department encourages experimentation https://federalnewsnetwork.com/artificial-intelligence/2024/06/with-new-ai-tools-available-state-department-encourages-experimentation/ https://federalnewsnetwork.com/artificial-intelligence/2024/06/with-new-ai-tools-available-state-department-encourages-experimentation/#respond Fri, 28 Jun 2024 22:20:15 +0000 https://federalnewsnetwork.com/?p=5058420 State wants employees to try out new AI tools like State Chat and North Star, but also share their own use cases to help drive the agency's AI approach.

The post With new AI tools available, State Department encourages experimentation first appeared on Federal News Network.

]]>
The State Department is launching a new artificial intelligence hub and encouraging employees across the globe to experiment with AI technology in ways that help streamline their diplomatic work.

Secretary of State Antony Blinken announced “AI.State” as a “central hub for all things AI” for the department’s 80,000 employees.

“It offers formal and informal training, including videos that are up there to help folks get started,” Blinken said during an event at the State Department today. “It’s a home for all of our internal State Department AI tools, libraries of prompts and use cases. And I would just say, try it out. I’d encourage everyone to test it out, to try it out, to explore it, to try to learn from it. And also lend your own ideas and input because this is something that will continue to be iterative and a work in progress.”

The State Department last fall released an enterprise AI strategy. The strategy prioritizes an “AI-ready workforce.” The agency has also been exploring using generative AI to help employees plan their next career steps.

Blinken said a big motivation for the State Department’s use of AI is improving analysis, while also freeing up its employees to work on high-priority tasks.

“We can automate simple, routine tasks,” Blinken said. “We can summarize and translate research. Something that would take normally days, even weeks, can be done in a matter of seconds.”

Blinken and other State officials at the event today encouraged the workforce to not just experiment with AI, but share use cases to better inform the agency’s approach to the technology.

“If that particular solution isn’t shared, if it just stays with that one person, that one group and that one country or that one place, then you have this reinvention of the wheel that has to go on time and time again,” Blinken said. “Our ability to draw from the experience that all of our teams are going to have using, deploying, experimenting with AI all around the world, but then bringing it back and having these use cases – especially the ones that are producing really interesting new things –come to the top, but then be taken and shared across the enterprise.”

State’s AI ‘North Star’

Earlier this spring, the State Department rolled out a new AI tool called “North Star” that can analyze and summarize news stories in more than 200 countries and in over 100 languages. Matthew Graviss, the State Department’s chief data and AI officer, said the agency’s public diplomacy officers are already making use of the tool.

“The ability to summarize in the media space, and then use that time that you saved to call the reporter find out a little more context around why they wrote that article, maybe shape the next article,” Graviss said today. “It’s repurposing that time to the higher value asks that we want our experts in diplomacy doing.”

Elizabeth Allen, under secretary for public diplomacy and public affairs, estimated the media monitoring tool could save PD officers 180,000 hours over the next year. “We have a lot of opportunity in the communication space to use AI,” Allen said today.

But she added that State’s public affairs offices also need to ensure that people are ultimately reviewing any outputs from generative AI, particularly if it helps feed prepared remarks made by ambassadors.

“We always have to be making sure that we have human checks, particularly when it comes to public communications,” Allen said.

The State Department also recently released a chatbot, “State Chat.” Graviss said his team can analyze the prompts and tweak the tool accordingly.

Kelly Fletcher, State’s chief information officer, said the department’s cybersecurity specialists are also “red teaming” any new enterprise tools like State Chat.

“We do that with almost all of our platforms and systems,” Fletcher said. “In the case of the newest AI technology, we were testing it . . . we found some stuff. Honestly, these folks managed to do some really cool sneaky things. And they were able to see what some folks’ prompts were, they were able to see information they shouldn’t have been able to see, and we fixed it.”

She said training is mandatory to use any new AI tools. And State’s IT teams are also monitoring tools like State Chat for nefarious activity.

“We can see what prompts people are using, not just to inform how is this technology being used and how is innovation happening in the field, but also we can see if somebody’s up to no good,” Fletcher said. “Whether they’re a person who works at the State Department, or somebody who’s managed to get in and is pretending to be a person who works at the State Department.”

Meanwhile, Uzra Zeya, under secretary for civilian security, democracy and human rights, said her team launched an AI research assistant called “data collection management tool,” DCT, in February 2023. Zeya said the tool will reduce by one-third – 52,000 hours per year – the time her officers spend researching and fact-checking reports.

The DCT capability is now available through AI.State.

“I’m really proud of what we’ve been able to accomplish, and I think this is an example of technology supporting not supplanting our work,” Zeya said.

Blinken said he believes AI will be fully integrated into the State Department’s work within the next 10 years.

“Some of this entails experimentation, some of it entails risk,” Blinken said. “But if we’re not leaning in, we’re going to be left out and left behind.”

The post With new AI tools available, State Department encourages experimentation first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/artificial-intelligence/2024/06/with-new-ai-tools-available-state-department-encourages-experimentation/feed/ 0
DoD study sees ‘big breakthrough’ with using AI for declassification https://federalnewsnetwork.com/artificial-intelligence/2024/06/dod-study-sees-big-breakthrough-with-using-ai-for-declassification/ https://federalnewsnetwork.com/artificial-intelligence/2024/06/dod-study-sees-big-breakthrough-with-using-ai-for-declassification/#respond Mon, 24 Jun 2024 22:49:47 +0000 https://federalnewsnetwork.com/?p=5051780 The DoD study comes as Congress presses the Biden administration for progress on efforts to streamline classification and declassification.

The post DoD study sees ‘big breakthrough’ with using AI for declassification first appeared on Federal News Network.

]]>
A Defense Department research project has seen success in using artificial intelligence and machine learning to manage and declassify records. The project leads say the approach could be used to help agencies manage an explosion in digital records.

The research study, “Modernizing Declassification with Digital Transformation” is sponsored by the Office of the Under Secretary of Defense for Intelligence and Security. It’s being carried out by the University of Maryland’s Applied Research Laboratory for Intelligence and Security (ARLIS), one of DoD’s University Affiliated Research Centers.

J.D. Smith, chief of the records and declassification division at DoD’s Washington Headquarters Services, said the research project validated a proof of concept that shows AI and machine learning models can use “contextual understanding” to perform records management and declassification functions.

“The big breakthrough here is the mapping of business rules to contextual understanding models,” Smith said during a June 24 Public Interest Declassification Board meeting.

Previously, machine learning models “weren’t quite there” to understand the context for different types of content, Smith said. He said it’s key for models to understand the distinctions between, for example, a Department of Agriculture document that describes a “kiloton of grain,” versus a DoD document that uses “kiloton” to describe the specific content of nuclear weapons.

“How do you break through that contextual decision making to a computer and train a computer or an algorithm on doing that,” Smith said. “And one of the big break breakthroughs that we discovered is you can actually do that now, with the algorithms that exist with natural language processing, named entity recognition, and other models, you can configure them to train on how to make a contextual decision making.”

Lawmakers want updates on declassification

The DoD project comes as Congress presses the Biden administration for progress on implementing the Sensible Classification Act of 2023. The legislation was signed into law as part of last year’s defense authorization bill.

In a June 18 letter to federal Chief Information Officer Clare Martorana, a bipartisan group of senators requested an update on efforts to develop a technology solution to support both classification and declassification.

“This opportunity to adapt our classification and declassification processes will greatly enhance the government’s ability to maintain accountability of our classified documents and records, streamline critical processes important to our national security, and work to reestablish trust and transparency between the United States government and the American people,” the lawmakers wrote.

Lawmakers are seeking answers to long-standing concerns about what one former official called a “tsunami of digitally created classified records.” The Biden administration has also kicked off a National Security Council-led process to reform the classification system.

‘Playbook’ for information review

Meanwhile, DoD’s declassification study will eventually result in a “playbook,” Smith said, for using technologies to support declassification and record management decisions in government. ARLIS is working on a “system architecture,” Smith said, as well as costs and other considerations.

The playbook will also turn into a request for proposals, he added, to help guide industry’s work with agencies on the supporting technologies.

DoD is looking to partner with agencies, including the Energy Department and the National Geospatial-Intelligence Agency, to further advance the project. Smith also said the DoD is looking to augment a State Department project that has used AI to declassify diplomatic cables.

DoD also plans to convene an interagency meeting this summer to discuss cross-government efforts and standardization.

“The principles that we’re going to explore here and show how we unlock technology to kind of navigate this, it’s applicable to any type of information review and release you’re doing,” Smith said. “Foreign disclosure, FOIA, security review . . . any type of information security review that you’re doing to clear anything, it follows these steps. And how do we map technology to each step to really make things efficient from a reviewer standpoint?”

The post DoD study sees ‘big breakthrough’ with using AI for declassification first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/artificial-intelligence/2024/06/dod-study-sees-big-breakthrough-with-using-ai-for-declassification/feed/ 0
This vendor tested its AI solutions on itself https://federalnewsnetwork.com/federal-insights/2024/06/this-vendor-tested-its-ai-solutions-on-itself/ https://federalnewsnetwork.com/federal-insights/2024/06/this-vendor-tested-its-ai-solutions-on-itself/#respond Tue, 18 Jun 2024 14:08:54 +0000 https://federalnewsnetwork.com/?p=5033628 IBM provided its own grounds for testing and developing a set of AI tools. It can help client organizations avoid some of the initial mistakes.

The post This vendor tested its AI solutions on itself first appeared on Federal News Network.

]]>

As its own ‘client zero’, IBM identified its human resources function back in 2017 for transformation with artificial intelligence. Today, the function is fully automated, and IBM has a wealth of insights and learnings to share that they hope can help federal agencies avoid some of the same pitfalls.

IBM took an AI-driven approach to transforming its HR function. For its test bed, the company used itself and came away with valuable lessons learned.

Now IBM can help federal agencies apply those lessons and — hopefully — avoid some of the same mistakes. That’s according to Mike Chon, IBM’s vice president and senior partner of talent transformation for the U.S. federal market.

“IBM has gained the efficiencies, it’s delivered on the employee experience, it has achieved a lot of the automations [and] productivity gains,” Chon said.

He cited statistics that tell the story. IBM employees have had nearly two million HR conversations with a virtual agent. Those have achieved resolution in 94% of the cases, meaning the employee didn’t need to proceed to a conversation with a live person.

Manager productivity

When seeking HR efficiencies, organizations tend to think initially in terms of self-service for employees. But Chon urged IT and HR staffs to think more broadly to include managers too.

“I also want to emphasize manager self-service,” he said. “I think that’s where the additional value can come in.”

It also requires a bit of rewiring of manager habits. Chon said that initially, he, like many experienced managers, was less inclined to invoke a chatbot than to simply call his HR representative with questions.

“I myself did not really adopt that [AI] paradigm right away,” he said. “My muscle memory was to call an HR person. Clock forward to today … I actually tend to go to our AI chatbot more than an HR manager.”

He added, IBM managerial uptake of the HR chatbot has reached 96% worldwide, accounting for 93% of the transactions.

HR presents a natural entry point for AI because it touches everyone.

“By introducing AI through HR, you’re really having this ability to embed the use of these tools throughout your enterprise,” Chon said. “I think that really starts to get people more comfortable.”

Use case approach

Having chosen the HR function, Chon said, IBM initially tried an overly comprehensive approach.

“When we first started this journey, we tried to boil the ocean. It was this big bang approach,” Chon said.

The company realized almost immediately that the tool wasn’t quite right, and people weren’t embracing it.

Lesson learned?

“Never seek the silver bullet,” Chon said. “It really forced everyone to put the brakes on this process” and rethink their approach.

The rethinking resulted in what Chon called a building block, use case-by-use case approach. The team started by identifying specific high-frequency or highly repetitive tasks, the automation of which would allow the team to spend less time on routine tasks and more on strategic, value add work. Data connected to each task helped with this identification, which  ultimately allowed the team to identify two use cases: employee time off and proof-of-employment letters. Before AI, employees would ask their HR representative how many vacation days they had left, and it could take days for HR to prepare and send employee proof of employment letters, Chon said. These tasks represented some of the most repetitive and time consuming for the function.

“AI gave employees the ability to find out their vacation days in seconds and generate their own employee verification letter from anywhere, anytime. And they get instant satisfaction because it happens right in front of them,” Chon said.

In the employment verification letter  use case, AI took the form of robotic process automation, he added.

Moreover, if a particular step to a task doesn’t work, HR and IT could simply turn it off and improve it, without affecting everything else that’s working well.

It’s also important to understand that in a small percentage of cases, employees will need to interact with humans; no AI agent can do everything. Therefore, Chon said, “we always give people the ability to connect to a live agent.” Careful data analysis of what leads to “off-ramps” helps with continuous improvement of the AI tool, he said.

Ultimately, Chon said, the HR AI-driven self-service option for employees and managers lets HR professionals become more productive, taking the drudgery out of HR processes, leaving people more time for “tackling things like recruiting and other high value activities like talent development.”

Ultimately, the key lessons learned from IBM’s experience center on employing a use-case driven approach. AI is successfully adopted with small wins, building blocks and steps. Larger, more strategic and transformational use cases don’t have one clear answer or outcome. The key is finding a use case — a workflow, process or task — that could be accelerated or improved through automation. This also allows for easier scaling to other parts of the agency.

“Now, I would say, seven years later, each time the team launches a new use case, it’s actually getting better and better,” Chon said.

Listen to the full show:

The post This vendor tested its AI solutions on itself first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/this-vendor-tested-its-ai-solutions-on-itself/feed/ 0
NARA to remove analog records as part of new digitization standards https://federalnewsnetwork.com/federal-insights/2024/06/nara-to-remove-analog-records-as-part-of-new-digitization-standards/ https://federalnewsnetwork.com/federal-insights/2024/06/nara-to-remove-analog-records-as-part-of-new-digitization-standards/#respond Tue, 11 Jun 2024 20:56:55 +0000 https://federalnewsnetwork.com/?p=5036457 The National Archives and Records Administration (NARA) is moving away from analog records and requiring it in digital format. June 30 will be the deadline.

The post NARA to remove analog records as part of new digitization standards first appeared on Federal News Network.

]]>
Federal Insights - Records Management - 06/11/2024

The National Archives and Records Administration will be moving away from analog records and are now requiring agencies to transfer records to them in digital format. NARA’s digitization standard is expected to begin at the end of June 2024.

“The deadline is June 30. In little more than six weeks, there’s going to be a major shift in how NARA accessions records from agencies. Arguably valuable permanent records that are part of our nation’s treasures. We got the biggest set of records covered first. Those standards are very detailed. They are almost like a checklist. And they explained to agencies and the vendors supporting agencies what needs to happen to create that digital image, that version of that permanent record that is coming to NARA. We are not accessioning the paper and the digital image, we are only going to be bringing in the digital image,” said Lisa Haralampus, the director of Federal Records Management Policy and Outreach at NARA, on Federal Insights — Records Management.

NARA recently opened a new digitization center in College Park, Maryland to evolve and provide better access to federal government records and expand its capacity.

“For the last year and a half or so, there was a renovation effort in our archives building at College Park. And we’ve renovated 18,000 square feet and established a modern state-of-the-art digitization center. A mixed use space that colocates our work processes. So archival prep, preparation of records before digitization, metadata capture and then ultimately scanning. We brought the different functions together in one location. We have a fleet of top of the line imaging equipment that ranges from overhead camera setups, flatbed scanners, microfilm and microfiche theatres. And we purchased three new imbl Fusion HD 8300 high-speed [scanners] that will exponentially make more records available online,” said Denise Henderson, director of digitization for the Office of Research Services at NARA.

NARA’s digitization is a multi-part process with different records requiring different techniques to scan and digitize. For agencies, all permanent paper records and print photos must have digital copies of its records as part of the digitization standards for NARA’s archives.

“We have format standards that we use at the National Archives; their records have been created in so many different formats over time by so many agencies depending on what they’re doing. We will take PST files, we will take EML files, we will take XML files, but we won’t see Lotus Notes on that email list. We need the email to be sent to us in a format that we can maintain. Unless your federal mission is really unique, and you are the standards authority, we try to base our standards on what’s common practice. So when we were developing the digitization standards for permanent records, we went and looked, well, what would we base them on? We at the National Archives, our job is to preserve our nation’s history,” Haralampus said.

NARA also requires digitized permanent records to meet Federal Agencies Digital Guidelines Initiative (FADGI) standards in order to be added in the archives. FADGI guidelines set standards for federal agencies to follow as best practices when processing digital historical, archival and cultural content. This includes maps, documents and prints.

“We are a cultural heritage institution. So we are using the Federal Agency Digitization Guidelines, because those were standards that were created to handle cultural heritage materials. The FADGI standards gives us our basis for the technical component of scanning, including things like what is the allowable error for noise. How do you test and make sure that you’ve got a calibrated workstation, so you know your image is what you produce,” Haralampus told the Federal Drive with Tom Temin. “When we wrote these digitization standards, we had the idea of modern textual records in mind; that’s where we started. Eventually, the FADGI standard that we produced would cover any type of record whether it was onionskin from the 1940s or maps. So our standards cover all types of records.”

Modern Textual Records (MTRs) refers to documents created by modern office paper. If the records are before 1950, or if that specific MTR has value, NARA will accept it along with the digital record.

“We created a disposition authority structure that has a check in it. An opportunity for NARA and for the agency and actually members of the public as well to weigh in and say yes, those records, we want to take the source record as well as the digitized record. So for us, modern textual records, we’re not anticipating those as having intrinsic value and coming to the National Archives,” Haralampus said.

When it comes to optical character recognition (OCR), NARA is not requiring that as a standard for agencies to perform. Haralampus said that will be standard to look at in the near future, but as of now they can’t find an OCR standard equivalent to the digitization standard.

“Most agencies are not digitizing records just to send them to the National Archives. They’re digitizing records because they need them to perform their mission. And as they’re performing their mission, the output of that is you should digitize to our permanent record standards. Don’t waste the digitization effort happening across the government,” she said.

The post NARA to remove analog records as part of new digitization standards first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/nara-to-remove-analog-records-as-part-of-new-digitization-standards/feed/ 0
Cloud Exchange 2024: Pluralsight’s Drew Firment on keeping pace with cloud professional development needs https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-pluralsights-drew-firment-on-keeping-pace-with-cloud-professional-development-needs/ https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-pluralsights-drew-firment-on-keeping-pace-with-cloud-professional-development-needs/#respond Sun, 09 Jun 2024 16:20:06 +0000 https://federalnewsnetwork.com/?p=5033697 Skill gaps in cloud technologies are holding back agencies, but artificial intelligence promises to help, says Pluralsight's chief cloud strategist.

The post Cloud Exchange 2024: Pluralsight’s Drew Firment on keeping pace with cloud professional development needs first appeared on Federal News Network.

]]>

Modern information technology tools bring amazing possibilities but only if there are skilled people to realize them. Evidence shows that U.S. companies and government agencies both have too few people versed in cloud and related technologies.

Pluralsight’s recent survey, for example, found that nearly eight in 10 projects end up abandoned because of skill gaps. That’s according to Drew Firment, chief cloud strategist at Pluralsight.

“The top two challenges remain cybersecurity, number one, and the need for a skilled workforce is number two,” Firment said during Federal News Network’s Cloud Exchange 2024.

“We’re really focused on helping these agencies mitigate the impact of the skills gap and really, ultimately, teaching the federal government how to use cloud computing and artificial intelligence so agencies can more effectively achieve their missions.”

In effect, both the cyberthreat situation and skill gaps amount to a single problem, Firment said. He cited a recent Government Accountability Office report pointing out the need for urgent action on critical cybersecurity challenges.

 AI to the government’s rescue?

Firment said the wide and widely planned adoption of AI will help resolve the convergence of the cyberthreats to data and the rising level of cloud adoption — and the skills needed to deal with them.

“We are starting to see emerging technologies like AI and machine learning being deployed to help improve IT capabilities like cybersecurity,” Firment said. “And as cloud environments and data footprints expand to support AI, ultimately this focus on security is going to be even more critical to protect that data.”

AI and the scalable, elastic nature of commercial cloud computing go hand in hand.

“Agencies now have the ability to scale AI using cloud,” Firment said.

Among the specific skills needed in the cloud and AI computing era are two in particular, he said.

“The government is still focused heavily on the general IT skills to maintain legacy systems,” but agencies need to pivot to data engineering and managing cloud services and characteristics, Firment advised.

He also recommended that moving toward hiring based more on skills basis rather than academic degrees would help agencies target and recruit people with skills in cloud architecture and cloud-native AI and encryption services.

It’s begun to happen but will take time to bring in new hires. Firment also lauded training efforts underway in agencies to expand cloud and cyber skills as well.

“It is encouraging to see that skills development is now a pretty significant investment for the federal government, and they’re making those investments in their professionals to help their workforce better manage and secure their cloud computing systems.”

Discover more articles and videos now on Federal News Network’s Cloud Exchang2024 event page.

The post Cloud Exchange 2024: Pluralsight’s Drew Firment on keeping pace with cloud professional development needs first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-pluralsights-drew-firment-on-keeping-pace-with-cloud-professional-development-needs/feed/ 0
ATF begins looking to new cyber strategies as it nears 100% cloud migration https://federalnewsnetwork.com/federal-insights/2024/05/atf-begins-looking-to-new-cyber-strategies-as-it-nears-100-cloud-migration/ https://federalnewsnetwork.com/federal-insights/2024/05/atf-begins-looking-to-new-cyber-strategies-as-it-nears-100-cloud-migration/#respond Tue, 28 May 2024 18:08:48 +0000 https://federalnewsnetwork.com/?p=5017971 Containerization and automation are two of the tools ATF is looking to use to implement zero trust principles as it re-architects its systems.

The post ATF begins looking to new cyber strategies as it nears 100% cloud migration first appeared on Federal News Network.

]]>
Federal Insights — Best Practices in Secure Software Development — 5/28/24

The Bureau of Alcohol, Tobacco, Firearms and Explosives is only a few months away from having 100% of its systems in the cloud. That’s the culmination of almost eight years of effort, said Mason McDaniel, ATF’s chief technology officer. He said that’s been such a large lift because there are no commercial, off-the-shelf products for missions like criminal investigations, firearms dealer regulations or firearm tracing. And because those systems weren’t compatible with the cloud, ATF needed an environment that allowed them to be rebuilt from the ground up.

“We really refocused on building an enterprise, continuous integration, continuous delivery (CI/CD) environment, rebuilding all of our processes around automation, and really focused on building this pipeline that let us rebuild our applications quickly, efficiently, deploy things quickly, and then we use that as the enabler to go through application by application and try to get those rebuilt. And we are just about at the end of that journey,” McDaniel said on Federal Insights — Best Practices in Secure Software Development.

One key part that McDaniel said ATF prioritized was not changing the business processes, in order to minimize retraining. Instead, ATF focused on wrapping modern frameworks and automation technologies around those, to set the stage for modernizing those business processes as rapidly as possible in the future.

Automating cybersecurity

That also gave ATF the opportunity to embed automated cybersecurity processes throughout the development lifecycle, said ATF Chief Information Security Officer Hillary Carney. That includes penetration testing, endpoint detection and response tools, security information and event management logging tools, and more. That gives developers the feedback they need to address vulnerabilities from test cases through production, as well as lifetime visibility.

“One of the things that I think cloud really helped us with is that near-real time visibility; it allows us to be so much more agile, not only for meeting the business mission need, but for the security testing portion as well,” Carney said. “And being able to interact with the operations teams and say ‘we monitor on a daily basis through our tools. And we’re seeing this change; the posture has changed, and we need you to get in there, and diagnose why that’s happening.’ So cloud has been essential in order to move our program forward, to be a lot more responsive to both mission and then to cybersecurity.”

“But just like the tools have gotten better, so have the adversaries. That’s really what’s driving this. It’s an arms race. So if we are not on top of it, someone else will find it. They will exploit it,” she added. “I am over the moon with the progress we’ve made and being able to do more near real-time analysis, do more agile testing. However, as we get better, they get better. So there is no rest for the weary.”

That’s why the next thing on ATF’s cybersecurity to-do list is to begin using the Cybersecurity and Infrastructure Security Agency’s software attestation form. Eventually, Carney said, the goal is to get to using Software Bills of Materials, but that’s too much of a culture change all at once. She said, much like ATF has done with it’s CI/CD program, the intent is to start slow and build the case as they build the program.

Containerization

But in the meantime, ATF is leveraging its new CI/CD capabilities along with a push toward containerization and virtualization to enhance its systems’ resiliency. McDaniel said using automated deployment and containerization limits the configuration creep of patching, because every new instance is automatically deployed from a known-good state. When paired with ATF’s more frequent deployments, that shrinks the window that adversaries have to create a persistent presence in the systems.

And as ATF uses this method to re-architect its systems, it’s also implementing zero trust principles like least privilege, and continuous verification of identity and authorization. That’s an ongoing process McDaniel said will help ATF protect its application programming interfaces.

“Identity is so foundational to our cloud journey as well as the zero trust mandate. We’ve started some work on device. We’ve made inroads in multiple pillars,” Carney said. “What we need to do now, and we’re trying to drive towards, which is difficult in these constrained budget environments, is really getting that integrated plan to move together, to ensure that we’re taking everything into account as we’re planning our featured architectural state. So it’s a work in progress.”

Information sharing

All of this has been bolstered by increased information sharing among Justice Department components, both Carney and McDaniel said. Many of ATF’s systems are law-enforcement specific; there’s no need for agencies outside DoJ to have them. That limits the applicability of information sharing in wider venues, like the Chief Information Officers Council. But within DoJ, they’re sharing strategies that they find to be more effective than “the traditional, ‘let’s throw 500 FISMA controls at it’” strategies, Carney said.

“So we’ve been figuring a lot of it out as we go and refining our processes and sharing a number of our lessons learned with some of the other components,” McDaniel said. “And then for those that have been on the same path, we’re certainly taking what we can from them. But there’s definitely active lessons learned sharing going on, between all the components.”

The post ATF begins looking to new cyber strategies as it nears 100% cloud migration first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/05/atf-begins-looking-to-new-cyber-strategies-as-it-nears-100-cloud-migration/feed/ 0
The Marine Corps’ plan to further breakdown data siloes https://federalnewsnetwork.com/defense-news/2024/05/the-marine-corps-plan-to-further-breakdown-data-siloes/ https://federalnewsnetwork.com/defense-news/2024/05/the-marine-corps-plan-to-further-breakdown-data-siloes/#respond Fri, 24 May 2024 16:44:13 +0000 https://federalnewsnetwork.com/?p=5014286 Dr. Colin Crosby, the service data officer for the Marine Corps, said the first test of the API connection tool will use “dummy” logistics data.

The post The Marine Corps’ plan to further breakdown data siloes first appeared on Federal News Network.

]]>
var config_5014343 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB2238077517.mp3?updated=1716568461"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"The Marine Corps\u2019 plan to further breakdown data siloes","description":"[hbidcpodcast podcastid='5014343']nnThe Marine Corps is close to testing out a key piece to its upcoming Fighting Smart concept.nnAs part of <a href="https:\/\/www.mca-marines.org\/gazette\/fighting-smart\/#:~:text=Fighting%20Smart%20is%20a%20way,and%20combined%20arms%20more%20effective." target="_blank" rel="noopener">its goal<\/a> to create an integrated mission and data fabric, the Marines will pilot an application programming interface (API) standard to better connect and share data no matter where it resides.nn\u201cReally over the next 12 months, we hope to have the autonomous piece of this API connection implemented in our environment in what we call the common management plane that allows us to execute enterprise data governance where we can then use the capabilities rather than the native capabilities within our environment to develop those data catalogs, to tag data, to track the data from its lineage from creation all the way to sharing and destruction within our environment and outside of our environment,\u201d said Dr. Colin Crosby, the service data officer for the Marine Corps, on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cWe're working with what we call the functional area managers and their leads on the data that they own because this is all new in how we're operating. I need them to help me execute this agenda so that we can then create that API connection.\u201dnnLike many organizations, <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2022\/03\/dod-cloud-exchange-renata-spinks-on-usmcs-acceleration-to-the-cloud\/">mission areas<\/a> own and manage the data, but sharing because of culture, technology and\/or policy can be difficult.nnCrosby said the API connection can help overcome <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2023\/04\/why-the-marine-corps-has-established-its-own-software-factory\/">many of these challenges<\/a>.nn\u201cOur first marker is to have a working API connection on test data. Once that happens, then we're going to start accelerating the work that we're doing,\u201d he said. \u201cWe're using logistics data so what we're doing is using a dummy data, and we're going to pull that data into our common management plane, and then from that CMP, we want to push that data to what we call the\u00a0 online database gateway. Then, by pulling that into the OTG, we can then push it into the Azure Office 365 environment, where we can then use that data using our PowerBI capabilities within our environment.\u201dn<h2>Testing the API before production<\/h2>nOnce the API connection proves out, Crosby said the goal is to push data into the Marine Corps\u2019 Bolt platform, which runs on the Advana Jupiter platform.nnHe said there is a lot of excitement from logistics and other mission areas around the Marine Corps to prove this API connection technology.nn\u201cAs we get more comfortable moving forward, then we will bring on the next, what we call, coalition of the willing. As of now, we have a line because we have other organizations now that are like, \u2018we want to be a part of this,\u2019\u201d Crosby said. \u201cThe training and education command is ready to go. So we're excited about it because now I don't have to work that hard to get people on board and now I have people knocking on my doors saying they are ready to go.\u201dnnCrosby added that before the API connection goes live with each new organization, his team will run similar tests using dummy data. He said building that repeatable process and bringing in some automation capabilities will help decrease the time it takes to turn on the API tools for live data.nnWithout these new capabilities, Crosby said it takes weeks to pull CSV files, thus delaying the ability of leaders to make decisions.nn\u201cWith the API, we're going to near-real time type of pull and push, which is speeding up the decision cycle,\u201d he said. \u201cThen there are opportunities to expand on that by building applications that will aggregate data and then being able to look at data to check the maintenance on equipment, and then it'd be a little bit easier to understand what we need and when. The goal is to shrink that decision cycle a little bit.\u201dnnThe API connection tool is one piece to the bigger Marine Corps effort to create an <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2022\/10\/as-data-fabric-comes-together-army-must-ensure-platforms-integrate\/">integrated mission and data fabric<\/a>. Crosby said that initiative also relies on the unification of the Marine Corps <a href="https:\/\/federalnewsnetwork.com\/defense-news\/2024\/03\/how-the-marines-corps-got-ahead-of-the-zero-trust-curve\/">enterprise network<\/a> to bring the business side and the tactical side together into one environment.nn\u201cThe fabric is a framework and approach of our environment today and how we want to connect our environment in an autonomous fashion using APIs, so that we can pull data and we can share data, regardless of the cloud environment that it\u2019s in, regardless of whatever database structure the data resides in,\u201d Crosby said. \u201cIt allows us to be flexible. It allows us to scale and to really push data and pull data at a speed that we've never done before. What I love about the fabric is it really gets to that decision making. It allows our commanders to make sense and act within real or near real time.\u201d"}};

The Marine Corps is close to testing out a key piece to its upcoming Fighting Smart concept.

As part of its goal to create an integrated mission and data fabric, the Marines will pilot an application programming interface (API) standard to better connect and share data no matter where it resides.

“Really over the next 12 months, we hope to have the autonomous piece of this API connection implemented in our environment in what we call the common management plane that allows us to execute enterprise data governance where we can then use the capabilities rather than the native capabilities within our environment to develop those data catalogs, to tag data, to track the data from its lineage from creation all the way to sharing and destruction within our environment and outside of our environment,” said Dr. Colin Crosby, the service data officer for the Marine Corps, on Ask the CIO. “We’re working with what we call the functional area managers and their leads on the data that they own because this is all new in how we’re operating. I need them to help me execute this agenda so that we can then create that API connection.”

Like many organizations, mission areas own and manage the data, but sharing because of culture, technology and/or policy can be difficult.

Crosby said the API connection can help overcome many of these challenges.

“Our first marker is to have a working API connection on test data. Once that happens, then we’re going to start accelerating the work that we’re doing,” he said. “We’re using logistics data so what we’re doing is using a dummy data, and we’re going to pull that data into our common management plane, and then from that CMP, we want to push that data to what we call the  online database gateway. Then, by pulling that into the OTG, we can then push it into the Azure Office 365 environment, where we can then use that data using our PowerBI capabilities within our environment.”

Testing the API before production

Once the API connection proves out, Crosby said the goal is to push data into the Marine Corps’ Bolt platform, which runs on the Advana Jupiter platform.

He said there is a lot of excitement from logistics and other mission areas around the Marine Corps to prove this API connection technology.

“As we get more comfortable moving forward, then we will bring on the next, what we call, coalition of the willing. As of now, we have a line because we have other organizations now that are like, ‘we want to be a part of this,’” Crosby said. “The training and education command is ready to go. So we’re excited about it because now I don’t have to work that hard to get people on board and now I have people knocking on my doors saying they are ready to go.”

Crosby added that before the API connection goes live with each new organization, his team will run similar tests using dummy data. He said building that repeatable process and bringing in some automation capabilities will help decrease the time it takes to turn on the API tools for live data.

Without these new capabilities, Crosby said it takes weeks to pull CSV files, thus delaying the ability of leaders to make decisions.

“With the API, we’re going to near-real time type of pull and push, which is speeding up the decision cycle,” he said. “Then there are opportunities to expand on that by building applications that will aggregate data and then being able to look at data to check the maintenance on equipment, and then it’d be a little bit easier to understand what we need and when. The goal is to shrink that decision cycle a little bit.”

The API connection tool is one piece to the bigger Marine Corps effort to create an integrated mission and data fabric. Crosby said that initiative also relies on the unification of the Marine Corps enterprise network to bring the business side and the tactical side together into one environment.

“The fabric is a framework and approach of our environment today and how we want to connect our environment in an autonomous fashion using APIs, so that we can pull data and we can share data, regardless of the cloud environment that it’s in, regardless of whatever database structure the data resides in,” Crosby said. “It allows us to be flexible. It allows us to scale and to really push data and pull data at a speed that we’ve never done before. What I love about the fabric is it really gets to that decision making. It allows our commanders to make sense and act within real or near real time.”

The post The Marine Corps’ plan to further breakdown data siloes first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/defense-news/2024/05/the-marine-corps-plan-to-further-breakdown-data-siloes/feed/ 0
HHS research arm to spend $50M on ‘revolutionary’ cyber tools https://federalnewsnetwork.com/cybersecurity/2024/05/hhs-research-arm-looks-to-boost-hospital-cyber-defenses-with-50m-project/ https://federalnewsnetwork.com/cybersecurity/2024/05/hhs-research-arm-looks-to-boost-hospital-cyber-defenses-with-50m-project/#respond Tue, 21 May 2024 22:18:44 +0000 https://federalnewsnetwork.com/?p=5010167 The new project comes amid sustained Congressional attention on HHS's role in overseeing healthcare cybersecurity in the wake of the Change Healthcare incident.

The post HHS research arm to spend $50M on ‘revolutionary’ cyber tools first appeared on Federal News Network.

]]>
Amid relentless targeting of the health sector by ransomware attacks, the Department of Health and Human Services research arm says it will invest more than $50 million in advanced healthcare cybersecurity tools.

HHS’s Advanced Research Projects Agency for Health (ARPA-H) on Monday announced a “Universal PatchinG and Remediation for Autonomous DEfense” (UPGRADE) program. The goal is to build tools that help hospitals and healthcare systems more easily find and fix cyber vulnerabilities in their systems.

In a statement, HHS Deputy Secretary Andrea Palm said the new program would help build on the HHS cybersecurity strategy for the healthcare sector.

“We continue to see how interconnected our nation’s health care ecosystem is and how critical it is for our patients and clinical operations to be protected from cyberattacks,” Palm said. “Today’s launch is yet another example of HHS’ continued commitment to improving cyber resiliency across our health care system.”

UPGRADE program manager Andrew Carney said a major challenge is modeling the complexities in the myriad software used in any given healthcare facility, leaving many open to ransomware attacks.

“With UPGRADE, we want to reduce the effort it takes to secure hospital equipment and guarantee that devices are safe and functional so that health care providers can focus on patient care,” Carney said in a statement.

A special notice announcing the new project details how ARPA-H envisions the new program developing a “revolutionary new cybersecurity platform for hospitals and health systems.” The idea is to help hospital IT teams manage the “massive complexity” of many health IT environments.

“UPGRADE envisions a semiautonomous cyber-threat mitigation platform that promotes proactive, scalable, and synchronized security updates, adaptable to any hospital environment, and across a wide array of the most vulnerable equipment classes,” the special notice states.

“This software platform will contain a suite of tools that enable real-time evaluation of potential vulnerabilities, and how corresponding security updates might impact hospital operations,” the notice continues. “This will empower hospital decision makers to deploy security remediations without risking the real-world operational downtime that threatens the continuity of patient care.”

ARPA-H detailed how the program will focus on four distinct technical areas, including creating the vulnerability mitigation software platform; developing “high-fidelity” digital twins of hospital systems; automatically detecting cyber vulnerabilities; and “auto-developing” custom cyber defenses.

The research agency said it plans to make multiple awards under the UPGRADE program. It will hold a proposers day on June 20.

ARPA-H’s new project comes amid sustained attention on health sector cybersecurity in the wake of the Change Healthcare ransomware attack. The February cyber incident took down the systems of the major health transactions provider, crippling the operations of hospitals and health systems across the country for weeks.

In addition to investigating the response by United Healthcare, Change Healthcare’s parent company, lawmakers have been probing the response of HHS, which is responsible for overseeing the cybersecurity of the healthcare sector.

“We must also assess the response of the federal government, which plays a critical role in these efforts,” Sen. Mike Crapo (R-Idaho) said during a May 17 Senate Finance Committee hearing on the Change Healthcare breach. “HHS has a responsibility to serve as a central hub for coordination, convening insights from other branches of government and the private sector to deploy timely information about active threats, as well as best practices to deter intrusions and resources should an attack occur.”

HHS officials say they are elevating the role of the Administration for Strategic Preparedness and Response (ASPR) to serve as a hub for the agency’s sector cybersecurity efforts, which span multiple components and offices.

The post HHS research arm to spend $50M on ‘revolutionary’ cyber tools first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cybersecurity/2024/05/hhs-research-arm-looks-to-boost-hospital-cyber-defenses-with-50m-project/feed/ 0
Navy hired this company to develop a new type of aircraft https://federalnewsnetwork.com/defense-news/2024/05/navy-hired-this-company-to-develop-a-new-type-of-aircraft/ https://federalnewsnetwork.com/defense-news/2024/05/navy-hired-this-company-to-develop-a-new-type-of-aircraft/#respond Mon, 13 May 2024 17:52:47 +0000 https://federalnewsnetwork.com/?p=4999286 The Naval Air Systems Command recently hired a company called Electra to study the development of such an electrically-powered plane.

The post Navy hired this company to develop a new type of aircraft first appeared on Federal News Network.

]]>
var config_4998529 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB5338801184.mp3?updated=1715587501"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"Navy hired this company to develop a new type of aircraft","description":"[hbidcpodcast podcastid='4998529']nnNew military aircraft designs don't necessarily require super jet engines or hundreds of billions in development costs. A case in point: The Navy's bid for a light plane that can take off and land in less than a football field. The Naval Air Systems Command <a href="https:\/\/www.electra.aero\/news\/u-s-navy-selects-electra-to-design-ship-based-estol-logistics-aircraft">recently hired a company called Electra<\/a> to study the development of such an electrically-powered plane. For more, <a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/"><em><strong>the Federal Drive with Tom Temin<\/strong><\/em><\/a> spoke with the founder and CEO of Electra.aero, John Langford.nn<em><strong>Interview Transcript:\u00a0<\/strong><\/em>n<blockquote><strong>Tom Temin <\/strong>And just a brief technological description of what your company does. It's more than just planes that can land in a short space, but it's the propulsion that's radically different.nn<strong>John Langford <\/strong>Exactly. Electra.aero is a US company started about four years ago, whose focus is sustainable aviation. We believe that the whole next generation of aviation, at least commercial aviation, is really all about decarbonization. And we're working in a part of the market that we think is relatively unaddressed within, but with enormous market potential, which is sort of the short haul and regional air mobility market. What Electra is doing is developing a hybrid electric, extreme short takeoff and landing airplane. Think of something with the operational flexibility of a helicopter, but with the cost structure at or below existing fixed wing airplanes. Electra uses a technique called blown lift, which is an idea that has been around for many years, was pioneered by NASA back in the 60s and demonstrated by NASA and the Air Force in the 70s, but which has never yet reached commercial utilization, primarily because the engines that existed at the time were not well suited economically to this idea of blown lift. But electric propulsion, distributed electric propulsion is really the breakthrough, which, combined with the idea of blown lift, makes this new category of airplanes possible. That's what Electra it's all about.nn<strong>Tom Temin <\/strong>Blown lift then makes the wing feel like it's going faster than it actually is. So the plane goes up even though it's not going forward as fast as usual, rotation speed.nn<strong>John Langford <\/strong>Exactly. The idea of blown lift is you bathe the wing in accelerated flow from many different propellers on there, and it accelerates the flow over the wing and it effectively makes the wing look bigger than it physically is, which is how we get the eventually the high lift coefficients. Then the slow flight speeds and the slow speeds are what allow you to do the really short takeoff and landings.nn<strong>Tom Temin <\/strong>And have you tested it with a barn door yet?nn<strong>John Langford <\/strong>We've tested it with a whole range of things, from pencil and pen calculations to computer fluid dynamics to subscale models. And now today, we have a full scale manned demonstrator flying right out at Manassas Regional Airport. And it's really neat to see how all of the theory actually translates into practice very well.nn<strong>Tom Temin <\/strong>And what has the Navy asked you to do to prototype a plane for its use, or to study the concept? What is the actual contract deliverable here?nn<strong>John Langford <\/strong>So Electra's primary focus is a commercial product aimed at, commercial operators. But at the same time, there's a lot of government uses for an airplane that can take off and land very quietly in very small spaces. Our biggest financial backer to date from the government has been the Air Force through their Agility Prime program, which is helping sponsor the development, not only of the test program that we're flying today, but also of a prototype airplane of the nine seed airplane product. As they've seen, the Air Force interests, both the Army and the Navy, have now become interested in how this technology might actually benefit them. And the Navy contract that we announced last week is really the first study of how that might be applied in the marine environment.nn<strong>Tom Temin <\/strong>So they need not just the technology, but it sounds like they're looking for a use case for this type of craft.nn<strong>John Langford <\/strong>Absolutely. An airplane that can operate in sort of helicopter like spaces, but at the very low cost, comparatively, of a fixed wing airplane has a lot of potential uses. And commercially, what we're trying to do is get in and out of the Wall Street Heliport, which would allow fixed wing airplanes to fly right into Manhattan, which is a little bit of a mind boggling idea when you think about it. That would enable direct air service from Manhattan to Washington, DC, right on a on a fixed wing airplane, not on a helicopter. And if you can land on that, if you're familiar with what that heliport looks like. Barge in the East River. And that's where the space of 300ft by 100ft, our operating requirement\u00a0 comes from there. Once you can operate in a space that size, there's all kinds of other places you can go the top of parking garages, literally any soccer field. And as you start to look at the marine environment, you start to go, wow, when you have a little wind over the deck, now you're talking about distances that are even shorter than the 300 foot or the 150 foot ground rule that we're talking about. These\u00a0 airplanes take off and land between 25 and 30 knots, which is down in the range of ships can achieve that. And if there's wind over the deck, either generated by ship motion or by by the wind itself, you can get into situations where these airplanes literally can almost take off vertically. There are historical examples of previous Stol airplanes, not blown lift airplanes, but previous Stol airplanes that can do essentially a vertical takeoff in the right wind conditions. And that's really the heart of the study we're going to be doing for the Navy is, well, what does this really mean? Some of the ideas we're thinking about is this allows you to take container ships. And use a container ship to add aircraft, fixed wing aircraft operations off a container ship, off an oil tanker, off anything with a space of 50 to 100ft. That's part of the study. How you treat some of these, the idea that now you have a reliable wind over the deck condition. What does that really mean for the operations of an airplane like this, which only needs 150ft ground roll to begin with? How does that really work in practice in the marine environment? That's the focus of this initial study.nn<strong>Tom Temin <\/strong>We're Speaking with John Langford, he's the founder and CEO of Electra.aero. And what is the status of this propulsion technology? Because pure electric planes have been flown, but they're kind of like electric motorcycles. Lots of fun if you don't want to go anywhere.nn<strong>John Langford <\/strong>Exactly. When you look at conventional jet fuel, and you look at the very best batteries, there's still a factor of between 50 and 100 in the amount of energy you can contain for a given amount of weight. And in cars, if your car weighs twice, your battery car weighs twice what your, gas car weighs, nobody really notices. The people who have to maintain the roads or the people who sell you the tires, they notice. But the average consumer doesn't really realize how much heavier their electric cars are. Aviation weight is everything. Absolutely, the name of the game is how you get this high performance at low weight. And so batteries are not really well suited to aviation today. They may well be as the battery technology progresses over the next, 10, 25, 50, 100 years. But today it's only in very limited cases that batteries buy their way on to an airplane. So what we are focused on is a hybrid solution. Think very much like, a Prius where there is both batteries and there is, in our case, a small gas turbine engine. Think of it like an auxiliary propulsion unit or something like that. They will work together in normal operations for takeoff and landing. Either one can power the airplane in an emergency. So one of the cool things the hybrid does is it gives you lots of redundancies that you don't have on an airplane normally in this weight category. And then they allow lots of really neat advantages. Essentially what we do is we operate the gas turbine at a single fixed operating point, and we run it that way for a really long time.nn<strong>John Langford <\/strong>So the two big drivers of maintenance cost on jet engines are how many throttle cycles you do, and how many times you turn it off and on. So both of those are dramatically reduced in the hybrid thing. And all of the throttle excursions are taken up by the batteries, which are actually pretty good at changing their loads very quickly. So we think it's really a nice combination that is going to work well, and not just in our nine seat airplane. We actually think this technology is very scalable. We're already talking with NASA about ideas about how this might scale up into airplanes as large as several hundred seats in a passenger. I think the whole idea of hybrid electric airplanes is actually something we're going to hear a lot about over the next couple of decades. And we think Electra is really just a pioneer in that in the technology and in the market space.nn<strong>Tom Temin <\/strong>But just to be clear, you do have craft built and flying around with this technology.nn<strong>John Langford <\/strong>Absolutely right. We started out companies four years old. We spent the first two years developing and proving the hybrid electric system before we even built any kind of airplane. We spent the first two years developing and testing the hybrid electric propulsion system, and then we built an airplane. We wrapped the airplane around it. And that airplane, is called the, the EL2, goldfinch. And it's flying today out in Manassas. It's a two place airplane about the size of a Cessna 172. And it's being used to validate all of the systems before we build the actual product, which is a nine passenger version.nn<strong>Tom Temin <\/strong>It strikes me you could have the future locomotive at your fingertips also.nn<strong>John Langford <\/strong>The electrification of things is going to be a big part of the next industrial revolution. And over the last 20 years, it's all been how do you put everything on the internet. The next 25 years is going to be how you make everything, some version of electric. Whether it's pure battery, whether it's hybrid. I'm a big believer of hybrid. These are steps towards a future that may be hydrogen based or something like that, but there are steps that can be taken today with the existing technology, and they don't require a rework of the entire distribution system. And so they're very practical even if they're only interim steps. And by interim, I mean this makes several generations 25 to 50 years, which is still a pretty good product lifecycle.<\/blockquote>"}};

New military aircraft designs don’t necessarily require super jet engines or hundreds of billions in development costs. A case in point: The Navy’s bid for a light plane that can take off and land in less than a football field. The Naval Air Systems Command recently hired a company called Electra to study the development of such an electrically-powered plane. For more, the Federal Drive with Tom Temin spoke with the founder and CEO of Electra.aero, John Langford.

Interview Transcript: 

Tom Temin And just a brief technological description of what your company does. It’s more than just planes that can land in a short space, but it’s the propulsion that’s radically different.

John Langford Exactly. Electra.aero is a US company started about four years ago, whose focus is sustainable aviation. We believe that the whole next generation of aviation, at least commercial aviation, is really all about decarbonization. And we’re working in a part of the market that we think is relatively unaddressed within, but with enormous market potential, which is sort of the short haul and regional air mobility market. What Electra is doing is developing a hybrid electric, extreme short takeoff and landing airplane. Think of something with the operational flexibility of a helicopter, but with the cost structure at or below existing fixed wing airplanes. Electra uses a technique called blown lift, which is an idea that has been around for many years, was pioneered by NASA back in the 60s and demonstrated by NASA and the Air Force in the 70s, but which has never yet reached commercial utilization, primarily because the engines that existed at the time were not well suited economically to this idea of blown lift. But electric propulsion, distributed electric propulsion is really the breakthrough, which, combined with the idea of blown lift, makes this new category of airplanes possible. That’s what Electra it’s all about.

Tom Temin Blown lift then makes the wing feel like it’s going faster than it actually is. So the plane goes up even though it’s not going forward as fast as usual, rotation speed.

John Langford Exactly. The idea of blown lift is you bathe the wing in accelerated flow from many different propellers on there, and it accelerates the flow over the wing and it effectively makes the wing look bigger than it physically is, which is how we get the eventually the high lift coefficients. Then the slow flight speeds and the slow speeds are what allow you to do the really short takeoff and landings.

Tom Temin And have you tested it with a barn door yet?

John Langford We’ve tested it with a whole range of things, from pencil and pen calculations to computer fluid dynamics to subscale models. And now today, we have a full scale manned demonstrator flying right out at Manassas Regional Airport. And it’s really neat to see how all of the theory actually translates into practice very well.

Tom Temin And what has the Navy asked you to do to prototype a plane for its use, or to study the concept? What is the actual contract deliverable here?

John Langford So Electra’s primary focus is a commercial product aimed at, commercial operators. But at the same time, there’s a lot of government uses for an airplane that can take off and land very quietly in very small spaces. Our biggest financial backer to date from the government has been the Air Force through their Agility Prime program, which is helping sponsor the development, not only of the test program that we’re flying today, but also of a prototype airplane of the nine seed airplane product. As they’ve seen, the Air Force interests, both the Army and the Navy, have now become interested in how this technology might actually benefit them. And the Navy contract that we announced last week is really the first study of how that might be applied in the marine environment.

Tom Temin So they need not just the technology, but it sounds like they’re looking for a use case for this type of craft.

John Langford Absolutely. An airplane that can operate in sort of helicopter like spaces, but at the very low cost, comparatively, of a fixed wing airplane has a lot of potential uses. And commercially, what we’re trying to do is get in and out of the Wall Street Heliport, which would allow fixed wing airplanes to fly right into Manhattan, which is a little bit of a mind boggling idea when you think about it. That would enable direct air service from Manhattan to Washington, DC, right on a on a fixed wing airplane, not on a helicopter. And if you can land on that, if you’re familiar with what that heliport looks like. Barge in the East River. And that’s where the space of 300ft by 100ft, our operating requirement  comes from there. Once you can operate in a space that size, there’s all kinds of other places you can go the top of parking garages, literally any soccer field. And as you start to look at the marine environment, you start to go, wow, when you have a little wind over the deck, now you’re talking about distances that are even shorter than the 300 foot or the 150 foot ground rule that we’re talking about. These  airplanes take off and land between 25 and 30 knots, which is down in the range of ships can achieve that. And if there’s wind over the deck, either generated by ship motion or by by the wind itself, you can get into situations where these airplanes literally can almost take off vertically. There are historical examples of previous Stol airplanes, not blown lift airplanes, but previous Stol airplanes that can do essentially a vertical takeoff in the right wind conditions. And that’s really the heart of the study we’re going to be doing for the Navy is, well, what does this really mean? Some of the ideas we’re thinking about is this allows you to take container ships. And use a container ship to add aircraft, fixed wing aircraft operations off a container ship, off an oil tanker, off anything with a space of 50 to 100ft. That’s part of the study. How you treat some of these, the idea that now you have a reliable wind over the deck condition. What does that really mean for the operations of an airplane like this, which only needs 150ft ground roll to begin with? How does that really work in practice in the marine environment? That’s the focus of this initial study.

Tom Temin We’re Speaking with John Langford, he’s the founder and CEO of Electra.aero. And what is the status of this propulsion technology? Because pure electric planes have been flown, but they’re kind of like electric motorcycles. Lots of fun if you don’t want to go anywhere.

John Langford Exactly. When you look at conventional jet fuel, and you look at the very best batteries, there’s still a factor of between 50 and 100 in the amount of energy you can contain for a given amount of weight. And in cars, if your car weighs twice, your battery car weighs twice what your, gas car weighs, nobody really notices. The people who have to maintain the roads or the people who sell you the tires, they notice. But the average consumer doesn’t really realize how much heavier their electric cars are. Aviation weight is everything. Absolutely, the name of the game is how you get this high performance at low weight. And so batteries are not really well suited to aviation today. They may well be as the battery technology progresses over the next, 10, 25, 50, 100 years. But today it’s only in very limited cases that batteries buy their way on to an airplane. So what we are focused on is a hybrid solution. Think very much like, a Prius where there is both batteries and there is, in our case, a small gas turbine engine. Think of it like an auxiliary propulsion unit or something like that. They will work together in normal operations for takeoff and landing. Either one can power the airplane in an emergency. So one of the cool things the hybrid does is it gives you lots of redundancies that you don’t have on an airplane normally in this weight category. And then they allow lots of really neat advantages. Essentially what we do is we operate the gas turbine at a single fixed operating point, and we run it that way for a really long time.

John Langford So the two big drivers of maintenance cost on jet engines are how many throttle cycles you do, and how many times you turn it off and on. So both of those are dramatically reduced in the hybrid thing. And all of the throttle excursions are taken up by the batteries, which are actually pretty good at changing their loads very quickly. So we think it’s really a nice combination that is going to work well, and not just in our nine seat airplane. We actually think this technology is very scalable. We’re already talking with NASA about ideas about how this might scale up into airplanes as large as several hundred seats in a passenger. I think the whole idea of hybrid electric airplanes is actually something we’re going to hear a lot about over the next couple of decades. And we think Electra is really just a pioneer in that in the technology and in the market space.

Tom Temin But just to be clear, you do have craft built and flying around with this technology.

John Langford Absolutely right. We started out companies four years old. We spent the first two years developing and proving the hybrid electric system before we even built any kind of airplane. We spent the first two years developing and testing the hybrid electric propulsion system, and then we built an airplane. We wrapped the airplane around it. And that airplane, is called the, the EL2, goldfinch. And it’s flying today out in Manassas. It’s a two place airplane about the size of a Cessna 172. And it’s being used to validate all of the systems before we build the actual product, which is a nine passenger version.

Tom Temin It strikes me you could have the future locomotive at your fingertips also.

John Langford The electrification of things is going to be a big part of the next industrial revolution. And over the last 20 years, it’s all been how do you put everything on the internet. The next 25 years is going to be how you make everything, some version of electric. Whether it’s pure battery, whether it’s hybrid. I’m a big believer of hybrid. These are steps towards a future that may be hydrogen based or something like that, but there are steps that can be taken today with the existing technology, and they don’t require a rework of the entire distribution system. And so they’re very practical even if they’re only interim steps. And by interim, I mean this makes several generations 25 to 50 years, which is still a pretty good product lifecycle.

The post Navy hired this company to develop a new type of aircraft first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/defense-news/2024/05/navy-hired-this-company-to-develop-a-new-type-of-aircraft/feed/ 0
CX Exchange 2024: Maximus’ MaryAnn Monroe on tech advances in experience capabilities https://federalnewsnetwork.com/federal-insights/2024/04/cx-exchange-2024-maximus-maryann-monroe-on-tech-advances-in-experience-capabilities/ https://federalnewsnetwork.com/federal-insights/2024/04/cx-exchange-2024-maximus-maryann-monroe-on-tech-advances-in-experience-capabilities/#respond Fri, 26 Apr 2024 12:11:33 +0000 https://federalnewsnetwork.com/?p=4958334 Agencies need to apply artificial intelligence and other technologies to improve services in real time based on citizen needs, Maximus’ experience leader says.

The post CX Exchange 2024: Maximus’ MaryAnn Monroe on tech advances in experience capabilities first appeared on Federal News Network.

]]>

When it comes to understanding their customer experience data, agencies are dipping into the cybersecurity lexicon.

More and more organizations want to view customer, employee, structured and unstructured data through a “single pane of glass,” a term typically reserved for pulling together data from a variety of cybersecurity tools.

And like in the cyber world, agencies are starting to see the need to view this total experience through an integrated platform that brings together a variety of data sources.

“For our customers, we want them to have a simple, seamless and secure experience when interacting with a digital service as well as a human-based service,” said MaryAnn Monroe, vice president of total experience solutions and services at Maximus, during the Federal News Network’s 2024 CX Exchange.

“That’s really critical because that really taps into our employee experience — and the tools that we provide at their fingertips as well as our managers and supervisors — to be able to have that view of the operation and also the customer and employee experience in one single place.”

Making sense of data in real time

The spillover of cybersecurity concepts into CX isn’t surprising. Agencies are facing similar challenges and opportunities in both realms given the amount of data they are collecting, the need to understand what’s happening in real time and the impact on the mission.

Monroe said the agencies Maximus work with are looking for the “pulse of the public” and for very specific insights from the public around different topics.

“Having that data at our fingertips and being able to use that and inform the agencies that we work with is very critical because it also helps inform decisions. It also tells us where pivots may need to be made or information needs to be developed — trying to bridge the gap in many cases where information doesn’t exist or maybe doesn’t exist as clearly as it needs to be based on what the public is telling us,” she said.

“We see that across agencies that are really trying to get that information much more clearly and not have to look in 10, 12, 15 different systems to cobble that together to understand the story. We’re getting better and better with our technology platforms and being able to pull that together to be able to tell that story from a single pane of glass.”

Taking advantage of AI, other advancing technologies for CX

Of course to deal with all the data, both structured and unstructured, agencies also are starting to turn to artificial intelligence and advanced data analytics. Agencies can apply these tools to improve operations as well as improve the employee experience.

“In several of our large programs supporting agencies, we’re able to pull in more unstructured customer and employee data about our actions or words spoken or customer intent. These are things that they care about and things that they’re asking about,” Monroe said. “We’re able to leverage AI to pull that information in so that we can actually understand the pulse of the public a little bit better, so that we can tell a better story or see what those impacts are in the program that we need to elevate to our agency stakeholders.”

She added several Maximus programs use AI, such as agent assist, to understand the topics citizens are asking and then bring back particular information that employees can use to answer questions more quickly.

“Those days where employees had to flip through a gazillion different pages of information, whether it’s digital or in a notebook, are over. We’ve certainly come a long way. AI is enabling very specific topics to be served right up to the desktop for them to click on and use as a resource to answer questions,” Monroe said.

Not only does this approach help make the employee experience more efficient, but it also allows integration of other forms of omnichannel communication into the interactions, she said.

“For example, if somebody has called for a particular piece of information, health information, for example, we’re able to integrate and say, ‘Would you like us to text that information to you or email it?’ Many times we can text it right to the customer while they’re still on the line and check to see that they’ve received it,” Monroe said.

Not all technology needs to be cutting edge. Many agencies also are turning to more straightforward automation tools like robotic process automation to improve the efficiency of services.

Monroe said no matter the tools agencies use, transparency and security are paramount to create long-standing trust. She warned that AI and similar tools, if not used correctly, could erode trust in the service or organization.

That’s why Monroe recommended agencies start small, with pilots, before expanding use of AI and other CX tools.

“We are all in with improving digital experiences where it makes sense and where people need those services,” she said. “The fact is that we need to understand who we’re serving and be able to provide those services. In many agencies, in-person services are still vitally important — at Social Security, the IRS, the Agriculture Department, the Department of Veterans Affairs. Those are very important services that the American public relies on and definitely have to factor into this equation of how we strike that balance with technology.”

Discover more customer experience tactics and takeaways from Federal News Network’s CX Exchange 2024 now.

The post CX Exchange 2024: Maximus’ MaryAnn Monroe on tech advances in experience capabilities first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/04/cx-exchange-2024-maximus-maryann-monroe-on-tech-advances-in-experience-capabilities/feed/ 0