Big Data - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Wed, 10 Jul 2024 13:37:16 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Big Data - Federal News Network https://federalnewsnetwork.com 32 32 Data security’s integral role in the digital age https://federalnewsnetwork.com/commentary/2024/07/data-securitys-integral-role-in-the-digital-age/ https://federalnewsnetwork.com/commentary/2024/07/data-securitys-integral-role-in-the-digital-age/#respond Wed, 10 Jul 2024 13:36:53 +0000 https://federalnewsnetwork.com/?p=5068243 Many regulations require that companies working with national security information implement aggressive levels of cybersecurity.

The post Data security’s integral role in the digital age first appeared on Federal News Network.

]]>
For machine shops, compliance with government standards is an evolving challenge. Defense and aerospace manufacturers must adhere to strict standards and documentation procedures, including the DoD’s Cybersecurity Maturity Model Certification (CMMC). Failure to comply with evolving requirements in a timely manner results in severe consequences, chief among them lost revenue and barriers to growth, as DoD suppliers are unable to work with shops that do not comply. This challenge to evolve must be met head-on and with urgency, to anticipate and plan for compliance requirements as manufacturers in the defense and aerospace industries position for success now and in the future.

Many regulations require that companies working with national security information implement aggressive levels of cybersecurity standards based on the type and sensitivity of information. Within the last year, President Biden signed the Fiscal Year 2023 National Defense Authorization Act, which produced a $773 billion funding package. While this represents a lucrative potential future for job shops it comes with a caveat: They must be able to meet the technological standards that allow them to comply with ever-changing regulations.

For job shops with smaller teams or limited resources, compliance is no small feat. Nevertheless, compliance is not optional. To meet these standards with limited workforce capacity, job shops must look to leverage technology that can automate processes, monitor and protect against cyberattacks, and update processes in real time. With a pen-and-paper or manual approach, manufacturers are committing significant time and resources that ultimately impact the bottom line, in a time where budgets are tight as they are being asked to do increasingly more with less. As certification standards continue to evolve, working with outdated tools will only hinder job shops’ progress.

Evolving landscape requires agile solutions

Meeting new CMMC standards is not a question of ‘if’ but rather a question of ‘how’ and ‘how quickly.’ The DoD is currently at the stage of suggesting the creation of a thorough and adaptable evaluation system to guarantee that defense contractors and subcontractors, under the CMMC program, have integrated the necessary security measures. This would extend the coverage of current security standards and introduce new security requirements in specific priority programs. In order to remain compliant and continue supplying the DoD, job shops must enhance their data security ahead of these rollouts.

Additionally, for maximum efficiency, manufacturers should focus on leveraging solutions that will integrate with their contractual requirements and CMMC implementation strategy. Cloud-based enterprise resource planning (ERP) solutions can help in a variety of ways, including centralized data management, compliance features, as well as enabling scalability and the ability to manage risk.

Visibility is essential in any security measure, and a centralized data repository provides crucial clarity. Disparate systems cause confusion and a lack of control. Through an ERP solution, sensitive data can be stored and managed in a secure platform, in which CMMC requirements regarding data security and access control can be easily adhered to. For Midway Swiss Turn, the progression to an ERP came following a PC and QuickBooks, and previously, typewriters. Their lack of data organization called for a solution that would allow them to be able to collect all the data and organize it in the most successful way. The evolution to automated collection of accounting data, machine availability and material stock helped revenue and employee numbers increase exponentially. The ability to spur growth with safe and secure data will be integral as CMMC standards finalize and evolve.

For job shops operating with smaller teams and tighter margins, achieving compliant data management is too significant an individual lift. Manual in-house efforts of creating encryption protocols, configuring access controls, maintaining audit logs and monitoring data protection take immense time and resources. In short, that’s a cost that many job shops can’t bear. Instead, cloud-based ERP solutions exhibit features like encryption, access controls and audit trails pre-designed to meet CMMC standards, ensuring compliance while looking out for the bottom line.

As job shops look to leverage opportunities in defense and aerospace investment, CMMC readiness also enables growth and scale. Cloud-based ERP systems bring with them the ability to adapt to changing compliance needs and include risk management features to identify and address cybersecurity vulnerabilities, allowing manufacturers to maintain compliance as standards evolve. CMMC-ready ERP solutions provide customers with a framework to meet timely compliance standards, maintain cybersecurity best practices, and build a competitive advantage in the market with the expanded opportunity to work with government contractors. This ultimately saves manufacturers time and money, enabling them to grow their businesses and avoid costly fines and opportunity exclusions. This forward-looking perspective is invaluable to data security as the CMMC deadlines approach.

Investing in the present and future

Manufacturers must adopt adaptable, industry-evolving solutions to remain compliant and position themselves for future success, especially as the new CMMC standards are set to be in place Q1 of 2025. Failing to leverage the technology required will ultimately be of a higher cost than the technological investment. Secure, agile solutions continue to provide the visibility and compliance that the government requires and serve as a strategic step to set manufacturers up for future success.

Matt Heerey, President of Manufacturing, ECI Software Solutions

The post Data security’s integral role in the digital age first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/commentary/2024/07/data-securitys-integral-role-in-the-digital-age/feed/ 0
Contracting officers benefit from a bot in the seat to their right https://federalnewsnetwork.com/contracting/2024/07/contracting-officers-benefit-from-a-bot-in-the-seat-to-their-right/ https://federalnewsnetwork.com/contracting/2024/07/contracting-officers-benefit-from-a-bot-in-the-seat-to-their-right/#respond Mon, 08 Jul 2024 16:07:55 +0000 https://federalnewsnetwork.com/?p=5067256 The Office Of Management and Budget and General Services Administration have been fielding a data integration tool to help contracting officers.

The post Contracting officers benefit from a bot in the seat to their right first appeared on Federal News Network.

]]>
var config_5066763 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB7483959928.mp3?updated=1720438806"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"Contracting officers benefit from a bot in the seat to their right","description":"[hbidcpodcast podcastid='5066763']nnThe Office Of Management and Budget and General Services Administration have been fielding a data integration tool to help contracting officers. Dubbed Co-Pilot, it gathers data from various governmentwide procurement systems, and presents buyers with pricing histories, vendor information and other data to help their decisions. How's the first month been going? <a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/"><em><strong>The Federal Drive with Tom Temin<\/strong><\/em><\/a> gets an update from the senior advisor at the Office of Federal Procurement Policy, Christine Harada.nn<em><strong>Interview Transcript:\u00a0\u00a0<\/strong><\/em>n<blockquote><strong>Tom Temin\u00a0 <\/strong>And it's been running almost a month now. And how has the take up have been so far for Co-Pilot?nn<strong>Christine Harada\u00a0 <\/strong>The take up in pretty good thus far. We had rolled it out a couple of weeks ago, as you indicated, and we have received a lot actually very positive feedback from contracting officers and acquisition professionals across the federal government. We've heard feedback and comments to the effect of words has been. And I think that it has been an extraordinarily useful tool. And I'd like to think that we also released it in a very good time from a seasonal perspective, if you will, as you know, a lot of federal government contracting does indeed happen in the last quarter of the fiscal year.nn<strong>Tom Temin\u00a0 <\/strong>And did you beta test it with a few poor souls ahead of time? Or did it simply launch? And if so, how many people have used it?nn<strong>Christine Harada\u00a0 <\/strong>We did beta test it with a few brave souls. And they of course, it provided us with some great user feedback around that. Since launch, we've had over 2500 unique users. And we've had approximately 1800 users who are using it on a regular basis.nn<strong>Tom Temin\u00a0 <\/strong>Wow. And briefly walk us through how it actually works. In daily situation. Say I'm a CO, I want to know what's the best pricing or what are the pricing trends for product XYZ? What do I do? Do I have a window that I type XYZ into? Or what do they see?nn<strong>Christine Harada\u00a0 <\/strong>Yes, absolutely. So, it is just as you say, it's a great tool, and it's tailored specifically for the contracting officer and Acquisition Professional user base, if you will. It is available to any government user. And it brings together several key facets of both the contracting officer and Program Manager user journey. Everything from cost estimates to vendor research to quickly identifying government or agency wide contract vehicle that can be used to meet requirements. It is currently available only for federal use, but it is basically available solely for those particular purposes. It's the product of years of user research, the team saw a need to help COs really rapidly procure the goods and services needed for COVID response while still aligning right to overall good federal taxpayer stewardship as well as category management principles. And over time, the tool has evolved from a dashboard to a more integrated web application that supports a lot more robust pricing and contract research, as well as bolstering the workforce\u2019s ability to be able to get these contracts and services executed.nn<strong>Tom Temin\u00a0 <\/strong>And are people using it able to sail to put a cliche on here think outside the box, as the government seeks the so called innovative companies and people that are not doing business with the government, but might have great products and services the government could use, does it see anything outside of what's already in the procurement systems in the area of market research, or I think a lot of COs, you know, run into trouble sometimes.nn<strong>Christine Harada\u00a0 <\/strong>So, it does indeed, include a find a vendor feature. And it's specifically targeted for those vendors that are registered in SAM. So, you do need to be registered in SAM in order to be searchable for this particular user base.nn<strong>Tom Temin\u00a0 <\/strong>Right. And the vehicles are all in there also.nn<strong>Christine Harada\u00a0 <\/strong>That is correct. So especially for those contracting officers that may not have much experience or exposure to other contracting vehicles outside of their agency. That is certainly unique benefit for these agencies. If the tool enables the contracting officer to match requirements to both contracting vehicles, whether it be like with other agencies and or other government wide vehicles.nn<strong>Tom Temin\u00a0 <\/strong>We're speaking with Christine Harada. She's senior adviser to the Office of Federal Procurement Policy. And can you narrow it down? Suppose I want to know how just civilian agencies did with this or just DoD agencies did with a certain product? Can you put parameters? Or is all of the government wide data? Always in every search?nn<strong>Christine Harada\u00a0 <\/strong>It's set up so that we want to make sure that we're maximizing the data exposure. And so, it's based on what are the common spend categories across the entirety of the federal government. And at the moment, the displays do, do show everything. So, it's not that, you know, once you download the data, etc., you can filter it out yourself, but we wanted to make sure that we were broadening the aperture of the data as much as we possibly could so that contracting officers can see what else is all out there.nn<strong>Tom Temin\u00a0 <\/strong>But people don't have to download datasets then normally to use it?nn<strong>Christine Harada\u00a0 <\/strong>That's correct.nn<strong>Tom Temin\u00a0 <\/strong>Yeah, because that's a pain in the neck. You know, you get all these crazy spreadsheets and different formats and then what do you do so this does that for you. You might say.nn<strong>Christine Harada\u00a0 <\/strong>It might be my it'd be a pain in the butt, you know, it might be a pain for some people. But if you're a data nerd, like me or many other contracting officers, you know, I think it's, I find that personally fun, but that's just me.nn<strong>Tom Temin\u00a0 <\/strong>Right? Well, will grant you that much fun as you want. Yeah, I feel like downloading data. And how did this get developed? Was it an in house, you know, US digital service type of effort? Or did you adapt a commercial product?nn<strong>Christine Harada\u00a0 <\/strong>So, this capability was developed very much based on our government user base, fundamentally built from scratch because of the very vast nature and the number of data systems that we've got across the entirety of the federal government.nn<strong>Tom Temin\u00a0 <\/strong>Was it built by a contractor? Or was it coded, you know, in house, just out of curiosity.nn<strong>Christine Harada\u00a0 <\/strong>it was built by a contractor, and also beta tested as a Tableau dashboard. Because we knew that we wanted to make sure that we're able to convert it to a web app for performance needs.nn<strong>Tom Temin\u00a0 <\/strong>Sure. And the name Co-Pilot, I don't know about you, but every time I get on a computer to do anything, Microsoft is telling me to use Co-Pilot something that it developed in the artificial intelligence area. But you had the name first.nn<strong>Christine Harada\u00a0 <\/strong>Yes. You know, we liked the name Co-Pilot because it ties to the Acquisition Gateway, and it implies that it's a way to support navigating the overall complex, you know, acquisition process itself.nn<strong>Tom Temin\u00a0 <\/strong>All right, and what is your plan for it? I mean, this is not touted as an artificial intelligence product. But it sure seems ripe for AI.nn<strong>Christine Harada\u00a0 <\/strong>That's something that we're still candidly exploring, you know, we are looking to expand both the features and the scope of this tool, including the data within it at the moment, it enables the user to search for products. And we do envision enlarging the scope, if you will to include services on the radar, we're also looking to apply emerging technologies, for example, for natural language pairings. So, making it a little bit user friendly in that regard.nn<strong>Tom Temin\u00a0 <\/strong>Got it. And it sounds like it's automatically updated. Because say NASA soup or a new GSA G whack is coming into the market. And as launched, those agencies would put the data online as they need fit as they see fit. And therefore, it would in a sense automatically be available to Co-Pilot fair way to put it.nn<strong>Christine Harada\u00a0 <\/strong>Yes, that's correct. It's the data is refreshed on a weekly cadence as we receive that data from agencies and the contracts.nn<strong>Tom Temin\u00a0 <\/strong>And what have people told you they wish could be improved with it so far?nn<strong>Christine Harada\u00a0 <\/strong>We're trying to get the acquisition professionals used to it, period and get it you know, enhancing the adoption across the entirety of the workforce. Two things. I think agency professionals are saying that they would like to see an increase in the scope of data specifically around services, which of course we're working on. And they're also interested explore AI capability itself with this with the Co-Pilot tool.nn<strong>Tom Temin\u00a0 <\/strong>Right? And that's where you get into possible vendors that aren't yet in federal contracts or in SAM because artificial intelligence companies are kind of like dandelions on my lawn.nn<strong>Christine Harada\u00a0 <\/strong>For our purposes, we're sticking with Sam registrants. You know, we absolutely do need the vendors to be registered and SAM, you know, with respect to other unregistered, you know, potential vendors, if you will, there are other methods that we're using. So, for example, you may be familiar with the Department of Defense's trade winds marketplace, which is, you know, exposes newer vendors, potential vendors, to the Department of Defense market. So, we've got other mechanisms like that we are first and foremost, though, focused on ensuring that we're best supporting our acquisition professionals with the existing vendors already that are indeed registered in SAM. There plenty of vendors in there, number one, but also plenty of vendors would also like to expand their offerings and services to the federal government. And so, we view this as a great way for them to be able to gain that visibility with those other agencies that they have not historically served up until this point.nn<strong>Tom Temin\u00a0 <\/strong>And one final question, what's your plan to get more than just 2000 people? There's probably, you know, 40,000 COs, 1102s just in DOD alone.nn<strong>Christine Harada\u00a0 <\/strong>Sure, of course, it's actually 40,000 COs and contracting presses across the entirety of the government as well as 100,000 program managers. So, we are doing a number of things. Number one, we're doing monthly demonstrations. Number two, we're also holding office hours to try to walk folks through the tool itself. We have also convened the frontline forum. And those members of the frontline forum have been helping us with evangelizing, if you will, the capabilities of the tools. We've also been doing agency roadshows, as well as social media outreach via the acquisition gateways, we've also been engaging a lot of other groups and communities of practice. So, in addition to the frontline form that I mentioned, with external entities, like ACT, IACT, for example, and so we definitely view this as a marathon and not a sprint as we ensure that we're steadily building out the user base. It is my view, I think our view that the benefits and the power of the tool is amazing, just with products itself, and so I do think that that natural enthusiasm will enable us to see a steady uptick not just in visitors who are curious about the tool itself but also a true adoption.<\/blockquote>"}};

The Office Of Management and Budget and General Services Administration have been fielding a data integration tool to help contracting officers. Dubbed Co-Pilot, it gathers data from various governmentwide procurement systems, and presents buyers with pricing histories, vendor information and other data to help their decisions. How’s the first month been going? The Federal Drive with Tom Temin gets an update from the senior advisor at the Office of Federal Procurement Policy, Christine Harada.

Interview Transcript:  

Tom Temin  And it’s been running almost a month now. And how has the take up have been so far for Co-Pilot?

Christine Harada  The take up in pretty good thus far. We had rolled it out a couple of weeks ago, as you indicated, and we have received a lot actually very positive feedback from contracting officers and acquisition professionals across the federal government. We’ve heard feedback and comments to the effect of words has been. And I think that it has been an extraordinarily useful tool. And I’d like to think that we also released it in a very good time from a seasonal perspective, if you will, as you know, a lot of federal government contracting does indeed happen in the last quarter of the fiscal year.

Tom Temin  And did you beta test it with a few poor souls ahead of time? Or did it simply launch? And if so, how many people have used it?

Christine Harada  We did beta test it with a few brave souls. And they of course, it provided us with some great user feedback around that. Since launch, we’ve had over 2500 unique users. And we’ve had approximately 1800 users who are using it on a regular basis.

Tom Temin  Wow. And briefly walk us through how it actually works. In daily situation. Say I’m a CO, I want to know what’s the best pricing or what are the pricing trends for product XYZ? What do I do? Do I have a window that I type XYZ into? Or what do they see?

Christine Harada  Yes, absolutely. So, it is just as you say, it’s a great tool, and it’s tailored specifically for the contracting officer and Acquisition Professional user base, if you will. It is available to any government user. And it brings together several key facets of both the contracting officer and Program Manager user journey. Everything from cost estimates to vendor research to quickly identifying government or agency wide contract vehicle that can be used to meet requirements. It is currently available only for federal use, but it is basically available solely for those particular purposes. It’s the product of years of user research, the team saw a need to help COs really rapidly procure the goods and services needed for COVID response while still aligning right to overall good federal taxpayer stewardship as well as category management principles. And over time, the tool has evolved from a dashboard to a more integrated web application that supports a lot more robust pricing and contract research, as well as bolstering the workforce’s ability to be able to get these contracts and services executed.

Tom Temin  And are people using it able to sail to put a cliche on here think outside the box, as the government seeks the so called innovative companies and people that are not doing business with the government, but might have great products and services the government could use, does it see anything outside of what’s already in the procurement systems in the area of market research, or I think a lot of COs, you know, run into trouble sometimes.

Christine Harada  So, it does indeed, include a find a vendor feature. And it’s specifically targeted for those vendors that are registered in SAM. So, you do need to be registered in SAM in order to be searchable for this particular user base.

Tom Temin  Right. And the vehicles are all in there also.

Christine Harada  That is correct. So especially for those contracting officers that may not have much experience or exposure to other contracting vehicles outside of their agency. That is certainly unique benefit for these agencies. If the tool enables the contracting officer to match requirements to both contracting vehicles, whether it be like with other agencies and or other government wide vehicles.

Tom Temin  We’re speaking with Christine Harada. She’s senior adviser to the Office of Federal Procurement Policy. And can you narrow it down? Suppose I want to know how just civilian agencies did with this or just DoD agencies did with a certain product? Can you put parameters? Or is all of the government wide data? Always in every search?

Christine Harada  It’s set up so that we want to make sure that we’re maximizing the data exposure. And so, it’s based on what are the common spend categories across the entirety of the federal government. And at the moment, the displays do, do show everything. So, it’s not that, you know, once you download the data, etc., you can filter it out yourself, but we wanted to make sure that we were broadening the aperture of the data as much as we possibly could so that contracting officers can see what else is all out there.

Tom Temin  But people don’t have to download datasets then normally to use it?

Christine Harada  That’s correct.

Tom Temin  Yeah, because that’s a pain in the neck. You know, you get all these crazy spreadsheets and different formats and then what do you do so this does that for you. You might say.

Christine Harada  It might be my it’d be a pain in the butt, you know, it might be a pain for some people. But if you’re a data nerd, like me or many other contracting officers, you know, I think it’s, I find that personally fun, but that’s just me.

Tom Temin  Right? Well, will grant you that much fun as you want. Yeah, I feel like downloading data. And how did this get developed? Was it an in house, you know, US digital service type of effort? Or did you adapt a commercial product?

Christine Harada  So, this capability was developed very much based on our government user base, fundamentally built from scratch because of the very vast nature and the number of data systems that we’ve got across the entirety of the federal government.

Tom Temin  Was it built by a contractor? Or was it coded, you know, in house, just out of curiosity.

Christine Harada  it was built by a contractor, and also beta tested as a Tableau dashboard. Because we knew that we wanted to make sure that we’re able to convert it to a web app for performance needs.

Tom Temin  Sure. And the name Co-Pilot, I don’t know about you, but every time I get on a computer to do anything, Microsoft is telling me to use Co-Pilot something that it developed in the artificial intelligence area. But you had the name first.

Christine Harada  Yes. You know, we liked the name Co-Pilot because it ties to the Acquisition Gateway, and it implies that it’s a way to support navigating the overall complex, you know, acquisition process itself.

Tom Temin  All right, and what is your plan for it? I mean, this is not touted as an artificial intelligence product. But it sure seems ripe for AI.

Christine Harada  That’s something that we’re still candidly exploring, you know, we are looking to expand both the features and the scope of this tool, including the data within it at the moment, it enables the user to search for products. And we do envision enlarging the scope, if you will to include services on the radar, we’re also looking to apply emerging technologies, for example, for natural language pairings. So, making it a little bit user friendly in that regard.

Tom Temin  Got it. And it sounds like it’s automatically updated. Because say NASA soup or a new GSA G whack is coming into the market. And as launched, those agencies would put the data online as they need fit as they see fit. And therefore, it would in a sense automatically be available to Co-Pilot fair way to put it.

Christine Harada  Yes, that’s correct. It’s the data is refreshed on a weekly cadence as we receive that data from agencies and the contracts.

Tom Temin  And what have people told you they wish could be improved with it so far?

Christine Harada  We’re trying to get the acquisition professionals used to it, period and get it you know, enhancing the adoption across the entirety of the workforce. Two things. I think agency professionals are saying that they would like to see an increase in the scope of data specifically around services, which of course we’re working on. And they’re also interested explore AI capability itself with this with the Co-Pilot tool.

Tom Temin  Right? And that’s where you get into possible vendors that aren’t yet in federal contracts or in SAM because artificial intelligence companies are kind of like dandelions on my lawn.

Christine Harada  For our purposes, we’re sticking with Sam registrants. You know, we absolutely do need the vendors to be registered and SAM, you know, with respect to other unregistered, you know, potential vendors, if you will, there are other methods that we’re using. So, for example, you may be familiar with the Department of Defense’s trade winds marketplace, which is, you know, exposes newer vendors, potential vendors, to the Department of Defense market. So, we’ve got other mechanisms like that we are first and foremost, though, focused on ensuring that we’re best supporting our acquisition professionals with the existing vendors already that are indeed registered in SAM. There plenty of vendors in there, number one, but also plenty of vendors would also like to expand their offerings and services to the federal government. And so, we view this as a great way for them to be able to gain that visibility with those other agencies that they have not historically served up until this point.

Tom Temin  And one final question, what’s your plan to get more than just 2000 people? There’s probably, you know, 40,000 COs, 1102s just in DOD alone.

Christine Harada  Sure, of course, it’s actually 40,000 COs and contracting presses across the entirety of the government as well as 100,000 program managers. So, we are doing a number of things. Number one, we’re doing monthly demonstrations. Number two, we’re also holding office hours to try to walk folks through the tool itself. We have also convened the frontline forum. And those members of the frontline forum have been helping us with evangelizing, if you will, the capabilities of the tools. We’ve also been doing agency roadshows, as well as social media outreach via the acquisition gateways, we’ve also been engaging a lot of other groups and communities of practice. So, in addition to the frontline form that I mentioned, with external entities, like ACT, IACT, for example, and so we definitely view this as a marathon and not a sprint as we ensure that we’re steadily building out the user base. It is my view, I think our view that the benefits and the power of the tool is amazing, just with products itself, and so I do think that that natural enthusiasm will enable us to see a steady uptick not just in visitors who are curious about the tool itself but also a true adoption.

The post Contracting officers benefit from a bot in the seat to their right first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/contracting/2024/07/contracting-officers-benefit-from-a-bot-in-the-seat-to-their-right/feed/ 0
New strategy, A-123 update to help reduce improper payments https://federalnewsnetwork.com/ask-the-cio/2024/07/new-strategy-a-123-update-to-help-reduce-improper-payments/ https://federalnewsnetwork.com/ask-the-cio/2024/07/new-strategy-a-123-update-to-help-reduce-improper-payments/#respond Tue, 02 Jul 2024 14:26:56 +0000 https://federalnewsnetwork.com/?p=5061571 David Lebryk, the fiscal assistant secretary at Treasury, said a new strategy provides tools, best practices and guidance to improve federal payments.

The post New strategy, A-123 update to help reduce improper payments first appeared on Federal News Network.

]]>
var config_5061627 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB4788938353.mp3?updated=1719928823"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"New strategy, A-123 update to help reduce improper payments","description":"[hbidcpodcast podcastid='5061627']nnNew tools and better data are putting the CFO community in a stronger position to do more to reduce improper payments and fraud in federal programs.nnThe Joint Financial Management Improvement Program (JFMIP) recognized this opportunity in its new three-year plan that it hopes can spur even more progress to ensure agencies are paying the right amount to the right people in a timely manner.nnDavid Lebryk, the fiscal assistant secretary at the Treasury Department, said the <a href="https:\/\/www.cfo.gov\/assets\/files\/Final_JFMIP%20PI%203-YR%20Plan_01052024.pdf" target="_blank" rel="noopener">JFMIP three-year strategy<\/a> outlines three pillars of effort that will give agencies tools, best practices and guidance to do more to prevent fraud and improper payments.nn[caption id="attachment_5061576" align="alignright" width="298"]<img class="wp-image-5061576" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/07\/David-Lebryk.jpg" alt="" width="298" height="417" \/> David Lebryk is the fiscal assistant secretary at the Treasury Department.[\/caption]nn\u201cIt's focusing on prevention. It's promoting best practices, and it's strengthening the partnerships. The Treasury piece that I think is very much important here is focusing on that prevention. What tools can Treasury bring to the payment process that can actually really help reduce and prevent process fraud from happening?\u201d Lebryk said on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cThe second pillar, which is promoting best practices, was about working with agencies. The Office of Management and Budget has done some good work with this, as well as when a new program has stood up. Do you put controls in place up front that help reduce the potential for improper payments? There are really a number of things you can do like doing risk assessment in your program and you can talk about different data that you need from recipients that you can get from them.\u201dnnAs part of the focus on prevention, Lebryk said his office has launched several programs where Treasury followed many of the steps outlined in the JFMIP strategy to prevent and reduce fraud in large programs.nnIn helping local communities recover from the Deep Horizon oil spill back in 2010, Lebryk said ahead of implementing the Resources and Ecosystems Sustainability, Tourist Opportunities, and Revived Economies of the Gulf Coast States Act (RESTORE Act), Treasury designed the program to make sure recipients understood the requirements to apply for funding as well as controls to make sure the money went to the right people.nn\u201cWe've had no fraud in that program that we're aware of because we've really focused on those controls in the design of the program up front. We haven't had an issue about slowing payment down. It does prove there's opportunity, both to be quick, but also careful in the issuance of money,\u201d he said. \u201cIn the third pillar, which was strengthening partnerships, it comes down to doing more with the states, the inspector general community and other government agencies to really strengthen those partnerships. I think we're very confident that it's going to have a real major impact and there's a real commitment across the different entities to make sure it works.\u201dn<h2>Increased focus on improper payments<\/h2>nThe JFMIP strategy outlines strategies and objectives for each pillar based on the work by Treasury, OMB, the Government Accountability Office and others.nnBoth Congress and the Biden administration have increased focus on preventing <a href="https:\/\/federalnewsnetwork.com\/big-data\/2023\/08\/eye-watering-kind-of-fraud-improper-payments-account-for-third-of-pandemic-unemployment-programs-funds\/">fraud and improper payments<\/a> as well as recovering lost money due to bad actors. The Government Accountability Office estimated that agencies <a href="https:\/\/www.gao.gov\/products\/gao-24-107482#:~:text=In%20FY%202023%2C%20federal%20agencies,and%20ways%20to%20reduce%20them." target="_blank" rel="noopener">spent $236 billion<\/a> improperly in fiscal 2023, which was down about $11 billion, as compared to 2022.nnOn Capitol Hill, lawmakers have introduced at least eight bills since February 2023 focused on improper payments and fraud. A recent one from Sen. Gary Peters (D-Mich.), chairman of the Homeland Security and Governmental Affairs Committee, called the <a href="https:\/\/www.congress.gov\/bill\/118th-congress\/senate-bill\/4089\/text?s=1&r=8&q=%7B%22search%22%3A%22%5C%22improper+payments%5C%22%22%7D" target="_blank" rel="noopener">Fraud Prevention and Recovery Act<\/a>, would, among other things, give resources to agency IGs to investigate people who committed pandemic fraud and recover the taxpayer dollars and create a new fund to help agencies prevent fraud and identity theft through a new early warning system for detecting fraud.nnThe Justice Department\u2019s COVID-19 Fraud Enforcement Task Force (CFETF), for example, <a href="https:\/\/www.justice.gov\/opa\/pr\/covid-19-fraud-enforcement-task-force-releases-2024-report" target="_blank" rel="noopener">reported in April<\/a> that it \u201ccharged more than 3,500 defendants, seized or forfeited over $1.4 billion in stolen <a href="https:\/\/federalnewsnetwork.com\/agency-oversight\/2024\/01\/this-line-of-federal-improper-spending-is-among-the-most-galling\/">COVID-19 relief funds<\/a>, and filed more than 400 civil lawsuits resulting in court judgements and settlements\u201d since it launched in 2021.nnAnother administration priority is the rewrite of Circular A-123 internal controls for overseeing and administering programs. OMB\u2019s <a href="https:\/\/federalnewsnetwork.com\/management\/2016\/07\/123-update-omb-knits-together-risk-management-internal-controls\/">last major rewrite<\/a> was in 2016 when it added risk management to its updated internal control processes.n<h2>Making A-123 less compliance-based<\/h2>nLebryk said one of the goals of the A-123 rewrite is to reduce the compliance requirements and make the circular more usable.nn\u201cWe want to make it less of a compliance exercise and more of a real actual set of practices that will help agencies. Some agencies have been further along in terms of setting up internal programs to actually adhere to the spirit of A-123 and here to the spirit of really trying to reduce the improper payments,\u201d he said. \u201cBut again, it's less so about paperwork and reporting, and more so about how do you make sure you actually make an impact in this area. I think the CFO community can be very helpful in this regard. The CFO community plays a very unique role in that we're supposed to speak the truth. We have an obligation to raise our hand and say, \u2018hey, something isn't necessarily looking right on the financials.\u2019 We want to make sure that we have integrity and stewardship of government resources, so I think that we can do a better job in a financial community of saying to program agencies, \u2018hey, the one way to create problems for you not to be able to meet your program\u2019s mission, is if you do have things like fraud because it means that the right people aren't getting the money.'\u201dnnTreasury already has <a href="https:\/\/federalnewsnetwork.com\/agency-oversight\/2024\/04\/treasury-giving-agencies-a-fighting-chance-to-prevent-fraud\/">several tools<\/a> on new and existing platforms and databases to help agencies move from being reactive to proactive in stopping fraudulent payments. One tool uses machine learning to look for anomalies on paper checks. So far, Treasury has run about 40 million checks through the ML application.nnLebryk said Treasury is the co-chairman of a CFO Council working group, which is determining the impediments for agencies to use these and other fraud prevention tools.nn\u201cThey're also doing some important work about creating a fraud catalog that collects trends and fraud, which I think is also very important. But one of these really important workstreams is for us to say, \u2018hey, is there something that Treasury can do to make it easier for you to access these tools?\u2019\u201d he said. \u201cHaving looked at the government environment over a number of years, one of the real challenges that you have is asking someone to make a systems change. It is a very lengthy, long process because, quite frankly, oftentimes system changes aren't funded. They can be difficult. So what we're really looking at is whether there is opportunity for technology to help in this, in terms of things like interfaces with existing systems, which can make it easier to interact. Are there just organizational issues within the agencies that would be helpful if the agency was organized slightly differently or had the information going in one place versus another place, that would mean that you could take action?\u201dnnLebryk added the committee will make a series of recommendations that would lead to improvements with a goal by the end of the year identifying a set of tools that agencies can take more advantage of to prevent fraud and improper payments.nn "}};

New tools and better data are putting the CFO community in a stronger position to do more to reduce improper payments and fraud in federal programs.

The Joint Financial Management Improvement Program (JFMIP) recognized this opportunity in its new three-year plan that it hopes can spur even more progress to ensure agencies are paying the right amount to the right people in a timely manner.

David Lebryk, the fiscal assistant secretary at the Treasury Department, said the JFMIP three-year strategy outlines three pillars of effort that will give agencies tools, best practices and guidance to do more to prevent fraud and improper payments.

David Lebryk is the fiscal assistant secretary at the Treasury Department.

“It’s focusing on prevention. It’s promoting best practices, and it’s strengthening the partnerships. The Treasury piece that I think is very much important here is focusing on that prevention. What tools can Treasury bring to the payment process that can actually really help reduce and prevent process fraud from happening?” Lebryk said on Ask the CIO. “The second pillar, which is promoting best practices, was about working with agencies. The Office of Management and Budget has done some good work with this, as well as when a new program has stood up. Do you put controls in place up front that help reduce the potential for improper payments? There are really a number of things you can do like doing risk assessment in your program and you can talk about different data that you need from recipients that you can get from them.”

As part of the focus on prevention, Lebryk said his office has launched several programs where Treasury followed many of the steps outlined in the JFMIP strategy to prevent and reduce fraud in large programs.

In helping local communities recover from the Deep Horizon oil spill back in 2010, Lebryk said ahead of implementing the Resources and Ecosystems Sustainability, Tourist Opportunities, and Revived Economies of the Gulf Coast States Act (RESTORE Act), Treasury designed the program to make sure recipients understood the requirements to apply for funding as well as controls to make sure the money went to the right people.

“We’ve had no fraud in that program that we’re aware of because we’ve really focused on those controls in the design of the program up front. We haven’t had an issue about slowing payment down. It does prove there’s opportunity, both to be quick, but also careful in the issuance of money,” he said. “In the third pillar, which was strengthening partnerships, it comes down to doing more with the states, the inspector general community and other government agencies to really strengthen those partnerships. I think we’re very confident that it’s going to have a real major impact and there’s a real commitment across the different entities to make sure it works.”

Increased focus on improper payments

The JFMIP strategy outlines strategies and objectives for each pillar based on the work by Treasury, OMB, the Government Accountability Office and others.

Both Congress and the Biden administration have increased focus on preventing fraud and improper payments as well as recovering lost money due to bad actors. The Government Accountability Office estimated that agencies spent $236 billion improperly in fiscal 2023, which was down about $11 billion, as compared to 2022.

On Capitol Hill, lawmakers have introduced at least eight bills since February 2023 focused on improper payments and fraud. A recent one from Sen. Gary Peters (D-Mich.), chairman of the Homeland Security and Governmental Affairs Committee, called the Fraud Prevention and Recovery Act, would, among other things, give resources to agency IGs to investigate people who committed pandemic fraud and recover the taxpayer dollars and create a new fund to help agencies prevent fraud and identity theft through a new early warning system for detecting fraud.

The Justice Department’s COVID-19 Fraud Enforcement Task Force (CFETF), for example, reported in April that it “charged more than 3,500 defendants, seized or forfeited over $1.4 billion in stolen COVID-19 relief funds, and filed more than 400 civil lawsuits resulting in court judgements and settlements” since it launched in 2021.

Another administration priority is the rewrite of Circular A-123 internal controls for overseeing and administering programs. OMB’s last major rewrite was in 2016 when it added risk management to its updated internal control processes.

Making A-123 less compliance-based

Lebryk said one of the goals of the A-123 rewrite is to reduce the compliance requirements and make the circular more usable.

“We want to make it less of a compliance exercise and more of a real actual set of practices that will help agencies. Some agencies have been further along in terms of setting up internal programs to actually adhere to the spirit of A-123 and here to the spirit of really trying to reduce the improper payments,” he said. “But again, it’s less so about paperwork and reporting, and more so about how do you make sure you actually make an impact in this area. I think the CFO community can be very helpful in this regard. The CFO community plays a very unique role in that we’re supposed to speak the truth. We have an obligation to raise our hand and say, ‘hey, something isn’t necessarily looking right on the financials.’ We want to make sure that we have integrity and stewardship of government resources, so I think that we can do a better job in a financial community of saying to program agencies, ‘hey, the one way to create problems for you not to be able to meet your program’s mission, is if you do have things like fraud because it means that the right people aren’t getting the money.’”

Treasury already has several tools on new and existing platforms and databases to help agencies move from being reactive to proactive in stopping fraudulent payments. One tool uses machine learning to look for anomalies on paper checks. So far, Treasury has run about 40 million checks through the ML application.

Lebryk said Treasury is the co-chairman of a CFO Council working group, which is determining the impediments for agencies to use these and other fraud prevention tools.

“They’re also doing some important work about creating a fraud catalog that collects trends and fraud, which I think is also very important. But one of these really important workstreams is for us to say, ‘hey, is there something that Treasury can do to make it easier for you to access these tools?’” he said. “Having looked at the government environment over a number of years, one of the real challenges that you have is asking someone to make a systems change. It is a very lengthy, long process because, quite frankly, oftentimes system changes aren’t funded. They can be difficult. So what we’re really looking at is whether there is opportunity for technology to help in this, in terms of things like interfaces with existing systems, which can make it easier to interact. Are there just organizational issues within the agencies that would be helpful if the agency was organized slightly differently or had the information going in one place versus another place, that would mean that you could take action?”

Lebryk added the committee will make a series of recommendations that would lead to improvements with a goal by the end of the year identifying a set of tools that agencies can take more advantage of to prevent fraud and improper payments.

 

The post New strategy, A-123 update to help reduce improper payments first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/07/new-strategy-a-123-update-to-help-reduce-improper-payments/feed/ 0
Robust data management is key to harnessing the power of emerging technologies https://federalnewsnetwork.com/commentary/2024/06/robust-data-management-is-key-to-harnessing-the-power-of-emerging-technologies/ https://federalnewsnetwork.com/commentary/2024/06/robust-data-management-is-key-to-harnessing-the-power-of-emerging-technologies/#respond Thu, 20 Jun 2024 19:36:35 +0000 https://federalnewsnetwork.com/?p=5047635 Comprehensive data management is key to unlocking seamless, personalized and secure CX for government agencies.

The post Robust data management is key to harnessing the power of emerging technologies first appeared on Federal News Network.

]]>
The recent AI Executive Order aptly states that AI reflects the data upon which it is built. Federal agencies are looking to responsibly implement cutting-edge IT innovations such as artificial intelligence, machine learning and robotic process automation to improve customer experiences, bolster cybersecurity and advance mission outcomes. Accessing real-time, actionable data is vital to achieving these essential objectives.

Comprehensive data management is key to unlocking seamless, personalized and secure CX for government agencies. Real-time data empowers informed, rapid decision-making, which can improve critical, high-impact federal services where time is of the essence, such as in response to a natural disaster. Alarmingly, only 13% of federal agency leaders report having access to real-time data, and 73% feel they must do more to leverage the full value of data across their agency.

While some agencies are making progress in their IT modernization journeys, they continue to struggle when it comes to quickly accessing the right data due to numerous factors, from ineffective IT infrastructure to internal cultural barriers.

Actionable intelligence is paramount. The ultimate goal is to access the right data at the right moment to generate insights at “the speed of relevance,” as leaders at the Defense Department would say. To achieve the speed of relevance required to make real-time, data-driven decisions, agencies can take steps to enable quicker access to data, improve their data hygiene, and secure their data.

How to effectively intake and store troves of data

From a data infrastructure perspective, the best path to modernized, real-time deployment is using hyper automation and DevSecOps on cloud infrastructures. Many federal agencies have begun this transition from on-premises to cloud environments, but there’s still a long way to go until this transition is complete government-wide.

Implementing a hybrid, multi-cloud environment offers agencies a secure and cost-effective operating model to propel their data initiatives forward. By embracing standardization and employing cloud-agnostic tools for automation, visibility can be enhanced across systems and environments, while simultaneously adhering to service-level agreements and ensuring the reliability of data platforms. Once a robust infrastructure is in place to store and analyze data, agencies can turn their attention to data ingestion tools.

Despite many agency IT leaders utilizing data ingestion tools such as data lakes and warehouses, silos persist. Agencies can address this interoperability challenge by prioritizing flexible, scalable and holistic data ingestion tools such as data mesh. Data mesh tools, which foster a decentralized data management architecture to improve accessibility, can enable agency decision-makers to capitalize on the full spectrum of available data, while still accommodating unique agency requirements.

To ensure data is accessible to decision-makers, it’s important that the data ingestion mechanism has as many connectors as possible to all sources of data that an agency identifies. Data streaming and data pipelines can also enable real-time insights and facilitate faster decision-making by mitigating manual processes. Data streaming allows data to be ingested from multiple systems, which can build a single source of trust for analytical systems. Additionally, these practices limit data branching and siloes, which can cause issues with data availability, quality and hygiene.

Data hygiene and security enable transformative benefits

Data hygiene is imperative, particularly when striving to ethically and accurately utilize data for an autonomous system like AI or ML. A robust data validation framework is necessary to improve data quality. To create this framework, agencies can map their data’s source systems and determine the types of data they expect to yield, but mapping becomes increasingly arduous as databases continue to scale.

One critical success factor is to understand the nature of the data and the necessary validations prior to ingesting the data into source systems. Hygiene can be improved by consuming the raw data into a data lake and then, during conversion, validate the data’s quality before applying any analytics or crafting insights.

In addition to data hygiene, data security must remain a top priority across the federal government as agencies move toward real-time data insights. Adopting a hybrid, multi-cloud environment can lead to a stronger security posture because there are data encryption capabilities inherent in enterprise cloud environments.

Agencies may consider using a maturity model to help their teams assess data readiness and how they are progressing in their cybersecurity frameworks. A maturity model lets agencies identify and understand specific security gaps at each level of the model and provides a roadmap to address these gaps. Ultimately, the cybersecurity framework is as essential as data hygiene to ensure agencies can harness data reliably and efficiently.

When agencies have data management solutions that reduce the friction of navigating siloed government systems and enable faster, more secure collaboration, it enables them to drive innovation. This is especially true for agencies that handle extensive amounts of data. For example, many High Impact Service Providers (HISPs) must manage vast amounts of citizen data to provide critical, public-facing services with speed and scale.

Data is the foundation for modern digital government services. Once data is ingested, stored and secured effectively, the transformational potential of emerging technologies such as AI or RPA can be unlocked. Moreover, with real-time data insights, government decision-makers can use actionable intelligence to improve federal services. It’s essential that agency IT leaders invest in a robust data management strategy and modern data tools to ensure they can make informed decisions and benefit from the power of AI to achieve mission-critical outcomes for the American public.

Joe Jeter is senior vice president of federal technology at Maximus.

The post Robust data management is key to harnessing the power of emerging technologies first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/commentary/2024/06/robust-data-management-is-key-to-harnessing-the-power-of-emerging-technologies/feed/ 0
Countdown to Compliance: Understanding NARA’s rules for text messaging https://federalnewsnetwork.com/federal-insights/2024/06/countdown-to-compliance-understanding-naras-rules-for-text-messaging/ https://federalnewsnetwork.com/federal-insights/2024/06/countdown-to-compliance-understanding-naras-rules-for-text-messaging/#respond Mon, 17 Jun 2024 18:03:41 +0000 https://federalnewsnetwork.com/?p=5043700 Federal agencies have just weeks to prepare for changes in digital record standards, here’s how agencies can help ensure compliance.

The post Countdown to Compliance: Understanding NARA’s rules for text messaging first appeared on Federal News Network.

]]>
This content was written by George Fischer, Senior Vice President, Sales, T-Mobile Business Group.

A pivotal deadline looms large for federal agencies: digital message compliance. Starting June 30, 2024, all federal agencies will be required to archive agency records digitally. That’s right, the end of paper culture is finally here. But for far too long, there’s been a lot of confusion over what exactly digital messages entail, what the expectations are for reporting them and a lack of tools to submit data to the National Archives and Records Administration (NARA) in a way that’s easy, secure and proactive. And to further complicate things, NARA broadened the meaning of digital messages in January 2023 to include text messages, so there’s yet another factor to consider. Let’s focus on text messages since that’s the latest addition.

So many important business transactions, official policies and decisions are done via texting, so failing to archive them can make it hard to stay transparent, accountable and in compliance with the law. Once NARA’s deadline hits, non-compliance can lead to information gaps during federal investigations, create PR headaches, and potentially result in substantial fines and penalties.

Understanding NARA’s text message regulations

So, what exactly is NARA tracking? Text messages from federal workers are deemed public records and must be archived. NARA is expecting access to digital messages sent or received by federal employees, including SMS and MMS messages – meaning photos, videos, voice notes and even emojis. Considering these different types of messages plus the fact that they need to be monitored across agency networks, personal devices and different phone operating systems like Android and iOS means there are several layers of complexity to navigate.

NARA also expects agencies to retain metadata associated with these texts. Including timestamps, device information, attachments and even emoji reactions. Yes, you read that right – even a simple thumbs-up emoji might serve as evidence in a federal case.

The NARA guidelines recommend evaluating whether messages need to be archived based on whether they contain:

  • Evidence of agency policies, business or mission
  • Information that is exclusively available in electronic messages
  • Official agency information
  • A business need for the information

It’s clear that federal agencies are facing an increasingly complex and dynamic digital landscape filled with constantly changing expectations. Outdated processes are no match for this complexity, and they’re holding agencies back from staying compliant. The answer? Solutions that do the heavy lifting. To make life easier, agencies should have a platform that automatically captures text messages, images, and videos, and is tightly integrated with their wireless provider. It should also leverage the latest security protocols and make it easy to generate reports that are audit ready.

Streamlining text message archiving for federal agencies

Companies like 3rd Eye Technologies have spent years perfecting a solution to keep data safe for federal organizations, including agencies in charge of the highest levels of national security and intelligence. That’s why T-Mobile teamed up with the mobile solutions provider to make it as easy as possible for federal customers to know that the data they’re archiving is not only easy to manage but safe.

Mystic Messaging Archival is a turn-key solution from 3rd Eye Technologies that specializes in securely capturing and archiving texts – that includes SMS and MMS message logs for federal and enterprise customers. Mystic is fully integrated into the T-Mobile network, meaning there is no need for any additional applications or software on the phone, making implementation across the agency simple and swift once the agency purchases the solution from 3rd Eye Technologies or T-Mobile. And because the solution is configured at the network level, it is archiving every SMS/MMS message in real-time and is storing them securely for reporting, so the messages do not need to be self-reported unless specified by agency protocols. The messages then travel over 5G where they’re archived in a hosted cloud and the data remains owned by the agency.

Mystic’s cloud-based solution is also “FedRAMP Ready” in the FedRAMP marketplace, which means it is ready for Agency Authority to Operate (ATO). Not all archiving solutions have that distinction due to the highly rigorous standards involved, so it’s a major advantage. And when pairing Mystic technology with T-Mobile’s nationwide 5G network and 5G standalone technology, messages are transmitted over a secure channel, enhancing protection against vulnerabilities such as cyber attacks (commonly found in Wi-Fi networks).

Mystic also ensures that SMS/MMS data from any lost, stolen, or damaged mobile device is automatically archived, safeguarding information despite the physical status of the device.

Preparing for NARA compliance

Mystic’s eDiscovery console – the mechanism that actually generates the reports — is designed to streamline the entire process of collecting, storing, managing, securing and reviewing text messages from mobile devices. This centralized reporting console consolidates all data from subscribed agency enterprise mobile devices. This console is accessible by the Agency Headquarters, allowing for efficient management and oversight of all archived communications. This way agencies can quickly and easily respond to all types of legal requests, investigations or regulatory requirements. And because Mystic and T-Mobile are already tightly integrated through the 5G network, getting set up takes only 10 days or less.

Here’s the bottom line: agencies need to move fast. The NARA deadline is close and the right tools and partners will make all the difference in preparing for it. The clock is ticking, but it’s not too late to get ahead of the game with a solution that makes text archiving easy, integrates into your existing processes seamlessly and stays up to date with the latest guidelines so you don’t have to.

The post Countdown to Compliance: Understanding NARA’s rules for text messaging first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/countdown-to-compliance-understanding-naras-rules-for-text-messaging/feed/ 0
Cloud Exchange 2024: Splunk’s Jon ‘JG’ Gines on how to map out your modernization journey https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-splunks-jon-jg-gines-on-how-to-map-out-your-modernization-journey/ https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-splunks-jon-jg-gines-on-how-to-map-out-your-modernization-journey/#respond Wed, 12 Jun 2024 11:17:16 +0000 https://federalnewsnetwork.com/?p=5037416 Agencies need to understand their data and current legacy system architecture before moving applications to the cloud, advises Splunk cloud solutions architect.

The post Cloud Exchange 2024: Splunk’s Jon ‘JG’ Gines on how to map out your modernization journey first appeared on Federal News Network.

]]>

As the journey to the cloud continues, agencies are learning they must have two key ingredients nailed down: One, a roadmap explaining their current and future architectures. Two, an understanding of their data and where it lives.

Over the course of the last decade, it has become clear that successful cloud migrations grew out of these two concepts, said Jon “JG” Gines, a cloud solutions architect at Splunk.

“You have to understand which systems could go to the cloud and the ones that don’t really need to — and have a strong justification for that. Are these snowflake systems or not? Is there a roadmap for that? And then speaking of roadmaps, if you’re going to the cloud, saying it is one thing, getting there is another challenge. You have to make sure you have a very clear roadmap to go to the cloud,” Gines said during Federal News Network’s Cloud Exchange 2024.

Many times agencies miss that alignment as outlined in the roadmap and that causes them to either slow their IT transformation efforts or spend more money than necessary for cloud services, he said.

Identify ‘snowflake’ systems while defining your cloud roadmap

Gines said agency planning must account for both legacy systems and current data architectures to determine what can and should go to the cloud, and then what should stay on premise.

Those so-called snowflake systems that an agency has customized over the years and would require more time and effort than it’s worth to move to the cloud still need to be part of that roadmap, Gines said.

“Many customers know their architectures pretty well. Even the managers that are not very technical, they at least conceptually understand their architectures. It’s when they start going to the cloud that everything becomes somewhat more abstracted,” he said.

“Concepts such as serverless, load balancing, autoscaling and similar are all terms that are more cloud speak as opposed to terms that you find in an on-premise environment. So oftentimes, managers or decision-makers who traditionally understand on-premise architectures have trouble translating them into the cloud because it’s a lot different.”

Add to that the different cloud flavors — like infrastructure, platform and software as a service — and an organization’s architecture becomes even more of an abstraction, Gines said.

This is why breaking down the complexity of an agency’s systems starts with understanding the data, both its sources and its importance to the mission.

Remember, he said, not all the data at is of the same operational value.

“Some data is very specific to certain domains, like security observability, and then some data is very specific to actually running missions,” Gines said. “We have one customer that I actually had a call with, and they’ve got a lot of data. I asked him specifically about the data sources that are very specific to containers, Kubernetes, and then he also had other systems that they were just monitoring on premise. We learned that they were monitoring different environments in silos.”

Enriching your data for better use in cloud operations, AI

Gines said creating a “single pane of glass” to monitor and understand an organization’s data, whether in the cloud or on premise, will improve the understanding of the data’s operational value.

“I’m talking about not just taking in-the-raw data but enriching the data,” he said. “We have a lot of third-party integrations at Splunk. We have third-party integrations from AWS, Microsoft and all these big major vendors. It’s not just getting the data, but it’s also enriching the data so that it actually makes sense to the customer and to make sure it’s accurate.”

When agencies have confidence in their data, then the use cases for AI and machine learning become clearer and easier to implement.

“If we’re talking about artificial intelligence and machine learning, I would recommend that government agencies have a very specific set of use cases so they can also have responsible AI that is specifically aligned with the agency’s mission,” he said.

Gines said applying AI/ML tools in an automated fashion can speed up an organization’s time to investigate or do some rote review or examination more quickly.

“Essentially, what’s happening is through one single application, you’re actually seeing what’s inside of your cloud environment, your on-premise environment, and if you have SaaS environments, you are also seeing the data that comes into there. It all can also be monitored,” he said. “I do find only those agencies that are actually using a lot of data and using security information and event management tools to improve their observability, that’s where I see a lot of the sophistication happening as opposed to agencies where they’re just starting to get to the cloud.”

Discover more articles and videos now on Federal News Network’s Cloud Exchange event page.

The post Cloud Exchange 2024: Splunk’s Jon ‘JG’ Gines on how to map out your modernization journey first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/06/cloud-exchange-2024-splunks-jon-jg-gines-on-how-to-map-out-your-modernization-journey/feed/ 0
When it comes to AI at Energy, it takes a village https://federalnewsnetwork.com/federal-insights/2024/06/when-it-comes-to-ai-at-energy-it-takes-a-village/ https://federalnewsnetwork.com/federal-insights/2024/06/when-it-comes-to-ai-at-energy-it-takes-a-village/#respond Mon, 10 Jun 2024 14:54:18 +0000 https://federalnewsnetwork.com/?p=5027885 Rob King, the chief data officer at the Energy Department, said a new data strategy and implementation plan will set the tone for using AI in the future.

The post When it comes to AI at Energy, it takes a village first appeared on Federal News Network.

]]>
var config_5038065 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB9260653875.mp3?updated=1718217566"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/03\/EY-Podcast_3000x3000-B-150x150.jpg","title":"When it comes to AI at Energy, it takes a village","description":"[hbidcpodcast podcastid='5038065']nnFederal chief data officers are playing a larger role in how their organizations are adopting and using basic or advanced artificial intelligence (AI).nnA <a href="https:\/\/federalnewsnetwork.com\/big-data\/2023\/12\/chief-data-officers-focused-on-accelerating-ai-adoption-across-government\/">recent survey<\/a> of federal chief data officers by the Data Foundation found over half of the CDOs who responded say their role around AI has significantly changed over the past year, as compared to 2022 when 45% said they had no AI responsibility.nnTaking this a step further, with nearly every agency naming a chief AI officer over the past year, the coordination and collaboration between the CDO and these new leaders has emerged as a key factor in the success of any agency AI program.nn\u201cWe are taking a collaborative and integrated approach to aligning data into artificial intelligence and building synergies between the role of data and data governance, and really being able to meet the spirit of the requirements of the AI executive order, with the ability to interrogate our data ethically and without bias as they are being imported into artificial intelligence models,\u201d said Rob King, the chief data officer at the Energy Department, on the discussion<a href="https:\/\/federalnewsnetwork.com\/government-modernization-unleashed\/"><strong><em> Government Modernization Unleashed: AI Essentials<\/em><\/strong><\/a>. \u201cWe're really now trying to ensure that we can back in the appropriate governance management, make sure we have oversight of our AI inventories and start to align the right controls in place from a metadata management and from a training data standpoint, so that we can meet both the letter and the spirit of the <a href="https:\/\/federalnewsnetwork.com\/artificial-intelligence\/2023\/10\/biden-ai-executive-order-calls-for-talent-surge-across-government-to-retain-tech-experts\/">AI executive order<\/a>. We don\u2019t just want to be compliance driven, but ensure that we are doing the right thing to leverage those AI models to their full extent, and make sure that we can accelerate the adoption of them more broadly.\u201dnnFor that adoption that King talks about to happen more broadly and more quickly, data must be prepared, managed and curated to ensure the AI, or really any technology tool, works well.n<h2>CDOs in a unique position<\/h2>nHe said AI is just the latest accelerator that has come along that reemphasizes the importance of understanding and protecting an organization\u2019s data.nn\u201cHow do we use AI to help us look for themes, patterns of usages in our data to advance the classification and tagging of our data from a stewardship standpoint, so that we can understand that whole full cycle? We're calling things like data-centric AI to ensure that we're looking at ways to use non-invasive data governance approaches to help meet the mission needs of AI. It's a great feedback loop,\u201d King said. \u201cWe're using AI to drive the maturity of our processes so that we can advance the mission adoption of AI as well. The CDOs are in a unique position because we live by the tenets of 'it takes a village.' It takes us working with policy and process leaders, and now the chief AI officers (CAIOs) and mission stakeholders, bringing us all together to really drive the outcomes of strong data management practices, now aligned to positioning for AI adoption.\u201dnnKing, who has been <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2023\/01\/ssas-data-pipelines-under-construction-to-feed-digital-transformation\/">the CDO<\/a> at Energy for <a href="https:\/\/www.energy.gov\/cio\/person\/robert-king" target="_blank" rel="noopener">almost a year<\/a>, said policies like the Federal Data Strategy or the Evidence-Based Policymaking Act have created a solid foundation, but the hard work that still must happen will be by CDOs and CAIOs as they put those concepts into action.nnOne way King started down this data management journey is by developing an enterprise data strategy and \u201crecharged\u201d DoE\u2019s data governance board by ensuring all the right stakeholders with the right subject matter expertise and relevancy are participating.nn\u201cWe're on the precipice of completing that strategy. It's been published in a draft format to our entire data governance board members for final review and edit. We hope to bring that to the finish line in the next few weeks,\u201d he said. \u201cFrom there, we're already moving right into a five-year implementation plan, breaking it down by annual increments to promote that strategy, recognizing that our science complex, our weapons complex and our environmental complexes have very different needs.\u201dn<h2>Testing AI has begun<\/h2>nThe new data strategy will lay out what King called the \u201cNorth Star\u201d goals for DoE around data management and governance.nnHe said the strategy details five strategy goals, each with several objectives and related actions.nn\u201cWe wanted to make sure that everyone could see themselves in the strategy. The implementation plan is going to be much more nuanced. We're now taking key stakeholders from our data governance group and building a team with appropriate subject matter experts and mission representatives to build out that implementation plan and to account for those major data types,\u201d he said. \u201cThe other thing we're starting to look at in our strategy is [asking] what is the right ontology for data sharing? We should have a conceptual mission architecture that can show where we can accelerate our missions, be it on the weapons side or on the science and research side. Where can we build ontologies that say we can accelerate the mission? Because we're seeing like functions and like activities that, because of our federated nature at the Department of Energy, we can break down those silos, show where there's that shared equity. That could be some natural data sharing agreements that we could facilitate and accelerate mission functions or science.\u201dnnEven as Energy finalizes its data strategy, its bureaus and labs aren\u2019t waiting to begin testing and piloting AI tools. Energy has several potential and real use cases for AI already under consideration or in the works. King said applying AI to mission critical priorities like moving to a zero trust architecture and in the cyber domain is one example. Another is applying AI to hazards analysis through DoE\u2019s national labs.nnKing said the CDO and CAIO are identifying leaders and then sharing how they are applying AI to other mission areas.nn\u201cI'm trying to partner with them to understand how I can scale and emulate their goodness, both from pure data management standpoint as well as artificial intelligence,\u201d he said. \u201cWe have one that the National Nuclear Security Administration is leading, called Project Alexandra, around non-nuclear proliferation. They're doing a lot of great things. So how do we take that and scale it for its goodness? We are seeing some strategic use cases that are of high importance. The AI executive order says our foundational models need to be published to other government agencies, academia and industry for interrogation. So how do we then start to, with the chief AI officer, say what is our risk assessment? And what is our data quality assessment for being able to publish our foundational models to those stakeholders for that interrogation? How do we start to align our data governance strategy and use cases to some of our AI drivers?\u201d"}};

Federal chief data officers are playing a larger role in how their organizations are adopting and using basic or advanced artificial intelligence (AI).

A recent survey of federal chief data officers by the Data Foundation found over half of the CDOs who responded say their role around AI has significantly changed over the past year, as compared to 2022 when 45% said they had no AI responsibility.

Taking this a step further, with nearly every agency naming a chief AI officer over the past year, the coordination and collaboration between the CDO and these new leaders has emerged as a key factor in the success of any agency AI program.

“We are taking a collaborative and integrated approach to aligning data into artificial intelligence and building synergies between the role of data and data governance, and really being able to meet the spirit of the requirements of the AI executive order, with the ability to interrogate our data ethically and without bias as they are being imported into artificial intelligence models,” said Rob King, the chief data officer at the Energy Department, on the discussion Government Modernization Unleashed: AI Essentials. “We’re really now trying to ensure that we can back in the appropriate governance management, make sure we have oversight of our AI inventories and start to align the right controls in place from a metadata management and from a training data standpoint, so that we can meet both the letter and the spirit of the AI executive order. We don’t just want to be compliance driven, but ensure that we are doing the right thing to leverage those AI models to their full extent, and make sure that we can accelerate the adoption of them more broadly.”

For that adoption that King talks about to happen more broadly and more quickly, data must be prepared, managed and curated to ensure the AI, or really any technology tool, works well.

CDOs in a unique position

He said AI is just the latest accelerator that has come along that reemphasizes the importance of understanding and protecting an organization’s data.

“How do we use AI to help us look for themes, patterns of usages in our data to advance the classification and tagging of our data from a stewardship standpoint, so that we can understand that whole full cycle? We’re calling things like data-centric AI to ensure that we’re looking at ways to use non-invasive data governance approaches to help meet the mission needs of AI. It’s a great feedback loop,” King said. “We’re using AI to drive the maturity of our processes so that we can advance the mission adoption of AI as well. The CDOs are in a unique position because we live by the tenets of ‘it takes a village.’ It takes us working with policy and process leaders, and now the chief AI officers (CAIOs) and mission stakeholders, bringing us all together to really drive the outcomes of strong data management practices, now aligned to positioning for AI adoption.”

King, who has been the CDO at Energy for almost a year, said policies like the Federal Data Strategy or the Evidence-Based Policymaking Act have created a solid foundation, but the hard work that still must happen will be by CDOs and CAIOs as they put those concepts into action.

One way King started down this data management journey is by developing an enterprise data strategy and “recharged” DoE’s data governance board by ensuring all the right stakeholders with the right subject matter expertise and relevancy are participating.

“We’re on the precipice of completing that strategy. It’s been published in a draft format to our entire data governance board members for final review and edit. We hope to bring that to the finish line in the next few weeks,” he said. “From there, we’re already moving right into a five-year implementation plan, breaking it down by annual increments to promote that strategy, recognizing that our science complex, our weapons complex and our environmental complexes have very different needs.”

Testing AI has begun

The new data strategy will lay out what King called the “North Star” goals for DoE around data management and governance.

He said the strategy details five strategy goals, each with several objectives and related actions.

“We wanted to make sure that everyone could see themselves in the strategy. The implementation plan is going to be much more nuanced. We’re now taking key stakeholders from our data governance group and building a team with appropriate subject matter experts and mission representatives to build out that implementation plan and to account for those major data types,” he said. “The other thing we’re starting to look at in our strategy is [asking] what is the right ontology for data sharing? We should have a conceptual mission architecture that can show where we can accelerate our missions, be it on the weapons side or on the science and research side. Where can we build ontologies that say we can accelerate the mission? Because we’re seeing like functions and like activities that, because of our federated nature at the Department of Energy, we can break down those silos, show where there’s that shared equity. That could be some natural data sharing agreements that we could facilitate and accelerate mission functions or science.”

Even as Energy finalizes its data strategy, its bureaus and labs aren’t waiting to begin testing and piloting AI tools. Energy has several potential and real use cases for AI already under consideration or in the works. King said applying AI to mission critical priorities like moving to a zero trust architecture and in the cyber domain is one example. Another is applying AI to hazards analysis through DoE’s national labs.

King said the CDO and CAIO are identifying leaders and then sharing how they are applying AI to other mission areas.

“I’m trying to partner with them to understand how I can scale and emulate their goodness, both from pure data management standpoint as well as artificial intelligence,” he said. “We have one that the National Nuclear Security Administration is leading, called Project Alexandra, around non-nuclear proliferation. They’re doing a lot of great things. So how do we take that and scale it for its goodness? We are seeing some strategic use cases that are of high importance. The AI executive order says our foundational models need to be published to other government agencies, academia and industry for interrogation. So how do we then start to, with the chief AI officer, say what is our risk assessment? And what is our data quality assessment for being able to publish our foundational models to those stakeholders for that interrogation? How do we start to align our data governance strategy and use cases to some of our AI drivers?”

The post When it comes to AI at Energy, it takes a village first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/06/when-it-comes-to-ai-at-energy-it-takes-a-village/feed/ 0
Gen. Rey reflects on leading Network Cross Functional team https://federalnewsnetwork.com/army/2024/06/gen-rey-reflects-on-leading-network-cross-functional-team/ https://federalnewsnetwork.com/army/2024/06/gen-rey-reflects-on-leading-network-cross-functional-team/#respond Thu, 06 Jun 2024 18:31:37 +0000 https://federalnewsnetwork.com/?p=5030506 Maj. Gen. Jeth Rey focused on four pillars, including agnostics transport and moving the Army toward a data-centric environment, over the last three years.

The post Gen. Rey reflects on leading Network Cross Functional team first appeared on Federal News Network.

]]>
var config_5030698 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB3268324857.mp3?updated=1717699192"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"Gen. Rey reflects on leading Network Cross Functional team","description":"[hbidcpodcast podcastid='5030698']nnMaj. Gen. Jeth Rey ended his three-year tenure as the director of the Army\u2019s Network Cross Functional team last week. When he started in 2021, Rey laid out a four-pronged vision to move the Army toward a data-centric environment.nnRey, who moved to a new job at the Pentagon as the director of architecture, operations, networks and space at the Office of the Deputy Chief of Staff, G-6, said the Army has made tremendous progress to become a data-centric organization over the last three years.nn[caption id="attachment_5030549" align="alignright" width="474"]<img class="wp-image-5030549 " src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/06\/jeth-rey.jpg" alt="" width="474" height="419" \/> Maj. Gen. Jeth Rey ended his three-year tenure as the director of the Army\u2019s Network Cross Functional team and is now the director of architecture, operations, networks and space at the Office of the Deputy Chief of Staff, G-6.[\/caption]nn\u201cThe problem said that we had in the Army, and across DoD, is we didn't have a data problem, we had a data management problem,\u201d Rey said in an interview at the Army TEMS conference. \u201cTherefore, we tried to find a way to get to data centric using agnostic transport to move the data as freely as possible to where it needs to go, a cloud-enabled asset to catch and move the data, and then, obviously, you needed a layered security architecture. We wanted a multi-level security architecture where we can move the data from one classification to another seamlessly.\u201dnn nnBrig. Gen. Patrick Ellis, the former deputy chief of staff, G-3 for the Army Europe-Africa <a href="https:\/\/www.defense.gov\/News\/Releases\/Release\/Article\/3693728\/general-officer-assignments\/">took over<\/a> for Rey in early June.nnUnder the Network Cross Functional team, Rey\u2019s four pillars were:n<ul>n \t<li>Agnostic transport<\/li>n \t<li>Moving to a data-centric environment from a network-centric environment<\/li>n \t<li>Implementing a multi-level security architecture to include a zero trust architecture<\/li>n \t<li>Ensuring cybersecurity is considered early as part of system development<\/li>n<\/ul>nRey said he worked closely with Army Program Executive Office Command, Control and Communications Tactical (PEO-C3T) and the Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance and Reconnaissance (C5ISR) Center in the Army Combat Capabilities Development Command to take the vision and make it into a reality.nn\u201cMy role is setting the vision and then keeping the momentum going forward. I would set a timeframe that I would want to see a part of the project achieved, and then I just continue to drive the momentum going forward,\u201d Rey said. \u201cWe are the influencers as the Network Cross Functional team to get to the end state and keep people focusing on track.\u201dn<h2>Army's transport is now multi-threaded<\/h2>nThe Army demonstrated its progress in advancing these capabilities over the past few years at Project Convergence and NetModX, which is one of their major exercises that is run by the C5ISR.nnRey said one way the Army is better off than it was three years ago is how it processes data across multiple infrastructure approaches.nnAt one time, the soldiers could only use one type of approach, or single threaded, such as only using Geostationary Operational Environmental Satellites (GOES).nnHe said the C5ISR office created an automate planning for primary, alternate, contingency and emergency (PACE) communications plan to create the multiple threaded approach to transport.nn\u201cI wanted to see if there was a way to automate pace that we could go from 5G to low Earth orbit (LEO) satellite to GOES to medium Earth orbit (MEO) satellites. I think, three years later, we are almost there as an accomplishment when it comes to that part of our pillar,\u201d Rey said.nnA second pillar where Rey believes the Army has made significant progress in is moving to a data-centric environment. He said the advancements in the network architecture is a big part of this change.nn\u201cI believe that the way data is being approached today is a little different. I think what we need to think about is the way we create data because today data is stored on your laptop or it's stored on your phone or it is stored in a data center or it stored in the cloud. It\u2019s still really siloed, and from my perspective, we need more of a in a large data fabric where we can catch and make sense of data by using artificial intelligence and machine learning,\u201d he said. \u201cWe need open application programming interfaces (APIs) in order for us to be able to share data. If we get to a point where I\u2019d like down to the attribute base level of data sharing. Until we actually get there, we will continue to have data siloed the way we are today.\u201dnnThe Army took a big step in this direction in January, <a href="https:\/\/federalnewsnetwork.com\/army\/2024\/01\/army-implementing-new-data-architecture-launching-innovation-exchange-lab-next-month\/">starting to implement<\/a> its unified data reference architecture (UDRA). The service recently completed version 1.0 of the UDRA while also building out an implementation plan of the framework in partnership with the Army Combat Capabilities Development Command (DEVCOM).n<h2>Keep the momentum going<\/h2>nThe Army expects UDRA to bring together principles and efforts for data mesh and data fabric. While data mesh involves a decentralized approach where data product ownership is distributed across teams and domains, the data platform will facilitate seamless access and integration of data products from different formats and locations.nnRey said the concepts that make the <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2022\/10\/as-data-fabric-comes-together-army-must-ensure-platforms-integrate\/">data mesh and data fabric<\/a> work go back to creating a unified network, especially in the tactical environment.nn\u201cThere are two separate areas that we're trying to unify together. In the tactical space is where we believe the data fabric is more important for us today because of all the sensors that are on the battlefield and in order to make sense of the information that's out there,\u201d he said. \u201cThat is the catcher's mitt that needs to ingest the data, use analytics and then egress of data for the commander to make an informed decision across the board. I think we're we have a lot of momentum right now. We've talked about the next generation of command and control systems that's coming, and that's going to be an ecosystem that allows us to really have a more robust type of data environment that will move data and echelon.\u201dnnArmy Chief of Staff Gen. Randy George on May 28 <a href="https:\/\/federalnewsnetwork.com\/army\/2024\/06\/agile-adaptable-modular-the-future-of-army-c2\/">signed off<\/a> on the Next Generation Command and Control (NGC2) Capability Characteristics (C2 Next).nnRey said creating data in a way that also foresees wanting to share it remains one of the biggest challenges for the Army.nn\u201cThe only way you can share it is if we decide what those attributes are going to look like, whether I'm with a partner or whether I'm just dealing with a US entity,\u201d he said. \u201cSo, attributes are going to be key with how we tag label the data, and then be an in are able to share it at the end of the onset.\u201dnnAs for the new director of the Network Cross Functional team Rey said his advice to Ellis was simple: \u201cDon't allow the momentum to slow down.\u201d"}};

Maj. Gen. Jeth Rey ended his three-year tenure as the director of the Army’s Network Cross Functional team last week. When he started in 2021, Rey laid out a four-pronged vision to move the Army toward a data-centric environment.

Rey, who moved to a new job at the Pentagon as the director of architecture, operations, networks and space at the Office of the Deputy Chief of Staff, G-6, said the Army has made tremendous progress to become a data-centric organization over the last three years.

Maj. Gen. Jeth Rey ended his three-year tenure as the director of the Army’s Network Cross Functional team and is now the director of architecture, operations, networks and space at the Office of the Deputy Chief of Staff, G-6.

“The problem said that we had in the Army, and across DoD, is we didn’t have a data problem, we had a data management problem,” Rey said in an interview at the Army TEMS conference. “Therefore, we tried to find a way to get to data centric using agnostic transport to move the data as freely as possible to where it needs to go, a cloud-enabled asset to catch and move the data, and then, obviously, you needed a layered security architecture. We wanted a multi-level security architecture where we can move the data from one classification to another seamlessly.”

 

Brig. Gen. Patrick Ellis, the former deputy chief of staff, G-3 for the Army Europe-Africa took over for Rey in early June.

Under the Network Cross Functional team, Rey’s four pillars were:

  • Agnostic transport
  • Moving to a data-centric environment from a network-centric environment
  • Implementing a multi-level security architecture to include a zero trust architecture
  • Ensuring cybersecurity is considered early as part of system development

Rey said he worked closely with Army Program Executive Office Command, Control and Communications Tactical (PEO-C3T) and the Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance and Reconnaissance (C5ISR) Center in the Army Combat Capabilities Development Command to take the vision and make it into a reality.

“My role is setting the vision and then keeping the momentum going forward. I would set a timeframe that I would want to see a part of the project achieved, and then I just continue to drive the momentum going forward,” Rey said. “We are the influencers as the Network Cross Functional team to get to the end state and keep people focusing on track.”

Army’s transport is now multi-threaded

The Army demonstrated its progress in advancing these capabilities over the past few years at Project Convergence and NetModX, which is one of their major exercises that is run by the C5ISR.

Rey said one way the Army is better off than it was three years ago is how it processes data across multiple infrastructure approaches.

At one time, the soldiers could only use one type of approach, or single threaded, such as only using Geostationary Operational Environmental Satellites (GOES).

He said the C5ISR office created an automate planning for primary, alternate, contingency and emergency (PACE) communications plan to create the multiple threaded approach to transport.

“I wanted to see if there was a way to automate pace that we could go from 5G to low Earth orbit (LEO) satellite to GOES to medium Earth orbit (MEO) satellites. I think, three years later, we are almost there as an accomplishment when it comes to that part of our pillar,” Rey said.

A second pillar where Rey believes the Army has made significant progress in is moving to a data-centric environment. He said the advancements in the network architecture is a big part of this change.

“I believe that the way data is being approached today is a little different. I think what we need to think about is the way we create data because today data is stored on your laptop or it’s stored on your phone or it is stored in a data center or it stored in the cloud. It’s still really siloed, and from my perspective, we need more of a in a large data fabric where we can catch and make sense of data by using artificial intelligence and machine learning,” he said. “We need open application programming interfaces (APIs) in order for us to be able to share data. If we get to a point where I’d like down to the attribute base level of data sharing. Until we actually get there, we will continue to have data siloed the way we are today.”

The Army took a big step in this direction in January, starting to implement its unified data reference architecture (UDRA). The service recently completed version 1.0 of the UDRA while also building out an implementation plan of the framework in partnership with the Army Combat Capabilities Development Command (DEVCOM).

Keep the momentum going

The Army expects UDRA to bring together principles and efforts for data mesh and data fabric. While data mesh involves a decentralized approach where data product ownership is distributed across teams and domains, the data platform will facilitate seamless access and integration of data products from different formats and locations.

Rey said the concepts that make the data mesh and data fabric work go back to creating a unified network, especially in the tactical environment.

“There are two separate areas that we’re trying to unify together. In the tactical space is where we believe the data fabric is more important for us today because of all the sensors that are on the battlefield and in order to make sense of the information that’s out there,” he said. “That is the catcher’s mitt that needs to ingest the data, use analytics and then egress of data for the commander to make an informed decision across the board. I think we’re we have a lot of momentum right now. We’ve talked about the next generation of command and control systems that’s coming, and that’s going to be an ecosystem that allows us to really have a more robust type of data environment that will move data and echelon.”

Army Chief of Staff Gen. Randy George on May 28 signed off on the Next Generation Command and Control (NGC2) Capability Characteristics (C2 Next).

Rey said creating data in a way that also foresees wanting to share it remains one of the biggest challenges for the Army.

“The only way you can share it is if we decide what those attributes are going to look like, whether I’m with a partner or whether I’m just dealing with a US entity,” he said. “So, attributes are going to be key with how we tag label the data, and then be an in are able to share it at the end of the onset.”

As for the new director of the Network Cross Functional team Rey said his advice to Ellis was simple: “Don’t allow the momentum to slow down.”

The post Gen. Rey reflects on leading Network Cross Functional team first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/army/2024/06/gen-rey-reflects-on-leading-network-cross-functional-team/feed/ 0
Agile, adaptable, modular: The future of Army C2 https://federalnewsnetwork.com/army/2024/06/agile-adaptable-modular-the-future-of-army-c2/ https://federalnewsnetwork.com/army/2024/06/agile-adaptable-modular-the-future-of-army-c2/#respond Tue, 04 Jun 2024 16:34:37 +0000 https://federalnewsnetwork.com/?p=5026837 The Army’s Next Generation Command and Control (NGC2) Capability Characteristics or C2Next is the roadmap for developing a different kind of command post.

The post Agile, adaptable, modular: The future of Army C2 first appeared on Federal News Network.

]]>
var config_5027013 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB9062140394.mp3?updated=1717517527"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"Agile, adaptable, modular: The future of Army C2","description":"[hbidcpodcast podcastid='5027013']nnFor the Army, the command post of the future will need to be agile, resilient and intuitive.nnIt will be a big lift not only for the Army, but for the contractors who are building the technology to support it.nnThis is one of many reasons why the Army Chief of Staff Gen. Randy George on May 28 signed off on the Next Generation Command and Control (NGC2) Capability Characteristics (C2 Next).nnThe Army <a href="https:\/\/sam.gov\/opp\/a060a01a72074a7c95dc541f7ae36400\/view" target="_blank" rel="noopener">released a notice<\/a> on SAM.gov to say the characteristics of needs are available, but vendors have to \u201capply\u201d to see them as they are not public.nnGeorge and other Army senior leaders, speaking at the Army TEMS conference in Philadelphia last week, offered a preview of the characteristics, outlining key concepts and insight into what command and control of the future needs to encompass.nnGeorge said with the <a href="https:\/\/federalnewsnetwork.com\/army\/2023\/10\/the-army-has-been-trying-to-simplify-its-networks-for-decades-officials-say-this-time-is-different\/">network being the Army\u2019s top priority<\/a>, these new characteristics are a key building block.nn\u201cI was out at the National Training Center I think it was March for Project Convergence. One of the things that I challenged everybody a year ago, and especially Army Future Command, was I want to be able to be on the network and I want us to be able to operate with tablets, phones, software-defined radios in a very simple architecture. What I saw when I was out there in March is that the technology exists now to do those kinds of things,\u201d George said. \u201cWe had a platoon leader talking to a company commander or talking to a battalion commander talking to a brigade commander, and they were talking on tablets. All those big systems that we used to have, the Advanced Field Artillery Tactical Data System (AFATDS) is one of them, can be an app. It can be on that tablet. So rather than having a truck or two trucks and 10 people, you have an application. That's where we have to go.\u201dnnGeorge said the commanders were excited about these capabilities because it speeds the decision process and makes them more lethal.n<h2>Army details C2 Next<\/h2>nThe Army developed this initial set of C2Next characteristics to support the concepts George talked about: Speed to decision, the lethality of the units, the ability to adapt and be agile based on real-time threats, challenges and needs.nnJoe Welch, the director of Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance, Reconnaissance Center (C5ISR) for the Army Combat Capabilities Development Command, said what\u2019s in the characteristics are not just capabilities to build or have, but they give you the ability to tailor and adapt C2 for the commander and their staff based on their needs and information requirements. He said these characteristics aren't even necessarily the nuts and bolts of the capabilities of systems.nn[caption id="attachment_5026914" align="alignright" width="400"]<img class="wp-image-5026914" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/06\/joe-welch-300x225.webp" alt="" width="400" height="300" \/> Joe Welch is the director of Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance, Reconnaissance Center (C5ISR) for the Army Combat Capabilities Development Command.[\/caption]nnWelch outlined several focus areas for C2 Next, starting with a key ingredient agnostic transport, meaning the data gets to the users no matter the infrastructure such as cloud or satellite or on-premise data center.nn\u201c[It has to be] robust and resilient. We've been making lots of progress in terms of that, not just in the variety of transport paths that we have for our networks to be able to support data transmission, but to do it in an automated way and a highly secure way,\u201d Welch said. \u201cI see this as a continued evolution. In the characteristics of need, we talk specifically about being threat informed in this area. We started from a perspective of, we just need to be able to communicate; we need to be able to get the data where it needs to go in order to accomplish the mission.\u201dnnA second area that will be critical, Welch said, is a robust services architecture that is cloud native and based on open systems standards that let commanders easily iterate new capabilities.nn\u201cA consistent theme here recently is as-a-service. We're seeing that in more and more areas. What's really meant by that is that we don't want to be fixed on any particular thing. We want to be able to experiment, prototype, move very quickly into deployment, and use something as long as it's working, and be able to challenge it when there's something that's better out there when the need changes or the technology changes,\u201d he said. \u201cThat gets into a lot more mechanics than the concepts or the capabilities that we're describing. But it's a very fundamental underpinning of where we're looking to go.\u201dn<h2>Testing C2 characteristics<\/h2>nWelch added C2 Next is part of a necessary and <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2022\/10\/as-the-u-s-military-struggles-some-c2-challenges-for-special-forces\/">complete revamp<\/a> of the way that the Army will generate, produce, consume and discover data, and it's necessary in order to apply machine learning on to it at all.nnHe said if the Army wants to be able to do informed and enabled decision making much faster than the adversary, then the characteristics of need will play a huge role.nnThe Army has been testing many of these concepts over the last 12-18 months and improving them as it went along. Most recently at <a href="https:\/\/federalnewsnetwork.com\/army\/2023\/06\/armys-next-project-convergence-to-be-an-integrated-global-exercise\/">Project Convergence<\/a>, an annual technology and capability demonstration, Dr. Jean Vettel, Next Generation C2 chief scientist for experimentation at Army Capabilities Command for C5ISR, said they measured some of the benefits of the C2Next characteristics.nn\u201cIn the characteristics of need, you'll see that we'll talk a lot about modularity or\u2026this really focus we have on composability. What does that actually mean?\u201d Vettel said.nnAs an example, Vettel said commanders developing a plan to set up a command post in minutes versus hours using commercial technology called Raspberry Pis.nn\u201cWithin that they had 16 Raspberry Pi's that they put out, where they emulated the electromagnetic signature of the command post as decoys. Whenever we think about adaptability, what is the metric of adaptability that would be successful for Next C2?\u201d she said.nnThe idea was to protect the command and control technologies from jamming or other cyber attacks. Vettel said this is an example of how the C2 Next characteristics emphasize adaptability.nn\u201cIf it's adaptable, that means that in the fight, whenever a pure adversary now has identified that we're creating our decoys with electromagnetic signatures, then our warfighters need to have access to data that they couldn't tell us they would need beforehand,\u201d she said. \u201cThey have to have the ability to know what data they have available and how do they try to spoof or create a different decoy because they have access to the data because it's adaptable to what they need to fight the peer adversary.\u201dnnShe added this example also shows building capabilities in modules that can be plugged into, removed and changed as necessary is another key piece in the C2 Next characteristics.n<h2>A living document to be updated<\/h2>nC2Next characteristics are out for review and comment for industry and other folks in the Army.nnWelch said the intent is to make C2 Next characteristics of need a living document that will be updated every six months or so.nnAdditionally, the Army Futures Command is in the early stages of planning a new contract vehicle to help bring these C2 Next characteristics into technology capabilities. While it\u2019s still early, the Army may use an Other Transaction Authority type of approach as a way to bring multiple companies into the mix and experiment with different parts of the characteristics.nn\u201cI think what you'll see is the characteristics of need, which may sound very principled and very large overarching statements, I'm expecting that they're going to get iterated into some greater and greater levels of detail as we continue through Next Generation C2 experimentation,\u201d Welch said. \u201cWe're certainly moving fast and in alignment with the chief\u2019s objective to be moving with speed and urgency. We're going to be moving in conjunction with our partners at Acquisition, Logistics and Technology (ASA(ALT) as we look beyond experimentation and prototyping and into delivery of Next Generation C2 capability.\u201d"}};

For the Army, the command post of the future will need to be agile, resilient and intuitive.

It will be a big lift not only for the Army, but for the contractors who are building the technology to support it.

This is one of many reasons why the Army Chief of Staff Gen. Randy George on May 28 signed off on the Next Generation Command and Control (NGC2) Capability Characteristics (C2 Next).

The Army released a notice on SAM.gov to say the characteristics of needs are available, but vendors have to “apply” to see them as they are not public.

George and other Army senior leaders, speaking at the Army TEMS conference in Philadelphia last week, offered a preview of the characteristics, outlining key concepts and insight into what command and control of the future needs to encompass.

George said with the network being the Army’s top priority, these new characteristics are a key building block.

“I was out at the National Training Center I think it was March for Project Convergence. One of the things that I challenged everybody a year ago, and especially Army Future Command, was I want to be able to be on the network and I want us to be able to operate with tablets, phones, software-defined radios in a very simple architecture. What I saw when I was out there in March is that the technology exists now to do those kinds of things,” George said. “We had a platoon leader talking to a company commander or talking to a battalion commander talking to a brigade commander, and they were talking on tablets. All those big systems that we used to have, the Advanced Field Artillery Tactical Data System (AFATDS) is one of them, can be an app. It can be on that tablet. So rather than having a truck or two trucks and 10 people, you have an application. That’s where we have to go.”

George said the commanders were excited about these capabilities because it speeds the decision process and makes them more lethal.

Army details C2 Next

The Army developed this initial set of C2Next characteristics to support the concepts George talked about: Speed to decision, the lethality of the units, the ability to adapt and be agile based on real-time threats, challenges and needs.

Joe Welch, the director of Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance, Reconnaissance Center (C5ISR) for the Army Combat Capabilities Development Command, said what’s in the characteristics are not just capabilities to build or have, but they give you the ability to tailor and adapt C2 for the commander and their staff based on their needs and information requirements. He said these characteristics aren’t even necessarily the nuts and bolts of the capabilities of systems.

Joe Welch is the director of Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance, Reconnaissance Center (C5ISR) for the Army Combat Capabilities Development Command.

Welch outlined several focus areas for C2 Next, starting with a key ingredient agnostic transport, meaning the data gets to the users no matter the infrastructure such as cloud or satellite or on-premise data center.

“[It has to be] robust and resilient. We’ve been making lots of progress in terms of that, not just in the variety of transport paths that we have for our networks to be able to support data transmission, but to do it in an automated way and a highly secure way,” Welch said. “I see this as a continued evolution. In the characteristics of need, we talk specifically about being threat informed in this area. We started from a perspective of, we just need to be able to communicate; we need to be able to get the data where it needs to go in order to accomplish the mission.”

A second area that will be critical, Welch said, is a robust services architecture that is cloud native and based on open systems standards that let commanders easily iterate new capabilities.

“A consistent theme here recently is as-a-service. We’re seeing that in more and more areas. What’s really meant by that is that we don’t want to be fixed on any particular thing. We want to be able to experiment, prototype, move very quickly into deployment, and use something as long as it’s working, and be able to challenge it when there’s something that’s better out there when the need changes or the technology changes,” he said. “That gets into a lot more mechanics than the concepts or the capabilities that we’re describing. But it’s a very fundamental underpinning of where we’re looking to go.”

Testing C2 characteristics

Welch added C2 Next is part of a necessary and complete revamp of the way that the Army will generate, produce, consume and discover data, and it’s necessary in order to apply machine learning on to it at all.

He said if the Army wants to be able to do informed and enabled decision making much faster than the adversary, then the characteristics of need will play a huge role.

The Army has been testing many of these concepts over the last 12-18 months and improving them as it went along. Most recently at Project Convergence, an annual technology and capability demonstration, Dr. Jean Vettel, Next Generation C2 chief scientist for experimentation at Army Capabilities Command for C5ISR, said they measured some of the benefits of the C2Next characteristics.

“In the characteristics of need, you’ll see that we’ll talk a lot about modularity or…this really focus we have on composability. What does that actually mean?” Vettel said.

As an example, Vettel said commanders developing a plan to set up a command post in minutes versus hours using commercial technology called Raspberry Pis.

“Within that they had 16 Raspberry Pi’s that they put out, where they emulated the electromagnetic signature of the command post as decoys. Whenever we think about adaptability, what is the metric of adaptability that would be successful for Next C2?” she said.

The idea was to protect the command and control technologies from jamming or other cyber attacks. Vettel said this is an example of how the C2 Next characteristics emphasize adaptability.

“If it’s adaptable, that means that in the fight, whenever a pure adversary now has identified that we’re creating our decoys with electromagnetic signatures, then our warfighters need to have access to data that they couldn’t tell us they would need beforehand,” she said. “They have to have the ability to know what data they have available and how do they try to spoof or create a different decoy because they have access to the data because it’s adaptable to what they need to fight the peer adversary.”

She added this example also shows building capabilities in modules that can be plugged into, removed and changed as necessary is another key piece in the C2 Next characteristics.

A living document to be updated

C2Next characteristics are out for review and comment for industry and other folks in the Army.

Welch said the intent is to make C2 Next characteristics of need a living document that will be updated every six months or so.

Additionally, the Army Futures Command is in the early stages of planning a new contract vehicle to help bring these C2 Next characteristics into technology capabilities. While it’s still early, the Army may use an Other Transaction Authority type of approach as a way to bring multiple companies into the mix and experiment with different parts of the characteristics.

“I think what you’ll see is the characteristics of need, which may sound very principled and very large overarching statements, I’m expecting that they’re going to get iterated into some greater and greater levels of detail as we continue through Next Generation C2 experimentation,” Welch said. “We’re certainly moving fast and in alignment with the chief’s objective to be moving with speed and urgency. We’re going to be moving in conjunction with our partners at Acquisition, Logistics and Technology (ASA(ALT) as we look beyond experimentation and prototyping and into delivery of Next Generation C2 capability.”

The post Agile, adaptable, modular: The future of Army C2 first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/army/2024/06/agile-adaptable-modular-the-future-of-army-c2/feed/ 0
Improving citizen experience with proper data management https://federalnewsnetwork.com/commentary/2024/05/improving-citizen-experience-with-proper-data-management/ https://federalnewsnetwork.com/commentary/2024/05/improving-citizen-experience-with-proper-data-management/#respond Fri, 31 May 2024 19:28:36 +0000 https://federalnewsnetwork.com/?p=5022866 By harnessing data-driven decision-making, agencies can significantly enhance the quality and efficiency of services provided to citizens.

The post Improving citizen experience with proper data management first appeared on Federal News Network.

]]>
The White House’s recent FY25 budget proposal emphasizes improved citizen services and quality of life and includes initiatives such as lowering the cost of childcare, increasing affordable housing, decreasing the cost of healthcare and more.  

 To accomplish these goals, the budget proposal focuses on utilizing evidence-driven policies and programs, highlighting the need for additional personnel to collect and analyze evidence, such as data, to properly inform agency initiatives.  

 Most agencies currently collect different types of data, but there is variation in the extent to which it is used to inform decision-making processes. The Office of Management and Budget published an evidence-based policymaking guide to encourage and support agencies in making more data-driven decisions. 

 While this is one of several pieces of support that the federal government has offered agencies, a critical piece is missing from the primary discussion – the role of proper data management and how it can impact citizen services and their experiences. 

Data management for CX success

As federal agencies look to leverage data to inform policies, decisions and programs, they are under-valuing data hygiene, failing to recognize the benefits of processes including lineage and testing protocols. If done incorrectly, the government will fail to meet crucial citizen needs.  

For example, during an analysis of the Internal Revenue Service’s legacy IT, the Government Accountability Office found the agency lacked regular evaluations of customer experiences and needs during the implementation of OMB’s cloud computing strategy. As a result, the agency has spent over a decade trying to replace the legacy Individual Master File (IMF) system, which is the authoritative data source for individual tax account data – this lack of responsiveness to CX needs is compounded with data and other challenges, significantly affecting citizen services.  

To ensure employees understand the true value of data and the benefits it can provide when used correctly, it’s important for agencies to foster a culture of data literacy, or the ability to read, write and communicate data in context. This is a foundational aspect of enhancing government’s data capabilities. 

Data plays a pivotal role in the quality of services provided to citizens. Before it can be used to inform such programs, agencies must ensure their data is organized and accessible according to proper data management protocols. 

Data management is defined as a set of practices, techniques and tools for achieving consistent access to and delivery of data across the spectrum of data subject areas and types in an agency. In the federal government’s case, having access to organized data, regardless of location, provides insights to decision makers that enable them to act according to relevant stats and information. 

This level of insight helps the government greatly when working to meet the requirements needed to correctly inform citizen programs and bolster citizen services, as the process may include migrating large data sets from legacy systems.  

When agencies successfully adhere to proper data hygiene and management, valuable resources for citizen use are made available, ranging from updated payment systems to public safety information such as crime rate data. Once the data has been properly stored and organized, business intelligence and analytics software tools such as ServiceNow or Tableau can help agencies make informed decisions. 

Impact on citizen services

The government provides a variety of services that citizens rely on daily, including health benefits, food assistance, social security and more. But as the economic landscape changes, the government’s citizen services must also change. 

To help individuals and businesses during the COVID-19 pandemic, Congress allotted $2.6 trillion to support vulnerable populations, public health needs and unemployment assistance – when agencies can access readily-available data that has been adequately managed, it makes it easier to provide the services that citizens need in a timely manner. Additionally, by ensuring internal data is ready for use, agencies can provide for all citizens despite factors such as race, location or age. 

Suppose the government decides to increase the amount of food assistance provided across the country and disperses an equal amount to every state without knowing population density, unemployment rates and other essential factors. In that case, they risk significantly decreasing the level of impact of such an initiative. While a simple example, this showcases the importance of data when making decisions that impact the lives of millions of individuals. 

Given the focus of the White House’s FY25 budget proposal, the federal government will see an increased need for proper data management to improve citizen services. Agencies must return to the foundational aspects of data hygiene to be successful. 

By harnessing the power of data-driven decision-making, adopting innovative technologies and fostering a culture of data literacy, agencies can significantly enhance the quality and efficiency of services provided to citizens. 

This transformation not only meets the evolving needs and expectations of the public but also represents a fundamental commitment to transparency, efficiency and accountability in governance. In this digital age, effective data management is not just a strategic asset but a cornerstone of democratic engagement and public trust. 

Laura Stash is executive vice president of solutions architecture at iTech AG. 

The post Improving citizen experience with proper data management first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/commentary/2024/05/improving-citizen-experience-with-proper-data-management/feed/ 0
Using data, revolutionary Sammies finalist sees forest for the trees https://federalnewsnetwork.com/technology-main/2024/05/using-data-revolutionary-sammies-finalist-sees-forest-for-the-trees/ https://federalnewsnetwork.com/technology-main/2024/05/using-data-revolutionary-sammies-finalist-sees-forest-for-the-trees/#respond Fri, 31 May 2024 16:44:30 +0000 https://federalnewsnetwork.com/?p=5022679 Robert McGaughey revolutionized the U.S. Forest Service’s ability to visualize aerial surveillance data as useful information.

The post Using data, revolutionary Sammies finalist sees forest for the trees first appeared on Federal News Network.

]]>
var config_5022141 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB7342747831.mp3?updated=1717142025"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"Using data, revolutionary Sammies finalist sees forest for the trees","description":"[hbidcpodcast podcastid='5022141']nnWith too much data, you can lose sight of the forest for the trees. <b data-stringify-type="bold"><i data-stringify-type="italic"><a class="c-link" href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/" target="_blank" rel="noopener noreferrer" data-stringify-link="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/" data-sk="tooltip_parent" aria-describedby="sk-tooltip-484">The Federal Drive with Tom Temin<\/a><\/i><\/b> spoke to a guest who revolutionized the ability of the U.S. Forest Service to visualize, as useful information, the mass of data from aerial surveillance. For his work, <a href="https:\/\/servicetoamericamedals.org\/honorees\/robert-j-mcgaughey\/">he's a finalist<\/a> in this year's Service to America Medals program: Research Forester Robert McGaughey.nn<em><strong>Interview Transcript:<\/strong><\/em>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>With too much data, you can lose sight of the forest for the trees. Well, my next guest revolutionized the ability of the U.S. Forest Service to visualize, as useful information, the mass of data from aerial surveillance. For his work, he's a finalist in this year's Service to America Medals program. Research forester Robert McGaughey joins me now. Mr. McGaughey, good to have you with us.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>Thank you, Tom. Nice to be here.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>And you are a programmer of software code that did something to other software to make it usable. Tell us what you've done here.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>So, technically, I'm trained as a forester, so it's an interesting combination being a programmer and a forester. But the technology that I work with is called airborne lidar. That's light detection and ranging. And the basic idea is you have this laser rangefinder in an aircraft that fires out millions of pulses per second, or a million pulses per second. It gets a measurement of every object that that pulse hits and naturally produces a lot of data. So the software that I developed reduces that data down to more usable products. The software has been around for about 20 years and been used across the country within federal agencies, universities especially, and then around the world as well.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Interesting. So the lidar is then surveillance that is not photography. It's surveillance using this lidar. And what is the output of lidar in its raw form?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>The true raw form are just range measurements from an aircraft. So the distance to an object that's hit by that laser pulse, that's combined with the position and attitude of the aircraft to get an actual XYZ point for every object that's hit. And as I said, it's millions of points over a small area, densities in the ten points per square meter and higher.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>So, in other words, it's a way of representing as data a three-dimensional, potentially three-dimensional image of what you're looking down at with the lidar equipment.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>Exactly, exactly. That's it.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>And so the output of putting the lidar data through your program, does that result in pictures or visualizations?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>Visualization is a big part of it. Just understanding what these systems measure. It's kind of hard to wrap our brains around the complexity that happens, you know, with this laser and the distance and the precise attitude and everything, but just knowing that it hits trees, buildings, the ground especially is important. But we reduce that down to something that's more usable, just a whole bunch of points. It's interesting to look at, but from an information standpoint, you know, our brain processes things really well, and we can see patterns and see objects that we recognize. But getting our computer to see that is a little different.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>So how does this work in a practical sense for the Forest Service? First of all, we'll talk about that context that you developed it for in the first place.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>So, for us, it was really a research problem to start with, to understand that these systems were even useful for forestry. We knew that they could measure the ground really well, very accurately, but we didn't even know if we would get data from trees. So once we realized that we could get good measurements from trees, we could measure the tree height over very large areas. We can measure variability in that height. We can measure patterns of vegetation. So there are trees in some areas, not trees in others. You know, big trees, little trees. And all that information is useful for making management decisions. It starts to give us information about the size of the trees, potential value, something about the age just because of their size, where there are denser areas of trees that might need to be thinned to encourage growth, or to remove or reduce fuel for fire risk. Things where we might want to go in and plant because there's not enough trees. So all those kind of things are useful information to have over the large areas, and it's wall to wall over the areas covered by this data.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>I suppose it could also tell you reasons to look into further as to why the forest density is higher here, as opposed to there also.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>It could give you some indication. At least it would tell you the areas where you have conditions that maybe you want to know more about. So kind of a reconnaissance before you were to go to the field to do more work.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>We're speaking with Robert McGaughey. He's a research forester with the U.S. Forest Service and a finalist in this year's Service to America Medals program. Okay, so we understand what's happening here from the forestry standpoint for our programing listeners. Then what did you do to convert the point data from the lidar into these visualizations or information that's usable?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>So, this software, which is called Fusion, because it fuses different sources of data to help you understand what they all measure. As I said, there is a visualization component that just lets you grab small samples of these data and spin them around and look at these point clouds. Really useful for just understanding what's being measured. But probably most useful, there's a whole suite of programing tools or smaller components that allow you to string together commands to take this point cloud and reduce it down to information in a raster form. So that's like little cells of information that have been summarized from the point data. Much smaller, much easier to work with. Typically, in a geographic information system, that data becomes very useful and very digestible to other types of software. So the software that I've developed really takes that raw XYZ point cloud or three-dimensional point cloud, and boils that down into something that's much more usable for a broader audience.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Well, how did people use lidar data before this?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>I started developing software pretty much about the same time lidar came to forestry. So in the late 90s, there were some initial research efforts to understand what the technology could do. I like to joke our first project included 40 million data points and we had no software. There was nothing commercially available that really could handle 40 million points. Today, that's collected in a blink of an eye. I mean, literally, we collect 40 million points in a few seconds.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Right. When the first hard drives came out for PCs in that era, you had to decide, should I get the five megabyte drive or the ten megabyte drive?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>Yes. Things have come a long way. Yeah, we just keep adding zeros to the end of those numbers, and we seem to add them pretty quickly.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>And as a forester, I mean, you were trained as a forester. That was your career choice. How did you bridge from looking at trees as a forester to looking at trees as a programmer?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>So, my undergraduate work was in forestry and I did a masters in forestry, but I developed software for my master's work as well. My original work was in timber harvesting. It was in doing the engineering design behind some of the systems used out West. It's one of those things, you know, computer programing, take a couple classes. I had an aptitude for it and I actually really enjoyed it. I always see it as kind of the ultimate engineering thing, where you're building something that accomplishes a task. So a lot of self-taught. The software that I developed is developed in C++.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Yeah, that was my question. Because when you started, I mean, people were still programming point and coordinate data using Fortran and languages in that era.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>Yes, that's true. That's true. I actually did some Fortran work at one point, but pretty much self-taught on C and C++. Having probably advanced as far as I, you know, could have in that realm. But my software works. It's very robust. I've had the good fortune of having a partnership with a group in the Forest Service that develops training materials, and so we get a lot of testing and training done through the support from that group as well.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Sure. And now Fusion is in the open source world. How else have people applied it that you're aware of?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>So, open source is a sticky word today because it implies certain things. It's freely available software. It's not really developed as an open source project where we have multiple contributors, but it's been picked up. The software is distributed as executables, the code is available, but it's been picked up in academia to a large extent and used to help teach students what you can do with wider data. And I see that as one of the most valuable things. It's just we've had a whole set of students go through academia that now know about lidar and how to apply it to natural resource things. It's used internationally. If you do a survey of the literature you'd find, I would say, maybe one in four papers have used the Fusion software to do some of the point cloud processing, partly because it's free, so people can download it and use it, and it really is designed from a forestry perspective rather than a remote sensing or engineering perspective. So, some of the products, in the way that it does things and kind of the terminology and everything are friendly to foresters. So that's really helped it be picked up and used around the world.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>And are you still working on it? Do you still work to perfect it?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>I am still working on it, yes. There's actually a release that'll come out here in the next week or two that incorporates a new point cloud format for the data, probably do two to three updates a year or major updates. I haven't really added a lot of new capability for a few years, but we continue to kind of work on how to chew through lots of data faster. These data sets, as I said, are huge. Billions of points, literally, consuming pretty large space on disk drives and that kind of thing. So there's a real focus now on moving things into the cloud and processing that way.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>You've moved from the Fortran world to the Drupal and Kubernetes world of large data sets in the cloud.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>At least as far as large data sets. The software doesn't really deal with things in the cloud. It's really set up for desktop use, and it's all Windows-based, partly because that's what the Forest Service uses for its computing system. So it's always been developed to run on the systems that we have available to us.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Sure. And you are a forester, ultimately. Do you still get out and hug a tree once in a while and not just a keyboard?<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>I do. Not as much as I would like to sometimes, but I do still get to go out. We do a lot of field work. I'm also an affiliate instructor at the University of Washington back in Seattle, and I get to work with grad students who have just super interesting ideas and projects that they're working on. So a lot of that is where the field stuff comes in, as you know, working with small areas and working with very specific projects. A lot of the large area work is just kind of recipe-driven. You plug the data in, you chew it through the software. You get a set of layers, and then there's some standard uses that those layers are used for.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Sure. So a lot of people speak for the trees, but in some sense you have enabled the trees to speak to you.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>You could say that, yeah. We've kind of enabled an ability to capture a lot of information that would have taken people weeks, months on the ground.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>Robert McGaughey is a research forester with the U.S. Forest Service and a finalist in this year's Service to America Medals program. Thanks so much for joining me.<\/p>n<p style="padding-left: 40px;"><strong>Robert McGaughey <\/strong>You're welcome, Tom. This has been great.<\/p>n<p style="padding-left: 40px;"><strong>Tom Temin <\/strong>And we'll post this interview along with a link to more information at federalnewsnetwork.com\/federaldrive, where you can find all of our Sammies interviews. Subscribe to the Federal Drive wherever you get your podcasts.<\/p>n "}};

With too much data, you can lose sight of the forest for the trees. The Federal Drive with Tom Temin spoke to a guest who revolutionized the ability of the U.S. Forest Service to visualize, as useful information, the mass of data from aerial surveillance. For his work, he’s a finalist in this year’s Service to America Medals program: Research Forester Robert McGaughey.

Interview Transcript:

Tom Temin With too much data, you can lose sight of the forest for the trees. Well, my next guest revolutionized the ability of the U.S. Forest Service to visualize, as useful information, the mass of data from aerial surveillance. For his work, he’s a finalist in this year’s Service to America Medals program. Research forester Robert McGaughey joins me now. Mr. McGaughey, good to have you with us.

Robert McGaughey Thank you, Tom. Nice to be here.

Tom Temin And you are a programmer of software code that did something to other software to make it usable. Tell us what you’ve done here.

Robert McGaughey So, technically, I’m trained as a forester, so it’s an interesting combination being a programmer and a forester. But the technology that I work with is called airborne lidar. That’s light detection and ranging. And the basic idea is you have this laser rangefinder in an aircraft that fires out millions of pulses per second, or a million pulses per second. It gets a measurement of every object that that pulse hits and naturally produces a lot of data. So the software that I developed reduces that data down to more usable products. The software has been around for about 20 years and been used across the country within federal agencies, universities especially, and then around the world as well.

Tom Temin Interesting. So the lidar is then surveillance that is not photography. It’s surveillance using this lidar. And what is the output of lidar in its raw form?

Robert McGaughey The true raw form are just range measurements from an aircraft. So the distance to an object that’s hit by that laser pulse, that’s combined with the position and attitude of the aircraft to get an actual XYZ point for every object that’s hit. And as I said, it’s millions of points over a small area, densities in the ten points per square meter and higher.

Tom Temin So, in other words, it’s a way of representing as data a three-dimensional, potentially three-dimensional image of what you’re looking down at with the lidar equipment.

Robert McGaughey Exactly, exactly. That’s it.

Tom Temin And so the output of putting the lidar data through your program, does that result in pictures or visualizations?

Robert McGaughey Visualization is a big part of it. Just understanding what these systems measure. It’s kind of hard to wrap our brains around the complexity that happens, you know, with this laser and the distance and the precise attitude and everything, but just knowing that it hits trees, buildings, the ground especially is important. But we reduce that down to something that’s more usable, just a whole bunch of points. It’s interesting to look at, but from an information standpoint, you know, our brain processes things really well, and we can see patterns and see objects that we recognize. But getting our computer to see that is a little different.

Tom Temin So how does this work in a practical sense for the Forest Service? First of all, we’ll talk about that context that you developed it for in the first place.

Robert McGaughey So, for us, it was really a research problem to start with, to understand that these systems were even useful for forestry. We knew that they could measure the ground really well, very accurately, but we didn’t even know if we would get data from trees. So once we realized that we could get good measurements from trees, we could measure the tree height over very large areas. We can measure variability in that height. We can measure patterns of vegetation. So there are trees in some areas, not trees in others. You know, big trees, little trees. And all that information is useful for making management decisions. It starts to give us information about the size of the trees, potential value, something about the age just because of their size, where there are denser areas of trees that might need to be thinned to encourage growth, or to remove or reduce fuel for fire risk. Things where we might want to go in and plant because there’s not enough trees. So all those kind of things are useful information to have over the large areas, and it’s wall to wall over the areas covered by this data.

Tom Temin I suppose it could also tell you reasons to look into further as to why the forest density is higher here, as opposed to there also.

Robert McGaughey It could give you some indication. At least it would tell you the areas where you have conditions that maybe you want to know more about. So kind of a reconnaissance before you were to go to the field to do more work.

Tom Temin We’re speaking with Robert McGaughey. He’s a research forester with the U.S. Forest Service and a finalist in this year’s Service to America Medals program. Okay, so we understand what’s happening here from the forestry standpoint for our programing listeners. Then what did you do to convert the point data from the lidar into these visualizations or information that’s usable?

Robert McGaughey So, this software, which is called Fusion, because it fuses different sources of data to help you understand what they all measure. As I said, there is a visualization component that just lets you grab small samples of these data and spin them around and look at these point clouds. Really useful for just understanding what’s being measured. But probably most useful, there’s a whole suite of programing tools or smaller components that allow you to string together commands to take this point cloud and reduce it down to information in a raster form. So that’s like little cells of information that have been summarized from the point data. Much smaller, much easier to work with. Typically, in a geographic information system, that data becomes very useful and very digestible to other types of software. So the software that I’ve developed really takes that raw XYZ point cloud or three-dimensional point cloud, and boils that down into something that’s much more usable for a broader audience.

Tom Temin Well, how did people use lidar data before this?

Robert McGaughey I started developing software pretty much about the same time lidar came to forestry. So in the late 90s, there were some initial research efforts to understand what the technology could do. I like to joke our first project included 40 million data points and we had no software. There was nothing commercially available that really could handle 40 million points. Today, that’s collected in a blink of an eye. I mean, literally, we collect 40 million points in a few seconds.

Tom Temin Right. When the first hard drives came out for PCs in that era, you had to decide, should I get the five megabyte drive or the ten megabyte drive?

Robert McGaughey Yes. Things have come a long way. Yeah, we just keep adding zeros to the end of those numbers, and we seem to add them pretty quickly.

Tom Temin And as a forester, I mean, you were trained as a forester. That was your career choice. How did you bridge from looking at trees as a forester to looking at trees as a programmer?

Robert McGaughey So, my undergraduate work was in forestry and I did a masters in forestry, but I developed software for my master’s work as well. My original work was in timber harvesting. It was in doing the engineering design behind some of the systems used out West. It’s one of those things, you know, computer programing, take a couple classes. I had an aptitude for it and I actually really enjoyed it. I always see it as kind of the ultimate engineering thing, where you’re building something that accomplishes a task. So a lot of self-taught. The software that I developed is developed in C++.

Tom Temin Yeah, that was my question. Because when you started, I mean, people were still programming point and coordinate data using Fortran and languages in that era.

Robert McGaughey Yes, that’s true. That’s true. I actually did some Fortran work at one point, but pretty much self-taught on C and C++. Having probably advanced as far as I, you know, could have in that realm. But my software works. It’s very robust. I’ve had the good fortune of having a partnership with a group in the Forest Service that develops training materials, and so we get a lot of testing and training done through the support from that group as well.

Tom Temin Sure. And now Fusion is in the open source world. How else have people applied it that you’re aware of?

Robert McGaughey So, open source is a sticky word today because it implies certain things. It’s freely available software. It’s not really developed as an open source project where we have multiple contributors, but it’s been picked up. The software is distributed as executables, the code is available, but it’s been picked up in academia to a large extent and used to help teach students what you can do with wider data. And I see that as one of the most valuable things. It’s just we’ve had a whole set of students go through academia that now know about lidar and how to apply it to natural resource things. It’s used internationally. If you do a survey of the literature you’d find, I would say, maybe one in four papers have used the Fusion software to do some of the point cloud processing, partly because it’s free, so people can download it and use it, and it really is designed from a forestry perspective rather than a remote sensing or engineering perspective. So, some of the products, in the way that it does things and kind of the terminology and everything are friendly to foresters. So that’s really helped it be picked up and used around the world.

Tom Temin And are you still working on it? Do you still work to perfect it?

Robert McGaughey I am still working on it, yes. There’s actually a release that’ll come out here in the next week or two that incorporates a new point cloud format for the data, probably do two to three updates a year or major updates. I haven’t really added a lot of new capability for a few years, but we continue to kind of work on how to chew through lots of data faster. These data sets, as I said, are huge. Billions of points, literally, consuming pretty large space on disk drives and that kind of thing. So there’s a real focus now on moving things into the cloud and processing that way.

Tom Temin You’ve moved from the Fortran world to the Drupal and Kubernetes world of large data sets in the cloud.

Robert McGaughey At least as far as large data sets. The software doesn’t really deal with things in the cloud. It’s really set up for desktop use, and it’s all Windows-based, partly because that’s what the Forest Service uses for its computing system. So it’s always been developed to run on the systems that we have available to us.

Tom Temin Sure. And you are a forester, ultimately. Do you still get out and hug a tree once in a while and not just a keyboard?

Robert McGaughey I do. Not as much as I would like to sometimes, but I do still get to go out. We do a lot of field work. I’m also an affiliate instructor at the University of Washington back in Seattle, and I get to work with grad students who have just super interesting ideas and projects that they’re working on. So a lot of that is where the field stuff comes in, as you know, working with small areas and working with very specific projects. A lot of the large area work is just kind of recipe-driven. You plug the data in, you chew it through the software. You get a set of layers, and then there’s some standard uses that those layers are used for.

Tom Temin Sure. So a lot of people speak for the trees, but in some sense you have enabled the trees to speak to you.

Robert McGaughey You could say that, yeah. We’ve kind of enabled an ability to capture a lot of information that would have taken people weeks, months on the ground.

Tom Temin Robert McGaughey is a research forester with the U.S. Forest Service and a finalist in this year’s Service to America Medals program. Thanks so much for joining me.

Robert McGaughey You’re welcome, Tom. This has been great.

Tom Temin And we’ll post this interview along with a link to more information at federalnewsnetwork.com/federaldrive, where you can find all of our Sammies interviews. Subscribe to the Federal Drive wherever you get your podcasts.

 

The post Using data, revolutionary Sammies finalist sees forest for the trees first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/technology-main/2024/05/using-data-revolutionary-sammies-finalist-sees-forest-for-the-trees/feed/ 0
How to manage the digital records deadline https://federalnewsnetwork.com/federal-insights/2024/05/how-to-manage-the-digital-records-deadline/ https://federalnewsnetwork.com/federal-insights/2024/05/how-to-manage-the-digital-records-deadline/#respond Thu, 30 May 2024 21:06:25 +0000 https://federalnewsnetwork.com/?p=5021262 June 30th deadline approaches, when NARA will only accept digitized documents. Agencies must deal with the largest volume known as modern textual records.

The post How to manage the digital records deadline first appeared on Federal News Network.

]]>

Ever since people applied ink to parchment, preserving records has posed challenges. Now federal agencies face a June 30 deadline to digitize certain federal records. The National Archives and Records Administration will require agencies to submit the digitized versions, including metadata for future accessibility. Agencies are moreover obligated to conform to NARA standards in carrying out digitization.

Long in the making and several times delayed, the digital requirement stems ultimately from the never-ending growth in annual production of paper records and resulting volume.

“There’s hundreds of millions of dollars being spent every year by federal agencies to create, manage and store these hardcopy records,” said Anthony Massey, strategic business developer at Canon on Federal Insights Records Management. The digitization directive, Massey said, is designed to make archiving easier and less costly while making records themselves more accessible.

The various types of documents – maps, photographs, items deemed culturally significant, and 8.5 x 11-inch bureaucratic output each have their own associated standards and require different technologies to achieve digitization, Massey noted. Helping inform NARA standards making have been guidelines from the Federal Agencies Digital Guidelines Initiative, or FADGI.

The initiative got underway about 10 years ago “as a concept of how to begin to guide agencies into what kind of a digitization format they could then roadmap their policy and procedure to,” Massey said.

Many digitizing procedures incorporate scanning. Scanning itself has continually advanced, said Tae Chong, Canon’s manager of new business development. One development especially relates to a type of document known as a modern textual record (MTR).

An MTR typically was created electronically, perhaps with a modern word processing program or – as is often the case with older records about to leave agency possession and move to NARA – in a program whose technical format is no longer extant.

That means digitizing a paper printout using scanning. Now, Chong said, scanning technology includes “software engineering techniques to tell the text from the background and … special software image processing to essentially enhance the visibility of the text element, while erasing unwanted graphics on the background.”

A second element in state-of-the-art scanning, Chong said, encompasses optical character recognition that “can kick in to pick up the text information and pass it to a software application which will then index the document for later search and retrieval.”

He noted that agencies must also by law preserve a paper copy. But by extracting the information and indexing it, public retrieval and viewing will no longer require handling the paper itself.

“The new regulatory requirement is focusing on creating a digital replica of the paper originals,” Chong said.

Special breed

MTRs differ from cultural heritage documents. In the latter type, the entire area of the document encompasses information to preserve; for example, pieces of artwork or hand-lettered manuscripts. OCR technology won’t yield much information, and the background requires preservation along with whatever else the document exhibits.

“When NARA and the working group of FADGI began to establish classifications of imaging for digitizing these various types of records,” Massey said, “they discovered in that particular context of the printed record, there was a need to get a special type of digitization process called MTR that was simpler, less involved with much less expensive equipment that could do a very high quality image and make it transportable into an archive.”

Because the MTRs exist nearly universally as printed on standard office paper, agencies can apply high speed scanning techniques to them. Massey said agencies have produced billions of MTRs, printing them out as either temporary or permanent records.

For such documents, Massey said, NARA wants an online catalog. A researcher with a particular topic “can go to a Library of Congress online catalog and look up that document, instead of having to go in person to a particular storage site or physically go and handle that document.”

While MTR is a process or image standard and not a hardware standard, Massey said Canon has developed scanners specifically for MTR.

“The hardware must then be aligned to those scanning requirements,” he said.

For practical purposes, speed is an important requirement for MTR scanners. Massey said the faster the process occurs, the faster agencies can clear back file projects for older records. For new records, he said agencies should consider establish in-house capability to scan and index records as they create them.

“When records management officers look at day-forward scanning,” Massey said, “knowing that from that day forward they also have to digitize these records, they want access to equipment that can do that at a setting that is confidently MTR capable.”

Listen to the full show:

The post How to manage the digital records deadline first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/05/how-to-manage-the-digital-records-deadline/feed/ 0
NARA’s looming digitization deadline for agencies means the end of paper https://federalnewsnetwork.com/federal-newscast/2024/05/naras-looming-digitization-deadline-for-agencies-means-the-end-of-paper/ https://federalnewsnetwork.com/federal-newscast/2024/05/naras-looming-digitization-deadline-for-agencies-means-the-end-of-paper/#respond Wed, 29 May 2024 15:30:29 +0000 https://federalnewsnetwork.com/?p=5019103 The National Archives and Records Administration is preparing agencies for the paper cut.

The post NARA’s looming digitization deadline for agencies means the end of paper first appeared on Federal News Network.

]]>
var config_5019101 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB3058441234.mp3?updated=1716984269"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FedNewscast1500-150x150.jpg","title":"NARA’s looming digitization deadline for agencies, means the end of paper","description":"[hbidcpodcast podcastid='5019101']nn[federal_newscast]"}};
  • The National Archives and Records Administration is preparing agencies for a looming digitization deadline. Starting on July 1, NARA will stop accepting paper records from agencies. Now, the Archives has a new website detailing the metadata requirements for a wide variety of electronic records. The goal is to ensure agencies are formatting their electronic records correctly ahead of the July 1 deadline. NARA said metadata is a key piece of preserving and providing access to the federal government's records.
  • A new request for proposal from the Centers for Medicare and Medicaid Services (CMS) has raised alarm bells among services contractors. The RFP seeks to recompete a 10-year contract to operate call centers, now in its third year. CMS wants to force bidders to establish union contracts with labor harmony agreements that bar strikes. The contract is now held by Maximus, which has 10,000 employees operating CMS call centers. Industry sources said the contract garners a 95% positive customer satisfaction rating and has had no labor issues. Professional Services Council CEO David Berteau said the labor requirement is unprecedented in such contracts and may violate the National Labor Relations Act. Proposals are due by the end of June.
    (HHS request for proposal has industry up in arms - The Federal Drive with Tom Temin)
  • One of the DoD’s top IT acquisition executives is departing federal service after a long career. Ruth Youngs Lew, the Navy’s program executive officer for digital and enterprise services, said she is retiring at the end of this week. She has led PEO Digital for the past seven years. Before that, as part of a 30-year career, she was the CIO for the Navy’s Pacific Fleet.
    (Retirement after 31 years - Ruth Youngs Lew via LinkedIn)
  • The Office of Personnel Management is working to address a recent spike in fraudulent activity. Several hundred federal employees in OPM’s flexible spending account program, FSAFEDS, are seeing fraudulent charges on their accounts. Scammers have used employees’ personal information to create fake accounts, or make false reimbursement claims. OPM, along with the program’s vendor HealthEquity, are working to secure impacted accounts and implement additional anti-fraud controls. OPM will also reimburse in full any affected employees.
  • Veterans are giving record-high trust scores to the Department of Veterans Affairs, reaching an an all-time high of more than 80%. That is based on feedback from more than 38,000 veterans who received VA services between January and March this year. VA’s current trust scores are 25% higher than when it first conducted its veteran trust survey in 2016. VA Secretary Denis McDonough said the department’s workforce is committed to delivering a high level of care to veterans across all service areas. “We strive to be an agency that fits our programs into veterans’ lives," McDonough said.
  • The State Department’s Bureau of Intelligence and Research plans to invest more in open-source data. INR’s new open-source intelligence strategy calls for the bureau to meet the rising demand for OSINT from State Department employees across the world. “When it comes to the future of OSINT, the stakes could not be higher," Assistant Secretary of State for Intelligence and Research Brett Holmgren said in an interview. Holmgren also said INR needs to harness a growing body of open information about world events. And he thinks generative AI could help the bureau sift through all that data to deliver more unclassified intelligence assessments.
  • A massive bipartisan bill is looking to make a lot of changes at the Department of Veterans Affairs. But veteran service organizations said they are concerned the bill does not have enough support to make it through Congress. Top lawmakers from the House and Senate VA Committees are backing the Senator Elizabeth Dole 21st Century Veterans Healthcare and Benefits Improvement Act. But veteran groups said the bill failed to reach a House floor vote before Memorial Day, falling short of expectations. Among its changes, the sweeping bill would give the VA additional pay flexibilities for its workforce. It would also set new requirements for VA to resume the rollout of its new Electronic Health Record.
  • The Pentagon’s Chief Digital and Artificial Intelligence Office (CDAO) is getting four new leaders to help accelerate innovation across the department. Garrett Berntsen will join the organization as the deputy CDAO for mission analytics. Berntsen previously served as the State Department’s first deputy chief data and AI officer, where he stood up the State Department’s Center for Analytics. Eugene Kuznetsov will join the CDAO as the deputy for enterprise platforms and services. Jock Padgett will step into his role as the deputy CDAO for advanced C2 acceleration. Christopher Skaluba will be the CDAO’s executive director.
  • As Election Day approaches, agencies should make sure that any job appointments or awards they give out are free from political influence. In a recent reminder, the Office of Personnel Management details how and where agencies should pay attention to keep within the guidelines. For one, agencies need to get OPM approval before moving a political appointee to a non-political position. It is a practice commonly known as “burrowing.” Agencies also cannot hand out pay bonuses or extra time off to politically appointed senior officials until after January 20, 2025.
  • The Defense Innovation Unit is seeking a cross-domain cloud-based information technology capability to make sense of big data for biodefense purposes. The system will be focused on automating anticipatory analysis for biological and health-related issues while also providing situational awareness for all levels of command. The DIU wants this system to be enabled by artificial intelligence and machine learning. The new technical solutions will work seamlessly with various DoD systems and capabilities, including the Combined Joint All Domain Command and Control initiative. Responses are due by June 7.

The post NARA’s looming digitization deadline for agencies means the end of paper first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-newscast/2024/05/naras-looming-digitization-deadline-for-agencies-means-the-end-of-paper/feed/ 0
The Marine Corps’ plan to further breakdown data siloes https://federalnewsnetwork.com/defense-news/2024/05/the-marine-corps-plan-to-further-breakdown-data-siloes/ https://federalnewsnetwork.com/defense-news/2024/05/the-marine-corps-plan-to-further-breakdown-data-siloes/#respond Fri, 24 May 2024 16:44:13 +0000 https://federalnewsnetwork.com/?p=5014286 Dr. Colin Crosby, the service data officer for the Marine Corps, said the first test of the API connection tool will use “dummy” logistics data.

The post The Marine Corps’ plan to further breakdown data siloes first appeared on Federal News Network.

]]>
var config_5014343 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB2238077517.mp3?updated=1716568461"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"The Marine Corps\u2019 plan to further breakdown data siloes","description":"[hbidcpodcast podcastid='5014343']nnThe Marine Corps is close to testing out a key piece to its upcoming Fighting Smart concept.nnAs part of <a href="https:\/\/www.mca-marines.org\/gazette\/fighting-smart\/#:~:text=Fighting%20Smart%20is%20a%20way,and%20combined%20arms%20more%20effective." target="_blank" rel="noopener">its goal<\/a> to create an integrated mission and data fabric, the Marines will pilot an application programming interface (API) standard to better connect and share data no matter where it resides.nn\u201cReally over the next 12 months, we hope to have the autonomous piece of this API connection implemented in our environment in what we call the common management plane that allows us to execute enterprise data governance where we can then use the capabilities rather than the native capabilities within our environment to develop those data catalogs, to tag data, to track the data from its lineage from creation all the way to sharing and destruction within our environment and outside of our environment,\u201d said Dr. Colin Crosby, the service data officer for the Marine Corps, on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cWe're working with what we call the functional area managers and their leads on the data that they own because this is all new in how we're operating. I need them to help me execute this agenda so that we can then create that API connection.\u201dnnLike many organizations, <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2022\/03\/dod-cloud-exchange-renata-spinks-on-usmcs-acceleration-to-the-cloud\/">mission areas<\/a> own and manage the data, but sharing because of culture, technology and\/or policy can be difficult.nnCrosby said the API connection can help overcome <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2023\/04\/why-the-marine-corps-has-established-its-own-software-factory\/">many of these challenges<\/a>.nn\u201cOur first marker is to have a working API connection on test data. Once that happens, then we're going to start accelerating the work that we're doing,\u201d he said. \u201cWe're using logistics data so what we're doing is using a dummy data, and we're going to pull that data into our common management plane, and then from that CMP, we want to push that data to what we call the\u00a0 online database gateway. Then, by pulling that into the OTG, we can then push it into the Azure Office 365 environment, where we can then use that data using our PowerBI capabilities within our environment.\u201dn<h2>Testing the API before production<\/h2>nOnce the API connection proves out, Crosby said the goal is to push data into the Marine Corps\u2019 Bolt platform, which runs on the Advana Jupiter platform.nnHe said there is a lot of excitement from logistics and other mission areas around the Marine Corps to prove this API connection technology.nn\u201cAs we get more comfortable moving forward, then we will bring on the next, what we call, coalition of the willing. As of now, we have a line because we have other organizations now that are like, \u2018we want to be a part of this,\u2019\u201d Crosby said. \u201cThe training and education command is ready to go. So we're excited about it because now I don't have to work that hard to get people on board and now I have people knocking on my doors saying they are ready to go.\u201dnnCrosby added that before the API connection goes live with each new organization, his team will run similar tests using dummy data. He said building that repeatable process and bringing in some automation capabilities will help decrease the time it takes to turn on the API tools for live data.nnWithout these new capabilities, Crosby said it takes weeks to pull CSV files, thus delaying the ability of leaders to make decisions.nn\u201cWith the API, we're going to near-real time type of pull and push, which is speeding up the decision cycle,\u201d he said. \u201cThen there are opportunities to expand on that by building applications that will aggregate data and then being able to look at data to check the maintenance on equipment, and then it'd be a little bit easier to understand what we need and when. The goal is to shrink that decision cycle a little bit.\u201dnnThe API connection tool is one piece to the bigger Marine Corps effort to create an <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2022\/10\/as-data-fabric-comes-together-army-must-ensure-platforms-integrate\/">integrated mission and data fabric<\/a>. Crosby said that initiative also relies on the unification of the Marine Corps <a href="https:\/\/federalnewsnetwork.com\/defense-news\/2024\/03\/how-the-marines-corps-got-ahead-of-the-zero-trust-curve\/">enterprise network<\/a> to bring the business side and the tactical side together into one environment.nn\u201cThe fabric is a framework and approach of our environment today and how we want to connect our environment in an autonomous fashion using APIs, so that we can pull data and we can share data, regardless of the cloud environment that it\u2019s in, regardless of whatever database structure the data resides in,\u201d Crosby said. \u201cIt allows us to be flexible. It allows us to scale and to really push data and pull data at a speed that we've never done before. What I love about the fabric is it really gets to that decision making. It allows our commanders to make sense and act within real or near real time.\u201d"}};

The Marine Corps is close to testing out a key piece to its upcoming Fighting Smart concept.

As part of its goal to create an integrated mission and data fabric, the Marines will pilot an application programming interface (API) standard to better connect and share data no matter where it resides.

“Really over the next 12 months, we hope to have the autonomous piece of this API connection implemented in our environment in what we call the common management plane that allows us to execute enterprise data governance where we can then use the capabilities rather than the native capabilities within our environment to develop those data catalogs, to tag data, to track the data from its lineage from creation all the way to sharing and destruction within our environment and outside of our environment,” said Dr. Colin Crosby, the service data officer for the Marine Corps, on Ask the CIO. “We’re working with what we call the functional area managers and their leads on the data that they own because this is all new in how we’re operating. I need them to help me execute this agenda so that we can then create that API connection.”

Like many organizations, mission areas own and manage the data, but sharing because of culture, technology and/or policy can be difficult.

Crosby said the API connection can help overcome many of these challenges.

“Our first marker is to have a working API connection on test data. Once that happens, then we’re going to start accelerating the work that we’re doing,” he said. “We’re using logistics data so what we’re doing is using a dummy data, and we’re going to pull that data into our common management plane, and then from that CMP, we want to push that data to what we call the  online database gateway. Then, by pulling that into the OTG, we can then push it into the Azure Office 365 environment, where we can then use that data using our PowerBI capabilities within our environment.”

Testing the API before production

Once the API connection proves out, Crosby said the goal is to push data into the Marine Corps’ Bolt platform, which runs on the Advana Jupiter platform.

He said there is a lot of excitement from logistics and other mission areas around the Marine Corps to prove this API connection technology.

“As we get more comfortable moving forward, then we will bring on the next, what we call, coalition of the willing. As of now, we have a line because we have other organizations now that are like, ‘we want to be a part of this,’” Crosby said. “The training and education command is ready to go. So we’re excited about it because now I don’t have to work that hard to get people on board and now I have people knocking on my doors saying they are ready to go.”

Crosby added that before the API connection goes live with each new organization, his team will run similar tests using dummy data. He said building that repeatable process and bringing in some automation capabilities will help decrease the time it takes to turn on the API tools for live data.

Without these new capabilities, Crosby said it takes weeks to pull CSV files, thus delaying the ability of leaders to make decisions.

“With the API, we’re going to near-real time type of pull and push, which is speeding up the decision cycle,” he said. “Then there are opportunities to expand on that by building applications that will aggregate data and then being able to look at data to check the maintenance on equipment, and then it’d be a little bit easier to understand what we need and when. The goal is to shrink that decision cycle a little bit.”

The API connection tool is one piece to the bigger Marine Corps effort to create an integrated mission and data fabric. Crosby said that initiative also relies on the unification of the Marine Corps enterprise network to bring the business side and the tactical side together into one environment.

“The fabric is a framework and approach of our environment today and how we want to connect our environment in an autonomous fashion using APIs, so that we can pull data and we can share data, regardless of the cloud environment that it’s in, regardless of whatever database structure the data resides in,” Crosby said. “It allows us to be flexible. It allows us to scale and to really push data and pull data at a speed that we’ve never done before. What I love about the fabric is it really gets to that decision making. It allows our commanders to make sense and act within real or near real time.”

The post The Marine Corps’ plan to further breakdown data siloes first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/defense-news/2024/05/the-marine-corps-plan-to-further-breakdown-data-siloes/feed/ 0
Platform One looks to enhance, build on software factory services https://federalnewsnetwork.com/technology-main/2024/05/platform-one-looks-to-enhance-build-on-software-factory-services/ https://federalnewsnetwork.com/technology-main/2024/05/platform-one-looks-to-enhance-build-on-software-factory-services/#respond Wed, 22 May 2024 02:04:14 +0000 https://federalnewsnetwork.com/?p=5010544 The Air Force’s Platform One is accelerating modern software development for the Defense Department.

The post Platform One looks to enhance, build on software factory services first appeared on Federal News Network.

]]>

The Air Force’s Platform One program has a well established role in bringing “DevSecOps” software development to the Defense Department.

Now the program is focusing on enhancing its existing services, while expanding its secure software development work into more sensitive data environments.

Platform One’s core offerings include “Iron Bank,” a secure repository of hardened container images. Maj. Matthew Jordan, chief of product for Platform One, describes Iron Bank as the “Lego bricks” needed to build modern software. That includes hardened applications, continuous monitoring, vulnerability scanning and regular updates.

“We ensure that we’re patched within our repository, and all of our downstream consumers are able to easily receive the cybersecurity benefits,” Jordan said on Federal News Network. “You’re getting a lot of economies of scale there.”

Iron Bank is primarily accredited for less sensitive unclassified information. But Jordan said Platform One is working to get Iron Bank accredited for controlled unclassified information (CUI) as well as for classified information.

That work is detailed in Platform One’s new product roadmap, which lays out the program’s plans for various offerings and services, including “Big Bang,” its continuous integration and continuous delivery/deployment (CI/CD) platform, and the “Party Bus” platform-as-a-service.

Platform One’s zero trust approach

Platform One also provides a “cloud native access point (CNAP),”  for accessing the software factory’s various services in a secure manner. Jordan said CNAP was “borne out of necessity” in the early days of Platform One, as it sought to work with software vendors, including nontraditional defense vendors, to establish its agile software development platform.

“How do you ensure that you’re still being secure and accessing things that may be coming from the Internet, or via contractor’s workplace or from their home as opposed to in a secure facility on a base?” Jordan explained. “So CNAP allows you to do the device compliance checks, so that you get a lot of attributes about the device itself, as well as understand who the user is, and get a lot of attributes on that user, and then make risk decisions as to ‘Okay, based on what we’re seeing today, you only get access to a certain subset of resources.’”

The capability allows each application owner behind the access point to “set their policies dynamically and make informed risk decisions, or accept the risk if they’re willing to, or mitigate the risk that they want to,” Jordan added.

CNAP is a key piece of Platform One’s zero trust security architecture, which also includes macro segmentation using a software-defined perimeter, Jordan said. Internally, Platform One also uses service meshes to ensure segmentation between individual applications, as well as continuous logging and monitoring.

“And that runtime security so that you can understand when something is going wrong or something’s attempting to do something that it shouldn’t, and then dive deeper for that root cause analysis,” Jordan said.

Platform One is also collaborating with other Air Force organizations on an application programming interface (API) reference architecture document. Jordan said that document is currently in draft.

“Data is king, and it’s crucial that we don’t allow data to just be put into a silo,” he said. “We need to be able to share that data. And API is definitely one way to enable that data flow. So we need to focus on providing those standards for application programming interfaces, software, development kits, data fabrics, all that kind of stuff to the developers. So they can quickly focus on developing features for their applications, as opposed to focusing on interfaces.”

The post Platform One looks to enhance, build on software factory services first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/technology-main/2024/05/platform-one-looks-to-enhance-build-on-software-factory-services/feed/ 0