Menu
/ / / GIS Conference Highlights Information Sharing

GIS Conference Highlights Information Sharing

(Editor's Note: Following are edited excerpts from three presentations given by senior leaders at Esri's Federal GIS Conference held in Washington, D.C., in February.)

 
Sue Gordon
Deputy Director
National Geospatial-Intelligence Agency
 
In partnership with Esri, we're in the middle of releasing a variety of non-classified data, working with a great partner, the University of Minnesota's Polar Geospatial Center to develop digital elevation models of Alaska, and eventually all of the Arctic.
 
This is the first time that we're making detailed, digital models available to the public. Models that show you shoreline elevation, and how much the water levels can rise before there's flooding. Models that can also tell you how ice and snow melts drain, how water feeds through the valleys, and how people in the Arctic get their drinking water. These models can support anything dependent on elevation. You can think of the possibilities as well as I—scientific research, commercial endeavors, economic impact, military operations, and on and on.
 
I will spend some time talking about partnerships--not only with Esri, but with others. I'll go into how we're leveraging those partnerships today, and how we plan on leveraging them into the immediate future. A central component of this vision requires a geospatial platform in a shared environment. And, the intelligence community's new enterprisewide delivery platform—what we call GEOINT Services—enables that vision and will be online this year across multiple security domains.
 
Let's talk about the steps we've taken to pull our partners more fully and completely into Team GEOINT.
First of all, we're incrementally shifting from building and using agency-specific IT systems, tools and capabilities, to sharing cloud-based services. The intelligence community's Information Technology Environment (IC ITE) is the IC-wide strategy to accomplish this--enabling agencies to share information and capabilities on demand to meet mission needs. And, NGA will be off our own infrastructure and into the Community Cloud in two years—something that sounds yawn inducing, but is actually a move that will enable everything we want to do. The result will be a more efficient model for resource sharing that can scale rapidly to meet unexpected and emergent requirements--perfect for the dynamic world I just described.
 
As I mentioned, GEOINT Services is the name of the IC's solution for how we share geospatial knowledge. The concept is this simple: We will expose our content—and your content if you allow. We will provide cloud-based scalable, responsive, Open Geospatial Consortium compliant services for common use on Top Secret, Secret, and Unclassified domains. And, we will demand that every piece of intelligence is tied to a place and a time—even if it is not traditional geospatial data. This will be true whether it is data from commercial sources, open sources, or governmental sources.
 
The biggest impact GEOINT Services will have for our users is that it will speed up the discovery, access to, processing and visualization of geospatial content to support time-critical tactical requirements. You will be able to spend less time hunting and gathering, and more time correlating and analyzing. And you will be able to more easily communicate using simple GIS tools such as Story Maps. In sum, we are making changes to how we develop, acquire and deploy tools that will allow NGA to get solutions to our workforce and our mission partners much faster and more effectively.
 
Our fiscal year 2016 plan included building geospatial content management tools into our unclassified and ICITE service offerings. This assists our mission partners as they migrate and manage their own geospatial data in the cloud. And it provides reliability and consistency throughout the IC.
 
In addition to the platform, we'll provide central registry and cataloging services that will help users find and get data. As we grow the quality and quantity of services, users throughout the Department of Defense and the IC can better support the kind of analysis that will dominate our future.
 
I've talked a lot about how GEOINT Services will enable the mission, and the common services we're providing. We're also doing a lot behind the scenes. Just to give you one example, we're changing the way we develop and acquire tools and capabilities needed by warfighters, first responders, and our other mission users. NGA is switching to a dev/ops methodology that allows developers and solution providers to get needed capabilities deployed to mission users much more rapidly. With this methodology, we're embracing collaboration during the solution development process, and we're establishing a developer environment that supports automated delivery of software and code into the NGA cloud.
 
We have capabilities already available for geospatial analysts to leverage, including our base visualization and data services, where analysts can discover trusted GEOINT. For example, NGA's Map of the World initiative has delivered our foundation GEOINT layers as services to support easy access and visualization on five secure domains. This enables NGA users as diverse as Army infantrymen and international partners to access our content on the networks where they work.
 
We also use Esri's ArcGIS Portal as the IC GIS Portal, providing a platform for analytic collaboration and data sharing for IC geospatial users. Our adoption of portal and web GIS promotes the sharing of geospatial data sets, tools and services in one location—about time, you say! As of yesterday, we have had more than 19,000 unique users log into IC GIS Portal, which is more than double the number of users we had just last October. These users include the traditional IC agencies, but also our partners in the Army, Navy, Marines, Air Force, and combatant commands. Nearly 50 percent of these users have been active in the portal in the last 30 days. We are also providing a modern, user-friendly, streamlined web presence on all domains, including the Worldwide Web. You won't have to piece together the story. You'll have it all in one place.
 
Let me close by telling you what is still in the offing. We need solutions that will:
• Enable our analysts to have simple, easy access to big data—and to not just look through massive amounts of data to find single answers, but to use the bigness of data to uncover patterns that are not obvious without looking at the whole.
• Equip our analysts to make sense of all that data—and do so when the data are not geographically contiguous or temporally synchronous.
• Provide more and streamlined capabilities—and to do so when users are disadvantaged--limited by low bandwidth or network access challenges.
• Move further toward anticipatory, instead of responsive, analysis—and to do so when there is no known starting point for understanding the pattern you're seeing.
• Establish on-the-fly analytic services and tools—and make them so flexible that they adapt to the trial and error that is required to answer complex questions.
• Develop better modeling capabilities--or, to put it another way, to provide better tools to capture knowledge in a useful, repeatable manner.
 
Tracy Toutant
Director, Intelligence Community GIS Portal
NGA
 
It's not so much that what we are doing is revolutionary, but that we're trying to bring all the functions to an environment where traditionally we have lagged.
 
If we get this IC GIS portal right, it's going to fall right in a technology "sweet spot." The reason I make that claim is that when we started this project, we didn't get it right. What we want this portal to do is to enable analysts. We want analysts across the community to discover geospatial data, create and share maps and perform basic GIS functions. We want to run this as an enterprise service for the community. We want to 8increase the community's access and usability for data service, web mapping capabilities and GIS based web applications, communitywide.
 
We're trying to use the portal to complement efforts not only by NGA but across the community. We want to use more collaborative technology. We have technology to help us collaborate, but the portal will be unique in that it focuses on geospatial analytical tradecraft. It provides a way to share data, models and scripts. It's focused at the geospatial analysis level. We want to include services, and not just trusted content, but web applications—analytic tools that we can use in common.
 
Most importantly, we want analyst-created data sets that are not usually considered trusted content, but still have analytic value. Right now, a lot of that data is not easily discoverable. We want to get more of that data and tradecraft shared across the community.
 
When we started, the piece we got right was in deploying the portal in the cloud on the high side. Like anything else on the high side, the first person to do it has the most trouble. You have the most paperwork and security negotiations. We were able to make sure we had some of the newer functions, and to combine content from previous portals.
 
At that point, however, we walked away and said the portal had been stood up. The catch was that didn't actually meet the need. What happened was that the person responsible for standing up the portal did so, and then waited for NGA to say this is what we want to do with it. But NGA didn't do that. So we weren't surprised when complaints poured in from all the agencies that no one was getting what they wanted. So we took ownership of this.
 
The other thing we found was that there wasn't a single person dedicated to supporting the portal on a full-time basis. So I talked to others in the community and asked how they ran their portal. I went back to NGA with the best practices from the community, saying this is how we should run out GIS portal.
We are still working on things like workflows and adopting best practices. But by taking ownership we can say we like how certain agencies control different functions, and look for the best of breed. We aren't going to reinvent the wheel or make up a new way of doing business. Now we've made our portal better, and people want to put things on our portal, which is the highest form of praise.
 
I'm an analyst, and I'm very focused on what I can do to make the lives of analysts better. How can I get them what they need to do their jobs? How do we support models, scripts, tools and widgets? How do we move from product focused things to working on the web? How do we enable teams? How do we keep the platform flexible and keep the barrier to entry low? We found that by using the portal, and reconfiguring what we need to, we could divert all of our programming resources to the pieces we needed to support our unique workloads.
 
We also needed classification tools, which are required. The first tool we rolled out was exactly what we had asked for. But we had to pull it down and start over, building a simpler classification tool. We are trying to support multiple types of missions, and to get better at service enabling our data, so you can get what you need when you need it.
 
We don't consider this project done. We've been able to add functionality, and we're looking at how to get trusted data into workflow pipelines. We've added a separate server attached to the portal.
What we need to do is to leverage what we are doing right. That sounds obvious, but we've all seen projects that end up languishing. We've tried hard to set up good conduits for feedback, and decided to call the link 'enhancement requests' rather than feedback. We want to hear what works well and doesn't, and to use this as a place for innovation and collaboration.
 
We know that we have issues specific to our community, classification and release being the biggest ones. We're constantly taking feedback from our users. We have dependencies—we're relying on people who are creating data services to tell us they are there. We are relying on people to want to create applications on the portal. Participation is the key.
 
We're starting to see the impact. Last summer, analysts were complaining everywhere about this, with the chief complaints about content. We had a lot of broken links, and were making it hard for analysts. Don't think we didn't hear that—so much that it's going to be a huge part of what we do going forward. It's not a technology problem, but a workflow and a policy problem, and we can fix those. We're looking for help in making improvements in data, applications and services.
 
David Alexander
Geospatial Information Director
Department of Homeland Security
 
Our critical infrastructure information is now being delivered as open and secure data. The Homeland Infrastructure Foundation-Level Data (HIFLD) site includes 275 data sets available as dynamic web services and downloadable files with built in digitalization.
 
To understand the significance of this, we need to understand how homeland security has changed in the last 15 years After 9/11, the immediate need was to secure and protect our infrastructure from terrorism. While this remains a critical part of the core DHS mission, homeland security is much more diverse, and considers areas like economic security and resiliency.
 
Addressing this broad mission requires a whole of nation approach. No single department can solve everything. It will take all of us. Navigating between open and secure information is a balancing act. Which of our critical infrastructure data can we make open to increase our economic security, and which data must remain secure to strengthen our homeland security?
 
Reaching this decision has been a challenging journey for all of us, to change our thinking and to adapt our culture both inside and outside DHS. Recent events like the Boston Marathon bombing have triggered a new type of community engagement. This requires adopting new technologies that facilitate open community collaboration. We continue to deliver secure services to government and first responders. But we must also enable greater community access to our content. We used to ship DVDs! Now we use dynamic web services that are updated from the source.
 
Providing authoritative, trusted and open data is fundamental to the DHS mission. ArcGIS open data supports our business processes and mobile and web access. HIFLD Open is integrated with the geo-platform through Data.gov and other data providers. There is no wrong door. The data is discoverable and accessible from anywhere.
 
HIFLD Open marks an evolution in DHS information sharing. We have an opportunity to be open and secure, to empower citizens and communities to support local law enforcement, and first responders, businesses and the private sector. Homeland security requires a whole of nation approach. I invite you to join us in building a transparent and collaborative ecosystem for sharing information across the nation.

 

Last modified on Monday, 23 May 2016 18:17

Additional Info

  • Issue: 7
  • Volume: 13
back to top