Read recent blog posts from Continental Mapping on various mapping technologies and information.

Popular Posts

 Our first 5G Geospatial Data blog entry discussed the primary challenges of building the new 5G communications network and why high-quality geospatial data plays such a critical role in its deve...
​Deploying broadband internet to rural America is a large challenge. Planning, engineering and deploying 5G networks in those same rural areas is even more complicated. Numerous challenges exist to br...
The fifth-generation wireless telecommunications technology known as 5G is coming, and it's going to impact more than just your cell phone. 5G will serve as the data transmission backbone that will en...



Industry Terrain: Kevin Hope Part 3 - What Does the Future Hold?

In the final installment of our Industry Terrain series, Continental Mapping's Kevin Hope expands on what the future of the geospatial industry may hold. Recognizing and leveraging emerging trends around technology, specifications, and requirements have always driven Continental Mapping's innovation and value. And as Hope explores, ever-expanding data inputs mean that quality analysis will be what creates coherence from the chaos.

The Future of Geospatial Technology, Specifications, and Requirements

Kevin Hope: So many avenues to explore for the future! Some primary trends that I think are emerging for the future of geospatial include the following:

  • Technology. Moving at the speed of user needs is the future. In the area of software development that means a commitment to DevOps and the ability to rapidly spin solutions to user requirements. Artificial intelligence and machine learning will be part of the future; you can see it all over the government landscape. The ability of the private sector and government to respond with algorithm development, advanced modeling, and effective solutions for automating hard problems will be key drivers for technology going forward. The proliferation of data and sources of information is exponential. Technology needs to find solutions that can rapidly turn data into coherent answers.

  • Specifications. Open standards are a big part of the future. Organizations like the Open Geospatial Consortium work across the community to develop and promulgate open geospatial standards where capabilities can be developed that will be interoperable within the domain. Developing standards and supporting community-based reference implementations of those standards are keys to enabling data and service interoperability. Open source capabilities will continue to grow and advance. Investing in leading the development of the standards and specifications that will drive this forward is something all private sector geospatial entities need to consider.

  • Requirements. I'll use the current foundation geospatial requirements process at NGA as an exemplar. The documented, formal process whereby NGA works with the community to define requirements, accept requirements, and fulfill needs has served the community well for decades. However, the process must evolve. With the need to decrease timelines from requirement to output being demanded by the end user, the paradigm of requirements must evolve. There will likely be a need for standard products and data development as far as the eye can see. In some ways, the existing paradigm can support, but with users demanding updated content within hours/days instead of weeks/months, requirements satisfaction will need to include rapid, near real-time ability to respond to requirements. Employing automation, change detection, and capabilities like crowdsourcing will become part of the process of how we think about requirements differently going forward.

"More is not always better. Capabilities, automation, and other advancements must find a way to leverage the best of the information available to derive actionable information and answers in near real time."

 The Concept of Persistence: More Does Not Always Mean Better

This drives to the very heart of where the community is going. The days when the Government had a monopoly on a small set of exquisite capabilities and requirements vastly outstripped the ability of the system to respond are gone. The paradigm has been completely flipped on its axis. We now exist in an environment where the proliferation of advanced sources, national and commercial, make virtual persistence a reality. It is this shift that the DoD/IC community must face. How to turn that abundance into coherence is the analytic challenge of the future. More is not always better. Capabilities, automation, and other advancements must find a way to leverage the best of the information available to derive actionable information and answers in near real time. This shift will be extremely challenging both to the government and private industry. It challenges the very core of how data has been gathered and analytic intelligence has been derived. It will shake the very foundations and challenge the community as a whole to re-think the way we do business! Those Agencies and companies that can look forward and develop the capabilities needed to meet this challenge will lead the enablement of this transformation.

By accepting you will be accessing a service provided by a third-party external to