Infoterra has been in the geospatial business for over 25 years, since its founding in 1980 as the UK National Remote Sensing Centre. In that time, NRSC grew, formed international partnerships with space agencies, expanded into aerial data collection, and was eventually privatized and re-named “Infoterra” in the late 1990s.
As a major re-seller of remote sensing and aerial photography in the UK and Europe, Infoterra has had to manage larger and larger collections of data for customers. With the rise of the internet, Infoterra has moved more of its data catalogues online, and customers have gotten used to having direct access to information about data holdings.
In 2001, Infoterra implemented a new aerial imagery catalogue for the UK Ordnance Survey using PostgreSQL, in support of a large data acquisition program. By the end of the contract, the database was managing 1 million metadata records without any operational issues, and Infoterra felt confident to move to larger architectures based on PostgreSQL.
As individual data collection programs were run at Infoterra, each tended to have its own data management and order fulfillment process. After the success of the Ordnance Survey project, Infoterra decided to consolidate data management and order fulfilling into a single corporate system, the “GeoStore”.
The GeoStore uses PostgreSQL/PostGIS as the database backend, UMN Mapserver for map rendering duties, and a variety of bespoke applications for data loading, order fulfillment, and access. The data managed in GeoStore now include:
Ross Elliott is a Senior Software Engineer for Infoterra, and has helped design the GeoStore system and its predecessors over the last five years. “To justify any software we use, the main requirement is that it be cost effective”, says Elliott, “and for the most part that means we choose open source.”
“Without PostGIS, we would have to go back to Oracle”, says Elliott, “and this would incur huge costs for us. Most of our database servers have at least two CPUs, if not more, and most are attached to the web in some way. This could easily add another £1,000,000 to our costs in licensing, plus annual maintenance.”
The UK Ordnance Survey database is one of the largest unified spatial data sets in the world, and Infoterra has built special tools for handling the huge volume of features. For data loading, Infoterra developed an application to bulk-load the Ordnance Survey GML data into PostGIS in just 12 hours — an average load rate of almost 14,000 features per second.
In addition to the GeoStore, Infoterra builds and runs a number of small custom systems for other companies, most of which run on PostGIS. Oracle is used when the customer demands it, but Infoterra builds with PostGIS when given the option. “PostGIS has made our systems possible to design in a way that suits the way we want to work without worrying about license costs” says Elliott, “It is an easy choice to go with PostGIS.”
Nautilytics is a small data visualization and GIS startup based out of Boston, MA. We use PostGIS and PostgreSQL, among other open-source tools to build powerful web applications for US government organizations, public, and private sector companies.
I used extensively postgis (+ecosystem) for my phd thesis, in several ways. The first is that PostGIS is a good steady horse (elephant?): a database is the perfect place to store a lot of very different information in the same place and put them in relation. For geospatial data, postgis means you always have a way to put data in relation (are they at the same place?).
The Urban Center for Computation and Data (UrbanCCD) is a research initiative of Argonne National Laboratory and the Computation Institute of the University of Chicago. We create computational tools to better understand cities. One of these is Plenario, our hub for open geospatial data. PostGIS makes the spatial operations at the heart of Plenario possible.