user_mobilelogo

OpenGIS News

Creative Map Solutions brings you the latest news in the GIS and mapping industry.

The Open Geospatial Consortium (OGC) is an international industry consortium of 510 companies, government agencies and universities participating in a consensus process to develop publicly available interface standards. OGC® Standards support interoperable solutions that "geo-enable" the Web, wireless and location-based services and mainstream IT. The standards empower technology developers to make complex spatial information and services accessible and useful with all kinds of applications.

The advent of I3S

The future of mapping is coming — or is it here already? With the advent of Indexed 3D Scene Layers (I3S) as a community standard, everyone stands to benefit.

Discussions around creating data-display standards have taken place for decades now, most famously including VHS vs. Beta. The first attempt at 3D standards was in the late 1980s with computer graphics, but came toward maps about 10 years later. So few sets of data were collected in 3D, and 3D was (and is) memory intensive because of the six variables necessary to place an object at a point in space at the correct angles. Everyone had vested interests in standards reflecting the work they were already doing.
That last part hasn’t changed, but approving I3S as a standard means that this form of 3D will be generally exploitable by many more users in the near term. I3S is one workable and robust protocol which has been both fleshed out and vetted by a knowledgeable user community.

A map based solely on looking down cannot present to its users the difference between a two-story building and a 100-story building — and that difference is critical. There is much more content to the world than what is represented on 2D maps! It’s an epiphany of sorts, looking up at a building from the ground instead of just down from sky. Eventually, this will be the norm: The 25-and-below age group has grown up with immersive gaming experiences, so stepping into I3S feels natural, and, in a few years, when that age group is teaching, it will be native. 

Meanwhile, who benefits the most from I3S? Certainly, those who need to understand the implications of 3D are at the top of that list. This includes first responders, soldiers, and various program operators, among others. There are members of the American intelligence community who recount their days in the military and cite logistical challenges without viewsheds. But 3D applications are not limited to militaries and emergencies: The telecommunications industry cannot rely on paper maps to determine optimal placement for power lines or antennae.

With standards comes a renaissance! OGC approval for I3S makes 3D more available and used increasingly in communities. It is important to agree on a set of high-quality standards that are open to others instead of spending additional money and manpower on this all-important but seemingly exclusive facet of product development. So where do we go from here? Get data into users’ hands to build communities, of both professionals and lay users. The Internet of Things is growing, and disasters strike with perilous force. Hunters can scope out boundaries and the best places for blinds, and major professional golf courses have been mapped down to an eyelash. The potential for aid from 3D imagery is limitless. Let’s embrace the variety of applications, and welcome the input.

Vricon, like others in the community, supports OGC’s efforts to develop 3D standards and bring acceptance and commonality to the field. How can we get this next-level imagery and data to analysts, warfighters, first responders, urban planners, and all who endeavor to make our lives  — and our planet! — better?

Original author: Simon Chester
Continue reading
91 Hits
0 Comments

Our newest OGC staff member: Gobe Hobona

Gobe Hobona

It is an exciting time to be joining OGC as Director of Knowledge Management (DKM). One of the reasons why I am very excited about this is that geospatial interoperability standards are increasingly seen as the key ingredient for allowing much in society to be better understood. Whether it is understanding how shoppers decide where to buy their groceries, where best to deploy resources in response to a hurricane, or even to predict which routes home from work are likely to be the least congested at a particular time, providing and implementing knowledge management strategies that facilitate geospatial standardisation will enable these and other more complex questions in society to be answered with greater confidence.

Knowledge Management is about creating an environment and strategies that enable an organization to identify, create, represent, distribute, and adopt insights and experiences. It is about getting the right knowledge, to the right place at the right time. The insights and experience of the OGC membership are typically reflected in the standards and other documents we produce in the consortium. So as the DKM I will be responsible for planning and managing the workflow of candidate standards through their standardisation lifecycle. This means creating a knowledge sharing environment that allows insights and experiences from the OGC membership to feed into each candidate standard. As we roll out the new OGC Knowledge Management strategy, members will see greater use of automation and autonomous analytics to enable the right knowledge to reach the right place at the right time.

I have previously been employed as the Consultancy Team Leader and Head of Applied Research at Envitia Ltd, an OGC member. During my time at Envitia I worked on a number of consultancy and applied research projects for government and commercial customers. Prior to joining Envitia, I worked as a postdoctoral researcher at Newcastle University and the University of Nottingham. During these previous roles, I actively took part in multiple OGC working groups. I have also worked as a consultant to OGC on the GEOSS, INSPIRE, GMES Action in Support (GIGAS) project initiated by the European Commission during my time as a postdoctoral researcher.

I hold a PhD in Geomatics from Newcastle University, achieved with a doctoral thesis on Web-based Discovery and Dissemination of Multidimensional Geographic Information. I also hold a 1st Class Bachelor of Science degree with honours in Geographic Information Science from Newcastle University. After a number of years working in both academic research and the geospatial software industry, I was accepted as a professional member of both the Royal Institution of Chartered Surveyors (RICS) and the Association for Computing Machinery (ACM).

Most of my previous involvement within the OGC has been as a participant of the OGC testbed series. I have had the privilege of working collaboratively with many OGC members involved in the testbeds, trying out new technologies, developing new ones and feeding lessons learnt into the working groups of the consortium. These testbeds provide a valuable knowledge resource for each Standard Working Group (SWG) and Domain Working Group (DWG) within the consortium. So, I am looking forward to working with colleagues and members to enable the knowledge generated from these testbeds and other OGC initiatives to reach the wider geospatial community in the form of consistent and high quality standards.

Original author: Gobe Hobona
Continue reading
132 Hits
0 Comments

OGC GeoPackage: Expanding the Realm of Geospatial Capabilities

This guest post was contributed by: David Wilson, Geospatial Engineer Strategic Alliance Consulting, Inc.; and Micah Brachman, PhD Lecturer, Center for Geospatial Information Science University of Maryland, College Park.

Viewshed using the UK Ordnance Survey Elevation GeoPackage

Viewshed using the UK Ordnance Survey Elevation GeoPackage. Image from: OGC 16-094r3 GeoPackage Elevation Extension Interoperability Experiment Engineering Report.

An OGC GeoPackage is a portable database that may contain raster maps and imagery, vector features, and elevation data. GeoPackages are optimized for sharing and displaying these types of geospatial data on mobile mapping systems, and GeoPackage extensions may be developed to support additional types of geospatial data such as routing networks. This blog post will discuss what GeoPackage extensions are, how they are developed, and how they can provide new geospatial capabilities to meet the requirements of a diverse user base.

A GeoPackage extension is method by which new requirements are added to the existing set of requirements in the OGC GeoPackage Encoding Standard. These new requirements expand upon the existing capabilities of GeoPackage by enabling the use of additional data types, styling, and other geospatial functions through the addition of tables, rows, and columns to the existing standard. Extensions can enhance existing capability of GeoPackage (i.e. incorporating Non-Linear Geometry Types) or add a completely new data type entirely (Elevation, Other Media, etc).

Anyone can extend GeoPackage to fit their needs, but custom extensions can come with their own set of interoperability risks. In OGC, extensions must be approved to be “Registered-Exenstions” under the GeoPackage Encoding Standard. A “Registered Extension” is one that’s been vetted through the GeoPackage Standard Working Group (SWG) and has undergone community exchanges and interoperability experiments. This ensures that the broadest scope of industry, academia, and government consumers of GeoPackage participate and influence the outcome of the extension.

GeoPackage extensions are usually developed to fulfill a specific need. The Extension for Tiled Gridded Coverage Data was developed specifically to support terrain visualization and analytics such as line-of-sight on cell phones and other lightweight, low-powered computing devices. The first step in developing this extension was a whitepaper entitled “Envisioning a Tiled Elevation Extension for the OGC GeoPackage Encoding Standard” which identified uses cases, defined terms, and proposed a technical approach for the adding tiled gridded elevation data to a GeoPackage. This whitepaper was discussed and approved within the GeoPackage SWG, and was then socialized with the broader OGC community to build a consensus on the technical approach. An OGC Interoperability Experiment (IE) - known as the GeoPackage Elevation Extension Interoperability Experiment - was then conducted, which included participants from industry, government, and academia to build and test prototype GeoPackages that included tiled gridded elevation data. The technical approach continued to be refined as to not limit the scope the extension. A report of the IE is available here.

There are many other GeoPackage extensions that have been developed in addition to the Tiled Gridded Elevation Data extension. The latest version of the GeoPackage Encoding Standard has eleven Registered Extensions, including an RTree Spatial Indexes extension to improve the rendering performance of large vector feature GeoPackages and a Metadata extension to allow additional information about vector or raster data to be stored within a GeoPackage. While most of these registered extensions provide a method for improving upon the existing capabilities of OGC GeoPackages, there are also several custom extensions that truly expand the geospatial capabilities of GeoPackages.

All in all, GeoPackage is a dynamic product format that will continue to be extended and improved. However, with as many great concepts for possible extensions, OGC’s top priority will always ensure that extensions do not break interoperability and open to everyone to implement.

David Wilson Bio:

David Wilson is a Geospatial Engineer for Strategic Alliance Consulting, Inc that specializes in Geospatial interoperability with a focus on GeoPackage standards compliance, testing and use. David has over 10 years’ experience working in the Army and the National Geospatial-Intelligence Agency’s (NGA).

Micah Brachman Bio:

Micah Brachman is a Lecturer in the Center for Geospatial Information Science at the University of Maryland, College Park. He holds a PhD (2012) and MA (2009) in Geography from the University of California, Santa Barbara and a BS (2000) in Geography from the University of Minnesota. Micah has extensive professional experience in GIS and Remote Sensing in the commercial, government, and non-profit sectors, and recently transitioned from a Geospatial Scientist position supporting the Army Geospatial Center to teach in the new Geospatial Intelligence (GEOINT) program at UMD. In addition to GEOINT, Micah is also actively engaged in teaching and scholarship in Hazards and Emergency Management, Network Science, and Active Transportation.

Original author: Simon Chester
Continue reading
118 Hits
0 Comments

Geomatics Web Services Used in Recent Quebec Floods

Editor’s note: This article has been jointly written by Nicolas Gignac and Serge Legaré from the Ministry of Public Safety in Quebec. It was originally published on GoGeomatics.ca.

During Spring 2017, a major flood occurred in Eastern Canada, centered around south west Quebec. This flood was created by abundant rainfall lasting two months during the melting season. It was among the most severe floods since 1974. It impacted heavily populated areas of southern Quebec near Montreal and Gatineau. Hundreds of square kilometers of flooded areas exceeded the 100-year record.

Pic 1. 2017-04-01 Abundant rainfall warnings and precipitation in real time (GeoMet Web Service Weather Radar) issued by Environment Canada.

Rainfall tends to last for weeks in Eastern Canada, as snow melts in the northern area as early as the beginning of April. Public Safety Quebec Emergency Operational Center (PSQ-EOC) monitors the situation 24/7 by coordinating assistance with its partners in all regions of Quebec. As flows started rising at the end of April, the Quebec government decided to move its PSQ-EOC from Quebec City to Montreal to be closer to the affected area. The flow peaked on May 8. The situation needed good coordination and planning, as experts, emergency teams, and army were mobilized on the ground to assist citizens and the municipal manager.

Pic 2. © [2017] Airbus Defense and Space, License by Planet Labs Geomatics Corp., www.blackbridge.com/geomatics

Here is a summary of statistics for this unusual flood in southern Quebec:

Active operations of PSQ-EOC lasted from 5 April to 5 June 23 conference calls were held between May 5 and 19 78 reports were published by PSQ-EOC to follow up and inform partners during operations 1,500+ employees were mobilized in various Government of Quebec departments and agencies 261 municipalities were affected; State of Emergency was declared in 22; 170+ used the provincial financial assistance program 5,371 residences were flooded 4,066 people were evacuated ~550 roads were damaged 3 deaths took place 2,600 Canadian military personnel were engaged in operations As at June 2, $13,581,663 was paid through the Quebec's Specific Financial Assistance Program for Floods

It was important to closely monitor the situation, especially: meteorological events; hydrological impacts; past, present and future flood areas; affected municipalities; status of the road network; and the level of flows. Related information was requested to assist emergency managers working in this wide and densely populated region. Consequently, PSQ-EOC and its partners had to request, access, purchase, integrate, and publish various geomatics products to support decision making. During this event, one of the mandates of PSQ-EOC operations was to coordinate all those geomatics operations. In such a major disaster, organisations, journalists, and citizens typically look for near real-time information. They usually search web maps to provide a quick overview as 'a picture worth a thousand words'!

To reuse web mapping products and minimize administrative barriers in this time-critical situation, the decision was made to openly offer all products through public URLs as standard Web OGC services. They included:

Web Map Service (WMS): for overlays with a clear symbology Web Map Service Time (WMS-Time): for historical analysis for near real-time data Web Feature Services (WFS): for raw open data to process and mapping Web Map Tile Service (WMTS): for base maps

All these services were hosted in a solid Open GIS Infrastructure, called IGO (Open Geomatics Infrastructure: www.igouverte.org/english/). IGO was developed by the Quebec government and hosted by Public Safety Quebec. Adding to this, there was also an easy-to-use interactive web mapping application adapted to mobile devices that universally depicted the evolution of the event using these web services.

Here are the main facts related to the high demand of these geomatics services during this period:

45 million web requests occurred just in May on geomatics Public Safety Quebec Web GIS servers A maximum of 100 queries per second and 6-7 million queries per day occurred on May 7-8, 2017 on these same servers 30+ satellite images products were shared and published (Radarsat-2, SPOT, Pleiades) 50+ map layers in OGC Web services (WMS, WFS) and in a web map application through regular browsers and mobile devices Activation of the International Charter "Space and Major Disasters" in Eastern Canada on May 6th, 2017 Several requests were made for thematic mapping products and spatial analysis on flooded areas for different operational and administrative needs

Usually, Public Safety Quebec requests radar imageries to monitor ice river conditions and mapped ice jamming with certain known high risk flood areas during the spring season, under its Radarsat-2 agreement with Public Safety Canada. However, in April and May of this year, emergency events were more related to flood in open water situation, not the usual ice jam situation. Quebec Public Safety therefore had to adapt its processes and ask Natural Resources Canada (NRCan) Emergency Geomatics Services group (EGS) to test a new NRCAN image processing algorithm for the first time. This emergency service was developed in a near real-time event to map the extent of floods in an open water situation. This new algorithm for open water situation was recently developed by using high-resolution radar satellite images (e.g. 9m resolution). This was like the ones that were already ordered by Public Safety Quebec.

Pic 3. Map of polygon extent

Maximum extent polygons were then generated derived from a system operated by NRCan's Strategic Policy and Results Sector (SPSR) using radar satellite imagery products (ex. Radarsat-2, TerraSar -X, Sentinel) from PSQ-EOC. These mapping products were then shared with partners and processed by SPSR for quick validation by PSQ-EOC. During the event, these products were distributed in near real-time, depending on imaging availability (approximately every three days for Radarsat-2). These polygons products represented the extent of water in urban and vegetated environment. It should be noted that the product was a near real-time interpretation of satellite data and was not fully validated during the emergency phase.

Pic 4. Second map of polygon extent

In order to monitor the evolution of the water level on the infrastructure more effectively, and to map historical maximum of the flood, other satellite imageries in the visible and infrared spectrum (SPOT 1.5 m resolution and Pleiades resolution 50 cm) were also purchased and processed. These images were acquired in particular through PSQ-EOC with Ministry of Energy and Resource of Quebec in order to ensure coherence between public safety needs and various sensors available on the market. To facilitate the exchange of information between government and agencies, discussions were held real-time through an online collaborative platform. The acquired images were made available by the Government of Quebec a few hours after data capture to RNCAN for validation. Moreover, these images in the visible and infrared spectrum were made available later in the summer to the general public with the agreement made with the private operators of the satellites (Distribution Astrium Services, Airbus Defense and Space) a few weeks later by Web service and in the Open Data portal: Données Québec (www.donneesquebec.ca).

Pic 5. Includes equipment © CNES (2017), Distribution Astrium Services / Spot Image Corporation, USA, all rights reserved

PSQ-EOC also received the support from Transport Canada through its National Aerial Surveillance Program (NASP) to acquire oblique images for the impacted areas. Two aircrafts from Transport Canada, the Moncton-based Dash-8 and the Ottawa-based Dash-7, flew over flooded areas from May 7th to 16th and acquired more than 14,000 oblique images. NRCan's developed a standardised web-based service (WMS) shared with Public Safety Quebec to get an overview of these 14,000 oblique images acquired and integrated into their Web mapping applications. These oblique images were also used by NRCan to validate their polygons for the extent of open water in their process.

Pic 6. PNSA 2017-05-15 Lake Saint-Pierre at Highway 40

Here is a summary of the geomatics data used and made available to the general public:

Geomatics work will continue during the current recovery phase to more accurately determine the historical maximum flood by reusing NRCan's polygons maximum extent and validating them with field operations to be made in the upcoming months. Discussions with OpenStreetMap community of mappers about their involvement will also take place.

Recently, Public Safety Quebec was also able to test a new web mapping application adapted for mobile devices and developed by the government (https://geoegl.msp.gouv.qc.ca/igo2/apercu-qc/?context=inondation). This solution called Open Geomatics Infrastructure version 2.0 (IGO2: https://github.com/infra-geo-ouverte/igo2) was offered from a mobile device to easily track floods. IGO 2.0 is an open source GIS development project carried out in partnership with many organisations in Quebec. This solution follows the latest web trends (mobility, API searching tool, simplicity, offline tool) and geomatics (OGC web service, time analysis). It will be enhanced in the coming months with new features. This solution is based on other open source projects, such OpenLayers 4, Google Angular 4, and Google Material. For now, it has been made available to the public in the application gallery of Quebec's Open Data Portal: Données Québec (www.donneesquebec.ca/fr/applications/). At the same time, another internal secure version of IGO2 for Public Safety organisations was successfully tested on mobile devices at the PSQ-EOC.

Pic 7. Web mapping application – IGO2.

An indication that GIS Web services or products were needed by citizen and the media was that much data and web GIS application were intensely used to inform the public. These geospatial data and maps available on the Web have been widely publicized by various media during this major events. These include (some only in French):

GIS Web services has proven to be very useful by the media and popular to follow other disaster events this year, such as Forest Fire in British Columbia and Hurricane Harvey in Texas.

Original author: Anonymous
Continue reading
213 Hits
0 Comments

Advancing standards for marine data

Bathymetric Map courtesy of NRCan http://www.bedfordbasin.ca/halifaxharbour/DVD/fig7-eng.php

 

The OGC has long maintained standards that are used in the marine domain. These standards have been applied to studies of bathymetry, ocean science, navigation, logistics, and more. Long-standing Domain Working Groups (DWGs) in the OGC provide expertise in ocean science (Meteorology and Oceanography DWG) and safety of navigation (Defense and Intelligence DWG). However, only a small subset of the diverse disciplines that work in the marine environment have representatives in OGC’s membership. Thus, in 2016 the Marine DWG was established to provide a forum for discussion across marine topics and to link the OGC with other organizations in the field.

 

As the Marine DWG was being established, the OGC also formed an alliance with the International Hydrographic Organization (IHO - www.iho.int), a body dedicated to ensuring that the world’s navigable waterways are surveyed and charted, and that the data are accessible to all stakeholders through the use of international standards. The OGC and IHO now regularly support each other through attendance at each others’ events, and together they jointly chair the OGC Marine DWG. The Marine DWG is already promoting a future Concept Development Study for a Marine Spatial Data Infrastructure (MSDI) and is considering various whitepaper topics to illustrate the advantages of standards and data accessibility to the marine community.

 

The OGC and the IHO are also key participants in the United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM - ggim.un.org). The UN-GGIM has a broad mission to develop global geospatial data and associated infrastructure, primarily in support of the UN Sustainable Development goals. Now, with the support of IHO and OGC, the UN-GGIM has established a Working Group on Marine Geospatial Information (MGIWG) in recognition that marine data are a critical part of a global geospatial data infrastructure. For an excellent summary of the UN-GGIM work in this area, please see IHO Circular Letter 47/2017.

 

So, how can you get involved? The Marine DWG is open to all. Subscribe to the mailing list and visit the public wiki. Consider joining the OGC or working with your national representatives to the IHO to have your opinion heard. And finally, take advantage of the output of these efforts: discoverable and interoperable data are of benefit to all of us.

 

Image courtesy of NRCan: http://www.bedfordbasin.ca/halifaxharbour/DVD/fig7-eng.php

Original author: Scott Simmons
Continue reading
144 Hits
0 Comments

OGC, INSPIRE, and Metadata

Metadata is critical for geospatial resource discovery, determination of fitness for purpose, and more. The value of metadata lies in the structure and meaning that it provides. Metadata serves asset discovery by identifying assets and allowing them to be found by relevant criteria. Metadata also brings similar assets together and distinguishes dissimilar assets. Value is added to data and service holdings. Standards for the creation and use of metadata have a long history in the OGC, as well as many ongoing SDI activities such as INSPIRE and the US Geoplatform Activity.

This is why a key OGC CDB Standards Working Group (SWG) activity is enhancing the OGC CDB standard to define requirements and practices for metadata. CDB version 1.0 does not provide any guidance on the use of metadata as used in the geospatial community. These metadata enhancements are planned for CDB version 1.1. As part of this activity, the SWG investigated a number of metadata standards used in the geospatial community. Major metadata activities such as those in INSPIRE were also evaluated. The review of the INSPIRE Metadata Technical Guidance and related documents led me to write this blog.

The investigation of INSPIRE metadata requirements and related technical guidance was very educational but not easy. A fairly large number of documents need to be read and understood in order to implement metadata that’s compliant with the INSPIRE technical guidance.

Article 5 sub-clause 1 of the INSPIRE Directive states, “Member States shall ensure that metadata are created for the spatial data sets and services corresponding to the themes listed in Annexes I, II and III, and that those metadata are kept up to date.” The directive does not define what metadata standard or related technologies to use. The Directive states basic requirements. The implementing guidance is provided in a separate, very detailed document INSPIRE Metadata Implementing Rules: Technical Guidelines based on EN ISO 19115 and EN ISO 19119. This document specifies the requirements for implementing INSPIRE compliant metadata based on various ISO standards, specifically 19115:2005 and 19119:2005. This guidance defines which elements are mandatory and which are conditional or optional. Other documents provide additional technical guidance. There are also metadata validation and compliance policies and related tools.

In addition to the INSPIRE metadata mandate and guidance, governments in the EU also need to comply with the requirement to achieve cross portal metadata interoperability. This requirement states that metadata provided and/or used in an INSPIRE compliant geoportal must interoperate with metadata from other eGovernment portals. The eGovernment portals use a European Union de-facto standard called DCAT-AP. DCAT-AP was developed as part of the EU Programme Interoperability solutions for public administrations, businesses and citizens (ISA²) “with the purpose of defining a cross-domain metadata interchange format that can be used to share dataset metadata across data catalogues operated across the EU.”

Within this context, under the EU ISA Program, and taking into account the INSPIRE metadata profile and relevant ISO standards on which it is based, the community developed a geospatial profile of DCAT-AP: GeoDCAT-AP. The work was coordinated by a Working Group with representatives from the EU Member States led by JRC. The profile has been endorsed by the Member States through the ISA SIS Group / Coordination Group / TIE Cluster where Member States ‘adopt’ e-Government solutions. There is no legislation as is the case for INSPIRE.

GeoDCAT-AP is specifically designed to enable the sharing of geospatial metadata, in particular those available via the INSPIRE infrastructure. GeoDCAT-AP defines mappings from ISO 19115 (the ISO standard for geospatial metadata) to DCAT-AP and other general-purpose RDF vocabularies. The GeoDCAT-AP work included the development of an API. This API provides the ability able to transform on the fly ISO 19139 records into DCAT-AP or GeoDCAT-AP. ISO 19139 defines an XML-based implementation of ISO 19115. Please note that the GeoDCAT-AP specification does not replace the INSPIRE Metadata Regulation nor the INSPIRE Metadata technical guidelines based on ISO 19115 and ISO19119. The purpose of GeoDCAT-AP is to give owners of geospatial metadata the possibility to achieve more visibility by providing an additional RDF syntax binding. Of course, at least to me, this raises the question of whether the current INSPIRE Metadata guidance and related policy directives need to evolve to meet market and user needs.

What does all this mean?

The INSPIRE metadata requirements were originally specified in the mid-2000s and the technical guidance originally published in 2007 and the implementing rules published in 2008. Ten years is an eternity in IT. Since then, there has been a tremendous move to using web technologies for discovery, linking, and access to information resources. Issues, such as the metadata technical guidance being XML based, need to be addressed. The Web world is moving away from XML to other technologies such as JSON and RDF. Today’s users want to use search engines to express queries and discover ‘fit for purpose’ spatial data. More and more users want applications that easily link data resources and enable rapid discovery traversal of these resources. At the end of the day, users are recognizing the need for a web of spatial resources. However, as stated in the W3C Data on the Web Best Practices Recommendation: “The openness and flexibility of the Web creates new challenges for data publishers and data consumers, such as how to represent, describe and make data available in a way that it will be easy to find and to understand.”

There is a tension within the INSPIRE community. Much of the tension is a response to the market and user forcing functions. Another factor is a desire for more simple approaches that reduce the complexity and costs of the current in force technical guidance and related compliance requirements. In some ways, INSPIRE is facing the same technology life cycle and evolution challenges that any software company has to deal with: how does one move forward to capitalize on better technologies and user experience without destroying investments in existing infrastructure and knowledge base?

This is not just an INSPIRE issue. This is a geospatial industry wide issue (although INSPIRE has the additional challenge of EU legislation and compliance requirements). As an industry wide issue, we need to work collaboratively to make sure that current and future requirements for metadata and spatial resource discovery are addressed. In recognition of this need, over the last few years the synergies between the INSPIRE community, the OGC, and the W3C have increased. Evidence of this collaboration is the joint W3C/OGC Spatial Data on the Web Working Group, increasing DCAT and DCAT-AP discussions in the OGC, and the INSPIRE ‘What If’ sessions at the 2017 Delft OGC meeting and this September’s INSPIRE Conference. These collaborative efforts need to not just continue but to expand. The long-term success of INSPIRE and other SDI activities rest on our ability to identify and document standards and best practices that enable the agility and flexibility to meet the ever-changing landscape of policy, user needs, and technology.

Original author: Carl Reed
Continue reading
132 Hits
0 Comments

A new manager for the OGC validation tools: lat/lon

The OGC Validation tools, those behind the free OGC web testing facility, have been used thousands of times by developers around the world to improve their software and confirm compliance with OGC standards. The reports from the web validator are used to ‘get organizations certified’ and serve as proof that the software will work seamlessly with external data sources served using OGC standards.

 

The tools are composed of a test engine (known as TEAM Engine) and test scripts developed for each standard, with about 50 repositories existing on GitHub right now (see: test suites and teamengine versions). OGC, coordinating with various groups around the world, provides monthly beta releases with a production release every 6 months.

 

I recently took the role to lead the Innovation Program, which provides a collaborative agile process for solving geospatial problems and advancing new technologies (more about that in this post). To fill part of my previous responsibilities as Compliance Executive Director, we decided to advertise the position for Product Manager of the OGC Validation Tools. Although we received good responses, we didn’t have one response that met all three criteria: 1) expertise of OGC standards, 2) expertise in building software, and 3) expertise in working in the ‘open’ world.

 

In a conversation with Dirk Stenger from lat/lon, an OGC member in Germany, the topic came up and we decided to move forward on the idea of lat/lon taking on the role of managing the OGC validation tools. lat/lon has been working with OGC for more than 10 years, leading the development and maintenance of tests like WMTS 1.0, WMS 1.3, and WFS 1.1. They are also main contributors to deegree, open-source spatial software that is a reference implementation for most OGC standards, including WFS 2.0.

 

Due to lat/lon’s longtime involvement in collaborative work in the ‘open’ world, lat/lon have gained an extensive knowledge of the procedures, methods, and tools used by the OGC. This high amount of structural and technical experience will enable lat/lon to manage the OGC validation tools with minimum impact on the developers that rely on OGC’s testing tools. Further, it is expected that the tools will be improved and advanced in terms of both their technical and organizational aspects.

 

What’s next?

 

If you have any ideas that you think could help you and others implement OGC standards, let OGC know via our community forum or by submitting an issue in the project trackers.

Happy Testing!

-Luis Bermudez

Original author: Luis Bermudez
Continue reading
136 Hits
0 Comments

GeoPackage guidance

info [at] opengeospatial.org

Original author: Anonymous
Continue reading
234 Hits
0 Comments

Leveraging the OGC Innovation Program to Advance Big Data Spokes

Big Data aids in Health

The National Science Foundation (NSF) currently has an open program solicitation that seeks to establish more ‘Big Data Spokes’ to advance Big Data Applications. Like the BD Hubs, the BD Spokes will provide a regional coordinating role but they will focus on narrower topic areas, such as applications concerning the acquisition and use of health data, or data science in agriculture, among others. In addition to its topic area, spokes will be driven by three themes: 1) advance solutions towards a grand challenge, 2) automate the Big Data lifecycle, and 3) improve and incentivise access to critical data.

Using the Open Geospatial Consortium’s (OGC) Innovation Process could help Big Data Spokes advance a solution to better integrate and run analytics on data sets using technologies that are not only freely available and ‘open’, but that are also maintained by an established Standards Development Organization (SDO). OGC also has various domain working groups currently advancing solutions that would complement the work done in Big Data Hubs.

The OGC is an international voluntary SDO that provides a broad interface with over 500 industry, government, academia, and research organizations engaged in advancing standards to improve geospatial interoperability. OGC’s standards are implemented in hundreds of products to improve the discovery, sharing, access, fusion, and application of location-based information. In addition to its proven consensus process for advancing open standards, OGC - via its Innovation Program - provides a venue in which to prototype in an agile, collaborative environment. It has developed more than 90 initiatives in the last 17 years.

OGC’s Innovation Program Initiatives have helped advance technology solutions that deal with important challenges, such as those rising from continued population increase. Most recently, OGC’s Future City Pilot Initiative created technologies that aid in the provision of adult health services using multi-source data analytics (you can learn more in this 5 minute video on OGC’s Future City Pilot).

An OGC initiative could help prototype and design a solution for Big Data Spokes, based on open standards, that could be further implemented in a Data Hub. An OGC initiative has five phases:

Phase 1 - Concept Development: OGC gathers requirements and proposes an initial systems architecture. Phase 2 - Call for Participation (CFP): OGC publicly invites industry and non-industry organizations worldwide to participate in the Initiative to develop the components of the architecture. Phase 3 - Team Formation and Kick-off: The OGC evaluation team selects participants. Selected participants meet face to face at the initiative kick-off meeting to coordinate on the development, testing, and demonstration process. Phase 4 - Execution: Participants engage virtually through frequent teleconferences, net meetings, and email exchanges to discuss progress and to identify and resolve issues. Phase 5 - Reporting, Demonstration and Outreach: Technology demonstrations occur at the end of the Initiative to showcase the major accomplishments. Engineering reports and other artifacts are written that identify and summarize the resulting technologies.

The completion of such an initiative would result in a proven solution that can be implemented in a Big Data Hub to help automate Big Data lifecycles, and support, for example, Smart Cities or Health related challenges.

If you want to learn more about how to partner with OGC for the NSF Big Data Spokes, or other solicitations, please contact Luis Bermudez, Executive Director of the OGC Innovation Program (lbermudez at opengeosptial.org).

Original author: Luis Bermudez
Continue reading
245 Hits
0 Comments

Our newest OGC staff member: Marie-Françoise Voidrot

Marie-Françoise Voidrot

Today I am honored to join the staff of OGC Europe as Europe Director of the Innovation Program, and to contribute more intensively to the development of OGC activities.

Prior to joining OGCE staff I worked with Meteo-France, the French national weather service. While working there, I was a project manager of weather information systems for meteorological forecasters, with major customers like the CNES, the French Armed Forces, Air France, etc., and, more recently, for mass market consumption via the Internet and mobile apps.

As an OGC member, I have contributed to the definition of the MetOcean Domain Working Group that I have co-chaired with Chris Little since 2009. Together, we have helped the definition of common terms of reference for a relationship with the World Meteorological Organization that supports both hydrological and meteorological standards development.

I have been involved in the organisation of several annual workshops within the Met Community to gather the issues identified by the developers, while providing Met Ocean data with OGC standards to several spatial data infrastructures, including INSPIRE and SESAR. Met Ocean data is complex, inherently spatial, temporal, and constantly changing. It is big, heterogeneous, and multi dimensional - including multiple time attributes. Another source of complexity is the very demanding level of service, as these data are used for critical safety purposes, and are essential for major business activities.

The MetOcean DWG provides an open forum to work on meteorological data interoperability, and a route to publication through OGC's standards ladder (Discussion paper -> Best Practice -> Standard -> [and if appropriate] ISO status), and giving a route for submission to WMO CBS for adoption. Since 2009, the DWG has produced several Best Practice documents (available on the MetOcean DWG wiki) and multiple presentations to further knowledge and understanding of the complexity of these environmental data.

As further background, I have a Master’s Degree in Computer Sciences from Ecole Centrale Paris, and a Master’s Degree in meteorology from the Ecole Nationale de la Meteorologie (French National School of Meteorology). I am trilingual (French, Spanish, and English), and am located in Toulouse, France.

As a new OGCE staff member, my first focus will be on the NextGEOSS, which fits perfectly with my experience. NextGEOSS aims to develop GEOSS into a next-generation data hub, and increase the use of Earth Observation data to better support decision making.

If you would like to get in contact with Marie-Françoise to offer congratulations or discuss the MetOcean DWG, NextGEOSS, or other OGC activities, she can be reached at mvoidrot [at] opengeospatial.org, or on Twitter @twitt_mfv.

Original author: Marie-Francoise Voidrot
Continue reading
351 Hits
0 Comments

Innovation Principles

The OGC Innovation Program provides a collaborative agile process for advancing new technologies. Since 1999, 95 initiatives have taken place, from multi-million dollar testbeds (such as Testbed 12) to in-kind interoperability experiments. During these initiatives, sponsors and technology implementers come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards.

The first Innovation Program initiative was in 1999, when the Web Mapping Testbed took place and helped to develop the most popular OGC standard: the Web Map Service (WMS). Today, hundreds of thousands of data layers are available via WMS, and more than ten thousand articles are available related to this subject.

OGC Testbed 12 Video

Testbed 12 (2017) was a US $ 3.6 Million initiative that brought together 30 organizations and 210 individuals from around the world. 82 components (e.g. servers, clients) were developed, and 51 documents were produced.

Recent OGC initiatives helped advance the Geopackage encoding format, which allows users to store terabytes of data, including features and tiles, and synchronize that data with mobile devices for use in offline environments. This new OGC standard is based on modern databases, like SQLite, and is currently supported by more than 20 tools:

GDAL SpatialLite OpenJUMP PLUS
QGIS GeoServer TerraExplorer for Mobile
Luciad FME Safe Software Compusult Go Mobile
Esri GeoTools INTERLIS ili2gpkg
NGA Geopackage Mobile TerraGo Carmenta Engine
Envita ERDC RGI Library PB MapInfo and Map Extreme

Driving Innovation at OGC

After I became the Executive Director of OGC’s recently renamed ‘Innovation Program’ in March 2017, I delved into its history and processes to better understand the success of this program: how do Innovation Program initiatives help advance innovation; and what makes OGC succeed in developing, say, a standard to share maps over the Web, or an encoding that can be used in any mobile application?

Peter Diamandis, one the biggest innovators of our time, founder of the XPRIZE Foundation and best selling author, summarized 8 innovation principles inspired by Susan Wojcicki, CEO of YouTube at Google. These are a practical set of principles that, I think, apply to OGC and can help answer the question of why the Innovation Program works so well:

Focus on the user: The users for the innovation program are the sponsors who ultimately care for their constituents and the customers using their data. NGA, for example, cares about providing the best geospatial intelligence information. NASA cares about the use of Earth Observation data. FAA and Eurocontrol want to improve the interoperability of data used in air transportation. OGC brings sponsors’ requirements and distills them into open architectures and open standards. We make sure we develop solutions where the user is getting data in the proper way in the proper format.

Open will win: OGC has always taken an ‘open’ approach to everything it does: all of its Standards are open and free to use; its initiatives are open; the software used by the validation tools is open source. The calls for participation of sponsors or funded participants are advertised to the world. The results of OGC initiatives (e.g. videos and reports) are open: anybody can use this material without paying any cost or being concerned about intellectual property rights. OGC is using GitHub for writing standards and reports, as well as making available the tests and validation tools.

Think big, but start small: We believe in a world in which everyone benefits from the use of geospatial information and supporting technologies across different domains. We break initiatives down into concrete themes that represent their applications and/or domains. For example, the Future City Pilot used 3D open standards to demonstrate how they can aid in supporting responses to urban flooding, and in land development planning, as well as provide better adult social care based on conditions provided by environmental data.

Spark with imagination, fuel with data: Sponsors and participants in OGC initiatives come together to provide innovative ideas. Getting a sense of what is a popular standard is important. OGC provides a self-registration implementation database that can serve as a proxy for the level of maturity of a standard. This data helps sponsors think about what should come next: address the gaps or make improvements?

Never fail to fail: Rapid iteration is key. OGC initiatives provide the environment to test, fail, and improve. During an initiative, solutions are tested and discussed in weekly telecons. Integration experiments are frequently run to make sure that clients and servers can communicate.

Be a platform: OGC provides the process for using standards as a baseline for innovation (see George Percivall’s recent blog post about Innovations and Standards), but OGC is more than a standards organization. The Innovation Program provides the process to run initiatives in a manner unseen in other organizations. We are continuously improving the process so it can be replicated all over the world (for example, OGC’s Indian Plugfest). We are the platform that brings together experts from around the world to solve challenging problems in an agile prototyping environment and to advance open architectures and standards.

Have a mission that matters. The OGC mission is the reason why staff, members, and those involved in initiatives like to be part of OGC. Advancing geospatial interoperability makes our world more sustainable and enjoyable, and helps first responders save more lives. Sponsors of, and participants in, initiatives make a genuine contribution to the well-being of our planet, and the people that live on it.

I’m more than excited to lead this program and be part of such an important mission. If you want to advance innovation in the geospatial domain, want to become a sponsor or to sign-up for future funding opportunities, please send me an email: lbermudez [at] opengeospatial.org.

 

Original author: Luis Bermudez
Continue reading
263 Hits
0 Comments

Innovations and Standards

OGC, as a standards developing organization, provides a stable baseline for innovation. It could be perceived that innovation and standards are opposing ideas; in reality, the two ideas work together to prevent the extremes of stagnation and chaos: standards bring order to chaotic implementations of new ideas, additionally providing a baseline for new innovation to flourish. That innovation, in turn, feeds the creation of new and updated standards. One of the first computer scientists, Herbert Simon, in The Architecture of Complexity coined the heuristic that “complex systems will evolve much more rapidly if there are stable intermediate forms than if there are not.” Standards are those stable intermediate forms necessary for innovation and evolution.

This blog highlights several recent innovations in OGC processes: Changes in the OGC Innovation Program; Community Standards in the OGC Standards Program; and Geospatial Trends Tracking by the OGC Architecture Board. Further discussion of innovation in an agile environment will be the topic of a blog in the very near future by Dr. Luis Bermudez, the new Executive Director of OGC’s Innovation Program.

In 2014, the OGC Planning Committee adopted an ‘Innovation Statement’ that laid out how OGC must maintain its current standards while simultaneously addressing the evolution in technology and markets. While ensuring harmonization in OGC standards, OGC must simultaneously respond to the Christensen’s ‘Innovators Dilemma.’ The OGC identified several actions to implement its Innovation Statement:

Extend or adapt the present baseline of OGC standards; Recognize that new standards may overlap with or diverge from existing standards, along with guidance to evaluate among options; Develop harmonization techniques (brokers, facades) for interoperability.

To frame a discussion about innovation and standards, consider this view of open standards development as developed by Mark Reichardt, OGC President:

Open Standards Development

Based on competition in the marketplace, over time a specification emerges as a de facto standard in the market. The specification may be publicly available but it is owned and controlled by an entity as a ‘proprietary standard’ (e.g. Microsoft’s Word .doc). As the market develops, the owner of a proprietary standard may see value in ‘opening up’ their specification by assigning the intellectual property to a Standards Developing Organization (SDO). KML, as licensed to OGC by Google, is an example of what OGC now calls the Community Standards process. As an alternative to development by a single organization, a group of organizations may collectively identify the need for a standard and develop a specification in anticipation of its widespread use. The OGC WMS as developed in the OGC Innovation Program is an example of an ‘anticipatory standard’

Based on this framework, examples of the OGC process for innovation are described next.

The OGC Innovation Program, previously known as the Interoperability Program, provides a collaborative agile process for advancing new anticipatory standards. The first Innovation Program initiative was in 1999, when the Web Mapping Testbed took place and helped to develop the most popular OGC standard: the Web Map Service (WMS). Since 1999, another 95 initiatives have brought together sponsors and technology implementers to solve problems, produce prototypes, develop demonstrations, and write engineering reports that anticipate the needs of the sponsors and the marketplace.

The OGC Standards Program is now processing the first OGC Community Standards. This new process welcomes innovative specifications developed outside of the OGC. The process allows for a commonly-used specification, along with its intellectual property, to come in to OGC as a snapshot of that specification. The snapshot is then voted to become an OGC Community Standard. The process not only recognizes that innovation occurs in many communities, but also provides the visibility for future evolution of standards and provides a stable baseline of externally-developed standards for use in the OGC process.

To anticipate innovation, the OGC Architecture Board (OAB) has taken on a part of the OGC Innovation Statement by defining a process to track geospatial technology trends. The OAB monitors trends to identify technology gaps or issues related to the OGC baseline. The OAB Technology Trends process is used to establish innovation topics for several OGC activities: Future Directions sessions in the Technical Committee; Location Powers emerging technology summits; and prospective topics for initiatives in the OGC Innovation Program.

OGC has been developing innovative standards for over 20 years, but we have more to learn. Several excerpts from The Innovators book by Walter Isaacson provide a historical perspective on technology innovation on which OGC can build:

The digital age may seem revolutionary, but it was based on expanding ideas handed down from previous generations. Innovation comes from teams more often than from lightbulb moments of lone geniuses. Most of the successful innovators and entrepreneurs had one thing in common: they were product people. They cared about, and deeply understood, the engineering and design.

We invite you to participate in the advancement of geospatial technologies based on open standards. Your contribution to OGC Innovation Initiatives by sponsoring new ideas or helping develop solutions and prototypes will save lives and make this world a better place.

If you have a question or comment on OGC’s approach to innovation, contact George Percivall, OGC CTO and Chief Engineer. gpercivall at opengeospatial dot org.

Co-authors of this blog post include: Luis Bermudez, Scott Simmons and Terry Idol.

Original author: George Percivall
Continue reading
431 Hits
0 Comments

Looking Forward / Looking Back

As we transition into the 2017 New Year, I thank OGC members for their continuing contributions to improve location information sharing and interoperability.

By all accounts, 2016 was a successful year. OGC completed one of its largest interoperability testbeds to date – OGC Testbed 12, which pushed the limits on a range of topics from geospatial quality frameworks to aviation and compliance. With over 100 in attendance, the TB12 demonstration in late November provided sponsors and attendees with insight into the developments made throughout the testbed. TB12 produced more Engineering Reports and Guides (a new implementer oriented document) than any previous Testbed. Engineering Reports described advancements based on implementations in many areas including: new capabilities for Geopackage, refinements in Web Map Tiling Service (WMTS) and Web Coverage Service (WCS) for Earth Observations; investigations in vector and raster tiling schemes; and various REST and JSON topics. Fundamental advancements were demonstrated in the General Feature Model.

Project activity in Europe continued to build with OGC Europe leading the H2020 ESPRESSO (systEmic Standardisation apPRoach to Empower Smart citieS and cOmmunities) project to empower Smart Cities through advancement of a standards enabled information framework, as well as supporting a NextGEOSS initiative which will advance a next generation hub for discovering, accessing and exploiting Earth Observation data. We also had some amazing progress in our consensus process with member dialog expanding in the areas of point clouds, the Internet of Things, integration of geospatial / civil engineering and the built environment, 3D visualization, and modeling and simulation. Further, we implemented a new community standards process that welcomes broadly implemented market technologies into the OGC for potential adoption. All these areas of interoperability emphasis are relevant and useful across many markets and domains of use, but they all converge as integrative capabilities to enable Smart and Resilient Cities.

The OGC Compliance Program continued to develop new tests and support an increasing number of certifications. New compliance tests made available in December 2016 included Catalogue (CAT) 3.0, GeoPackage 1.0 and the Web Map Tiling Service (WMTS) 1.0.

Looking forward, 2017 is also shaping up to be a great year. OGC Testbed 13 is underway and we expect to focus on a range of new topics – with a planned renovation of OGC Web Services, which will include a close look at evolving a set of OGC Essentials – more elemental components of OGC standards – to ease implementation, especially in the app developer world. We will be implementing streamlined ‘Implementer Guide’ versions of our OGC standards to make it easier for developers to support integration and implementation of our standards. A standards incubator is being stood up within our Standards Program to encourage new ideas and concepts to be advanced – with very little process overhead. As stated on the OGC website, the Incubator promotes community “prototypes that push the boundaries of our work and perhaps provide a pathway to future standards”. Several projects have already been initiated by contributors from the community.

We will continue to leverage our OGC Location Powers Summits to interact with the community on emerging topics, trends and challenges. Location Powers events bring together both private and public sector experts from across the globe to stimulate highly interactive, cross-community discussion and to shape industry guidance of value to the OGC process. Summits held in prior years covered topics such as Smart Cities, Sensor Web, and Geospatial Big Data. In 2017, OGC will be convening summits on Big Linked Geodata (March, 2017), Underground Infrastructure (June, 2017) and Agriculture (December 2017). Input from Location Powers summits help drive OGC’s direction in these and other areas.

Clearly, location has become an underpinning enabler across many markets and domains of use. OGC standards and best practices are enabling rapid mobilization of geospatial data and technologies into systems, enterprises, and the decision cycle. Much more opportunity lies ahead for OGC to help communities to benefit from the power of location.

Original author: Mark Reichardt
Continue reading
403 Hits
0 Comments

The OGC Community Standard Process

For many years, the OGC membership discussed and struggled with defining a process and set of related policies for accommodating submission of widely used, mature specifications developed outside the OGC standards development and approval process. Examples of these specifications are GeoTIFF, GeoJSON*, and GeoRSS. Such specifications are often termed as “de facto” standards. A de facto standard is something that is used so widely that it is considered a standard for a given application although it has no official status. The focus of the OGC discussions was to define a more lightweight process by which outside (non-OGC) groups as well as OGC member organizations could feel comfortable submitting specifications developed outside the OGC into the formal OGC standards process. The four driving use cases for the Community Standards process are:

Submitters could rest assured that the OGC would not alter the content of the specification (unless errors were discovered), would not usurp the development and maintenance of the specification, and that the intellectual property could remain with the organization or group that developed the de-facto standard. A desire by government organizations to have de facto standards vetted and branded by a formal Standards Development Organization, such as the OGC, so that these de facto standards could be specified in procurement language. The need for OGC standards to be able to reference externally developed de-facto standards as normative. The ability to have a “version” of a de-facto standard that is stable and does not change. This allows such a document to be referenced in procurement, other OGC standards, and so forth.

After many months of OGC member discussion, in 2015 a new set of OGC policies and procedures for what became termed “Community Standards” were adopted by the OGC Membership. A number of de-facto standards have already been approved as candidate community standards and have entered the new OGC process workflow. These are Cesium 3D Tiles, ASPRS LAS, GeoRSS, and Esri I3S.

A Community standard is an official position of the OGC endorsing a specification or standard developed external to the OGC. A Community standard is considered to be a normative standard by the OGC membership and becomes part of the OGC Standards Baseline. A key consideration for a Community standard is that there must be strong evidence of implementation. The OGC does not take over the maintenance of the specification. Rather a Community Standard is a “snapshot” of a mature specification and the continued evolution and maintenance of the specification remains with the external group. Further, by submitting the specification into the OGC Community Standards process, there is no requirement to make any normative changes to the document; i.e. the external version and the OGC version of the document can remain identical. Finally, the originator has either shared the Intellectual Property Rights with the OGC or granted unlimited free use of the Intellectual Property to all implementers.

The following table is from the OGC Technical Committee Policies and procedures. The table compares the requirements for the two primary OGC standards types: Community and Full.

 

SWG

Evidence of Implementation

Modular Spec

Compliance Test

OGC Template

Public Comment

OAB Review

IPR to OGC

Member Vote

Community standard

Not required

Strong

Not required

Not required

Yes

Yes

Yes

Yes or Shared

Yes

Full Standard Track:

 

standard

Yes

No

Yes

Not required

Yes

Yes

Yes

Yes

Yes

standard with Compliance Suite

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

When defining the policies and procedures for processing candidate community standards, the membership agreed that having as light-weight as possible process was extremely important. Therefore, many “normal” OGC procedures are not required for Community Standards. These include formal Standards Working Group formation, complying with the OGC Modular Specification policy, and having compliance tests. Add in the fact that the OGC cannot change the normative content of a community standard and the net result is that the effort to move a community standard through the OGC process is considerably less than for a full standard. As a result, instead of 18 to 24 months to get a standards document through the OGC process, a community standard could – from submission to approval – be processed in as little as 6 months.

Recently, on behalf of the GeoRSS community and at the request of the US Federal Government, I submitted the GeoRSS specification into the Community Standards process. On February 2, 2017, the members approved the submission as a new OGC work item. OGC Architecture Review should occur later this month followed by a public/member comment period and an adoption vote. Hopefully, all steps for GeoRSS to become an official OGC Community Standards will be completed by May of this year – 6 months from the original submission.

Obviously, given that this is a new OGC process, we can expect minor refinements to the process as the OGC learns from the recent submissions. If you have any questions or concerns, please let the OGC know!

Carl Reed is a geospatial technology professional. Carl holds a PhD from SUNY Buffalo in Geography specializing in GIS technology development and systems engineering/design. He has 40 years of geospatial expertise in the commercial, government, and non-profit communities. Carl recently retired from the OGC where he was CTO and Executive Director Standards. Carl developed the original idea and initial PnP for the OGC Community Standards process.

Original author: Simon Chester
Continue reading
593 Hits
0 Comments

OGC and GEO - in partnership from the start

image

Recent guest blog for the Group on Earth Observations (GEO) by our CEO Mark Reichardt

Partnerships are key to our success.  The context of location permeates across all disciplines.  OGC has established numerous alliance partnerships with other standards organizations and associations as a mechanism to improve our standards and best practices through better understanding of community needs.  These partnerships also enable us to coordinate on activities of mutual interest, driving outcomes that could not have been achieved in isolation. To date, OGC has over 40 such partnerships spanning sensors and IoT, built infrastructure, smart cities, modeling and simulation, law enforcement, the smart grid, and of course Earth Observation to name a few. Our partnership with GEO is one of our more important relationships given the role that standards and interoperability play as an integration platform for the large-scale heterogenous global array of EO sensors and sensor systems.

The OGC / GEO partnership began in the early 2000’s as Open Geospatial Consortium (OGC) open web service standards gained global implementation – improving the ability of geospatial technologies and data sources to work together seamlessly.  This was around the same time as OGC members began interacting with GEO and GEOSS activities to assist in advancing Earth observation interoperability arrangements in the GEOSS Common Infrastructure (GCI) - drawing on standards from IEEE, ISPRS and the OGC.

OGC members recognized early on the importance of a close and continuous relationship with GEO.  In February 2005, OGC became a GEO Participating Organization at the final ad hoc GEO Plenary. Shortly thereafter, OGC participated in the Earth Observation Summit that formally established GEO.

Role of Open Standards In GEO

The initial GEOSS 10 Year Plan identified a bold vision to create a system of systems based on voluntary contributions from nations and participating organizations from around the world.  This vision is now being realized by GEO through an architecture based on open international consensus standards.

Just as the internet continues to grow and prosper based on an open standards architecture, creating a system of systems for Earth observations required a similar standards framework that would also address the unique aspects of the varied characteristics of EO systems and information.

The role of open standards as expressed in the GEOSS Architecture / GCI is significant. They provide a level of interoperability that enables organizations and nations to contribute and share their EO assets to more effectively address a range of social, economic and environmental issues.

OGC Role in GEO

OGC’s involvement in GEO has been significant and varied:

From 2005 to 2008, OGC in conjunction with IEEE and ISPRS planned and conducted a series of GEOSS Architecture workshops in locations around the world.  OGC’s role was to conduct live demonstrations of GEOSS architecture capabilities based on OGC standards.  These workshops and demonstrations were well received, helping to validate the power of an interoperable architecture for GEOSS, and inform the community of how to implement and scale this architecture.

On behalf of GEO, OGC conducted a series of GEOSS Architecture Implementation Pilots (AIP) bringing together the user community, industry, and the university and research community to develop an architecture for the GEOSS including GEOSS Common Infrastructure components.  A series of eight Architecture Implementation Pilots developed and implemented operational prototype capabilities using OGC’s Interoperability Program rapid prototyping and engineering process. AIP Phase 1 was conducted in 2007, and focused on evaluation of the GEOSS Initial Operating Capability produced by the GEOSS Architecture and Data Committee.

Subsequent AIP phases addressed topics in the context of GEO Societal Benefit Areas such as: renewable energy planning and placement, air quality assessment, habitat management and forecasting, disaster management, water quality and drought, and disease surveillance.  For each of these AIP phases, operational prototypes, live demonstrations and detailed engineering reports and best practice videos were delivered and made public to encourage adoption of the GEOSS Architecture. These can be found on the OGC website at www.opengeospatial.org and at our YouTube channel “ogcvideo”.

OGC continues to support a range of GEO activities and members are involved in a range of activities including Flagships, GEO Initiatives and Community Activities as defined in the current GEO Work Programme.  Further, OGC serves as an active member of the GEO Programme Board.

Opportunities Ahead

The OGC plans and conducts a range of interoperability testbeds and pilot initiatives to rapidly develop, test, validate and demonstrate the power and effectiveness of new and enhanced candidate standards and architectural best practices.  OGC is presently seeking participation in its next major interoperability testbed – OGC Testbed 13. This initiative has a significant emphasis on the access, processing and application of EO data. I encourage GEO Members and Participating Organizations to review the Call for Participation which is open until 17 February 2017.   The Mass Population Migration and other themes of this testbed will benefit from access to a broad range of EO sources, and I can see great opportunity to leverage assets available through the GEO portal and GEO Members.

As the GEOSS approach calls for open standards, it is vital that the implementations conform to those open standards.  OGC’s Compliance Program provides a testing framework to certify that implementations conform to OGC standards.  The OGC Compliance Program testing infrastructure can be applied to confirm use of OGC standards in GEOSS as well as extending this open source testing framework to other standards used in GEOSS.

OGC is a participating member of a H2020 NextGEOSS project, which will implement a federated data hub for access and exploitation of Earth Observation data, including user-friendly tools for data mining, discovery, access and exploitation. This data hub will be supported by a strong commitment to the engagement of Earth Observation and related communities, with the view of supporting the creation of innovative and business oriented applications.

NextGEOSS includes a set of demonstrative pilot activities based on research topics, and various business scenarios. These Pilots will showcase GEOSS capabilities with emphasis on data accessibility and use, and will directly engage GEO and other EO-related communities, including the commercial sector.  The GEO Secretariat will announce the Call for Participation to AIPs.

OGC also has a standing Domain Working Group on Earth System Science, as well as a range of working groups that rely heavily on EO information. These working groups would benefit from deeper involvement by GEO Members and Participating Organizations to identify new geospatial interoperability challenges requiring action by the standards community.  Most of these DWGs are open to participation by OGC members and the broader community.

Finally, OGC conducts quarterly Technical Committee meetings where OGC Members converge to discuss interoperability issues and work to advance solutions.  Our 102nd Technical Committee meeting will take place in Delft, The Netherlands during the week of 20 March, 2017.  I encourage GEO representatives to consider participating in these meetings to further strengthen the ties and alignment of activities of importance to our organizations.

The original version of this blog can be found at http://www.earthobservations.org/geo_blog_obs.php?id=204

Original author: Mark Reichardt
Continue reading
598 Hits
0 Comments

OGC invests in improving Quality of Service and Quality of Experience

We live in a world stealthily powered by Web Services and APIs: nearly everything we do on our laptops and mobile devices uses background services to talk over the Internet. These services are especially important for applications providing access to small subsets of information, based on a user’s location, fed from large, remotely stored datasets. Any quality issues in the communication between the applications and their backend services quickly become critical, causing bad user experience for tens of thousands of people.

Systematic improvement of the Quality of Service (QoS) for Web Services, including factors like availability, capacity, and performance, requires using well-defined metrics in order to provide comparable QoS measurements. Defining these QoS indicators and metrics, as well as declaring the expected service levels for Spatial Data Services, have been identified as priority topics of the newly founded OGC Quality of Service and Experience Domain Working Group (QoSE DWG). OGC member activity leading into founding of the new DWG in late 2016 clearly shows that QoS and more user-oriented Quality of Experience (QoE) topics are currently of great interest within the OGC.

In addition to the QoS metrics, the initial list of tasks for the OGC QoSE Domain Working Group includes gathering and defining a list of the essential QoS and QoE terms, and collecting good community practices in evaluating and improving the user experience of OGC Web Services. As an open DWG, the group acts as a forum for discussion and sharing information in QoS and QoE related topics for OGC members. Regular online meetings will be held monthly, and the group intends to meet face-to-face in as many OGC Technical Committee meetings as possible.

Charter members of QoSE DWG include several active OGC members with critical business interests in QoSE. Tom Kralidis, Senior Systems Scientist from the Meteorological Service of Canada, Government of Canada, highlights the importance of QoSE for both the data providers and data users: “Health check monitoring of geospatial services provides value for more than just uptime, focusing on the specific functionality of a given service or API. The work of the QoSE DWG will be of value to both organizations wishing to communicate their quality of service levels as well as monitoring applications wishing to evaluate and measure service quality in an interoperable manner.”

In Europe, the EU INSPIRE Directive and e-Government development are key drivers for QoSE. Danny Vandenbroucke, Research Manager, KU Leuven (SADL): “With the development of a European wide Spatial Data Infrastructure (SDI) steered by the INSPIRE Directive, QoSE has been recognized as a critical factor in the successful integration and usage of INSPIRE web services in e-Government processes. KU Leuven has been involved in the assessment of SDIs throughout Europe since 2002 and the testing and validation of its components, including QoSE, are a very important part of these assessments.”

Natural Resources Canada, Government of Canada (NRCan) is eager to contribute to the QoSE DWG best practices based on their experience. Cindy Mitchell, Lead, Operational Policies and Standards, Federal Geospatial Platform Initiative: “Quality of Service and Experience is fundamental to operational Spatial Data Infrastructures by ensuring services originating from a wide variety of publishers are available, usable, and relevant to applications and their users. We lead several initiatives of interest to the QoSE DWG in OGC, including Spatial Data Infrastructure assessment methodologies and key performance indicators, automated web services harvesting approaches, Federal Geospatial Platform data and service quality assessments, standards validation, and international collaborative projects (Pan-Arctic DEM, WaterML) that ensure data interoperability via standards. NRCan is pleased to collaborate within the QoSE DWG to bring best practices for highly reliable and usable web services to the web.”

Sampo Savolainen, the Managing Director of Spatineo, is thrilled to see the growing OGC interest for QoS: “In Spatineo, our entire business model is based on leveraging standard interfaces for letting our customers measure the quality of Spatial Data Services they are providing and using, and helping them leverage spatial data on the web. OGC activities in this field will make it easier for our customers to provide and find high quality Spatial Data Services.”

Scott Simmons, the Executive Director of the OGC Standards Program, notes that “geospatial web services include some unique characteristics, especially considering that the visual nature of a map rendered to a browser does not necessarily reflect the method of service nor the user interaction with the data. We need metrics tailored to the use case of the service and fair comparisons that target the services, not the IT environment and internet bandwidth in which the services reside.”

Raising the customer awareness in QoSE issues, and harmonizing QoSE measurement where it makes sense, were primary reasons for us at Spatineo to join the OGC. I’m honoured to co-chair the group with Tom Kralidis, and looking forward to active discussion and contributions from the group members.

The next QoSE DWG face-to-face meeting will be held at the upcoming OGC TC in Delft, The Netherlands on Wednesday the 22nd of March 2017. For more up-to-date information, including the mailing lists, work programme and meeting minutes, see the group wiki at http://external.opengeospatial.org/twiki_public/QualityOfService/WebHome

Original author: Simon Chester
Continue reading
583 Hits
0 Comments

OGC GeoPackage: Enabling the Next Generation of Geospatial Technologies

GeoPackage meets the geospatial data requirements of users in multiple domains, including defense and intelligence, emergency management, and outdoor recreation.

An OGC GeoPackage is a portable database for sharing and displaying geospatial data that is optimized for use on mobile mapping systems. GeoPackages may contain raster maps and imagery, vector features, and elevation data, and extensions may be developed to support additional types of geospatial data such as routing networks. This post will discuss how GeoPackage can be used meet the geospatial data requirements of users in multiple domains, including defense and intelligence, emergency management, and outdoor recreation.  

In the defense and intelligence domain, users often must rely on mobile mapping systems that are not connected to a data network. Even in places where a network connection is available it may be slow or unreliable, yet users still expect the “Google Maps” experience when viewing maps and imagery or running analytics. Most internet-based map services such as Google Maps use PNG or JPEG tiles rather than native raster formats such as GeoTIFF or MrSID, which allows maps and imagery to be rendered quickly on the screen and provides a better user experience when panning around the map or zooming to different scales.  GeoPackage is the first OGC standard that specifies how to store and access tiles within a lightweight SQLite database, thus providing a performant, cross-platform solution for viewing imagery and maps when a data network connection is not available. The adoption of OGC GeoPackage enables defense and intelligence users to view maps and imagery in a consistent manner across multiple mapping systems, thus providing the common operating picture needed to successfully complete their missions.

Geospatial data and information play a critical role in the emergency management decision making process. To protect people from hazardous events such as wildfires, earthquakes, tornadoes, hurricanes, and industrial accidents, emergency managers and first responders must have access to maps showing the terrain, transportation networks, and location of vulnerable populations in hazard-affected areas. In addition to raster tiles, vector features such as roads and building footprints can also be stored in an OGC GeoPackage, and GeoPackage extensions may be developed to support storage of elevation data or 3D building models. The flexibility of the GeoPackage standard allows purpose-built geospatial datasets to be stored within a single file, and in addition these datasets can be easily updated or queried using SQL statements. For example, a specific Geopackage schema containing a road network, building footprints, and a fire perimeter boundary could enable emergency managers to determine safe wildfire evacuation routes on-the-fly using a mobile device, tablet, or laptop computer.  Dynamic data such as the fire perimeter boundary could be collected in the field and shared across a local mesh network as a GeoPackage update, allowing emergency managers to re-assess the safety of evacuation routes based on this new geospatial information.

OGC GeoPackage can also meet the geospatial data requirements of outdoor recreation enthusiasts. Like defense and intelligence users, these users often need access to maps and imagery in areas where a data network connection is unreliable or non-existent.  In addition to storing the maps and imagery needed to perform basic geospatial functions such as determining a location or planning a route, GeoPackage can also give hikers, hunters, mountain bikers, or climbers a mobile platform for documenting and sharing their outdoor adventures with others. Photos, notes, and GPS tracks can be georeferenced to maps and imagery and dynamically added to a GeoPackage that is stored on a mobile device. When a data network is available, this GeoPackage can be uploaded to a web server and shared with other outdoor enthusiasts as an online adventure blog or travel diary. Purpose built GeoPackage schemas could be designed to support this adventure blog use case, or could be designed to support other outdoor activities such as trail maintenance or ecology research.

Outdoor recreation, emergency management, and defense and intelligence are just a small subset of the many user groups that could benefit from adopting the OGC GeoPackage standard. Mobile computing and location-based context awareness are two trends that are currently reshaping the geospatial industry, and OGC GeoPackage is well positioned as a performant, open-source, cross-platform solution that enables the next generation of geospatial technologies.

Micah Brachman is a Geospatial Scientist at Strategic Alliance Consulting, Inc based in the Washington, DC metro area. Micah holds a PhD in Geography from the University of California, Santa Barbara and has 20 years of geospatial expertise in the commercial, government, non-profit, and academic sectors. When he’s not making maps, Micah enjoys hiking, biking, rock climbing, and spending time with family.

Original author: Simon Chester
Continue reading
561 Hits
0 Comments

Geospatial and OGC to feature at Apache: Big Data Europe

info [at] opengeospatial.org

Original author: Simon Chester
Continue reading
623 Hits
0 Comments

Open geospatial standards and social analytics

The Urban Big Data Centre (UBDC) at the University of Glasgow in Scotland and the Business and Local Government Data Research Centre (BLGRC) have established the Social Analytics Strategic Network (SASNet), a research-based network focused on capacity building for social analytics of emerging forms of data, including big data. Geospatial data represents one of the key data types across many of the UBDC and SASNet activities. 

A series of capacity building events funded through the Economic and Social Research Council (ESRC) SASNet fellowship programme have been carried out recently and on Tuesday 20 September SASNet Fellow Steven Ramage was invited to run a workshop. This was a free training workshop entitled ‘An introduction to open geospatial standards’. 

Steven Ramage at the Urban Big Data Centre

Andrew McHugh, the Urban Big Data Centre’s Senior IT and Data Services Manager, introducing Steven Ramage.

The workshop focused on work done by the OGC to develop international geospatial standards with some insight into other standards development bodies including IETF (Internet Engineering Task Force), ISO (the International Organization for Standardization) and W3C (the World Wide Web Consortium). 

Some of the key messages were that communities are collaborating worldwide across multiple specific domains to develop open standards, which support innovation and where benefits accrue for government, non-governmental and non-profit organisations, industry, research and academia. Many resources relating to open standards are available for free, including tutorials, engineering reports and white papers. Also many OGC standards are compatible with other de facto or de jure standards, the key is choosing whatever serves your purpose. Overall, the key to the consensus standards development process is participation: the effort you put in reflects the results and benefits you or your organisation obtain.

The ultimate objectives of SASNet are building capacity amongst social science researchers engaged in innovative research to harness the power of real-time data and other forms of big data for use in urban and business contexts, while providing business and public sector analysts as well as decision-makers with the understanding and analytic skills to make effective use of these various data sources in their professional activities. Hopefully this workshop provided an insight into how open geospatial standards can assist with big data and the importance of policy makers and other business decision makers to understand the value of open standards.

Original author: Simon Chester
Continue reading
568 Hits
0 Comments

RM-ODP: A Critical Enabler for Sustainable Development

Achieving Sustainable Development Goals will require developing, collecting, publishing, assessing, accessing, sharing, aggregating and analyzing a huge variety -- and huge amount -- of data. This can’t be accomplished without consensus on open distributed processing architectures.

Achieving Sustainable Development Goals will require developing, collecting, publishing, assessing, accessing, sharing, aggregating and analyzing a huge variety -- and huge amount -- of data. This can’t be accomplished without consensus on open distributed processing architectures.

The United Nations (UN) Sustainable Development Goals Report - 2016” overview includes this: “The data requirements for the global indicators are almost as unprecedented as the Sustainable Development Goals (SDGs) themselves and constitute a tremendous challenge to all countries.”

That report is based on a proposed global indicator framework developed by the UN Inter-Agency and Expert Group on SDG Indicators. The UN Department of Economic and Social Affairs Statistics Division’s June 2016 final version of the “Framework for the Development of Environment Statistics” (FDES) represents many years of work. It is an extraordinary document that will be a key tool for collection and analysis of the environmental data required for the SDG indicators.

A massive amount of data must be gathered into the indicator databases to quantify progress. In some respects this will become easier, because the number and variety of information technologies, their SDG-related applications and the applications’ user populations are all expanding rapidly, causing an exponential increase in the amount of data produced. The real difficulty will come from parallel growth in the number and variety of different databases, spreadsheets and data streams that nations, NGOs and industry will be obliged to develop, assess, share, aggregate (and disaggregate), integrate and analyze.

An important part of the solution lies in a set of design principles for connecting the countless data systems in the SDG Indicators universe. (These principles may already be in review in the UN SDG community, but I haven’t found reference to such review in my web searches.) What’s needed, I believe, is a “Global SDG Information Systems Reference Model” based on the Reference Model for Open Distributed Processing (RM-ODP) (ISO 10746-1 to 10746-4).

Figure: Reference Model for Open Distributed Processing (RM-ODP) Viewpoints

Figure: Reference Model for Open Distributed Processing (RM-ODP) Viewpoints. Table from "Server Architecture Models for the National Spatial Data Infrastructures (NSDI).” Open Geospatial Consortium (2005)

People knowledgeable about enterprise information systems know about the RM-ODP. It is a widely accepted international standard for the specification of large information systems that consist of distributed and diverse components. It has played an important role in systems architecture thinking in the Open Geospatial Consortium (OGC), in most National Spatial Data Infrastructures, and in the UN Spatial Data Infrastructure.

As shown in the figure, the RM-ODP directs information system architects to look at an overall distributed information system from 5 different viewpoints. Each viewpoint needs to be developed in roughly the order shown.

The international community has already agreed on a set of Sustainable Development Goals that could provide the core of the Enterprise Viewpoint. To build out the Enterprise Viewpoint, the UN still faces the challenge of getting countries to reach greater agreement on the indicators, and the countries face the challenge of getting internal stakeholders to agree on the indicators.

The indicators and the FDES provide much of the content for the Information Viewpoint. The proposed reference model’s Information Viewpoint needs more modeling of the workflows that will provide the data that populates the SDG indicator databases. Quality assessment will be a critical workflow item, particularly because much of the data will be Volunteered Geographic Information (VGI).* So, more work remains on the first two viewpoints, but from a systems architecture perspective, these are the right viewpoints to begin with.

It’s now time to begin developing the Computational Viewpoint. The countless systems that will make up the SDG System of Systems will need to communicate, interoperate, and behave as one big distributed processing system. This can be accomplished by means of open, international, consensus derived data model standards, open encoding standards based on the open data models, and open interface standards based on the open encodings. The OGC WaterML 2.0 standard, developed in a working group organized by the joint WMO/OGC (World Meteorological Organization/Open Geospatial Consortium), provides an excellent example. Developers of the proposed Global SDG Information Systems Reference Model’s Computational Viewpoint would also need to discuss how to exploit progress in sensor webs, linked data and graph databases (as well as relational databases), Big Data analytics, and citizen science.

I was fortunate to have Bart De Lathouwer, Director, Interoperability Programs, OGC and Prof. Mike Jackson, Professor of Geospatial Science, Faculty of Engineering at the University of Nottingham, co-author with me an OGC white paper, OGC Information Technology Standards for Sustainable Development, that provides more detail and context.

I welcome readers’ comments and I welcome pointers to information that might fill gaps in my knowledge and  understanding. Surely someone is already working on this.

* See “A conceptual model of the automated credibility assessment of the volunteered geographic information”, by Idris, Jackson and Ishak. Published on IOP Science. http://iopscience.iop.org/article/10.1088/1755-1315/18/1/012070/meta

Lance McKee was on the OGC startup team in 1994. He recently retired as OGC’s senior staff writer and now participates in OGC as an OGC member.

Original author: Simon Chester
Continue reading
524 Hits
0 Comments