Monday 5 December 2016

Hydrological Society Conference - Web-based real-time processing of environmental measurements

This case study was presented as poster abstract.

Kmoch, A., White, P. A., & Klug, H. (2015). Sensor Observation Service and web-based real-time Processing of environmental Measurements in the Upper Rangitaiki Catchment (Poster). In The NZ Hydrological Society Conference 2015, 26th November, in Hamilton, New Zealand

Abstract:

Environmental assessments naturally depend on field observations and technological advancements. , such as tTelemetry, allow the automated collection, transmission and processing of these measurements. However, modelling of natural processes is typically a complex challenge and involves applying expertise of scientists as well as a host of data preparation steps (White, 2006, White et al., 2003).
In addition, automation of model execution with the most recent observation data is dependent on the integration of the data collection, storage and processing elements (Klug and Kmoch, 2014). This paper demonstrates a system that integrates a Sensor Observation Service (SOS) that includinges field observations and internet-based environmental data with a rainfall recharge model that allows near-real time calculation of rainfall recharge in the Upper Rangitaiki catchment, Bay of Plenty region.
The SOS specification is an Open Geospatial Consortium (OGC) standard for the open and standardised integration of environmental sensors into an internet-based environmental data infrastructure (Klug and Kmoch, 2015, Klug, Kmoch, Reichel, 2015).

Figure 1. Process of data flow from field site sensors, to SOS data service to a simulation model process


Results:


  • We showed that it is possible to link the collected data directly to a simple rainfall recharge model (Figure 1)
  • The low cost sensor and circuit board instrumentation collects data and forwards them to the field computer in 10-minute intervals via robust, low power, ZigBee wireless protocol
  • The field computer running a standard Linux operating system, transfers observation data in 10 minute intervals via a 3G mobile data connection to an online SOS server.
  • From the service the observations are available in a standardised open format.
  • A website can access the raw data from the SOS server and plotted data points within 5-10 minutes of field measurement
  • A rainfall recharge model runs with the latest data points from the online SOS server.


References:

Klug, H., & Kmoch, A. (2014). A SMART groundwater portal: An OGC web services orchestration framework for hydrology to improve data access and visualisation in New Zealand. Computers & Geosciences, 69(0), 78–86. http://dx.doi.org/10.1016/j.cageo.2014.04.016

Klug, H., Kmoch, A. (2015). Operationalizing environmental indicators for real time multi-purpose decision making and action support. Ecological Modelling, 295, 66-74. http://dx.doi.org/10.1016/j.ecolmodel.2014.04.009.

White, P. A. (2006). Some Future Directions in Hydrology. Journal of Hydrology (NZ), 45(2), 63–68.

White, P. A., Hong, Y.-S., Murray, D. L., Scott, D. M., & Thorpe, H. R. (2003). Evaluation of regional models of rainfall recharge to groundwater by comparison with lysimeter measurements, Canterbury, New Zealand. Journal of Hydrology (NZ), 42(1), 39–64.

Klug, H., Kmoch, A., & Reichel, S. (2015). Adjusting the Frequency of Automated Phosphorus Measurements to Environmental Conditions. GI_Forum 2015 - Journal for Geographic Information Science - Geospatial Minds for Society, 1, 590–599. http://doi.org/10.1553/giscience2015s590

Wednesday 30 November 2016

AGILE 2016 - SensorWeb Semantics on MQTT for responsive Rainfall Recharge Modelling

Integrating Wireless Sensor Networks (WSNs) and spatial data web services is becoming common in ecological applications. However, WSNs were developed in application domains with different sensor and user types, and often with their own low-level metadata semantics, data format and communication protocols. The sensor web enablement initiative (SWE) within the Open Geospatial Consortium (OGC) has released a set of open standards for interoperable interface specifications and (meta) data encodings for the real time integration of sensors and sensor networks into a web services architecture.
Such XML-based web services exhibit disadvantages in terms of payload and connectivity in low-bandwidth low energy unreliable networks, such as remote 3G uplinks. Monitoring stations deliver frequent measurements in real-time, but dynamic implementation of measurement frequencies, adapted to certain environmental conditions, are rarely implemented. We describe a responsive integrated hydrological monitoring prototype to calculate rainfall recharge for water management purposes.
When rainfall is observed, a threshold event triggers a reconfiguration task for the soil moisture sensors, using asynchronous, push-based communication implemented with an MQTT queue. A Sensor Planning Service commits that request via MQTT into the wireless sensor network, and updates the measurement frequency of the target sensors to gain higher resolution for the vertical soil water infiltration.
The system integrates a Sensor Observation Service (SOS) including field observations and internet-based environmental data with a rainfall recharge model that allows near-real time calculation of rainfall recharge in the Upper Rangitaiki catchment, Bay of Plenty region in New Zealand.


Figure 1: Setup and location of the sensor field site, central North Island, New Zealand

The prototype site comprises a main station conducting comprehensive measurements of meteorological, hydrological and pedological parameters. For the wireless data transmission within the local site installation XBee-PRO modules from the Digi Company  ZigBee IEEE 802.15.4 protocol are used. The main station receives continuous sensor measurements from the attached sensor units, and acts as the gateway to the online SOS and SPS services by providing the communication channel from the local sensor network to the web-enabled data management infrastructure.
The field site has been established in the Upper Rangitaiki catchment (Figure 1) and comprises a field computer (Raspberry Pi) with a direct internet link (GPRS/3G) and a sensor board (Waspmote) that has 12 typical meteorological, hydrological and pedological sensors attached (i.e., wind speed, wind direction, rainfall, 1x groundwater probe, 5x temperature and 3x soil moisture). The Raspberry Pi and Waspmote can be monitored and reprogrammed from an online server.

Figure 2: Raw sensor series visualized in a website from a SOS query.

The site setup allows scaling up to a multitude of low cost, low energy sensor stations throughout the catchment, with only one field computer that serves as data logger for backup. The observations were available in a standardized open format. The website accessed the raw data from the SOS server and plotted data points within 5-10 minutes of field measurement. This website was easily accessible via browsers and smartphones (Figure 2).

The paper was was presented at the 19th AGILE International Conference on Geographic Information Science, 15th of June, in  Helsinki, Finlkand.

Kmoch, A., Klug, H., White, P., & Reichel, S. (2016). SensorWeb Semantics on MQTT for responsive Rainfall Recharge Modelling. In 19th AGILE International Conference on Geographic Information Science. Helsinki.


Sunday 25 September 2016

Using Liquid Democracy for Water Resources Management: A Review


... and how to link governmental policy processes and geosciences via an interactive decision making platform for water resources co-management.

Initial Context - Case Study New Zealand

The responsibility of management of natural resources, in particular water, is typically delegated to the Regional Councils by the Resource Management Act. Decisions regarding water management are regulated through the National Policy Statement for Freshwater Management. However, those decisions need to be backed by thorough science and in consultation with all stakeholders, be it Iwi, domestic, agricultural or industrial water users, or the general public in regards to recreational services that water resources provide.

Current policy and management decision processes follow a rigid procedure, 1) the science to understand the resources, 2) a consultation process (if at all) about the plan how to manage the resource and 3) the development of a long term strategy and policy decision.

The main concern that I’d like to address is the agility of that process. These steps follow the traditional waterfall project management model, which probably is due to the limitations of current tools. Scientific models that aim to characterise the availability and of natural resources and dynamics of environmental process are developed, possible impact assessed and then described in reports for further use. The data and assumptions utilised in the research process are limited snapshots in time, dependent on the quality of data collection and curation processes, and scope and depth vary with available budget. Based on the information from these reports resource managers discuss strategies how to manage the water resources in reconciliation with demands. The consultation with users is again a cost- intensive process, because it is time-consuming. Furthermore, assumptions e.g. limits or local/regional differences in water allocation for the modelling process cannot be changed flexibly.

The consultation process therefore seems limited to very few pre-decided scenarios, which might not have considered all important stakeholders (who are all important stakeholders anyway?). Thus, a final management and policy decision is often not satisfactory.

I would like to propose more research into an agile, aka “liquid resource management system”. Recent advances in computer and web technologies for example allow dynamic exposure of datasets and scientific software. So it might be time to link data with models and a “delegated voting” mechanism into an online resource management feedback system. Such an online platform would provide the capabilities to run different scenarios transparently within a democratic discussion forum. Through delegates stakeholders can have their interests represented in a co-management approach which allows a close link of the resource managers with the affected communities while having the current science at hand.

Background Liquid Democracy

Probably one of the first explicit thoughts on Liquid Democracy originate from Bryan Ford's draft named Delegative Democracy from 2002. Back then it was uncertain what scientific venue(s) it would be suitable for. Bryan unfortunately didn’t manage to get back to exploring and developing it in any rigorous scientific fashion.


However, he published a well cited informal blog post revisiting the idea and pointing to some of the interesting developments since 2002:


The closest academic work Bryan referred to as a reasonably serious, rigorous exploration of the topic of any kind (and the only peer-reviewed and published work directly on the topic that by then was James Green-Armytage’s political-economic analysis:



This paper took a first step at theoretically defining and analyzing the idea from a cost/benefit perspective, but it’s only a first step and leaves a lot of unanswered questions and issues, and of does not contributes to the empirical space.

Helene Landemore, a colleague of Bryan in political science at Yale, has been interested in this and related “collective intelligence/decision-making” topics for a while. She is working with Rob Reich and Lucy Bernholz at Stanford on some events in the near future exploring this and other related “digital democracy” topics. One might try reaching out to any or all of them as additional points of contacts with more experience in the political and social sciences space.

Further academic or applied works around Liquid Democracy can be found occasionally, for example:

  • Blum, C. & Zuber, C.I., 2015. Liquid Democracy: Potentials, Problems, and Perspectives. The Journal of Political Philosophy, 24(August 2014), pp.6–9. Available at: http://doi.wiley.com/10.1111/jopp.12065




  • Zwattendorfer, B., Hillebold, C. & Teufl, P., 2013. Secure and Privacy-Preserving Proxy Voting System. 2013 IEEE 10th International Conference on e-Business Engineering, 0, pp.472–477



  • Behrens, J. et al., 2014. The Principles of LiquidFeedback. Interaktive Demokratie e. V. Berlin. ISBN: 978–3–00–044795–2, p.240

Liquid Democracy as an Integrating Technology Platform

Possibly the most prominent of the application of Liquid Democracy is the German Pirate Party [1], for which the Software Liquid Feedback was developed [2]. A Medium article describes the distinctive features [3]:

Liquid Democracy is a new form for collective decision making that gives voters full decisional control. Voters can either vote directly on issues, or they can delegate their voting power to delegates (i.e. representatives) who vote on their behalf. Delegation can be domain specific, which means that voters can delegate their voting power to different experts in different domains. This is in contrast with direct democracy, where participants are required to personally vote on all issues; and in contrast with representative democracy, where participants vote for representatives once in a certain election cycle and then never worry about voting anymore.

Coming back to the water sciences and data issues around the water resources management process. In the last decade so called geoportals evolved to integrated systems of systems, not only providing data, but also processing routines and visualisations of the processed geospatial data to support science and education as well as policy and decision making for particular environmental domains. From a scientific and data-centric point of view, it becomes an obvious choice to link the democratic processes with the data in an online platform, e.g. as suggested by Craglia & Shanley (2015), or in an elaborate study "When Water Becomes the New Oil" (Kwiatkowski & Höchli, 2016) from the Swiss Gottlieb Duttweiler Institute (GDI), an independent think tank based in Rüschlikon near Zurich, which frames this as a participatory process.




Links

  1. http://liquidfeedback.org/
  2. http://techpresident.com/news/wegov/22154/how-german-pirate-partys-liquid-democracy-works
  3. https://medium.com/organizer-sandbox/liquid-democracy-true-democracy-for-the-21st-century-7c66f5e53b6f
  4. https://en.wikipedia.org/wiki/LiquidFeedback

Wednesday 10 August 2016

SMART Groundwater Portal Dev going full "Cloud"

What a concise quote from Erik Dietrich, founder of DaedTech LLC:
Software developers demand the ability to work effectively from anywhere.  They have attained a coolness factor, and demand for them is so high that there is no need for them to guard their source code like squirrels preparing for winter.  GitHub is a good idea because it effectively captured what software developers really want and offered it to them pretty flawlessly.  GitHub is a zeitgeist that is taking over the world precisely because software developers are taking over the world and software developers really like GitHub. (source)
Although, I work at a research institute which is one half a commercial consultancy with IP to protect, on-going international research collaborations, and governmental research funding require us to be flexible, open and accessible.  I work as a research scientist/analyst programmer in a science department, not in the IT or applications department, and thus, IT infrastructure interaction in commercial entities is "challenging" - for the scientists as well as for the IT folks.

In our current project we embraced the Zeitgeist now, too. For our geodata portal development and deployment processes, we adopted following paradigm:

Google Cloud Platform, Compute and Container Engine with Kubernetes as the our computational platform.

Google Drive, Google Docs and Sheets for assets, functional and implementation specifications development, user stories and use cases.
    GitHub as our distributed version control system, which allows us to collaborate, yet, keep contributions transparent and easily and publicly traceable.
      Trello eventually serves as our workflow board, for sprints, keeping links of specs, repos and other soft information together. I believe, that we could have done everything in GitHub, but Google Docs and Trello provide gently mechanisms to also invite non-technical folks to contribute. And actually, what we really want is this, right?



      Thursday 30 June 2016

      Amazon IoT and OGC SensorThings vs SOS/SPS on MQTT

      Today I stumbled over a blog post on the OGC Blog about "Amazon IoT and the candidate OGC SensorThings API Standard"

      It was great to learn that Amazon IoT uses MQTT as network / transport layer. The more interesting is the move towards REST (which is HTTP) to address semantic issues with IoT systems:
      AWS IoT, MQTT and many other network interoperability standards (LWM2M, CoAP, etc.) enable message interchange.  However, IoT network interoperability doesn’t enable the systems that are exchanging messages to interpret those messages. To realize the many-to-many system-of-systems vision, IoT applications need to implement standard ways of communicating sensor locations, sensor and data parameters, and sensor instruction sets. This is what the OGC SensorThings API provides.
      It is understandable from the point of view, that OGC OWS is in general intrinsically interwoven with HTTP semantics and the application of SWE in the IoT context is desirable, well, even crucial.

      However, MQTT stems from the constraints of low bandwidth, low energy consumption and unreliable connectivity, and implements a robust, small footprint asynchronous messaging platform. Earlier this we have developed a prototype system, that aims to add SWE semantics into MQTT messaging. We keep the specialisation of MQTT, where for example connections can be resumed which under SSL/TLS security can be a significant power saver. MQTT would also better address the "addressability" of distributed "things" which might be isolated behind IP NAT or run with dynamically assigned addresses, because in MQTT context the "things" would subscribe to SPS management topics and therefore higher order SDI services don't need to direct data queries or task requests to an address but place it on the queue with respective topics.




      For an agricultural research project we distributed monitoring stations as a wireless sensor network throughout the catchment to deliver frequent measurements in near-realtime, which mimics OGC SOS.

      A dynamic implementation of measurement frequencies, adapted to certain environmental conditions like heavy rainfall events require re-tasking of the nodes, which mimics OGC SPS. Within this paper we provide a framework where a threshold event triggers a reconfiguration task for a phosphorus measurement device, using asynchronous, push-based communication, on an MQTT queue, which links the ground stations with a cloud-based control system. OGC WPS algorithms continuously analyse incoming SOS measurements commits such a request into the wireless sensor network, and updates the measurement frequency of the target nodes to enable nutrient peak flow estimation during storm events.


      This on-going work is based on original thoughts on the  interlinking  of  OGC  SWE  semantics  for  sensor  descriptions  and  observations (OGC Sensor Observation Service, SOS), sensor configuration and planning (OGC  Sensor Planning Service, SPS)  with  the  open  Message  Queue  Telemetry  Transport  (MQTT)  protocol.
      MQTT  is  a  simple,  yet  very performing  and  robust  publish/subscribe  message  passing  system,  which  can  also  serve  for  event  brokering as  described  in  the  SWE  Service  Model.  Actual  addressing  and  data  transmission  is  done  by  a  topic  string and an arbitrary payload. The topic string and the payload need to reflect the necessary standard semantics to be  mapped  back  and  forth  to  allow  for  a  seamless,  and  preferably  lossless  bi-directional  multi-lateral communication  between  the  WSN  as  data  provider  (SOS),  web  clients  as  data  consumers  (SOS)  and  the management system, which interacts with the WSN in a standardised way, too (SOS and SPS). 



      Friday 29 April 2016

      Geoscience Data Mining and Visualisation Brainstorming Weekend

      Early 2016 New Zealand Ministry for Business, Innovation and Employment (MBIE) have put out calls for interested parties for a business to govt (B2G) data innovation challenge, one "opportunity" in particular is about "Geoscience Data management" - thought that I have some strengths and competency to contribute and was keen to have a say.

      The Challenge: 

      This so called R9 Accelerator brings the public and private sectors together to make it easier for business to interact with New Zealand government.

      http://www.r9accelerator.co.nz/opportunities/opportunity14/

      A prototype model could be applied to a range of other large databases managed by government, businesses and science institutions across the country. Data management issues are common internationally, so the model could have applications overseas.

      On 29th Jan to 31st Jan 2016 was a full weekend information workshop in Miramar, Wellington. Subsequently, if interested, one would have to apply for the 3 months accelerator programme (either as team member or as mentor/domain expert).

      http://www.r9accelerator.co.nz/apply/

      http://www.r9accelerator.co.nz/timeline/

      If teams would get elected to the full programme they would take part in the three month Accelerator starting 1st of March, then pitch to investors. This could be an opportunity to learn how private sector could better interact with govt data.

      Alternatively, one could to consider to be involved on higher level into the process, which would be sort of part-time mentorship govt navigator or domain expert type participation.

      http://www.r9accelerator.co.nz/take-part/support-a-team/

      http://www.r9accelerator.co.nz/take-part/invest-in-a-team/

      The plan was to show up and try to form a team and develop an idea and basic plan over the weekend, which will then be pitched on Sunday in a two minute presentation. This apparently would have the most impact on a team's chance of being accepted into the 3 months intense programme, where a real prototype is supposed to be developed by the team. The official application via an online form is then only a formal act to be completed subsequently for an already consistent and focussed team from the weekend).

      The Geoscience Data Management opportunity was only one out of 14 or 15, and it's not obvious how many teams tackle each opportunity and how many applications are thought to go forward.

      The Team Brainstorming Weekend:

      There was a wild crowd of young and old, but only the team around the geodata challenge seemed to be high profile.

      Katalyst / KDM Spectrum Data from Australia, Schlumberger, and the NZ agencies MBIE, LINZ, NIWA, GNS (Guy Maslen / Globe Claritas) had representatives there. So we were locked away over the weekend to brainstorm ideas to address MBIE's and NZPM immediate problem of nicer representation/delivery/visualisation of prospectivity data for possible investors in oil&gas and minerals.

      From Dave Darby pitching the challenge...


      WELLINGTON, NEW ZEALAND - January 29: R9 Accelerator Day 1: January 29, 2016 in Wellington, New Zealand. (Photo by Mark Tantrum/ http://mark tantrum.com, COPYRIGHT:2016 Mark Tantrum)



      over group discussions...

      WELLINGTON, NEW ZEALAND - January 30: R9 Accelerator Day 2. January 30, 2016 in Wellington, New Zealand. (Photo by Elias Rodriguez/ eliasrodriguez.co.nz) COPYRIGHT:2015 Elias Rodriguez
      WELLINGTON, NEW ZEALAND - January 30: R9 Accelerator Day 2. January 30, 2016 in Wellington, New Zealand. (Photo by Elias Rodriguez/ eliasrodriguez.co.nz) COPYRIGHT:2015 Elias Rodriguez

      .. toward the final pitch of what a team could possibly achieve if funded (respectively participate in this accelerator program):

      WELLINGTON, NEW ZEALAND - January 31: R9 Accelerator Day 3. January 31, 2016 in Wellington, New Zealand. (Photo by Elias Rodriguez/ eliasrodriguez.co.nz) COPYRIGHT:2015 Elias Rodriguez

      The (preliminary) Summary:

      It could have been a great set-up for creating specific start-up type business solutions for MBIE across their departments.

      We came up with designated/suggested team members , e.g. Guy Holmes and Tony Duffy(KDM Spectrum Data), Marielle Lange, a developer, Gavin Chapman, geodata management team at MBIE and I. We also suggested an advisory group, as far as I get it together: Dave Darby (MBIE), James Johnson (MBIE), Richard Garlick (MBIE), Jochen Schmidt (NIWA), Guy Maslen (GNS / Globe Claritas), Grep Byrom (LINZ).

      If the proposal would have been accepted then the team would have had to develop a prototype with a little funding type stipend, and present that prototype to MBIE and other possible investors by June. Based on that further commercialization/contracting may arise. However, the professional team members were mainly supposed to support themselves (presumably KDM as big business, MBIE seconding their participant), and few of us would have to go full in and see if we'd be eligible for a part of the team stipend to basically live the start-up work life for the coming three months.

      However, while the team, the idea, and the pitch were great, my situation would of course complicate my personal setting my PhD and within SMART programme. Eventually, I had to make a decision and withdrew to wrap up my PhD first. After all, it was a great opportunity to meet fascinating people and talk about possibly disruptive ways of re-shaping geoscience data management, visualization at governmental and even global scale.



      Thursday 24 March 2016

      Dreamteam FOSSGIS NZ eResearch 2013 Ignite Talk

      In July 2013, NeSI, the New Zealand E-Science Infrastructure, organised lightning talks in Ignite format (20 slides, 15 seconds each) at the eResearch NZ conference in Christchurch.

      I had the chance to present a little piece on the dream team combination of FOSSGIS (free and open source software for Geographic Information Systems) and OGC (Open Geospatial Consortium) standards, and how they are great enablers for Science and Education. Stumbled over this presi in my archives and thought I could share it:


      A rough transcript :-)

      1. Hi, I’m Alex and I’m a Geoinformatics PhD student at AUT University and I’m working in a New Zealand groundwater research project with GNS Science in Taupo.

      2. Today I’d like to tell you the story of my journey. I think it is a great story and I want to share it with you - and being in my first year I needed to get my head around stuff.

      3. So my part in the so called SMART aquifer characterisation project is about a groundwater web-portal for New Zealand. I need to collect, analyse, mash-up, visualise and share again a lot of different groundwater related data sets – and everything shall happen in the web

      4. Little did I know before …The diversity of available content, formats and technologies is quite overwhelming – So I had to figure out where to start.

      5. So my story is basically about discovery –a step by step exploration – to build something for the better of humanity (well, definitely for New Zealand)

      6. And what helped me in this rather iterative and incremental process? Open Source software, especially Free and Open Source Software for Geoinformation Systems (FOSSGIS)

      7. It worked for me for two reasons: If it didn’t perform, I could have a look inside and fiddle around with it to make it fit my demands – well, or I’ll just try something else, you know, freedom of choice.

      8. And if you even want to give back to the community, you often can propose and contribute enhancements to make such software better – however all things optional

      9. Alright, second thing, data sharing - Let’s have a look what’s happening out there in the vast expanses(expansiveness) of the internet.

      10. So there is the Open Geospatial Consortium, the OGC, is an international consortium of 482 companies, government agencies and universities participating in an open consensus-based process to develop publicly available standards that "geo-enable" the Web, and thus fostering interoperability. Interoperability is key.

      11. They create specifications for data formats and web services and so forth, to interact, integrate and communicate with each other. You can download the full specifications for free and again, you could also participate.

      12. So now, does that OGC stuff work with open source? Bamm, there’s a huge open source software ecosystem supporting OGC standards –  go to osgeo.org or 52north.org and you will likely find everything you need to start.

      13. And that’s actually really high quality software, They are often even the reference implementations for particular OGC standards - and you still can do with it whatever you want.

      14. Ok, I don’t want to pull out what’s happening internationally, you know, Aussie, North America or Europe … Well, I learned New Zealand is just really awesome, too

      15. So NZ has a Geospatial Office that published the NZ Geospatial strategy in 2007 which actually says “USE OGC STANDARDS” and make environmental and other spatial datasets available.

      16. And boom, agencies, research institutes and even Kiwi-based commercial companies open up massive NZ datasets to the public – accessible through OGC interfaces and open source – OPEN STANDARDS work

      17. So data and technology was available. I could start with small steps, be open, be agile and flexible. I really felt enabled to discover, grow and share in return.

      18. So how can you facilitate that wealth of knowledge? I heard in New Zealand there is a GIS Masters programme available, is it covering OGC and Open Source software? I don’t know actually, but I’ve been asked to co-author an “Open GIS” module in another international GIS Masters (I could make it open access perhaps)

      19. Well, having OGC and FOSSGIS at hand – you can conduct as well as support top-notch science and research, and also foster education of kids at school, students at uni even citizens at home with the same set of tools … communicate science and knowledge about this country and its beautiful nature that deserves protection and sustainable development

      20. So when do you start? Thanks everyone, acknowledgements to GNS, AUT, eResearch NZ, MBIE/MSI