Sunday, February 22, 2015

The Future of GeoNet Revisited - Part 4: Early Warning

Early Warning....
Let me state up front that I have mixed feelings about early warning for geological hazards events. It is very hard to do well, and the consequences of false alarms can be severe. Also, technology is a very small part of the end-to-end process. Detecting the hazard is in many cases the easy part - getting the message to affected communities in forms that can easily be acted upon is the really hard part of the complete package. Finally, any system has to be super robust. It may be in place for decades before a devastating event occurs. And when disaster strikes, will the early warning system (end-to-end) still be fully functional?

 Another issue I will get off my chest at this stage is that there is a danger that early warning is used as a funding opportunity by researchers (often with the best of intentions), particularly those not involved in operational systems. It is easy to say that someday this wonderful research I am doing will lead to early warning. This is the "I can cure cancer" syndrome we all know about. How many times have the media announced a cure for cancer? What is obvious is we are slowly moving to be able to treat (or at least delay) many cancers - but there is no one silver bullet. I believe it is the same with early warning for geological hazards.

GeoNet and Early Warning....
GeoNet has much of the infrastructure and technology necessary to contribute to a forecasting and early warning capability for New Zealand for some perils. GeoNet is currently set up to collect research data and report on geological hazards and would require considerable reconfiguration for short term early warning. The lack of a fully staffed 24/7/365 warning centre is a major component not currently available. GeoNet operates a duty system rather than being staffed 24 hours a day every day of the year. Sensor network expansion and increased robustness is required, and research and development is needed to take the outcomes of scientific research and transform these into operational tools if GeoNet’s role was expanded to include warning centre capabilities.

For the record, forecasting is a form of time-dependent hazard assessment, whereas early warning requires the identification of an imminent peril and the likely time of impact.  Some geological hazards are easier to forecast than others, and the benefits of the forecasts can vary considerably. Let's briefly look at the perils we face.

Volcano Early Warning....
The GeoNet volcano monitoring programme already provides a level of volcano forecasting which would be enhanced by a 24/7/365 warning centre, improved remote data collection systems and additional research and development. Let's look at the example of the eruption of Te Maari craters on 6 August 2012, following nearly 120 years of inactivity of Tongariro volcano. It followed three weeks of unrest including an increase in earthquake activity and changes in fluid chemistry, leading  GeoNet to raise the alert level for the volcano. Although the alert level is not designed to be predictive, the increased activity triggered increased community and land owner (Department of Conservation) consultation and resulted in a better prepared community when the eruption took place. This is an example of effective volcano forecasting in practice.

Tsunami Early Warning ...
The compelling case for early warning capability in New Zealand is the potential for local or near-regional source tsunami. The 2013 update of the 2005 tsunami hazard assessment for New Zealand demonstrated that a regionally generated tsunami from a Kermadec earthquake could impact highly populated parts of the North Island from Bay of Plenty through the Auckland and Northland regions with travel times of between one and two hours. Further, it is likely the causal earthquake would not be strongly felt because the volcanic region reduces the earthquake shaking, negating the effectiveness of using natural warning signs. Local-source tsunami caused by an earthquake on the Hikurangi subduction zone offshore of the east coast of the North Island also poses a threat, making tsunami the most crucial of the perils requiring early warning capability. 

Figure 1. A scene from within the Pacific Tsunami Warning Centre (PTWC) in Hawaii. PTWC currently provides tsunami advice on ocean-wide tsunami to all countries in the region (including New Zealand) using internationally available seismic and sea level data. The alerts provided by PTWC come around 10 minutes after a tsunami-generating earthquake so can only be used for distant and regional source tsunami warning.
Landslide Potential ...
Landslide potential is site specific, but forecasting can be based on rainfall rates, earthquake shaking and volcanic activity (lahars and other forms of debris flows) and the severity of likely landslides reported. This is an area of active research which is likely to bear fruit in the next decade.


Earthquake Early Warning....
Earthquake early warning, on the other hand, although already operational in places like Japan, is probably the lower priority for New Zealand because of the very short warning times, marginal outcome improvements and the much higher requirements for robustness and sensor densities for its effectiveness. Earthquake early warning is fundamentally different to the other early warning capabilities discussed above. We cannot predict the location or size of future earthquakes. We can only detect the start of an earthquake near where it ruptures and warn at a distance because seismic waves travel slower than electronic signals. Earthquake early warning times are measured in seconds, unlike the tens of minutes to hours and days possible with the other perils discussed above. New Zealander's live on top of our earthquakes! And we often have earthquakes in unexpected places making earthquake early warning very difficult. The Alpine Fault is the only structure in New Zealand where someday we may be able to deploy a cost effective earthquake early warning system.

Resources and Priorities....
For GeoNet to take on a leading role in event forecasting and early warning would require considerably more resources. Such an undertaking would be a step up in capability (people, expertise and resources) at least as large as when GeoNet was established in 2001. There is a compelling case for the establishment of a New Zealand tsunami early warning system for near-regional and local source events. But GeoNet can only provide part of the solution. Education, evacuation zone and route planning and a very effective public alerting system are also requirements for an effective end-to-end tsunami warning system.

Wednesday, February 18, 2015

The Future of GeoNet Revisited - Part 3: Impact Reporting


In my last blog I talked about the GeoNet Community - the large and growing group of people who rely on, use and are interested in GeoNet, our operations, data and other outputs. In this blog I will introduce the first of two fundamental changes I see happening to GeoNet and our community over the next decade.

Potential Impact reporting ....
Our vision is that GeoNet will be able to provide near-real-time potential impact reporting not just for earthquakes, but also for volcanic eruptions, tsunami impacts and landslide potential. The potential impact reports can then feed directly into systems designed to estimate the likely levels of damage given the people and infrastructure at risk. This is a major move from event reporting (where, when, how big) to impact reporting (what will be the likely effects where people or infrastructure reside). This reporting will use instrumental data, community reporting (citizen science) and effective modelling.

If we consider earthquakes, then felt intensity is a form of impact reporting. The magnitude of an earthquake estimates the physical size of the event where it ruptured – whereas the intensity relates to its impact on people, landforms, buildings and infrastructure. So reporting an earthquake location, depth and size is event reporting, but providing intensity estimates at multiple locations where people live and work is impact reporting. For a volcano, stating it has erupted is event reporting, but giving ballistic and ash fall damage estimates is impact reporting. You get the picture.

So why are we not doing impact reporting now? In short, because it is hard to get it right! Let’s consider earthquakes (again) as being a seismologist I find it easier. If we had enough sensors, then we could just use the sensor network to give an accurate estimate of the shaking level where you are for any earthquake. But the nearest sensor to you may be tens of kilometres away so we have to make an estimate based on the earthquake location, size and depth, the ground type below where you are and various other factors (like if you are in a multi-storied building). If you are lucky there may be a sensor near where you are, but it may be on different ground (soft rather than hard rock for example). We need either a huge increase in the number of sensor sites, or we can use known science to estimate the likely felt intensity anywhere. We can also supplement the physical measurements  and modelling with reports from people as explained in my last blog. 

ShakeMap NZ ....
The approach taken by the USGS ShakeMap, which we are in the process of implementing in New Zealand, is to use modelling and all available data. For example, Figure 1 shows the ShakeMap for the most recent large earthquake in New Zealand, the M 6.0 Wilberforce earthquake of 6 January 2015. In this case the nearest strong motion stations were a long distance from where the earthquake was centred so the maximum recorded shaking was less than 5% the force of gravity. However, ShakeMap estimated shaking levels of more than 20% of the force of gravity near the epicentre. 



Figure 1. ShakeMap of the Wilberforce earthquake of 6 January 2015 showing shaking intensity at the surface. The maximum accelerations indicated in the yellow and orange zones (around 0.2 g) could potentially have caused minor damage if the location was not so remote.
This shows both the strength and weakness of ShakeMap - it gives us an estimate of the maximum shaking levels but we can not confirm this value because we have no nearby instruments (see Figure 2). This event was also originally mis-located because of the influence of a small foreshock in a similar location which confused the automatic location system (an issue which was not identified by the Duty Officer immediately). Because ShakeMap requires the location, depth and magnitude as well as any actual shaking data to estimate the overall pattern of shaking, the mis-location caused the shaking pattern to be also wrongly-estimated. This was not a big problem in this case because of the remoteness of the earthquake from population centres, but this would not have happened if we had more sensors in the region. The remoteness of the location also meant we had few felt reports from close to the earthquake location.


Figure 2. The strong motion recordings for the Wilberforce earthquake of 6 January 2015. Note that the indicator bar are 10% the force of gravity and that there are no recording close to the earthquake epicentre.
The best choice we have is to improve our models of shaking AND to increase the number of sensors over time as I advocated in my original GeoNet technological blog series. Improving our knowledge of shaking requires more data on the earthquake source, the effects of the earth the earthquake waves travel through and the near-surface damping and amplification effects near where you require the shaking information. In many ways putting in more sensors is easier!

By providing improved potential impact reporting outputs like ShakeMap directly into systems providing damage and harm estimates GeoNet can make a major positive difference. In the modern world this is becoming more and more important making this development essential for the future of GeoNet and our community.

Future Event Scenarios
Before I move on to forecasting (or early warning) let's consider another step on that road. For recent volcano and earthquake events GeoNet has published a short list of what the most likely future scenarios may be along with the probabilities (chances) of what may happen. For example, for the Wilberforce earthquake discussed above we estimated that a normal aftershock sequence was by far the most likely future scenario, but other possibilities could not be totally ignored. In future GeoNet will provide this information following all geohazards events.


Tuesday, January 13, 2015

The Future of GeoNet Revisited - Part 2: The GeoNet Community

In my last blog in this series I summarised where GeoNet is at, and indicated I would explain what GeoNet can offer in future in coming blogs. In my GeoNet 2023 blog series (Part 1, Part 2, Part 3) I outlined what GeoNet may become, but from a largely technological point of view. The technological advances are important, but what about the GeoNet impact on how society plans, makes decisions, and responds to geological hazards events?

Fundamental changes ....
The fundamental changes I see for GeoNet over the next 10 years are the move from event to impact reporting, a greater emphasis on early warning and forecasting, and much more two-way communications with our community. This process has started, but has a long way to go and much of the progress will come from the research currently being carried out using GeoNet data, and as an extension to our current citizen science and social media initiatives. In this blog I will concentrate on the developing GeoNet Community (note the capital C), before moving on to impact reporting, and early warning in forthcoming blogs.

The GeoNet Community ....
What do I mean by the GeoNet Community? Before the start of the Canterbury earthquakes GeoNet had a small following via our website and social media. We were essentially a data collection system providing the raw material for researchers and information on events for the emergency management sector and the small number of interested people in the wider community. That changed on 4 September 2010 when suddenly GeoNet became a critical source of information about what Cantabrians' were experiencing. Other websites sprung up taking feeds from the freely available GeoNet data but presenting it in different ways. The GeoNet open data policy which our funders, the Earthquake Commission (EQC) insisted on, was crucial to this process - data and information was available when people needed it.  The people of Canterbury took to filling in GeoNet felt reports in great numbers during the extended earthquake sequence. The GeoNet Facebook site became very popular - people wanted to report and discuss what they were experiencing. And a large number of people became avid followers of our work and the information we provide. In short, the GeoNet Community was born!

By using social media we can debunk myths and rumours quickly, and over time people teach each other about our hazards and how to use GeoNet’s facilities. The two way conversation also lets us “feel the pulse” and react to what people say they need from us. But social media is not a 9-to-5 activity; studies show that peak interaction occurs in the evening, and of course straight after anything significant. With the help of the GeoNet Community this becomes possible.

Modern technology means the two way conversation with the GeoNet Community is available anywhere. Smartphone apps inform people no matter their location. Whether it is the shaking of an earthquake or a change in one of our volcanoes' behaviour, people expect to know within minutes. GeoNet can be literally everywhere in a connected world.

The updated GeoNet Quake app (available for iOS and Android).

The GeoNet Community in Future: Civil Defence, Lifelines and Decision Makers ....
Through engagement with EQC, Civil Defence organisations, local and central government, lifelines, and other sectors, GeoNet will continue to introduce new methods of providing information to end users in more useful and understandable ways. This will include the continual improvement of the depth and usability of the public information on the GeoNet website and via mobile devices, which teachers use in the classroom and members of the community use to keep informed. The major aim is for GeoNet data and information to play a significant role in education, policy, planning and decision making. But the important change in future will be improvements in two-way communications, enabled through technology and personal interaction.

The GeoNet Community in Future: Citizen Scientists ....
In future we will develop our apps to let people talk back to us. We propose extending the very successful felt reporting system already operated by GeoNet (on the website) to other platforms and other perils. We have already successfully encouraged people to “dob in a landslide” after the Eketahuna earthquake of January 2014. People now expect to be able to help us wherever they are by reporting geological phenomena from their mobile devices, together with pictures. Given the resources we can extended citizen science initiatives to include opportunities for local people to volunteer to work with scientists in their studies, collecting scientifically-valuable information, or have schools and communities “adopting an instrument” allowing them to participate in our work. The effective two-way communication between GeoNet and the community will be critical to raising awareness of our geological hazards and how we can prepare and respond to them.

The GeoNet Community in Future: Scientists and Engineers ....
The widening of the GeoNet Community will see more scientists and engineers invited into technical conversations more regularly. We propose to develop mechanisms for consultation with the GeoNet Community between the major four yearly review cycle. While the current GeoNet technical committees are largely operational in nature we envisage conversations on the longer term direction of GeoNet, while acknowledging that the governance of GeoNet is the joint responsibility of EQC and GNS Science.

We will continue and extend our efforts to help researchers make effective use of the large quantities of GeoNet data - improving New Zealander's understanding of geological hazards and helping the targeting of future GeoNet initiatives. Given the resources this will include providing cloud-based computing and data archives and training to facilitate the easy use of GeoNet data for research.

The GeoNet "New Voices" workshop, November 2014

The GeoNet Community in Future: International ....
New Zealand benefits from the existing deep contacts between GeoNet and our colleagues in other countries. This helps us maintain best practice and keep abreast of emerging monitoring technologies and research. Further, New Zealand contributes as an international citizen in areas where cooperation is vital for the outcome such as tsunami response. An example is my current chairing of the Pacific Tsunami Warning and Mitigation System. Our vision is that GeoNet will continue to enhance our high profile internationally.

The GeoNet Community Hub ....
GeoNet has the potential to be the “one-stop-shop” for both the collection and distribution of data and information on New Zealand’s hazards environment. This would be an extension of the current GeoNet Community and a part of the citizen science initiative and planned science experiments, allowing the community to contribute and share data, information and observations on events and the planning for events. It would be planned and undertaken in consultation with EQC, Civil Defence, science organisations and other key players. Emergency managers, planners, insurers, researchers and decision makers will then have quick access to all the data and information needed to improve the preparation, response and recovery from natural events.



Tuesday, December 16, 2014

Reflections on the Boxing Day Tsunami

To say that the Boxing Day Tsunami had a huge impact is an understatement. It affected the world, the science community and me. My science training had not prepared me for the sheer devastation the earthquake and tsunami did across 19 countries. New Zealand was not immune to the devastation either; we lost seven kiwis that day. The total loss of life and damage is beyond comprehension. Today I reflect on the lessons we have learned as a science community to help ensure the loss of life and damage does not occur on that massive scale again.

What we couldn’t appreciate at the time was the Boxing Day tsunami was the start of a decade of deadly and destructive tsunami. These include the 2007 Solomon Islands (Gizo), 2009 Samoan Islands, 2010 Chilean, 2013 Solomon Islands (Temotu) Tsunami and, the largest of all in the Pacific, the 2011 Japan tsunami. All these events demonstrated the massive power of the mega-earthquakes which hugely displace the sea floor and sea water above causing tsunami.

So what have we learned in this decade of tsunami? Globally, we learned that we needed a much better tsunami warning capability. Before the 2004 Boxing Day tsunami, the Pacific Tsunami Warning and Mitigation System (PTWS) was the only tsunami warning system on the planet, but it is now one of four globally covering the world’s oceans. So at least we have learned that lesson.

In New Zealand, we have changed our tsunami warning system considerably. We now use tsunami forecast models to establish the potential threat in pre-defined coastal zones and issue this information in map and text formats. The threat levels can be used to inform evacuation decisions based on planned evacuation zones and routes. GNS Science act as the science advisors to the Ministry of Civil Defense and Emergency Management (MCDEM) employing forecast models and the expert knowledge of the “Tsunami Experts Panel”, a group of New Zealand based tsunami scientists.  

We also updated the science and technology in the wider Pacific; on 1 October this year PTWS improved its tsunami warning capability using similar techniques to those we currently employ in New Zealand. Now the Pacific Tsunami Warning Centre in Hawaii (the operational centre of PTWS) sends pictorial and text messages to member countries based on tsunami forecast models and the expected impacts on coastlines. This replaces the messaging based solely of the size and location of possible tsunami-generating earthquakes.

We have done a lot of work in the last decade. But here is what keeps me awake at night: we still rely totally on natural warnings (feeling high levels of, or long lasting shaking, and unusual sea behaviour) for local-source tsunami warning. These are the tsunami caused by earthquakes or triggered undersea landslides near our coast.  And there are some situations in New Zealand where a tsunami-causing earthquake may not be felt strongly, leaving a potential gap in our tsunami warning strategy. On the east coast of the North Island we have a huge fault (the subduction zone) where the Pacific tectonic plate meets and is pushed down below the Australian plate. This is similar to the tectonic situation off the coast of Japan. Many earthquake types can happen in this process, including “slow” earthquake which will not be strongly felt. And further north of New Zealand, a very large earthquake could send a tsunami towards cities and townships of the upper North Island without high levels of shaking being felt on-land (see 2013 GNS Science tsunami hazard update). Two “slow”earthquakes in 1947 caused tsunami which deposited seaweed in power line and damaged buildings, but thankfully caused no loss of life, in small communities north of Gisborne.

Work continues on the science and technology necessary to provide official warnings for these local events for New Zealand, which may provide minutes to 10s of minutes of warning. Watch this space!

I realise this sounds very “doomsday” scenario. And we haven’t been affected by a tsunami like this for a long time. But I’d like to see a local-source tsunami warning capability piloted here in New Zealand. I’m a realist and know the amount of resources required to make this happen.  But the Boxing Day tsunami taught me that the seemingly impossible can happen. We are more ready than we were in 2004. But we need to be even more ready than we currently are.

The most fundamental lesson we’ve learned though isn’t about science. It’s that people’s direct actions matter. The day may come when we have all the scientific systems set up, but we will always need to rely on ourselves and each other. If you feel a long or strong earthquake on the coast, evacuate immediately. Here is the best advice about evacuating during a tsunami (from the MCDEM):

·         Take your getaway kit with you if possible. Do not travel into the areas at risk to get your kit or belongings.
·         Take your pets with you if you can do so safely.
·         Move immediately to the nearest higher ground, or as far inland as you can. If evacuation maps are present, follow the routes shown.
·         Walk or bike if possible and drive only if essential. If driving, keep going once you are well outside the evacuation zone to allow room for others behind you.
·         If you cannot escape the tsunami, go to an upper storey of a sturdy building or climb onto a roof or up a tree, or grab a floating object and hang on until help arrives.
·         Boats are usually safer in water deeper than 20 metres than if they are on the shore. Move boats out to sea only if there is time and it is safe to do so.
·         Never go to the shore to watch for a tsunami. Stay away from at-risk areas until the official all-clear is given.
·         Listen to your local radio stations as emergency management officials will be broadcasting the most appropriate advice for your community and situation.

Sunday, November 16, 2014

The Future of GeoNet Revisited - Part 1

Recently a reader of this blog asked me what more would GeoNet be able to do in 10 years’ time? At first I thought – what is he talking about - I answered that question in the GeoNet 2023 blog series (Part 1, Part 2, Part 3), didn’t I?  But he wasn’t meaning the technical details I had outlined, but what more would GeoNet be contributing to the wellbeing of New Zealanders and the wider world community? Or in current terms – how would GeoNet be helping to make communities more resilient (now that resilience is the new black, or is that orange)?

In 2001 GeoNet was brand new, and to me it still has much development ahead. But with a history approaching 15 years, we have to ask - what has been GeoNet’s major contribution, and where can we contribute more?

Nature was kind to GeoNet giving us all those years up to 2009 to develop the system before the largest and most prolonged series of geological hazards events in more than 80 years started. The period of “peace time” (for GeoNet and New Zealand) ended in 2009:


During the period from 2009 until recently GeoNet transitioned from being a fast growing sensor network using many of the techniques of data handling and delivery developed earlier in the 2000s, to a powerful resource for emergency responders, scientists, engineers, the media and public. We embraced social media, mobile technology and our mission to inform.  We upgraded our earthquake analysis system while “under fire” from continuing Canterbury aftershocks, and continuously redeveloped our website and information delivery systems to cope with ever increasing load.

We became an example of a New Zealand high technology project which not only did not fail (almost an oxymoron), but also became an important part of the lives of many New Zealanders. And we did this within a fixed but flexible budget (and with the blessing of our sponsors, the Earthquake Commission - EQC) and without increasing staff numbers (in fact with a small reduction in total staff).

Our success has been highlighted by IT awards, and has been acknowledged by review panels and studies. For example the 2012 GeoNet Strategic Review panel concluded:

“GNS Science and EQC have worked together to develop a long-term, high-trust, mutually beneficial partnership. Together in GeoNet they have created a gem – a brilliant example of government agencies collaborating effectively together to create public value”

And similarly, to quote the recent EQC commissioned New Zealand Institute of Economic Research (NZIER) report "The value of information on natural hazards":

“GeoNet is now at the hub of a wider community of practice of researchers and users that extends well beyond GNS and EQC. This wider network, which GeoNet has enabled, has yielded direct but unforeseen benefits to New Zealand. For example, because of the quality of the GeoNet data infrastructure, New Zealand is able to leverage others research spending. Other geological agencies are doing detailed work in New Zealand. As one respondent observed ‘New Zealand is now the global geo-hazard laboratory for the world’”.

I believe we have achieved success because of our belief that what we do is important and this underpinned our dedication to providing data, information and insight to help New Zealanders respond to the unprecedented series of events we were facing.

But we achieved the required performance by delaying some equipment installations and replacements and redirecting resources, and sometimes by stopping doing some tasks and delivering some services. And often we did not introduce new products and services even when we knew they were or would soon be needed. This has left us in catch up mode, meaning sustaining GeoNet’s current level of performance and making sure data and information are made easily available must be one of our primary goals.

In the GeoNet 2023 blogs I was concentrating on the technology (one of my BIG interests), but in the next blog I will turn my attention to a more holistic view of how GeoNet can contribute even more in the future. 

Wednesday, December 11, 2013

GeoNet 2023 Part 3: The way ahead

Before I start, I would like to point out that forecasting the future is difficult, particularly when it concerns technology and it is likely to lead to a BIG fail. This was expressed very well by Niels Bohr who said (and yes, I know there is dispute about who said this first):

“Prediction is very difficult – especially if it is about the future.”

In previous blogs (Part 1 and Part 2 of this series) I have shown that we have sometimes got it right in the past, so if I restrict myself to the future of GeoNet, perhaps I will increase my chances. So here goes - what will GeoNet (or what GeoNet becomes) look like in 2023?

Sensor networks 2023 ….
I expect sensor site numbers to explode in the coming decade as a whole series of technological advances come together. The number of sensors will increase by at least an order of magnitude, meaning GeoNet in 2023 will have round 6000 sensor sites available. This sounds far-fetched, but remember how few real-time sensor feeds we had 10 years ago.

What will bring about this change? I expect the same technology advances which have revolutionised computer and data communications technology will finally start making its mark on sensor technology. This has been slow to happen, but it will. The trick is to increase the density (number of sensors) while at least maintaining the measurement accuracy. Previous proposals for increased sensor coverage have advocated more but lower quality sensors. What I envisage is a world where just about everything (position, strain, temperature, pressure, chemistry, shaking level, etc.) can be measured to a high level of accuracy.

Where will all these new sensors come from? The answer is from an extension of existing and yet to be utilised techniques. For example, sensors for measuring temperature and pressure can use the changes in the properties of fibre optic cable lengths and rings. Micro-electro-mechanical systems (MEMS) technology has come a long way in the last decade. We all have MEMS in our smartphones and tablets to tell the device which way is up (its orientation). These are low accuracy devices but very good ones exist and are improving all the time. These are already used in some of the strong shaking instruments we use (see the CUSP instruments). Price is the current barrier to widespread use of high accuracy MEMS sensors in very large numbers.

Consider the recent improvements in GPS technology. Again we all have GPS receivers in our smartphones and tablets. Expect the accuracy of GPS devices to increase with time and become part of multi-sensor platforms. In many respects our current smartphones have much of the technology required to act as sensor platforms, although they do not yet have the necessary sensor accuracy.

And I have not even mentioned nanotechnology yet! Nanotechnology is the manipulation of matter on an atomic and molecular scale.This technology is already starting to produce very small sensors, and this trend is likely to continue. In some ways it is an extension of MEMS technology, but much smaller. The impact on sensor technology of nanotechnology is very hard to predict!

One of the real barriers to very good sensor coverage of New Zealand is the sea that surrounds us. It would be so much easier to locate earthquakes and monitor tsunami if we had sensors on the seabed surrounding New Zealand. The problem is that such sensors are currently very (very) expensive to install and maintain. But imagine if they were installed as part of the data communications infrastructure which connects different parts of New Zealand and other countries. An international collaboration I am involved in, which is a joint undertaking between United Nations organisations, scientific institutions and commercial companies is investigating the use of submarine cables as instrument platforms for environmental and hazards monitoring. Cables capable of carrying sensors (usually assumed to be at repeater sites; see Figure 1) are called green undersea cables. It is early days, but the requirements for low data latency, which is not available with most satellite technology, and route diversity will drive terrestrial solutions. It is therefore likely that there will be many more submarine cables installed in coming years. If these cables are utilised for sensor deployment we will end up with a huge number of sensors covering the world’s oceans.

Figure 1: A map of submarine cable routes. Submarine cable repeaters (blue dots) are along the cables although the total number is about four times those shown (40 to 150 km apart). A typical transpacific cable has about 200 repeaters. Current tsunami buoys and other ocean observatories are also plotted. The figure is from an ITU report.

Data communications 2013 ….
This is both the hardest and easiest capability to predict. If the past predicts the future, then data bandwidth will not be a problem for GeoNet in 2023. Predicting exactly how bandwidth will be made available to move the huge amount (by today's standards) of data collected by GeoNet 2013 is difficult. But our data volumes will be tiny compared to super high density 3D video (and virtual reality I assume, having read far too much science fiction). The "last mile" problem will be solved by current rural broadband initiatives and satellite technologies. So I will leave it at that, assuming there will be ample bandwidth available “somehow” for GeoNet in 2023!

GeoNet data 2023 ….
Everything will be in the cloud. The GeoNet data centre will be distributed and very resistant to geological hazards and equipment failures. It will reconfigure automatically and move data and processing capability and capacity around as required. The volumes of data collected each day will be orders on magnitude greater than today, but all data will still be online and easily accessed. The data archive and delivery will come from somewhere in the cloud electronically close to you. And the way it is delivered will be very configurable.

GeoNet outputs 2023 ….
By 2023 GeoNet will be providing very fast impact reports following geological events to a large number of stakeholders as well as the public and media. Much more background will be provided for events, and many new ways to visualise GeoNet data and information will be available in 2023. We have started to move in this direction by reporting likely felt intensity rather than just magnitude for earthquakes.

It will be a very mobile world – almost all data and information delivery will be to mobile devices but these will be closely connected to the cloud. With data, information and compute capability existing in the cloud, the distinction between mobile and fixed devices (like this computer I am typing these words into) will have little meaning. By 2013 GeoNet will be providing not only the data to researchers, but tailored compute capability to allow very detailed data analysis and modeling electronically close to users. 

Summary ….
Overall the development of GeoNet will continue to parallel that of computer and data communications technology. But additionally, expect to see a huge increase in the number and usability of sensor technology.

That's it from me in 2013. Now all I have to do is live long enough to see what happens!

Tuesday, December 3, 2013

GeoNet 2023 Part 2: The here and now

Before launching into what GeoNet may look like in 2023, I will briefly review where we are at now and try to answer the question – is the past a good predictor of the future?

What is GeoNet?
GeoNet is New Zealand’s geological hazards monitoring system – we monitor earthquakes, volcanic activity, tsunami and land stability. As well as monitoring these hazards, GeoNet collects high quality data for research which will lead to better knowledge and therefore mitigation of our geological hazards.

GeoNet can also be viewed as a large, distributed data collection, processing, archiving and delivery system. It is comprised of sensors networks, processing and archiving capability, and data and information delivery functions.

And yet another way to look at GeoNet – it is a New Zealand high technology project that made good!

GeoNet networks ….
GeoNet operates a network of over 600 sensor sites throughout New Zealand, connected by a variety of data communication systems (satellite, radio, landline and mobile) which form a huge computer network. The approximate breakdown of sensor types is:

  • 180 “weak motion” earthquake recorders (both National and Regional networks of sensors) to locate earthquakes
  • 180 continuous GPS sites which record how the land deforms slowly and during earthquakes
  • 250+ “strong motion” sensors which record the shaking levels in felt earthquakes, including sensors in buildings and on bridges
  • 17 tsunami gauge sites to record sea level change caused by tsunami
  • Plus a variety of other sensors to record position, chemistry, water levels and temperatures for volcano and landslide monitoring
The big changes in the GeoNet sensor networks have been in the way we move data around the country. The fundamentals of the sensor and data recording technology have not changed much, but with the spread of the Internet our ability of moving data has grown. In 2001 many places required expensive satellite data communications, but this situation is improving fast. The spread of the Internet was predictable and has paralleled the growth of GeoNet.

GeoNet data ….
The data from the sensor networks feeds into GeoNet’s distributed data centre system. When GeoNet began in July 2001 our plan was to have a main data centre in Wellington with a backup site at GNS Science’s Wairakei campus near Taupo. Over the last few years we have moved away from that concept to a distributed data centre model using compute capacity and storage in external and internal “clouds”. GeoNet now operates around 100 “virtual computers” which are centrally configured and managed allowing fast rollout and quick failure replacement. GeoNet Rapid, which automatically locates New Zealand’s earthquakes is run primarily in a cloud service in Auckland with the backup here in Wellington. The rapid availability and growth of the cloud is something I had not expected, but is now central to GeoNet operations.

In the early days of GeoNet we calculated that if computer hard disk space continued to increase at the (then) current rate, we could keep all data on-line indefinitely. Currently GeoNet collects around 8 GB a day and the total archive is around 30 TB. When GeoNet started, 30 TB of online storage required robotic tape changing systems costing millions of dollars. Now I have around 10 TB of storage at home - this is one technology prediction we got right!

Figure 1: GeoNet sensor network 2013 - Seismographs (big and small red dots); Strong motion (big and small green squares); GPS (black and light blue triangles); Tsunami gauges (upside-down dark blue triangles).

GeoNet outputs ….
The data and information produced by GeoNet is delivered through the GeoNet website, which is itself a distributed system of New Zealand and international computer servers. We also have information available via our smartphone Apps (currently on Android and iOS). Via the website, it is possible to find such things as earthquake information, volcano status and the position changes happening to our GPS stations as New Zealand slowly changes shape as we are buckled by the slow collision between the Pacific and Australian tectonic plates. Researchers can download data on earthquake shaking, the raw data used to measure the slow deformation as New Zealand deforms, and all the time-series data (waveforms) recorded by seismographs and strong motion instruments. All of the information on the sensor networks (sensor locations, types and calibrations, etc.) is available via the website so that the data can be interpreted and used correctly.

To demonstrate how the use of the GeoNet website has grown, lets look at the case of Dino the pink dinosaur. In the early days of GeoNet, Dino appeared in front of the White Island volcano-cam and caused the one and only complete outage of the website when "huge" numbers of admirers arrived to view him (or her?). Traffic to the site reached 10 hits per second! Today a widely felt earthquake drives traffic to 10s of thousands of hits per second.


So, is the past a good predictor of the future? Sometimes! The growth and development of GeoNet has paralleled that of the Internet and computer technology and will probably continue to do so. 

Next blog - GeoNet 2023 Part 3: The way ahead