Volume 57(1) — January 2008

Observing the climate—challenges for the 21st century

by William Wright*

linesatelliteIntroduction

One of the great challenges facing humankind in the 21st century is how to deal with the global climate, today and in the future. Seasonal swings in climate, with their droughts, floods and storms, are responsible for major natural disasters that, at their worst, wreak death, famine, loss of livelihood, epidemics and displacement of populations, as well as vast losses of personal and State-owned belongings. On top of that, the vast consensus of reputable scientific opinion, as represented in the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, states that the climate is changing and will change significantly further. In general, these changes will be for the worse, with the most severe effects likely in developing and Least Developed Countries—precisely those countries with the least ability to adapt.

The good news is that scientific progress has equipped mankind with tools that can possibly reduce adverse impacts by enabling some capacity to predict in advance what will happen so that, potentially at least, some kind of preventive action can be taken. Thus, an increased likelihood of drought can, in theory, lead to a range of timely mitigation activities: more careful management of water (e.g. increasing storage through tanks); agricultural responses, such as planting more drought-resistant strains of crops; government-level recognition that financial or other support may be required for affected communities. In terms of climate change, the development of increasingly powerful models and downscaling techniques can not only predict future climate patterns from known levels of atmospheric forcing, but should soon be able to estimate the likely impact on local rainfall and vegetation—provided there is sufficient data to train and verify the models. In short, mankind’s ability to manage the climate of the future should be well served by the lessons contained in the climate record of the past.

The operative phrase here is “the climate record of the past”. In order to develop effective adaptation strategies for the future, a reliable record of the climate of the present and past is absolutely indispensable. This is usually more challenging than simply collecting and recycling the observations taken to support operational weather forecasting, because climate has special needs that forecasting does not. In particular, climate scientists and service providers require observational records that are long and free from significant gaps, major errors and inhomogeneities. These conditions are not necessarily met by networks designed primarily for weather forecasting. It is one of the duties of today’s climate scientists, through advocacy and advice, to ensure that observational network designers recognize the special needs of the climate programme. Other challenges connected with the effective use of data are the development of capacity to analyse and interpret what the climate record is saying and the merging of historical observations with future climate projections.

A wish-list for the climate record

While it is difficult to generalize about what constitutes a sufficiently long record, one might suggest that at least 30 years of record is required at sufficient stations to represent all the major climate zones and vulnerable regions within a country. Because of the need to ensure that climate extremes are properly captured, these data would be required on at least daily time-scales. Moreover, the data must be homogeneous and accompanied by good supporting information (metadata). A rainfall time-series with a discontinuity of 15 per cent will make it harder to identify and attribute climate-change-related trends in rainfall of similar magnitude. The task would be almost impossible if metadata recording the time and cause of the discontinuity were unavailable.

The time-series also needs to be free of significant gaps in the record, as these can play havoc with statistical relationships, in particular. Finally, the wider the range of variables recorded, the better the ability to monitor the climate. For many purposes, it is useful to know not only the average and extreme rainfall and temperatures for an area, but also the frequency of thunderstorms, hailstorms and frosts.

Since climate variability and change are global phenomena, this record must, as far as possible, be global in extent. While good records in some countries are valuable for those countries, the ability to understand and predict the global climate as a whole is weakened if there are few or no observations in neighbouring countries. It is highly likely that our ability to predict the global climate under conditions of future global warming would be weakened by a lack of surface and upper-air observations over the Pacific Ocean region, very much the “flywheel” of the current climate system, and the home of the El Niño-Southern Oscillation phenomenon.

Unfortunately, at the very time that high-quality networks for climate monitoring and prediction purposes are most needed—and there is increasing recognition of this fact—economic factors are working in the opposite direction.

Impacts of automatic weather stations

It is a fact of life in nearly all countries, that budgets for National Meteorological and Hydrological Services (NMHSs) are becoming increasingly constrained. In this environment, the tendency is increasingly to replace relatively resource-intensive manual observational networks with automated instrumentation and remote-sensing approaches. While such networks are cost effective and have considerable benefits for the weather-forecasting community, they potentially pose a number of problems for the climate community and the overall integrity of the climate record.

This trend towards automating some (all, in some countries) of the observational network has been apparent over the last 10-15 years. It has been estimated that, in late 2006, some 23 per cent of all Regional Basic Synoptic Network stations were automatic weather stations (AWSs), with the number increasing rapidly. To be sure, AWSs have some attractive features for climate science: apart from cost-effectiveness, they provide higher-frequency data (down to one- minute observations in some cases); better ability to detect extremes (due to the higher-frequency data); they can be deployed in remote or climatically hostile locations; provide generally faster access to data; and ensure consistency and objectiveness in measurement. They can also provide a useful function for some kinds of quality control: for instance, when a manual observer goes on holiday, it may be possible to use daily recordings from a nearby AWS to break down a cumulative rainfall total into its constituent daily amounts.

On the other hand, experience in several countries has shown that AWSs can have an adverse effect on the climate record. Observed impacts have included:

  • Data losses, communication failures and inadequate back-up of data, leading to significant gaps in data continuity;
  • Inhomogeneities have been introduced into time-series, due partly to inadequate change management (e.g. insufficient period of overlap of conventional observations with those from AWSs) and sometimes due to poor maintenance. A recent study in Romania indicated a tendency for AWSs to overestimate minimum temperatures and underestimate maximum temperatures; while, in Australia, a change in wind sensor gave rise to discontinuities in peak wind-speeds, attracting the ire of the national standards authority;
  • Maintenance procedures sometimes generate spurious data spikes;
  • Doubts in some cases arise concerning accuracy and precision, especially of rainfall. Again, this can lead to homogeneity issues, as well as adversely influencing decisions on which significant amounts of money depend;
  • Because of the complex electronics in AWSs, maintenance requires more specialized skills than may be readily available in some countries;
  • Unless specifically equipped with special sensors, AWS deployment usually results in a loss of visual observations (such as phenomena), making it difficult to construct some kinds of climatologies or monitor trends in, say, hail-days or cloudiness.

Many of the impacts outlined above can be substantially reduced if the introduction of AWSs is accompanied by sound implementation and change-management processes and regular maintenance. Moreover, further mitigation of the problems identified can be expected as technology continues to improve: for instance, visual sensors can record some kinds of phenomena; enhanced data loggers can minimize data losses. The catch is that sound management and technological enhancements generally mean increased costs. The challenge for climate programmes everywhere will be to ensure, firstly, that the value of the climate record is recognized; and, secondly, that networks are designed in such a way that the strengths and weaknesses of conventional versus AWS measurements are complementary. The WMO Commission for Climatology (CCl) has a leadership role to play in this regard. A current activity of the CCl Expert Team on Observing Requirements and Standards for Climate is to identify a list of requirements for AWS data for climate purposes. These include not only sensor precision standards but also requirements for data back-up, extra sensors, station distribution, maintenance, etc.

Remote-sensing approaches are attractive for their ability to provide much greater densities of observations than is possible with conventional networks. They have particular value over oceans or sparsely populated areas. The drawbacks are that it is difficult to monitor, or interpret accurately, some variables. Also, there needs to be at least a certain number of surface stations to “ground truth” the information inferred from satellites, radar, etc.

The challenge for the future will be to integrate remote-sensing approaches with conventional and AWS ground-based networks in such a way that the overall climate record is not compromised. This will require not only a careful change-management strategy as observational systems evolve, but almost certainly the establishment of an optimal blend of observation systems. The latter stands as a significant research exercise.

Observational data management and stewardship

Having in place observational networks that can take and record regular climate observations is a necessary, but not sufficient, condition for ensuring the climate record can adequately support climate monitoring and service provision. The data must also be properly quality-controlled, archived and easily accessible. If the climate record exists largely or only in hardcopy manuscript form, it is difficult to use it to construct climatologies, develop statistical prediction schemes or utilize it in climate models. Unfortunately, the question of effective management of observational data remains a major problem in many, particularly developing, countries.

The following outlines the data- management requirements to ensure that the information contained in the observations is available for optimal use. To these must be added the imperative that countries remain willing to share their data, so that the rest of the world has access to it.

We, the delegates from 170 Member States and Territories of the World Meteorological Organization (WMO), meeting in Geneva from 4 to 26 May 1999 at the Thirteenth World Meteorological Congress, declare as follows: …

We are cognizant that, weather and climate systems do not recognize political borders and are continuously interacting. Hence, no one country can be fully self-reliant in meeting all of its requirements for meteorological services and countries need to work together in a spirit of mutual assistance and cooperation.

(extract from the Geneva Declaration of Thirteenth World Meteorological Congress)

 

Data rescue and digitization

Many countries have large amounts of data locked up in largely inaccessible paper formats, such as logbooks or record-sheets. Worse, in such formats, they are at heightened risk of permanent loss or damage from fire, flood, rot, theft or insect or vermin attack. For this reason, WMO in recent years has been concerned with data rescue and digitization efforts, especially in developing countries. It has been aided in such efforts by opportune funding from government agencies in certain countries. Activities typically involve securing vulnerable records against immediate loss or damage, digitizing/imaging them and/or relocating them to safer locations (including overseas) and—importantly—providing NMHS staff with training in effective record-management and archiving techniques. Various countries have supported attempts by the marine science community to rescue ships’ logs as an aid in interpreting climatic conditions over the world’s oceans: the RECLAIM project (RECovery of Logbooks And International Marine Data) is an example of such an initiative.

Database technology and archiving

In the modern era, effective data access means having the data stored in electronic formats and preferably in forms where it can be readily ingested into spreadsheets, analysis software and climate models. CCl supports these activities by recommending and supporting various data-management initiatives (e.g. WCDMP, 2005). A recent example has been the recommendation and implementation in developing and Least Developed Countries of non-proprietary data management software, designed with the specific needs and limitations of those countries’ NMHSs in mind. The ClimSoft software package, developed under the auspices of WMO, has so far been installed in countries of the Caribbean, Africa and the Pacific, backed by training courses, tailored report formats and an online discussion group. The Australian Bureau of Meteorology has supported the Pacific “arm” of this implementation: experience has shown that training in the use of such software is much more effective if delivered in-country, rather than via workshops which may be attended by only one representative per country.

Where it has not been possible to digitize data, there is still a need to ensure paper records are stored securely, according to acceptable archival standards (e.g. in acid-free boxes, in air-conditioned rooms and with the data properly inventoried).

Quality control/assurance (QC/QA)

For data to be truly reliable, there has to be some means of diagnosing, then correcting, eliminating or at least flagging, errors. In other words, the data must be subject to some kind of quality control (QC). Once, QC was very much a hands-on activity, with NMHS staff physically checking individual data recorded in logbooks or on record sheets against a series of tests, coupled with often subjective operator experience, a dash of local topographic knowledge and a great deal of observational knowledge. With the advent of automatic high-speed computers and larger volumes of data, QC in many locations has become more automated, with much of the testing done automatically using predetermined checks (e.g. checks against climate extremes; internal consistency checks, e.g. does dewpoint exceed temperature?; unlikely temporal fluctuations; checks against neighbouring stations). With this approach, the manual operator role—if there is one—becomes confined to following up and deciding on cases flagged by the automated testing procedures. It is good QC practice to assign a quality flag indicating the reliance to be placed on the data and to keep an audit trail so that original data may be regenerated if required.

Clearly, the degree and type of QC will depend on various factors such as the number of stations; the variable type (in general, essential climate variables such as rainfall, temperature and humidity should receive greater attention than less critical ones); frequency of data; and, naturally, staff and computing resources within the NMHS. Some centres (e.g. the US National Climatic Data Center (NCDC)) run additional tests for homogeneity. The Expert Team on Observing Requirements and Standards for Climate, in collaboration with NCDC, is completing a revision of the Guidelines on the Quality Control of Surface Climate Data (Abbott, 1986).

An important part of the QC/QA process is to ensure that systematic or repeated errors are identified, and referred back to observational managers for investigation and rectification. Recurrent errors may reflect faulty observing sensors, poor observational practices, inadequate site or instrument maintenance or—in the case of AWSs—problems with the messaging systems (in Australia, recently, there was a software glitch that in some circumstances caused old messages to overwrite recent data). Such end-to-end quality assurance should be a primary aim for NMHSs: it is a truism that the best form of quality assurance is to ensure that the original data are as close to perfect as possible.

International efforts on rescue and digitization of climate records in the Mediterranean basin
Long-term, high-quality and reliable climate instru­mental time-series are key information required in undertaking robust and consistent assessments in order to better understand, detect, predict and respond to global climate variability and change. The benefit areas include regional climate studies and predictions, calibration of satellite data, generation of climate- quality re-analysis data, besides being a formidable and essential tool in translating climate proxy evidence into instrumental terms.

The Mediterranean region has a very long and rich history in monitoring the atmosphere, going back to the 19th century. However, despite the efforts undertaken by some National Meteorological and Hydro­logical Services (NMHSs) in data rescue (DARE) activities aimed at transferring historical long-term climate records from fragile media (paper forms) to new electronic media, accessible digital climate data are still mostly restricted to the second half of the 20th century, hence preventing the region from developing more accurate assessments of climate variability and change.

To address these issues, WMO, in collaboration with the National Institute of Meteorology of Spain, organized an international Workshop on Data Rescue and Digitization of Climate Records in the Mediterranean Basin in Tarragona, Spain, 28-30 November 2007. Representatives of 22 NMHS in the region and several regional and international institutions attended the workshop and defined the way forward. The subsequent workplan includes: the establishmen of an inventory, on a country basis, of the currently available long-term climate records in digital form (temperature, precipitation, air pressure) and the longest and key climate records to be recovered: and the setting-up of a common Website for inventorying the current available climate data and the potential data to be recovered on a national basis and actions to be undertaken for developing national/regional climate data rescue activities.
schema Data rescue and digitization of climate records

 

Particular difficulties in developing countries

In the foregoing, reference was made to the problems faced by developing and Least Developed Countries in both collecting and managing climate data. Many such problems have been identified in various publications, such as the First and Second GCOS Adequacy Reports (e.g. GCOS, 2003). Apart from severe resource constraints, staff turnover may be high; there is often little opportunity to train new staff; equipment and storage facilities may be poor or limited; stations are frequently remote and hard to access due to infrastructure limitations; communications may be poor; and meteorology frequently ranks low in government priorities.

Yet, without observations from these countries, not only does it become difficult to provide the level of climate services required in-country to manage climate-related risk, it becomes difficult, if not impossible, to put together a truly global picture of the climate, its variations and its changes. It is essential that the global meteorological community addresses and helps solve the observational and data management problems in developing and Least Developed Countries.

giving laptop On behalf of WMO, Mr Bui Van Duc (right), Director General of the National Meteorological Service of Viet Nam, presents a participant in a capacity-building workshop with a laptop to operate and maintain a climate database at his home institution (Workshop on CLIMSOFT Data Management System and Climate Extreme Indices, Hanoi, Viet Nam, 12 November-7 December 2007).

 

The Expert Team on Observing Requirements and Standards for Climate is currently putting together a series of recommendations on how to support climate observational programmes in developing and Least Developed Countries, starting with the premise that, to meet current and future needs, there must be a certain minimum number of climate stations representing key centres, distinct climate zones and, particularly, vulnerable regions and sectors. The Team will investigate what can be done to improve observational standards through, for example, improved and better-targeted training and the use of AWSs. It will attempt to provide suggestions on how to address some of the endemic problems outlined in the previous paragraphs. Some suggestions for resource mobilization will also be made, including such things as drawing on aid and climate-change funding bodies, utilizing, where possible, the assistance of private funding, and raising the profile of the NMHS in its own country.

Concluding remarks

There is widespread recognition that climate change looms as perhaps the biggest single future threat to humanity and the environment. Major global efforts will be needed to ameliorate impacts which the meteorological, and especially the climate community, will need to be instrumental in supporting. This support will, in turn, need to be underpinned by an observational system that is designed with the needs of the climate programme in question. The role of WMO, its Commission for Climatology and NMHSs in facilitating such a system will be crucial.

References

Abbott, P.F., 1986: Guidelines on the quality control of surface climatological data, WMO/TD No. 111, WMO, Geneva, 56 pp plus appendices (18 pp).

Global Climate Observing System (GCOS), 2003: Second report on the adequacy of the Global Observing System for Climate. GCOS Report No. 82, WMO/TD No. 1143, WMO, Geneva.

World Climate Data Monitoring Programme (WCDMP), 2005: Report of the RA V Data Management Workshop, Melbourne, Australia, 28 November- 3 December 2004). WCDMP-No. 57, WMO/TD No. 1263, WMO, Geneva , 7 pp.

line

* National Climate Centre, Bureau of Meteorology, Melbourne, Australia; Team Leader, WMO Commission for Climatology Expert Team on Observing Requirements and Standards for Climate

back to top

 

 

 

 

 

 


Download
   
  Latest issue [pdf]

Archive



Copyright | Privacy policy | Disclaimer | Guidelines |