Climate Data, Management and Exchange
Information about the weather has been observed and recorded for many centuries. Early records were written as manuscripts and then collected and retained in specific journals for climatological information. The records included notes on extreme and sometimes, catastrophic events and also on phenomena such as the freezing and thawing dates of rivers, lakes and seas.
With the development of instrumentation to quantify meteorological phenomenon and the dedication of observers in maintaining methodical, reliable and well-documented records, much more data was being collected and needed to be kept. This need paved the way for the development of organized management of climate data.
Since the 1940s, standardized forms and procedures gradually became more prevalent amongst recorders and, once computer systems were being used by National Meteorological and Hydrological Services (NMHSs), these forms greatly assisted the computerized data entry process and consequently the development of computer data archives and software specifically designed for modern Climate Data Management.
The 1960s and 1970s saw several NMHSs implementing electronic computers and gradually the information from many millions of punch cards from old mechanical devices (where the information was kept from the 40’s) was transferred to magnetic tape.
A major step forward occurred with the WMO World Climate Data and Monitoring Programme's (WCDMP) CLICOM project. Initiated in 1985 and now finished, this project developed a standard for database management. It involved the installation of PC-based climate database software, hardware and a comprehensive training programme, in more than 100 NMHSs around the world. The project provided the foundations for demonstrable improvements in climate services, applications and research in these countries.
With better and easier ways to obtain and store information, more was collected creating a higher demand for ways to manage all the data. In the late 1990s, the WCDMP initiated a Climate Database Management System (CDMS) project to replace CLICOM and take advantage of the latest computer technologies to meet the varied and growing data management needs. The new CDMSs offered improved data access and security and much greater utility for users.
Today, with the Internet already delivering greatly improved data access capabilities, data management is evolving as an integral part of the WMO Information System (WIS) architecture at the national level. Having better data management, and storage allows easy discovery, access and retrieval of historical climate data to the benefit of various users of climate information and services.
Several international centres and institutions have the expertise to deal with the huge number of climate data collected from all over the world. They have been developing and maintaining several global data sets containing various climate elements and atmospheric parameters for easy access by users. Regional data sets are also collected, managed and maintained, primarily by WMO Regional Climate Centres (RCCs).
Reliable climate monitoring and assessment are highly precise areas of science. Because of this, a high level of accuracy and reliability is required in the data sets. There are several challenges in constructing such high quality data sets. These include:
Fortunately scientists have developed methods to keep the inherent data uncertainty at a minimum to make sure the data sets are accurate enough for use in climate analyses and applications.
There are few international data centres which develop and maintain high quality, homogenized, global data sets, combining land and sea surface data while also having a complete set of data running from the present back to more than 160 years.
The Hadley Centre of the United Kingdom (UK) is one of these international data centres. The Hadley centre has developed several data sets which provide various climate records including surface, upper air and marine data sets.
An example of the use of the Hadley Centre’s data sets is the analysis of the global surface temperature, since instruments first started being used in 1850. This particular data set is routinely updated and improved as more data are continuously added the data base.
Every climate database is arranged in a certain way to help facilitate the storage and retrieval of the information. The way the data is arranged is called a model. This model is very important for the quality of the resulting system, and in particular for its maintainability. An inappropriate model will tend to make the system harder to maintain.
An important part of the quality management of an NMHS is the data Quality Control (QC) process. The QC process should be designed to check the quality of the whole data-flow process. It should ensure that data is checked and, to the extent possible, is error-free. There are several points where errors can creep into the data, and so these must be detected and eliminated, and if possible, the errors should be replaced by the correct values (while also retaining the original values). Errors can come into the data set at numerous times including: the station site; instrument/sensor; data transmission or data entry stages.
[Technical information] about data management can be obtained through the WMO’s Climate Database Management System website.
The exchange of data between NMHSs is essential for climate monitoring and various applications. This may include both the storage and use of data (and metadata) from other countries in the database of one NMHS, and the transmission of data to Global and Regional data centres.
[More in depth information] A list of Global and Regional Data Centres can be found here.
WMO Member states have the obligation to share data and metadata with other members of WMO. Two types of data have been identified by the agreements: “essential” and “additional” data. A minimum set of “essential” data has to be made available with “free and unrestricted access”. However Members may include more information under the “essential” category than just the minimal set.
[Technical information] The conditions under which the information should be shared between Members and third parties is covered in the Resolutions of the executive council, 2009. Resolution 40 (Cg-XIII) and Resolution 25 (Cg-XIV).
Members of WMO volunteer a subset of their stations to be part of various networks, including:
Nomination of stations into these networks implies an obligation to share the data internationally using standard procedures and transmission protocols. Several data centres are linked to the WMO Global Telecommunication System (GTS) of the WMO information System (WIS). Thereby they constitute international climate data archiving centres serving various users.
[More in depth information] A Monthly set of Global Climate Data is kept at the National Climate Data Centre
|© World Meteorological Organization, 7bis, avenue de la Paix, Case postale No. 2300, CH-1211 Geneva 2, Switzerland
Tel.: + 41(0)22 7308111 / Fax: 7308181 - Copyright | Privacy | Scams | Disclaimer | Guidelines | Procurement | UN System | Accessibility