All official European Union website addresses are in the europa.eu domain.
See all EU institutions and bodiesDo something for our planet, print this page only if needed. Even a small action can make an enormous difference when millions of people do it!
12. QUALITY CONTROL AND ASSURANCE
12.1 Information transfer to Agency
The Topic Centre on Catalogue of Data Sources is currently working on many of these aspects of the environmental information network. For example, there must be a common language for determinants, sampled media and units, usually codified in a data dictionary. In terms of water quality and quantity information there will be a requirement for aggregated data rather than raw data to be transferred to the Agency. The Agency will as well as specifying codes, formats etc. for transfer will have to specify the type of information. For example, monitoring information on water should contain site means, standard errors, confidence limits, maxima, minima and percentiles. In this way the variability and validity of spatial and temporal comparisons can be assessed and quantified. Details of analytical procedures, methods, limits of detection, quality control are also likely to be required. The following sections, therefore, just briefly touch on some of the issues that will be at sometime addressed by the Agency with support from the appropriate Topic Centre(s).
12.2 Data quality control
12.2.1 Handling data
The first stage in ensuring the quality of the collected sampling data is the appropriate choice of storage format. The data should be in a form which allows access to all relevant sample details (such as date and time of sampling, grid reference, etc.), which allows the data to be easily examined for erroneous entries, and which permits the data to be divided into subsets as desired. An ideal storage medium is a database system like Microsoft Access, Borland DBase IV or the Oracle RDBMS.
As well as choosing the data format, a further requirement is that all necessary sampling information is recorded alongside the actual sample value. This is important as once the data has been entered and stored it is likely to be difficult, if not impossible, to add retrospectively the missing information. This information will be needed not only for the purposes of the monitoring scheme, but also to help validate the data.
If the data are produced in a computer readable format at the time of sampling, then the direct transfer of the data onto computer will minimise human errors caused by re-entering the data.
In order to prevent mistakes from being made whilst transferring data from one user to another, a universally agreed data transfer format should be used. One example of a standard format is ASCII files (ordinary text) with comma delimited fields and one sample value, plus other details about the sample, per line. Although this format does not make the most efficient use of space, it allows the data to be read into a new database with little or no manipulation of the transfer files. This avoids errors and saves time and money. An example of how the information might look after importing into a database is given below.
Date |
Time |
Grid Ref. |
Sample Code |
Determinant Code |
Units |
Sample Value |
12/3/89 |
10:59 |
637098 224573 |
00102 |
623 |
mg/l |
0.34 |
12/3/89 |
13:59 |
637098 224573 |
00103 |
079 |
mg/l |
2.307 |
12/3/89 |
13:59 |
637098 224573 |
00103 |
623 |
mg/l |
0.796 |
M |
M |
M |
The EU and United Nations have invested considerable resources in providing efficient solutions to the problems associated with data transfer. The UN developed EDIFACT (Electronic Data Interchange For Administration, Commerce and Transport) as a world wide standard. The EC sponsored the application of this message system to environmental data exchange through the TEDIS programme. This system standardises the information format and ensures that all of the required supporting information that is sent with the message. This concept has advantages in that one common interface can be used for transfer of data. Unfortunately, the current system falls short of current EEA requirements as it contains no data dictionary to standardise codes associated with determinant, river sampling sites etc.
12.2.2 Detection of incorrectly entered data
The simplest form of check on entered data is to identify those values which fall outside the expected range. These apparently outlying values can then be verified, changed or discarded as appropriate. It is very important to note that data should only be discarded when there they are definitely known to be incorrect. Outliers which occur due to random variation are valid values and their exclusion at this stage can bias results. Range checking methods are listed below.
Another method of quality checking is to use a statistical quality assurance scheme, in a similar way to analytical quality control. A number of data records are selected at random (with replacement) and checked for mistakes. The proportion of errors in the database is estimated from the proportion of errors in the randomly selected records, and a confidence interval for the proportion is also estimated. Quality standards are being met if the true proportion of errors is below some prescribed level with a certain level of confidence.
For example, suppose that the proportion of errors must be no more than 1% with 95% confidence. Table 12.1 below shows the one-sided 95% confidence intervals for different numbers of observed errors from 500 randomly selected records (Ellis, 1989).
Table 12.1 One-sided 95% confidence intervals for the true proportion of errors based on 500 randomly checked records
Number of errors |
1 sided 95% CI for true proportion of errors |
2 |
[0%, 1.3%] |
1 |
[0%, 0.9%] |
0 |
[0%, 0.6%] |
As can be seen from the above table, if more than one error is observed then the quality standards are not being met and remedial action may be necessary. A disadvantage of such a statistical quality control scheme is that it can be expensive to implement.
12.2.3 Analytical limits of detection and missing values
An agreed system of marking sample values below or above analytical limits of detection (LoD) should be used by all parties. The best system is to include an extra field in the database to indicate the state of the sample (for example, the field could contain a minus sign for samples below the LoD, a plus sign for samples above the LoD, and a blank if the sample was normal).
A convenient way of marking a sample as missing is to replace its value with some non-numeric marker, such as an asterisk.
12.3 Analytical performance
The analytical methods described in Appendix C are the techniques commonly used in laboratories routinely analysing these determinants. This does not however, preclude the use of other methods provided that the analytical performance can be proved to be adequate. They are typically generic methods (e.g. ICP-MS, flame photometry etc.), with most of the references being standard methods drawn up by the UKs Standing Committee of Analysts (SCA). There are of course international organisations such as the European Standardisation Committee (CEN) and the International Standards Organisation (ISO) producing similar standard methods which would be equally relevant.
12.4 Analytical quality control
12.4.1 Background
Analytical Quality Control (AQC) is the term used to describe the procedures adopted to ensure that analytical measurements are of adequate accuracy for their intended purpose. It is worth emphasising that, in any form of monitoring, the aim should not be to seek the ultimate achievable accuracy. The tasks are: (i) to establish sufficient control over measurement errors to allow clear and accurate interpretation; and (ii) to maintain consistency of measurement so that any temporal changes of interest can be discerned.
AQC is the principal practical component of a system of Quality Assurance. Other aspects of Quality Systems (e.g. staff training, instrument maintenance, adequate systems of records) are also important to ensure satisfactory operation of a monitoring programme. For example, it is of little consequence to achieve adequate accuracy, if samples cannot be identified clearly. However, these issues are outside the scope of this section.
12.4.2 Summary of approach to analytical quality control
The following summarises the essential features of Quality Control activities in laboratories undertaking water quality monitoring. The approach is described more fully in the European Standard guidance document "Guide to Analytical Quality Control for Water Analysis" CEN TC230 WG1 TG4, N120.
Laboratories should carry out the following procedures in sequence and obtain satisfactory results before an analytical system is used for routine analysis. The following stages should be observed:
It is emphasised that the largest part of AQC effort should be expended on (d), above. The participation on inter-laboratory tests is an important supplement to routine within-laboratory quality control, rather than a substitute for it.
12.4.3 Within-laboratory quality control
Routine quality control within a laboratory is based on the use of control charts. The laboratory must analyse a control sample at least once in each batch of analysis. The results of these control analyses are used to plot a control chart which is used to maintain the analytical system in a state of statistical control.
The control sample should be chosen such that it is subject to the same potential sources of error as samples analysed routinely. As a minimum requirement, the control sample should be a solution which contains a known concentration of determinant no greater than the level of interest. Where sample concentrations are greater than the level of interest, then additional control samples should be used to reflect sample concentrations. The type and frequency of use of control materials will depend on the analytical technique and the nature and likely sources of error which may affect results. Normally, between 5% and 20% of all samples analysed should be control samples. All control samples should be subject to the full analytical procedure. The results for all control analyses should be recorded.
Where the limit of detection is critical (e.g. for calculation of contaminant loads), duplicate blank determinations should be made in each routine batch of analyses. The limit of detection should then be re-estimated at 11-batch intervals from these measurements. Reporting limits should be based on the most recent estimate of the limit of detection.
It is essential that the laboratory has adequately documented procedures which define loss of statistical control and specific actions to be taken when an out of control condition arises. Records of breaches of the control rules need to be maintained and, as a minimum, should include:
The results of analyses obtained using a system not in statistical control should not be released, except under exceptional circumstances. Any such results should be identifiable for future examination and audit. The circumstances under which such results may be released should be documented clearly and shall include the specification that the cause of the out of control condition must first be identified and shown not to affect results for the analysis of samples.
The control chart should be reviewed periodically and the control limits updated if necessary. The results of all current quality control analyses should be taken into account in calculations of performance and in updating charts, apart from out of control values for which the cause has been identified.
Unless it is agreed otherwise, the laboratory should adhere to the test protocol for an interlaboratory exercise. Samples provided in proficiency testing schemes should be treated as far as is possible in the same way as routine samples with respect to storage, registration, analysis and reporting. Routine AQC procedures should be applied. In particular, any replication of analysis carried out as part of an inter-laboratory test should as far as is possible be 'blind. Individual replicates need to be submitted for analysis independently and without reference to one another. No more than the specified number of determinations should be made.
Summary of approach to laboratory ACQ
Laboratories should carry out the following procedures in sequence and obtain satisfactory results before any analytical system is used for routine analysis:
12.4.4 Inter-laboratory quality control
Laboratories should also participate in suitable external inter-laboratory quality control schemes involving the distribution of check samples. A sample check scheme typically entails the organising laboratory distributing samples of different matrices (e.g. fresh and salt water) and determinants (e.g. metals and organic substances) to participating laboratories. Analysis is undertaken by the participating laboratory and the results are returned to the organising laboratory. This provides a continuous check on the accuracy and comparability of analytical results obtained in the participating laboratories, and identifies the determinants for which improved accuracy is required, towards which each laboratory should assign priority within its own analytical quality control work.
12.4.5 Inter-laboratory quality control
There are examples of national and international quality assurance programmes in some EEA States and as such these could form the basis of assuring at least the quality of chemical data reported to the Agency.
Table 12.2 summarises the national analytical quality control programmes that were reported to be in use in 1992/93 by 12 of the 17 EAA Member States. (ERM, 1993 cited in Groot and Villars, 1995). It can be seen that most countries reported to have some national analytical quality control programme in place.
There may also a need to establish international quality assurance programmes. Such programmes already exist for marine waters for example the QUASIMEME programme which currently supports 90 laboratories in Europe which submit data to international marine monitoring programmes (OSPARCOM, HELCOM, MEDPOL, ICES). Under Article 2 of the Agency Regulation the EEA is required to co-operate with certain organisation such as the Joint Research Centre (JRC) on certain tasks. The JRC runs a sample check, and a reference material production and dissemination programme, AQUACON, and may, therefore, have an over-seeing role in assuring the analytical quality of data submitted to the Agency.
Table 12.2 Summary of analytical quality control measures in some EEA Member States (ERM, 1993 cited in Groot and Villars, 1995)
Country |
Analytical Quality Control |
Belgium | Yes. Includes the use of recovery efficiency, blank samples and analytical standards. |
Denmark | Yes. Internal AQC includes control charts and inter-laboratory comparisons. |
France | Yes. Internal AQC with many laboratories formalising formal procedures in Quality Manual. |
Germany | Yes. Internal AQC protocol including recovery checks, blank tests and use of different analytical methods for confirmation. |
Greece | No. No formal AQC procedures currently established. |
Ireland | Yes. Internal AQC protocol including reference standards, spiked samples and extraction efficiency tests. |
Italy | Yes. Internal AQC including recovery efficiencies, blank samples and analytical standards. |
Luxembourg | Yes. |
Netherlands | Yes. Internal AQC protocols including control charts, reference samples for recovery cheeks, blank samples and inter-laboratory comparisons. |
Portugal | Yes. Internal AQC including control charts and reference standards. |
Spain | Yes. Internal AQC procedures applied. |
UK | Yes. Internal AQC including control charts, reference standards, spiked samples, recovery efficiency tests, etc. Also, participate in inter-laboratory checks and all are externally certified. |
For references, please go to https://eea.europa.eu./publications/92-9167-023-5/page015.html or scan the QR code.
PDF generated on 23 Nov 2024, 04:07 PM
Engineered by: EEA Web Team
Software updated on 26 September 2023 08:13 from version 23.8.18
Software version: EEA Plone KGS 23.9.14
Document Actions
Share with others