Civil Engineering Systems

The three main elements of civil engineering - structures, geomechnics and water resources, are united through the idea of systems. Our research is therefore either performed jointly with other research groups or is generic to all the research groups. Specific research focuses on developing an understanding of uncertainty and risk in civil engineering projects so that they can be managed and controlled.

We also work closely with staff from the social science, mathematics and information technology disciplines and with industry.

Our research can be summarised under the following major headings:

Hazard & Risk Management Reflective Practice Ethics Coastal Engineering
Vulnerability Observational Engineering Physical Process Modelling Oil Exploration
Uncertainty Analysis Monitoring of Full Scale Structures Egan Initiative Process Management
Structural Safety Life Lines Machine Learning Water & Environmental Management

Uncertainty Analysis

The mathematical analysis of uncertainty has its origins in probability theory, but the fundamental limitations of the theory lead to the development of alternatives. Much of the early work on the application of fuzzy sets and fuzzy logic to civil engineering was done at Bristol. These ideas have lead to a new powerful and simple theory of evidential support based on interval probability. In this theory an interval number is used to represent a probability measure. In classical probability theory precision and truth likeness are in conflict in the sense that the more precise the definition of a concept the higher the information content but the less likely it is to be true. The interval representation of probability allows a compromise between these requirements and obviously contains classical reliability as a special case.

The three extremes of [0,0], [1,1] and [0,1] represent the cases ‘certainty false’, ‘certainty true’, and ‘don’t know’ respectively. Thus the interval probability theory is designed for use in problems involving sparse data and incomplete and possible inconsistent knowledge. The probability of some action effects causing damage (e.g. dead load) can be defined fairly well but for other action effects (e.g. accidental loads) only a measure of belief can be obtained. The Interval Probability Theory has been used successfully for evaluation of seismic vulnerability and the damage assessment of corroded beams.

(Back to top)


There is a recognised need to design structures to be robust and to avoid possible progressive collapse mechanisms. However there is, as yet, no accepted theory of robustness to help structural engineers. SCOSS (Standing Committee on Structural Safety) have identified this need as have a number of authors. The term robustness implies strength and sturdiness in all possible limit states regardless of any actions on the system. One insight into the lack of robustness is gained by identifying how a system is vulnerable since this indicates how it is weakest.

The problem of identifying how a structure is weakest has been addressed at Bristol. A theory of Structural Vulnerability for two-dimensional structures has been developed and computer programs written. Recent work has lead to the generalisation of the theory which is applicable across a wide range of systems, including water pipe works and traffic flows.

Structural vulnerability theory is an innovative systems theory of the form of a structure. The purpose of the theory is to help provide structural integrity by addressing the way in which a structure is connected together. The theory enables the form of a structure to be described so that the quality of its connectivity can be measured. This quality is called the ‘well-formedness’ of the structure. Thus ‘weak links’ are identified so that they may be redesigned or suitably protected and monitored. The theory enables particular ways in which a structure might fail to be identified. However, the emphasis of structural vulnerability theory is not the usual one of structural response analysis where a structure is examined under specific loading conditions. In vulnerability theory the loading causing damage can be any possible action e.g. dead load, wind load, accidental damage, terrorist attack etc.

(Back to top)

Physical Process Modeling

The work on physical process modeling is based on an Interacting Objects Process Model (IOPM) which has been developed at Bristol. The IOPM integrates ideas from systems thinking, software engineering, artificial intelligence and traditional engineering. This has lead to enhanced capability of simulation of complex physical processes. The IOPM also provides the basis of a theory of 'appropriate physics' to tackle the integration of quantitative and qualitative knowledge.

The model, initially developed for solid mechanics problems, has undergone several enhancements over the last eight years. It has been encoded with finite element relations and has been used to analyse typical structural dynamics problems. Its structure is suitable for different computer architectures and its implementation on a massively parallel Connection Machine with 8192 processors showed speed-ups of the order of three. It has also been used to analyse the chaotic behaviour of non-linear systems. A query system maps the quantitative simulation results into qualitative form. A graphical user interface facilitates the process of querying and analysing the simulation results.

More recent work on the IOPM has lead to a generalised structure where objects are seen as a cellular structure arranged in a hierarchy and containing the genetic information about the physical system being modelled. This conceptual framework can be developed for different problem domains.

(Back to top)

Hazard and Risk Management

Hazard is defined as 'the set of preconditions for failure'. Early work centred on human factors, uncertainty analysis and system uncertainty. The quality of an engineering calculation depends on many diverse factors, from the testing of basic theories in a laboratory to the quality of judgements about the relationship between theory and practice. All these factors have different degrees and types of uncertainty which need to be assessed. Problems that have been examined so far are fatigue in structural steelwork and the geotechnical design of retaining walls.

(Back to top)

Monitoring of Full Scale Structures

This area is concerned with the analysis and interpretation of data. Signal processing techniques are being integrated with knowledge-based systems through pattern recognition based on relational inference. Data from the non-destructive testing of piles and from measurements taken from dams have already been used. The two-fold aim was to interpret the meaning of a confused signal and to put together a structural diagnosis from a very large number of diverse signals. This work is now being developed to support more general monitoring activities.

(Back to top)


The aim of this research is to improve our understanding of the ethical basis for a sustainable technology. We are testing the hypothesis that science and engineering are inseparable from the questions of right action or moral values. In Ancient/Medieval times, moral and scientific reasoning were not separated. After the Renaissance, there has been a strong tendency to treat moral and scientific reasoning separately. Scientific activity has been conducted in a manner directed only by logic and empirical facts. Scientists have tried to purge their research of any personal emotions and feelings, personal and social values and preferences or prejudices. It has caused a situation where most engineers and scientists, unless they have made individual efforts to compensate for narrowness of their education, are not equipped to reflect critically on the moral dilemmas they will confront. This poverty in accompany with the ever increasing power of technology has brought about one of the greatest problem of our time, unwelcome and negative aspects of technology. We believe that scientists and engineers need to bring together moral and scientific reasoning in engineering practice.

(Back to top)

Observation Engineering

The observational method is an efficient approach to dealing with uncertainty in the ground by allowing in a systematic way for the reaction during the process of design and construction to the ground conditions, characteristics and responses that are observed. The objective of our research is to develop an holistic generic process model of the observational method of design, using geo-technical design as a basis for development then expanding the scope to cover other areas of design under uncertainty.

The process model that has been developed is hierarchical: careful design of the model enables its generic character to be maintained and applied at several levels of the design process. The model makes explicit the roles and sub-processes that are involved in the design process, the interactions between them and the associated uncertainties. These interactions often cross traditional boundaries within or between organisations so that understanding of these interactions by all players is important for the successful application of observational design; the process model can aid this understanding.

The model has been developed from a number of quite different observational design case histories. Using a Grounded Theory approach, data have been gathered from a range of projects in various countries, both by interviewing people involved with observational design projects and by recording the process on site. The model will initially describe the current situation from which improvements to the process will be identified, by applying the model to real projects business benefits can then be measured. A possible future development of the model is associating semi-quantitative measures of uncertainty with each process, having it running in real time on a project computer network will then help manage the uncertainty throughout the design.

(Back to top)

Reflective Practice

It has long been felt that an ability to define the existence of event sequences that have in the past lead to hazards would be very valuable. Especially, if the diagnostic use of these event-sequences in the study of new projects can lead to the prediction of potential problems. A new approach to the recognition of continuously occurring patterns in engineering failures is proposed. The research had three objectives. Firstly, to develop an audit model for the identification of failure and hazard; secondly, to develop a model for diagnosing necessary action through which failure and hazard can be managed and lastly, to produce a generic learning model, that may be able to predict the occurrence of hazards.

This learning model is based on the hypothesis that decision-making is a dynamic process, supported by an interactive reflective process memory loop between the memory and the decision-making processes. It is suggested that for the system to find the most appropriate solutions, the system should store past case histories that are organised by process models of the positive interactions that they achieve and the negative interactions that they avoid. Central to this view, is the idea that a system needs three kinds of knowledge. Firstly, a process model memory indexed by the positive interactions that have achieved and the negative interactions that they have avoided. Secondly, a process hierarchy that discriminates between those positive and negative interactions. Lastly, an evidence hierarchy of those positive and negative interactions, which is used to judge the relative justification of decisions made by the system. This view of decision-making has been termed as a Case-Based Reflective Process Memory.

The intention is to demonstrate that this view of decision-making simultaneously supports and is supported by a learning system that incorporates new cases into the systems memory, such that the system gains from case failures and as well as case successes. The system will create an index of aspects that are associated with the positive and negative interactions and by recognising those aspects, will anticipate hazards, thus helping the human operator avoid that hazard.

(Back to top)

Egan Initiative

The construction industry was invited by Sir John Egan in 1998 to rethink how it works and to accept the challenge of improving its performance further. Our research has analysed and developed a range of tools to help the industry work differently.

(Back to top)