Research Groups and Current Projects
Our supervisors are drawn from a number of existing research groups within the School of Informatics, and the Edinburgh Parallel Computing Centre. You can find out more about these groups by browsing their pages below.
Edinburgh Parallel Computing Centre (EPCC) targets strategic software technologies and applications, preparing them for current and next generation HPC architectures.
Institute for Computing Systems Architecture (ICSA) is primarily concerned with the architecture and engineering of future computing systems. Its fundamental research areas cover Compilers & Architectures, Parallel Computing and Wireless Networking.
Centre for Intelligent Systems and their Applications (CISA) develops theories of knowledge representation and inference and engineers systems for modelling, automating and supporting this activity.
Laboratory for Foundations of Computer Science (LFCS) aims to achieve a foundational understanding of problems and issues arising in computation and communication through the development of appropriate and applicable formal models and mathematical theories.
Institute of Perception, Action & Behaviour (IPAB) links computational action, perception, representation, transformation and generation processes to real or virtual worlds: statistical machine learning, computer vision, mobile and humanoid robotics, motor control, graphics and visualization.
Current Research Projects
Below are some examples of projects on which CDT in Pervasive Parallelism supervisors and students are currently working. Click on the project names or logos to find out more.
From Data Types to Session Types
Concurrency and distribution are computing’s most pressing problem. The data type is one of computing’s most successful ideas. Session types codify the structure of communication, making software more reliable and easier to construct, and will therefore play a crucial role in all aspects of software.
Issues around concurrency and multithreading are the worst bugs software can have. They tend to be intermittent and so can only be detected by chance during testing. The effect of such defects can be disastrous: deadlocks can result in the system hanging, race conditions can drastically alter important data values without reporting any error. Threadsafe, a product of Contemplate Ltd, is a static analysis tool for finding and diagnosing dangerous concurrency bugs in Java code, and is based on cutting-edge academic research.
Developing techniques and solutions which address the most difficult challenges that computing at the exascale can provide. The supercomputing community worldwide has set itself the challenge of delivering an exaflop (or a million million million calculations per second) by the end of this decade. CRESTA is committed to ensuring Europe is at the forefront of solving this worldwide challenge.
We have a range of projects and collaborations in the area of programming language security for systems which depend on the mobility of code, data, or both. Common themes include ensuring the secure use of resources such as time, memory space, or network access; and the verification of security with machine-checked proof.
In the PASTA project we seek to automate the design and optimisation of customisable embedded processors. We do this by creating tools that are able to learn about the physical characteristics of the underlying silicon technology, and use that knowledge to synthesise the structure of an embedded processor.
This project aims to prevent software stagnation by investigating new techniques to automatically learn how to utilise new multi-core platforms. Using ideas and techniques first developed in artificial intelligence, we will develop a system that automatically learns how to adapt software to work on new platforms.
A Quantitative Approach to Management and Design of
Collective and Adaptive Behaviours
The main objective of the QUANTICOL project is the development of an innovative formal design framework that provides a specification language for collective adaptive systems (CAS) and a large variety of tool-supported, scalable analysis and verification techniques. These techniques are based on stochastic process algebras and associated verification techniques, and mean field/continuous approximation and control theory. This design framework will provide scalable extensive support for the verification of developed models, and also enable and facilitate experimentation and discovery of new design patterns for emergent behaviour and control over spatially distributed CAS.
How can we use rigorous maths to improve the quality of mainstream computer systems? REMS is a 6-year EPSRC-funded Programme Grant to do just that, bringing together an exciting combination of researchers in systems (architecture, operating systems, and networks) and in semantics (programming languages, automated reasoning, and verification) at Cambridge, Imperial, and Edinburgh.