Research seminars archive

This archive comprises details of research seminars from 1997 to 2007. We currently publish no archive of seminars from 2007 onwards.

Primary page content

Improved score-performance matching using both structural and temporal information from MIDI recordings

Bruno Gingras: 20 Nov 2007

In order to study score-based music performance, one has to determine the corresponding score note for every performance note, a process called score-performance matching. Since a typical performance may contain thousands of notes, researchers have developed algorithms that automate this procedure. Such algorithms are called matchers. Automated matching is a complex problem due to the use of expressive timing by performers and the presence of notes that are unspecified in the score, such as performance errors and ornaments. Automated matchers typically use performance data extracted from MIDI recordings. In the last two decades, several scholars, such as Puckette & Lippe (1992), Large (1993), and Heijink (1996), have developed such matchers. For the most part, these algorithms use structural information, such as pitch and chronological succession, but do not use timing information. As a result, most matchers cannot deal satisfactorily with ornamented performances or performances that exhibit extreme variations in tempo. In an attempt to solve these issues, the author developed a matcher that relies both on structural information and on a temporal representation of the performance, which is obtained by sequentially tracking local tempo changes on a note-by-note basis and mapping performance events to the corresponding score events. This allows the matcher to generate an accurate match even for heavily ornamented performances. Furthermore, this matcher can identify and categorize all common types of errors and ornaments. 

In Search of the Goosebump Factor - A Blueprint for Emotional Music Recommenders

Stephan Baumann: 15 Nov 2007

Music Information Retrieval (MIR) as an interdisciplinary research discipline achieved impressive progress over the last decade. Pandora, Last.fm or MyStrands are successful commercial webservices offering previously unavailable convenience for customers. Although such systems compute personalised recommendations based on relevance feedback on top of content-based, expert-based or community metadata, the embedding of emotional context is still a challenge. In my talk I will sketch a blueprint towards an architecture of an emotional music recommender in order to solve the abovementioned problem. The approach is in its infancy but we have already the core ingredients developed. Lifestream aggregation from Web2.0 platforms and the analysis of blog postings will be aligned with the analysis of song lyrics. Furthermore we propose an open Web2.0 platform in order to collect personal descriptions of "goosebump sensations" when listening to music. This collection will be available to researchers in the field to serve as a common ground for training emotional classifiers.

Designing the rock/pop sound-box, 1966-72

Ruth Dockwray: 1 Nov 2007

Stereophony has significantly impacted the way songs are produced and experienced, enabling the creation of a performance that exists exclusively on the record – this virtual performance can be conceptualised in terms of the ‘sound-box’ (Moore 1992), a four-dimensional virtual space. Within this, sounds can be located through: lateral placement within the stereo field; foreground and background placement due to volume; height according to sound vibration frequency; and time. The period from the mid 1960 to the early 1970s saw the gradual establishment of a ‘normative’ position for sound-sources within the sound-box, which we term the ‘diagonal mix’. In this talk I shall discuss the results of an AHRC-funded project to investigate the establishment of this normative mix. Using 3D rendering software, I will illustrate the diversity of sonic placement in tracks across the years (1966-72) and different styles (psychedelic rock, hard rock and easy listening). The main sound-box configurations, including the disparate mixes evident in the mid1960s and the normative mix will be presented by means of a ‘taxonomy of mixes’. 

Investigations of Bowed String Performance Through Measurement of Violin Bowing Technique

Diana Young: 20 Sept 2007

Virtuosic bowed string performance in many ways exemplifies the incredible potential of human physical performance and expression. A great deal is known about the physics of the violin family and those factors responsible for its sound capabilities. However, there remains much to be discovered about the intricacies of how players control these instruments in order to achieve their characteristic range and nuance of sound. Today, technology offers the ability to study this player control under realistic, unimpeded conditions to lead to greater understanding of these performance skills. Presented here is a method for investigation of bowed string performance that uses a playable hardware measurement system to capture the gestures of right hand violin bowing technique. This measurement system was optimized to be small, lightweight, and portable and was installed on a carbon fiber violin bow and an electric violin to enable study of realistic violin performances. 

The Bol Processor project: Rule-based representation of music

Bernard Bel: 14 May 2007

The Bol Processor project originated in 1980 as a word processor facilitating the transcription of quasi-onomatopoeic syllables used as an oral notation system for Indian drumming. It grew up as an expert system mimicking the ability to compose variations on a musical theme or assess their acceptability. Pattern grammars (a subset of type-2 formal grammars) proved appropriate for modelling the musical system under study. In 1989 a numeric-symbolic learning device was developed for inferring grammars from examples. The next implementation (BP2) addressed the issue of music composition in the MIDI and Csound environments of electronic music. The new challenge was to deal with superimposed sequences of events (polyphony) within the framework of text-oriented rewriting systems. This was achieved by means of polymetric representation. In a classical electronic music environment, musical works are represented on scores analogous to Western staff notation. 

Modelling the well-formedness of musical pitch structures: Intonation, modulations and pitch spelling

Aline Honingh: 29 Mar 2007

A model describing musical pitch structures like scales and chords has been developed using the geometrical properties of convexity and compactness. Using this approach, a number of music related problems has successfully been modeled. In this talk I will explain the convexity and compactness of musical pitch structures and its application to three music related problems. The first problem involves the question of what is the preferred intonation of a specific chord. Since intonation can be influenced by many factors, in this approach only chords in isolation (without musical context) are studied. The second study is about automatically finding modulations in a piece of music of which we have only the pitch information under octave equivalence. The last study encompasses pitch spelling which is known as the problem of how to transform pitches encoded as MIDI into note names.

Robust and Efficient Multimedia Retrieval for Music and Motion Data

Meinard Müller: 14 June 2007

In this talk, we introduce concepts and algorithms for robust and efficient multimedia retrieval in the presence of variations. By means of two different types of multimedia data -- waveform-based music data and human motion data -- we discuss strategies for handling object deformations and variability in the given data. To illustrate the kind of problems encountered in content-based retrieval, we outline some typical query-by-example scenarios. In the music domain, one often has multiple realizations of one and the same piece of music such as audio recordings of different interpretations and arrangements. 

The Study of Persistence and Change in Meter using Inner Metric Analysis

Anja Volk: 22 Mar 2007

In this talk we give an overview over different applications of Inner Metric Analysis (IMA) in the areas of computational music analysis, perception and classification. IMA studies the rhythmic-metric structure evoked by the notes of a piece of music by assigning to each note a metric weight. The method uses persistent and regular pulses at all grid levels and phase shifts to induce the metrical patterns present in the piece. It allows the comparison of how strictly a piece of a certain genre follows a metric hierarchy in comparison to another piece. Stable metrical patterns allow, for instance, the classification of dance rhythms into different genres using IMA. 

An information theoretic characterisation of auditory encoding

Tobias Overath: 22 Feb 2007

The entropy metric derived from information theory provides a means to quantify the amount of information transmitted in acoustic streams like speech or music. The encoding of such streams in the brain has been proposed to be critically dependent on a 'computational hub' in human auditory cortex, the planum temporale (PT). 

MAMA: Using ideas from speech act theory to support musical interaction

Dave Murray-Rust: 18 Jan 2007

There are large areas of intersection between computational treatments of dialogue and music, as well as areas of incompatibility.  In this talk, I will discuss some components of my Musical Act Theory, and what my conception of a musical agent is in relation to this. I will present some aspects of the design of my system, and how these relate to the theoritcal concerns. Finally, I will discuss two uses of the system, one experimental and one exploratory, and give some thoughts about where the system could move in the future.

A Generative Theory of Shape

Friday 24 November 2006, noon (Ben Pimlott building seminar room)

Professor Michael Leyton
Rutgers University, USA

Abstract: This talk gives an introduction to Leyton's book *A Generative Theory of Shape* (Springer-Verlag, 2001). The purpose of the book is to develop a generative theory that has two properties regarded as fundamental to intelligence – maximizing transfer of structure and maximizing recoverability of the generative operations. These two properties are particularly important in the representation of complex shape – which is the main concern of the book. The primary goal of the theory is the conversion of complexity into understandability. For this purpose, a mathematical theory is presented of how understandability is created in a structure. This is achieved by developing a group-theoretic approach to formalizing transfer and recoverability. 

Ben Finn: Causation in music analysis and algorithmic composition

Monday, 18 December 2006, 5pm (Ian Guilland Lecture Theatre)

Causation is fundamental to numerous aspects of the world, though little discussed in musical contexts. Many apparently disparate aspects of music can be unified and explained if they are seen as arising from causal relations between notes, harmonies, phrases, sections, and so on. As causation can be mathematically characterized, it also has considerable potential for generating music on computer (which will be demonstrated). Additionally, the underlying causal structure of music reveals deep connections with various other fields such as economics, physics, and psychology

Living with creative machines

Professor George E. Lewis
Columbia University, USA

Abstract: The computer has become an irreversible part of cultural and social histories of the arts, in which improvisation has long served as a site for interdisciplinary exploration, exchanges of personal and cultural narratives, and the blurring of boundaries between art forms. For George Lewis, living, working, and performing with creative machines of his own design is closely intertwined with the study of how improvisation produces knowledge and meaning. As a kind of computer music-making embodying African-American aesthetics and musical practices, Lewis's work intersects with critical histories of new media, interactive music, and American experimentalism, as well as ethnographic and historical work on improvisation.

Where do heuristics come from? Using abstraction to speed up search

Tuesday 8 February 2005, 1pm

Professor Rob Holte 
University of Alberta

Abstract: This talk is aimed at AI researchers and students interested in methods for creating heuristics to be used by algorithms such as A* and IDA*. The talk will focus primarily on pattern databases, a general-purpose technique for defining heuristics that has revolutionized heuristic search, leading to breakthroughs such as the first optimal solutions for random instances of Rubik's Cube. 

Robert Holte's is Professor of Computing at the University of Alberta, Canada. His current research interests are in: machine learning; abstraction and search; Combinatorial auctions and intelligent information access. Robert is editor of the Machine Learning journal.

3D Technologies Applied to Cultural Heritage

Monday 21 February 2005, 4-5pm

Guy Godin 
National Research Council of Canada

Abstract: 3D imaging, modeling and visualisation technologies have been an active research area of computer vision and graphics. Applications to cultural heritage have emerged as one of the key focus of interest in this field: they provide a wide spectrum of sizes, complexity, materials, and tasks for 3D techniques. The National Research Council of Canada has been actively involved in this world, and a participant in several collaborative projects. In this talk, we will describe work in development of active optical 3D sensors, of processing algorithms for shape and appearance modeling as well as for analytical tasks, and of high-resolution visualisation techniques. 

Guy Godin, Senior Research Officer, Visual Information Technology Group, National Research Council of Canada Ottawa, Ontario, Canada.

How to Destroy Manufacturing Industry

Wednesday 27 April 2005, 5-6pm

Adrian Bower 
University of Bath, UK

Abstract: In the mid twentieth century John von Neumann proposed a Universal Constructor - a machine that could copy itself. Since then a number of people have realised his idea, both in simulation, and physically. However, in the case of physical implementations, all current systems require a supply of very complicated and intricate building blocks. Rapid Prototyping is the engineer's name for 3D printing - the technology that allows a computer to print out three-dimensional plastic parts directly. This has been growing in industrial importance for the last 15 years, and many different ways of making such machines have been developed. Rapid prototyping machines are now at a stage where they should be able to manufacture many of their own component parts. 

Adrian Bower is Senior Lecturer in the Mechanical Engineering Department, University of Bath

Techniques for Computational Analysis of Triadic Harmony

Tuesday 3 May 2005, 4-5pm

Craig Sapp 
CCRMA, Stanford University, USA

Abstract: There are many motivations for developing automatically-generated harmonic analyses of music. First and foremost, it can save time from the tediousness of doing the analyses manually. This reason alone should make many music-theory students very happy. Secondly, it is a useful step towards giving computers an intrinsic understanding of music which is necessary for applications such as automatic performance and generative composition. For example, a straight performance of a musical score (such as a quantized MIDI file) by a computer can be quite boring and difficult to listen to. Harmonic analysis can be used to identify the underlying syntax in music which can then be used to make automatic performance more interesting.

Craig Sapp, Stanford University Centre for Computer Research in Music and Acoustics

Audio Information Retrieval for Creativity Support

Thursday 21 January 2004, 4pm

Dr. Michael Casey

Goldsmiths College, University of London, UK

In this talk I will give an overview of audio-based music information retrieval (MIR) describing the application of audio feature extraction, similarity measurement and pattern processing to creative systems. The talk will also include a brief discussion on standardisation of such tools within the MPEG-7 International Standard and an account of how international standardisation of multimedia information retrieval technologies is changing the global creative landscape

Machine Consciousness: Does the Idea Make Sense?

Thursday 19 February 2004, 4pm

Prof. Igor Aleksander

Dept. of Electrical Engineering, Imperial College, University of London

Abstract: Since 2001 there has been a series of conferences on this topic spanning approaches from the functional (behaviour is an indication of consciousness) to the material end (mechanism is an indication of consciousness). I shall take a brief look at this spectrum and then discuss our own material approach. This is broken down into five axioms, introspectively obtained. 

New technology, New art: studies of collaborative research in art and technology

Thursda, 26 February 2004, 4pm

Prof. Ernest Edmonds

University of Sydney, Australia.

Abstract: The seminar describes research collaborations between an HCI team and artists who ware artist-in-residence with the group. The collaborations resulted in new performance art works, new interactive works and a new instruments and art forms. The research included a full study of the processes of collaboration and innovation. 

Computer Consciousness: what is it and should it be?

Wednesday 4 March 2004, 4pm

Prof. Steve Torrance

Universities of Sussex & Middlesex, UK.
Email: stevet (@sussex.ac.uk)

Abstract: Steve Torrance is interested in materialist and computational theories of consciousness, and in the place of the study of consciousness within science. He is examining the differing conceptions of science and mind that emerge from AI-based cognitive science and from contrasting approaches, which are critical of computationalism. 

The Spiral Array and Computational Models for Tonal Induction and Segmentation

Elaine Chew: 25 May 2004

This talk centers around a geometric representation for tonality called the Spiral Array (Chew, 2000). The model starts from a spiral configuration of the Harmonic Network (the Harmonic Network, also known to transformational music theorists as the tonnetz, has been described by Longuet-Higgins), and successively generates higher level tonal entities as convex combinations of their lower level components. 

Some Stories Concerning the Mutated Observer

Wednesday 12 May 2004, 4pm

Dr. Warren Neidich

University of Sydney, Australia

Abstract: Warren Neidich's multi-media and interdisciplinary practice operates at the margins of multiple discourses, which include, photography, cinema and new media as they inform and constitute visual/ haptic culture resulting in new forms of subjectivity.

Art & the Internet

Thursday 3 June 2004, 4pm

Venue: Seminar Room, Department of Computing, 25 St. James, New Cross

Dr. Mark Tribe 

University of Columbia, New York, USA. 

Abstract: An unlikely alliance has formed, an alliance of artists, technologists and intellectual property scholars to create alternative models for the sharing of all kinds of content, from art and scholarship to software code. The corporations have won, at least temporarily, the battle for legal access to the content they control. 

RELAX

Friday 11 June 2004, 12pm

Dr. Matt Jones

University of Waikato, New Zealand

Abstract: In this talk, we'll explore how a handheld computer can be used to enable interactive experiences that vary in pace from fast and immediate through to reflective and delayed. We describe a system that asynchronously combines an offline handheld computer and an online desktop Personal Computer, and discuss some results of an initial user evaluation.

A Hybrid Decision Tree/Genetic Algorithm for Data Mining

Wednesday 12 March 2003, 4pm

Dr. Alex A. Freitas 

Computing Laboratory, University of Kent at Canterbury

Graphical language games: Representational form and communicative use

Wednesday 12 February 2003, 4pm

Dr. Pat Healey

Dept. of Computer Science, Queen Mary, University of London

 

Neural Networks and Evolutionary Algorithms for Advanced Intelligent Control, Bioinformatics and Path Planning

Wednesday 11 December 2002, 4pm

Dr. Dimitris Dracopoulos

Geometric Representations of Symbolic Sequences

Wednesday 30 October 2002, 4pm

Dr. Peter Tino

Neural Computing Research Group, Aston University, Birmingham

Evolving Rule-based models and their applications

Wednesday 22 May 2002, 4pm

Dr. Plamen Angelov

Dept. of Civil Engineering, Loughborough University

Testability Transformation

Wednesday 15 May 2002, 4pm

Dr. Mark Harman

Dept. of Information Systems and Computing, Brunel University

Estimating errors for spectral approximations

Tuesday 7 May, 4.30pm

Dr. Oscar Bandtlow

Strange Convex Sets Arising In Dynamics

Tuesday 7 May 2002, 3pm

Dr. Oliver M Jenkinson

School of Mathematical Sciences, Queen Mary, University of London

Intelligent Adaptive Systems

Wednesday 24 April 2002, 4pm

Dr. George Magoulas

Dept. of Information Systems and Computing, Brunel University

Gene Expression Programming and the Automatic Induction of Computer Programs

Wednesday 17 April 2002, 4pm

Dr. Candida Ferreira

Chief Scientist of Gepsoft

Bayesian Modelling of Neuroimaging Data

Wednesday 10 April 2002, 4pm

Dr. Will Penny

University College London

From Natural to Artificial Immune Systems

Wednesday, 27 March 2002, 4pm

Dr. Jon Timmis

Computing Laboratory, University of Kent at Canterbury

Support Vector Machines in Combinatorial Chemistry

Wednesday 20 March 2002, 4pm

Dr. Sean Holden

Department of Computer Science, University College London

Order bounded operators on classical function spaces

Tuesday, 12 March, 4pm

Professor Hans Jarchow

University of Zurich

Foundations of Genetic Programming

Wednesday 6 March 2002, 4pm

Professor Riccardo Poli

Department of Computer Science, University of Essex

Computing Patterns in Strings

Wednesday 27 February 2002, 4pm

Professor Bill Smyth

McMaster University, Canada

What is a good genotype-phenotype mapping for the evolution of computer programs?

Wednesday 20 February 2002, 4pm

Dr. Julian Miller

School of Computer Science, University of Birmingham

Inductive Genetic Programming of Polynomial Networks

Wednesday 30 January 2002, 4pm

Dr. Nikolay Nikolaev

Goldsmiths College, University of London

Language is Never, Ever, Ever Random

Tuesday 30 January, 4.30pm

Dr Adam Kilgarriff, University of Brighton.

An Introduction to the Theory of Quasigroups, and Applications

Tuesday 20 Feburary, 4.30pm

Prof. Scerbakov, Visiting Department of Mathematics, University of Reading.

Some connections between Bernoulli convolutions and analytic number theory

Tuesday 27 Feburary, 4.30pm

Dr T. Hilberdink, University of Reading

Program Schemes, Finite Model Theory and Computational Complexity

Tuesday 6 March, 4.30pm

Prof Iain Stewart, University of Leicester

Analyzing the cache behaviour of non-uniform distribution sorting algorithms

Tuesday 13 March, 4.45pm

Professor Rajeev Raman, Computer Science Group, Dept of Maths & Comp. Sci, University of Leicester, Leicester, LE1 7RH.

Extensions of the Stone-Weierstrass Theorem

Tuesday 20 March

Speaker: Dr Barnaby Sheppard (University of North London)

Experimental evaluation of algorithmic solutions for genrealized network flow problems

Tuesday 15 May

Dr Tomasz Radzik, Dept of Computer Science, Kings College.

Fragmentability of graphs

Tuesday 22 May

Prof. Graham Farr, School of Computer Science, Monash University, Victoria, Australia

Checking States and Transitions of a set of Communicating Finite State Machines 

Tuesday 24 October, 4.30pm 

Dr Robert M. Hierons, Department of Information Systems and Computing, Brunel University.

Augmenting Connectivity in Graphs 

Tuesday 7 November, 4.30pm 

Dr Tibor Jordan, Department of Operations Research, Etovos University, Budapest, Hungary. 

Channel assignment and multicolouring of subgraphs of the triangular lattice 

Tuesday 14 November, 4.30pm 

Frederic Havet, Oxford 

Set functions and clustering: quasi-convexity and greedy optimization with monotone linkage functions

Tuesday 28 November, 4.30pm 

Prof. Boris Mirkin, Birkbeck College, University of London

In data of complex structure, such as graph or image or protein fold, a linkage function d(i,S) may capture important information on similarity between elements i and subsets S of a finite set I. A set function F(S), defined as minimum of d(i,S) over all i from S (or, in a different setting, over all i from I-S), satisfies a so-called quasi-convexity condition if and only if d(i,S) is monotone over S. Greedily seriating I according to monotone d(i,S) leads to global maxima of F(S), which can be used for finding cluster patterns in data.

Secret sharing with disenrolment

Tuesday 5 December, 4.30pm 

Prof. Peter Wild, Department of Mathematics, Royal Holloway, University of London. 

The NewsTools Project 

Tuesday 12 December, 4.30pm 

Dr Anthony Hunter, Dept of Computer Science, UCL, University of London. 

The NewsTools Project is aimed at developing tools for analysing and merging news reports that are represented in the form of structured text.

Structured text is an idea implicit in a number of approaches to handling information such as news reports. An item of structured text is a set of semantic labels together with a word, phrase, sentence, null value, or a nested item of structured text, associated with each semantic label. As a simple example, a report on a corporate acquisition could use semantic labels such as "buyer", "seller", "acquisition", "value", and "date". Some news agencies store news reports as structured text. In addition, new technologies, such as information extraction and XML, will massively increase the amount of structured text available.

Tuesday 30 January, 4.30pm 

Dr Adam Kilgarriff, University of Brighton. 

Semigroups

Tuesday 20 Feburary, 4.30pm 

Prof. Scerbakov, Visiting Department of Mathematics, University of Reading.

Domain Theory and Analytic Sets 

Tuesday 8 February 2000 - John Howroyd, Goldsmiths College

Efficient Strategies for Selective Mutation Testing 

Tuesday 14 March 2000 - Len Bottaci, Hull University

The talk introduces the mutation testing technique for unit testing of software and the notion of test set efficiency. The various forms of selective mutation testing are described in terms of their effectiveness and cost saving. A heuristic for efficient operator selective mutation is proposed. With the aim of finding efficient selective mutation testing strategies, operator selective mutation is compared to random selective mutation in an experiment using 11 programs. It turns out that the choice of most efficient strategy depends on the degree of testing confidence required.

Indexing texts and data compression 

Wednesday 15th March 2000 - Professor Maxime Crochemore , University of Marne-la-Vallée

Automated V&V for High Integrity Systems, A Targetted Formal Methods Approach 

Tuesday 21th March 2000 - Simon Burton, York University

This presentation describes the intermediate results of a project to develop automated, high integrity, software verification and validation techniques for aerospace applications. Automated specification validation and test case generation are made possible by the targeted use of formal methods. Specifically, the restricted domain of use is exploited to reduce the set of mathematical problems to those that can be solved using constraint solvers, model checkers and automated proof tactics. The practicality of the technique is enhanced by the tight integration of the formal methods to intuitive specification notations, existing specification modelling tools and a traditional software development process. This paper presents evidence to support an emerging appreciation amongst the software engineering community that, for the benefits of formal methods to be widely exploited in industry, an approach must be taken that integrates formal analysis with intuitive engineering notations, traditional engineering approaches and powerful tool support.

Problem Structure and Analysis

Tuesday 5 October - Michael Jackson

Introduction to Secret Codes and Digital Signatures

Tuesday 19 October and Wednesday 20 October 1999 - Nigel Smart, Hewlett Packard.

I will explain, to a general audience, the basics of the modern science of cryptography. I will discuss various real world scenarios where codes and signatures are needed to ensure electronic commerce. I will explain the background behind some of the public policy decisions facing governments today. Finally I will look to the future to see where we are going and what problems need to be tackled. I will assume no background from the audience, hence the talk will be applicable to anybody from A-Level student upwards.

An Implementation of Amorphous Program Slicing

5pm, Thursday 21 October 1999 - David Binkley [Loyola College]

Traditional program slicing simplifies a program by deleting statements and expressions which do not affect a computation of interest in the program. The simplification power of traditional slicing is limited by the restriction to deletion as the only simplifying transformation. Amorphous program slicing retains the semantic property that a slice preserves a projection of the original program's behavior, but drops the syntactic requirement that the slice is a syntactic subset of the original program. This allows for greater simplification, improving the applicability of slicing to problems where syntax is relatively unimportant.

Reducing the Cost of Regression Testing

5pm, Friday 22 October 1999 - David Binkley [Loyola College]

Software maintainers are faced with the task of regression testing: retesting a modified program on an often large number of test cases. The cost of regression testing can be reduced if the size of the program that must be retested is reduced and if old test cases and old test results can be reused. Two complimentary algorithms for reducing the cost of regression testing are presented. The first produces a program named `differences', which captures the semantic change between `certified', a previously tested program, and `modified', a changed version of `certified'. It is more efficient to test differences, because this omits unchanged computations of modified. This algorithm is based on the semantic properties of program slices. Essentially, two programs that share a slice share a computation. 

Some extremal problems in approximation theory

Tuesday 9 November 1999 - Christina Dragonova Sofia University/UCL

Animating the Semantics of VERILOG using Prolog

Tuesday 16 November 1999 - Jonathan Bowen, University of Reading

The logic programming language Prolog is used to provide a rapid-prototype simulator for the VERILOG Hardware Description Language (HDL). The simulator is based on an operational semantics of a significant subset of the language. Using this approach allows the exploration of sometimes subtle behaviours of parallel programs and the possibility of rapid changes or additions to the semantics of the language covered. It also acts as a check on the validity of the original operational semantics.

Codes, trellises and matroids

Tuesday 30 November 1999 - Peter Cammeron, Queen Mary and Westfield

There is a very close link between linear codes and representable matroids: given a matrix, the rows span the code, and the columns represent the matroid. Information can be transferred back and forth between code and matroid. For example, Greene showed that the weight enumerator of the code is a specialisation of the Tutte polynomial of the matroid. Trellises provide a method for decoding a linear code used over an analog transmission line without the need for analog-to-digital conversion. It is important to be able to find the smallest trellis which represents a given code. This invariant, unlike more familiar ones, depends on the order of the coordinates; it is not invariant under permutations of coordinates. Nevertheless, it can be calculated from the matroid in a natural way. The talk will be an elementary exposition of these ideas, with several illustrative examples.

Testing the safety of a computer-controlled railway signalling device

Tuesday 14 December 1999 - Jim Woodcock, Oxford University (PRG)

Dependence Analysis of Concurrent Programs and Systems

Tuesday 1 September 1998 - Professor Jingde Cheng, Kyushu University, Japan

In this talk, I survey the program dependence analysis technique for concurrent programs and its applications from the viewpoint of software engineering. I present various primary program dependences which may exist in a concurrent program, a general approach to define, analyze, and represent these program dependences formally, and applications of an explicit program dependence representation for concurrent programs in various software engineering activities. I also suggest some research problems on dependence analysis of concurrent systems.

Algebraic Specifications: Assessing their Quality and their Effectiveness in Specification-based Testing

Tuesday 29 September 1998 - Dr Martin Woodward, Liverpool University

There is an increasing trend towards more formality in the development of specifications of software systems in order to reduce the likelihood of errors as early as possible in the development process. The algebraic approach to specification, with its equational form, leads to the added advantage of executability via the process of term-rewriting. Nevertheless, algebraic specifications themselves may be developed which are erroneous or simply of poor quality. This talk will focus on pragmatic techniques and tools to test algebraic specifications and to assess their structural quality. Also, brief mention will be made of some empirical work to assess the effectiveness of test data that is automatically generated from algebraic specifications for testing equivalent procedural implementations.

Predicting Test Effectiveness

Tuesday 20 October 1998 - Dr Marc Roper, Strathclyde University:

Recently, capture-recapture models have been applied to software inspection data to predict the number of defects remaining in a document after inspection. However, there are significant differences between the inspection and testing processes and so it is not safe to assume that such techniques will be as successful in the testing domain. This talk investigates the applicability of capture-recapture models to testing by taking into consideration the characteristics of the testing process and comparing the performance of the various existing models. The models are evaluated with empirical data from studies which compared the relative merits of different testing techniques, and the factors effecting the accuracy of the models are analysed.

When are Hard Problems Genuinely Hard?

An Introduction to Phase-transition Phenomena and the Implications for Efficient Algorithms

Tuesday 27th October 1998 - Dr Paul Dunne, Liverpool University:

All of the classical examples of computational problems for which only inefficient and infeasible algorithms are known, seem to reduce to the general problem of searching a given structure, such as a graph, in order to decide if the structure possesses a particular property of interest, eg does a graph contain a path which visits each graph vertex exactly once?

Using Real Paper To Communicate with Computers

Tuesday 10 November 1998 - Professor Heather Brown, University of Kent at Canterbury

This talk covers two novel methods of using paper to communicate with computers. Both are designed to allow paper to become an active part of a computer system. The first method involves the use of a computer-enhanced desk known as a DigitalDesk. The DigitalDesk can recognise pages placed on the desktop and can match them up with corresponding electronic versions. This allows users to access or manipulate information in the electronic version via natural actions such as pointing at text on the page. I shall describe the `Active Alice' project which used a DigitalDesk to turn a normal paperback version of `Alice's Adventures in Wonderland' into an integral part of a grammar lesson for students. The second method uses Xerox's `intelligent paper'. This looks like ordinary paper but contains invisible information that can be detected by a special pointer device.

Linking informal and formal requirements: a tool based on the B-Toolkit and DOORS

Tuesday 1 December 1998 - Dr Jeremy Dick, Quality Systems & Software Ltd, Oxford

On the rank of random matrices over finite fields

Tuesday 8 December 1998 - Dr Colin Cooper, University of North London

Topological Design Theory

Tuesday 26 January 1999 - Professor Mike Grannell, Open University

Embedding a graph in a surface produces faces. These faces may be regarded as the blocks of a combinatorial design. Given the general difficulty of effecting such embeddings, it is perhaps not surprising that the design-theoretical aspect seems to have hitherto attracted little attention. In his book on the solution of the Heawood conjecture (\textit{Map Color Theorem}, Springer-Verlag, 1974), Gerhard Ringel describes embeddings of the complete graph $K_n$ in orientable surfaces of minimal genus.

Testing Refinements of State-based Formal Specifications

Tuesday 2 February 1999 - Dr John Derrick, University of Kent at Canterbury

A specification provides a concise description of a system, and can be used as both the benchmark against which any implementation is tested, and also as a means to generate tests. Formal specifications have potential advantages over informal descriptions because they offer the possibility of reducing the costs of testing by automating part of the testing process. This observation has led to considerable interest in developing test generation techniques from formal specifications, and a number of different methods have been derived for state based formalisms such as Z, B and VDM. 

Hierarchies of program schemes

Tuesday 9 February 1999 - Professor Iain Stewart, Leicester University

Finite model theory, that is, the study of the expressive power of logics on finite structures, is a thriving research area mid-way between mathematics and computer science. Descriptive complexity theory is one aspect of finite model theory dealing explicitly with the relationship between computational complexity theory and finite model theory. Typical results in descriptive complexity theory are Fagin's Theorem that NP is the class of problems definable by sentences of existential second-order logic, and the Immerman-Vardi theorem that P is the class of problems on ordered structures definable by the sentences of least-fixed point logic.

Design for test 

Tuesday 16 February 1999 - Professor Mike Holcombe, Sheffield University

Re-engineering: One approach to Coping with Constant Software Changes

Tuesday 9 March 1999 - Dr Hongji Yang, De Montfort University

With the advent of new architectures, the need to introduce new functionalities and the improvement in design techniques, there is a strong need to re-engineer existing software systems maintaining their continuity of use. By re-engineering, we adopt the view of examining and altering an existing subject system to reconstitute it in a new form. This process encompasses a combination of various sub-processes, mainly reverse engineering and forward engineering.

Optimisation Based Approaches to the Synthesis of Hard Real Time Systems

Tuesday 16 March 1999 - John Clark, York University

Developing safety critical real-time systems is hard – very hard. Most design techniques and methods of refinement deal predominantly with (so-called) functional requirements. Safety critical systems also have to satisfy crucial non-functional properties, e.g. hard time constraints. One would like to be able to generate designs at various levels that make best use of available components (or intended components with postulated properties) taking into account required system reliability, the ability to meet all deadlines and costs of development/maintenance etc. 

Tuesday 23 March 1999 - Richard Pearce, Wye College

Jordan approach to representations of the Lorentz group

Tuesday 30th March 1999 - Professor Bernie Russo, University of California

Engineering and re-engineering software for change

Tuesday 27th April 1999 - Professor Keith Bennett, Durham University

I'd like to present results from a 10 year research project on the application of formal methods to the reverse engineering of industrial scale software. We have used a transformation-based system to obtain design abstractions from real source code, written in languages such as Assembler and Jovial! This work has stimultated many theoretical problems as well as practical issues in developing good tool support.

Potts Models, Chromatic Polynomials, and All That

Tuesday 3rd August 1999 - Professor Alan Sokal (New York University)

The q-state Potts model is a statistical-mechanical model that generalizes the well-known Ising model. It can be defined on an arbitrary finite graph G, and its partition function is a polynomial that encodes much important information about G (including its chromatic polynomial and its reliability polynomial). The complex zeros of the Potts-model partition function are of interest both to statistical mechanicians (in connection with the Lee-Yang picture of phase transitions) and to combinatorists. I begin by giving an introduction to all these problems.

Measuring Test Effectiveness - An Experiment on Operational Avionics Software

Tuesday October 28 1997 - Stuart Reid, Cranfield

A new approach to some classical results in Analysis and applications

Thursday Nov 6 1997 - Professor Maria-Victoria Velasco, University of Granada

A Graph-Theoretic Model of Human Memory

Tuesday 18 November 1997 - Professor Robin Whitty, Southbank University

Random Necklaces, Graphs and Molecules: The Challenge of Unlabelled Sampling

Tuesday November 25 1997 - Dr. Leslie Ann Goldberg, University of Warwick

Transparent Access To Distributed Information Sources

Wednesday December 3 1997 - Dr. Norman Paton, Manchester University

Test adequacy criteria and their relations to fault detecting ability and test cost

Tuesday December 9 1997 - Dr Hong Zhu, Institute of Computer Software, Nanjing University

Constructions for cubic graphs with large girth

Tuesday January 20 1998 - Professor Norman Biggs, LSE

Predicting software project effort using machine learning technology

Tuesday February 10 1998 - Professor Martin Shepperd, Bournemouth University

Discretely observed multivariate continuous time autoregressive processes

Thursday February 12 1998 - Georgi Boshnakov (Bulgaria, visiting Goldsmiths)

Application of Network Flows to Rearrangement of Series

Tuesday February 24 1998 - Professor Nash-Williams, Reading University

A Bayesian semiparametric accelerated failure time model and extensions

Thursday March 12 1998 - Bani Mallick (Imperial College)

A latent trait model for a mixture of binary, polytomous, normal and gamma items

Thursday March 19 1998 - Irini Moustaki (LSE)

Radio Channel Assignment

Tuesday March 24 1998 - Dr Robert Leese, Oxford University

The automation of software testing using genetic algorithms

Tuesday, 28 April 1998 - Professor Bryan Jones, Glamorgan University

Software Quality Management in practice - how effective is ISO 9000?

Tuesday May 12 1998 - Professor Frank Bott, Aberystwyth University

A partial order on permutations with applications to Computer Science

Thursday May 14 1998 - Professor Michael Atkinson, University of St Andrews

Symmetry in graph labeling problems

Wednesday May 27 - Douglas Rogers. 

Personalizable Hyperlink-based Systems

Tuesday July 14 - James Ohene-Djan.