In this thesis we examine the possibilities of integrating specifications written in different formalisms used in the description of programming languages within a single framework. We suggest that the theory of institutions provides a suitable background for such integration, and we develop descriptions of several formalisms within this framework. While we do not merge the formalisms themselves, we see that it is possible to relate modules from specifications in each of them, and this is demonstrated in a small example.
The focal point of the thesis is the provision of the semantics for a small language in Z. We believe our approach to be unique, in that we have not only described the semantic mapping functions using Z, but we are also taking Z to be our semantic domain; that is, our semantics will map a program into a corresponding Z specification. We assert the usefulness of such a specification as a basis for the analysis of the program. The semantics are analyzed in detail, and further work is done towards the unification of various aspects of program analysis under the Z notation, and in particular, under our predicate version of the schema calculus; the use of this notation and calculus to explore the link between programs and specifications is a dominant theme throughout the thesis.
Fuzzy logic technology has been successfully applied in many industrial control applications, making a number of difficult problems easier to solve. Using Fuzzy Logic provides a clear account of the principles of fuzzy logic based design, from a computer and electronics engineering perspective. Structured in a pedagogical manner, the book incorporates the latest fuzzy logic techniques, emphasizing hardware and software design for fuzzy systems and fuzzy logic development tools.
In this paper, we present a new automatic transformation algorithm which can automatically transform pi-calculus processes to remove all explicit internal communication. We prove that the transformation preserves full bisimulation equivalence, and also that the transformation always terminates.
In this paper we describe the design and implementation of a system for representing context-free grammars in C++. The system allows for grammar representation at the object level, providing enhanced modularity and flexibility when compared to traditional generator-based approaches. We also describe the transformation of grammar flow analysis problems into an object-oriented framework using the Visitor pattern, as well as the implementation of a top-down LL(1) parser. As such, this work represents the synthesis of three presently disparate fields in parser design and implementation: combinator parsing, fixpoint-based grammar flow analysis, and object-oriented design.
Dynamic quantitative measurements of bytecode and stack frame usage by Eiffel and Java programs on the Java Virtual Machines are made. Two Eiffel programs are dynamically analysed while executing on the JVM and results compared with those from the Java programs. The aim is to examine whether properties like instruction usage and stack frame size are properties of the Java programming language itself or are exhibited by Eiffel programs as well. Investigations analyse how the different assertion checking and optimisations possible using the SmallEiffel compiler affect bytecode and stack frame usage.
In this paper we present and discuss an implementation of combinator parsers in Standard ML. Several parsers are presented, ranging from a conventional backtracking parser to a predicative parser based on dynamically calculated lookahead sets. All of these parsers are based on the same principle of re-interpreting the standard union and concatenation operators of a context-free grammar, with the necessary overloading of these operators being provided via ML's module language.
A variety of specification formalisms exist for the precise description of programming language semantics; once such formalism is action semantics, which strongly emphasises modularity in the specification, coupled with a sophisticated set of "high-level" specification constructs. In this paper we describe the translation of action notation into the functional programming language CAML, a variant of ML. Because of its high level of abstraction, the notation itself demands a formal specification in terms of a more primitive notation, and it is this "kernel" notation that we implement.
One of the advantages of formally specifying a programming language's semantics is that it provides a formal foundation for proving properties of programs, and equivalencies between program constructs. Action Semantics provides a framework for constructing modular specification, as well as a foundation (using its basic action laws) for deriving these proofs. In this paper we discuss the use of the Coq proof assistant (based on higher-order constructive logic) to build such proofs.
Institutions are abstract frameworks used for the description of
formal systems; one common application is giving formal semantics for
languages used in formal specification. This facilitates comparison
and integration of specifications written using these formalisms.
Two of the main formalisms used in the description of programming
language syntax are regular expressions and context-free grammars. In
this report we treat these as separate specification formalisms, and
construct an institution for each.
Building on previous work in programming language syntax, this paper gives institutions for some formalisms used in the description of programming language semantics. Specifically we construct an insitution for attribute grammars and van-Wijngaarden grammars, and examine the possibility of extending the framework to include denotational- and axiomatic-style definitions.
The theory of fuzzy sets was developed as an extension of classical Boolean logic and has been applied to both low-level and mid-level computer vision problems. The view of a digital image as an array of singletons where each pixel's membership represents the strength of some property at that point, allows the use of fuzzy logic in image processing. The paper contains a comparison of the image enhancement, thresholding, noise suppression and edge detection problems using fuzzy and non-fuzzy approaches. We then describe an attempt at image analysis using fuzzy geometric properties, in particular fuzzy perimeter, fuzzy compactness and index of area coverage. It was hoped than on extracting these properties for simple 2-dimensional shapes, shape identification could be achieved. Fuzzy perimeter is then examined in some detail and experimental results are presented and discussed.
In this paper the design, implementation and operation of a fuzzy development tool is discussed. The fuzzy development tool consists of a graphic programming tool and a hardware-independent compiler. The graphic programming tool, which runs on MS-Windows, provides an effective means to capture and design the fuzzy knowledge base. The fuzzy language compiler, which translates the input fuzzy knowledge base and fuzzy reasoning rules into a portable module in C, allows users to explore the powerfulness of fuzzy control approach and to incorporate the intelligent control module into the overall design of the fuzzy system.
The basic building block of a Z specification is the schema, and we can use these schemas in logic expressions where they are presented as if they were propositions. This paper presents an enhancement of this schema calculus which allows for a predicate-like version of the schema to be used; this can facilitate operations where all the variables of a schema are not always treated as a unit. We explore the use of this notation in the standard Z schema operations, in the proof obligations for specification refinement and in the formulation of a weakest prespecification operator.
Revised: 7^{th} October 1998
Contact: James Power