1. Task
environment
The
earlier version of Bol Processor, namely BP1, was built on a model of "pattern
grammars" enabling both the production and fast parsing of sentences (Bel
1987a-b). The grammar format in BP2 is much less restrictive as far as
production is concerned. A sound interface has been implemented (using the
MIDI standard) which is not geared towards any particular synthesiser; it
accepts musical input from any MIDI device.
Below
is a block diagram showing the interaction of modules in BP2:
Fig.1
A block diagram of Bol Processor BP2
Three
fields are used for storing
grammars,
items
generated by the
inference
engine
and
sound-object
prototypes.
Items are represented as structures (strings and sets) of symbols. Each
symbol is mapped to a single sound-object prototype or a note. The interpreter
generates MIDI codes or Csound score lines given the symbolic structure of an
item and properties of sound-object prototypes.
Interpretation
is in two stages. Musical items (which may be polyphonic) are represented as
strings of symbols in a syntactic form called a
polymetric
expression.
First, a mapping is calculated between the sound-objects contained in a
polymetric structure and a set of
symbolic
dates
.
Symbolic time,
here, is an arbitrary ordered set that permits the ordering of sound-objects.
The mapping of symbolic to physical time is called the
interpretation
of polymetric expressions.
In this process, missing information regarding the ordering of sound-objects
(along symbolic time) is inferred (Bel 1991,1992a). Then BP2 proceeds to the
time
setting
of the sound-object structure represented as a complete polymetric expression.
Start/clip dates of all sound-objects are calculated, yielding the dates of
MIDI messages or Csound events. Sound-object properties are taken into account
during time setting only.
The
block diagram indicates that an external control can be exerted on the
inference engine, grammars and the interpretation module. Specific MIDI
messages may be used to change weights in grammars,
the time base and nature of time (striated/smooth). These messages may also be
used for synchronising events during their performance and even assigning
computation time limits. Such features are currently used in improvisational
rule-based composition.
Several
BP2's may be linked together and to other software devices such as MIDI
sequencers. Messages on the different MIDI channels and Apple Events may be
used for communicating between machines or for controlling several sound
processors. It must be kept in mind that a "sound-object" is not necessarily a
sound-generating process. Depending on the implementation it may contain any
kind of control/synchronisation messages as well.