OM# (om-sharp) is a computer-assisted composition environment derived from OpenMusic: a visual programming language dedicated to musical structure generation and processing. The environment is based on Common Lisp, and allows creating programs that are interpreted in this language. Visual programs are made by assembling and connecting icons ("boxes") representing Lisp functions and data structures, built-in control structures (e.g. loops), and other program constructs. The visual language can be used for general-purpose programming, and reuse any existing Common Lisp code. A set of built-in tools, editors, and libraries (including common music notation, MIDI, 2D/3D curves, audio...) make it a powerful environment for music composition.
→ OM# is available on macOS, Windows, and Linux
OM# brings a bunch of new of tools and features in your computer-assisted composition environment:
- Patching interfaces and environment with easier box inspection / display control / automatic alignment / connections / etc.
- No workspace to set-up: open your documents and simply organize them in your usual file-system.
- Interactive visualization of Lisp code corresponding to visual programs.
- A native implementation of the reactive mode for visual program execution.
- New loops. Embed iterative processes in standard patches. Use a collection of new collectors and memory utilities.
- A set of interface components: list-selection, switch, button, slider, ... / Lock your patch for faster interaction.
- A redesigned sequencer interface including dual program/tracks-based visualization, meta-programming tools, and reactive execution modes.
- Human-readable, easily editable text format for patches and other documents. Possibility to read and edit patches as text.
- New score editors, BPF/BPC editors, etc. Nicer display. Easier edit.
- Versatile containers handling the storage, visualization and editing of collection of objects.
- A time-based model for "executable" objects, including dynamic function execution and data send/transfer possibility.
- Dynamic-memory allocated audio buffers.
- Tools and editors for the representation and manipulation of musical objects (score, sounds, MIDI tracks, ...)
- A framework for handling OSC structures and communication
- ...
→ See also this ICMC paper (2017) for a quick overview.
→ Use Discussions to report problems, suggest features or enhancements, or just discuss about the project!
OM# is a free software distributed under the GPLv3 license.
→ Source repository: https://github.com/cac-t-u-s/om-sharp/
As a Common Lisp program, OM# can be considered just as an extension of Lisp including the specific built-in features of the application. The application is developed with the latest LispWorks compiler (7.1.2), which provides multi-platform support and graphical/GUI toolkits in Common Lisp. A limited "Personal" edition of LispWorks 7 is now available: its limited heap size requires compiling sources in several successive runs, and it is not possible to create new OM# executables with it, however, it allows loading and running/using/editing the program from its sources.
Alternatively, the OM# executable also includes a Lisp interpreter which can load and evaluate modifications and extensions of the program sources.
OM# can load patches created in OpenMusic. See how to import OpenMusic patches. Most OpenMusic external libraries are easily portable (or already ported). See how to create or adapt a library. Report any problems in porting or converting libraries or patches on the discussion forum (see below).
External libraries are packages containing additional pieces of code that can be loaded dynamically in an OM# session. There exist a few specific libraries distributed here (see list below), as well as a number of compatible OpenMusic libraries.
OM# external libraries are structured as a simple folder, called either "libname" or "libname x.y" (where "libname" is the name of the library, and "x.y" is a version number), containing a loader file named libname.olib (or .omlib).
→ Unzip the external libraries in a common container directory and specify directory in the Preferences/Libraries/
OM# libraries | |||
---|---|---|---|
csound
A simple interface with the Csound synthesis language. |
odot
Support for OSC encoding/decoding usnig CNMAT's libo library and "o." expression language. |
spat
A connection with IRCAM's Spat library for spatial audio control and rendering. |
mathtools
An adaptation of OpenMusic's "Mathtools" package for mathematical music analysis and representations. |
Compatible OpenMusic libraries | |||
---|---|---|---|
"Classics" | Externals/DSP tools | Third-party | |
About the general design and implementation of OM#:
- Score Objects in OM#. Jean Bresson. International Conference on Technologies for Music Notation and Representation (TENOR’20/21), Hamburg, Germany (online), 2021.
- Next-generation Computer-aided Composition Environment: A New Implementation of OpenMusic. Jean Bresson, Dimitri Bouche, Thibaut Carpentier, Diemo Schwarz, Jérémie Garcia. International Computer Music Conference (ICMC’17), Shanghai, China, 2017.
- Timed Sequences: A Framework for Computer-Aided Composition with Temporal Structures. Jérémie Garcia, Dimitri Bouche, Jean Bresson. International Conference on Technologies for Music Notation and Representation (TENOR’17), A Coruña, Spain, 2017.
- Computer-aided Composition of Musical Processes. Dimitri Bouche, Jérôme Nika, Alex Chechile, Jean Bresson. Journal of New Music Research, 46(1), 2017.
OM# was also used as a support for research and production in a number of other projects:
- Composing Structured Music Generation Processes with Creative Agents. Jérôme Nika, Jean Bresson. Joint Conference on AI Music Creativity (AIMC 2021), Graz, Austria (online), 2021.
- Instrumental Radiation Patterns as Models for Corpus-Based Spatial Sound Synthesis: Cosmologies for Piano and 3D Electronics. Aaron Einbond, Jean Bresson, Diemo Schwarz, Thibaut Carpentier. International Computer Music Conference (ICMC), Pontificia Universidad Católica de Chile (online), 2021.
- OM-AI: A Toolkit to Support AI-Based Computer-Assisted Composition Workflows in OpenMusic. Anders Vinjar, Jean Bresson. Sound and Music Computing conference (SMC'19), Málaga, Spain, 2019.
- Musical Gesture Recognition Using Machine Learning and Audio Descriptors. Paul Best, Jean Bresson, Diemo Schwarz. International Conference on Content-Based Multimedia Indexing (CBMI'18), La Rochelle, France, 2018.
- From Motion to Musical Gesture: Experiments with Machine Learning in Computer-Aided Composition. Jean Bresson, Paul Best, Diemo Schwarz, Alireza Farhang. Workshop on Musical Metacreation (MUME2018), International Conference on Computational Creativity (ICCC’18), Salamanca, Spain, 2018.
- Symbolist: An Open Authoring Environment for End-user Symbolic Notation. Rama Gottfried, Jean Bresson. International Conference on Technologies for Music Notation and Representation (TENOR'18), Montreal, Canada, 2018.
- Landschaften – Visualization, Control and Processing of Sounds in 3D Spaces. Savannah Agger, Jean Bresson, Thibaut Carpentier. International Computer Music Conference (ICMC’17), Shanghai, China, 2017.
- Interactive-Compositional Authoring of Sound Spatialization. Jérémie Garcia, Thibaut Carpentier, Jean Bresson. Journal of New Music Research, 46(1), 2017.
- o.OM: Structured-Functional Communication between Computer Music Systems using OSC and Odot. Jean Bresson, John MacCallum, Adrian Freed. ACM SIGPLAN Workshop on Functional Art, Music, Modeling & Design (FARM’16), Nara, Japan, 2016.
- Towards Interactive Authoring Tools for Composing Spatialization. Jérémie Garcia, Jean Bresson, Thibaut Carpentier. IEEE 10th Symposium on 3D User Interfaces (3DUI), Arles, France, 2015.
Support the development of OM#: Buy me a coffee!
Design and development: J. Bresson, with contributions by D. Bouche, J. Garcia, A. Vinjar, and other contributors. This project uses code and features from the OpenMusic project by IRCAM - STMS lab.
Contact: https://j-bresson.github.io