The Liquid Interface for Musical Expression (LIME)

EditedArts Facebook Page

Demonstration and Audio Description
Project Blog


Video link

Collaboration with Vytautas Niedvaras
A Musical performance that uses embodied technologies. We've made a set of sensory smart gloves that will track the movement of the performer's hands along with the relative position of their fingers, we can use this data to detect different types of motion and gestures expressed. When used in conjunction, the performer is able to control the sound output. We're also controlling a TENS muscle stimulator that's attached to the performer, overriding the control of movement when a current is imposed

Footage from self-curated performance evening: Event Page

False Efficiency


Video Clip

Collaboration with Rosie Gibbens
"Using sensors and microphones on my body and the table, an unsettling soundscape is created as I perform the gestures. It builds in intensity at moments of increased movement. The stamp cues whispered phrases from Ivanka’s book. These were chosen by me for their corporate jargon, aspirational nature or gendered associations"

Shown as part of the ‘Youth Board Presents’ at Toynbee Studios. April 2018. Photos by Greg Goodale.

Strava Swarm

Experimenting with the dispersive flies swarm algorithm to analyze real route data taken from the Strava API. This project explores the relationships between the natural characteristics of swarms and their behaviours within different landscapes

Vibrace, an alternative approach to wearable technology

Hardware Hackathon at Google campus. A prototype for a smart watch which communicates with multiple vibration motors. We got by with a lot of improvisation and experimentation and managed to produce our first working model over the weekend.

More prototype photos

From here we went on to build a first prototype to pitch to potential supporters. We've gained interest from our university careers service who have entered us into the Goldsmiths Innovation Awards. With this we hope to move onto building marketable prototypes as the next stage to try bringing Vibrace into commerce


First university group project. As a team of five we built musical instrument app based on the flocking algorithm. Our collective programming experience was close to nothing but we learned a lot over the two months working together.

pHloc() web app (Recommended use with touchscreen)

Unity experiments

Screen Shot 2016-06-20 at 10.29.01
Full Photo Gallery
LAB colour space

Using objects in a simple playground to represent different colour spaces, starting off with XYZ to RGB conversions showing pure green, yellow and cyan. I then wrote an XYZ to CIE LAB function, displaying a spectrum between red and purple. CIELAB colour spaces are capable of producing perceptually linear colour spaces in three dimensions, allowing for nicer gradients and colour schemes

Fruit Salad Music


video link



William Primett

Goldsmiths, University of London. Immersing myself in digital art, animation, sound programming and innovative design

Contact / Pages:
w.primett [at]
Resident Advisor


  • C++, Java, Python, ChucK, Tensorflow, Keras
  • openFrameworks, Processing, Maya, Unity
  • Max MSP, Ableton Live, Adobe Suite