Open Arts Academy

Master's Thesis in Integrated Digital Media. May 2021.

Prototype || Thesis Report

Open Arts Academy (abbreviated throughout paper as ‘OAA’) is a platform and community dedicated to effective, empathetic, and equitable computer science education for individuals who, for any reason, will not attend or have not attended a formal computer science program. Rather, users of OAA are individuals who intend to use computer science as a means-to-an-end for the development of projects. Some examples of usage could be with projects for personal websites, artistic projects and portfolios, and/or small business websites.

The intended audience of Open Arts Academy is learners who are working on a project and using a particular language for self-actualization purposes, as opposed to for the purpose of getting a job as a developer. These means-to-an-end users could be called artists, hobbyists, tinkerers, etc.

The bulk of this Thesis was around rethinking the way in which programming tutorials are presented and ideating new methods for encouraging critical problem solving as a means of learning computer science skills, as opposed to learning the "magic words" of a particular language.

See the above links to view the website prototype for Open Arts Academy and to read my thesis report.


Convolution Reverb Tutorial

WebAudio Tutorial. February 2021.

Tutorial || Demo

This is an intermediate tutorial that explains the usage of WebAudio API's Convolution Reverb. In the tutorial, learners will go through the process of creating a web app that plays an audio file (by pressing the a,s,d,f keys) with a convolution reverb that can be changed (by clicking the buttons).


Socket Step Sequencer

Collaborative Musical Web App. December 2020.

Socket Step Sequencer allows users to collaborate with their musical partners from anywhere in the world. The web app features an 8-note scale for melody and harmony that can be set to major, minor, or pentatonic. It also has rhythm samples and a BPM slider to change the speed of the loop.

Socket Step Sequencer was built using Javascript, Tone.js, p5.js, Node.js, Socket.io, and Heroku.

Try Socket Step Sequencer here.
View the Github Repo here

Invisible Intentions

Live Streamed Sound Installation. April 2020.

What started out as an immersive sound installation using data collected from plants turned into a live stream after the outbreak of COVID-19. Invisible Intentions is an audio/visual installation dedicated to granting sonic agency to plants and bringing attention to plant intentionality.

Brought upon by my personal interest in climate change and the ignorance of many people about the knowledge and intentions of plants, this piece regards plants as living, breathing organisms with their own needs, aspirations and self-expression. The piece consists of sensors that read data off the plants. The data is collected an interpreted sonically, which is then sent (along with a live video of the plants) to a twitch live stream that runs for long periods of time. The sonic experience of the installation is meant to be somewhat peaceful and atmospheric. Users in the chat are be able to use commands to change different aspects of the piece, including the color of LED lights around the plants as well as manipulating the generated sound. This piece targets those who may think of plants as inanimate objects, without agency.

Above is an example of the twitch stream.


Softwarpipes

New Musical Instrument. November 2019.

Softtwarpipes is an instrument developed using MaxMSP, an Arduino, and the TouchOSC app for iPhone. It exists as part of a larger project examining the political history of the Bagpipes and placing it into a modern context. To get the proper sound of a bagpipe, I developed an automatic grace-noting system that would be controlled by the arduino controller in one hand, simulating the chanter. The other hand, holding an iPhone using TouchOSC, would simulate the gestures of putting pressure on the bag, and the rotation data of the phone would change the timbre and additive frequencies of the drone.


Natural Artifice

Audio/Visual Installation. October 2019.

Natural Artifice is a project that explores plant intelligence and intentionality. The piece is a Max/Jitter patch based on an L-System GUI developed by Robin Parmar, which can be viewed here: L-System GUI. The patch contains an L-System Generator that interacts with a video asset from the 1978 Documentary, The Secret Life of Plants, in addition to a frequency generator that emits sine waves based on the position of the L-System’s location. It also includes a voiceover discussing plant intelligence and our perception of it. A link to a screen capture of the patch is provided here: Natural Artifice


Bodies Bound in Binary

Audio/Visual Performance. May 2019.

Bodies Bound in Binary is an experimental audio/visual performance by Melissa Canavan and I. The audio and visual components of the performance work to tell a narrative of a lost relationship and how the protagonist's memories of it become corrupted over time. Both the audio and visual components for the performance were written by myself in MaxMSP. The audio patch consists of a vocoder in which the audio feed is multiplied by saw wave freqencies controlled by a midi controller. These are then repeated using a looping pedal to create a backing track over which the lyrics to my song, "Bodies Bound in Binary" are performed. The video patch utilized video composition, multiplying the ARGB channels of the multiple prerecorded video clips in different ways controlled by Melissa to create real-time generative effects that are unique to each performance.


Till I See You Again

Gif made using video compositing and color streaking in MaxMSP. Lucas White. March 2019.

Self Portrait

Gif made using video compositing and color streaking in MaxMSP. Lucas White. April 2018.

Charmacy

Gif made using video compositing and color streaking in MaxMSP. Lucas White. April 2018.

New Con

Music Video made for Felix Simpson-Orlebar using video compositing and coloration in MaxMSP. Lucas White. October 2017.

New Con from Lucas White on Vimeo.


Near Far Nowhere

Interactive Installation Performance. May 2018.

Near Far Nowhere was a live Audio/Visual performance by Lucas White utilizing MaxMSP. Lucas created a Max Patch which enabled him to control and composite different video clips on top of one another, editing the ways in which the channels are added onto one another. The performance included Lucas on voice and controlling visuals, Sage Beasley (who cowrote the song with Lucas) on piano, and Esteban Espana on Guitar. The videos included in the performance starred Lucas and were shot by Katrina Blasco.


The Flame's Truth

Still and Moving Images Fall 2017. Joris Kwong, Ari Bennett & Lucas White. December 2017.

In The Flame's Truth, the protagonist (played by myself) takes extreme measures to destory a past that has come back to bite him.

This film was part of the 'Genre Study' final project for Still & Moving Images. Ari, Joris, and I chose to focus on the film styles and tropes of Western' and Noirs and bring them out of their time period and location and into the city in the current decade. Our attempt at recreating the Noir style was in the lighting and story itself, while, the Western style was reflected in the duel, the extreme close-ups on the eyes, and the big, bold, font choice. The film was also scored by myself and the other actor in the film (Sage Beasley).