My name is Drew Harris. I'm a musician and music producer, software engineer and hobbyist hardware creator. I currently work at Bandcamp. I graduated in 2013 from University of Victoria in Canada with a Bachelor in Electrical Engineering. I completed several internships as a software developer during my degree: as a node.js developer at exfm in New York City, as a Java developer at Alcatel-Lucent in Ottawa, and two internships at University of Victoria as a Python developer in the high energy physics lab and at the Computer Help Desk.
Along with software and hardware development, I compose and produce electronic music under several monikers, including Germany Germany, radioseven and Toy Camera. I've performed in Vancouver, New York City, Paris, and Berlin, among other locations, and I've released a total of 8 albums independently.
Please contact me at email@example.com for any inquiries.
Visual music performance accompaniment with MIDI and an Arduino, built for Germany Germany live performances.
4 strings of white LEDs are connected to the lightarray controller, a box containing an Arduino and switching transistors to amplify output signals. MIDI control change messages (serial data) dictating a four-channel light pattern, defined by a set of envelope parameters, are passed from Ableton Live on a connected computer to the lightarray controller over USB. The lightarray controller draws each channel's pattern to each of the four LED strings using pulse width modulation (PWM) amplified by transistors. The controller also transmits the same information over radio frequency (RF) to remotely connected light strings. A newer iteration of this project, combining the lightarray controller with a hardware audio player, is in the works.
The original version (v1) of the project was presented at different performances in New York City, including Glasslands Gallery in Brooklyn. Since then, it has gone through different iterations and is consistently part of the Germany Germany performances.
A simple photo gallery, using Flickr as the photo storage backend.
The application relies on exfm's audio stream database and uses a WebSocket to maintain an open connection between client and server. When a client makes a change to a playlist (add song, change order, change title), the changes to propagated through to all connected clients. This project is a work in progress and is currently in the 'alpha' stage.
Written for the New York City Music Hackathon in October, multi is a node.js application to create dynamic multi-user audio installations, making use of speakers in laptops, mobile devices and tablet computers.
A multi server runs on one central computer connected to a local area network. Multi-track compositions are split into seperate mp3 files and loaded into the multi server (up to 10 tracks were tested). Client computers - devices running a browser implementing the HTML5 Web Audio API - then connect to a web server run by multi and are presented with a simple web interface. Upon client connection, the multi server opens a WebSocket connection to the client, passes the URL of the assigned audio file as a WebSocket message and tells the client to buffer the and prepare for playback. Once all clients are connected and buffered, the multi server is notified. Upon triggering of playback, all clients will (theoretically) begin playback simultaneously, presenting a surround-sound effect using only pre-existing speakers.
Unfortunately, due to network-related latency, delays of up to 300ms occurred after triggering playback, especially on mobile devices. The Web Audio API is still very new and different browsers implement it differently. This project is still a work in progress.
A small project, intended as an experiment to determine feasiblity. Livectl is a node.js application to link audio performance and processing software over networked computers using WebSockets.
A server instance of livectl is run on a central server. Client instances connect to the server and present their MIDI inputs and outputs. Currently, all messages are broadcast to all clients, but a patchable web interface was planned. Unforunately, the latency was found to be too unpredictable for live keyboard performance, though patching of audio applications such as Ableton Live worked well. This may be useful in a multi-computer electronic performance. This project is still a work in progress.
Visual music performance accompaniment with openFrameworks and iOS, built for the Germany Germany live performance.
ViLA, Virtual Light Array, is the mobile companion to lightarray. Written in C++ with the openFrameworks toolkit, ViLA is my first iOS application. Similar to light array, ViLA receives a light pattern in real-time through UDP network broadcast packets. The light patterns are created in Ableton Live in the same way as the patterns for the physical lightarray installation.
The application works well with minimal latency on an adhoc network, but UDP packets are dropped frequently when on a public wireless router.
A weekend project to learn the Flask Python web framework.
Photos are batch uploaded in a zip file, which is extracted and parsed in the Python web application. Information about each image is retrieved during view using an AJAX request.
A social/collaborative/community blog network.
Journal is a blog/community application built on the Django Python web framework. It was an experiment to create a smaller network of people that would communicate using text, images, audio and video. Further implementation would be to have different communities for different interest groups, each posting different specific material. Journal allowed private posting with the intention of keeping a journal of activity that could be partially public and partially private.
Journal is no longer maintained and was never implemented in a large scale. It was an experiment, as are all of these projects.