Researcher, engineer, hacker, and transhumanist working to understand, enhance, and interface with the human mind.

I believe we are all cyborgs: intelligent biological agents upgraded by the technology we create. This circumstance introduces the possibility to transcend cognitive limitations imposed by disease, age, and evolution to improve people's lives. I want to continue in this path, creating and existing as the next generation of human intelligence and experiential enhancement. This work includes knowledge, memory, learning, conversation, social, health, music, sensing, etc. intelligence and experience enhancing tools, realized through wearable computers, brain-computer interfaces, and more.

Recently, I've been working on contextual search engine smart glasses and all-day wearable brain sensing.

Background

My hard skills lie in software + electrical engineering and signals (see the big ol' list in my CV) and my interests focus on HCI, psychology + neuroscience, infrastructure, communication, and adventure. I've collaborated with university research labs at UofT (Steve Mann), UofA (Kyle Mathewson), and NUS (HCI Lab), attended the Wolfram Summer School, built brain sensing glasses at Blueberry, road tripped across Canada in a cybernetic mobile research lab, and completed an engineering degree (BESc).

Please feel free to reach out.

Current (2022):

Featured Work/Fun

In no reasonable order...

BrainJam: Electric Music Brain Stimulation - tACS, GVS, tDCS Wearable

See all projects...

Experience a whole new dimension of music with BrainJam - a wearable neurotechnology computer system that allows a user to experience music and galvanic vestibular brain stimulation in sync. (Collab: Jeremy Stairs)

Wearable Intelligence System

Description and Code

1. A software framework to serve as the backend for a number of Wearable Computing research experiments, use cases, and applications. 2. Baked-in tools to upgrade human intelligence with smart glasses, including conversational intelligence, social intelligence, memory, knowledge, and thinking upgrades.

Mobilab: 40 Year Old RV Mobile Hacker Lab

See all projects...

I converted a 40 year old RV into a mobile hacker lab for wearable computing adventures across North America.

Wearable Social Assistant

Article

Code

Improving the emotional intelligence of users, with a focus on autistic users, via a computer co-processor that provides you with insights into the non-verbal communication being displayed around you.

A custom wearable computer with processor, camera, microphone, stereo sound, WiFi+Bluetooth, power, and all day battery life. Social tools AI software stack runs on the wearable.

Human Eye as a Camera

See all papers...

A wearable computer that can scan the visual cortex and image what the subject is looking at.

This was accomplished using a wearable SSVEP EEG system.



Collaboration with Steve Mann, Derek Lam, Kyle Mathewson, Jeremy Stairs, Jesse Hernandez, Georges Kanaan, Luke Piette, Humza Khokhar

Audio Evoked Potentials BCI (early prototype)

A rapidly prototyped EEG BCI using neural speech entrainment to identify moments in received speech that the listener finds relevant, tagging these events in memory. The wearable computer listens to audio data, runs it through a digital processing pipeline, and then correlates the processed audio with filtered EEG data, in real-time. We built a wearable headband with light indicators to represent someone's audio attention using the Audio Evoked Potentials system prototype.

Collaboration with Kyle Mathewson at APP lab and Jeremy Stairs.