Researcher, engineer, hacker, and transhumanist working to understand, enhance, and interface with the human mind.
I believe we are all cyborgs: intelligent biological agents upgraded by the technology we create. This circumstance introduces the possibility to transcend cognitive limitations imposed by disease, age, and evolution to improve people's lives. I want to continue in this path, creating and existing as the next generation of human intelligence and experiential enhancement. This work includes knowledge, memory, learning, conversation, social, health, music, sensing, etc. intelligence and experience enhancing tools, realized through wearable computers, brain-computer interfaces, and more.
Recently, I've been working on contextual search engine smart glasses and all-day wearable brain sensing.
Researching + learning about memory, cognition, brain sensing + stimulation, and UI
Launching the H20 Smart Glasses Community
Open Source projects contributor, see Github
In no reasonable order...
Experience a whole new dimension of music with BrainJam - a wearable neurotechnology computer system that allows a user to experience music and galvanic vestibular brain stimulation in sync. (Collab: Jeremy Stairs)
1. A software framework to serve as the backend for a number of Wearable Computing research experiments, use cases, and applications. 2. Baked-in tools to upgrade human intelligence with smart glasses, including conversational intelligence, social intelligence, memory, knowledge, and thinking upgrades.
I converted a 40 year old RV into a mobile hacker lab for wearable computing adventures across North America.
Improving the emotional intelligence of users, with a focus on autistic users, via a computer co-processor that provides you with insights into the non-verbal communication being displayed around you.
A custom wearable computer with processor, camera, microphone, stereo sound, WiFi+Bluetooth, power, and all day battery life. Social tools AI software stack runs on the wearable.
A wearable computer that can scan the visual cortex and image what the subject is looking at.
This was accomplished using a wearable SSVEP EEG system.
A rapidly prototyped EEG BCI using neural speech entrainment to identify moments in received speech that the listener finds relevant, tagging these events in memory. The wearable computer listens to audio data, runs it through a digital processing pipeline, and then correlates the processed audio with filtered EEG data, in real-time. We built a wearable headband with light indicators to represent someone's audio attention using the Audio Evoked Potentials system prototype.