Colin Owens

How do we build
tools that match
how people think?

Twenty years finding out. Now running aboutface.io.

Now

Research, synthesis,
build.

I run aboutface.io — an R&D lab exploring human-AI collaboration. The question is the same one I've been asking since graduate school: how do we build tools that work with how people actually perceive, rather than how computers work?

Project

Weave

Human-AI working together. A project exploring the texture of real collaboration between people and AI systems — how work changes, what gets better, what doesn't.

Project

aihates.me

An AI that files complaints about humans — because someone should. A research project with a sense of humor, examining AI perspective on human behavior.

Ongoing

Teaching & speaking

Faculty at RISD and Lesley University. Visual Systems, Design for Dynamic Media. The classroom as a laboratory for the same questions.

Principles

The things that
stayed constant.

Twenty years across very different domains — music technology, fintech, advertising platforms, enterprise software, academic research — and the same ideas keep surfacing. These aren't methods. They're the lens.

Systems thinking

Everything connects. A button is never just a button — it's the end of a chain of decisions about architecture, language, mental model, and trust. Design the system, not the surface.

Senses are interconnected

We don't see and hear separately. We perceive as one system. The best tools are built around how people actually experience the world — not how data is organized in a database.

Tools serve ideas

Technology is the vehicle, not the destination. The question is always: what does this make possible? What does it make easier to think?

Context over technique

Anchored to problems, not methods. The right tool for this situation. Frameworks are useful until they aren't — the skill is knowing when to set them down.

Natural metaphors

Physics, not interfaces. When software borrows the logic of the physical world — gravity, collision, space — it becomes intuitive in a way that no onboarding flow can replicate.

Background

How it got here.

The research started with how our senses work together. That led to a music mixing company. That led to teaching. Teaching led to building products at the largest technology companies in the world. All of it led to the same question, restated.

We build tools around how computers work.
We should build them around how people perceive.

2009

MFA, Graphic Design — MassArt

Thesis: how our senses interconnect, and what that means for the tools we build. Synaesthesia as a model for dynamic media.

2007–2012

Founder — Shapemix

Music mixing as physics. Raised $2M. Launched four iOS apps, including a SPIN Magazine co-branded remixing contest. Patent US20110271186A1. The thesis, made real.

2007–2018

Faculty — RISD, Lesley University, Emerson College

Visual Systems, Design for Dynamic Media, Data Visualization for Journalism.

2014–2015

Designer — Stackdriver → Google

From startup to Google Cloud Monitoring. Developer tools. Data visualization library for Google Cloud. The scale problem: how do you make infrastructure legible to the people who depend on it?

2019–2021

Principal Designer — Fidelity Investments

Fractional trading, mobile design systems, trading flows. $2M in daily profits on launch. The same question in fintech: how do you make a complex system feel navigable?

2022–2023

Lead Designer — Meta

Ads Manager. Machine-learning guidance. Audience expansion. Giving millions of advertisers meaningful control over systems.

2026–

Founder — aboutface.io

R&D lab for human-AI collaboration. The question, restated for the current moment: how do we build AI tools that work with how people think?