Skip to main content
Global Innovation Design (MA/MSc)

Harrison Tan

I am a multidisciplinary designer and formerly a design outsider. At university, I studied economics and historical sociology before discovering the joy in making things with code and interactive media the summer before graduating. I then went on to wear many hats professionally – mostly sequential, occasionally stacked – from software engineering to product management.

As a member of NYU Abu Dhabi's inaugural class, I nurtured an enduring curiosity for the world and a penchant for action. That desire to roll up my sleeves led me to Venture For America where, as a fellow, I learned about entrepreneurship. Putting theory to practice, I joined an early team of four at Avhana Health, a healthcare startup where I designed and developed clinical decision support tools to improve preventative care.

While the San Francisco Bay Area is home, I have had the good fortune to live in Abu Dhabi, Baltimore, Berlin, New York City, and now London. The range of people I’ve met, cultures I’ve encountered, and experiences I’ve lived through continue to expand my imagination and let me to navigate with greater awareness of and sensitivity to other ways of living.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~


Experience

Founder, Tangently

Volunteer, Frontline Aid

Volunteer, Capoeira4Refugees

Director of Integration, Avhana Health

Fellow, Venture For America


Education

MA in Global Innovation Design, Royal College of Art

MSc in Global Innovation Design, Imperial College London

BA in Economics, cum laude, New York University Abu Dhabi

Web Development Immersive, General Assembly San Francisco


Selected Events

Imperial Enterprise Lab Venture Catalyst Challenge 2021, Participant & Team Lead

RCA x Logitech Grand Challenge 2020, Finalist Group Member & Team Lead 

RCA x Land Rover Masterclass: Turning Concept Into Reality, Participant

Tsinghua University 2020 Exchange Exhibition, Exhibitor

Harrison Tan

I'm an {embodied human}-centered designer and engineer inspired by how humans – when playing, dancing, even fighting – make sense of and move within the world. And the joy found in this process is what drives me to design interactions that enrich people’s relationship with their closest tools and the environments they inhabit.

I came to this realization when writing a dissertation that was later awarded a distinction. Titled Embodied Design: ago ergo sum, it began with an investigation of the philosophical and physiological basis of interactions – from embodied cognition to sensory-motor aka perception-action couplings. It then looks to emerging technologies in soft robotics, cyberphysical systems, and extended reality to speculate on how embodied designers might create richer interactions through continuous modulation. (read here)

Taking a step in that direction, my final project Holodeck re-imagines tabletop game playtesting for a more inclusive and immersive cyberphysical future. Holodeck explores how cyberphysicality might change the relationship between player, creator, and the game itself. With a stage set by the modern renaissance in crowdfunded tabletop games, Holodeck creates new interactions that connect the creativity of creators to the curiosity of players in more accessible and interactive ways.

Holodeck Intro — a minute and a half video introducing the concept and project

The rise of crowdfunding has reshaped the relationship between game creators and players. More than just a shift in funding, creators often bring the crowd of project backers into the design, development, and playtesting process itself. Even the idea of what makes up a tabletop game has expanded to include soundtracks and animations thrown into the mix.

Holodeck makes it possible for creators to go beyond the status quo playtesting experience via print and play kits. It does this by making it simple to virtualize their game assets, creating a digital twin that can be played across devices. As a virtual yet in-person gaming and playtesting platform, it offers both more accessible and more interactive experiences.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Blind tabletop game enthusiasts currently shoulder the burden of modding physical game pieces to make games accessible via stickers, braille embossing, or even NFID/RFID tags. With Holodeck, creators contribute textual, positional, and pictorial information such as card text, card layout, and alternative image text during the virtualization process. This means game assets like cards are voiceover ready right out of the box. Cards become consistent gestural soundboards – accessible through virtual augmentation rather than tedious physical modification. And with the addition of computer vision and image segmentation, this virtual augmentation could even extend into the real world of physical cards.

In addition to inclusivity, cyberphysical tabletop games open the door to immersive gameplay currently seen only in video games. Game events such as playing a card could trigger sound effects and animations – mixed media assets already being produced by many tabletop game creators. A magical future awaits as developments in cyberphysical systems, spatial audio, and holographic volumetric displays promise to bring the world of tabletop games to life.

A virtual hand of cards
A virtual hand of cards — Players can view and rearrange their hands while sending cards to the board or potentially other players.
A virtual tabletop
A virtual tabletop — Players can move, rotate, and flip cards on the table while also sending cards to other players or themselves.

Holodeck is a gaming and playtesting platform for in-person play that captures the physicality and sociality of the tabletop experience. Players begin by scanning the QR code in front of them. Once they join, players can draw cards from the table, play cards to the table, or swap them amongst themselves. Holodeck grounds the virtual in the physical by virtualizing how we interact with physical cards, keeping the interactions between players’ hands and table, and setting aside space for the table at the center of the tabletop experience.

* as of June 2021, Holodeck is under development at playholodeck.com

Print-and-play kits
Print-and-play kits — Print-and-play kits are the status quo for remotely delivered in-person playtesting that require players to print and cut out the cards for each new iteration or expansion of the game.
Virtualization
Virtualization — Creators virtualize their tabletop games by uploading game assets – image, audio, video – along with a spreadsheet outlining textual, pictorial, and positional information for those assets.

Tabletop game creators have long been digitizing their game assets and distributing them as print-and-play kits. More recently, they have also begun to virtualize their game assets on platforms meant for remote play such as Tabletop Simulator and Tabletopia – and increasingly so since the pandemic began. Holodeck lets these creators offer an accessible and immersive experience tailored for in-person play.

A special thanks to Will Kirkpatrick at Jellybean Games and Brad O'Farrell at Cantrip Games for speaking with me about their experience crowdfunding tabletop games as well as granting me permission to virtualize their games for Holodeck.

Design research with blind players
Design research with blind players — Quotes describing what makes a virtual tabletop games accessible
Voiceover Accessibility — A demonstration of Holodeck's custom voiceover/screen reading functionality

Holodeck lets blind players interact with cards in new ways. It makes it possible to simultaneously flip through their virtual hand of cards as they screen read the list of cards with one hand while screen reading the entirety of the currently selected card with the other. This improves on most screen readers where screen reading is a read only interaction requiring an additional step to select a card. Additionally, feedback on the difficulties with drag and drop interactions led to drop zones that could double as buttons for grouping and sending cards.

These interactions were inspired by design research conducted with blind tabletop game enthusiasts found through Sightless Fun, a podcast and website on accessibility of modern tabletop games. Key insights included the importance of easy to navigate information hierarchies and controllable delivery of information when designing accessible interfaces for blind players. These findings helped shift the design away from more skeuomorphic interfaces.

A special thanks to Ertay Shashko, Brian Counter, and Ryan Peach for speaking with me about their experiences as blind tabletop game enthusiasts.

Holodeck 2.0
Holodeck 2.0 — Emerging technology in mixed reality, holographic volumetric displays, and spatial audio promise to extend tabletop games into new dimensions.
Holodeck 2.0: A First Step
Holodeck 2.0: A First Step — A soundtrack and soundboard created for Story Wars, a crowdfunded game from Cantrip Games, meant to set the ambiance during gameplay.

The transition from purely physical to cyperphysical table games opens the door to new levels of immersion and interactivity. Even today, DIY projects such as Pepper's Cone have introduced inexpensive 3D holograms to traditional 2D displays while commercial headphones have made spatial audio mainstream. This means that creators can begin interweave mixed media content into their tabletop games.

For example, the crowdfunding campaign for Story Wars, a storytelling card game, included an animated cartoon as well as a soundtrack and soundboard. While the cartoon existed outside of gameplay, Cantrip turned the soundtrack into a soundboard so players could loop the song that matches the current battlefield card to set the ambiance during gameplay. With Holodeck, accompanying mixed media from animations to sounds could be seamlessly triggered by in-game events.

With Holodeck, accompanying mixed media from animations to sounds could be seamlessly triggered by in-game events. This media could even be dynamically created in the near future with generative algorithms that can mix and blend together multiple virtual game assets. And with the incorporation of the sensors and actuators that make up cyberphysical systems, players can magically interact with both the physical and virtual layers of tabletop games.

resonance intro — a two minute video introducing the concept and project

Inspired by a tap on the shoulder that tells you where to turn, resonance explores how we might experience the spatial dimension of sound through touch. It takes the form of a wooden bead necklace holding three vibration motors nested between each pair of gold spacers. It listens to commands via Bluetooth that translates into various positioned haptic patterns on the back of the wearer’s neck.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Presence of sound expands situational awareness while directionality of sound extends spatial awareness – together they afford spatial situational awareness. The design rationale for focusing on sounds coming from behind is that humans can only see what is in front of them and therefore rely predominantly on their hearing to be aware of what’s happening behind them. In the spirit of inclusive design, resonance imagines how sensory substitution could make the public soundscape more accessible.

This project seeks to remedy some of the dangers and difficulties that arise when sound in the environment goes unheard – whether it be from disability or situational limitations. When the public soundscape is inaccessible, everything from warnings to look out, requests for attention, to cries for help will not be perceived. Research into the deaf culture and headphone culture revealed how often this results in situations ranging from the embarrassing – inadvertently blocking an aisle as requests to pass are seemingly ignored – to the life-threatening – stepping off the curb not sensing the ringing bike bells ringing or the honking of car horns.

Ideation
Ideation — early notes, questions, and sketches
 Materialization
Materialization — an intentional "non-techy" aesthetic
Form
Form — inspired by prayer beads and worry beads, and especially how they are worn, held and touched
Experiments With Sensing
Experiments With Sensing — low cost sensors managed sound detection, but not sound localization
Wiring & Circuitry
Wiring & Circuitry — an Arduino Nano BLE Sense listens to bytes sent via Bluetooth and actuates motors
Experiments With Haptics
Experiments With Haptics — pulse-width modulation allows for gentle haptic patterns that ease in and out
Meet resonance
Meet resonance — the final prototype thus far, and portable thanks to two battery packs that power the microcontroller and motor
Moving Forward
Moving Forward — this diagram depicts how a haptic wearable could potentially interface with a headphone paired to a smartphone
resonance
resonance — a sound or vibration produced in one object that is caused by the sound or vibration produced in another...

Beyond directionality, there are other dimensions of sound to explore including proximity, loudness, and emotions and tone when it comes to the subtleties of human speech. How might we continue to enrich haptic interactions by incorporating these dimensions?

And on the technological front, how might we tap into sophisticated beamforming microphone arrays already embedded in headphones and hearing aids?

More generally, how might we apply this overarching concept of augmenting spatial situational awareness to other contexts such as biking or driving?

Finally, how might we harness touch to convey other more-than-human sensing modalities from radar to magnetoreception?

Medium:

wooden beads, gold plated spacer beads, silicon tubing, coin vibration motors, Arduino Nano BLE Sense
J{AI}}NE DOE: AI Enhanced Cyberstings
J{AI}}NE DOE: AI Enhanced Cyberstings — Swarming online sex trafficking with AI-enhanced cyberstings
Ideation
Ideation — How might we make the online environment a more hostile place for sex traffickers?
J{AI}}NE DOE: Create & Interact
J{AI}}NE DOE: Create & Interact — J{AI}NE DOE assists with the key stages of running cyber stings beginning with creating and embedding victim personas and decoys before interacting with potential traffickers.
J{AI}}NE DOE: Monitor & Analyze
J{AI}}NE DOE: Monitor & Analyze — J{AI}NE DOE also helps monitor and analyze interactions with potential traffickers by leveraging artificial intelligence and collective intelligence of counter trafficking experts.
Research
Research — In 2016, 55% of those who entered the life in 2015 reported meeting their trafficker via text, website, or app.
Figma Prototype — A run through of our clickable prototype for J{AI}NE DOE

J{AI}NE DOE is a digital platform that extends the ability of activists and law enforcement to catch online sex traffickers by streamlining, automating, and scaling the deployment of AI victim personas to combat them.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Much like cyberbullies and online trolls, sex traffickers take advantage of the anonymity offered by the Internet. These traffickers that deceive, recruit and groom their victims online find the Internet and social media platforms to be a far more hospitable environment than the real world. With a minimal law enforcement presence, the virtual world also makes it easier to not only hide their intentions, but also scale their operations. As a result, many traffickers increasingly go online to find their victims – and this has only accelerated during the pandemic.

J{AI}NE DOE seeks to introduce more of the friction felt in the physical world into the online space and make it a more hostile environment for traffickers by supporting counter trafficking cyber operations like cyberstings. Traditionally, running cyber operations requires a significant investment in time and effort, from generating and maintaining active fake personas, engaging and monitoring suspected traffickers, to assembling a case against identified traffickers. J{AI}NE DOE is meant to be a platform that can support activists and law enforcement in these operations every step of the way.

J{AI}NE DOE was one of 25 teams selected from a pool of over 270 applicants to participate in the Venture Catalyst Challenge run by the Imperial Enterprise Lab.

Kaleidocycle Box: Process
Kaleidocycle Box: Process — a collage from start to finish
Kaleidocycle Box: In Motion — a boomeranged video of the revolving box

Given a design brief to create a box that reflected who you are, I soon realized that my box must express a desire to move and be moved – to feel alive. I stumbled into the world of transformable structures and linkages that afforded movement. Intrigued by the continuously twisting motion of the kaleidocycle, I set out to carve a box out of a kaleidocycle by removing a set of faces – one from each of the six tetrahedra – to form three compartments.

I began with a simple folded paper prototype, then moved to hand cut cardboard, before finally arriving at laser cut wood. After much sanding to account for the dimensionality that comes with depth, I was happy to discover how alive it felt in motion. There is a gripping moment in each rotation where it almost springs out of your hands.

Medium:

wood, tape