Empty Surfaces

Envisioning the future of ubiquitous user interfaces

At Carnegie Mellon University, I was challenged to "design a responsive and scalable User Interface that connects the experience across multiple devices ranging from a mobile phone, to digital whiteboards, to interactive kiosks." As a result, I conceptualized a user interface designed to help users do digital visual thinking work on any surface, suggesting that one day we won’t be carrying our devices with us, but that our digital experience will be a ubiquitous part of our existing environment.  I have also conceptualized the idea that users will be able to experience more multi-touch functions through the use of specific fingerprint input. 


Process

Initial Concept

I came up with the concept of a smart pen. This pen allows users to enjoy the tactile nature of physical pen and paper, while it created digital copies of the penned work in real time. Their work is saved to a digital format that enables them to search their written content, convert hand writing into text, and make edits digitally.

The smart pen also allows users to send texts, emails, Facebook posts, tweets, etc. All they have to do is hold down a button on the pen and write a command such as “email Mom”, and then release the button to continue writing the message, and use the button to write “send”. An email will instantly be sent to the intended recipient.

1-GaHtlRI4Xx0YiRa5QHNkYw.jpeg

 

Revising the Concept

After getting some feedback from my professor, I identified problems and opportunities with my concept. I needed to push the ubiquitous interface idea further. What is the future of pens, paper, and screens? After considering this for a while, I realized...

I don't think we will be carrying our devices with us forever. One day screens will be embedded in the existing environment and all we have to do is log on. 

 

Next, I took a step back from the “smart pen” idea and ditched paper altogether. I realized that the future of screens is not to carry one around with you all the time. One day, surfaces may be interactive screens. I altered my concept and assumed that could be a reality and created a user experience that could take place on several different surfaces.


I came up with a concept that enabled a person to start a visual thinking project on one surface, and resume on a different surface by logging in with a biometric fingerprint. I considered collaboration possibilities and different input such as touch gestures, an invisible keyboard and stylus. I also started to experiment with multitouch interaction.