What if your exoself could touch reality? This thesis project explored what might happen if a person could create a connection between one's digital self and objects, places or people one usually interacts with through touch. The end results of the concept Shopal were a digital information service named AIA (Artificiall Intelligence Assistant) and a collection of smart at-hand-wearables called SIGIL.
AIA filters information about products the user touches according to the user's interests, based on the user's digital self. SIGIL is a means to allow the user to preserve a natural movement pattern to scan objects - by holding or touching them - as well as a way to deliver AIA:s filtered information to the wearer.
The goal of SHOPAL was to facilitate consumer decisions, create a way for companies to deliver product information to their customers and ultimately a way to visualize and manage a user's exoself.
Inspiration and Method
Originally this project set out to explore alternative ways to enable services traditionally handled through plastic ID- and payment cards, without the need to bring said cards or a wallet.
Throughout the project research was gathered concerning wearable technology, NFC security, as well as emerging trends on society, users and commerce - all of which was implemented in workshops and web questionnaires.
The process led to a turning point where focus shifted from enabling mobile payments and ID, into exploring what ID actually means in the the digital era. The Digital Self became a focal point and the project began exploring ways to gain personal control over information stored about the user on the web, then applying this digital information to the physical world - enabling a "phygital" world where the physical reality draws assets from the digital world, when traditionally the opposite is more common.
The end result is a framework concept for how users would like to manage an artificial intelligence assistant, both in terms of the digital service and examples of physical smart tools allowing a user to navigate AIA through the physical.
The tools suggested through this project are a collection of smart at-hand-wearables codenamed SIGIL, which allows the users to connect with products they touch through NFC (Near Field Communication) technology in a future reality where products are expected to hold a plastic NFC chip, in order to supply customers with more information than today's mandatory contents labels could supply.
AIA helps the user navigate his/her exoself data and uses it to filter information about a product, in order to swiftly determine what the user usually wants to know about that sort of ware. This information is then sent to the user's smart phone, which in turn can send it via BT to the user's SIGIL wearable, enabling feedback based on however the product meets the users' demands.
The end effect is a service and product which understands what the user touches, what it expects from such a product and then tells the user if these expectations are met - all through something as natural as touching or holding the product.