The bureau of architecture Kollision has developed a series of interactive installations that serve as a playful communication platform and infrastructure for the exhibition Smart Use of Energy at Danish Design Centre in Copenhagen. The exhibition communicates the idea that Denmark is aiming to become a world leader within environmentally friendly transportation in an attempt to face the challenges of climate change. The exhibition presents a variety of solutions among them how the cars of the future can run on green energy from windmills and at the same time being used as large batteries to store excess wind generated power.
Kollision’s main contribution is a RFID based infrastructure that allows the visitors to charge their tickets as batteries by blowing a fan at the entrance. The ‘energy’ on the ticket is then used to power the 14 installations in the exhibition. At the individual installations the users can watch movies of experts explaining new challenges and potentials in relation to wind energy – as long as their batteries are powered. When your battery is running out you have to go back to the fan to recharge it.
Read more about the project (in Danish).
A glance at this smart bag from the not-so-distant future instantly verifies if you forgot one or more of the necessities of your everyday life: keys, wallet and cell phone.
The bag uses RFID technology to determine whether the items are present. A display of LEDs built to show emoticons is mounted on the front of the bag, making it an indicator to show which of the items you forgot. When you’re ready to go the bag smiles at you but until then it’ll show emotions ranging from angry to not-completely-satisfied.
The Ladybag Project was developed by a team of Canadian students from the Simon Fraser University. The have created various prototypes which can be seen on the project’s website.
If you’re interested in creative uses of RFID tags, check out our previous posts on the technology.
Graphical output from a computer application is projected on to the table, which has a set of RFID-readers attached underneath. Each of these readers define an active zone on the table surface. The table can be controlled by a number of intelligent objects equipped with RFID tags, all of which are designed to fit the specific theme of the table. When an object is placed in an active zone, the RFID tag is read by the RFID reader underneath. Activating a tag thus triggers a reaction from the table according to the tag function, the location, and other tags placed in the same zone. The reactions can be edited in the iLand administration module, and includes playing a video sequence in a window on the table surface, or playing a sound clip using the integrated surround sound equipment.
iLand can be used for different scenarios – from handling advanced simulation games to setting up simple edutainment for children.
|The table can be controlled by objects equipped with RFID tags.||When an RFID tag is read by the RFID reader it can be set to trigger the playing of a video sequence.|
Hørbar/Audiobar is a responsive environment created by the Danish artist Mogens Jacobsen. The project is commisioned by The Museum for Contemporary Art in Roskilde, Denmark. Mogens Jacobsen was presented with the task of finding a new way to present the museum’s vast collection of international audio-art.
The result is the Hørbar/Audiobar which consists of two rooms: A bar-room for collective interaction and a study lounge for more in deep exploration. In each room users can listen to different audio-art clips by moving RFID equipped bottles across a table with an RFID reader built into it. A new and playful way to communicate audio-art.
Hørbar/Audiobar is exhibited at The Museum for Contemporary Art in Roskilde until April 15.
|The user can interact and listen to different audio-art clips by moving RFID equipped bottles across the table.|
Smart Urban Intelligence is a project by Japanese artist Ryota Kimura. The project is based on the SUICA, which are RFID-equipped cards that are used in the subway system of Tokyo. The ‘Smart Urban Intelligence’ system reads out the data from these cards and interprets them in an arbitrary way.
The Smart Urban Intelligence project is a mixture of reality and fiction. It exaggerates and plays on situations which will be an integrated part of our future, as surveillance becomes ever more present in our computerized society.
The system reads actual data in the railway smartcard, and then displays the history from the card in the form of motion video and of a visualized route map in real time. At the same time, the bot automatically analyses the history from the card and tells this story to the user: where the card holder lives, what his favourite place is to visit, when he returns home, and when he stays out overnight, etc. For the bot, the data are the only real thing, and the bot analyses, interprets, and evaluates this with its limited and circumscribed view.
The Smart Urban Intelligence project was presented at this year’s Ars Electronica Festival.
|The ‘Smart Urban Intelligence’ system reads out data from RFID-equipped cards used in the Tokyo subway system.||The system reads actual data in the railway smartcard, and then displays a visualized route map in real time|
USED clothing is a concept developed by Martin Mairinger. By adding a virtual component to second-hand clothes, the USED clothing concept turns clothes into a new kind of storage medium. Using RFID and web technology, digital information about a certain piece of clothing can be stored and read. That is, in the USED clothing shop customers can buy and sell used shirts, pants, and so on, each of which has a unique story attached in the form of digital information.
A prototype of the concept has been developed in co-operation with the Ars Electronica Futurelab.
For a flash-presentation of the concept look under media.
|RFID technology is used to enable storage and reading of digital information.||In the USED clothing shop customers can buy and sell clothes with a unique story attached.|
bYOB (build your own bag), is a research project by the Object-Based Media Group at the MIT Media Lab. bYOB is a computationally enhanced modular textile system that allows any user to construct intelligent fabric objects such as bags and clothing. The bYOB modules consist of a microcontroller and LEDs housed within a fabric shell. When modules are snapped together to build an object, they start communicating with one another and the environment.
Applications include Light Sensing: The fabric of the bag illuminates when there is not enough natural or artificial light to see inside. Object Detection: RFID tags are used to detect whether or not important objects (e.g. cell phones, keys, and wallets) are nearby, and alert the user through the fabric’s light and sound actuation if the items are missing. A bYOB-constructed bag will ensure that you never leave the house without your keys or cell phone! Network Detection: Information regarding the existence and strength of network is communicated to the user through an ambient light display, and makes the invisibility of wireless networks visible. In this way you will always know when and where to log on!
|By snapping together the bYOB modules you can build any object.||The fabric of the bag illuminates when there is not enough light to see inside.|
MouseField is a simple and versatile input device for ubiquitous computing. The implementation of the MouseField consists of an RFID reader and motion sensors. By placing an object with an RFID tag on the MouseField, it can detect the object via the RFID reader, and moreover detect the direction and rotation of the object via the motion sensors.
An example of an application of the MouseField could be the controlling of a Music Player. Variuous CDs are saved on a computer, and CD covers represent the saved music. By equipping each of the CD covers with an RFID tag, the user can start playing music by placing a CD cover on the MouseField, choose between different songs by sliding the CD cover up and down, or change the sound volume by rotating the CD cover.
Check out the video of the application of MouseField as a Music Player.
|The MouseField consists of two motion sensors (taken from a standard optical mouse) and an RFID reader.||The user can change the sound volume by rotating the CD cover.|
BusinessWeek Online’s feature on the IBM Store of the Future provides a sneak preview of several technologies developed by IBM to make shopping a more personalized experience. The innovations include The Everywhere Display, which uses a projector, mirrors, and software to turn any store surface into a virtual interactive touchscreen. In one case The Everywhere Display is used to guide customers to previously ordered products by projecting arrows on the floor of the store.
Another innovation is the Veggie Vision Scale which uses a camera to identify the kind of fruit consumers place on the scale and price the fruit without the customer having to push any buttons.
IBM’s Personal Shopping Assistant uses RFID (Radio Frequency Identification) and GPS (Global Positioning System) technologies to alert customers to promotions and personalized discounts as they walk through the aisles of a supermarket.
Read more about The Everywhere Display research project.
|The Personal Shopping Assistant alerts customers to promotions and personalized discounts as they walk through the aisles of a supermarket.||The Veggie Vision Scale prices the fruit without the customer having to push any buttons.|
NTT Communications and the Japanese drugstore chain Seijo are testing a new RFID enabled system that allows customers to try on make-up without having to apply it. The system consists of a traditional make-up mirror with a computer monitor where the mirror is normally placed. A video camera is mounted just above the screen and the physical make-up samples are tagged with RFID chips. As these samples are placed on the pad in front of the user, the system will find the product details that match the chosen product. Visual recognition technology is used to manipulate the on-screen image of the user’s face to simulate the application of the chosen make-up.
Additionally, the system recommends matching shades of eye shadow, eyeliner, and other make-up items, and allows the customer to simulate what these would look like in combination with the make-up that was initially selected.
Ultimately, the user gets a printout of the recommendation, including a picture of the simulation grabbed from the video.
|The physical make-up samples are tagged with RFID chips.||Visual recognition technology is used to manipulate the on-screen image of the user’s face to simulate the application of the chosen make-up.|
Natural Interaction is both a website and a concept that focuses on inventing, designing, and creating solutions that interact with users in a natural way. Natural Interaction focuses on understanding the gestures, movements, and expressions of the user, and a natural interface is one that lets users explore and use the technology without the sense of interacting with a machine. The implementation of natural interaction, among other things, makes use of computer vision techniques, RFID-technology, 3D-graphics, and video projections. Natural Interaction is by nature intuitive, and the user doesn’t need to wear any device or learn any specific instructions.
At the website, several projects can be found. Alessandro Valli, the main creator of the projects, is associated with the Media Integration and Communication Center (MICC) at the University of Florence. Read his comprehensive notes on Natural Interaction to learn more about his research work.
We have singled out some projects and videos:
Watcher: A 3D virtual character observes people’s actions and behaviours. The project focuses on enabling visual attention from the virtual character, when multiple persons or objects are in the field of vision of this character. Video.
Ely the Explorer is an interactive play system created at the Umeaa Institute of Design at the University of Umeaa in Sweden. The purpose of the project is to explore the potential of interactive play systems as tools for collaborative learning.
The concept is targeted at children and consists of a multi-user unit and a set of interconnected physical and virtual tools. The physical and virtual tools are connected using RFID technology, and a LAN enables children to transfer information between the units.
The play is developed around a group of characters (Elys) going on a journey of discovery around the world, hence the name of the game. One of the components of the game is Ely, both a physical toy and a virtual character, who guide the children throughout the game. Ely carries a back-pack which has a PDA with a small display built into it. Another central component is the multi-user unit which functions as a teleporter in the game. The game is initiated by putting the physical toy inside the teleporter. The Ely travels to different countries with the teleporter, and tasks related to this journey are put forward, e.g. documenting the journey through words, images, drawings and sounds using both the physical and virtual tools.
Check out the prototype and other ideas for supporting collaborative learning.
|The physical Ely carries a back-pack which has a PDA with a small display built into it.||The Teleporter is a multi-user unit with a tabletop touch screen.|
IDEO and Prada have worked together in research aimed at exploring how new technology can be used in retail stores. The collaboration has led to the creation of an interactive dressing room that augments the experience of trying on clothes.
The interactive dressing room consists of an eight-foot-square booth with Privalite glass walls that switch from transparent to translucent when the room is occupied. Additionally, the customer can choose to switch the doors back to transparent, thereby enabling onlookers waiting outside the dressing room to take a look at the clothes he or she is trying on.
The dressing room also holds a closet equipped with RFID technology. As garments are hung in the closet, their RFID tags are automatically scanned and detected by means of an RF antenna embedded in the closet. Information related to the garment is then displayed on an interactive touch screen, making it possible for the customer to select alternate sizes, colours, styles etc. Last but not least, the interactive dressing room holds a video-based Magic Mirror which delays the image as the customer turns in front of the mirror, enabling the customer to view herself in slow motion from all angles.
|The Interactive Dressing Room offers an augmented experience when trying on clothes.|
Digital experience inspires a lot of people, showing exiting, interessting, innovative or even odd uses of new technology in an experience-oriented context. If you have a good example that we haven't already covered, we would appreciate your suggestion.
Digital Experience is a weblog on interaction and experience design.
The site is created and maintained by CAVI, the Centre for Advanced Visualization
and Interaction at Aarhus University. Contributions are welcome.
More about this site.