DNA 11 creates unique portraits through an innovative combination of science and art. By using DNA samples as the starting material, each abstract art piece becomes as unique as a human being.
The Co-Founders of DNA11, Adrian Salamunovic and Nazim Ahmed, spent months experimenting with different ways of representing the unique DNA fingerprints. Finally, they came up with a method that in a noninvasively way captures the organic beauty of DNA, and they developed a way of translating the images of DNA into one-of-a-kind abstract artwork.
Their customers include ABSOLUT VODKA, who ordered a piece featuring the DNA of fruits used in their products.
More on how these unique pieces of artwork are created.
Order your own DNA portrait.
Related projects: GFPixel.
|DNA portraits.||Decorate you living room with you own DNA portrait.|
Ely the Explorer is an interactive play system created at the Umeaa Institute of Design at the University of Umeaa in Sweden. The purpose of the project is to explore the potential of interactive play systems as tools for collaborative learning.
The concept is targeted at children and consists of a multi-user unit and a set of interconnected physical and virtual tools. The physical and virtual tools are connected using RFID technology, and a LAN enables children to transfer information between the units.
The play is developed around a group of characters (Elys) going on a journey of discovery around the world, hence the name of the game. One of the components of the game is Ely, both a physical toy and a virtual character, who guide the children throughout the game. Ely carries a back-pack which has a PDA with a small display built into it. Another central component is the multi-user unit which functions as a teleporter in the game. The game is initiated by putting the physical toy inside the teleporter. The Ely travels to different countries with the teleporter, and tasks related to this journey are put forward, e.g. documenting the journey through words, images, drawings and sounds using both the physical and virtual tools.
Check out the prototype and other ideas for supporting collaborative learning.
|The physical Ely carries a back-pack which has a PDA with a small display built into it.||The Teleporter is a multi-user unit with a tabletop touch screen.|
The concept of force feedback can be explained as opposing the movement of the hand in the same way that an object squeezed between the fingers resists this movement. By way of example: When you squeeze an orange, you can sense the oppositional forces of the orange. Force feedback is about imitating these oppositional forces. This can be done in various ways, e.g. by using a data glove that, in the absence of a real object, recreates the forces applied by the object (in our example the orange) on the human hand.
Immersion develops technologies based on the principle of force feedback, so called haptic technologies. They focus, among other, on the areas of gaming and 3D interaction.
In the field of gaming Immersion has developed products that augment the experience of using computer and console gaming platforms, arcade and simulation products, and casino and mobile gaming products by means of force feedback.
In computer and console gaming platforms, the experience of “feeling” the game extend from rumble vibrations to full-force feedback built into gamepads, joysticks and racing wheels.
Another example of force feedback used in an experience enhancing way is the use of touch screens capable of “touching back” in casino gaming. Users can actually feel how onscreen buttons depress and spring back, thereby supplying the feeling of pressing physical buttons.
In the field of 3D interaction, Immersion has developed products as the CyberGrasp™ system which enables the user to “reach into the computer” and grasp computer-generated objects. Additionally the CyberGrasp™ system makes it possible to feel the size and shape of the computer-generated 3D objects in a simulated virtual world.
Check out examples of different applications of Immersion’s 3D interaction products.
|Immersion has developed various kinds of data gloves that make use of the principle of force feedback.||The touch screen capable of “touching back”.|
Slow Technology is a design concept in the domain of interaction design, developed at the PLAY Research Studio at the Interactive Institute in Gothenburg, Sweden. The purpose of Slow Technology is to gain an understanding of the aesthetics of computational technology as a material in the design of everyday things.
As computers become ubiquitous, Slow Technology acknowledges the need for focusing on designing technology aimed at reflection and moments of mental rest rather than efficiency in performance. Slow Technology can be slow in various ways, e.g. in the way that it takes time to understand how the technology works, it takes time to use the technology, and it takes time to find out why the technology works the way it does.
Design projects that follow the design agenda of Slow Technology include:
The ChatterBox: The installation picks up emails and electronic documents sent around an office, for example. The ChatterBox subsequently generates new sentences based on the collected data. These sentences are then visualized in a public place at the office. Video of ChatterBox.
Informative Art: Informative Art are computer augmented works of art that are both aesthetic objects and information displays. Examples of Informative Art include experiments with dynamic mapping of information structures into Mondrian-like compositions, e.g. a visualization of what the weather is like in six countries around the world. Video of Informative Art.
Further information on Informative Art.
Expressions: Explores the basic acts of using information technology, i.e. reading and writing information. We write information when we use our keyboard or mouse, and we read information in the shape of graphics and text on our computer screen. In the Expression project the focus is on reading and writing information in our everyday life through everyday activities. This has led to the creation of different devices to be used as part of our everyday environment, e.g. the Fan House, the Sail House, the Chest of Drawers and the Paper Recycler.
Read more and check out examples.
The Textile Displays: this project is not only concerned with integrating computers into fabric, e.g. creating intelligent materials or new kinds of displays. Instead, the focus is on using the unique spatial properties of textiles to manifest temporal structures generated by computational processes. An example of this way of combining textiles and computers is The Information Deliverer which is a device for delivering information, in this case news, by means of blowing out pieces of fabric from large plastic tubes.
More about experimental design work on textiles at the Interactive Institute.
Related projects: Lumen Interactive Visual and Shape Display
|In this example of Informative Art the colours of the rectangles indicate the weather conditions, and the size of the coloured rectangle indicates the temperature.||The Information Deliverer delivers information by means of blowing out pieces of fabric from large plastic tubes.|
IDEO and Prada have worked together in research aimed at exploring how new technology can be used in retail stores. The collaboration has led to the creation of an interactive dressing room that augments the experience of trying on clothes.
The interactive dressing room consists of an eight-foot-square booth with Privalite glass walls that switch from transparent to translucent when the room is occupied. Additionally, the customer can choose to switch the doors back to transparent, thereby enabling onlookers waiting outside the dressing room to take a look at the clothes he or she is trying on.
The dressing room also holds a closet equipped with RFID technology. As garments are hung in the closet, their RFID tags are automatically scanned and detected by means of an RF antenna embedded in the closet. Information related to the garment is then displayed on an interactive touch screen, making it possible for the customer to select alternate sizes, colours, styles etc. Last but not least, the interactive dressing room holds a video-based Magic Mirror which delays the image as the customer turns in front of the mirror, enabling the customer to view herself in slow motion from all angles.
|The Interactive Dressing Room offers an augmented experience when trying on clothes.|
Touchlight is new interactive display technology developed by researcher Andy Wilson from Microsoft Research. The Touchlight technology was presented at the SIGGRAPH 2005 Emerging Technologies venue and focuses on using different surfaces as alternative computer interfaces.
The technical implementation of Touchlight consists of combining a translucent holographic film projection material and computer vision techniques. Two infrared video cameras, one still camera, and an infrared illuminator are mounted behind the projection material. Alignment of the output images of the two video cameras combined with image processing techniques enables the user to interact with the screen. The image becomes bright where the user’s hands are touching or nearly touching the screen, thereby enabling the user to draw light across the surface of the screen with his hands.
The characteristics of the projection material that constitutes the screen make it possible to project onto the material and see through it at the same time. Because of the transparency of the screen, it is possible to use the digital still camera behind the screen to capture a high resolution picture of an object placed on the surface. Optical flow techniques are then used to enable the user to interact with the picture in a natural and fluid way, e.g. rotate and scale the picture by merely touching and pulling the image.
Check out the explanatory video of Touchlight and a presentation of Touchlight at the Microsoft Research’s faculty summit.
Related projects: The Cabinet, Natural Interaction, The Multi-Touch Sensing Through FTIR Display, tabulaTouch and Tangent.
|Touchlight enables you to draw light across the surface of the screen with your hands.||Optical flow techniques make it possible to rotate and scale the high resolution picture by merely touching and pulling the image.|
The installation presents content related to the German art exhibitions documenta in digital form, and consists of a touch-sensitive projection surface which enables the visitor to interact in a very direct manner and thereby choose between different content related to the documenta exhibitions. The installation changes between this interactive mode, where visitors can choose among different content at will, and a more linear mode which gives the visitors an audio-visual tour of the city of Kassel.
The project is initiated as part of the 50th anniversary celebration of the documenta exhibition, and as a part of the city of Kassel’s candidacy to become the Cultural Capital of Europe in 2010.
|The 11-metre-long interactive table runs alongside posters from the preceding documenta exhibitions.||By touching the surface you can activate information. In this case, information about the artist Jeff Wall.|
Blinkenlights was a series of installations that used building facades as displays. The installations worked by controlling the lights behind a building’s windows to create a monochrome matrix display.
A notable example was the Blinkenlights Pong, in which users could dial a designated number and use their mobile phones to control a paddle in a game of Pong against either a computer or another human player.
Users could also use custom-made Blinkenpaint software to make small love letters that were displayed on facades.
Related projects: SPOTS.
|Blinkenlights Loveletters.||Play Pong on building facades.|
This year’s Christmas special is dedicated to the house of the Williams family in Deerfield Township, Ohio, US. Carson Williams, who works as an electrical engineer, has created an amazing audio-visual installation in his front garden. The installation consists of 16.000 lights which are programmed to synchronize with various tunes. The lights are controlled by means of 88 Light-O-Rama channels. Light-O-Rama is a programmable light controller that enables you to create moving lighting effects in any environment. Each minute of music required three hours of programming to get the lights to move in sync with the music.
Among this year’s tunes are the Trans-Siberian Orchestra’s “Wizards in Winter” and Barbra Streisand’s “Jingle Bells”. The music is broadcast with a low power FM transmitter, making it possible for passers-by to listen to the music on their car radios while watching the light show.
|The 16.000 lights are programmed to move in sync with the music.|
With the I/O Brush, the world becomes your palette. It is as simple as that. The I/O Brush is a new drawing tool, created at the MIT Media Lab, which makes it possible to explore colours, textures, and movements found in your everyday surroundings.
The I/O Brush looks like a regular physical paintbrush, but instead allows you to pick up colors, textures, and movements from objects surrounding you, and use them as your own special “ink”. It lets you control both input and output, unlike a traditional paintbrush with a specified colour and texture.
The technical implementation of the I/O Brush system consists of two components: The brush and the drawing canvas. The brush has a small CCD video camera with ring of light bulbs around it, built into the tip of the brush. Touch sensors, also built into the brush, measure the pressure being applied to the bristles. As the brush touches a surface, the lights are turned on briefly to make sure there is enough light for the camera. The system grabs the frames from the camera and stores them in a software program. There are three modes for picking up the “ink”. The texture mode captures a snapshot of the brushed surface consisting of one frame and lets you paint with that particular frame. The colour mode computes the RGB values of all of the pixels in the captured frame, and the most common RGB value is then used as ink. This enables the user to paint with a solid colour. In the movement mode up to 30 consecutive frames are grabbed, and the user can thereby paint with this grabbed movement.
The canvas consists of a large touch screen with a back projection screen. As an additional feature, the brush strokes are linked to movie clips documenting where the user has picked up that particular material. So as the user touches the screen, a movie clip will be played showing where the material was picked up.
Watch the video of the I/O Brush in action.
|The world is your palette. You can pick up every colour, texture and movement surrounding you.||The components of the I/O Brush.|
The aim of Drumhead is to explore a new way of creating a musical controller; a controller that evokes an empathic response in addition to the more conventional forms of responses.
The implementation of Drumhead consists of a polystyrene wig stand onto which is projected a video of a face. When the wig stand is struck by a drumstick, a sound is played and the face responds to being struck. This way the novelty of hitting an object not usually considered to be a percussion instrument is combined with the innovativeness of equipping this same object with a human appearance and response.
The technical construction of Drumhead consists of a piezo electro transducer which is inserted into the back of the wig stand. The output of the transducer is directed to the audio input of a computer which drives a piece of software. The software is capable of controlling the playback of the quicktime movie of the face, as well as triggering a sound. So when the transducer senses a strike, a sound is played and the video is rewound to a predetermined location in which the face has an appropriate expression.
Drumhead is created by Murat N. Konar
Check out the video of Drumhead .
Go behind the scenes with the making of Drumhead.
|When the wig stand is struck by a drumstick a sound is played and the face responds with an appropriate expression.||A piezo electro transducer is inserted into the back of the wig stand.|
Body Movies is a relational architecture installation by Rafael Lozano-Hemmer. The installation is interactive and is designed to transform the Schouwburg Square in Rotterdam by involving the passers-by in the implementation of the piece.
Body Movies consists of a projection onto the 90-metre long and 22-metre tall facade of the Pathé cinema in Rotterdam. Projectors with robotic controllers are placed on two towers facing the building. The projectors show more than 1000 portraits taken on the streets of Rotterdam, Madrid, Mexico and Montreal. Two 7000 watt light sources are placed at ground level in front of the building, and the portraits are completely washed out by this bright light. However, as people move across the square, their shadows are projected on to the building, and the portraits are revealed within the shadows. Audible feedback is given to the participants in the square when a portrait is revealed. A camera based tracking system monitors the location of the shadows. When the shadows match all of the portraits in a given scene, a midi-signal is sent to the robotic controller to trigger a complete black out, which is then followed by a new series of portraits.
A printed explanation and a projection of the computer interface lets the participants know how the installation works.
Check out the explanatory video.
|The shadows reveal portraits of people in the streets of Rotterdam, Madrid, Mexico and Montreal.||The projected computer interface lets the participants know how the installation works.|
This interactive art installation enables its users to explore pre-recorded movie content in an entirely new way. Where modern digital players focus on allowing users to make temporal changes when playing a movie, such as stopping, rewinding and skipping, the Khronos Projector allows the user to explore a new dimension. By touching a deformable projection screen, it is possible to send parts of the image forwards or backwards in time. The encouragement for creating the Khronos Projector is to enable the viewer to free himself from the enforced adoption of a point of view in time and space when watching a movie, i.e. making it possible to change the perspective.
The installation set-up consists of a video projector, a large deformable projection screen, and a sensing mechanism that is capable of registering the deformation of the projection screen material in real time. The deformable screen is made of a thin, translucent, elastic fabric and the interaction is registered using infrared light and a CCD camera. The result is an interface that delivers position and pressure information in real time, and thereby allows the user to interact by touching the material with his hands or feet or even by throwing things at the screen.
Check out videos of the installation.
|Physical interaction by touching the deformable screen.||Khronos projection.|
Afterwords is a video installation that creates a poetic experience which visitors are encouraged to explore through their presence and movement.
The installation consists of two opposing screens which together create the interactive space. The rear screen is of a translucent green material and is illuminated from behind by fluorescent tubes, while the front screen is an opaque surface that displays imagery from a ceiling-mounted projector. A video camera is placed at floor level of the front screen; as a person moves between the screens, this person’s image is captured by the video camera, processed by a computer, and finally a stylized full size image of the person is projected onto the front screen. A single line of text floats above the imagery. It is possible to interact with the text in different ways, e.g. touch it to change the text. If more than one person is present in the installation at a time, the projected imagery of these persons will merge as they touch or walk by one another. Their individual texts will disappear and a single text will appear above the formed group.
|A visitor interacts with the installation.||Two persons form a group with a shared text appearing above.|
Digiwall looks like a traditional climbing wall, but is actually a computer game that you climb to get a new and different experience consisting of sound, light, music, and game.
The installation is an alternative to computer games that require the user to be sedentary. The concept of Digiwall is to maintain the excitement of computer games while supplementing this with an educational and physical dimension.
The installation is a hybrid between a climbing wall, a computer game, and a musical instrument, and consists of a nearly 16 square metre large wall with 144 climbing-holds. The climbing-holds produce sounds when touched and can be lit up from the inside to show the user which way to climb. Each climbing-hold contain a sensor that communicates with a computer, and by gripping and stepping on the handles in different combinations, the users are capable of creating pieces of music or combining sound and light into pieces of art. The functionalities of the climbing-holds are reprogrammable, i.e. it is possible to change their functionality and thereby create new and different experiences for the users.
Digiwall is created by The Interactive Institute in Piteaa, Sweden.
|Climbing the wall the users are capable of creating pieces of music.|
Digital experience inspires a lot of people, showing exiting, interessting, innovative or even odd uses of new technology in an experience-oriented context. If you have a good example that we haven't already covered, we would appreciate your suggestion.
Digital Experience is a weblog on interaction and experience design.
The site is created and maintained by CAVI, the Centre for Advanced Visualization
and Interaction at Aarhus University. Contributions are welcome.
More about this site.