Ash at CIID

Ashwin Rajan's blog while at the Copenhagen Institute of Interaction Design.

Posts Tagged ‘sensor

Frontline gloves – Tech Testing

leave a comment »

Critical technological components for demonstrating the value of the Frontline gloves concept were 1. the proximity sensor for sensing distances in smoky/darr/low visibility environments, and 2. basic communication between firefighters made possible by gesture recognition, using bend sensing technology. Putting together quick and dirty prototypes of these two components helped us test their viability early. Here are a couple of videos of the tests.

Testing Bend Sensors

Testing Proximity Sensors (Ultrasound)

Advertisements

Written by Ashwin Rajan

February 2, 2009 at 12:45 am

Frontline gloves – an attempt at miniaturization

leave a comment »

In our initials ideas, the Frontline gloves product concept consisted of two distinct parts connected by wires: 1. the upper hand area of the glove with proximity sensing and signalling capabilities and 2. the brain and software nestled into a pouch further up the arm (maybe integrated into the end of  the long arm of the glove). The latter would consist essentially of an Arduino, a Xbee shied for wireless communication with the paired glove, and a battery pack for powering the whole setup.

But before long we found ourselves exploring ideas around miniaturizing the product (encouraged by teacher Vinay.V) by combining both parts in one – within the box on the hand of the glove itself (photo above). This, we learnt, would be possible if we got rid of the Arduino board by detaching its chip, clock and a few other components and mounting them on to a much smaller custom made board.

The integrated single-piece set up of components.

The integrated single-piece set up of components.

The Tradeoff

Going further, we decided to go with using the complete Xbee shield setup as is, including its board, for this prototype. As a result, we traded off using two seperate components connected by wires for a larger but single component fully integrated into the hand of the glove, including the battery pack.

The increasing size of the housing for components as prototyping progressed.

The increasing size of the housing for components as prototyping progressed.

Testing how the box would fit and feel on the glove.

Testing how the box would fit and feel on the glove.

How the setup would fit together.

How the setup would fit together.


Frontline gloves – concepts and prototypes

with 2 comments

I posted a short note on our recent project in Tangible User Interfaces where we decided to work on wearables for firefighters. Here are some photos of initial sketches and prototypes. More about the actual features of the glove in posts ahead.

Rapidly created scenarios helped us better understand how technology-enhanced gloves could answer critical needs of firefighters in real fire situations.

Rapidly created scenarios helped us better understand how technology-enhanced gloves could answer critical needs of firefighters in real fire situations.

Because we were working with a set of four or five critical user needs (finalized from researching papers and ongoing projects in wearables for firefighting), the first concept of our product became loaded with features – a classic case of ‘featuritis’.

An all-inclusive first version of the glove.

An all-inclusive first version of the glove.

Exploring possibilites and uses of gesture recognition in the gloves.

Exploring possibilites and uses of gesture recognition in the gloves.

Constant visual and verbal feedback from teachers helped iterate issues around form, function, interaction, physical limitations and user interface.

In feedback sessions from teachers - drawing by Alex Wiethoff.

In feedback sessions from teachers - drawing by Alex Wiethoff.

In feedback sessions from teachers - drawing by Christopher Scales.

In feedback sessions from teachers - drawing by Christopher Scales.

Bend sensors came up as a great option for adding gestural recognition possibilities in a prototype.

Bend sensors came up as a great option for adding gestural recognition possibilities in a prototype.

Soon we made the leap to testing and working with an actual prototypical glove.

Experimenting with a real glove helped explore issues of viability of gestures, user interface details, etc.

Experimenting with a real glove helped explore issues of viability of gestures, user interface details, etc.

Tap is the new click – Dan Saffer

with one comment

 

Designing Gestural Interfaces 01

Designing Gestural Interfaces 01

Dan Saffer’s talk at CIID some weeks ago was about his new book on designing for gestural interaction. His detailed and convincing talk clearly indicated that we have a new paradigm of  interaction challenge on our hands, one hastened by the radical increase in networks of sensors and other technologies with the potential to create context-aware ‘ecosystems’, fast emerging in advanced urban environments around the world. Why, there are even DIY (prosumer?) versions of these things

Dan’s enlightening presentation highlighted the various subtleties of the domain as well as some of the first thumb rules for gestural interface design. Issues highlighted included limitations of current interaction modes, the (under-explored) importance of ergonomics, types of interactive gestures, the preponderance of sensors, an overview of notation, prototyping for … phew! immersive stuff, literally. But I will let you find the real thing for yourself in his soon-to-be-released book. There’s also Dan’s exclusive wiki on the subject.

We had our own shot at tinkering around with the delightful Ardunio micro-controller some days later, and took the opportunity to develop our own gestural interfaces. My favorite was this one – RubberBots – for its degree of sensitivity and emotional subtlety in response to interaction.

Written by Ashwin Rajan

December 3, 2008 at 8:29 pm

Visualizing Wind

leave a comment »

This mini-project was executed as part of the Computational Design course at CIID some weeks back. My objective was to use the Nintendo Wii to record simple wind data by suspending the Wii in an open location, and using the recorded data to create visualizations drawn by programming in Processing. More details follow.

The Concept:
This intent is to show  a concept for visualization of simulated wind data as would be available from a sensor located on a windmill.
A real stakeholder – wind data analyst – from a leading Danish consultancy was interviewed to understand key challenges in wind turbine design.
It is common in the Danish wind industry (and elsewhere) to record wind data for very small intervals of time. By understanding wind patterns in terms of its properties such as acceleration and constancy, engineers are able to go beyond physical limitations of turbine design to evolve increasingly efficient and productive systems.
With over 20 sensors recording different parameters on a single windmill, analysts often face a veritable mountain of data (down to individual seconds). In this context, visualization of such data in a manner that facilitates comparison, causality and multivariant evidence becomes key. The poster describes briefly how some of these goals were met.

The delightful ‘Wiimote’ was used in this experiment to mimic the sensor.

Design Context: A real world scenario 
Location: a wind farm out in the North Sea
o   72 turbines
o   20 sensors on each turbine
o   Each sec of wind data recorded
o   Data from 8 years archived for analysis

·      The peak production of windmills of all capacities is 60% of full capacity due to physical limitations
·      Measuring constancy of wind is of most interest to wind analysts and windmills designers
·      Acceleration of wind is most deterrent to wind production as it wears out the material most, and least load on the material comes from constancy of wind
·      The main difficulty in real-world wind turbine design isn’t generating the most electricity at a given speed – it is making blades which will work across a range of wind speeds
·      The design challenge is to be able to measure and visualize wind data in a way that can help engineers interpret the ‘acceleration’, ‘lift’ and ‘orientation’ of wind.

Using the Wii Remote as principle sensor
Wii remote sensors simulate the acceleration of wind by it’s x, y, z acceleration coordinates
·      In order to holistically render the acceleration of wind in a visual manner, we have focused on gathering data in coordinate directions x and y, and rendered the z axis insignificant – by suspending the sensor (in this case, the Wiimote) with a cord of fixed length.
·      Due to the position of the Wii remote when recording the raw wind data, the x and y coordinates are principle axis in this experiment. This has two benefits
o   The z sensor is rendered insignificant in terms of contributing to measuring acceleration, thus rendering it simpler by reducing the number of variables required in its calculating
o   The z axis becomes dedicated to measuring ‘lift’ and denoted here by the small series of arcs at the bottom.

Anatomy of a second - Composite View
Anatomy of a second – Composite View

 

And another mode of visualizing the same data set …

anatomy-of-a-second-02

 

 

Written by Ashwin Rajan

November 30, 2008 at 9:00 pm