Friday, 1 July 2011

The Haptic Cow, Westfield Touchscreen encounter and some interesting research

On Tuesday I went to Haptics Day at the British Library in London.  Haptics is a term which involves the sense of touch and there were several exhibits both on display and available for touching.  If you’re squeamish I wouldn’t read or listen much further, but if you’re a keen Archers fan on BBC Radio4 you will realise that a vet will do a certain procedure with a pregnant cow regarding a check up. 

Passing this knowledge on from vet to student has been rather difficult and the Royal Veterinary College has developed a Haptic cow which resembles the feeling a vet would get in making ‘an examination’ of the animal.  Not only do you get to ‘feel’ the insides of a cow, but the programmer can vary the status of the cow so that the sensor on your finger gives a touch representation of what is going on inside and which would of course be invisible to the vet. 

While waiting for my turn to ‘observe’ the displays, there was a sound of a cow mooing in the lobby of the British Library.  Many a manuscript researcher could have been dreaming of writing a script involving Ruth Archer and the Ambridge vet regarding the state of Bluebell’s insides.  For those of you on Twitter, you may remember I tweeted about this on Tuesday evening. 

Another demonstration of the use of Haptics involved the training of an anaesthetist in delivering an epidural.  If this hurt the patient, a scream could be heard and when it came to my turn one of the students commented that I had done better than an anaesthetist in having delivered a pain free injection at the right point.   I did explain that as a recipient of lumber punctures 10 years ago, I was well aware of being given a needle between the vertebrae. 

A very interesting exhibition and I hope this develops into some benefit for all of us.  Haptics has many applications for making what is unseen accessible by touch.  An obvious application of this would be in the consideration of touch screen information boards and even tablet computers.  Instead of an audio prompt there could be a virtual touch prompt in the same way that Braille gives users an alphabet feel for a word. 

I can’t do Braille but I did ‘run’ into a large touch screen information point at the Westfield Shopping Centre in Shepherds Bush in London.  Getting to Westfield is easy enough as both the London Underground Central Line and London Overground stations are near. 

I drifted into the Westfield Village which is a modern shopping mall, with lots of glass, very metallic and monochromatic.  In fact on late Wednesday afternoon it was surprisingly empty and the absence of colour means it’s difficult to recognise with my limited vision if one is inside or outside a given shop.  The background mood music is pervasive and there were no apparent ‘manned’ help points apart from the touch screen which I enjoyed poking with my finger.  I noticed the screen change but hadn’t a clue what I pressed and didn’t know what the answer was. 

The centre is interesting enough if you’re into retail therapy, but I would prefer central London or one of the suburban high streets any day to this.  It’s obviously relatively safe and traffic free, though it seems to lack a sense of community.  Maybe I was unlucky and visited on an ‘off’ day.

Now here is a bit of encouraging news.  Through Twitter, I have met a research at Loughborough University involved in a study of accessibility for primarily wheelchair users, though of course we all benefit from the tactile markings, crossings and dropped kerbs which make crossing the road safe for us.  The project is called the Free Traveller project ( and the section below has been extracted from information I received from the project leader Christopher Parker:

“The project is the ‘public front’ for an experiment which I am running, researching how different forms of information influence the user in terms of information quality, authority and usability.

...  the aim of the experiment is …  to understand how different forms of information effect the user. However, by choosing a ‘vulnerable’ user group, the results of the experiment will also be useful in influencing accessibility planning…

While blind people were considered as a potential user community since they experience similar (but different) risks when travelling, it was deemed a little too complex for this experiment. This was because while wheelchair users may be easily categorised by the mobility provided by their wheelchair (easy to group participants), it may be difficult to assess the level of visual impairment in a group of participating blind persons. Also, there is a potential complexity with running the experiment online. While I am aware that a good proportion of those registered blind in the UK would be able to access the website successfully, their experience of the information would be partly dependent upon the degree of their visual impairment. This would introduce a hard to verify and control variable, relative to the online experience a wheelchair user may have.

I would however like to point you towards a project which I am involved with, which will hopefully be able to make use of the outcomes from this experiment; Access Advisor. …

At the moment the project is in its infancy, but within 6 months it will be launched properly and is definitely worth following.”