To start the ball rolling this month I want to look at a development that got a mention in various technology forums recently: The Mind Reading Computer. Sounds a little scary on first thoughts, but on reading through the detail I see that this is actually a computer that reads facial expressions, then works out what is being thought about by inference, or, to give it its formal book chapter title: “Real-time inference of complex mental states from facial expressions and head gestures“.
Couple this idea with an android developed by Osaka University’s Hiroshi Ishiguro in his own image, which is able to re-create vaguely life-like facial expressions through motion capture in real time, and you have the makings of a self-perpetuating universe, where humans need not intervene, but can watch a scenario unfold before their eyes.
It’s easy to imagine a crowd of life-like androids, sitting around a room and interpreting each other’s facial expresssions and head gestures, having been seeded by a discussion starter MC-android with an appropriate (perhaps provocative) series of statements and gestures. This could be as entertaining as watching table-side banter between christian women and muslim men at a speed-dating session, or, equally, as banal as watching BigBrother Up Late:The Poolside Sessions. Any closed loop situation such as this has the potential to fall on either side of the line when you think about it.