Humans and computers are getting closer, and gestures are just the beginning says SMART's Edward Tse
Edward TseEdward Tse: SMART TechnologiesWhen interactive whiteboards became a quasi orthodoxy in schools, many pundits lost interest, making cynical asides about the "sage on the stage". But gesture interfaces, popularised on devices like the iPhone, mean the game is changing.

And when you consider that 'pinch" and "expand" are just the first in a new, developing vocabulary of gestures, it's time to talk to an expert, like SMART Technologies' Edward Tse, for an indication of the interactive advances to come, especially when speech and position are added. Welcome to the world of proxemics.

Edward Tse, SMART's project research leader, is a 360-degree technologist. He is just as happy spearheading a team to develop clever new products like the SMART Table as he is demonstrating developments to lay users at technology events like BETT (see YouTube clip below, from 2009). And it is all delivered with a personal and professional enthusiasm, grounded in pedagogy, that's highly infectious.

He's happy to gaze into the technology crystal ball but his research work and his role in the education market means that he is thoroughly grounded. "A lot of times when we predict the future we often compose it of things that are present today," he says. "The classroom of tomorrow is filled with the technologies of today.

"I think Smart has a different approach where we examine the emerging pedagogical practices that are happening in schools and we build technologies and tools to simplify this process within the classroom. So really we are not trying to just create new technology per se we are also trying to figure out what’s happening within schools. For example, with the Smart Table what we were originally thinking about was learner-centric pedagogy – people working within a classroom – and we found that in the kindergarten grade three ages a lot of the children in those ages are working in small groups and we wanted to create technology that would support this kind of learner-centric pedagogy. And so the Smart Table wasn’t designed for teachers; instead it was designed for learners working together."

http://www.youtube.com/watch?v=xMKUZhgikTg

http://www.youtube.com/watch?v=_hn0Ga4cJXY
Edward Tse presenting gesture techniques at BETT back in 2009

Innovation doesn't just incubate in corporate labs under non-disclosure agreements and then pop out as perfectly formed products. There are whole research communites out there, with roots in academia too. Edward Tse is involved with The SurfNet Network, a Canadian research alliance of "academic researchers, industry partners, and government collaborators". Its aims are to "improve the development, performance, and usability of software applications for surface computing environments: nontraditional digital display surfaces including multi-touch screens, tabletops, and wall-sized displays. Surfaces naturally support group work and collaboration."

He also attends the annual ACM International Conference on Interactive Tabletops and Surfaces (SMART is a champion sponsor). This all makes up for a rich community of expertise and it's no surprise that it was the subject of Edward Tse's PhD (see his two PhD research videos below). It was at this conference that he saw ice blocks being used to create interactive walls (a solution reactive to inrared light was mixed with the water): "We are going to see more and more creative uses and I think one of the really good venues for learning about these esoteric uses is the Interactive Tabletop and Surfaces conference. We have a community that explores just multi-touch and explores what can be done if we find new, funky ways of doing interaction."

The growing ubiquity of gestures in what he calls "the casual computing environment" fires Edward Tse's enthusiasm for future developments. Most people, he points out, don't want to be “hunkered down” over a computer for most of the time with a keyboard and mouse. It could be that they are at a meeting, for example, where the computing takes a a social form, “where its really about the face-to-face communications that people have over the table so you don’t have one person who is the controller of all the interaction”.

Interactive technology is also making an appearance in restaurants and bars where it is highly functional rather than just an attractive innovation. Only recently, visitors back from the US would regale friends with stories of ordering food and drink from their "interactive table" in a Las Vegas bar or restaurant. But now London has its own oriental fusion restaurant, Inamo, where the table surfaces are all part of the service, and a core of the business too. It works and it's popular and it's catching on.

'We are building a language of multi-touch'

The first touch gestures are, of course, just that – a beginning. "I feel that we are doing is we are building a language of multi-touch," says Edward Tse, "and really we are starting to work on the very basis of that language at this time, which is the vocabulary.

"Essentially teachers are learning this multi-touch vocabulary. They are learning that, OK, there are certain gestures that you can do with multiple fingers and this is what they will do with a computer. But eventually they become a part of our own language, part of the way that we communicate with other people. It doesn’t matter if you are in Canada or you’re in China – those same gestures mean the same thing everywhere around the world.

"And so my feeling is 'What does this have, or how will it affect teachers?' I feel it’s going to change the vocabulary. It’s going to increase the vocabulary that teachers have to communicate with their students. All of their students will be familiar with these gestures as well and they will know what that means. As we start building more and more of these gestures, we have a more and more powerful way of communicating between teacher and students. And the students can also communicate with each other using this language as well.

"My feeling with multi-touch, especially over the large display is that we’re moving in to this notion that, instead of just interacting with the computer, we’re starting to build not just a way to communicate with computers but also ways to communicate with other people. What I mean is these multi-touch gestures that you’re seeing – for example the scaling and pinching – they’re starting to mean something that’s not just important for the computer but also for other people." (See videos below of Edward Tse's PhD work on speech and gestures over digital tables.)

http://www.youtube.com/watch?v=UyjycmOXm1E

http://www.youtube.com/watch?v=RuU2smVPX3g

For those who like their future Bladerunner style, the next stage will be to bring in speech and proximity, hence the relatively new study of proxemics. This is where humans and computers get closer, and the computer literally knows where you are and even has a sense of what you are doing (check out the videos). The challenge for educational technologists here is understanding and 'taming' the technology before mapping it to the pedagogy.

This greater level of interaction allows the pedagogy to become more kinaesthetic and active. You only have to look to gaming technologies like Wii and Kinetic and apply just a little imagination to guess what this can mean for activities like PE and dance.

"Speech and gestures are very much the same communication system," explains Edward Tse. "So computers need to get better at understanding human intent and they need to know when we want to interact with each other and when we don’t. Multi-touch is nice in that the only time we want to interact with computers is when we are touching on the surface. But this becomes harder with technologies like speech, so computers need to get an understanding of the way that people communicate, and I think we are getting there with multi-touch.

"That’s the first step, with the gestures. I do think we need to continue to do more in terms of combining both the speech and the gesture together because that’s where the real context and the real information about what we intend or what we want the computer to do is located."

'Kids learn better by doing and we need to have technology that can support that'

The bottom line for education companies, however, is to provide teachers and learners with the tools and the content to create excellent learning experiences, and to further develop them. Edward Tse explains: "We are moving away from teacher-centred pedagogy to something that’s much more learner-centric or actually teacher and learner-centric.

"We are finding now, especially with kinesthetic learning, that kids learn better by doing and we need to have technology that can support that. So we have the new 800 series Smartboard to support people working anywhere around the board and these kinds of activities are starting to become more and more an important part of the classroom. Our technologies are really trying to get ready for when we start to see this shift in the educational pedagogy."

There's one more powerful and constantly replicating driver to this heady mix of technology, research and product development, and that's children: "Children are growing up with touch and they are changing their expectations. They don’t have the same notion that we have that interacting with computers will be a single cursor, a mouse and a keyboard.

"Before they’ve even learned the alphabet, they’re learning how to read or they are learning letters right on their phone. Their first exposure to computing is through a touch-screen or through a touch-surface and, when we get up to the age of kindergarten to grade three, kids already know how to interact with technology like the Smart Table. So we don’t need to do any training, there’s no learning involved; it’s just something that they are already used to, they pick up right away.

"This is really important because we feel that touch actually may be one way of getting really young kids into computing much earlier. One of the things we are learning is that kids can interact with touch-screens much earlier than they can with a traditional computer with a keyboard and a mouse.

"What I am expecting for the longer term is that multi-touch becomes a large part of our vocabulary, becomes a larger part of the language in which we use to communicate with other people."

Dr Edward Tse PhD is project research leader with SMART Technologies

BETT 2011 logoBETT 2011, January 12-15
Olympia, London

SMART Technologies: stand B50

More information

http://edwardtse.com
Wikipedia on proxemics

 

 

 

http://www.youtube.com/watch?v=OHm9teVoNE8
Video of the proxemic interactions of Nicolai Marquardt from the University of Calgary, who works under the Surfnet research alliance

http://www.youtube.com/watch?v=bbtrI6GjBsk
Video of multi-touch ice installation

http://www.youtube.com/watch?v=bwP4vBgHixI
Apple’s vision of the future circa 1988 (is that a nascent iPad, 22 years before launch?)

 



Add comment


Security code
Refresh