Engaging and interacting with users - engagement and movement patterns

I looking for behaviours/algorithms/API support so that jibo can simulate human engagement involving single and multi-users. Obviously, you will be providing lots of tools to simulate human engagement , I’ll just give some use case input concerning one problem area in my skills design

Let’s just start with movement patterns as Jibo interacts with multiple users. Beyond just rotating Jibo’s head mindlessly left to right and back again;)

When Jibo talks he is socially inclusive and gives attention to people (participants) . Each person likes that! Jibo engages her/him personally while attending to the needs of the whole group. Jibo can execute (patterns) of attention to a particular person or persons - and extended by the skill. Humans learn these patterns - engage someone by looking in their eyes, orient one’s body a bit to the person, ask questions, smile, laugh, etc. Our heads or bodies orient to others – where do we look next? We keep an eye on new comers - and perhaps widen our gaze and be socially inclusive,etc

What’s important, is there is always some variations in how each of us converse with others in a group. Our moves are not simply sweeping from left to right, and back again.

As Jibo gives focus/attention, the skill can listen for entity(s) that are being detected … from the entity we get location etc

The skill can react, lookup the person’s name and perhaps ask a question. Or if there are entities that don’t have a name, the skill runs a behaviour to register persons,etc.

The skill might pause the interaction behaviour to spend a little more time on the person… and then move on or break for whatever reason

Patterns of interaction - these might be setup based on profiles suitable to Jibo’s personality.

Best, Bob

2 Likes