Is Jibo’s emotive system persistent? And if so, do we have access to that via the SDK? For example, would I be able to see how Jibo is feeling (e.g. sad, happy, tired, etc) at any time from my skill so that I could alter his output accordingly?
And continuing that thought, would we be able to access the user’s emotional state as well?
Our team is still working on Jibo’s emotion system and we will announce more details about this in the future when we are ready. In the meantime, there are definitely ways to emulate this at the moment by thinking about how Jibo would react to certain responses or requests and then plan for it in your skill.
For example, in your turn around skill, how does Jibo feel by being told to turn around? How does Jibo feel when the user tells him to stay turned around the first time? How do you think the user or Jibo feel after repeated requests?
Definitely check out our design style guide and speech style guide for more information and guidance. See what you can come up with and you will be prepared when we release more details about his emotive system.