Facial Expression Design

For Robot

My Contribution(2019 Dec- 2020 Feb.)

Define facial expression interaction for the service robot system

Coding in Arduino

Overview

表情切換.gif

I set the personality for the robot and studied what scenarios the robot went through in the process of executing the task.

 

According to the risk level of the scene, I used expression design to help users understand the mood of the robot.

 

There is a huge difference between the expressionless robot and the expressive robot, which can affect the user's cognition of the commercial brand positioning. Using an expressionless robot is just an automatic tool for users, but when the robot has an expression, it will make the user generate empathy in the process of interaction with the robot, and the communication process is more intelligent and interesting.

My Design Process

研究著录.png

1

Users Research

方案管理.png

2

Solution

设计.png

3

Coding

测试 (1).png

4

Test

User Research

蚁人与场景交互物调研示意图.png

In the process of various exhibitions and competitive product research, I found people actually did not understand the intention of robots. When people see the robot coming, people may pat the robot or talk to the robot, but the robot may be on a mission and can't interact with people.


Because people don't know the status of the robot's current tasks, they think that the robot ignores people, which results in the discount of the interactive experience between humans and robots.

In the process of various exhibitions and competitive product research, I found people actually did not understand the intention of robots. When people see the robot coming, people may pat the robot or talk to the robot, but the robot may be on a mission and can't interact with people.


Because people don't know the status of the robot's current tasks, they think that the robot ignores people, which results in the discount of the interactive experience between humans and robots.


Therefore, Alibaba robot makes a special specification for the lighting of the task scenarios, so that the robot can prompt people with light, remind people of dangerous situations, or provide rich visual guidance for people through colorful light color changes.

蚁人与场景交互物调研分析表.png
蚁人与场景交互物调研分析表.png

Solution

Traditional robot design

將板子放在機器人上.gif

Traditional robot design focuses on user interfaces to guide users to operate robots to assist users to accomplish certain tasks,

Facial Expression Design Setting

Smart robot design

屏幕快照 2020-08-23 上午2.22.54.png
表情切換.gif

In order to communicate with people more smoothly, the new generation of robots must seek a more simple and direct symbolic expression.

Design Solution

Facial Expression Design 

I am inspired by the many expression packs popular on social networks. I began to think about whether facial expressions can be used as a supplement to the visual experience in addition to voice interaction and ui interaction.

wink.jpg
震惊(确定).jpg
正常.jpg
左看.jpg
晕.jpg
观察.jpg
右看.jpg
不耐烦.jpg
沮丧.jpg
闭眼.jpg
心水(喜欢).jpg
翻白眼.jpg

Interactive Prototype

谨慎.gif
眨眼2.gif

Coding

The original plan was to hand over the coding design of facial expression settings to engineers, but because they were too busy with their projects, I had to watch the online video tutorial while learning simple hardware programming by myself.

表情測試.gif
屏幕+字.gif

At first, I bought two kinds of dot matrix display. I was hesitant to use the left or right one for development.

 

However, I found that the left display screen must be developed with makecode, which has many limitations. Therefore, I finally decided to use the LED dot matrix display on the right with Arduino

Finally, I chose coding in Arduino.

The LED board is matrix 32X16.

-----

I am a liberal arts student, learning programming is relatively slow, so it took about two weeks to study, two weeks to turn the facial expression design set into the LED board

我寫的表情+Arduino.gif

Test

將板子放在機器人上.gif
表情切換.gif

When the expression was finished, I made the Arduino code into hex file, then give it to the software engineer to access the relevant communication protocol, and then give the LED dot matrix display to the mechanism engineer to install it into the robot, and then you can see the results.

When the LED matrix screen was installed on the actual machine, I also recompiled several files back and forth, adjusting the color value and the rate of expression changes to make these expressions more vivid on the robot.