Deliverable 4 - Worked Hours Report
In this part we will show the arms part of the expressions. It will be integrated with the display facial expressions in the integration deliverable (D5). We synchronized the engines and made the code of the movements in Python, using mainly the adafruit_servokit library. We also used threads to make the movements more fluid, since there are 2 motors per arm, and one in the head.
“Celebrating” Arms and Head Configuration
“Celebrating” Arms and Head Configuration
“Happy” Arms Configuration
“Thinking” Arms Configuration
“Dancing” Arms Configuration
“Asking” Arms Configuration
“Sad” Arms Configuration
Here we have the facial expressions that will be shown on the display, the display is the ILI9341 arduino shield, and we are using a C library in order to control it with a raspberry pi:
“Sad” facial expression