So here we are at the finishing line for this design challenge, and my first attempt at a Design Challenge. A huge pat on the back for my partner dougw who had the lions share of the design and manufacturing portions of the project.
In this final blog I will post my test code and the final voice control python script.
I started out planning for a dual GUI and voice control python "program" that would be able to control and setup the spice platter either by voice commands or by the touchscreen. I then found that designing and implementing a GUI on python is not as simple or drag and drop as I would have liked. So the GUI was abandoned in favour of voice control only. Once I started playing digging around with the Voice Assistant to see what I could do with it, I realized that, despite some limitations, it is quite good for general queries and simple commands
Pros:
- Once linked with my gmail account, I could use the Pi assistant to locate all the android devices registered on my account (a couple of phones, and a tablet).
- Accessing the assistant on one of my other devices (from a room where the Pi assistant was not located) would be answered on the Pi. This is also a con, but only if the devices were on different floors of the house.
-> This was also only available when all devices were on the same Wireless network
- I used VNC to connect to my Pi, and if the IP address of my Pi changed, I could just ask it what the IP address was.
- It tells jokes. My daughter says they are bad jokes, so I know they are really, really good ones...
- The AIY voice kit is a slightly less capable Play Home or Play mini device. This is also a con.
- Google has made it very easy to program custom commands
- It can add calendar events to my google calendar
Cons:
- Not all Google services are available on the Pi assistant ( Play music for example). On my phones and tablet, I can say 'Ok Google, play some music' and it will start playing music. The Pi assistant can't do that yet.
- Cannot access phone functions through the assistant (can't send sms message, dial a phone number)
- Cannot access gmail
- Cannot break apart a command into smaller parts (yet):
-> I wanted to make a command to load the spice in the platter: 'Ok Google, load <some spice name> into < some jar location>' I could not figure out or find a way to have the assistant capture the whole phrase and then allow me to programmatically get the name and location from the actual command.
- Sometimes you need to make sure you enunciate you words, 'Spice of Pi' was sometimes interpreted as Spice of Thai, which apparently is a Thai restaurant about 2000 km from my location. I wonder if they deliver...
- English words... Thyme and time sound the same.
All in all, I found the Voice kit to be a fun tool to use in this project. As time goes on it can only become more capable and easy to program for.
Here is the code for testing and determining the proper angle to be used for the lifting mechanism.
#!/usr/bin/env python3 import aiy.audio import aiy.cloudspeech import aiy.voicehat from gpiozero import AngularServo import time def main(): recognizer = aiy.cloudspeech.get_recognizer() recognizer.expect_phrase('inner') recognizer.expect_phrase('outer') recognizer.expect_phrase('middle') button = aiy.voicehat.get_button() aiy.audio.get_recorder().start() #in this next line the first parameter is the servos actual GPIO number, not the servo number. # servo 0 would be 26 # servo 1 would be 6 # servo 2 would be 13 # servo 3 would be 5 # serv0 4 would be 12 # servo 5 would be 24 s =AngularServo(5,min_angle = -45, max_angle = 30, frame_width = 10/1000) while True: s.detach() print('Press the button and speak') button.wait_for_press() print('Listening...') text = recognizer.recognize() if text is None: print('Sorry, I did not hear you.') else: print('You said "', text, '"') if 'inner' in text: print('Lifting inner bottle') s.angle = -45 time.sleep(1) elif'outer'in text: print('Lifting outer bottle') s.angle = 20 time.sleep(1) elif'middle'in text: print('Returning to middle') s.angle = -20 time.sleep(1) if __name__ =='__main__': main()
Here is the code for determining the rotation angle.
#!/usr/bin/env python3 import aiy.audio import aiy.cloudspeech import aiy.voicehat from gpiozero import AngularServo import time def main(): recognizer = aiy.cloudspeech.get_recognizer() recognizer.expect_phrase('meat') recognizer.expect_phrase('cheese') recognizer.expect_phrase('water') recognizer.expect_phrase('milk') recognizer.expect_phrase('toaster') recognizer.expect_phrase('oven') recognizer.expect_phrase('knife') recognizer.expect_phrase('spoon') recognizer.expect_phrase('pan') recognizer.expect_phrase('pepper') recognizer.expect_phrase('salt') recognizer.expect_phrase('spatula') recognizer.expect_phrase('sugar') recognizer.expect_phrase('basil') recognizer.expect_phrase('lemon') recognizer.expect_phrase('clove') button = aiy.voicehat.get_button() aiy.audio.get_recorder().start() #in this next line the first parameter is the servos actual GPIO number, not the servo number. # servo 0 would be 26 # servo 1 would be 6 # servo 2 would be 13 # servo 3 would be 5 # serv0 4 would be 12 # servo 5 would be 24 s =AngularServo(6,min_angle=-180, max_angle=180,frame_width = 10/1000) while True: s.detach() print('Press the button and speak') button.wait_for_press() print('Listening...') text = recognizer.recognize() if text is None: print('Sorry, I did not hear you.') else: print('You said "', text, '"') if 'meat' in text: print('Moving servo to bottle one') s.angle=-176 time.sleep(4) elif'cheese'in text: print('Moving servo to bottle two') #servo.min() s.angle=-155 time.sleep(4) elif'water'in text: print('Moving servo to bottle three') #servo.mid() s.angle=-135 time.sleep(4) elif 'milk' in text: print('Moving servo to bottle four') #servo.mid() s.angle=-113 time.sleep(4) elif'toaster'in text: print('Moving servo to bottle five') #servo.mid() s.angle=-95 time.sleep(4) elif'oven'in text: print('Moving servo to bottle six') #servo.mid() s.angle=-73 time.sleep(4) elif'knife'in text: print('Moving servo to bottle seven') #servo.mid() s.angle=-53 time.sleep(4) elif'spoon'in text: print('Moving servo to bottle eight') #servo.mid() s.angle=-31 time.sleep(4) elif'pan'in text: print('Moving servo to bottle nine') #servo.mid() s.angle=-18 time.sleep(4) elif'pepper'in text: print('Moving servo to bottle ten') #servo.mid() s.angle=10 time.sleep(4) elif'salt'in text: print('Moving servo to bottle eleven') #servo.mid() s.angle=36 time.sleep(4) elif'spatula'in text: print('Moving servo to bottle twelve') #servo.mid() s.angle=58 time.sleep(4) elif'sugar'in text: print('Moving servo to bottle thirteen') #servo.mid() s.angle=85 time.sleep(4) elif'basil'in text: print('Moving servo to bottle fourteen') #servo.mid() s.angle=107 time.sleep(4) elif'lemon'in text: print('Moving servo to bottle fifteen') #servo.mid() s.angle=129 time.sleep(4) elif'clove'in text: print('Moving servo to bottle sixteen') #servo.mid() s.angle=151 time.sleep(4) if __name__ =='__main__': main()
And finally, the SpiceofPi script.
#!/usr/bin/env python3 import logging import subprocess import sys import aiy.assistant.auth_helpers import aiy.audio import aiy.voicehat from google.assistant.library import Assistant from google.assistant.library.event import EventType from gpiozero import AngularServo import time logging.basicConfig( level=logging.INFO,format="[%(asctime)s] %(levelname)s:%(name)s:%(message)s" ) def power_off_pi(): aiy.audio.say('Good bye!') subprocess.call('sudo shutdown now', shell=True) return; def reboot_pi(): aiy.audio.say('See you in a bit!') subprocess.call('sudo reboot', shell=True) return; def say_ip(): ip_address = subprocess.check_output("hostname -I | cut -d' ' -f1", shell=True) aiy.audio.say('My IP address is %s' % ip_address.decode('utf-8')) return; def get_spice(SpinAngle, LiftAngle,s,l): l.angle= -20 time.sleep(1) s.angle = SpinAngle print ('Moving platter') time.sleep(4) l.angle = LiftAngle print ('Lifting bottle') time.sleep(1) print ('Enjoy your spice!') print('Say "OK, Google" then speak, or press Ctrl+C to quit...') return; def process_event(assistant, event,s,l): status_ui = aiy.voicehat.get_status_ui() if event.type == EventType.ON_START_FINISHED: status_ui.status('ready') if sys.stdout.isatty(): print('Say "OK, Google" then speak, or press Ctrl+C to quit...') elif event.type == EventType.ON_CONVERSATION_TURN_STARTED: status_ui.status('listening') elif event.type == EventType.ON_RECOGNIZING_SPEECH_FINISHED and event.args: print('You said:', event.args['text']) text = event.args['text'].lower() if text == 'turn off': assistant.stop_conversation() print ('going to power off') power_off_pi() elif text == 'restart': assistant.stop_conversation() print('going to reboot') reboot_pi() elif text == 'ip address': assistant.stop_conversation() say_ip() elif text == 'can i have allspice': assistant.stop_conversation() get_spice(-176,20,s,l) elif text == 'can i have basil': assistant.stop_conversation() get_spice(-155,20,s,l) elif text == 'can i have caraway': assistant.stop_conversation() get_spice(-155,-45,s,l) elif text == 'can i have cardamom': assistant.stop_conversation() get_spice(-135,20,s,l) elif text == 'can i have cayenne pepper': assistant.stop_conversation() get_spice(-113,20,s,l) elif text == 'can i have celery seed': assistant.stop_conversation() get_spice(-113,-45,s,l) elif text == 'can i have chili seasoning':#chile assistant.stop_conversation() get_spice(-95,20,s,l) elif text == 'can i have cinnamon': assistant.stop_conversation() get_spice(-73,20,s,l) elif text == 'can i have clove': assistant.stop_conversation() get_spice(-73,-45,s,l) elif text == 'can i have coriander': assistant.stop_conversation() get_spice(-53,20,s,l) elif text == 'can i have cumin': assistant.stop_conversation() get_spice(-31,20,s,l) elif text == 'can i have curry': assistant.stop_conversation() get_spice(-31,-45,s,l) elif text == 'can i have dill weed': assistant.stop_conversation() get_spice(-18,20,s,l) elif text == 'can i have ginger': assistant.stop_conversation() get_spice(10,20,s,l) elif text == 'can i have marjoram': assistant.stop_conversation() get_spice(10,-45,s,l) elif text == 'can i have mexican chili powder':#this one chilie? assistant.stop_conversation() get_spice(36,20,s,l) elif text == 'can i have nutmeg': assistant.stop_conversation() get_spice(58,20,s,l) elif text == 'can i have oregano': assistant.stop_conversation() get_spice(58,-45,s,l) elif text == 'can i have paprika': assistant.stop_conversation() get_spice(85,20,s,l) elif text == 'can i have peppercorns': assistant.stop_conversation() get_spice(107,20,s,l) elif text == 'can i have rosemary': assistant.stop_conversation() get_spice(107,-45,s,l) elif text == 'can i have sesame seeds': assistant.stop_conversation() get_spice(129,20,s,l) elif text == 'can i have tarragon': assistant.stop_conversation() get_spice(151,20,s,l) elif text == 'can i have thyme': assistant.stop_conversation() get_spice(151,-45,s,l) elif text == 'can i have time': assistant.stop_conversation() get_spice(151,-45,s,l) else: logging.error("Not a Spice of PI command.") elif event.type == EventType.ON_END_OF_UTTERANCE: status_ui.status('thinking') elif event.type == EventType.ON_CONVERSATION_TURN_FINISHED: status_ui.status('ready') elif event.type == EventType.ON_ASSISTANT_ERROR and event.args and event.args['is_fatal']: sys.exit(1) return; def main(): credentials = aiy.assistant.auth_helpers.get_assistant_credentials() #in this next 2 lines the first parameter is the servos actual GPIO number, not the servo number. # servo 0 would be 26 # servo 1 would be 6 # servo 2 would be 13 # servo 3 would be 5 # serv0 4 would be 12 # servo 5 would be 24 spinner=AngularServo(6,min_angle=-180, max_angle=180,frame_width = 10/1000) lifter = AngularServo(5,min_angle = -45, max_angle = 30, frame_width = 10/1000) with Assistant(credentials) as assistant: for event in assistant.start(): process_event(assistant, event,spinner,lifter) if __name__ == '__main__': main()
Since the actual platter had to be disassembled for transportation, here are the instructions to recalibrate the angles for spinning the platter and lifting the jars:
Put the correct number for your servos in the lines with AngularServo ( XX, blah, blah, blah)
You may need to make sure the angles are correct for servos. Both test scripts are activated by the push button so you can look at the position of the platter and lifter to make sure the angle is correct.
Top of jar
_____
Larger number to move jar left | | smaller number to move jar right
|____|
Bottom of jar
I would only connect one servo at a time when testing/determining the proper angles for the servos.
Once you determine the proper angles, change the ones that need changing in the SpiceofPi script.
I don't think I put in the instruction text file on the sd card that I gave Doug, is that the location of all these .py files are in the /AIY-voice-kit-python/src directory.
And finally how I numbered all the jar locations...
Design Challenge Links:
Project Links:
Blog Glenn 1 - AIY Voice Kit Unboxing
Blog Doug 2 - The Block Diagram and Bill of Materials
Blog Doug 3 - Spice Jar Lift Mechanism
Blog Glenn 2 - Firmware Considerations
Blog Doug 5 - Platter Rotation Mechanism
The Spice of Pi - Blog Glenn 3
Blog Doug 6 - 3D Printed Platter Parts
Blog Doug 7 - Main Drive Assembly
Blog Doug 8 - Working Carousel
Blog Doug 9 - Google Assistant
The Spice of Pi - Glenn blog #4
The Spice of Pi - Glenn blog #5
The Spice of Pi - Glenn blog # 6
The Spice of Pi - Glenn Blog # 7
The Spice of Pi - Glenn Blog #8
Top Comments