aiy.assistant¶
APIs that simplify interaction with the
Google Assistant API in one of two ways:
using either aiy.assistant.grpc
or aiy.assistant.library
, corresponding to the
Google Assistant Service and Google Assistant Library, respectively.
Which of these you choose may depend on your intentions.
The Google Assistant Service provides a gRPC interface that is generally more complicated. However,
the aiy.assistant.grpc
API offered here does not provide access to those APIs. Instead, it
completely wraps the google.assistant.embedded.v1alpha2
APIs. It
takes care of all the complicated setup for you, and handles all response events. Thus, if all you
want to build a basic version of the Google Assistant, then using aiy.assistant.grpc
is easiest because it requires the least amount of code.
For an example, see src/examples/voice/assistant_grpc_demo.py.
On the other hand, aiy.assistant.library
is a thin wrapper around
google.assistant.library
. It overrides the Assistant.start()
method to handle the device registration, but beyond that, you can and must use the
google.assistant.library
APIs to respond to all events returned by
the Google Assistant. As such, using aiy.assistant.library
provides you more control,
allowing you to build custom device commands based on conversation with the Google Assistant.
For an example, see src/examples/voice/assistant_library_with_local_commands_demo.py.
Additionally, only aiy.assistant.library
includes built-in support for hotword detection
(such as “Okay Google”). However, if you’re using the Raspberry Pi Zero (provided with the V2 Voice
Kit), then you cannot use hotword detection because that feature depends on the ARMv7 architecture
and the Pi Zero has only ARMv6. So that feature of the library works only with Raspberry Pi 2/3,
and if you’re using a Pi Zero, you must instead use the button or another type of trigger to
initiate a conversation with the Google Assistant. (Note: The Voice Bonnet can be used on
any Raspberry Pi.)
Tip: If all you want to do is create custom voice commands (such as “turn on the light”), then
you don’t need to interact with the Google Assistant. Instead, you can use aiy.cloudspeech
to convert your voice commands into text that triggers your actions.
Note
These APIs are designed for the Voice Kit, but have no dependency on the Voice HAT/Bonnet specifically. However, they do require some type of sound card attached to the Raspberry Pi that can be detected by the ALSA subsystem.
aiy.assistant.grpc¶
Enables a conversation with the Google Assistant, using the Google Assistant Service, which connects to the Google Assistant using a streaming endpoint over gRPC.
This gRPC service is typically more complicated to set up, compared to the Google Assistant
Library, but this API takes care of all the complexity for you. So you simply create an instance
of AssistantServiceClient
, then start the Google Assistant by calling
conversation()
.
This API provides only an interface to initiate a conversation with the Google Assistant. It speaks and prints all responses for you—it does not allow you to handle the response events or create custom commands. For an example, see src/examples/voice/assistant_grpc_demo.py.
If you want to integrate custom device commands with the Google Assistant using the gRPC interface,
instead use the Google Assistant Service directly. For an example, see this gRPC sample.
Or instead of interacting with the Google Assistant, you can use aiy.cloudspeech
to convert your voice commands into text that triggers your actions.
-
class
aiy.assistant.grpc.
AssistantServiceClient
(language_code='en-US', volume_percentage=100)¶ Bases:
object
Provides a simplified interface for the EmbeddedAssistant.
Parameters: - language_code – Language expected from the user, in IETF BCP 47 syntax (default is “en-US”). See the list of supported languages.
- volume_percentage – Volume level of the audio output. Valid values are 1 to 100 (corresponding to 1% to 100%).
-
conversation
(deadline=185)¶ Starts a conversation with the Google Assistant.
The device begins listening for your query or command and will wait indefinitely. Once it completes a query/command, it returns to listening for another.
Parameters: deadline – The amount of time (in milliseconds) to wait for each gRPC request to complete before terminating.
-
volume_percentage
¶ Volume level of the audio output. Valid values are 1 to 100 (corresponding to 1% to 100%).
-
class
aiy.assistant.grpc.
AssistantServiceClientWithLed
(board, language_code='en-US', volume_percentage=100)¶ Bases:
aiy.assistant.grpc.AssistantServiceClient
Same as
AssistantServiceClient
but this also turns the Voice Kit’s button LED on and off in response to the conversation.Parameters: - board – An instance of
Board
. - language_code – Language expected from the user, in IETF BCP 47 syntax (default is “en-US”). See the list of supported languages.
- volume_percentage – Volume level of the audio output. Valid values are 1 to 100 (corresponding to 1% to 100%).
-
conversation
(deadline=185)¶ Starts a conversation with the Google Assistant.
The device begins listening for your query or command and will wait indefinitely. Once it completes a query/command, it returns to listening for another.
Parameters: deadline – The amount of time (in milliseconds) to wait for each gRPC request to complete before terminating.
-
volume_percentage
¶ Volume level of the audio output. Valid values are 1 to 100 (corresponding to 1% to 100%).
- board – An instance of
aiy.assistant.library¶
Facilitates access to the Google Assistant Library, which provides APIs to initiate conversations with the Google Assistant and create custom device commands commands.
This includes a wrapper for the Assistant
class only. You must import all other Google
Assistant classes directly from the google.assistant.library
module
to handle each of the response events.
Note
Hotword detection (such as “Okay Google”) is not supported with the Raspberry Pi Zero (only with Raspberry Pi 2/3). If you’re using a Pi Zero, you must instead use the button or another type of trigger to initiate a conversation with the Google Assistant.
-
class
aiy.assistant.library.
Assistant
(credentials)¶ Bases:
google.assistant.library.Assistant
A wrapper for the
Assistant
class that handles model and device registration based on the project name in your OAuth credentials (assistant.json
) file.All the
Assistant
APIs are available through this class, such asstart()
to start the Assistant, andstart_conversation()
to start a conversation, but they are not documented here. Instead refer to the Google Assistant Library for Python documentation.To get started, you must call
get_assistant_credentials()
and pass the result to theAssistant
constructor. For example:from google.assistant.library.event import EventType from aiy.assistant import auth_helpers from aiy.assistant.library import Assistant credentials = auth_helpers.get_assistant_credentials() with Assistant(credentials) as assistant: for event in assistant.start(): process_event(event)
For more example code, see src/examples/voice/assistant_library_demo.py.
Parameters: credentials – The Google OAuth2 credentials for the device. Get this from get_assistant_credentials()
.
aiy.assistant.auth_helpers¶
Authentication helper for the Google Assistant API.
-
aiy.assistant.auth_helpers.
get_assistant_credentials
(credentials_file=None)¶ Retreives the OAuth credentials required to access the Google Assistant API.
If you’re using
aiy.assistant.library
, you must call this function and pass the result to theAssistant
constructor.If you’re using
aiy.assistant.grpc
, you do not need this function because theAssistantServiceClient
calls this during initialization (using the credentials file at~/assistant.json
).Parameters: credentials_file – Absolute path to your JSON credentials file. If None, it looks for the file at ~/assistant.json
. To get a credentials file, follow these instructions.Returns: The device OAuth credentials, as a google.oauth2.credentials.Credentials
object.