summaryrefslogtreecommitdiff
path: root/docs/html/training/articles/assistant.jd
blob: a1fbd6ba1634be3cf05697113bb3284aeb896bcf (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
page.title=Optimizing Content for the Assistant
page.metaDescription=Support contextually relevant actions through the Assist API.
page.tags=assist, accessibility, now, now on tap
meta.tags="assistant", "marshmallow", "now"
page.image=images/cards/card-assist_16-9_2x.png

page.article=true
@jd:body

<div id="tb-wrapper">
<div id="tb">
    <h2>In this document</h2>
    <ol>
      <li><a href="#assist_api">Using the Assist API</a>
      <ol>
        <li><a href="#assist_api_lifecycle">Assist API Lifecycle</a></li>
        <li><a href="#source_app">Source App</a></li>
        <li><a href="#destination_app">Destination App</a></li>
      </ol>
      </li>
      <li><a href="#implementing_your_own_assistant">Implementing your
      own assistant</a></li>
    </ol>
  </div>
</div>

<p>
  Android 6.0 Marshmallow introduces a new way for users to engage with apps
  through the assistant.
</p>

<p>
  Users summon the assistant with a long-press on the Home button or by saying
  the {@link android.service.voice.AlwaysOnHotwordDetector keyphrase}. In
  response to the long-press, the system opens a top-level window that displays
  contextually relevant actions for the current activity. These potential
  actions might include deep links to other apps on the device.
</p>

<p>
  This guide explains how Android apps use Android's Assist API to improve the
  assistant user experience.
</p>


<h2 id="assist_api">Using the Assist API</h2>

<p>
  The example below shows how Google Now integrates with the Android assistant
  using a feature called Now on Tap.
</p>

<p>
  The assistant overlay window in our example (2, 3) is implemented by Google
  Now through a feature called Now on Tap, which works in concert with the
  Android platform-level functionality. The system allows the user to select
  the assistant app (Figure 2) that obtains contextual information from the
  <em>source</em> app using the Assist API which is a part of the platform.
</p>


<div>
  <img src="{@docRoot}images/training/assistant/image01.png">
  <p class="img-caption" style="text-align:center;">
    Figure 1. Assistant interaction example with the Now on Tap feature of
    Google Now
  </p>
</div>

<p>
  An Android user first configures the assistant and can change system options
  such as using text and view hierarchy as well as the screenshot of the
  current screen (Figure 2).
</p>

<p>
  From there, the assistant receives the information only when the user
  activates assistance, such as when they tap and hold the Home button ( shown
  in Figure 1, step 1).
</p>

<div style="float:right;margin:1em;max-width:300px">
  <img src="{@docRoot}images/training/assistant/image02.png">
  <p class="img-caption" style="text-align:center;">
    Figure 2. Assist &amp; voice input settings (<em>Settings/Apps/Default
    Apps/Assist &amp; voice input</em>)
  </p>
</div>

<h3 id="assist_api_lifecycle">Assist API Lifecycle</h3>

<p>
  Going back to our example from Figure 1, the Assist API callbacks are invoked
  in the <em>source</em> app after step 1 (user long-presses the Home button)
  and before step 2 (the assistant renders the overlay window). Once the user
  selects the action to perform (step 3), the assistant executes it, for
  example by firing an intent with a deep link to the (<em>destination</em>)
  restaurant app (step 4).
</p>

<h3 id="source_app">Source App</h3>

<p>
  In most cases, your app does not need to do anything extra to integrate with
  the assistant if you already follow <a href=
  "{@docRoot}guide/topics/ui/accessibility/apps.html">accessibility best
  practices</a>. This section describes how to provide additional information
  to help improve the assistant user experience, as well as scenarios, such as
  custom Views, that need special handling.
</p>

<h4 id="share_additional_information_with_the_assistant">Share Additional Information with the Assistant</h4>

<p>
  In addition to the text and the screenshot, your app can share
  <em>additional</em> information with the assistant. For example, your music
  app can choose to pass current album information, so that the assistant can
  suggest smarter actions tailored to the current activity.
</p>

<p>
  To provide additional information to the assistant, your app provides
  <em>global application context</em> by registering an app listener and
  supplies activity-specific information with activity callbacks as shown in
  Figure 3.
</p>

<div>
  <img src="{@docRoot}images/training/assistant/image03.png">
  <p class="img-caption" style="text-align:center;">
    Figure 3. Assist API lifecycle sequence diagram.
  </p>
</div>

<p>
  To provide global application context, the app creates an implementation of
  {@link android.app.Application.OnProvideAssistDataListener} and registers it
  using {@link
  android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener)}.
  In order to provide activity-specific contextual information, activity
  overrides {@link android.app.Activity#onProvideAssistData(android.os.Bundle)}
  and {@link
  android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)}.
  The two activity methods are called <em>after</em> the optional global
  callback (registered with {@link
  android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener)})
  is invoked. Since the callbacks execute on the main thread, they should
  complete <a href="{@docRoot}training/articles/perf-anr.html">promptly</a>.
  The callbacks are invoked only when the activity is <a href=
  "{@docRoot}reference/android/app/Activity.html#ActivityLifecycle">running</a>.
</p>

<h5 id="providing_context">Providing Context</h5>

<p>
  {@link android.app.Activity#onProvideAssistData(android.os.Bundle)} is called
  when the user is requesting the assistant to build a full {@link
  android.content.Intent#ACTION_ASSIST} Intent with all of the context of the
  current application represented as an instance of the {@link
  android.app.assist.AssistStructure}. You can override this method to place
  into the bundle anything you would like to appear in the
  <code>EXTRA_ASSIST_CONTEXT</code> part of the assist Intent.
</p>

<h5 id="describing_content">Describing Content</h5>

<p>
  Your app can implement {@link
  android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)}
  to improve assistant user experience by providing references to content
  related to the current activity. You can describe the app content using the
  common vocabulary defined by <a href="https://schema.org">Schema.org</a>
  through a JSON-LD object. In the example below, a music app provides
  structured data to describe the music album the user is currently
  looking at.
</p>

<pre class="prettyprint">
&commat;Override
public void onProvideAssistContent(AssistContent <strong>assistContent</strong>) {
  super.onProvideAssistContent(<strong>assistContent</strong>);

  String structuredJson = <strong>new </strong>JSONObject()
       .put(<strong>"@type"</strong>, <strong>"MusicRecording"</strong>)
       .put(<strong>"@id"</strong>, <strong>"https://example.com/music/recording"</strong>)
       .put(<strong>"name"</strong>, <strong>"Album Title"</strong>)
       .toString();

  <strong>assistContent</strong>.setStructuredData(structuredJson);
}
</pre>

<p>
  Custom implementations of {@link
  android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)}
  may also adjust the provided {@link
  android.app.assist.AssistContent#setIntent(android.content.Intent) content
  intent} to better reflect the top-level context of the activity, supply
  {@link android.app.assist.AssistContent#setWebUri(android.net.Uri) the URI}
  of the displayed content, and fill in its {@link
  android.app.assist.AssistContent#setClipData(android.content.ClipData)} with
  additional content of interest that the user is currently viewing.
</p>

<h4 id="default_implementation">Default Implementation</h4>

<p>
  If neither {@link
  android.app.Activity#onProvideAssistData(android.os.Bundle)} nor {@link
  android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)}
  callbacks are implemented, the system will still proceed and pass the
  information collected automatically to the assistant unless the current
  window is flagged as <a href="#excluding_views">secure</a>.
  As shown in Figure 3, the system uses the default implementations of {@link
  android.view.View#onProvideStructure(android.view.ViewStructure)} and {@link
  android.view.View#onProvideVirtualStructure(android.view.ViewStructure)} to
  collect text and view hierarchy information. If your view implements custom
  text drawing, you should override {@link
  android.view.View#onProvideStructure(android.view.ViewStructure)} to provide
  the assistant with the text shown to the user by calling {@link
  android.view.ViewStructure#setText(java.lang.CharSequence)}.
</p>

<p>
  <strong>In most cases, implementing accessibility support will enable the
  assistant to obtain the information it needs.</strong> This includes
  providing {@link android.R.attr#contentDescription
  android:contentDescription} attributes, populating {@link
  android.view.accessibility.AccessibilityNodeInfo} for custom views, making
  sure custom {@link android.view.ViewGroup ViewGroups} correctly {@link
  android.view.ViewGroup#getChildAt(int) expose} their children, and following
  the best practices described in <a href=
  "{@docRoot}guide/topics/ui/accessibility/apps.html">“Making Applications
  Accessible”</a>.
</p>

<h4 id="excluding_views">Excluding views from the assistant</h4>

<p>
  An activity can exclude the current view from the assistant. This is accomplished
  by setting the {@link android.view.WindowManager.LayoutParams#FLAG_SECURE
  FLAG_SECURE} layout parameter of the WindowManager and must be done
  explicitly for every window created by the activity, including Dialogs. Your
  app can also use {@link android.view.SurfaceView#setSecure(boolean)
  SurfaceView.setSecure} to exclude a surface from the assistant. There is no
  global (app-level) mechanism to exclude all views from the assistant. Note
  that <code>FLAG_SECURE</code> does not cause the Assist API callbacks to stop
  firing. The activity which uses <code>FLAG_SECURE</code> can still explicitly
  provide information to the assistant using the callbacks described earlier
  this guide.
</p>

<h4 id="voice_interactions">Voice Interactions</h4>

<p>
  Assist API callbacks are also invoked upon {@link
  android.service.voice.AlwaysOnHotwordDetector keyphrase detection}. For more
  information see the <a href="https://developers.google.com/voice-actions/">voice
  actions</a> documentation.
</p>

<h4 id="z-order_considerations">Z-order considerations</h4>

<p>
  The assistant uses a lightweight overlay window displayed on top of the
  current activity. The assistant can be summoned by the user at any time.
  Therefore, apps should not create permanent {@link
  android.Manifest.permission#SYSTEM_ALERT_WINDOW system alert}
  windows interfering with the overlay window shown in Figure 4.
</p>

<div style="">
  <img src="{@docRoot}images/training/assistant/image04.png">
  <p class="img-caption" style="text-align:center;">
    Figure 4. Assist layer Z-order.
  </p>
</div>

<p>
  If your app uses {@link
  android.Manifest.permission#SYSTEM_ALERT_WINDOW system alert} windows, it
  must promptly remove them as leaving them on the screen will degrade user
  experience and annoy the users.
</p>

<h3 id="destination_app">Destination App</h3>

<p>
  The matching between the current user context and potential actions displayed
  in the overlay window (shown in step 3 in Figure 1) is specific to the
  assistant’s implementation. However, consider adding <a href=
  "{@docRoot}training/app-indexing/deep-linking.html">deep linking</a> support
  to your app. The assistant will typically take advantage of deep linking. For
  example, Google Now uses deep linking and <a href=
  "https://developers.google.com/app-indexing/">App Indexing</a> in order to
  drive traffic to destination apps.
</p>

<h2 id="implementing_your_own_assistant">Implementing your own assistant </h2>

<p>
  Some developers may wish to implement their own assistant. As shown in Figure
  2, the active assistant app can be selected by the Android user. The
  assistant app must provide an implementation of {@link
  android.service.voice.VoiceInteractionSessionService} and {@link
  android.service.voice.VoiceInteractionSession} as shown in <a href=
  "https://android.googlesource.com/platform/frameworks/base/+/android-5.0.1_r1/tests/VoiceInteraction?autodive=0%2F%2F%2F%2F%2F%2F">
  this</a> example and it requires the {@link
  android.Manifest.permission#BIND_VOICE_INTERACTION} permission. It can then
  receive the text and view hierarchy represented as an instance of the {@link
  android.app.assist.AssistStructure} in {@link
  android.service.voice.VoiceInteractionSession#onHandleAssist(android.os.Bundle,
  android.app.assist.AssistStructure,android.app.assist.AssistContent) onHandleAssist()}.
  The assistant receives the screenshot through {@link
  android.service.voice.VoiceInteractionSession#onHandleScreenshot(android.graphics.Bitmap)
  onHandleScreenshot()}.
</p>