At Nuxeo we are always on the lookout for opportunities to add an edge to the Nuxeo Platform. Combining this and my love for Amazon products, a few months back I had created a Nuxeo Alexa skill that allows us to search the documents in the Nuxeo Platform through Alexa. When the new Echo Show was released, I saw an opportunity to improve the Nuxeo Alexa experience - by adding the ability to display the documents on the Echo Show screen. This would mean that you can ask Alexa to search your digital assets and show the results on the screen without having to lift a finger!

Let's take a deep dive into this skill and see it in action.

Recap of the Previous Skill

In my previous Alexa blog, I showed you how to develop a skill with an authentication screen to link your Alexa account to your Nuxeo instance.

We had 3 main commands

  • Search into your documents
  • GetLast to see your last documents
  • Tasks to see the last tasks

The architecture looked like:

Architecture

Now let's take a look at the steps to display the search results on Echo Show.

Handle Display

To display data on the Echo Show you just need to include another directive inside your answer.

  {
    "type": "Display.RenderTemplate",
    "template": {...}
  }

You have 6 different types of layouts available here in the current list:

BodyTemplate1 BodyTemplate2 BodyTemplate3
BodyTemplate1 BodyTemplate2 BodyTemplate3
BodyTemplate6 ListTemplate1 ListTemplate2
BodyTemplate6 ListTemplate1 ListTemplate2

Tasks

For the tasks I decide to go with ListTemplate1 as the thumbnails are not critical. I updated our code and added those lines. Here is the Before and After code:

Before

var tasks = '';
for (let id in doc.entries) {
    let res = "Task " + i++ + " " + ...;
    tasks += res;
    response.say(res);
}

After

var tasks = '';
var listItems = [];
for (let id in doc.entries) {
    let res = "Task " + i++ + " " + ...;
    tasks += res;
    response.say(res);
    listItems.push(
      {
        ...
        "textContent": {
            "primartyText": {
              ...
            }
        }
      });
}
let template = {
  "type": "ListTemplate1",
  "token": "workflow",
  "title": "Your workflow tasks",
  "listItems": listItems
};
response.directive({"type": "Display.RenderTemplate", "template": template });

To display our documents from a Search or GetLast request, I chose the ListTemplate2 as I wanted to display the document thumbnail.

But as the thumbnail URL needs authentication, the retrieval of the thumbnail would fail. So we need to trick the system and proxy the request. Instead of using the thumbnail rest API URL:

{nuxeo_url}/api/v1/{uuid}/@rendition/thumbnail

We will proxy it through our authentication server:

https://nuxeo-auth.loopingz.com/thumbnail.php?token={token}&uid={uid}

Proxy Thumbnails

The new architecture looks like:

New Architecture

The thumbnail proxy is pretty simple - it uses the auth token to initiate the Nuxeo PHP API Client and then does a GET on the thumbnail rendition. In case of failure, it will send a default image for the document.

...
try {
   $client = new NuxeoClient($url);
   $client = $client->setAuthenticationMethod(new TokenAuthentication($token));

   $doc = $client->automation('Document.Fetch')->param('value', $uid)->execute(Document::className);
   $thurl = $url."/api/v1/id/".$uid."/@rendition/thumbnail";
   $res = $client->get($thurl);
   header("Content-Type: ".$res->getContentType());
   header("Content-Length: " . $res->getContentLength());
   print($res->getBody());
} catch (Exception $e) {
  defaultImage();
}

Security

As the token is now passed over some URLs, and was only encoded as base64, I decided to secure that token a bit more and use AES-256 on it with a shared-secret between the authentication server and the Lambda function.

So I added the encryption to the linker, and the decryption to the thumbnail proxy.

require 'secret.php';

function encrypt($data) {
  global $SECRET;
  $pass = openssl_digest($SECRET,"sha256", true);
  $iv = openssl_random_pseudo_bytes(16);
  $algo = "aes-256-ctr";
  $options = OPENSSL_RAW_DATA;
  $encData = openssl_encrypt($data, $algo, $pass, $options, $iv);
  return base64_encode($iv.$encData);
}

function decrypt($data) {
  global $SECRET;
  $pass = openssl_digest($SECRET,"sha256", true);
  $algo = "aes-256-ctr";
  $options = OPENSSL_RAW_DATA;
  $plain = base64_decode($data);
  return openssl_decrypt(substr($plain,16), $algo, $pass, $options, substr($plain,0,16));
}

Then the trick was to align the encoding from PHP to NodeJS for the Alexa skill

const secret = require('./secret');

...

let tokenRaw = request.getSession().details.accessToken;
let raw = Buffer.from(tokenRaw, 'base64');
let iv = raw.slice(0,16);
let enc = raw.slice(16);
let cipher = crypto.createCipheriv(algo, secret, iv);
let dec = cipher.update(enc, 'ascii', 'ascii');
dec += cipher.final('ascii');

This encryption is pretty standard - we have a shared key of 256 bits and then we generate a random IV initialization vector. We use both to encrypt our data then concatenate the IV vectors with the encrypted data.

Let's See It in Action

Here's a demo of what it looks like:

You can find the sources in the GitHub repository. Try it out and let us know what you think!