[Tips] Expanding Native Type on Typescript


Sometimes, We need to extend a native class(object) with inheritance. But some meta programmer has a different direction to solve it. They extend the native class by editing itself. It has beneficial because Not necessary to define a new extended class. There surely has some point you should take care of.

Let’s see on typescript

typescript provides these feature, it calls ‘Declaration merging


This is example, The Array object extedns by adding the uniq method which eliminating duplicated item in a Array.


export {}

declare global {
    interface Array<T> {
        uniq<T> (comparer: (value1: T, valu2: T) => boolean): T[];
        contains<T> (value: T): boolean;

/* eslint no-extend-native: ["error", { "exceptions": ["Array"] }] */
Array.prototype.uniq = function<T> (comparer: (value1: T, valu2: T) => boolean): T[] {
    return this.filter((value1, index1, array1): boolean => {
        const index2 = array1.findIndex((value2): boolean => comparer(value1, value2))
        return (index1 === index2)

To use this extension library, like this.

import './array.extensions'

.filter(elem => elem === 'ccc')
.uniq((v1,v2) => (v1 === v2))

[{ID:'aaa'}, {ID:'bbb'},{ID:'ccc'},{ID:'bbb'},{ID:'ccc'}]
.filter(elem => elem.ID === 'ccc')
.uniq((v1,v2) => (v1.ID === v2.ID))


If you use eslint, You should turn off the no-extend-native rule. (like this)

/* eslint no-extend-native: ["error", { "exceptions": ["Array"] }] */

And, There is a risk of this feature that it spoil to native namespace. You should take care of spoiling namespace when you add (or extend) native object.

Simple Mocking of DynamoDb data mapper in the Jest code

DynamoDB Data Mapper is an awslab’s open source project. This is very helpful as the ORM library for your application. But for testing, there is something tricky because Query and Scan return QueryIterator(nearby AsyncIterator)

DynamoDB data mapper


import { DataMapper } from '@aws/dynamodb-data-mapper'

describe('test', () => {
  beforeEach(() => {
    const asyncIteratorMock = new Object()
    asyncIteratorMock[Symbol.asyncIterator] = async function*() {
      yield 'hoge'
      yield 'hage'
      yield 'huge'
    jest.spyOn(DataMapper.prototype, 'query').mockImplementation(() => {
      return asyncIteratorMock

  afterEach(() => {

I’m happy to become this example can be helpful to you.

Alexa Day 2019


Space Alpha Sannomiya – KOBE, Japan

Fully Amazon Alexa Focusing Conference and of the around ML, IoT technologies of AWS.
– Shift to Voice First –

Attendees 200 (Registrations 240)
Supporters: 24 https://alexaday2019.aajug.jp/supporters/
Speakers: 18 + 7 challengers. https://alexaday2019.aajug.jp/speaker/
Staffs: 17

This is fully organized by volunteers from the AAJUG (Amazon Alexa Japan User Group) and JAWS-UG (Japan AWS User Group) community.

For Designers, Builders, and all interesters.

There were many sessions with many scopes. VUX Designing, Deep diving of development, Operations, Analytics, collaborating with Machine Learning, Case Studies in Cooking, Alexa Skills of Traffic company, and workshops.


Workshop for families

For families, Of making a robot who is controlled by Amazon Alexa.
All programming with Node-RED. Visual Style programming and building some board.


Blueprints Lab

Recently, Alexa Blueprints was launched in Japan. There were petit Labo of Blueprints with Alexa Evangelist. People all were surprised how quick with launching a skill!


After Party

Many attendees participated After Party. Many ‘real’ discussions. Have a mixing all of the contributors.

Go Next

We changed the Name of this event to ‘Voice Con Japan” to be more globally.
Enjoy Voice World Community, Enjoy more Humanic Ways of interaction.

Let’s make a simple observing with Notification powered by AWS deeplens and Amazon Alexa.

Today’s Goal

The goal of this post is that you can feel fun and you feel that using the AI and Alexa is easier.

Background Story

  • I’m a Japanese. I’m really missing Japanese foods.
  • Fortunately, I can get the really fresh fish every weekend at the Market.
  • I want to store these fishes longer. So I decided to make a dried fishes.

I tried to dry fishes at our balcony. And They came from the sea to pick up my lovely fishes.

Thus, I decided to observe our enemies to save my fish.

Designing of Observation


1. AWS Deeplens

The deeplens provides the power of AI easier. We can deploy pre build training model to the deeplens for several simple steps.


2. Amazon Echo

For notifying to me, I could use Amazon Echo.

Let’s go to cooking.

Setting and Deploying the Object Detection Model to the deeplens

Left : selecting Project template on the deeplens console
Middle: MQTT topic filter on the deeplens console
Right: testing dialog on the AWS IoT console

The deeplens provides the Project Template to implement models easier.

Select the Object Detection in Project template.


MQTT is the lightweight M2M protocol. When the model deployed to the deeplens, MQTT topic which sends the detecting status is deployed too. You can see on the deeplens console.

Also, you can test that the handling messages in AWS IoT Core console. You can access it from Project output column.

As you can see, Simple message is receiving on AWS Iot Core console like this.

{ "chair": <percentage of confidence> }

when some bird coming, the deeplens will send message as follows.

{ "bird": <percentage of confidence> }

Make the Alexa skill with notification

OK. Now We could make Object detection part. Let’s make Alexa Skill for accepting notification. To do this, We have to configure the Manifest file for using the Alexa proactive API. The API provides a capability to send notification which Alexa defined schemas.

Defined Schemas

Unfortunately, We can only use a defined schema. So In this demo, I alternatively use the WheatherAlert Schema for notifying as assuming the bird to storm. 🙂


If you want to add some schemas, You can send a request to add a new schema in the Alexa Developer Forum.

To use proactive API is a really simple modification in Alexa Skill Manifest (skill.json). You only add the permission block and the events block

Then, You can deploy by using ASK-CLI.

ask new
git clone https://github.com/haruharuharuby/server-room
ask deploy


After deploying succeeded, Let see Alexa Developer Console. And check the ClientId and ClientSecret in the permission dialog. (these are used later)

Deploy Lambda function

Until now, You deployed two front interfaces individually. (deeplens, alexa skill). So Let’s connect each other!! This lambda code does two things.

  • Filtering the message to pick up the specific word ‘bird’
  • alling Alexa proactive API

To deploy this function, You need to pass 2 steps.

Step1: Add clientId and clientSecret to the parameter store on AWS System Manager

If you can use AWS CLI. Run the script. (Of course, You can set t on AWS Management Console)

aws ssm put-parameter --type String --name bird-detection-client-id --value <your-client-id>
aws ssm put-parameter --type String --name bird-detection-client-secret --value <your-client-secret>
aws ssm put-parameter --type String --name bird-detection-topic-filter --value <your-mqtt-topic-filter>

Step2: deploy lambda function by the serverless framework

The serverless framework is really useful to deploy function and around resources. This script deploy Lambda function and set the trigger from AWS IoT Rules.

git clone https://github.com/haruharuharuby/bird-detect-message-handler
cd bird-detect-message-handler
serverless deploy


After deploying, you can see lambda function and set the trigger on the AWS IoT Rules.

Now current configuration assumes the message has 80% confidence of the bird.

Note: AWS IoT Rules

AWS IoT Rules is a feature for filtering the message by (like) SQL query.

If you want to change the topic filter, and notify to another skill, You just modify serverless.yml

    handler: handler.handler
      - iot:
          name: 'birdDetection'
          sql: "select bird from '${self:custom.iot.topicFilter}' where bird > 0.800"
      STAGE: ${self:custom.stage}
      CLIENT_ID: ${self:custom.alexa.clientId}
      CLIENT_SECRET: ${self:custom.alexa.clientSecret}
      PROACTIVE_AUTH_ENDPOINT: https://api.amazon.com
      PROACTIVE_EVENT_ENDPOINT: https://api.amazonalexa.com
      ALEXA_NOTIFICATION_EXPIRY_MINUTES: ${self:custom.alexa.notificationExpiryMinutes}

If you are in EU, or Asia Pacific region, You should be changed the PROACTIVE_EVENT_ENDPOINT to appropriate one.


Let’s take a check!

This film is for testing. I was setting the AWS IoT Rules as follows.

select 'person' from <<topic-filter>> where 'person' > 0.600

All Done! We could use AI power and Alexa without deeply knowledge of Machine Learning 🙂

Let’s enjoy Alexa style 🙂





Alexa Dev Summit 2018 Tokyo

This is a report of Alexa Dev Summit 2018 @ Meguro Tokyo Japan. This is the big conference which focused on Amazon Alexa. I attended to talk about our community (Amazon Alexa Japan User Group a.k.a AAJUG) I’m surprised with attendees who had a big passion for the Amazon Alexa.

Community Session.

The post is the summary with complementation what I want to talk at the session.

First Part: Are you enjoying VUX?

VUX is a natural way to interact between device and people.
GUI tend with becoming systematic. People had to be gotten along with the system. So people were expecting the system is completely moved even if there was just one petit fail. 
In VUI world, We may be able to accept several fails in conversation. When I talk with my friends, if I could not listen a part of the content, You ask it again, don’t you? You are not angry.

Basically, Voice is ambiguous.  So we speak each other on the premise that making a mistake.

I surely understand there are many situations that VUX must not make mistake(e.g: medical scene). But I think this use case seems few.

VUX may bring a generousity to connecting the technology. That’s why I like VUX interaction.

Second Part: Lightning Talks

Three community members talk about their experience.

Jun Kawaoka

He was making Alexa Skill in Sapporo. the skill called “the trash calendar” This skill was pulled the gathering trash data which published by the government. Unfortunately, Data was a PDF. Then Government changed this data to the open data. I think this is a really nice sample which the government is moved by an Alexa skill.

Rie Motoki

She is a VUX designer. She is really passionate to the Voice Experience. So She talked about her experience why she dive deep to the community.


Miso Tanaka

He is not an engieer. But he was launching several Alexa Skills and Many prototypes which Some IoT Device links to Alexa. 

Third Part: Through the community

at the networking party(Day 1). 

The community is an opportunity to output your knowledge what you like things. It was Alexa in my experience. AAJUG has over 450 members. It was raised up more 300 members in a year. But I think that the Size of the community does not matter.
Because freshness of Community depends on how many members who have big passion.
I would like to talk… “Let’s do output and share your Alexa knowledge and Experience at the community!”
You will get the big feedbacks depends on the number of your outputs.
If you are hesitating to output your Alexa knowledge, It is nonsense.

  • You don’t need provides correct knowledge.
  • You don’t need provides really difficult knowledge.
  • When you feel your experience seems a little bit useful for other people, Please talk at the community.

And, AAJUG is a not developer community. Designer, Planner, Seller, Marketer, and any other people can join us. We hope that People will be collaborating with each other naturally in the AAJUG community.




Thanks all attendees, staffs, and all amazonians.

Voice User Experience should be more humanic.

The Amazon would like to make Alexa to more HUMAN CENTRIC. You should not convert your web services to Alexa Skill directly.  What is the humanistic way? We consider these feature what we have.

  • We already have experience.  In conversation, We always complement information from our experience.
  • We often extend our conversation with the information that is stored each other.
  • We usually take care of him(or her) in conversation.
  • We store some keywords in our conversation for the next time.

In many cases of GUI systems, There are many implementations which we don’t go forward if all conditions should be clear. This is a not appropriate way for Voice User Experience. 


First Time
You:       Alexa, open booking concierge
Alexa:   Welcome booking concierge. What your name?

You:       hugtech
Alexa:    Welcome hugtech. ….

Secound Time
You:       Alexa, open booking concierge
Alexa:   Welcome booking concierge. What your name?

You:       (I already told my name to you….)

That is not humanistic! When you turn on the skill for the second time,  How about this?

Second Time
You:       Alexa, open booking concierge.
Alexa:   Welcome. I’m happy to see you again. 

In addition to this, We usually use several greeting. There are many patterns. Shall we modify greeting every time after Second time?

Third Time
You:       Alexa, open booking concierge
Alexa:   Welcome back, hugtech.

Re:Cap Re:Invent

The re:Invent is the biggest AWS conference in the world.  The post is a self-report(retrospective) at the re:Invent.



Meet with Alexa EU Team

I went to join the breakfast with Alexa EU Team.

I helped to make AWS Certification in August.

  • Assisted in the development of AWS Certification Examinations by serving as a subject matter expert.

They are Amazonian, other Alexa’s expert and AWS Hero.


Stay together

We stayed a condominium with JAWS-UG community members. Here was the best place! It’s center of Las Vegas.

And, All JAWS-UG KOBE core organizers gathered at Vegas for the first time. It took for 3 years. That was my pleasure!

Meet Alexa Champions


As Alexa Champion, I and Hide joined the Alexa Champions meetup.

We could introduce our activity and discussing how to enhance the Japanese NLU.

In the future, They may join us.



Ruby! Ruby! Ruby!  Lambda!Ruby! Ruby! Ruby!

Finally!  AWS Lambda officially support Ruby!

There were many useful new service and features.


AWS Lambda Support Ruby.



Lambda Layers


With lambda layers, We can integrate the common logic into a layer.


Step Functions supports eight more services


We can import the data from Amazon ECS, AWS Fargate, Amazon DynamoDB, Amazon SNS, Amazon SQS, AWS Batch, AWS Glue, and Amazon SageMaker.

So We can do more less code.


For Amazon Alexa

Amazon Personalize


I had been expecting how to personalize Alexa Skill.

AWS personalize analyzes consumer needs and respond automatically by using ML power.


Machine Learning Model Marketplace.


Alexa will be growth with Amazon SageMaker to enhance User Experience.

But Many people are not familiar with Machine Learning.

I think that The marketplace of ML models is really useful and much easier to implement ML technologies.



I joined 3 workshops.

I really excited with AVS Echo show development kit!



I recommend 2 tutorials.




Foot Print


AWS-UG Booth

There was Community Booth at the Expo floor.

I could talk with other countries UG leaders.


Martijn (Organizer of AWS Amsterdam) came to our booth.

He presented his knowledge at besides our booth.

That was precious time.


Community Workshop



At the night,  In the community leaders workshop.

Hiromi Ito talked as represent of JAWS-UG.

She is really passionating for community activities.



We were burning at the front!!!




Account Linking with Amazon Cognito by Authorization Code Grant


If you would like to provide your SAAS.  Adding Voice Experience to your service is a good choice for the customer.

Voice Experience has a possibility that customer can use your service easier.

In this post, We show you to build the Serverless OAuth infrastructure and combining this your Alexa Skill.

Step by Step Summary

  1. Constructing Amazon Cognito UserPool
  2. Configuring Client App
  3. Create your Custom skill with Account Linking
  4. Acquire OAuth token



1. Constructing Amazon Cognito UserPool

At first, Creating User Pool on AWS.

Access to the Management Console, Select Cognito. Then select “Manage User Pool“.


Select “Create a user pool

Select “Review defaults


In this post, All configuration is a default. (User is authenticated by email)

Select “Add Clients


before going, Checking the Client information.

client_id” and “App client secret” will use Step 3.


2. Configuring Client App

There are two authentication types in OAuth2. The one is an implicit grant, and Auth code grant.

Implicit grant can use the use-case which user needs to authenticate every time when The Access Token had expired.

Auth code grant can use the use-case which application needs to update Access Token automatically.

For Alexa Skill,  Auth code grant is the better way to acquiring an access token.  Because Alexa has a feature that Access Token automatically updated.


To do this, Configuration is really easy. Just checking the “Authorization code grant” checkbox. and For authenticate by email, check “aws.cognito.signin.user.admin” in the Scopes.


In the Domain name, most of the use-cases are needed a custom domain for authentication.  Add Your own domain and Enter “Domain name” and “AWS managed certificate“.


Attaching is in progress….


You will find the Alias target (“xxxxxx.cloudfront.net”) on the screen. Add the URL as an alias record(A record) to your Hosted Zone on the Route53.

When attaching is finished, Domain status transits the “ACTIVE“.


Almost done.

For the test, Add the redirect URL to the Callback URL(s).


OK. Let’s access sign in page.



3. Create your Custom skill with Account Linking

Make a custom skill using the Fact Skill tutorial.


(Any other skills which are able to call can use as an alternative.)


Then, In Alexa Developer console, Add the Client Information.

In the same screen, Write down Redirect URLs. These URLs needs later.


Back to the Amazon Cognito Screen, Set the three Redirect URLs to the Callback URL(s).


done. You finished configuring Account Linking!

Let’s test!



4. Acquire Access token

When the skill was called, You can find the access token in the Request.

(Cloud Watch Logs)

ASK-SDK for Node.js のCode Snipet

const Axios = require('axios');
const HelloWorldIntentHandler = {
  canHandle(handlerInput) {
    return handlerInput.requestEnvelope.request.type === 'IntentRequest'
      && handlerInput.requestEnvelope.request.intent.name === 'HelloWorldIntent';
  handle(handlerInput) {
    const accessToken = handlerInput.RequestEnvelope.session.user.accessToken;
    const speechText = 'Hello World!';
    const headers = {Authorization:  `bearer ${accessToken}`,  'Content-Type':  'application/json'};
    const Axios.get(url, headers);
    return handlerInput.responseBuilder
      .withSimpleCard('Hello World', speechText)



With using Cognito, You can build the scaffold of OAuth2 flow much easier.

And, You don’t need to maintain the user resource in your own database.

And, You can build Voice User Experience with your customer resources.


For more information:




More naturally interface. More Humanic interface.

HUMAN CENTRIC (Dr Warner Vogels at re:Invent 2018)







Alexa Champions

I was given the honor as a Alexa Champion today.

I have been holding Amazon Alexa’s meetup in Japan. But Now I’m living in the Netherlands.

Surely my situation is a little bit strange. I understand it.

But I would like to do my activities of Alexa community to spread Alexa’s Experience.

See my bio on Champions Gallery.


AAJUG Online Discussion #01

The AAJUG is a short name of Amazon Alexa Japan User Group.

The first online meetup was taken placed on 4 June 2018.


Thank you to all attendees!


  • self-introducing
  • about your skill
  • how do you usually do the test?
  • how to test the SSML
  • which is better node or python?
  • We wish the whole room can be Smart Home.
  • Clova, Google Home, Alexa, and Line
  • How about microphone’s performance of Spot and Show?
  • Which is better for designing the skill? Many conversations? Minimise?
  • advertisement(meetup, event etc..)

URL’s in the Topics

Python tutorial 5 munites alexa skill


Alexa Skill: Backlog for Amazon Alexa


Alexa Skill: MHW JAKUTEN checker

Alexa Skill: Mesoko URANAI



tips Github/URL



[Amazon Key]





Give me the PR、issue



See you next online discussion and enjoy voice experience!


FB Group





Invitation link