Building a HdM Alexa Skill – Part 1

Introduction to the Project

It is becoming obvious that chatbots are quickly emerging to the next Big Thing. This is especially evident by the recent flood of publication of software development kits by major tech companies to encourage developers to build applications for their ecosystem.

Microsoft offers a SDK to enable chatbots in Skype, Facebook is implementing chatbots into its own messenger, Google is opening up their voice assistant to the public and Amazon already offers more than 3.000 custom skills for their Alexa voice service. Combined with Smart Speaker – take Amazon Echo, Google Home or the very recently announced device integration of Cortana by Microsoft as an example – seamless voice interaction is introduced into our every day life and opens up a whole new market for developers to implement their products into the daily routine of their users.

To gain expertise with this new technology we decided to develop our own voice service. Since – as of December 2016 – Amazon is slowly releasing their smart speakers Echo and Dot to Germany, we took the opportunity to be one of the first to release a German speaking Skill.

Now that we knew how to implement the project we needed to decide on what to implement. It was important to us that we don’t just implement a product that only serves us as a mean to learn and study but something that benefits others as well. The idea for a lightweight knowledge system which covers the basic functions to get through a regular university day at the HdM seemed fitting since we personally struggled with ever changing lecture halls. While there are possibilities to quickly lookup important updates using the mobile application or one of the information displays which are distributed throughout the buildings, our team took the opportunity to further simplify the daily student life while using as much state of the art technology as possible.

The video below shows an overview over the functionality of the finished project to give a brief preview on what to expect.

Task Overview & Team Setup

If we were to start a new project from scratch, we wanted to develop it using both new or sophisticated software tools (e.g. serverless computing). At the same time, we faced a challenge concerning the hardware implementation: At the time of writing, the Amazon Echo / Dot devices are only available by invitation (and Amazon is very sparse giving them away). So, to be more independent, we initially decided to build our own Alexa powered device using a Raspberry Pi – which led to a whole other problem (more in the corresponding blog post). While two of our group worked on the Raspberry Pi – and later joined the skill development – Jonas and myself wrote the actual skill using Node.js while heavily relying on test driven development. To be able to seamlessly and automatically integrate our code from Bitbucket into Amazon Lambda we settled for Jenkins which was setup and configured by Malte.

Team Member Activity
Andreas Fliehr Alexa Implementation on Raspberry Pi & Skill Development
José Egas Alexa Implementation on Raspberry Pi & Skill Development
Eric Schmidt Skill Development
Jonas Scheffner HdM API Wrapper & Skill Development
Malte Vollmerhausen Jenkins

Distribution of Labour

Obtaining the Data: Writing a Client Application

Before we could start to write the actual skill we needed to gain access to the relevant information from various sources (The website of the university, menu schedule etc.). Fortunately for us, there is an API available which already parses the data of these sources and returns a JSON-Object containing the information. Jonas wrote a straightforward wrapper module for this API using Node.js which we later used to obtain the information we needed (The module is available on npm). After the development on the module was completed, we were able to start with the actual development.

Coming Up

The following blog posts include in-depth coverage of the implementation techniques and reasoning of using Amazon’s Alexa Voice Service (AVS), Serverless Computing with Amazon Lambda, Jenkins as the continuous integration platform and Node.js as the programming environment all while heavily relying on test driven development. We hope you find the upcoming articles informative and that we could give you a little insight into our way of working.

Further Remark: We plan to publish the skill and make it accessible through the amazon skill store! We will update this article as soon as it is available.

Resources

Resource Name Link
Skill Repository https://bitbucket.org/jscheffner/hdm-alexa-skill
Client Repository https://bitbucket.org/jscheffner/hdm-node-client

Automate deployment with the Unreal Engine using the Unreal Automation Tool (UAT)

The Unreal Engine 4 from Epic Games is a powerful tool to create any type of game or even application, however the implemented automation and build system is barely documented, if at all. This post will show the necessary steps to build, cook and package a game using the Unreal Automation Tool (UAT)  and gives a brief overview over the somewhat hidden tools.

Terminology and Engine Types

Before we can start to delve into the automation system we need to define some terminology. The central tool we are going to use is the Unreal Automation Tool (UAT). This tool is the primary entry point for any sort of automation inside the engine aside from building and packaging an application. The UAT provides commandlets, which are usually a set of commands to be run inside the engine’s ecosystem. The UAT is started using the appropriate script for the underlying operating system (*.sh for linux, *.command for mac and *.bat for Windows), which are located in Engine/Build/BatchFiles. For UAT this would be the RunUAT script. Appending -list returns a list of available commands. More information for UAT can be retrieved using the -help switch. The same applies for each commandlet, just write RunUAT <Commandlet> -help. To get more information and possible switches for this command.

It is important to know on which type of engine the UAT is run. There are three different engine types and they require some changes to the command line depending which is used.
The first and most common engine type for smaller projects is the so called Rocket build. The rocket build is what you get, when you install the engine via the Epic Games Launcher. This is a pre-built, ready-to-use version of the engine. Depending on the choices you select in the launcher, it contains all necessary dependencies to package a project for all desktop applications and mobile targets.

The next type of engine is the source build. This build type is what you get when you clone or download the Unreal Engine repository from GitHub (if you have linked your GitHub account to your Epic Games profile).
This is the most versatile type of engine as the source of the engine can be changed and recompiled. Licensees for console development also get access to extra source code to be compiled in for console support. While this engine type is the most versatile it comes with the cost that everything needs to be compiled, an editor build usually takes from 15-60 minutes, depending on the system used and takes up more than 4 times the size of pre-built engines.

To speed up development for your team one can create an installed build from a source engine. The supported platforms can be chosen upon creating this build. This type is supported starting with 4.13 and can be easily created, see this link for more information. This build is very similar to the rocket build, however it can contain own changes and additional target platforms.

One last tool we need for automatic deployment is the Unreal Build Tool (UBT), which, hence the name, is the main tool for building source code inside the engine. It looks like the UAT is currently meant to be run from inside the editor. When trying to package a project just using the UAT it will fail (on a clean version of the project) because of missing editor dlls of the project. To create the missing dlls we need to use the UBT to build the editor target for our project and we are good to go.

Step 1: Building the Editor Target

Before we can launch the UAT we first need to compile the editor targets for our project to get up-to date versions of our editor dlls. To build those dlls we run the UBT for our project editor target for our operating system (e.g. Win64) in the development configuration:

Build/BatchFiles/Build.bat <ProjectName>Editor Win64 Development <PathToProjectFile>.uproject -WaitMutex

For a clean build either the Clean script can be run before or instead of the build script the Rebuild script can be used. The -WaitMutex switch tells the build tool to wait for the global mutex for this UBT instance. Omitting this switch lets the UBT return with an error if it is currently used elsewhere.

Step 2: BuildCookRun

The complete packaging process is available using the BuildCookRun commandlet inside the UAT. As the name of the commandlet suggests this is a three part process.

  • Build: Compile the engine, engine plugins, project plugins and the project itself with all necessary runtime modules which usually creates a single executable file at the end
  • Cook: Convert all referenced assets to the respective runtime formats for the target platform (e.g. on Windows convert textures to DDS format), compile still missing shaders, compile Blueprints to their binary representation and strip out any editing information.
  • Run: The last step can have a multitude of actions to be performed. In the scope of build automation we usually want to package our assets into pak files and archive the complete project into a folder for further processing (e.g. uploading to Steam). Other features include automatic deployment to connected devices via network or for mobile connected via USB and start the game on the device. These are usually not part in a fully automated process and will not be described here much further.

A complete command line for this process could look like this:

call Engine\Build\BatchFiles\RunUAT.bat" BuildCookRun -Project="<ProjectPath>.uproject" -NoP4 -NoCompileEditor -Distribution -TargetPlatform=Win64 -Platform=Win64 -ClientConfig=Shipping -ServerConfig=Shipping -Cook -Map=List+Of+Maps+To+Include -Build -Stage -Pak -Archive -ArchiveDirectory=<ArchivePath> -Rocket -Prereqs -Package

Let’s go through this one by one:

  • BuildCookRun: We want to to use the BuildCookRun commandlet
  • -Project=”<ProjectPath>/<ProjectName>.uproject”: Required parameter, absolute path to your uproject file
  • -NoP4: We do not want to interact with Perforce during this build (opposite would be: -P4)
  • -NoCompileEditor: As far as I know this seems to be broken, omitting this flag should build the editor parts we previously built using the UBT, however at least for me this does not work.
  • -Distribution: Mark this build for distribution (especially for mobile platforms creates a distribution package, this usually means using distribution certificates)
  • -TargetPlatform=<Platform1>+<Platform2>: For which platforms we want to package (separated with a +)
  • -ClientConfig=Shipping: Which configuration we want to package, options are Debug, Development, Test and Shipping
  • -ServerConfig=Shipping: Target platform for the server to be build against
  • -Cook: We want to run the cook step
  • -Map=List+Of+Maps+To+Include: Specific list of map names, separated using a +, to include. If omitted it will use the ones specified in the project settings
  • -Build: We want to run the build step
  • -Stage: Save the cook result in a staging directory
  • -Pak: Use pak files instead of plain file system directories
  • -Archive: We want to get a archive the complete output in a directory
  • -ArchiveDirectory=<ArchivePath>: The path to archive the project
  • -Rocket: We are using an installed/Rocket build
  • -Prereqs: Include Unreal Engine Prerequisites installer
  • -Package: Create a package for the target platform (e.g. an app file on Mac, apk on Android or ipa on iPhone)

Note that all switches are case insensitive (except for paths on case-sensitive platforms of course).

The above collection of switches and parameters are a solid basis for packaging a ready-to-run application and fully suffice. The next list shows a couple more useful switches to consider:

  • -Compile: This switch is usually required on source builds. It tells the UAT to compile itself before running any commandlets, however on Installed/Rocket builds this will result in an error as the sources for UAT are not part of those engine distributions.
  • -UnversionedCookedContent: Omit versions in assets, all loaded assets are assumed to be the current version.
  • -EncryptIniFiles: Encrypt your ini files, which makes it hard to tamper with them
  • -CreateReleaseVersion=<VersionName>: Creates information about this version inside you project folder under Releases, this information can be used for patches and dlc
  • -BasedOnReleaseVersion=<VersionName>: This build is based on the given version, used for patches and dlc, to only include new or modified files.
  • -Compressed: Compress the pak files for smaller disk usage (and increased loading times)
  • -Iterate: Only cook not already cooked items (if running on the same directory as a build before), if omitted it will cook all content
  • -CookAll: Cook all content, not only referenced assets
  • -CookOnTheFly: Does not cook the content, but starts the cook process in servermode, where a game can connect to using the -FileHostIP=<IP> parameter to connect to this server. The server will then cook requested content on the fly.
  • -Run: Start the project after deployment
  • -Device=<DeviceName1>+<DeviceName2>: Device name to run the game on (separated with +)
  • -NullRhi: Run the game without any render hardware interface (e.g. for running unit tests on headless devices)
  • -NativizeAssets: Supported since 4.13, allows C++ code generation from Blueprint scripts

Step 3: Profit!

With all this information it should be easy to successfully integrate the engine deployment into your automation process.

If you happen to use buildbot as your CI/CD framework you may want to take a look at my plugin to simplify interfacing with the UAT from buildbot.

Kons and Kudder

What the hack is „Kons“ and „Kudder“?

Part 1: Introduction

Thanks to Mr. Gittinger. Now I know what this means. This are snippets of descriptions of how to build a Scheme Interpreter. Yes Scheme is a programming language see. https://de.wikipedia.org/wiki/Scheme it belongs to the „lips“ family, a programming language which is structured in lists. This type of programming language is very simple to understand and fast implemented. An element of a scheme list exist of 2 part the first is the „Kons“ – the value of the list and the second part is the „Kudder“ the link to the next list element or the end of the list.

So now we know what Kons and Kudder is, but what is the correlation to our System Engineering Course. Hmm Ok – the program i wrote was a scheme interpreter (with Kons and Kudder) and the method of how I write this scheme interpreter was based on the 12 factory app. Somebody may ask how a scheme interpreter can be thought as a “software as a service application”? Yes good question. But this is just a student project. The main goal is to learn nice deploying pipelines and good looking and well tested code.

In my next episode we will have a closer look at my build pipeline with jenkins cmake and unittest.

Snakes exploring Pipelines – A “System Engineering and Management” Project

Part 4: Jenkins and Wrap Up

This series of blog entries describes a student project focused on developing an application by using methods like pair programming, test driven development and deployment pipelines.

Our first blog entry for this year will at the same time be the final one for this project as well, but to ease your pain of separation, we saved a highlight for the very end: Jenkins integration!
Starting with a short introduction to Jenkins, in this entry we’ll guide you through installation and configuration.

Continue reading

Snakes exploring Pipelines – A “System Engineering and Management” Project

Part 3: Coding Guidelines

This series of blog entries describes a student project focused on developing an application by using methods like pair programming, test driven development and deployment pipelines.

An important part of any professional software development process (like ours 😀 ) are coding guidelines and methodologies, so we’ll deal with these in today’s blog entry. A fancy system we learned about in the course was the 12 factor methodology. It is mainly applicable for web-based, software as a service (SaaS) applications and defines high-level aspects an application should follow in order to remain robust and maintainable throughout its lifecycle.

Continue reading

Snakes exploring Pipelines – A “System Engineering and Management” Project

Part 2: Initial Coding

This series of blog entries describes a student project focused on developing an application by using methods like pair programming, test driven development and deployment pipelines.

Onwards to the fun part: The actual coding! In this blog entry, we will focus on test-driven development. Like we learned in the course, the very first task to do was to set up unit tests with JUnit, and so we did. As a quick reminder (or an introduction) the basic concept of test driven development:

Continue reading

Snakes exploring Pipelines – A “System Engineering and Management” Project

Part 1: Tool Setup

This series of blog entries describes a student project focused on developing an application by using methods like pair programming, test driven development and deployment pipelines.

Welcome to the next part our project, on its way to become the Snake game with the very best underlying code base ever. (If you somehow missed the introduction to our project, you may want to give it a read: Click ) Today, we’ll take a look at the most important tools we use for building and versioning our code: Eclipse, Gradle and Github.

Continue reading

Snakes exploring Pipelines – A “System Engineering and Management” Project

Part 0: Introduction

This series of blog entries describes a student project focused on developing an application by using methods like pair programming, test driven development and deployment pipelines.

Once upon a time, which was about one and a half months ago, an illustrious group of three students found together, united by the shared interest to learn more about systems engineering and software development in the aptly named course “System Engineering and Management”.

As we had to choose a project to work on for the course, we decided to explore modern, professional development methodologies as employed in software development companies.
In particular, we wanted to focus on pair programming and test driven development with unit tests for the actual coding, as well as using a build and deployment pipeline for continuous integration, supported by tools and a version control system.
Continue reading