What channels you’re using matters! Test out on all your desired platforms before publishing code.

Testing out the limits of the bot framework I tried to create multi-line responses for my bot. The Text property of replies was Markdown, so, I thought it should be easy enough to implement. However, I quickly realized it didn’t always look the way I wanted. Here’s some tips to getting your responses to look just right =).

These examples were all using the

reply = context.MakeMessage();

and implemented the PostAsync method since my responses are all Tasks.

  await context.PostAsync(reply);

Multi-Line Repsonses

You must use \n\n in the string.


reply.Text = "Hi I'm one line \n\n " +
"I'm line two" +
"I'm line three?" ; 

Output Web:

Output Facebook:



In a list you must have the new line syntax \n\n as well as an ‘*’ with a space after it, be careful here since ‘*’ are also used for italicization. You can also see that the spacing is slightly different between the two channels.


reply.Text = "Hi I'm one line \n\n" +
"* Item 1 \n\n" +
"* Item 2 " ; 

Output Web:

Output Facebook:


Block Quote with Horizontal Rule

Quoted Text must have ‘>’ with one space after to denote that the next chunk of text will be a quote. The Horizontal Rule is marked by ‘—‘. We can especially see the limitations between channels even more in this example.


  reply.Text = "Block quote below bar \n\n" +
    "---" +
    "\n\n > Something about life. I'm an existential quote \n\n" + 
    "-BOT ";

Output Web:


Output Facebook: 


Headers / Bold, Italics and Strike Throughs

This time – drastic differences between Facebook versus the Web. Note that with headers you must type in \n\n after the header text or the entire string will be part of the first header syntax. And typing ‘ *** ‘ will get bold italics. However Facebook does not register ANY of these.


  reply.Text = "# Don't know if I need new lines \n\n" +
     "~~You **don't** *need* new lines~~ \n\n" +
     "***yes you do***";

Output Web:


Output Facebook: 


Links and Pictures in an Ordered List

Some more differences with Facebook and the Web, but less so. Remember to put \n\n after every item in your list and to leave a space after the ‘.’ following the number.


  reply.Text = "### List \n\n" +
     "1. Link: [bing](http://bing.com) \n\n" +
     "2. Image Link: ![duck](http://aka.ms/Fo983c)";

Output Web:


Output Facebook:


Hope this little guide helps.

Happy Hacking!

– TheNappingKat


Microsoft released their new Bot Framework early this year at the Build Conference. So, naturally, I wanted to create my own; eventually integrating it into a game. In this post I talk about some of my learnings and what the bot framework provides.

I decided to work with the Microsoft Bot Connector, part of the Microsoft Bot Framework, as a way to get my bot up and running on the most platforms as quickly as possible. I haven’t worked with bots in the past so this was my first dive into the territory. My bot was built in C# however, Microsoft’s Bot Framework can also be built in Node.js. My colleague Sarah wrote a post about getting started with Node here: https://blogs.msdn.microsoft.com/sarahsays/2016/06/01/microsoft-bot-framework-part-1/

The bot I wanted to create was a simple chat bot that I could build upon for interactivity with users. If your familar with The 100, you’ll figure out what my bot does. All the code for what I did can be looked at here: https://github.com/KatVHarris/ALIEbot

What I used

Microsoft Bot Framework is super powerful and makes it easy to create a bot of your own. You can use any of the following to get started:

  • Bot Connector
  • Bot Builder C#
  • Bot Builder Node.js

I used the Bot Connector, an easy way to create a single back-end and then publish to a bunch of different platforms called Channels.

I started out by following the steps in the getting started section of the docs and downloaded the Bot Template for Visual Studio here: http://docs.botframework.com/downloads/#navtitle

** Note: It’s really important for Visual Studio to be updated in order to use this, as well as download the web tools in the Visual Studio Setup when you download.**

** Another Note: if you have never downloaded a template for Visual Studio before here are some instructions: http://docs.botframework.com/connector/getstarted/#getting-started-in-net. You’ll have to save the zip into the %USERPROFILE% folder on your computer. **

Set Up

Open a new project with the Bot Template, and install the nuget package for the Microsoft’s Bot Builder: install-package Microsoft.Bot.Builder

Message Controller

The file that dictates the flow of responses is the MessageController.cs in the “Controllers” folder. The class handles systems messages and allows you to control what happens when a message comes through.

Adding the following conditional statement to the Post function allows you to cater the response to the user.

Let’s create a simple response:

public async Task Post([FromBody]Message message)
    if (message.Type == "Message")
        return message.CreateReplyMessage($"You said:{message.Text}");
        return HandleSystemMessage(message);

Now you can stick with this model and add in bits of functionality but I like to add a more powerful messaging system with Dialogs.


** Now there are slight differences between the BotConnector Dialogs for Node vs C#. Everything in this post pertains to the C# verison.**

Bot Builder uses dialogs to manage a bots conversations with a user. The great thing about dialogs is that they can be composed with other dialogs to maximize reuse, and a dialog context maintains a stack of dialogs active in the conversation.

To use dialogs all you need to do is add the [Serializable] tag and extend the IDialog<> from Microsoft.Bot.Connector;

Dialogs handle asynchronus communication with the user. Because of this the MessageController will instead use the Conversation class to create an async call to a Dialog Task that uses the context to create a ReplyMessage with more functionality. What does that all mean? It means that with dialogs, you can implement a conversation with a user asynchronously when certain keywords are triggered. For example if the user types in the keyword “reset” we can have a PromptDialog to add the confirmation. One of the most powerful ways of creating a an actual dialog with the user and the bot is to add Chain Dialogs.

Chain Dialogs

Explicit management of the stack of active dialogs is possible through IDialogStack.Call and IDialogStack.Done, explicitly composing dialogs into a larger conversation. It is also possible to implicitly manage the stack of active dialogs through the fluent Chain methods.To look at all the possible ways to respond to a user with Dialogs you can look at the EchoBot Sample in Github.

Publishing your Bot

Okay now that you have tested your bot and got it to respond to your user, how do we publish? Well steps to getting your bot on the bot connector are here: http://docs.botframework.com/directory/publishing/#navtitle

TIP 1: Update Visual Studio and tools

As I said earlier make sure all of your tools are on the latest update. My web tooling was not on the latest version when I first tried to publish my bot, so the directions were slightly different than the tutorial and caused issues later.

TIP 2: Don’t skip any of the steps

The first time I published my bot it didn’t work. I still have no idea why, but I believe it was because I missed a minor step in the creation process.

TIP 3: It should work immediately

Your bot should work immediately after you activate the web channel. If it doesn’t check your code again. My first bot was not working immediatley and I ended up just registering a new one with the same code. That worked.

TIP 4: Web disabled

If you look at my channel picture you can see that the web channel is registered but it’s status says “diabled”

Don’t worry about this. Your Bot web should still work.

TIP 5: Registering your bot

You don’t need to register your bot for it to work. Registering your bot will allow it to be in the publish gallery later. Make sure your bot does something useful before submitting as well. Simple chat bots do not count.

That’s it! You should have a bot published and all ready to Chat with.

Next Steps – LUIS

Okay so there are many ways that your bot can respond to your user. However specific keywords are needed and it’s less user friendly, and conversational than we would like. In order to make our bot respond to natural language that users will most likely be using to integrate LUIS. Which I’ll talk about in the next post with part 2.

Reading the Bot Framework Docs was extremely helpful when getting started, so if you haven’t looked at them I recommend you take a look here: http://docs.botframework.com/

The Microsoft Bot Framework is opensource so you can help contribute to the project here: Microsoft/BotBuilder. They also have more samples included in their source code.

Happy Hacking!


Error Fixing

Here are a list of errors I’ve seen while working with HoloLens Emulator. I’ll be adding to the post regularly. If I’ve missed something, please comment below and I’ll can add it.

For this demo I was following the HoloAcademy Origami Tutorial: https://developer.microsoft.com/en-us/windows/holographic/holograms_101e

System Specs:

  • Window 10 Enterprise
  • Intel(R) Core(TM) i5-4300U
  • 2.50 GHz
  • RAM – 8GB
  • 64 bit, x64 processor

Error – Exception Code 0xc0000409 Error

Solution: Check Versions of Unity and VS tools as well as Emulator Version

This error occurred because the download link to the latest Unity Editor build was not compatible with the Emulator Unity Tools for Visual Studio. The Unity version of the editor I had was 5.4.0b14. The Origami demo and the Emulator tools currently work with b10, if your looking at this post several months down the line just make sure your versions are compatible. Also make sure you have the VS Update 2 installed.


Error cs0234: the type or namespace name wsa' does not exist in the namespace unityengine.vr’. are you missing an assembly reference?

Solution: Make sure the correct version of Unity and VS Emulator Tools are installed. Then make sure the correct version of UWP tools are installed.

You should have the 10.0.10586 UWP tools not Win 10 SDK 10.0.10240. The Win 10 SDK at the moment conflicts with the tools for some reason when I was trying to deploy my project. This may change in the future.

Error – Connectivity.Remote.Device.Ping()

Solution: Check to see if Remote tools version 10.0.10586 is installed and then try downloading the Remote tools.

Error – Project not Deploying

There are a number of reasons for faulty deployment

Solution: Wait. The first time I ran the Emulator it took 15 minutes to run and load my app.

Solution: Make sure your versions are correct.

Solution: Look at the Project in Visual Studio; make sure there are no popup windows that are halting the debugging and stopping the deployment. The first time you run the emulator Visual Studio will ask you if you want to continue debugging in Emulator mode, if you select “Continue (always use this option)” they Deployment process won’t hang waiting for your permission.

Solution: Make sure you don’t have too many other programs running

Error – Project is Deploying to Emulator but not starting

Solution: Hit the plus on the right side of the menu Window.

This will take you to All Apps Running in the Emulator, you should see your app there.

Error – No ‘Home’ or ‘Menu’ Window in emulator

Solution: Hit the Windows Key. If that doesn’t work restart.

Error – Stuck in Windowed Mode of the Emulator

If you see the Unity Logo with a white screen you will be stuck in Windowed mode of the Emulator and be unable to run your app.

Solution: Turn off emulator. Clean your Solution. Build it. Then hit Run again for the Emulator. The emulator is still new and sometimes will get stuck.

Error – HoloLens Emulator is not appearing in Visual Studio Devices drop-down

Solution: Make sure the tools are downloaded and you are in x86 mode with Release Mode selected

So those are the main ones. Again I’ll keep adding them. Let me know what cool projects you’re working on and if you ran into more errors that I can add =)

Happy Hacking!



Unity Trail Renderer v. Particle System

Hey all! This video is a continuation of the Particle System videos for polishing your game:

Happy Hacking!



So in the last post I talked about the Particle System basics. In this post I’ll be delving deeper in using particles systems in your game and talk about Particle Shaders and Textures.

Textures, Texture Sheets, and Shaders

Usually when we talk about Textures and Shaders we are talking about applying them to a material for a gameobject in your scene. For those of you who are new to working on the front-end side of development, a Texture determines what an object will look like (color, designs, etc.) while a Shader determines what attributes the texture will have (shininess, transparency, reflectivity, etc.).

To be more specific, a Shader is defined as:

The method to render an object. This includes code and mathematical calculations that may include the angles of light sources, the viewing angle, and any other relevant calculations. Shaders can also specify different methods depending on the graphics hardware of the end user. The parameters that can be customized in the material inspector, such as texture maps, colors and numeric values.

Most developers only interact with Unity’s standard shader. It’s very powerful and can achieve many of the effects that developers want, like glass, metallic, matte, or holographic appearances for their objects. However, in addition to the Standard Shader, there are a number of other categories of built-in shaders for specialized purposes:

  • FX: Lighting and glass effects
  • GUI and UI: For user interface graphics
  • Mobile: Simplified high-performance shader for mobile devices
  • Nature: For trees and terrain
  • Particles: Particle system effects
  • Skybox: For rendering background environments behind all geometry
  • Sprites: For use with the 2D sprite system
  • Toon: Cartoon-style rendering
  • Unlit: For rendering that entirely bypasses all light & shadowing
  • Legacy: The large collection of older shaders which were superseded by the Standard Shader

Using different Textures with Shaders

So far we have only used the default particle texture with the default particle shader


Let’s import more assets to work with.

Assets>Import Packages > Particles

This package comes with a variety of different textures, materials, and prefabs we can use and edit for our purposes. It also imports some default particle shaders for us to use.

Depending on the effect you want to accomplish Textures can be a single image or a sheet of images that is used like an animation sheet; and depending on how you want the texture to render in your scene you need to use a corresponding shader.

Now, I’m not going to explain the intricacies of shaders and how they work, it’s way to in depth for a short blog post. But I’ll touch on some of the basics of the two main shaders and they types of textures they work with.

Additive Shader

Look under the prefabs folder of Under Assets>Standard Assets> Prefabs

Select the Afterburner prefab and drag it into your scene. This prefab is a great example of a Particle System using Additive Shaders; what’s even better is that it’s using the same Texture as the Default-Particle System.


Additive blending is the type of blending we do when we add different colors together and add the result. This is the way that our vision works together with light and this is how we can perceive millions of different colors on our monitors — they are really just blending three different primary colors together. You can read more about here: http://www.learnopengles.com/tag/additive-blending/

This type of blending has many uses in 3D rendering, such as in particle effects which appear to give off light or overlays such as the corona around a light, or a glow effect around a light saber. If you go through the inspector of the Afterburner prefab you can see the specific settings for each of the particle system modules.

Alpha Blended Shader

Now to show you about Alpha Blended particles go into the same folder as before and select DustStorm and drag it into your scene. You can delete the afterburner prefab. The DustStorm Particle System uses a different texture than the default and afterburner systems.


This cloud has a black (0) alpha channel, making the black parts of the image not effect any of the pixels in the layers under it.

Think of a layer of colored glass or colored semitransparent plastic on top of something of another color. That’s the effect accomplished with Alpha blending. Alpha blending is a blending technique that allows for the combination of two colors allowing for transparency effects. The alpha blending equation is as follows:

final color = src color * src alpha + dest color * (1-src alpha)

What this does is linearly interpolate between the two colors according to the alpha value. If a src pixel has an alpha of 0.8 it contributes 80% of the final color while the destination pixel only contribute 20% of the final color of the new pixel color. This obviously means the lower the source pixel alpha the larger the contribution of the destination pixel. To read more about it check out this link: https://takinginitiative.wordpress.com/2010/04/09/directx-10-tutorial-6-transparency-and-alpha-blending/

Let’s see the difference between the two shaders. Go into the inspector and change the Shader of the DustStorm to Particles/Additive instead of Particles/AlphaBlended.


Now you can see that the Dust now has a glow about it as it layers, instead of the transparent quality it had earlier.

Yay! Shaders and Textures complete! In the next post I’ll show you how to make your particles interact with objects in your scene.

Happy Coding!



So far in this series I’ve shown you how to make an infinite runner; talking about spawners, destroyers, character controls, handling collisions, and music. Now I’ll show you another way to polish your game; Particle Effects. Particle Effects are an easy way to add that extra wow factor to your game without too much effort.

There are several things you can do with the Particle System Unity provides. You can make streams of water, fireworks, explosions, lightning bolts, OR ominous yellow radioactive acid fog (the 100 references, you should watch it, it’s on Netflix) I find the best way to learn about how to create all the different effects is to play with them yourself. So, by the end of this post, you will have a barrel leaking yellow radioactive material shown below.


I’ll also talk about WHY the particles look like they do and HOW we can effect them through the editor.

Understanding Particle Systems

Particle Effects are generated by a Particle System. Any gameobject within Unity can have a Particle System component.

A Particle System is, at its core, a Spawner, that spawns texture sheets at a specific location with certain properties defining lifetime and behavior. So we’ve done something like this before. However, this time Unity provides a nice Inspector Menu to tweak the behavior and lifetime of the object being spawned. Also, Unity’s Particle System is dynamic. This allows the emission and lifetime settings affect the overall behavior of the system but the individual particles can also change over time.

Unity also provides lots of documentation on each module of the Particle System. Modules are the different sections in the Particle System Component in Unity. Now, I won’t go over all the modules of a Particle System and their properties but I’ll definitely highlight the popular ones as we make the prefab.

  1. Main, Emission
  2. Shape
  3. Color Over Time
  4. Size over lifetime
  5. Collision
  6. Sub Emitters
  7. Renderer

Step 1: Set Up and Default Emitter

So in a new scene I’ve added a Cylinder and a plane to the scene along with an empty GameObject and called it Particles. I then rotated the Particles object -90 in the X axis of the transform and added a particle system component to it.


Once you’ve added the particle system to your scene you will see little white glowing balls of light being generated and then disappearing. This is because of Unity’s default settings for it’s Particle system.


The little balls of light are created because that’s the current texture assigned to the system. The texture could be changed to whatever you want, like pictures of your face or smoke. I’ll talk more about textures and shaders later on. The point at which the particles are generated is called the Emitter.

Step 2: Main

The Main module of the Particle System controls properties of the system overall.


Duration is the length of time the system will run. Most of the properties in this module control the initial behavior of the particles from the emitter. I’ve made the duration 5.00 which translates to 5 seconds.

Looping – This boolean value triggers whether or not the system will loop. If selected it will then repeat in 5 second (the duration) intervals.

**SIDE NOTE: Start Delay, Lifetime, Speed, and Size can be customized by selecting the down arrow on the side, and selecting one of the following options: Constant, Curve, Random between Two Curves, and Random Between Two Constants (The 4 Options). These Options allow the System to behave dynamically, emitting particles at different sizes, at different times, and with different speeds.

Start Delay: 0 – How many seconds after the system start would you like the first particles to be emitted.

Start Lifetime: 5 – Lifetime for the individual particles emitted

Start Speed: 2 – How fast the particle travels outward from the emitter

Start Size: 4, 7 (Random between two constants)- How large the particles will be

Start Color – Yellow and Light green. I also selected Random Between Two Colors. You can choose whatever colors you want.

Gravity Modifier – 0.1 – Scales the gravity value set in the physics manager. A value of zero will switch gravity off. With the Gravity Modifier you do not need to use a lot of force. Play around and see what little gravity you need to force your particles straight down after emission… Hint, it won’t be a lot.

Main should eventually look like this now.


Step 3: Emission

Rate – 3, 5 – Random Between Two Constants. Unity explains that the rate of emission can be constant or can vary over the lifetime of the system according to a curve. If Distance mode is selected then a certain number of particles are released per unit of distance moved by the parent object. This is very useful for simulating particles that are actually created by the motion of the object (eg, dust from a car’s wheels on a dirt track). However, Distance mode is only available when Simulation Space is set to World in the Particle System section.


Step 4: Shape

The Emitter can multiple shapes

  • Sphere
  • Hemisphere
  • Cone
  • Box
  • Mesh
  • Mesh Renderer
  • Skinned Mesh Renderer
  • Circle
  • Edge

I’ve selected Cone since I want the particles to fountain out of the Barrel.

Angle – 30

Radius – 1 – Same as the radius of the barrel.

Emit from: Base

Step 5: Color Over Lifetime

Color Over Lifetime can be set with one Gradient or Random Between Two Gradients. I’ve chosen Gradient, and set the color to White; and then changing the alphas at the beginning and end to 0 to fade in and then fade out creating a radiating glow from the particles. You can change the colors and alpha for the gradient by clicking on the gradient field.


Step 6: Size Over Lifetime

To Edit the curve of Size over Lifetime you need to open the editor for Particle Editing. At the top of the Particle System Component in the Inspector Click – Open Editor…


Once the Editor has popup you will see the settings for your particle system as well as a graph. Unity has some default curves for you to choose from at the bottom of the grid. My curve starts at .5 and arcs up to 1.


You can also use the editor to tweak the modules that we were editing before.

Step 7: Collision

The collision module determines whether or not the particles will collide with 3D, 2D, or plane objects. I have it selected because on collision I want a secondary effect to happen, which we can edit with the sub-emitter module.


I’ve chosen my particles to collide with objects in the 3D World space, with no dampening force or bounce on the particles so they stay on the ground and keep their velocity. I also want dynamic collisions but not with itself. I could also check the box for Send Collision Message, but I’ll save that for later.

Step 8: Sub Emitters

Sub Emitters are secondary particle systems that are parented to the initial particle system created. They are great for creating secondary effects on actions. Unity likes using the example of a bullet that might be accompanied by a puff of powder smoke as it leaves the gun barrel and a fireball that might explode on impact. Those sub-emitters would be added to the Birth and Collision section respectively. Since sub-emitters are simply particles systems they can have sub emitters of their own, allowing you to create complex effects like fireworks. However BE CAREFUL this can lead to a numerous amount of particles in your scene and slow performance greatly.

When you initially look at the Sub Emitter Module it contains three sections, Birth, Collision, Death with two slots each, saying, “None”.


As I mentioned before I want my particles to have a secondary effect on their collision. So to add a sub-emitter to my scene I simply have to select the plus sign next to the first Collision slot. The default sub-emitter of white lights will be created. To prevent the onslaught of pretty white lights filling up my scene, get ready to click stop on the particle system in the scene tab. This will save you a lot of grief as your machine might slow down a bit.


Now to save time I’ll screen shot the modules of the sub-emitter for you to copy.




I’ve only changed a few things. The shape of the sub-emitter is a hemisphere, and the color over lifetime is a bit more colorful. Also notice that the Collision module is on as well but I do not have any more sub-emitters attached.

Step 9: Renderer

Now this module is related to the texture sheet that is attached to the particle system. Overall the Renderer determines how a particle’s image or mesh is transformed, shaded and overdrawn by other particles. Since we are using the default texture sheet I’ve kept the Renderer settings on their default as well.


Tweaking the System

Now there are several other examples that you can checkout from Unity in the editor. Simply go to Assets and Import the Particles Package


And there you have it! Your first Particle System! Now take what you’ve learned and play around with more of the settings. I did skip over some things, like Send Collision Messages, and the Texture Sheet Animation module. I will cover those in my next post where we have these systems interact with our player.

Happy Coding!


Music, Unity

So in Part 1 of Music and Sound I talked about creating your own game music. Now I’ll show you how to integrate the music into Unity.

Unity updated many things about their engine in the latest release. One of the biggest overhauls they did was toward sound creation and manipulation within the engine. This post will go over the basics of using the sound engine but if you want to take it to the next level here is a link to Unity’s advanced tutorial on making sound groups and mixers: http://blogs.unity3d.com/2014/07/24/mixing-sweet-beats-in-unity-5-0/.

Background Music

Sound and Music is sometimes the most underrated thing game developers take into consideration when making their games. But what they don’t realize is that extra addition can take a game from great, to epic. Imagine Super Mario or any of the Final Fantasy Games without their phenomenal score. The ability to completely immerse the player would be lost. Movies use music to manipulate the mood of the audience in order to provoke certain emotions; fear, sadness, anxiety. Games can use music to do the same. For example, speeding up music can create a feeling of intensity and urgency.

Implementing Sounds in Unity

Before we begin here are some Unity terms that you should know:

  • Audio Listener – controls audio output to the headphones.
  • Audio Source – where the sound comes from
  • Audio Clip – sound file
  • Audio Mixer – Controls Audio Groups
  • Audio Group – Channel for certain clips and effects

To create sound in Unity you need at least three things. I’ll leave Audio Mixer and Audio Groups out for now.

  1. Audio Clip
  2. Audio Source
  3. Audio Listener

Step 1: Create Audio Game Objects

In your hierarchy create a empty gameobject and call it SoundManager. I like to organize my sounds by putting them in an empty SoundManager Object.

Step 2: Add Audio Source to the SoundManager

Step 3: Add Audio Clip to source

Create an Audio folder in your “Assets” folder. This folder will hold all your sound clips.


Take the sound file you created from the previous post, and with your Explorer (or Finder for Macs) move it to this Audio folder.

Then add this clip to the Audio Clip area of the Audio Source you added to SoundManager.


Make sure that the Loop and Play on Awake boxes are selected as well. This will ensure that your music will play throughout your game and will play at the creation of the SoundManger object.


Step 4: Add SoundManager Script

In your scripts folder create a new script and call it SoundManager. This script can control certain effects that can be applied to your clips, like pitch controls; however I’m only use it to will ensure the game music is playing.

public AudioSource musicSource;
public static SoundManager instanceSM = null;
// Use this for initialization
void Awake () {
// Update is called once per frame
void Update () {
    if (!(GetComponent<AudioSource>().isPlaying))
        //Debug.log("Something is wrong with Music.");

And there you have it. If you press play your music should start playing. I’ll continue talking about adding sound effects to specific game objects in my next post, as well as getting into Audio Mixers.

Happy Coding


Music, Unity

So for game developers worrying about the music is one of the last things on our minds. It shouldn’t be. Music gives games the extra layer of professionalism that immerses players into the gaming experience.

Now there are a bunch of free loops online. And if you want something professional you can always pay someone to write a small loop for you too. However, for those of us developers that want our games to have a unique sound, (and have an hour or two to kill) we can create our own music loops.

One of my favorite tools to do this is Audio Sauna. It’s a really good off/online music editor and gives games a very 80s feel, with the synths and samples it uses.

IT’S SOOOO EASY TO USE! Don’t just have the same generic garage band samples that everyone has. Trust me. There. Is. A. Better. Way.

Now, I am no composer, producer, music maker, so this is just my 2 cents on how to create song for your game.

Intro to Audio Sauna

To get started you can go to the audio sauna site:


They have a great online studio for quick sound loop generation. When you open the studio you should see a screen like this:


Do you see it? Great! Okay so what are we looking at. Well the studio works like most big audio applications with a menu bar at the top some editing hotkeys below that. Your main editing area in the middle, with the source input area. And the mixer, is located at the bottom.

Creating Music

Step 1: Choosing Sound Inputs

So the art of making loops, beats, music in the studio is by creating rhythms and layering different inputs together. The studio already has up one of the default inputs: the Analog Synth. Down in the Mixer you can see that the “Analog Synth” is colored red instead of blue. This indicates that it is selected and that is the current track you are editing.

Let’s take a closer look at the Analog Synth. input. So it seems pretty daunting at first but after playing around with it for a while it gets easier to understand. On the left Audio Sauna has created some presets for you to use.


Now the AWESOME thing about this tool is that you can use your qwerty keyboard as a regular musical keyboard! “But wait?!” You exclaim, “There are way more keys on a musical keyboard than a computer one.”  Yes. The tool gets around this caveat by having an octave button that will bounce your input up and down the keyboard’s(musical) keys.


See my mouse isn’t anywhere on the screen but when I hit the keyboard key the corresponding musical keyboard key depresses.


**I’m just noticing how confusing referencing your key from your keyboard and key from your musical keyboard is getting, so I’m going to call the musical keyboard key’s “mkeys”.

Step 2: Writing Notes

You might have also noticed that the side mkeyboard is also depressed.


The mkeyboard in the main editing area, is the full standard mkeyboard. And shows you where your note will be placed in the sheet time line. You see there are two ways of adding notes to your loop’s “sheet music”. One, using your keyboard. Two, using the pencil edit tool right on the timeline.


The Pencil tool allows you to click inside of the timeline sheet and create notes that will be triggered on play. If you exit out of the Analog Synth. input editor, you will see the plain editing sheet. However the notes you create will all be on the short 1/16 beat. To change how long the note lasts just change the 1/16 to a larger fraction, by clicking on it.


At the top of the sheet you can see a green bar. The green bar on the top signifies the length of the loop.

Step 3: Recording Music

Now to actually save any noise you have been creating you need to record your key input.

Let’s get the Input tool back. By double clicking the red Analog Sync. will bringing it back into view. Click on the keyboard to start using the keyboard to manipulate the mkeys. Now hit the record button on the bottom.


The green bar is now highlighted over with a red one, to signify that it is recording your key strokes now. Each time you press a key a red mark should appear one 1/16 at a time.


Great! now keep playing around with the beat until you fill up the loop.

Step 4: Layering beats

So I mentioned that using the tool was meant for layering sounds on top of one another. So lets make those other beats. Click the top right horizontal bars.


A new screen should appear. Currently the Analog Synth is selected and is the only track that has notes in it.


Now double click on the FM-Synth. The tool will then change back to the editor view, but for the FM-Synth.


As you can see below the FM-Synth is now red in the mixer. However, the Analog Synth input editor is still out.


To get the FM-Synth input tool out double click on the red FM synth name in the mixer.


A new input should appear now, like the one above. This time the presets are on the right hand side. Select one of them and beginning hitting the keys, to hear what kind of sound will be made.

This editor works the same as the one before. So either with the pencil, or in record mode with the keyboard, you can create your timeline sheet music. There is also a Sampler input that you can use. You will have to repeat the process.

Step 5: Saving the Loop

So after you have composed your amazing awesome mix. Lets get it into unity. From the File Menu there are two options you should be familiar with. The Save to My Computer option and the Export Loop as Audio File.


You should use the Save to My Computer option when you haven’t quite finished your mix and would like to work on it later. The Tool saves what you’ve done as a session that you can reopen with the tool again. The Export Loop as Audio File is the one we need when we are done with our loop and want to put it into Unity.

The next blog post in the series will be adding your saved Loop into Unity!

Happy Mixing!



Implementing Unity Ads

In my last post Unity Ads (part 1) I talked about the importance of Ads, why you need them, and how you should implement them.

Step 1: Sign-Up

Make sure you have a Unity Account before you get started. You will also need a Unity Services account as well. To get one go to https://unity3d.com/services/ads to sign-up and try the free beta.


Click “Earn Money with Your Games”.


If you have never created an ad service before your dashboard will prompt you to create a new Ad Project. Do NOT create a new project.  If you create a new project you will have to link the ID that is generated from your editor to this project. Instead, we will create the Ad portion of your project in Unity’s Project Editor.


Step 2: Enable Unity Services

Now that you have an account you need to open your game in unity. In the upper right hand corner of the Editor you should see the unity services tab (if you don’t see the tab, hit the cloud icon in the upper right).


Before you can enable start using Ads you need to create a Unity Project ID so you can collect your data and view it with Unity’s Analytic Dashboard.

Select “Create” to have Unity create a Unity Project ID.


Step 3: Enable Unity Ads

After you’ve turned on services and connected generated your Unity Project ID. You should see the available services that Unity provides within the Editor. Currently Analytics and Ads are the only ones available, however multiplayer options and Cloud Build are in the pipeline for future integration and use with the editor.

Turn on Ads by clicking the “Off” toggle to “On” for Ads.


To link this project with our online portal simply click “Go to Dashboard”.


The portal for your project should now open in your browser.


Step 4: Ads Integration

Now that your project is linked to Unity Ad Services, let’s get some ads in your code. There are two types of Unity Ads:

  1. Simple
  2. Rewarded

To explain these two types, I’ll start with Rewarded. Rewarded Ads keep track of whether or not the player has skipped your ad, or has completed watching the ad. This allows the developer to then reward the player after the completion of the advertisement.

Simple Ads are ads that simply run in your application without any other interaction with your application.

Unity provides some sample code for you to run in your project, which makes it really easy to plug and play with the feature.


In order to test out your ads, make sure that the “Enable Test Mode” button is selected


Also ensure that you are building to the iOS or the Android Platform.


When you run the code the ads should look like this:


And there you have it. Integrated ads in your game!

Happy Hacking!



Why you want ads

We live in a Free to Play World where most users have grown accustomed to mobile games being free. There are a select few who are willing to pay that up front fee, but your game will already need be popular, have player support, or a good amount of attention to sell it. And of course, console games are an entirely different story.

So how do you get revenue from users who don’t want to spend any money? Ads. Advertisements allow you to have a steady revenue stream as your users play your game.

Now I know what you’re thinking: Ads are horrible, they distract from game play, they take up screen real estate, and they take away from the experience…And yes, they can – if the developer doesn’t design the game to incorporate them in an immersive, consumable way.

In this post, I’ll go over some of the main points of integrating ads into your game and how to do it in Unity’s new services. If you want to read more about it, Unity has a few articles and blog posts about best practices and tips from other indies that successfully incorporate ads in their games:

Implementing Ads Well

Don’t disrupt game play

Have you ever played a game with an annoying banner across the top for a random ad? The answer is most likely yes. one way to avoid advertisements from disrupting game play is to make the decision at the beginning of the design process that you are going to make a freemium game. Knowing that you plan on including ads during the design phase allows you to then tailor your game to strategically use ads instead of slapping them on at the last minute.

Make ads part of the experience of your game. Currently, Unity ads will play full screen and are video based. You can have simple ads or you can have advanced ads that can tell developers if users actually watched the ad or not.

Ad Incentives

To incorporate Ads into the experience of your game you can use ads with an incentive system. For example:

A user is playing your infinite runner and they die. Instead of simply showing their score and a play again button, they see a screen that shows them their score and a choice. 1) Restart and try again, or 2)Watch an ad and start from where they died.

That way, game play has not been disrupted and the ad is showing at a natural pause in the game. Also, by rewarding the player, the user doesn’t associate dying with watching ads, giving them a bad experience with your game when they die.

Don’t break the game

Be wary of adding too much power to incentives.

The incentives and rewards that players receive for watching your ad should not give them an unfair advantage for the game. If your game involves some sort of currency, the reward shouldn’t be great enough to disrupt play. Balance the frequency of showing ads. Show them too often and you’ll irritate players, which might cause them to stop playing your game all together.

How to Implement

Unity 5.2 and higher has Ads support integrated into the editor. It’s a great tool for testing your ad experience. In my next post, I’ll show you how to integrate ads the way I did with Unity Analytics here: Unity Gaming: Anayltics

Happy Hacking