It works! I managed to get HoloLens inputs working with the LUIS integration I did before in Unity. The project melds phrase recognition with dictation from HoloAcademy’s github example and then pings the LUIS API.
Okay so this is a very short post. Most of the harder parts were completed before this in my LUIS post and the Hololen’s Academy code helped a lot. I’ll just mention here some of the pains I went through and how it works.
Phrase Recognition vs. Dictation
HoloLens has 3 main types of input control for users. Gaze, Touch, and Voice. This application focuses on the last one. The voice input uses the Speech library for Windows.
using UnityEngine.Windows.Speech;
This library allows the HoloLens to then use Phrase Recognition to trigger specific actions or commands in your project. Yet that defeats the point of natural Language processing. In order to then interact with LUIS we needed to feed in what the user is saying. So to do this, I integrated the Communicator Class from the HoloLens Example into my project. This class handles the Phrase Recognition of the project but it also handles dictation, enabling natural language to be captured from the user. My Communicator is slightly different than the HoloLens because of the LUIS interactions as well as reworking the code to call for multiple dictation requests.
Now to activate the dictation Phrase Recognition commands are used. So no need to Tap to activate.
Dev Notes
I did have some speech/phrase recognition trouble. The original phrase to activate dictation was “Allie” (the character on the 100 which my original bot project is based off); however, the recognizer doesn’t recognize that spelling of her name. Changing it to “ally” caused the recognizer to trigger. DictationRecognizer is similar to the PhraseRecognizer in that it also doesn’t recognize the spelling of many names; for example, I would say “Tell me about Clarke.” and the dictation recognizer would write “tell me about clark.”. To fix the dictation errors I used Regex to replace the spelling before querying the LUIS API. One can also change their LUIS API to accept the speech recognition spelling but because multiple bots and applications are connected to my LUIS API I couldn’t implement that solution.
private void DictationRecognizer_DictationResult(string text, ConfidenceLevel confidence)
{
// Check to see if dictation is spelling the names correctly
text = Checknames(text);
// 3.a: Append textSoFar with latest text
textSoFar.Append(text + ". ");
// 3.a: Set DictationDisplay text to be textSoFar
DictationDisplay.text = textSoFar.ToString();
}
Anyway that’s all there is to it. All the major code is in the:
What channels you’re using matters! Test out on all your desired platforms before publishing code.
Testing out the limits of the bot framework I tried to create multi-line responses for my bot. The Text property of replies was Markdown, so, I thought it should be easy enough to implement. However, I quickly realized it didn’t always look the way I wanted. Here’s some tips to getting your responses to look just right =).
These examples were all using the
reply = context.MakeMessage();
and implemented the PostAsync method since my responses are all Tasks.
reply.Text = "Hi I'm one line \n\n " +
"I'm line two" +
"I'm line three?" ;
Output Web:
Output Facebook:
Lists
In a list you must have the new line syntax \n\n as well as an ‘*’ with a space after it, be careful here since ‘*’ are also used for italicization. You can also see that the spacing is slightly different between the two channels.
Input:
reply.Text = "Hi I'm one line \n\n" +
"* Item 1 \n\n" +
"* Item 2 " ;
Output Web:
Output Facebook:
Block Quote with Horizontal Rule
Quoted Text must have ‘>’ with one space after to denote that the next chunk of text will be a quote. The Horizontal Rule is marked by ‘—‘. We can especially see the limitations between channels even more in this example.
Input:
reply.Text = "Block quote below bar \n\n" +
"---" +
"\n\n > Something about life. I'm an existential quote \n\n" +
"-BOT ";
Output Web:
Output Facebook:
Headers / Bold, Italics and Strike Throughs
This time – drastic differences between Facebook versus the Web. Note that with headers you must type in \n\n after the header text or the entire string will be part of the first header syntax. And typing ‘ *** ‘ will get bold italics. However Facebook does not register ANY of these.
Input:
reply.Text = "# Don't know if I need new lines \n\n" +
"~~You **don't** *need* new lines~~ \n\n" +
"***yes you do***";
Output Web:
Output Facebook:
Links and Pictures in an Ordered List
Some more differences with Facebook and the Web, but less so. Remember to put \n\n after every item in your list and to leave a space after the ‘.’ following the number.
So in the last post I talked about the Particle System basics. In this post I’ll be delving deeper in using particles systems in your game and talk about Particle Shaders and Textures.
Textures, Texture Sheets, and Shaders
Usually when we talk about Textures and Shaders we are talking about applying them to a material for a gameobject in your scene. For those of you who are new to working on the front-end side of development, a Texture determines what an object will look like (color, designs, etc.) while a Shader determines what attributes the texture will have (shininess, transparency, reflectivity, etc.).
To be more specific, a Shader is defined as:
The method to render an object. This includes code and mathematical calculations that may include the angles of light sources, the viewing angle, and any other relevant calculations. Shaders can also specify different methods depending on the graphics hardware of the end user. The parameters that can be customized in the material inspector, such as texture maps, colors and numeric values.
Most developers only interact with Unity’s standard shader. It’s very powerful and can achieve many of the effects that developers want, like glass, metallic, matte, or holographic appearances for their objects. However, in addition to the Standard Shader, there are a number of other categories of built-in shaders for specialized purposes:
FX: Lighting and glass effects
GUI and UI: For user interface graphics
Mobile: Simplified high-performance shader for mobile devices
Nature: For trees and terrain
Particles: Particle system effects
Skybox: For rendering background environments behind all geometry
Sprites: For use with the 2D sprite system
Toon: Cartoon-style rendering
Unlit: For rendering that entirely bypasses all light & shadowing
Legacy: The large collection of older shaders which were superseded by the Standard Shader
Using different Textures with Shaders
So far we have only used the default particle texture with the default particle shader
Let’s import more assets to work with.
Assets>Import Packages > Particles
This package comes with a variety of different textures, materials, and prefabs we can use and edit for our purposes. It also imports some default particle shaders for us to use.
Depending on the effect you want to accomplish Textures can be a single image or a sheet of images that is used like an animation sheet; and depending on how you want the texture to render in your scene you need to use a corresponding shader.
Now, I’m not going to explain the intricacies of shaders and how they work, it’s way to in depth for a short blog post. But I’ll touch on some of the basics of the two main shaders and they types of textures they work with.
Additive Shader
Look under the prefabs folder of Under Assets>Standard Assets> Prefabs
Select the Afterburner prefab and drag it into your scene. This prefab is a great example of a Particle System using Additive Shaders; what’s even better is that it’s using the same Texture as the Default-Particle System.
Additive blending is the type of blending we do when we add different colors together and add the result. This is the way that our vision works together with light and this is how we can perceive millions of different colors on our monitors — they are really just blending three different primary colors together. You can read more about here: http://www.learnopengles.com/tag/additive-blending/
This type of blending has many uses in 3D rendering, such as in particle effects which appear to give off light or overlays such as the corona around a light, or a glow effect around a light saber. If you go through the inspector of the Afterburner prefab you can see the specific settings for each of the particle system modules.
Alpha Blended Shader
Now to show you about Alpha Blended particles go into the same folder as before and select DustStorm and drag it into your scene. You can delete the afterburner prefab. The DustStorm Particle System uses a different texture than the default and afterburner systems.
This cloud has a black (0) alpha channel, making the black parts of the image not effect any of the pixels in the layers under it.
Think of a layer of colored glass or colored semitransparent plastic on top of something of another color. That’s the effect accomplished with Alpha blending. Alpha blending is a blending technique that allows for the combination of two colors allowing for transparency effects. The alpha blending equation is as follows:
final color = src color * src alpha + dest color * (1-src alpha)
What this does is linearly interpolate between the two colors according to the alpha value. If a src pixel has an alpha of 0.8 it contributes 80% of the final color while the destination pixel only contribute 20% of the final color of the new pixel color. This obviously means the lower the source pixel alpha the larger the contribution of the destination pixel. To read more about it check out this link: https://takinginitiative.wordpress.com/2010/04/09/directx-10-tutorial-6-transparency-and-alpha-blending/
Let’s see the difference between the two shaders. Go into the inspector and change the Shader of the DustStorm to Particles/Additive instead of Particles/AlphaBlended.
Now you can see that the Dust now has a glow about it as it layers, instead of the transparent quality it had earlier.
Yay! Shaders and Textures complete! In the next post I’ll show you how to make your particles interact with objects in your scene.
So far in this series I’ve shown you how to make an infinite runner; talking about spawners, destroyers, character controls, handling collisions, and music. Now I’ll show you another way to polish your game; Particle Effects. Particle Effects are an easy way to add that extra wow factor to your game without too much effort.
There are several things you can do with the Particle System Unity provides. You can make streams of water, fireworks, explosions, lightning bolts, OR ominous yellow radioactive acid fog (the 100 references, you should watch it, it’s on Netflix) I find the best way to learn about how to create all the different effects is to play with them yourself. So, by the end of this post, you will have a barrel leaking yellow radioactive material shown below.
I’ll also talk about WHY the particles look like they do and HOW we can effect them through the editor.
Understanding Particle Systems
Particle Effects are generated by a Particle System. Any gameobject within Unity can have a Particle System component.
A Particle System is, at its core, a Spawner, that spawns texture sheets at a specific location with certain properties defining lifetime and behavior. So we’ve done something like this before. However, this time Unity provides a nice Inspector Menu to tweak the behavior and lifetime of the object being spawned. Also, Unity’s Particle System is dynamic. This allows the emission and lifetime settings affect the overall behavior of the system but the individual particles can also change over time.
Unity also provides lots of documentation on each module of the Particle System. Modules are the different sections in the Particle System Component in Unity. Now, I won’t go over all the modules of a Particle System and their properties but I’ll definitely highlight the popular ones as we make the prefab.
Main, Emission
Shape
Color Over Time
Size over lifetime
Collision
Sub Emitters
Renderer
Step 1: Set Up and Default Emitter
So in a new scene I’ve added a Cylinder and a plane to the scene along with an empty GameObject and called it Particles. I then rotated the Particles object -90 in the X axis of the transform and added a particle system component to it.
Once you’ve added the particle system to your scene you will see little white glowing balls of light being generated and then disappearing. This is because of Unity’s default settings for it’s Particle system.
The little balls of light are created because that’s the current texture assigned to the system. The texture could be changed to whatever you want, like pictures of your face or smoke. I’ll talk more about textures and shaders later on. The point at which the particles are generated is called the Emitter.
Step 2: Main
The Main module of the Particle System controls properties of the system overall.
Duration is the length of time the system will run. Most of the properties in this module control the initial behavior of the particles from the emitter. I’ve made the duration 5.00 which translates to 5 seconds.
Looping – This boolean value triggers whether or not the system will loop. If selected it will then repeat in 5 second (the duration) intervals.
**SIDE NOTE: Start Delay, Lifetime, Speed, and Size can be customized by selecting the down arrow on the side, and selecting one of the following options: Constant, Curve, Random between Two Curves, and Random Between Two Constants (The 4 Options). These Options allow the System to behave dynamically, emitting particles at different sizes, at different times, and with different speeds.
Start Delay: 0 – How many seconds after the system start would you like the first particles to be emitted.
Start Lifetime: 5 – Lifetime for the individual particles emitted
Start Speed: 2 – How fast the particle travels outward from the emitter
Start Size: 4, 7 (Random between two constants)- How large the particles will be
Start Color – Yellow and Light green. I also selected Random Between Two Colors. You can choose whatever colors you want.
Gravity Modifier – 0.1 – Scales the gravity value set in the physics manager. A value of zero will switch gravity off. With the Gravity Modifier you do not need to use a lot of force. Play around and see what little gravity you need to force your particles straight down after emission… Hint, it won’t be a lot.
Main should eventually look like this now.
Step 3: Emission
Rate – 3, 5 – Random Between Two Constants. Unity explains that the rate of emission can be constant or can vary over the lifetime of the system according to a curve. If Distance mode is selected then a certain number of particles are released per unit of distance moved by the parent object. This is very useful for simulating particles that are actually created by the motion of the object (eg, dust from a car’s wheels on a dirt track). However, Distance mode is only available when Simulation Space is set to World in the Particle System section.
Step 4: Shape
The Emitter can multiple shapes
Sphere
Hemisphere
Cone
Box
Mesh
Mesh Renderer
Skinned Mesh Renderer
Circle
Edge
I’ve selected Cone since I want the particles to fountain out of the Barrel.
Angle – 30
Radius – 1 – Same as the radius of the barrel.
Emit from: Base
Step 5: Color Over Lifetime
Color Over Lifetime can be set with one Gradient or Random Between Two Gradients. I’ve chosen Gradient, and set the color to White; and then changing the alphas at the beginning and end to 0 to fade in and then fade out creating a radiating glow from the particles. You can change the colors and alpha for the gradient by clicking on the gradient field.
Step 6: Size Over Lifetime
To Edit the curve of Size over Lifetime you need to open the editor for Particle Editing. At the top of the Particle System Component in the Inspector Click – Open Editor…
Once the Editor has popup you will see the settings for your particle system as well as a graph. Unity has some default curves for you to choose from at the bottom of the grid. My curve starts at .5 and arcs up to 1.
You can also use the editor to tweak the modules that we were editing before.
Step 7: Collision
The collision module determines whether or not the particles will collide with 3D, 2D, or plane objects. I have it selected because on collision I want a secondary effect to happen, which we can edit with the sub-emitter module.
I’ve chosen my particles to collide with objects in the 3D World space, with no dampening force or bounce on the particles so they stay on the ground and keep their velocity. I also want dynamic collisions but not with itself. I could also check the box for Send Collision Message, but I’ll save that for later.
Step 8: Sub Emitters
Sub Emitters are secondary particle systems that are parented to the initial particle system created. They are great for creating secondary effects on actions. Unity likes using the example of a bullet that might be accompanied by a puff of powder smoke as it leaves the gun barrel and a fireball that might explode on impact. Those sub-emitters would be added to the Birth and Collision section respectively. Since sub-emitters are simply particles systems they can have sub emitters of their own, allowing you to create complex effects like fireworks. However BE CAREFUL this can lead to a numerous amount of particles in your scene and slow performance greatly.
When you initially look at the Sub Emitter Module it contains three sections, Birth, Collision, Death with two slots each, saying, “None”.
As I mentioned before I want my particles to have a secondary effect on their collision. So to add a sub-emitter to my scene I simply have to select the plus sign next to the first Collision slot. The default sub-emitter of white lights will be created. To prevent the onslaught of pretty white lights filling up my scene, get ready to click stop on the particle system in the scene tab. This will save you a lot of grief as your machine might slow down a bit.
Now to save time I’ll screen shot the modules of the sub-emitter for you to copy.
PART 2.
I’ve only changed a few things. The shape of the sub-emitter is a hemisphere, and the color over lifetime is a bit more colorful. Also notice that the Collision module is on as well but I do not have any more sub-emitters attached.
Step 9: Renderer
Now this module is related to the texture sheet that is attached to the particle system. Overall the Renderer determines how a particle’s image or mesh is transformed, shaded and overdrawn by other particles. Since we are using the default texture sheet I’ve kept the Renderer settings on their default as well.
Tweaking the System
Now there are several other examples that you can checkout from Unity in the editor. Simply go to Assets and Import the Particles Package
And there you have it! Your first Particle System! Now take what you’ve learned and play around with more of the settings. I did skip over some things, like Send Collision Messages, and the Texture Sheet Animation module. I will cover those in my next post where we have these systems interact with our player.
So in Part 1 of Music and Sound I talked about creating your own game music. Now I’ll show you how to integrate the music into Unity.
Unity updated many things about their engine in the latest release. One of the biggest overhauls they did was toward sound creation and manipulation within the engine. This post will go over the basics of using the sound engine but if you want to take it to the next level here is a link to Unity’s advanced tutorial on making sound groups and mixers: http://blogs.unity3d.com/2014/07/24/mixing-sweet-beats-in-unity-5-0/.
Background Music
Sound and Music is sometimes the most underrated thing game developers take into consideration when making their games. But what they don’t realize is that extra addition can take a game from great, to epic. Imagine Super Mario or any of the Final Fantasy Games without their phenomenal score. The ability to completely immerse the player would be lost. Movies use music to manipulate the mood of the audience in order to provoke certain emotions; fear, sadness, anxiety. Games can use music to do the same. For example, speeding up music can create a feeling of intensity and urgency.
Implementing Sounds in Unity
Before we begin here are some Unity terms that you should know:
Audio Listener – controls audio output to the headphones.
Audio Source – where the sound comes from
Audio Clip – sound file
Audio Mixer – Controls Audio Groups
Audio Group – Channel for certain clips and effects
To create sound in Unity you need at least three things. I’ll leave Audio Mixer and Audio Groups out for now.
Audio Clip
Audio Source
Audio Listener
Step 1: Create Audio Game Objects
In your hierarchy create a empty gameobject and call it SoundManager. I like to organize my sounds by putting them in an empty SoundManager Object.
Step 2: Add Audio Source to the SoundManager
Step 3: Add Audio Clip to source
Create an Audio folder in your “Assets” folder. This folder will hold all your sound clips.
Take the sound file you created from the previous post, and with your Explorer (or Finder for Macs) move it to this Audio folder.
Then add this clip to the Audio Clip area of the Audio Source you added to SoundManager.
Make sure that the Loop and Play on Awake boxes are selected as well. This will ensure that your music will play throughout your game and will play at the creation of the SoundManger object.
Step 4: Add SoundManager Script
In your scripts folder create a new script and call it SoundManager. This script can control certain effects that can be applied to your clips, like pitch controls; however I’m only use it to will ensure the game music is playing.
public AudioSource musicSource;
public static SoundManager instanceSM = null;
// Use this for initialization
void Awake () {
}
// Update is called once per frame
void Update () {
if (!(GetComponent<AudioSource>().isPlaying))
{
GetComponent<AudioSource>().Play();
}
else
{
//Debug.log("Something is wrong with Music.");
}
}
And there you have it. If you press play your music should start playing. I’ll continue talking about adding sound effects to specific game objects in my next post, as well as getting into Audio Mixers.
So for game developers worrying about the music is one of the last things on our minds. It shouldn’t be. Music gives games the extra layer of professionalism that immerses players into the gaming experience.
Now there are a bunch of free loops online. And if you want something professional you can always pay someone to write a small loop for you too. However, for those of us developers that want our games to have a unique sound, (and have an hour or two to kill) we can create our own music loops.
One of my favorite tools to do this is Audio Sauna. It’s a really good off/online music editor and gives games a very 80s feel, with the synths and samples it uses.
IT’S SOOOO EASY TO USE! Don’t just have the same generic garage band samples that everyone has. Trust me. There. Is. A. Better. Way.
Now, I am no composer, producer, music maker, so this is just my 2 cents on how to create song for your game.
Intro to Audio Sauna
To get started you can go to the audio sauna site:
They have a great online studio for quick sound loop generation. When you open the studio you should see a screen like this:
Do you see it? Great! Okay so what are we looking at. Well the studio works like most big audio applications with a menu bar at the top some editing hotkeys below that. Your main editing area in the middle, with the source input area. And the mixer, is located at the bottom.
Creating Music
Step 1: Choosing Sound Inputs
So the art of making loops, beats, music in the studio is by creating rhythms and layering different inputs together. The studio already has up one of the default inputs: the Analog Synth. Down in the Mixer you can see that the “Analog Synth” is colored red instead of blue. This indicates that it is selected and that is the current track you are editing.
Let’s take a closer look at the Analog Synth. input. So it seems pretty daunting at first but after playing around with it for a while it gets easier to understand. On the left Audio Sauna has created some presets for you to use.
Now the AWESOME thing about this tool is that you can use your qwerty keyboard as a regular musical keyboard! “But wait?!” You exclaim, “There are way more keys on a musical keyboard than a computer one.” Yes. The tool gets around this caveat by having an octave button that will bounce your input up and down the keyboard’s(musical) keys.
See my mouse isn’t anywhere on the screen but when I hit the keyboard key the corresponding musical keyboard key depresses.
**I’m just noticing how confusing referencing your key from your keyboard and key from your musical keyboard is getting, so I’m going to call the musical keyboard key’s “mkeys”.
Step 2: Writing Notes
You might have also noticed that the side mkeyboard is also depressed.
The mkeyboard in the main editing area, is the full standard mkeyboard. And shows you where your note will be placed in the sheet time line. You see there are two ways of adding notes to your loop’s “sheet music”. One, using your keyboard. Two, using the pencil edit tool right on the timeline.
The Pencil tool allows you to click inside of the timeline sheet and create notes that will be triggered on play. If you exit out of the Analog Synth. input editor, you will see the plain editing sheet. However the notes you create will all be on the short 1/16 beat. To change how long the note lasts just change the 1/16 to a larger fraction, by clicking on it.
At the top of the sheet you can see a green bar. The green bar on the top signifies the length of the loop.
Step 3: Recording Music
Now to actually save any noise you have been creating you need to record your key input.
Let’s get the Input tool back. By double clicking the red Analog Sync. will bringing it back into view. Click on the keyboard to start using the keyboard to manipulate the mkeys. Now hit the record button on the bottom.
The green bar is now highlighted over with a red one, to signify that it is recording your key strokes now. Each time you press a key a red mark should appear one 1/16 at a time.
Great! now keep playing around with the beat until you fill up the loop.
Step 4: Layering beats
So I mentioned that using the tool was meant for layering sounds on top of one another. So lets make those other beats. Click the top right horizontal bars.
A new screen should appear. Currently the Analog Synth is selected and is the only track that has notes in it.
Now double click on the FM-Synth. The tool will then change back to the editor view, but for the FM-Synth.
As you can see below the FM-Synth is now red in the mixer. However, the Analog Synth input editor is still out.
To get the FM-Synth input tool out double click on the red FM synth name in the mixer.
A new input should appear now, like the one above. This time the presets are on the right hand side. Select one of them and beginning hitting the keys, to hear what kind of sound will be made.
This editor works the same as the one before. So either with the pencil, or in record mode with the keyboard, you can create your timeline sheet music. There is also a Sampler input that you can use. You will have to repeat the process.
Step 5: Saving the Loop
So after you have composed your amazing awesome mix. Lets get it into unity. From the File Menu there are two options you should be familiar with. The Save to My Computer option and the Export Loop as Audio File.
You should use the Save to My Computer option when you haven’t quite finished your mix and would like to work on it later. The Tool saves what you’ve done as a session that you can reopen with the tool again. The Export Loop as Audio File is the one we need when we are done with our loop and want to put it into Unity.
The next blog post in the series will be adding your saved Loop into Unity!
In my last post Unity Ads (part 1) I talked about the importance of Ads, why you need them, and how you should implement them.
Step 1: Sign-Up
Make sure you have a Unity Account before you get started. You will also need a Unity Services account as well. To get one go to https://unity3d.com/services/ads to sign-up and try the free beta.
Click “Earn Money with Your Games”.
If you have never created an ad service before your dashboard will prompt you to create a new Ad Project. Do NOT create a new project. If you create a new project you will have to link the ID that is generated from your editor to this project. Instead, we will create the Ad portion of your project in Unity’s Project Editor.
Step 2: Enable Unity Services
Now that you have an account you need to open your game in unity. In the upper right hand corner of the Editor you should see the unity services tab (if you don’t see the tab, hit the cloud icon in the upper right).
Before you can enable start using Ads you need to create a Unity Project ID so you can collect your data and view it with Unity’s Analytic Dashboard.
Select “Create” to have Unity create a Unity Project ID.
Step 3: Enable Unity Ads
After you’ve turned on services and connected generated your Unity Project ID. You should see the available services that Unity provides within the Editor. Currently Analytics and Ads are the only ones available, however multiplayer options and Cloud Build are in the pipeline for future integration and use with the editor.
Turn on Ads by clicking the “Off” toggle to “On” for Ads.
To link this project with our online portal simply click “Go to Dashboard”.
The portal for your project should now open in your browser.
Step 4: Ads Integration
Now that your project is linked to Unity Ad Services, let’s get some ads in your code. There are two types of Unity Ads:
Simple
Rewarded
To explain these two types, I’ll start with Rewarded. Rewarded Ads keep track of whether or not the player has skipped your ad, or has completed watching the ad. This allows the developer to then reward the player after the completion of the advertisement.
Simple Ads are ads that simply run in your application without any other interaction with your application.
Unity provides some sample code for you to run in your project, which makes it really easy to plug and play with the feature.
In order to test out your ads, make sure that the “Enable Test Mode” button is selected
Also ensure that you are building to the iOS or the Android Platform.
When you run the code the ads should look like this:
And there you have it. Integrated ads in your game!
We live in a Free to Play World where most users have grown accustomed to mobile games being free. There are a select few who are willing to pay that up front fee, but your game will already need be popular, have player support, or a good amount of attention to sell it. And of course, console games are an entirely different story.
So how do you get revenue from users who don’t want to spend any money? Ads. Advertisements allow you to have a steady revenue stream as your users play your game.
Now I know what you’re thinking: Ads are horrible, they distract from game play, they take up screen real estate, and they take away from the experience…And yes, they can – if the developer doesn’t design the game to incorporate them in an immersive, consumable way.
In this post, I’ll go over some of the main points of integrating ads into your game and how to do it in Unity’s new services. If you want to read more about it, Unity has a few articles and blog posts about best practices and tips from other indies that successfully incorporate ads in their games:
Have you ever played a game with an annoying banner across the top for a random ad? The answer is most likely yes. one way to avoid advertisements from disrupting game play is to make the decision at the beginning of the design process that you are going to make a freemium game. Knowing that you plan on including ads during the design phase allows you to then tailor your game to strategically use ads instead of slapping them on at the last minute.
Make ads part of the experience of your game. Currently, Unity ads will play full screen and are video based. You can have simple ads or you can have advanced ads that can tell developers if users actually watched the ad or not.
Ad Incentives
To incorporate Ads into the experience of your game you can use ads with an incentive system. For example:
A user is playing your infinite runner and they die. Instead of simply showing their score and a play again button, they see a screen that shows them their score and a choice. 1) Restart and try again, or 2)Watch an ad and start from where they died.
That way, game play has not been disrupted and the ad is showing at a natural pause in the game. Also, by rewarding the player, the user doesn’t associate dying with watching ads, giving them a bad experience with your game when they die.
Don’t break the game
Be wary of adding too much power to incentives.
The incentives and rewards that players receive for watching your ad should not give them an unfair advantage for the game. If your game involves some sort of currency, the reward shouldn’t be great enough to disrupt play. Balance the frequency of showing ads. Show them too often and you’ll irritate players, which might cause them to stop playing your game all together.
How to Implement
Unity 5.2 and higher has Ads support integrated into the editor. It’s a great tool for testing your ad experience. In my next post, I’ll show you how to integrate ads the way I did with Unity Analytics here: Unity Gaming: Anayltics
So in Part 1 of Unity Gaming: Analytics, I talked about the importance of analytics, what it is, why you would need them, and how to understand the data. This part will go over how to integrate them into your game and connect to the Unity Analytics Dashboard. Remember these new analytics are still in Beta and can only work with Unity 5.2 or above. Let’s get started.
Step 1: Sign-Up
Make sure you have a Unity Account before you get started. You will also need a Unity Services account as well. To get one go to https://unity3d.com/services/analytics to sign-up and try the free beta.
Step 2: Enable Unity Services
Now that you have an account you need to open your game in unity. In the upper right hand corner of the Editor you should see the unity services tab (if you don’t see the tab, hit the cloud icon in the upper right).
Before you can enable start using analytics you need to create a Unity Project ID so you can collect your data and view it with Unity’s Analytic Dashboard.
Select “Create” to have Unity create a Unity Project ID (if you already created a project ID in the Analytics Dashboard tool, you can use that ID to connect to your game with the “I already have a Unity Project ID link below the Create button).
Step 3: Enable Analytics
After you’ve turned on services and connected generated your Unity Project ID. You should see the available services that Unity provides within the Editor. Currently Analytics and Ads are the only ones available, however multiplayer options and Cloud Build are in the pipeline for future integration and use with the editor.
Turn on Analytics by clicking the “Off” toggle to “On” for analytics.
The Services tab will then open to the Analytics portion. Click the “Enable analytics
Step 4: Analtyics Integration and Validation
To view and test your analytics you now need to go to the Analytics Dashboard found online at https://analytics.cloud.unity3d.com. The easiest way to get there is to click on “Go to Dashboard” (make sure you’re connected to wifi).
The link will open your default browser and navigate you to the integration tab on your Unity Dashboard.
To find out if your Analytics Services are correctly integrated, navigate through the documentation, by clicking the Next button. You’ll see a “Play to Validate” page.
Go back to your application and Play it in the editor. The empty box on the Dashboard should now be displaying data about your game.
Trouble Shooting
If there is no data being displayed, stop your game and give the system time to refresh the dashboard. If it still isn’t working make sure that the Project ID in the Dashboard and the Project ID in the editor are the same.
Step 5: Write Custom Events
Now all that’s left to do is figure out what data is important to learn how users are interacting with your game. The next post will explain how to write code to collect custom information specific to your game/application.