The Unreal Virtual Production Fellowship, is a month long intensive program put on by Epic. The course was 4 weeks long, and I was a part of cohort number 1. During the program we learn Unreal basics surrounding the engine's capabilities for virtual production; from materials and lighting, to sequencer and cine camera.
This project was created during the fellowship. I did the lighting, facial animations, cinematography, blueprints, modified the environments, and put it all together.
Want to see my final project or participate in the fellowship? The links are below.
The Real-time shorts competition was a 30 days short challenge for Unreal. Individuals and teams working remotely from around the world. MacInnes Studios supplied the Unreal characters + scene files, real-time filmmakers suppled creativity + vision + skills. An illustrious panel of judges picked the best.
This is my first short in Unreal, using version 4.24. The characters and most of the animations were provided by MacInnes Studios. I did the lighting, parts of the animation, cinematography, and voice acting.
Want the Code for this Project? You can clone the repo from Github. I also have tutorials about how I created the project on my blog here.
MRTK-Unity is a Microsoft driven open source project that provides a set of components and features to accelerate cross-platform MR app development in Unity. It provides the basic building blocks for Unity development on HoloLens, Windows Mixed Reality, and OpenVR. It's also designed as an extensible framework that provides developers ability to swap out core components. And supports a wide range of platforms.
I was one of the designers for the MRTK as well as developing new features on our internal toolkit that we could port over to the public. Our team was in charge of bridging the gap between what Microsoft R&D was doing internal and what we shared with community devs.
This project combines the power of Microsoft Cognitive services and Unity. Using HoloLens Voice Recognition combined with the Microsoft Speech to Text Bing Service and Microsoft Natural Language Processing (LUIS), users are able to query bot infromation with only their voice.
I created this project out of an attempt to find better user interactions with Hololens since tapping is physically imposible for some users, and voice commands need to be precise in order for the built in Phrase handler to work. This method used Natural Language processing to make it easier to interpret a user's intent rather than a specifc phrase. I did all the programming, and design work for the project.
This experience was my first creation working with HoloLens. It utilizes spacial mapping and gaze events to allow the user to bring scenes from the show Stranger Things, into their living room. This project was later ported to the Microsoft Mixed reality headsets, covering both the VR and AR sides of the spectrum.
This project was created during an internal 2 day hackathon and was my first experience developing for HoloLens. I developed this project on my own and used art and sounds from the Unity asset store. I also ported the application to VR on my own.
Want the code from this project? You can clone the repo from Github. I also gave a talk about my experience at Vision Summit 2017.
This Bot was created to demonstrate the power of the new Microsoft Bot Framework. It uses Dialogs and Microsoft Cogniative Services LUIS, Natural Language Understanding Platform, to interpret user input and output show infromation.
I created this project in attempt to learn the new Microsoft Bot Framework. I eventually ported the bot to Unity as well to expiriment with new methods of gameplay interaction and it is what lead to the HoloBot I created.
This project was part of my Make a Game without Coding talk and series. The game is adapted from the show Stranger Things and follows Nancy as she hunts or is hunted by the Demogoron. The Game was built using Construct 2. It is all hosted on Microsoft Azure and can be played here: Nancy's Quest Game.
I created this project to demonstrate what is possible to do without knowing how to code. I did all the game logic with Construct2. All the art and animation is original and created by myself as well.
This is an open source project maintained by both Microsoft and the developer Community. And can be found on GitHub.
Trello is a web-based project management application used by main Indie Game Devlopers and Startups, as well as small to medium size businesses. This Plugin, available in the VS Code Market Place, was created to increase devloper productivity by managing cards from within the IDE. Coded in VS Code in TypeScript the plugin uses a variety of callback functions to authenticate users and manage layers of cards from your Project Boards.
I created this project as part of an internal 3 day hackathon at Microsoft. I continued to iterate on it so I could submit it to the VisualStudioCode Marketplace. This was a solo project I did to increase my TypeScript and Plugin skills.
Want the code for the Project? You can clone the repo from Github. Or download it from the VSCode Market Place.
This project was started in order to create more imersive controls for VR. Originally developed in 2014/2015 as an infite runner game for the DK2, the project used kinect integration to imporove user presence.
After using the gamepad in VR I started playing around with different input methods for VR to better improve a users presence. I developed this project on my own. Devloping all of the code, designing the game, UI, and interactions. I even made the music. However the texture and particle effects are from the asset store.