Over the past several years we’ve seen an explosion of interest from students looking to work with or learn from our team. In an ideal world, we would give everyone who asks the opportunity to fill a paid internship (we try not to let anyone work for free).
As an alternative, we offered a select group of college students interested in an internship to take part in a L2D guided, team-based, design and development incubation process, which we dubbed L2D Summer School. Essentially, we created a space to facilitate and build something cool that they came up with while giving them feedback along the way. We provided the space, advice from our senior level team members, and the occasional ice-cream bar to help them along. Here is how we presented Summer School:
Running between Monday July 9th and Friday August 9th, the L2D Summer School program will give four talented young people the opportunity to work collaboratively, within a small team and under the supervision of L2D senior staff members, on a moonshot project of their own design. This project, which may or may not be applicable to any current L2D work, can include aspects of web, physical interactive, and game design, but will not be limited to one silo or discipline.
To clarify the general direction of the task we also asked the Summer School students to “Create a project that was collaborative and reflective of team members respective disciplines while also stretching and challenging each team member outside their comfort zone in order to learn within other areas of concentration.”
This piece will outline, in their words, what they were able to accomplish in only a month. Spoiler Alert: what they were able to accomplish with basically 3 half-days per week for a month, all while performing a major pivot about halfway through the process will blow your mind. The team included Libby Michael and Chad Tetzlaff (design), and Dustin Smoote and Jonathan Waters (engineering). Warning, there are parts of this that get very technical. If you want a TLDR, head on down to the bottom.
Project Description
After a quick brainstorming process (that looked something like this) the team had a good understanding of what they wanted to build. Here is what they came up with:
Initially we came up with the idea to host an interactive exhibition, using Xbox Kinect cameras to track movement of people in space to then create a digital interaction with objects/projection mappings in the room.
The team moved down this path for roughly two weeks, creating some pretty cool interactions using Kinect cameras, data visualization and the Unity Realtime creation platform, but after some debate decided to completely change directions. Not the most advisable path for a project that needed to be completed in a month, but they did it in order to produce the best possible project in the available time. Here is what they wanted to accomplish after the pivot:
After being unable to create a uniform and concise original idea we came across with the thought to create an AR app influenced by the concept of a rube goldberg machine. This new idea would be an AR Physics based game. You would use your smartphone to track cards which would be then translated into 3D objects. We initially pushed to design 10 levels and create a sandbox mode. The design process was heavily influenced by 90s visual aesthetic (terrazzo, taco bell interior design, math textbooks, etc.). In order to create the app we used an AR plugin in Unity called Vuforia. Each card that was scanned was then linked to a 3D object that was created by Libby, our 3D designer. The app is cross compatible for android and iOS.
Making it happen:
Since the plan was to build an app, it needed to be available in both Android and IOS. Here’s how to make that happen in a short period of time (excluding the approval process, which admittedly can be arduous at times).
Opening the app:
Unity is capable of exporting projects onto Android and iOS devices as running applications, via Android SDK and XCode, respectively. Just like any other mobile application that is downloaded from Google Play or the App Store, the app icon can be located on the device and selected to run the app, once it has been exported from Unity.
Detecting the card:
- Using the Vuforia plugin for our team’s Unity build, the DefaultTrackableHandler.cs script provided by Vuforia served as a template for our custom Event Handler script, named VFEventObject.cs.
- In the VFEventObject.cs script, object tracking is handled by three methods: OnTrackableStateChanged(), OnTrackingFound(), and OnTrackingLost().
- These three methods influence the conditional logic within the Update() method, which render the 3D meshes over their associated trackable objects.
- OnTrackableStateChanged uses constant values for tracking states, inherited from Vuforia’s ITrackableBehaviour class, to determine and update the dynamic state of the trackable object, by comparison of the previous status to the new status.
If the trackable object has been detected in the OnTrackableStateChanged() method, OnTrackingFound() is called, which enables the 3D mesh associated with the trackable object (playing cards, in our case), and sets the boolean variable “tracking” to true. Alternatively, if tracking is lost from previous status to new status in the OnTrackableStateChanged() method, OnTrackingLost() is called, which sets the boolean variable “tracking” to false. The “tracking” variable is used for conditional logic in the update() method for the VFEventObject script, where the orientations of the 3D meshes associated with each card are configured to achieve behavioral uniformity, while being tracked.
Here's what that looks like:
Spawning the shape:
Within the VFEventObject.cs script, the Update() method handles the transformations of the 3D meshes or shapes, in conjunction with the orientation of the playing cards. For example, as a playing card is rotated 90 degrees, the 3D shape will follow and rotate along with the card. The shapes’ transformations are locked onto a single x-plane, so that the ball will always align with the shapes on the plane during gameplay. An additional feature that is also implemented in the Update() method is to give the user the ability to select shapes and lock them into place, so that they remain on the display even after the card has been removed.
Interactivity with the Ball:
The game ball is a 3D sphere gameObject, a default primitive 3d shape on Unity, that has been UV mapped and given a custom texture. The game ball has the Rigid Body component applied to it, in order to allow for the effects of gravity upon a mass. In addition to applying physics to the gameBall, the shapes which serve as obstacles for the ball during gameplay are configured to have BoxCollider components, so that any collision between the ball and any shape is detected, allowing for the game ball to interact with the 3D meshes. I.e. Without the box colliders applied to the shapes, the ball would simply fall through the obstacles.
The configuration for the ball’s initial position and release during gameplay is handled in the SceneController.sc script, which is used for handling the load and exit setup for the current level selected by the user.
Below are the methods in SceneController.cs which setup the visibility, functionality, and transformation of the game ball:
Completing the Game
To create an objective-based game, each level design has a single start gate and a single end gate, positioned relative to the start gate. The two are visible on the display once the start gate playing card has been tracked by the smartphone’s camera. Similar to the other obstacle shapes in the game, the end gate gameObject has a box collider component attached to it, so that when the game ball has collided with the end gate, the SceneController script is triggered to exit the level, clear the visible targets, and load the “Win Screen”. The “Win Screen” page is accessed by the MenuController script. The MenuController script provides UI menu navigation functionality for launching the menu and levels pages throughout the runtime of the application. The MenuController methods are extended and used within the SceneController script for additional menu navigation between loading and exiting of levels during gameplay.
Below is the endLevel() method in the SceneController script, using MenuController method ShowLevelComplete() to launch the “Win Screen” page within the UI menu.
Our Conclusion
To us the 1st Annual L2D Summer School was a rousing success. Jonathan, Dustin, Libby and Chad were able to pull together to create a app, with a unique aesthetic, that was actually playable… and fun! Currently, the team is in the process of refining the app and it will be available as a full game pretty soon. Keep an eye out and follow along with the process here.
Internally, providing an opportunity for students to learn, have fun and build something as a group was invigorating. It provided an opportunity for our senior level team members to gain further understanding through mentorship and an additional excuse to eat popsicles and ice cream during a blazing hot Chattanooga summer.
TLDR:
Within a month, the Summer School team created an AR based app with multiple functional levels. They used Unity and the Vuforia plugin to craft a game where users place cards on a table or a wall. These shapes spawn 3D objects that mirror the shapes on the cards. The goal is to get a ball that appears at the top of the screen to fall into a bucket-like shape at the bottom. You arrange the cards in a way that forms a path for the ball to fall into the bucket at the end. As you progress through each level (9 in total) the bucket gets further and further away from where the ball drops making it more difficult to arrange the cards in a way that gets the ball into the bucket.
The senior members of L2D provided feedback occasionally but all of the work was done by Jonathan, Libby, Chad and Dustin. Check out their progress here if you’re intrigued. Oh, and what Summer School experience would be complete without a corresponding playlist. Enjoy.