WanderMath is an Entertainment Technology Center (ETC) project making an Augmented Reality (AR) experience inspired by Math Walks, to help 4th and 5th graders engage in open-ended math adventures.  

Our interdisciplinary team consists of five students:

Katherine Wheeler, co-producer of the team, took charge of external production, such as client communication, external resources connections, playtest schedule, as well as game design. Kat is also the voice actress of our character Finley. 

Leslie Jing, co-producer of the team, took charge of internal production, such as time schedule, team meeting, faculty meeting, design, and updates of our project website. Leslie also helped in game design, art and tech.

Sophia Videva, the solo artist of our team. Created our lovable character Finley from scratch to animation. Sophia also made the logo for our team.

Yuanqin Fan and Jacob Li, are the programmers in our team. They learnt AR from the beginning on their own and conquered a lot difficulties under the help of ETC.

The Center for Arts and Education at West Liberty University is the client providing us with guidance throughout the semester. Our goal is to help kids not only “see Math” but engage them in play with math, in pursuit of a critical and conceptual math understanding.

At the end of this semester, we successfully made 3 activities: the Birthday Party, the Skate Park, and the Carnival, covering geometry concepts such as folding the nets of 3D shapes, calculating the volume of prisms, and graphing points on the coordinate plane. Students can freely explore the AR math world under teachers’ guidance.

We also publish WanderMath into Google Play Store to make it an App that can be downloaded everywhere in the world. Also made a Teacher Resource Guide for the App and each activity in the it.

Project Achievement

1. Production

1.1 Time schedule

  1. Semester-long plan

At the beginning of this semester, we didn’t have a long term plan. We started to think about the plan for the whole semester after the first process grade meeting with our instructors, Michael Christel and Thomas Corbett, then realized that there wasn’t much time left for us. Fortunately, we found it early, so we had time to make three activities at the end. Otherwise, time can easily slip away without our notice.

  1. Reserve time for the unexpected

We set the deadline early for our team. Week 12 is set as the final week for developing and playtests. We also reserved Week 11, a whole week for polishing the project, as well as saving time for the unexpected during the time before Week11, such as falling ill, annoying bugs, etc. In this way, we were able to finish the activities on time and didn’t need to rush in the last few weeks.

1.2 Communication

  1. Within the team

We had very good communication within the team. We not only communicate about projects together, but also chat with each other about life, food, clothing, also learn each other’s languages! We had two team bonding this semester, both happened during playtest day. The first one is on Oct.10th. After having a big meal at Hofbräuhaus, we conducted our first playtest in Tech Playground. The second team bonding is the lunch during Nov.6th, the ETC Playtest Day. We ordered way too much pizza and invited all the other teams to enjoy it with us! Through Slack and Wechat, we communicated with each other well after core hours, and improved the work efficiency and flexibility.

  1. With instructors

As for communicating with instructors, we not only have faculty meetings every week to update instructors with our progress, we also meet with Mike almost every Wednesday and Friday to show him every update we made and concerns we had during the week. By this means, we were able to obtain feedback from instructors timely and make changes to our plan or development in a real quick manner.

2. Tech 

2.1 Activity 1

  1. Folding cubes in AR space

The biggest challenge we faced is allowing users to interact with 3D AR technology using a 2D phone screen. We did a fair amount of research and came up with multiple strategies to achieve this to be most intuitive yet less instructional on the screen.

We started out with 3D raycasting to determine the screen position based on left and right. This worked initially but as users moved around, the detection went off. It is very inaccurate which made users more frustrated than enjoyable. As such, we decided to use raycasting in 3D to determine the movement of fingers in 3D space instead. This worked a lot better.

However, a problem arises when the person tries to fold in the opposite direction and it would not work since the 3D mapping in the world changes direction. This does not happen that often hence we tried to solve it towards the end of the semester. After much research, we tried to add a look at the script for objects to turn accordingly and the direction would not change. This was removed eventually as people want to view the cubes in multiple directions/angles instead of simply 1 angle. We tried another approach of detecting the position of the users with respect to the object and changing the logic accordingly. This works for a single object but fails terribly for multiple objects when the user is in the middle of multiple objects. Our final solution to the problem was using forgiving mode where we detect if the user has touched an object for more than 5-10 seconds, we will recognize that the user is having trouble folding the cube and allow the user to fold simply by touching it.

  1. Instantiating objects in AR space

Also, we were looking for ways to allow objects to show up in the 3D space logically. Our initial idea of allowing the objects to float down was treated as a bug. So we decided to make it more magical by using particle effects to let objects appear & disappear.

2.2 Activity 2

  1. Drawing accurately in AR space

This is definitely the most challenging activity we have done. The major issue is trying to calculate distance accurately despite the issue of drifting in the AR world. We do recognize the fact that drifting exists in our initial design, as such, we have varying degrees of margin of error to ensure that the activity will work. However, it requires a huge trial and error to determine the right degree of accuracy in gameplay and calculations.

However, as we tested our activity more through external playtests by our instructor, we noticed the extent of drifting to vary largely from time to time. Although our initial idea can account for 80% of the normal playing cases, it is not foolproof. It is worsened when the game is initialized based on random surfaces.

As such, we changed our initial approach from using real-life object locations to a dictionary of virtual objects to map to real-life objects and vice versa. Upon requests of drawing movements, we will update the mapped virtual objects to show that it is linked as well. Hence, before we need the object locations, we will use the virtual object to map to the real-world object and obtain the locations so that it is more up to date, accounting for the drift. This worked 100% so far and we are happy we made this change.

  1. Pulling from multiple edges using ProBuilder

As objects in Unity only have a single pivot point, to be able to pull up multiple edges to form a ramp requires additional logic. The most basic solution would be to instantiate multiple objects with different pivots and depending on where the object is touched, the object will be instantiated. This would work in a normal situation but since it is in AR and there is a possibility of drifting, this would show to the users “multiple objects” when it is supposed to be one object.

Hence, we did more research and found ProBuilder. ProBuilder is a tool used to build a customized shape for different level designs. It is mainly used before gameplay. However, they do provide APIs to allow interactions during gameplay as well. There are minimal resources related to this so we spent a considerable amount of effort exploring how to raise up edges and top, making a single Probuilder object dynamic in nature.

2.3 Activity 3

  1. Mainly uses mechanics from Activity 1 and 2

At the beginning of the Activity3 development, our team planned to do it asynchronously. That means one of our programmers would keep working on the second activity, and another started implementing some basic features with some placeholder assets. By doing so, our team can make progress on both two activities and catch up with our original development schedule. 

To align with this asynchronously working schedule, we spent some time on doing the scope management for activity 3 and came up with the carnival booth idea, which is easy-to-implement and also fun-to-play. We reused the folding and dragging mechanics and implemented the coordinate-related water ball shooting gameplay to save some time to research for some brand new AR features. With the development experience from the previous two activities, the basic feature demo was done within two days and almost fully implemented in the following week.

Although the development went very well, we had some trouble from the gameplay designing side. We were planning to create a slingshot-like feature to give players more freedom to walk around during the gameplay. But it’s hard for us to make the slingshotting experience as good as a PC in our project due to the unexpected issue of drifting in the AR world. After considering trade-offs between every solution and the suggestion from our instructor, we then pivoted to the “cannon booth” idea to keep the educational purpose(learning X/Y coordinate) without changing the gameplay a lot.

2.4 Generic changes

  1. Unity Build settings change

The generic build settings recommended for our initial AR setup from the Youtube video did not work. We did some research and played around with the settings. In the end, we changed the scripting backend to IL2CPP from Mono and tick the ARM64 for target architectures instead of ARMv7. The building time increased but there are very few occasions of application crashing anymore.

  1. Instantiating objects more accurately

Throughout the semester, we tried to solve the drifting issue, not completely but certainly alleviating it. Eventually, following many guides on this, we added ARAnchor to reduce drift level. This works for objects that are fixed from the start to the end but for objects that are moving like Finley, this would not work. Although it is not full proof, it definitely helped in reducing drift. We used this wisely as it adds to the computational power of AR resources. It needs to be cleared properly as well.

  1. ARFoundation sample resources

As ARFoundation is still being enhanced constantly, some youtube videos for research may be outdated. The most useful resource we have explored so far is this arfoundation-samples which is maintained by Unity. By building the application on the phone, you are able to explore various technologies and see how each technology works directly. It helps with the explanation from the online API documentation. If you are looking for more advanced resources, you can also check out arfoundation-demos with more advanced functionalities that may require external resources.

  1. Deploying to both iOS and Android

Although our application is meant to build for android, we also managed to build it for iOS. There are minimal changes required as ARFoundation supports both android and iOS. The build settings to build Android can be used for iOS. The main (most difficult) thing is to obtain an iOS certificate from the developer account.

  1. Preparing for publishing on the app store

There is a restriction of 100MB. We had a growing application size of 150MB and growing near the end of the semester. As such, we started to find online resources to try to optimize the builds. E.g. Tips we used to reduce file sizes of our build. It mainly resolved around reducing assets such as removing unused assets. However, this is minimal change. We did put in a lot of effort to remove unused audio files and assets to make the project cleaner.

What actually helped is to reduce the size of used assets within the Unity editor log. For example, in our case, a texture png that takes up 5Mb may take up way more after the build. We mainly change texture settings from default of max size 2Mb to 512kb, set compression set as normal quality, resize algorithm remain as Mitchell. We also tried to use crunch compression to 75% for even smaller size and excluded plugin for ARFoundation Remote Installer helped as well. We also played it a few times and there is no decrease in visible quality of the objects in the scene both on our Pixel 3, 4, and Samsung Tablet.

3. Art

3.1 Finley

Out of all the art assets in this project, Finley by far took the most work. While modeling and rigging assets was not to us and Sophia, one of the biggest things we wanted to get right from the start was an organized process that could be kept as modular as possible. While the model and rig for Finley was being finalized, the programmers worked with a rough blocked-out model to get a sense of scale and space for Finley. As animations got finished, they would substitute in the final file. Coming from a film background, Sophia was used to working with feature rigs that didn’t have specific requirements for a game engine and allowed complex contraints, locators, and scripts that couldn’t be exported into Unity. Finley’s rig was a new challenge in the sense that all of the joints had to be connected, all blendshapes had to be baked into the animation, and the animation had to be baked in a stepped playback to imitate the 12 frames-per-second feel. 

Another new obstacle for us was to have the rig be further altered in real-time and have the character remain appealing in AR. While Finley was animated to look up at the guest in Maya, we also had to take into account different heights of the students and moving tablets. So, working with Sophia, the programmers took specific joints in the rig and had them follow the camera, overriding some of the rotation animation in that joint. That way, Finley could connect more by the player by constantly looking at them. 

Apart from technical challenges, of course, Finley’s animation was another obstacle to overcome. I (Sophia) have had some experience animating characters, but this semester really pushed me to create better polished and stylized animations. This required me to record many takes of reference footage, gave me more experience working with animation layers in Maya, and gave me more insight on how to work in an organized animation file in Unity. 

3.2 Environment Assets

One of the first things Sophia pushed for at the beginning of the semester was a simple and stylized environment. Becuase our team had one artist, we wanted to trim as much art scope as we could. To make each activity’s environment cohesive, we worked with the same color palette in each scene with the addition of a toon shader.

3.3 Version Control

This semester, the tech team decided to use Github for programming version control. Sophia was unfamiliar with that workflow, and becuase she would not be making changes to the actual scenes, we ended up using Google Drive to provide art assets to the tech team. Models would be imported into a local Unity project on Sophia’s computer, and the prefabs and animator controllers created on her end would be exported as a Unity asset package. This package, along with any example screenshots of how to set it up, would be uploaded to Drive and posted on the WanderMath slack channel. While not as quick as Perforce, it seemed to work well for our team and we managed to keep our files organized. 

Possible Improvements

1. Production

This semester, we did about 5 playtests in 5 different locations. However, looking back now, we were not really well prepared for the playtests. 

We should discuss the questionnaire together to gain a better understanding of our audience so that we could know better about our achievements. It’s reasonable for the first 2 playtests because we didn’t know who would come to play our activities. We were basically just sitting there and waiting for the kids to pass by, so it’s hard for us to prepare questions that could be answered by all the guests. 

However, as for the last playtest in WLU, we should think about more questions that cover not only math concepts we had in our game, but also how the kids feel about these activities, do the kids who don’t like math that much feel better about math, etc. Though it’s hard to have a statistical result from playtesting with kids, we should still work on that.

2. Tech

2.1 Approach activities from easy to difficult

It might be better if we can start by designing our activities from easy to difficult. As we start to explore technologies for activity 1, we have to learn about folding, scaling, animating, instantiating. However, for activity 2, we need to ensure accurate measurements, handle drifting, detection systems etc. As for activity 3, we are able to use all the technologies we learned to work on it. Looking back, it might be better if we can plan things out first and decide on a better sequence to work on the activities.

2.2 Spend time and effort to make it more realistic with lighting

We spend a considerable amount of time building the mechanics but we made everything seem magical in AR. Not sure if it is expected in the AR world but we would want to spend more time and effort researching and applying lighting in AR if possible.

2.3 Better time and task planning on individual activities

In general, we make a list of tasks for each activity and allocate time to them. However, we should take into account time used on playtesting and resolving issues after each activity. Also, as we started to explore the tasks, we found new tasks to be done which extended the original deadline discussed. A good example would be activity 2 where we thought that drawing would be the main mechanics, yet a huge part boiled down to handling drift issue and mechanics after a shape is drawn correctly.

3. Art

Looking back at the semester, there are a few things in each activity that could be improved. Because Finley’s animations took the most time out of each week’s art tasks, UI assets and feedback environment assets were pushed to the lowest priority. Our playtests later proved that we should have focused a lot more time on these specific assets, such as highlights and handles guiding the player where to tap. Our initial designs for each activity included ideas for where to have the user interact with the screen, but less so how to intuitively guide them to where we want. The resulting environment assets that were added to fix these confusing interactions were simple, a bit boring, and created only in our final week. While Finley received overwhelmingly good reviews from both faculty and playtesters, the environment was not at the same polish level and sometimes resulted in less intuitive gameplay. 


To sum up, we made a buffet of AR experiences for 4th and 5th graders, helping them better obtain a conceptual math understanding with the help of AR visualization. We are also planning to publish it to the Google Play Store so that teachers and students can play it anywhere, anytime. All in all, this is a fruitful semester. We appreciate everyone who helped us along the way.