Ongoing Blog

Hi everyone, it’s the week before finals! We’ve just ended the open house tours, and we’ve had a blast showing off our work to all of you who came by! Thanks to everyone who showed up, we loved talking with each and every one of you. We’ve finished working on most of the things for our project, it’s down to actually talking about it in next week’s presentations.

What’s Left

We’re spending most of the limited time we have left polishing and working on our presentation for finals. This includes taking videos of our work, including for our new trailer:

 
 

Hope you’ve enjoyed hearing from us from us all this time. Sadly, all good things must come to an end.

Wrap Up

It’s been a crazy 14 weeks. Thank you all for staying with us and reading our blogs and helping us out with our project. We couldn’t have done this without all of you! This is the final blog post for this project, but look forward to hearing from many of us next semester, as we move on to other things. This is VESP, signing out!

Hey everyone! The semester is nearing the end, with only one more week and blog post after this. We’re still working hard on the project, but we’ve shifted more into smaller tweaks and other facets of the project (like our deliverables) rather than adding more features last minute.

What We’ve Been Up To

This week we had Softs presentations, so we ended up meeting with a lot of faculty. They gave out lots of feedback, so we’ve spent a bunch of time sorting through that and working out what we need to do with our limited remaining time. We want to thank everyone who came, hope you enjoyed seeing our work!

Our drafts of all our things was also due this week, so we’ve been progressing on that part of our work as well.

On the topic of our actual thing, we’ve been touching up parts here and there. We have a new pedestal model, for transitioning out of the spirit scene. We have new elephant animations. We have an updated elephant scene user interface, which uses the watch from the squid. The squid has a new goal and collection mechanic. The elephant landscape has been adjusted. On top of these small changes, even more even smaller changes have been made across our project.

 

Next week is the last week before finals. We’ll be wrapping up everything from this semester. We’re still planning on making changes, even into next week, as we’re still scheduling playtests and incorporating their feedback, but changes should be minor since we need to complete everything. Hope to see you there for our final blog update!

Hey everyone! Softs is coming up next Monday, so we’ve been getting things done to prepare for that. We’ve got a new trailer up in our media section, but I’ll still link it here for you!

 

 

What We’ve Been Up To

Last week we’ve spent a bunch of time focusing on getting our elephant scene up and running again. It had been broken for a while, but now it’s better than ever. The elephant walks better, the sound now gradually fades as your hands get closer to the ground, and the world now is fleshed out and not missing a piece.

Other than the elephant stuff though, we’ve been doing work in the other animal scenes too. The bat and squid both also has environment updates, with the jungle having a larger variety, and the squid having more stuff overall. The squid also has new animations in. The bat scene now has a cool progress tracker for you, so it’s easier to see how many moths you’ve eaten.

Over on the sound side, things have been progressing great. The elephant scene’s sounds have been completely revamped. We’ve also added sounds to the spirit scene. Also, the bat now has a docent, so every animal scene has a docent. We’ve added a way to trigger the docent in the scenes, so that the player can start it up when they’re ready. All the sounds are now converted to FMOD, so our sound guy can tweak all the sounds and has been doing so to their satisfaction!

 

Next week is Softs, so we’ll be presenting this trailer and getting feedback. Our work on the experience is close to completing. There are still things here and there that we want to work on and change and fix though, so we’re not not done yet! See you all next week!

 

Hello everyone! 

This week ended with CMU’s carnival. We had a lot of fun catching up with the rest of our peers in other projects, as well as hanging out with the staff in our open brunch session. It was a great ending to a productive week working on our project.

What We’ve Been Up To

We’re still making consistent progress on our experience. This week we focused on improving our squid experience. The squid’s audio has been moved to FMOD and iterated on, so now our sound designer has full control over how the sound plays in the squid scene. The squid’s model has also been upgraded, although that is not yet in the scene. Some of the scripts have been updated too: the big squid which appears in the scene now tracks whether you are properly hiding or not.

Outside of the squid scene, we’ve been working on fixing our bindings buttons. Now, the project should run on any VR device properly (or as proper as it runs normally). This has been pretty hard to track and work on – since none of us has a second VR device to test changes on – so there are probably still a few kinks to work out.

In other scenes, we’ve been working on the elephant, so that you can see the vibrations in the ground coming towards you, rather than only feeling it. It’s a pretty cool effect which we hope shows up well.

 

Next week is our last week before softs, and we still have a lot of things to do. We’re now focusing on our elephant scene and spirit scene. Also, we’re getting some playtests done, and squashing bugs that we find along the way, as well as streamlining things which are unintuitive to playtesters. By the end of next week, we’ll have videos and a build to be ready to present. The end is in sight, see you next week!

Hi everyone.

Hope you had a great holiday weekend like we did! Some of us had family events on. Some of us went travelling. For the others, we’ve been hard at work refining our experience (as we always are!).

What We’ve Been Up To

As our refining of the bat scene winds down, we’ve moved on to refining our squid scene. We’ve been thinking a lot of how to add agency for the player, since there isn’t much for the player to do in our current iteration of the squid scene. We’re thinking of what goals the player can be given and what would be achievable for them. Other than that, we’re working on smaller things, such as updating the sounds, models, and the environment around our squid.

In other news, our scenes are now connected! We’ve worked on a menu screen, so now you can travel back to the scene selection scene from any of the animal scenes. Also, the scene is somewhat paused while you’re in a menu, so you can drop out and do some real life stuff before coming back to continue (not that we’re expecting you to want to tear yourself away from our game, but things can happen).

Finally, we’ve also been working on our scene selection scene, which we’ve been calling our spirit scene. It’s just there to bring you to the animal scenes, but hopefully we can make it all feel connected and natural, not to mention awesome looking.


We’re still grinding hard, up until the deadline and softs are coming up soon, so we’re definitely feeling the pressure!

See you all next time!

Hello everybody!

This week we continued working on our final experience. Updates will be a bit shorter from now on, as we switch from prototyping to refining our experience, and have less new things to show off. However, we will continue doing weekly updates, so that all of you lovely people know where we are in our progress!

What We’ve Been Up To

We’ve been focusing a lot on our first experience – the bat – this past week. We have all new models, animations, sounds, visual effects, and flying mechanics! We were pretty happy with the interactions in the scene to begin with, so a lot of the polish just makes everything feel better rather than change up the gameplay. We’ve also adjusted the player slightly, making them smaller to be closer to the world and the things in it. We’re still working on visualizing the player: before we were using the default controller model in the SteamVR demo, however we’ve tried experimenting with a stick for the bat to hang off of, like a tree branch. It’s still very rough, so we’ll keep trying things and keep you up to date.

We’ve also started working on a scene selection screen. It is still very basic in terms of visuals, but it serves its purpose as a way to start the game and choose which experience you want to enter. Currently, we’re using the point-and-click method from our squid camouflaging mechanism, however we’re thinking of whether we can make it more immersive, while keeping the intuition that point-and-clicking provides.

Finally, the last thing we want to mention is that we’re focusing more now on playtesting than before. Now that we are getting our final experience up and running, we have an easier way to test all our experiences together. We’ve already started contacting a bunch of people and arranging dates. If you want to try out our game, we’d love to have you try! Just contact us at our “About” page.

 Wishing you all a happy holiday!
See you next time!

Hi everyone! We’re just a week into focusing on our final experience and we’re going full force! Firstly, there’s no video this week because we are in the middle of a bunch of changes to a variety of our old prototypes and none of them are ready yet, but look forward to seeing them soon!

What We’ve Been Up To

As a refresher for our previous week, we’ve decided to make our final experience a combination of our bat, squid, and elephant experience. Over the past week, we’ve been compiling a list of things we want to iterate on and polish from our older prototypes. We also have been discussing how we want to combine the experiences into one. We had an idea for a grand narrative and spent a long time discussing and planning for that, but ultimately decided to keep it simple, so that we can focus on the animal experiences and due to time pressure. So for now, we’re planning on implementing a simple animal selection scene. Other than that, we’ve started refining our bat prototype. The general mechanics are the same, but there are many tweaks to the audio, visuals, and general feel of the experience, so we hope you enjoy what we’ve done!

Project Management

Since we are working on a bigger project now, we’ve started utilizing task managing software. We’re using AirTable to organize and assign our tasks.

Tasks in progress – secretly! Look forward to seeing what this becomes!

We are also using their automation features, as well as the automation features of Zapier! Using automation, we’ve been managing our pipeline and tasks to notify each other when there are things to be taken care of, so we never miss a beat!

The end of the semester is in sight, and we’re all really excited for our big finale. Tune in for our upcoming blogs!

Hey everyone! This week we had 1/2s presentations, which means that we are roughly halfway through this project and our semester! We spent a lot of time preparing for halves, so we didn’t do too much work on the project itself. If you missed our presentation, the video is up on YouTube, go check it out! (along with the other presentations for day 3)

 

Updated Videos

We redid some of our earlier videos for our presentation, take a look at them below!

General Updates

We are done with our prototyping phase of our project. Initially we intended to continue a bit into halves, and to prototype 6 animals in total. After some consideration and getting feedback from quarters, we have cut that time and scope down to the 4 animals we have already shown.

With the extra time which we now have, we want to focus more on our final experience, which we have expanded in scope to be an experience which includes 3 animals so that players can appreciate the differences and variety in the different senses.

What Happens Now

We start work again! This time we have more room to think things through and expand on our experience and polish it, because we will be working on it throughout the rest of the semester. That being said, our blog updates might be a little less exciting, because we aren’t sprinting through with rapid prototypes anymore. Still, we’ll do our best to keep you informed, excited, and entertained! Tune in next week as we begin our last spurt!

Coming Up Next Week…

Ongoing production updates!

Elephants

We are feeling elefantastic about these socially-intelligent, and emotionally gifted pachyderms!

Exploration this past week centered on infrasonics, vibrations that elephants can feel through the Earth.

 

Elephants can use this infrasonic knowledge to change their long-scale navigation to better serve their goals and needs. In many cases, that means finding other members of their species, and avoiding potentially dangerous weather conditions that, in due time, could be headed their way!

Features

In this experience, you are in a giant savanna desert. The world is vast, and filled with lots of things to explore. Also, there is a day/night cycle, and clouds move across the sky. It is the most expansive world we have created thus far.

You are accompanied by an elephant, as you try to lead it back to its herd. The elephant follows you – and once you have decided on a direction and walk that way (in real life) – the whole scene moves in that direction, bringing you closer (or farther!) from the herd.

To figure out which direction you need to go, you have access to the elephant’s infrasonic detection. When your controllers are close to the ground, you enter an “underground” state, where you can feel vibrations through your controller. The vibrations are on a delay based on the distance from the source to your controller, so you can triangulate where the vibrations are coming from. Since the vibrations are from the rumblings of the elephant’s herd in the distance, you can figure out where the herd is and guide the lost elephant back home.

Considerations

Dr. Design here with some of our big ontological dilemmas!

For the sake of true experimentation, we always want to use novel traits in each prototype. For this one, we used a novel..

  1. Animal (elephant)
  2. Sense (infrasonic tactility)
  3. Mapping (highly focused on haptics, kinesthetic placement, sonic feedback)
  4. Environment (Savannah)
  5. Locomotion (passive tracking, light-speed jumps)
  6. Interactivity (low-input >> high-attention to sensory stimuli)
  7. Player Scale (continental — you are hundreds of feet tall!)
  8. Objective (Reunification of the herd/parade)
  9. Sense Objects (Weather, movements of other individuals)
  10. Emotional Arc (loneliness <–> social connection)
 

Mr. General Programmer here again, to talk about locomotion and other things VR development related. For movement, we decided to try something with more physicality. In this experience, we use the full playing area, with the player walking inside the environment. The elephant follows the player, and the entire room will teleport if the elephant walks far away enough from the center of “the room”. This worked out in some ways – since the elephant follows your position, it’s easy to navigate in the direction you want to go. Also, it doesn’t require any buttons to use, so it’s quick for players to grasp the control mechanism of the game. However, every time you wanted to travel, you needed to wait for the elephant to move to you, which would take longer than was fun (since it’s a casual, slow-moving elephant). Ultimately, the long delay in movement time makes me think this probably is not an ideal method of locomotion. If this was a faster animal, I still struggle to think this would be a good idea, because then the animal would catch up to you too fast and you wouldn’t get to experience it walking around.

Another thing which came up was – during our meeting with our advisors – Heather brought up a point about being handicapped-friendly. To accommodate this, we added a button which toggles whether you are automatically in the “underground” state. While we don’t think it’s very important to necessarily develop for impaired conditions, (because we don’t want to constrain our creations artificially) it ended up being useful in testing, since my setup is wonky and it’s hard to put my controllers physically low – to enter “underground” mode – without them losing connection to the base station. It was a good reminder that accessibility to our cool features was important so that people could actually experience the cool feature we implemented, without them feeling frustrated trying to get it to work.

Future Iterations

Even though we feel this is one of our better prototypes, that doesn’t mean there isn’t room for more additions and improvements. For starters, right now the rumble and haptics functionality is based solely on the sounds from the herd. We could expand it to react to other sounds in the environment, such as predators – or weather conditions – which you need to avoid. This adds an additional layer of thought, of interpreting the vibrations you feel and what they could be rather than just what their direction is coming from.

Another thing we could add is different states for the elephant. Our savanna environment follows a day/night cycle, so it would be interesting if the elephant also observed different behavior patterns during each of these phases, or had needs which you are required to fulfill.

One last interesting thing which we could include would be to walk around with the herd once you find them. Currently, you win when you reach the herd, and they are just hanging out in the world in a random space. It would be a different experience to be with a group of elephants rather than just one, and we could walk towards a new goal, maybe searching for a singular lost elephant rather than rejoining a group.

Coming Up Next…

                            …Halves are coming up! What are we up to next?
Elephant Emoji🦑🦇🐝

Squids

It’s almost the end of the week, and our squid prototype is moving along smoothly. Since we gave ourselves 1 1/2 weeks for this prototype due to 1/4s, we wanted to be done by Wednesday. It went on a bit longer than we would’ve liked, but the end result looks great because of it.

Features

For this experiences, we wanted to try out a different goal from our other 2 experiences. Since squids are considered a delicacy by all manner of ocean life, we experimented with having the squid be hiding from a predator, instead of searching for something on its own. To that extent, we have an environment that the player can move around in with their squid. There are 2 separate phases, a peaceful phase and a dangerous phase. In the peaceful phase, many other small squid are swimming around with the player, and there is a calm music playing. When it transitions to the dangerous phase, the small squids disappear and a giant squid goes past the scene overhead. The music is a lot more foreboding, and we hope the player feels threatened by the situation. After the big squid travels far enough, the scene transitions back to the peaceful scene, and the small squids come back. For the player themselves, they control a squid in the world. They can move by slingshot movement: the player pulls the trigger, pulls back their controller, and releases the trigger. It’s kind of similar to how nocking an arrow and firing it in VR demos for bows work, but it’s singlehanded rather than dual-handed. This lets you casually move around with one hand, while doing something with your other hand! The squid can see polarized light, which is the direction and angle that light travels. We visualized this by exaggerating normal maps and contrasts. This emphasizes how modulation in angles from the light source are more extreme as the light reflects off of them differently. The players can also somewhat control the squid’s chromatophores. Squids have the ability to alter the way light is reflected off of them, allowing themselves to change color and more, blending in with their surroundings. Players can point and click on objects to camouflage their squid to look like that object. The squid’s color and even some of how light reflects off of it changes. If you get a chance to try the experience, try clicking on a variety of things!

Considerations

Mr. General Programmer here again with updates on locomotion. This time we tried a charging mechanic with the squid. Squids draw in water, and squirt them out with force to move, a bit like having a impulse thruster. We tried the feeling of this 2-step process, building up then letting go. Implementation-wise, this is reminiscent of all the bow VR experiences, with drawing an arrow and shooting it. Since we didn’t need the front part of the bow, that freed up one hand to do other things, which we used to control other functions. In terms of feeling, drawing back with 1 hand is slightly awkward, as you don’t have a physical point of reference anymore once you start charging. As such, we leaned on using haptics to indicate the current charge – more vibrations for more power – which in collaboration with the visual squid pointing at the direction it’s firing towards is decent for feedback.. Future experiments might include keeping track of time held for movement instead of distance pulled. One thing which stood out was that – based on players’ handedness – choosing which hand to control movement and which hand to control other stuff mattered a lot for intuition. This is the first time that we encountered this, since the bee uses both hands, and the bat only uses 1 controller in total.

Future Iterations

As always, we have this section so that we can think about what to add and work on if we continue using this animal for the final phase. One big thing that we need is for the environment and other animals to react to the squid. Right now, even though we have the setting for the squid hiding from the predator, the predator follows its path and ignores the squid. This would involve figuring out how we want the predator to react if it sees the squid, and to figure out how much of a failure state we want for players, so it’s going to take a bit of work to get running. Along the same lines, right now we have a squid with the cool ability to change what it looks like, but it doesn’t do anything with that power. We could try to find more ways to utilize it, such as by using how squids communicate with each other through their chromatophores and have our squid be able to interact with other friendly, small squids in the environment. Another thing we could do is add a goal, so that the player has something to aim for. One simple one we could add is for the player to travel from their current position to somewhere far away. This will also change up their environment as they play the game, so it makes them have to pay attention to their surroundings, as the things they can blend into keeps changing. This would be quite an undertaking, as it would require a lot more level design than we currently have in our current prototype.

Coming Up Next…

                         …Elephants!Elephant Emoji

Hi everyone! We know we promised squids in the last blog entry, but we wanted to pivot for a bit to discuss some other things. Squids will have to come later, when we have our prototype fully operational. Firstly, we had our 1/4s walkaround and sit-down this week. We would like to thank everyone for all of your helpful and insightful advice and opinions! It really meant a lot to us. That being said, we had a lot of varied opinions to sift through, and so we’ll be discussing some of that today.

New Videos

First up, we had new videos to show for our first two prototypes. Re-introducing, the Bat and Bee!

First-person VS Third-person

One piece of feedback which we got early into walkarounds was that people didn’t get the feeling of being a bat from watching our bat prototype. Unlike the bee prototype where you play as a bee, you only control the bat in that prototype. They were obviously right, since you aren’t a bat in the experience. We discussed this a lot, since there were members among us who preferred the bat experience over the bee experience. They felt that being able to see and move the bat made them have a closer connection to the bat, rather than the bee which you can’t see and can’t immediately relate to. After much back and forth, we ultimately agreed that it was worth losing the ability to actually be the animal to gain the extra empathy and relatability associated with hanging out with the animal. As such, we have updated our Madlibs which people who have seen our pitch and who have read our week 1 blog will know:
“You and a(n) animal pursuing objective using sense.”
We hope this phrase will be more effective in achieving our newly updated goal – increasing and promoting empathy with animals and our natural environment.

Senses

We’ve spent a lot of this blog and video showing off what we’ve done in our experiences and prototypes. A lot of those are only tangentially related to sense, and having too many things might detract from the core idea of using the animals’ senses to accomplish something. Also, our current humanization of these senses have been largely visual, which is of limited scope and something we want to expand out of, especially since we fought so hard to do VR so that we could explore other methods of feedback too. With our new goal, that isn’t as big a concern anymore, but it is definitely something we want to work on and improve. Right now, we don’t have any good ideas to remedy this, but it is definitely on our mind as we continue iterating and prototyping throughout the rest of the semester.

Animal Bits and Pieces

We received a lot of interesting feedback about regarding animals and other things in general. To start with, these were the animals we were planning to work on during our prototyping phase, and we showed this at walkarounds: We chose this based on their senses that we wanted to explore. Since then, we’ve gotten feedback on other factors that we might want to factor in as well. Some things that were brought up include:
  1. Animal size (if they are similar size to us, we can more directly imagine the things being experienced differently from the animals’ point of view)
  2. Animal proximity to us (people wanted to experience animals they were close to i.e. dogs, cats, etc.)
  3. External interests (we brought forth the idea of bringing this idea to zoos and museums, so they would need to care about the animals we choose)
  4. and Animal actions (as brought up earlier, we are trying to implement senses in more interesting ways, and one way to do that is to choose animals which don’t move around as much so we can focus elsewhere)
That being said, we now have to evaluate each animal more carefully, as we have many reasons why we might want to include or exclude any specific animal. Our new list of animals will still likely stick to this list (due to inertia) but this list is now a lot more temporary and subject to change, and is more of a guideline of things we want to do than a definitive list of things we will be doing. We are even considering changing up how many animals we might want to prototype, so that we have more time to explore, or alternatively so that we have more time to work on our final experience. Be sure to keep up with our blog so you are up to date on all our decisions!

Closing Thoughts

This week has been amazing for us in terms of progression towards our overall image and goal for this project. We loved hearing all of your thoughts and feelings, and it challenged us to really think about our direction and where we are headed. We are always open to feedback, so please send us more!

Coming Up Next…

                         …Squids! 🦑

(For real)

Bees

Week 3 is nearing it’s end, and we have completed our bee prototype! As stated in our last blog, we made this prototype in 1 week to try out different sprint timings. Generally, we are happier with the 1 week format, as we feel we have enough time to accomplish what we want to achieve with these prototypes. That being said, our next sprint is going to be 1 1/2 weeks long, because 1/4s are coming up and we aren’t sure how much time we will lose to that process.

Features

In this prototype we experience being a bee in a field of flowers. The flower placement is procedurally generated, with a new field being generated each time you load the prototype (although if you wanted to stick with a specific arrangement, that is possible too!).

There are a lot of other bees in the field too! They fly from flower to flower, collecting pollen – and neutralizing the flower’s electromagnetic charge in real life – from flowers which they touch. Our bee also wants to collect pollen, and it won’t be stopped by other bees!

When flowers are left alone, they slowly produce pollen. In real life, they also accumulate an electromagnetic charge from being out in the sun. The bee is able to sense these electromagnetic waves, so they can determine whether a flower has lots of pollen on it, or whether another bee has recently claimed the pollen for themselves. As such, the other bees are communicating with our bee which flowers they should go to, even without talking or signing to each other.

Our bee can fly to tasty flowers which other bees haven’t been to. By holding the VR controllers and squeezing the triggers, the player simulates the bee flying with its wings. For the motion needed, we would describe it like an airplane, with arms extended. Players can tilt their bodies to influence the direction of the bee.

Considerations

Continuing with last week’s discussion, we tried airplane controls for the player for this prototype. Another idea we considered was thrusters similar to how Ironman would fly, but we decided on airplane because it’s more intuitive for our target audience, as well as more natural as opposed to the futuristic image Ironman controls would give. Generally, we think it worked out quite well. We made the thrust quite low, and the motion is somewhat wobbly because it is hard to maintain the angle of thrust in one direction for a long period of time. Our programmer thinks this suits the image of a bee, lazily wobbling from flower to flower. On the other hand, our producer said it was quite tough and caused quite a workout, which is opposite to what a bee would feel, as they can go on for a very long time and distance on low amounts of energy. Also, if we put this on an animal which moved fast, it would quickly cause large amounts of motion sickness.

Future Iterations

If we continue working on this prototype some of the features we could work on include refining the bee AI and expanding on the flowers.

Right now each bee acts as independently as a singular entity. However, bees are more structured than that, with communication and signals. If we iterate on this, the bees could be controlled by a swarm director which makes the bees act in unity with one another, emphasizing on their real life behavior.

Along with this, we could have the player be able to communicate and receive information with other bees. This could be used to control the other bees’ actions, giving the player more room to influence their surroundings. Potentially, we could add more structure to the game, with a beehive and queen bee giving out objectives to the player.

For flowers, currently you play in a singular field, where the bees travel to each flower with equal weight. We could expand the environment to include multiple fields, with bees rotating between different fields as they expend the pollen in one field. This gives the player more direction, as the flowers in their field would dry up and they would need to search for greener pastures.

Coming Up Next…

                         …Squids! 🦑

Bats

For the first and second week of our project, we made a prototype of a Bat in a tropical Indonesian rainforest!

Starting out, we were (and still are!) unsure whether to go with 1-week long projects or 2-week long prototypes. Initially we wanted to go with 1-week long sprints, however we were cautioned that it might be too ambitious by some of our seniors and advisors. To resolve this, we decided to start with a 2-week sprint followed by a 1-week sprint and gauge what we think is a good pace after that. We decided that our first prototype should be a 2-week sprint to allow room for adjusting to the new school term and getting administrative stuff ready. So here we are!

Features

In this scene, we have a bat which flies around a rainforest. The bat follows your controller’s position, accelerating to catch up to it if necessary. If you hold down the controller’s trigger button, the bat emits its calls. The harder you hold down the trigger, the faster and higher pitched the calls become. In real life, bats use their low frequency pitch as their long-range sonar, as it travels much farther than the higher pitched call. Bats use their higher frequency pitch to get more detail for close objects, such as when catching close prey.

The calls from the bat are not only audible, as our main theme for this prototype is echolocation! Using the power of technology, we have translated the immense auditory power of the bat into visual language. While the bat is calling out, a visual pulse is emitted from the bat in an arc in front of it, illuminating the rainforest. This wave mimics the bat’s ability to use echolocation to “see” in front of it, as objects which the wave touches reflects the sound wave back to the bat, enabling it to judge how far the object is from itself.

In the rainforest, there are also other living beings – moths! Moths fly around lazily from point to point, unable to hear the calls from the bat. This makes them easy targets for our bat, who loves to eat them (actually, that’s not completely true, but we’ll leave it for you to find out on your own when you get a chance to play our game!).

Considerations

One concern that our general programmer had was his inexperience with VR coding and best practices for handling game mechanics, especially movement. He has taken the opportunity to experiment with trying out different movement systems during each of these prototyping sprints. For the bat, we have the player standing in the middle of the world, directing the bat around with their controller. He chose this because it was the easiest to conceptualize (he did not have his VR headset for this sprint!) and implement. While this is very straightforward and intuitive for players to control, it had a huge limitation on how far the player can move the bat around, as it is essentially limited to the big box which was the player’s gaming environment. This constricted space didn’t really suit the bat, who is notable for its fast speed and nimbleness – something it uses often to avoid predators. Also, the bat’s movement is front and center, so time and effort needs to be spent to make it as smooth and familiar as possible, since it can and will detract from the immersion of the world if done improperly.

Future Iterations

Ultimately, we plan to return to one of these prototypes in the later half of the semester and continue polishing and tuning it into an amazing program for everyone to experience. As such, we plan to have this section for each of our prototypes, so that we are always looking forward to the future and the exciting possibilities each prototype could become! Of course, general polish and tweaks are expected, so this section will be reserved for standout features.

One cool vegetation that we did not have in our prototype is pitcher plants. Pitcher plants and bats have a symbiotic relationship – bats provide the plants with nutrition and plants provide the bats with vision. The plants have a hollow center which amplifies the bat’s call, allowing it to “see” much farther than they normally could. Adding pitcher plants to the scene and having them echo our bat’s echolocation calls could result in some amazing visuals, with players wanting to stay close to them all the time, emulating real life!

Another thing that we would be excited to add is materialistic shaders and visuals. Currently, our bat sees the world as we would see it when it uses its echo location. However, echoes react differently to different materials compared to how light reflects off of surfaces. Sound is mainly affected by the texture of the objects, while light is affected by reflectivity and absorption. As such, what a bat would see using echolocation would be vastly different than what people see with their eyes. Exploring what “colors” that would translate to and how we could interpret that is an exciting avenue for progression.

Coming Up Next…

                         …Bees!  🐝

We Are VESP

Hi Everyone! Thank you for reading our blog and coming to our website. Virtual Extended Sensory Palate – VESP for short – is a team at the ETC working on an edutainment experience that brings real animal senses to people through virtual reality. We are passionate about the natural world and want to help spread this passion to people just like you! We plan to start with small prototypes based around different animals, utilizing their specific senses. Each prototype will be centered on the following phrase: “You are a(n) animal pursuing objective using sense.” Ultimately, we plan to produce a final experience based on one of these prototypes, alongside a PR package to give to various locations to show potential in this field and to encourage further exploration. We hope you enjoy your journey as much as we will enjoy making it!

Meet the Team Members

Noah Kankanala

  • Our Sound Designer and Composer
  • Uses his waterbottle as a mic stand

Lewis Koh

  • Our Koh Lewis
  • Currently in Singapore

Stefani Taskas

  • Our Shader Programmer
  • Plays D&D

Sophia Videva

  • Our Technical Artist and Rigger
  • Lives with a cat

Jack Wesson

  • Our Producer and Design Director
  • Proficient in Elvish

Coming Up Next Week…

                                           …Bats! 🦇