All posts by Ryan

VR DevLog 14: Audio

By | Audio, Journal, Super Mega Mega, VR | No Comments

One of the things I’ve wanted to focus on has always been music and sfx. After chatting with a mate during the EBExpo I think I’ve decided how to proceed with the audio for SMM.

Most of the time I use third party audio or create my own simple loops. This time it would be nice to have more of a dynamic audio system built early on in the project.

I’d like the style to be very retro but not necessarily exclusively so. After discovering unity’s support for MOD files I’m very tempted to try and use this system. My only problem here is generating the music. I don’t have much experience with the tools. (In general my music making skills are pretty low) This would be a good chance to learn.

I don’t even know where it will end up… which is sort of the best thing about it. Create as many cool things as I can and try and make them loop and play at the right time…. what can go wrong with that? Dynamic song generation. Perhaps I can group up all commonly themed loops and randomly arrange them?

-Ryan

VR DevLog 13: EBExpo and beyond

By | Journal, Super Mega Mega, VR | No Comments

Firstly, if you’d like to grab the build of Super Mega Mega that was playable at EBExpo .. go here

 

Once again it’s been a little while since the last post. I spoke about deciding to focus on the random level generation … which I have now switched to completely disregard and push forward with a more hand crafted approach.

The main problem here is ‘multi-plane’ nature of the levels. It just makes it too hard to get any sort of interesting gameplay when the level is random. Even when using the Spelunky style with hand made segments arranged randomly.

So anyway, I had a exhibition booth for Super Mega Mega at the EBExpo for 3 days. Man it was a long time. I spent most of the time by myself with a little help from the Rizzle (Cheers dude!)  and RyanB which helped me stay sane with a couple of breaks to eat and walk around the halls.

What did I learn/achieve from the event? Well at first I was a bit dubious at whether it was worth attending. This opinion didn’t change during the first few hours… there were a lot of very young kids. Totally not the audience for either the Rift or SuperMegaMega. But they still seemed to love it and soon there were more varied ages arriving. I can only remember about 2 time during the whole 3 days (about 35 hours of actual show time) that there wasn’t someone playing the game…… that’s a lot of people.

The hardest thing was extracting a meaningful play test from someone who is trying the Oculus Rift for the first time. Most people are overwhelmed and struggle to learn even the most basic platforming mechanics. There were a few people that really ‘got’ the game almost instantly and destroyed all the levels with ease. That was really encouraging. It also helped me cemented the games focus going forward.

So overall… Awesome event and met some great people. Now to use the feedback to finish off the game design and then get to work on development. If we’re going to release near the Rift’s consumer version then we’ll have to get a move on!

Here’s a couple of photos taken… should have taken more but forgot…

Cheers!

-Ryan

VR DevLog 10

By | Journal, Screenshots, VR | No Comments

I took a little development break this past couple of weeks, mainly to prepare for the birth of my new kid (it’s a boy yay)… but I’ve had some time to start getting back into some work…

– One of the problems I noticed when using ray casts for the “Is the player looking at something?” test seems to be it’s high accuracy. Most of the time the player is not actually looking directly at their desired target and it’s not obvious why the it isn’t being highlighted or it can be frustrating moving the head around to try get the exact positioning.

I’m trying to ditch the ray casting and use the Unity trigger system to filter out the objects that enter/exit/stay in players view. First tests say it will work well. The technical parts work fine this way (so did ray casting) … but now there’s more room for a fudge factor to play with. Previously I would have had to cast a bunch more rays to try and get a better indication of what the player could possibly be looking at. I won’t know for sure until I get some more play testing done.

– A quick improvement for the demo is to make the player character visible when behind blocks….. adding this will greatly improve the playability in a few of the parts of the game. It will also mean I don’t have to worry about hiding the player as much. Could also maybe use it for enemies and things.. dont know yet.

– The game still needs more VR specific game mechanics. I’m experimenting with a few different things still. I’ll have to add them in for the next play test for sure.

20130803-152834.jpg20130803-152933.jpg

out!

-Ryan

 

VR DevLog 9 : Post Prototype

By | Journal, VR | No Comments

Prototype complete.

Now that’s done it’s time to start to flesh out more of the mechanics and level structures.

I’ll be avoiding anything theme/art related for a little while longer and just stick to adding a bunch of the platformer ‘tools’ (moving platforms, variable physics, enemy types, bosses etc) this should provide a good base to build the rest of the game upon.

I’ve created a few hacks during the prototype phase and I’ll need to throw a bunch of it away and re-make it in a nicer more user friendly way. For example, each unity scene stands on it’s own with no shared data or configuration files. That makes it nice and easy to load/reset a level but it also means all the levels need to be updated if any of the global configs change. This is first on the list!

The level editor will need to flow a lot better than it currently does. I’ve had to manually place a few pieces of the levels because I didn’t have time to create the editor support. Most of it is non-trivial it’s just a matter of getting in there. Manually placing tiles instead of painting them is some kind of hell.

During my short time at PAX AUS I managed to snag some time on the newer HD Oculus Rift. First thing noticeable is the massive improvement in clarity. Especially in the key portion directly at the effective viewing area in front of the eye. You have to look for the screen door effects and they are quickly forgotten about. Text is actually visible now, making it a bit easier to get some useful menu interfaces working.

I’ll be ordering the next dev kit whenever it’s ready. I don’t know what specs the consumer Rift will have but I don’t think it would be far off. I’d be happy just to include positional tracking with the HD screen. More improvements can come later!

So far I’d say any version of the Rift will have a limited session time for most people. The biggest factor here is related to the brains exhaustion levels with having everything in the scene in sharp focus all the time. We need some eye tracking support to allow dynamic depth of field calculations. Hopefully that’s not too far off.

Back to it!
-Ryan

WP-Backgrounds Lite by InoPlugs Web Design and Juwelier Schönmann 1010 Wien