Flappy Jam complete!
Made a new game in a couple of days with Aaron and the Robomuffin crew. (Same guys from the recent GGJ24)
Pretty much just for fun…
Get it here or on google play
Flappy Jam complete!
Made a new game in a couple of days with Aaron and the Robomuffin crew. (Same guys from the recent GGJ24)
Pretty much just for fun…
Get it here or on google play
Now something a bit more technical and rambling… But not too bad 😉 … again this will help me more than anyone else 😛
With the initial prototype for Super Mega Mega, the world is completely dynamically rotating around the centre point of the world.
This was chosen to ensure the VR head tracking camera wouldn’t have any major issues with things like motion sickness.
Also, the player movement code becomes a lot simpler when the player isn’t actually moving and the level just rotates on a single axis.
The problem comes now with performance. I’ve been starting to push the detail levels of all the levels up a lot and that’s increased the draw call and vertex count significantly.
Deferred rendering is being used to get some nice lighting and post effects but it really hammers the draw call count.
To reduce this I have a few options:
– Reduce the vertex count of the individual blocks significantly (Unity dynamic batching doesn’t allow more than 900 vertex attributes which means about 200-300 vertices for me) which would make this look shitty.. potentially only an option if the detail is replaced with normal maps. This would add a lot more work into my pipeline…. not cool but has potential.
– Leave it where it is and just decrease the level sizes. Not cool either! Seeing the tall tower stuff is one of the coolest bits.
– Switch all the mesh blocks to static objects and change the movement code to adapt. This is the first option I’m currently testing out. Draw calls are reduced significantly but now the camera has to be rotated to follow the player. This might screw with your head when wearing the VR headset…. must really test this ASAP. If it doesn’t cause any problems I’ll switch all the levels into this static mode. Re-writing the movement will be a pain but it’s a once off thing.
If anyone has any pearls of wisdom here.. feel free to drop them on me @bluntgames
Anyway it’s good to be back in the middle of it all. Hoping to be able to share some awesome new stuff early in the new year!
-Ryan
p.s.
After writing this I think I’ve decided to embrace the ‘all normal-mapped future’. This will make the vertex count of the blocks restricted but it should be nicely compensated with the normal detail I hope. My only concern is the lack of quality in the silhouettes. I’ve also solved the non-manifold mesh generation that was plaguing the tool used for current models. Perhaps the best solution is a happy mix of dynamically batches models and auto-combined models.
The attraction of having everything in the world dynamic (destructible and movable) is far too appealing to ignore.
test test test
The word of the day is Random.
This is what the levels are going to be in Super Mega Mega.
The initial intention was to create hand crafted levels with a story flowing through them. Instead now it’s going to be a randomly generated cylindrical level and story elements will be woven into the game play without directly controlling the flow.
A couple of games out there are already doing the randomly generated level thing… (not just a couple… lots) so it’s not an unsolved problem. I don’t see there being too many problems…… but I’ve been wrong before.
The first thing to do is adapt the current level editor structures to allow me to create the smaller level building blocks to use in the generator. I think this is the way Spelunky handles the problem and it should be a good fit.
I haven’t decided on the blocks/level dimensions yet. It’s a bit hard to decide until I’ve tried it out a bit. The nature of the 360 degree levels means my first try might not work at all. It’s just a matter of giving it a go. Time to code up the system and plug some numbers in with some test data and see if it makes a decent level.
Needed to make a better logo so here’s a new revision: I haven’t decided if it’s the final yet.
I took a little development break this past couple of weeks, mainly to prepare for the birth of my new kid (it’s a boy yay)… but I’ve had some time to start getting back into some work…
– One of the problems I noticed when using ray casts for the “Is the player looking at something?” test seems to be it’s high accuracy. Most of the time the player is not actually looking directly at their desired target and it’s not obvious why the it isn’t being highlighted or it can be frustrating moving the head around to try get the exact positioning.
I’m trying to ditch the ray casting and use the Unity trigger system to filter out the objects that enter/exit/stay in players view. First tests say it will work well. The technical parts work fine this way (so did ray casting) … but now there’s more room for a fudge factor to play with. Previously I would have had to cast a bunch more rays to try and get a better indication of what the player could possibly be looking at. I won’t know for sure until I get some more play testing done.
– A quick improvement for the demo is to make the player character visible when behind blocks….. adding this will greatly improve the playability in a few of the parts of the game. It will also mean I don’t have to worry about hiding the player as much. Could also maybe use it for enemies and things.. dont know yet.
– The game still needs more VR specific game mechanics. I’m experimenting with a few different things still. I’ll have to add them in for the next play test for sure.
out!
-Ryan
Oh man, going to PAX AUS for one day was simultaneously awesome and extremely depressing.
I made the decision to only display for 1 of the 3 days. That one day was really cool! Met a bunch of great people and demo’d the prototype for a solid 8 hours. (side note…. showing off an oculus rift title is quite exhausting compared to more fire and forget game demos) I got to check out the HD version of the Rift and got a mention from the Oculus dudes!
Then heading home sucked….I had to tune out from all twitter and gamedev news feeds. I felt so bad for not being there but I’d feel a lot worse if I missed the birth of my next kid >_>
Next year will be different! There’s no fucking way I’ll be having another kid that soon so it’s clear skies!
The other slightly less depressing thing is the lack of progress on the other game we’ve been working on. It hit a bit of a brick wall recently. We had the intention of showing it at PAX too…. fucked that up for sure!
No matter… time to get this protoype build posted online and get working on the next version. Need to touch base with a few of my other projects too, spread the love a bit.
Here are a few more photos from the event, it went so fast I barely managed to breathe let alone take in my surroundings!
-Ryan
Another bunch of time spent doing things…
– Focused on making this little tech demo into a working game demo. Progress made but lots left to do.
– Biggest push has been on the gameplay and game flow. Each weapon in the game will be designed to have an alternate mode that uses the head tracking in some way.
– Played with an interesting way to bring more depth into the game … Basically letting the player switch planes of movement to be further or closer to the view point. It also unlocks lots of gameplay options too.
– On a side note its been hard writing these logs more often. It seems I just need to start typing without any real plans and things just fall onto the page.
– Menus suck with VR! Just putting everything into 3D now and attaching it to the right eye transform.
– Music and sfx are looming large on the list that really need to be in there! I’m setting aside the next few nights for doing this. I’m pretty amateur at the music stuff!
HEYO!!!!
It’s been two weeks since I got back home from GDC. In that time I’ve completed approximately nothing and probably gone backwards on some stuff due to lack of attention!
I blame the air travel and time zone shifting. It really fucked me up and I need to find a better way to deal with it next time! It could also be due to general intensity that was the week at GDC. I’ve felt simultaneously inspired and motivated but have been entirely too exhausted to do anything about it.
This week has been a lot better. I’ve felt sane again after a couple of days spent with the family and going to a few local sporting events. Also, starting a bit of physical fitness activity has cleared the head a bit.
Overall I think I’ll need to plan my schedule better around the GDC time next year.
I’ll blame GDC here again… I failed to make anything resembling a game during March! I’m pretty disappointed in the effort. I really thought I’d be able to pump something out. I had a bit of crazy real life shit leading up to my trip that put a block on getting anything done. Oh well. I haven’t decided if I’m going to resume the effort yet.
So wtf am I doing now after the recovery? Well we initially considered switching back to the tactical shooter we’ve been prototyping for a while but we realised we still don’t have all the pieces in place to execute it well enough. This led us to come up with something different….Project:M which will lead us into PAX AUS and beyond! The details will be saved for the PAX event… meanwhile…
One of the other games we’ve played with recently is a new mode variant of Missile Control called Onslaught. It focuses more on a high score chase and uses a gamepad as the primary control system (touch screens still work fine). This is the game we are looking at bringing to the Ouya. I also plan on patching the game mode into Missile Control on iOS /Android. Here’s a screenie running on the Ouya:
So that should be a bit of fun! More info soon.
-Ryan