Wednesday, December 18, 2013

pile of thoughts: Indie Megabooth, Death in July, photon cloud, Juiciness, Internet Simulator 2014

Ok! video games!

Let's get started. I will be posting today about a pile of thoughts. I have resumed development on Bad Things, and they are progressing!!! I have refactored the item system into something that is non-ultra-terrible. I guess I really just wanted multiple inheritance in C# but no taco. And now I am focusing on getting multiplayer working again. Multiplayer stuff is hard.

The Indie Megabooth

I have submitted ('submote') Bad Things Happen in Space to the Indie Megabooth for PAX East 2014!! Yay! In accordance with this venerable tradition, I will now work furiously on getting a fairly fully featured multiplayer build ready for PAX. I know that getting into the megabooth is nowhere near guaranteed, but I still want to have a game to show off in the case that I do make the cut. Also, ungiving up has something to do with dying in July. The megabooth is more than a sweet way to show off your game at PAX; it's a great community of people who are all working together to empower independent developers everywhere. It's kind of crazy and awesome that something like the megabooth exists at all.

Death in July

In July: Pow! I'm dead. Not really. However, in the terrible month of July my student loans kick back in. And then, if I don't have a video game that makes money, I will probably have to get a job. This is a somewhat upsetting prospect. I would really like to be an indie developer that makes money by then! Is this impossible? No. Is this going to be difficult? eff-yes. If I have to, I would like to get a job doing graphics programming. Will I be skilled enough in graphics programming to get a job doing it yet? The internet seems to think that there is no such thing as a junior graphics programmer. Only senior positions exist? Maybe? I know a lot of graphics programming things, but I don't know if I am good enough. I could be a technical artist or something, but I am not that good of an artist. Or I could just like, get a job doing something I hate doing like computer engineering. Or the fuel cell thing could really take off. Though, that's pretty unreliable. So, anyways, try to make a video game by then!!!

Photon Cloud

Photon networking for unity is a sweet package that lets you host your networking in ~the cloud~ which is totally awesome. The API is a little bit cumbersome but I guess that's to be expected for a non native package. It's free for 20 concurrent users and then you can scale it up after that. 20 is enough for testing, certainly. The cloud handles all the NAT punching and server hosting, which is great if you're a guy who doesn't want to punch NATs or host servers. I think I must be willing to crush my dreams and kill my fun non-network-compatible mechanics. I've got ideas to replace them. A reactor room that has real power management stuff. A warp core. Weapons still needs work and thinking about. And maybe, just maybe, the entropy capacitor will survive in some capacity. 

Concern with Juiciness

I have been watching presentations on this elusive concept of Juiciness. Juiciness, as far as I can tell, is the idea of rewarding every action the player takes with some kind of visceral response. Menus wiggle, things make sounds, move around, explode, pop, morph, sparkle, and the ever-powerful screenshake. Usually this is visual, but something that really stands out for me is the SFX design in Dust: an Elysian Tail. Every single menu interaction has a separate and visceral sound effect associate with it. It's super satisfying just navigating the menus. Making your game juicy is very important. But! I'm not quite sure when to start injecting juice into my game. Should I do it before the mechanics are sorted out? Should my prototypes be juicy? I don't know.

Internet Simulator 2014

also known as reddit the video game. You wander around a house (or an environment, or whatever) that is filled with cats doing funny or cute things. Your job is to take pictures of cats, assign captions to them, pick the appropriate (simulated) subreddit and post them on the (simulated) internet to gain simulated fake internet points. Added advantage: you can post these simulated cat images on the real internet to get real fake internet points! It's basically pokémon snap, but with cats. How well did pokemon snap do? I thought it was a fun game, and we haven't really done anything like that since then. Not that I know of, at least. The game will rate you on your composition, funniness, cuteness, and number of cats. Also you have to pick an appropriate caption and choose which subreddit will give you the most karma for your image. I haven't fully figured out exactly how the caption system will work, but maybe it can actually hook into your real reddit account and check your real karma, and rate you on that!? I am pretty excited, but honestly, I probably won't get around to working on this game for quite a while.

Thursday, December 12, 2013

Ungiving up on Bad Things Happen in Space

OK! Short post this time.

So I will be actually now trying as hard as possible to make a playable multiplayer demo of Bad Things available for PAX East! Which is in Apirl. So 4 months or something! Seems possible. I think I will stop crying about how I'll have to kill fun mechanics and just do it! Yeah! Video games!!!

Tuesday, December 3, 2013

Other Things Happen in Space

Hi! I'm Kelly. I make video games. Well kind of. Maybe.

Anyways, I haven't posted in like a month now. I've been really busy with fuel cells. Making money! Yay! But that's not a really great reason to not write a blog post for a long time. There is also the fact that my laptop is currently employed 24/7 in mining bitcoins, so I can't really participate in community writing sessions anymore. I have to write these posts on my PC. It's a little annoying, but I will survive. Maybe.

Bad Things

So I'm just gonna come out and say it. I'm probably not going to be working on Bad Things Happen in Space for a while. I've gotten to a point in development that it seems to be a little bit too ambitious and not all the design elements are there and the mechanics aren't solid. When I went to Boston Postmortem and I heard Eitan from Firehose Games talk about what Indies need to do to survive, he said a lot of important things (many of which I am taking to heart) but what struck me most is that they all have passion for the games they're making. I have lost my passion to make Bad Things Happen in Space. I'm not saying I won't ever pick it back up and start working on it again, but there are several other, smaller projects that I'd like to work on and maybe get something actually finished.

So, what else?

Here's a couple things. There's Magenta! The kitten eating ghost, I mentioned that before. I'm pretty deep into a fluid mechanics GPGPU simulation thingy (in the same vein as echobox), which I think I might want to adapt for Magenta! and other projects. I'm meeting with Seth Alter of Subaltern Games this Thursday (12/5/2103) to talk about maybe working on his next project. And I've got other ideas for things, too. Vorift, a first person eating simulator in the Rift, several QWOP style control-in-great-detail games, a pikmin like game where you control bugs based on Skitter from worm, and finally a sort of Minecraft on a sphere.

A few of these are just interesting technology pieces I'd like to try and implement, but some of them might turn into actually fun games. Once I get the fluid dynamics simulation figured out, I'll like try my hand at the Spherecraft thing or create a demo-prototype thing for Magenta! I can feel the passion for those things burning deep in my passion glands. It does suck that I am going to be spending at least half of my time working on fuel cell stuff until the end of the year, so I won't get to work on those things as much as I'd like. Though, I do have this weird effect of actually being more productive when I have less time. I'm not quite sure how that works.

When I finish the fluid mechanics simulation, you can probably expect me to write a post about all the nitty gritty details. Or at least expect me to say I'll write up a post on it. Fluid simulation is a pretty intense topic, but also somewhat overlapping in a lot of ways with the acoustic propagation stuff I've been working on for echobox. The fluid mechanics is, in a sense, more general than the acoustic stuff because there have been so many more people working on it. You get to see all the other ways people have tried to doing real time simulations of physical phenomena. In a way, I feel more equipped to work on echobox now than I ever have before. Too bad.

Also, I suppose I should change the name of the blog? Maybe just Pyromuffin? Cool.

Tuesday, November 5, 2013

The Oculus VR meetup

Today I am going to write about the Oculus VR meetup, my presentation, and the weird attention that echobox has been receiving lately. Last Saturday (November 2nd) was the Oculus VR meetup at the Microsoft NERD center in Cambridge.

wtf is an echobox

If you don't recall my posts from a while back, echobox (link to the demo on the downloads page) is my entry in Oculus VRjam. It was a project I made in 3 weeks specifically for use with the rift. It's an acoustic wave propagation simulation and visualization. It's a big pile of math and you can get a lot of the details in the presentation (also on the downloads page). I've gotten a lot of comments like, "oh it's so trippy," or "this should be in a club," or "I feel like I'm inside the sun." All that stuff is great, but I don't know how to sell trippy sun simulations to clubs. 



Apparently, there is quite some interest in echobox. I've gotten some very positive replies on my vrjam forum topic, and there were people lined up at my booth all day, and some people have really shown interest in it. I admit, it is probably more novel than Bad Things, so I'm collecting ideas for what kind of game I should turn echobox into. Send me your ideas!



Oculus VR meetup

The Microsoft NERD center is actually a pretty cool place. There was free food and drinks, and the view was pretty nice. It's got some sweet colored chairs, and in fact, you can see how dumb I look with a rift strapped to my face.


There were dudes lined up at my booth the whole day. I'm guessing at least 200 people got to do the demo. The framerate was a lot worse than usual for some reason, and also the bug that stops you from getting to level 4 was in full force, no one was able to ascend to the interplanetary platform and witness the splendor thereupon. It may be the case that no one will ever get to do it ever, now that the source is lost. Ah, too bad, so sad.

When I wasn't doing the demos, Gosia, my fearless minion, manned the booth. I sneaked upstairs and saw Palmer's keynote presentation. Nate Mitchell, VP of something wasn't able to make it, so Palmer had to try and give his part of the presentation too! It was kind of funny, but Palmer did well covering much of the topics. I also got to see Elliott's presentation on Unity and VR, and also Alex's presentation on AAAAAaaaaaaAAAAaaaaCULUS. It is clear that no one knows how to do VR.

I gave my presentation and it was mostly over everyone's heads. No one had any comments, and I doubt people were even paying attention because no one laughed at my jokes. Or maybe my jokes were bad... but I thought that they were good. I'm pretty sure someone was actually asleep. That's fine though! It was my first time giving a presentation on game development stuff, and I only started not that long ago, so it's only up from here!

The secret vip dinner

After the meetup there was a secret dinner for VIPs only. That's right, someone thought I was Very Important. I got to talk to Palmer about his secret things that I shall not repeat. Well, I will repeat them if you pay me enough, but whatever. I also got to speak with Peter Giokaris about his work on the Unity integration, which was really great. I talked with Will Brierly about soda drinker pro, and also some other academia guys (one of which did his phd on ocean wave stuff, quite related to echobox), and it was suggested that I try to write a paper on my work simulating real time acoustic propagation. Unfortunately, I am not in academia, and also I only have a very "omg I can't believe this worked, and I have no idea why" understanding of what I am doing.

There was also steak tartare. I didn't partake.

Monday, November 4, 2013

VR Meetup downloads

This isn't going to be a long blog post, just putting information here for the people at the VR meetup. I gave a presentation on the 2nd of November, I'm pretty sure it was mostly over everyone's heads, but that's fine. I did end up meeting some super cool people at the afterparty thing, and I did get to speak with Palmer if ever-so-briefly. Anyways, here's the stuff. It's all going to be hosted on my google drive, so if the links go down, send me an email: kelly@pyromuffin.com

echobox download:


Hyper echobox repo:


VR meetup presentation:


Bad Things Happen in Space: panic build:



and lastly, as a note, I will be doing fuel cell work for all of this week, probably, so if there is no blog post, that is why. Also, I should probably put these links on a dedicated downloads page.




Tuesday, October 22, 2013

Reverse Engineering Unity Games

Hi. Today I will be writing about my efforts regarding reverse engineering unity games. As you may have read in a previous post, I have recently lost all my project files in a horrific idiot hard drive reformatting disaster. I was able to get my final builds back from some people I had sent it to... and actually, it just occurred to me yesterday that I had sent Unity QA a repro project for echobox that had almost all of the final source in it (except for the visualization portions and the level design), but they have yet to reply about sending me the project.

A bug or a feature?

So it turns out that reverse engineering Unity games is actually quite possible, and even somewhat easy. Is this a feature or a bug? In my case, I have a legitimate reason to reverse engineer as much code as possible from my game executables, because it will save me a ton of work trying to rebuild what I've lost. For others, I imagine this is somewhat of a headache. There is an extensive forum post about this, but that's mostly useful for back up information.

There are two tools that I used in the reverse engineering process:
A sketchy Unity asset unpacker, and .NET Reflector. The asset unpacker just takes your sharedassets0.assets file and spits out all the files in it, which is for the most part undecipherable garbage, but some of it is helpful. .NET Reflector is a magical program written by timeless space wizards that takes .NET assemblies and turns them into C# code.

So there are a few major categories of assets in a unity project that one might want to recover:
  • Art/models
  • Scenes, gameobjects, and prefabs
  • Scripts
  • Shaders
  • Compute shaders
  • Other things?
I will discuss each of these and their recoverabilities.

Art and Models

Echobox didn't actually have any custom models made for it, everything in the level is made from primitives and megafiers (a procedural mesh modification plug-in), so I didn't import any mesh assets. The same is true for textures. The asset unpacker supposedly will recover models and textures, but unity converts them to a weird format. I would rate texture recovery to be about 4 reindeer skulls (the arbitrary unit of measurement for asset recoverability), and models to be 3 reindeer skulls.

Scenes, gameobjects, and prefabs

Nope, as far as I can tell, there is no way to get this stuff back. The asset unpacker exports a lot of stuff which may or may not be prefabs, but they aren't named, and most of them are actually empty files, so I have no idea how to get back any of these things. Zero reindeer skulls.

Scripts

Ding ding ding. You can get these back! The .NET Reflector allows you to pretty much get all your script source code back. There is some compiler garbage left in for things like coroutines and also some weird stuff for structs, but otherwise, you get all your code. Class variables even keep their names, local variables get renamed to things like vector1, vector2, float1. If you understand the general structure of your code, this is by far the most recoverable portion of your project. I suppose that if you didn't write it in C#, it might be harder, but whatever. 32 reindeer skulls.

Shaders

Shaders come out of the asset unpacker, and they are actually not garbage, kind of. They have the usual unity shaderlab stuff about passes, subshaders, blending types and stuff, and then it's followed by a big string that looks like "aalkjapsaaaaajajaaakaalbbaaabacaaaaaaaaacbacbaaa..." for about 100 lines. I expect that this is the gpu/dx11 bytecode for your shader. This might be a little bit different for other platforms, or Unity in non-dx11/hlsl mode, so I don't know. The shader will actually work in Unity, I think, but you can't really make changes to it. I don't know how to reverse engineer the bytecode, and I can't even find a resource on what the big string actually is. It is valid ASCII, so I don't think that it's actually bytecode, or you'd get a lot of garbage too, so there is something going on here. I don't know. I bet Aras would know. I don't know if Unity can officially condone or support reverse engineering, even if it's for legitimate purposes. "aaaabbabaacaacabaccaaa" reindeer skulls.

Compute shaders

These don't even show up in the assets? I don't know where they're included in a unity project. I've dug through the binaries, and didn't even see words that would correspond to important things that are in compute shaders. So they must be compiled somewhere in a way that makes them something. Zero reindeer skulls.

Other Things?

Animations, mecanim state machines, terrains, sounds, I don't know, because none of these things showed up in the unity assets unpacker in any identifiable way. And also I didn't actually have any mecanims,  terrains or sound files in my game. ?? reindeer skulls.


Hyper Echobox recovery progress


I'm at about 60% now. Here's the github, you're welcome to try it out if you'd like. Propagation and raymarching are back, but voxelization of scene is not working because RWTexture writes are not supported in pixel shaders from camera drawing (I have posted some things on the forums about this). Also the visualization isn't tuned, it's just the old boring red and green again. And it's super not flexible. By that, I mean, it is hardcoded to be a certain size and also only work in a cube centered around <128,128,128>. I figure I should be ready for the presentation. A few more days and I'll have it back to where it was, at least from a technology standpoint.

Thanks for reading! Comments and questions are appreciated.

Wednesday, October 16, 2013

Bad Things Happen to Bad Things and Everything Else

Hi! Ok, so while I was trying to upgrade my PC to Windows 8.1 RC, I ended up deleting all my game project files forever! And also all my documents. Basically, I've lost about 30 hours of work on Bad Things (the last three days of the ultracrunch) and everything else from the beginning of July. Yep, that means the source for echobox is gone forever. Which is bad. As I mentioned before, I have to give a presentation on echobox and the Rift integration in about two weeks. So this is pretty terrible. Of course, the actual executables are still available for the panic build and the echobox demo. Those might prove useful for decompiling later, but I don't know if it's worth spending time figuring out how to get that information (a cursory search suggests that it is not that hard!?).

Should I give up on Bad Things?

Well, I guess the major advantage of continuing to work on Bad Things instead of starting, or resuming work on other game projects is that I've already got a bunch of people interested in Bad Things. And if I don't make Bad Things, then, obviously, it will never get made, and therefore I'll never get to play it. The whole reason I'm making this game in the first place is that it's the game that I want to play!

Some disadvantages:
  • I will have to redo about 30 hours of fairly monotonous work.
  • It is unclear to me that I will be able to solve some of the mechanical problems in the demo.
  • It is unclear to me that I will be able to find a suitable multiplayer implementation for the physics based interactions.
Keep in mind, I am just exploring options, and the default action here is to keep working on Bad Things. I'll talk a little bit about the game I was working on before.


Magenta! The Kitten Eating Ghost

Note that Magenta is actually lavender. And the squid in a box is not particularly relevant, except that there are octopi in the game. Magenta is a multiplayer action platformer about eating kittens and defeating puppies. Think of it as Castlevania, meets Megaman, meets Castle Crashers. There is lore. There is a lot of lore. In fact, I have spent probably way too much time inventing the universe in which this game takes place.

You run around, each kitten is supposed to be a unique challenge to eat. Each kitten you eat confers an ability that will sometimes help you but sometimes not really help. The attitude of the game is "neither happy nor sad" with a hefty helping of 'lomgcao' which is a word that I invented that means something like "actively not trying." The power of a ghost is proportional to its ability to be lomgcao. And there is dubstep. A ghost is planar convergence tied to the Nexus via wub. The Nexus is a large laserite crystal that is lodged in the center of ghostown.

The advantage of making Magenta! is that the mechanics are pretty much already sorted out. I mean, they're simple, the ways of interacting with things are limited to jumping, using your essential artifact (for Magenta, this is fangs, for other ghosts, this varies), and using your tongue to deal damage without wub and to also eat kittens. And to invoke a kitten. The kittens want to be eaten. The interface is the digestive system of the ghost. 

The reason I stopped working on Magenta! is that I needed something less ambitious and more well defined than Magenta. Now that Bad Things is becoming less well defined, it would be kind of fun to try and go back to working on Magenta.

Immediately:

I will spend the next week remaking echobox and perhaps integrating it with Hyper Rave Cube. This is necessary because I need to have some source to show off for the presentation, and also I need some time for my Ugh field to dissipate.

If you've got any thoughts on what I should do, or what you'd like to see, feel free to comment. Thanks!

Wednesday, October 2, 2013

Networking Things in Boston

Yeah! I've been doing fuel cells work again, and also Final Fantasy XIV (shhhh). So, the post today is going to be about some cool networking things in the Boston area. But first:

Feedback on the prototype

The feedback so far has been mostly Bad! Which is Good if you're trying to make a game that is Bad? or is it? I'm unsure. The feedback has been good, I mean, and that the feedback is that the game is bad. Which is totally understandable. Rami said he'd play it, but has yet to do so, or at least tell me his thoughts on it. Some interesting bits of feedback:
  • Art scale is off
  • There is no way to quit without alt-f4
  • Without reading the instructions, the game is impossible to play
  • The mouse cursor shows up while you're playing
  • Putting stuff in the reactor isn't fun
  • It's hard to tell which room is which from the map
  • There are some bugs with barrels and the megatongs and also barrels and the oxygen station
  • The oxygen station doesn't make sense to anyone, even if you do read the instructions
So that's cool, some of those things are just bugs that I need to fix, but some of them are more conceptual. Mostly the reactor thing needs to be made more fun, and also I need to think of a way to make the map room more intuitive. The oxygen system does make sense, but it just needs a better explanation. Hopefully I will be able to wrestle my willpower away from FFXIV and actually do some game dev soon, but its jaws are powerful.

The Oculus/Matrix partners Rift party thing

This was kind of weird. I don't know what the Matrix partners do, but apparently they are some cool investment thing that gave Oculus 60 million to make the king of head mounted displays. Their office is in some super high end building thing in Cambridge, and parking is free if you know the magic word. You go up to the 17th  floor and wander through these big glass sliding doors and then you're in a monochromatic color scheme conference zoo. Everything is a tasteful shade of off-white and there is a super slick desk with a secretary greeter person who asks you to sign in. After that, it's just a big island table (covered in pizza, for this event) and various places to sit. It's really weird to have an office just to have a place for conferences but I guess so.

At the event, they had two of the high-def rift prototypes (1080p panels) and trying out one of them in what seemed like the Unreal 4 demo, it was really really impressive. You can still see the pixels, but it's good enough. I mean, in the same way with the first dev-kit, you can kind of forget about the pixels, but with the high def prototype, it's even better because the pixels aren't that apparent unless you're looking for them. The demo that I saw was a snowy outside area of castle, and then inside was some lava and a mean golem guy that yells. And also you can shoot crazy GPU particle effects out of your face (as usual). Once I got to the golem thing, it growled at me and then I press some of the d-pad things and the screen tore in half and oops I broke it. But it was cool up until then!

I met a bunch of the Boston Indies guys there, and also Benjamin and Antonio. Benjamin is the guy putting together the talk on Rift development that's happening in November, and Antonio is the king of the Matrix partners who apparently decided to give all the funding to Oculus. I have his card, and supposedly there is a way to get "face time" with him so that I can do something? I mean, is my game dev thing the kind of thing that receives "seed funding" or am I just a guy trying to make a video game?
I don't know, but I haven't gotten an email from anyone regarding a big check yet. Oh, and I misheard Dave Evans about demo night at the Boston Indie Games Collective.

The Boston Indie Games Collective

Now this is super cool. The indie collective is a group of local game developers that rent out a space in a big room and they all work on games independently but can also talk to each other and get feedback on stuff and it's pretty much way better than making games in your room by yourself. It's somewhere in Cambridge and also it's hard to find unless you know the name of the place that rents the space (intrepid labs). In fact, it was fairly unlucky because I searched the first, second and third floors, and it just happened to be in the attic, suspended from the ceiling by big metal poles. I showed up for the demo night myth and well, it wasn't demo night. But that didn't matter because I had super awesome conversations with a bunch of the Boston Indies guys like Trevor, Erik, Luigi and Elliott. We talked about everything from networking code to the graphics pipeline. I saw Trevor's rhythm monkey strategy game and Erik's disco dodgeball, and also there was free beer. But I don't drink, so free ginger ale? Yeah.

I asked about getting some space so that I could work with them at the indie games collective, but unfortunately there is no availability until some of the other companies move out. Dang.

Some cool talk about Rift game dev in November

Here's the link to the eventbrite. I was asked to give a talk on echobox and the Rift integration. Apparently the idea is to invite game developers, students and faculty to show off how cool the Boston community is for game developers and also learning about game development. Benjamin said that there are even some acoustics R&D people who are interested in hearing about how I did the acoustic propagation for the echobox. Which is exciting. When I released echobox, nobody cared, which is a shame because I think some of the technology is really cool and hasn't really ever been done before in a game. So it's neat that some people are actually interested in it now. But this also means that I have to put together a presentation for it. I could talk a lot about echobox, but unfortunately they want to limit it to stuff about the Rift, and unfortunately again, the rift integration was pretty easy (except for some adjustments per-eye in the ray-marching algorithm). I think I'll figure out how to sneak in a bunch of stuff about the simulation because that is the coolest part. Though, it's only a half hour. Anyways, if you are in the Boston area on November 2nd, you should totally go and see me give a talk there!

So that's all the cool stuff that I've been going to in Boston, and if you see me around, be sure to say Hi, or bye, or something, anything really.

Wednesday, September 25, 2013

Bad Things Happen in Space: Panic Build

So it's done. I completed Rami's challenge... only 13 hours late. I don't really know what the conditions were for him to play my game or what I'd get or if anything would happen at all if I completed it on time. I suppose that the main idea behind the challenge was to get me to build a prototype at all - and it worked!

Here's the link: https://docs.google.com/file/d/0B209xEE7EEixNDNERDNlMlhmbU0/edit?usp=sharing

I don't know what the bandwidth limits are like on google drive (cursory search indicates within the range of 600 mb/day), but if the link doesn't work, try again later. Or tell me, and if it gets to be too much of a problem, I'll host it somewhere else.

I'll copy and paste the shitty instructions I wrote:

controls:
wasd to move
shift to sprint
left ctrl to crouch
space to jump (though you can't jump very high without hitting your head)
e to interact
q to drop an item if you're holding one
each tool has two modes of usage: 
wrench-welder: left click to hit fractured debris back into place, right click to weld the seams 
entropy capacitor: right click to float/attract cubes, left click to shoot them 
megatongs: right click to grapple fuelium rod, left click hold to charge forcelight, left click release to launch fuelium rod.
Tips: Press start server to start playing.Once you've figured out the ship, the helm is where you can access a few different scenarios. 
Performance is really shitty, so you'll probably need a really good computer to get decent frame rates. 
The entropy capacitor (cube sphere gun thing) puts out fires if you shoot cubes at fires. 
You can open up specific doors from the map in the security room. 
That weird glowing chamber thing in the life support room will heal you if you step into it. 
the weapons room has been temporarily re-purposed for oxygen canister storage. 
The megatongs are used by right clicking on a fuelium rod and then holding down left click (while still holding right click) until your light glows big to shoot out the fuelium rod (probably into the big swirling reactor thing...) 
There are 4 distinct and bad scenarios to try.


Also some minor news: I have been asked to give a talk about echobox and the Rift at some unannounced conference thing in november. The Rift integration was actually one of the easiest parts of echobox, but that's not saying much.

The Ultracrunch

Apparently I have the tendency to ultracrunch before deadlines. For instance, not sleeping or eating for the three days prior to the deadline. The same thing happened when I was working on echobox (and to a lesser extent with Blood too) right before it was due. I have a feeling that this is unsustainable and probably extremely unhealthy. I felt like I was constantly about to pass out or die. Productivity goes way down too. Unfortunately, if there isn't a deadline, I have a tendency to focus on infrastructure and technology rather than actual gameplay mechanics. I fall into the trap of "making it easier to make a game" rather than actually making a game. I should find some way to impose actual real deadlines and react to them in a way that is less torturous- because ultracrunching is ridiculously stressful.

Team work(!??!)

Yeah! Bad Things may be an anticooperative game, but that doesn't mean that making Bad Things has to be. I enlisted the expert (zero previous experience) modeling skills of a friend (gosia) to make some of the art in the Panic Build and also I am teaming up with some super sweet (no joke this time) game design dude (joe) to help me hammer out the mechanics and concepts. Like balancing realism vs rule of cool vs fun. Specifically, Joe really helped me narrow down the scope into a viable set of systems and especially helped with clarifying how the oxygen system should work. Eventually (probably never) I'll just implement shipwide navier-stokes math for the oxygen flow and let physics figure it out, but until then, I'll use a volume/concentration based approach.

Good/Bad Things

I'll outline some stuff I am particularly pleased with, and some stuff I think is the weakest:

Good Things:
  • I really really like the look of crisp high resolution text as part of the environment. It makes it really seem like a very cohesive experience.
  • Hitting fracture debris with the wrench is super satisfying.
  • Shooting entropy cubes at fire looks really cool. Actually, maybe I like everything about the entropy capacitor.
  • Managing oxygen canisters really hammers in that sense of being-alone-in-space. That feeling of don't fuck this up or you'll die.
  • Actually getting the area behind the fracture sites to look like outer space (probably the most novel technical trick in the build).
  • The door control map actually works!
  •  damn, I love floating cubes.
Bad Things:
  • Multiplayer is broken. There are some panic hacks that are super-not-multiplayer-implementable. And also the physics. Physics based stuff in multiplayer games tend to not work out very well. We'll see.
  • Welding fracture pieces together is hard, there's not a lot of feedback to know what you can weld, or how much you need to weld, or when welding is done.
  • Picking up fuelium rods with the megatongs is too floaty, it needs to be more gravity-gun like. This isn't hard to implement, but I didn't want to spend too much time on getting it perfect for the panic build. Similarly, with shooting fuelium rods into the reactor it is hard to predict where the rod will go, and there's not enough feedback when you do get it into the reactor. And most of all, it's just not very fun or mechanically satisfying. 
  • Ship power isn't important enough. Currently some systems shut down if the power is too low, but that's about it. The oxygen system stops pumping, the doors require manual work to open, can't warp, lights go dark (though indirect lighting is still there... thanks lightmaps (I'll probably fix this eventually)) (idea- sensors, door labels should not work either). Eventually you'll die from not having enough oxygen, but that's about the only threat to your life from power loss.
  • Performance is horrible. Most of the frame time is spent "culling". I don't think that's normal, and it might be a bug in the 4.3 beta that I'm using. I get 60 fps in bad areas and 150 fps in staring at a wall, and that's on a Haswell i7 and a gtx780, so performance is going to be mostly unplayable on merely mortal computers. And the occlusion culling system is broken too, so that doesn't particularly help. Also, rendering the scene 3 times per frame (this can probably be fixed, eventually). Oops.
  • Difficulty is super high. Especially considering welding fracture pieces works poorly in the first place, you don't have much room for error when repairing a fracture. You have about 5 seconds before you start taking damage from asphyxiation, and then about 20 seconds after that you'll die, if you're at full health. Balancing this is hard, but not fundamentally problematic. Also fire is really dangerous. And there's no way to repair the hull yet. It's a prototype after all.
  • Not enough scenarios or compelling situations. The power goes out, the oxygen vents, fire breaks out, and fractures occur. None of these things occur simultaneously in the current scenarios (and if they did, they would probably kill you), and having conflicting priorities is where real panic comes from.
  • I didn't have enough time to finish the weapons room, or any weapon systems or ship combat or anything like that, so that will come eventually.
Other Things:

  • Graphics programming is amazing. I feel super cool about solving some problem with getting the fracture pattern to show up behind the actual fracture. I render the fracture geometry from a closed position and use a shader that outputs all black, and then use that as a texture for the space behind it.
  • I had to write my own hinge physics system (more vector math) to get the fracture debris to behave like I wanted it to. Unity's collision conditions are kind of a pain to work with, but I think I've worked around it for the most part.
  • Getting the fracture geometry to actually work in at arbitrary rotations was kind of difficult. I did the proof of concept oriented along the major scene axes, and it required some tough vector math (actually just cross products) to un-hardcode those assumptions. 
  • The Unity beta is full of bugs. Again, this is my fault for trying to do game dev on beta builds, but still, man, every time you remove an item with occlusion culling on, the editor crashes. Change a script in play mode, editor crash. Do anything while baking lightmaps and you'll get a blue screen of death (maybe the fault of overclocking). And the new monodevelop can be super slow for some reason.
  • There was a super crisis problem that took two hours to fix. For some reason the editor builds were working fine, but the standalone executables were just showing a blank screen. Turns out it had something to do with the camera order, but I don't even know why that's happening. I put a band-aid on it and it seems to be working now.
So that's about it. Feedback would be super great if you have any.

Wednesday, September 18, 2013

Moving, VRJam, Bad Things, Rami's Challenge

Hey Everyone! I've been doing a lot of stuff over the last month. Let's start at the beginning.

Moving to Boston

Yeah, I did it. This place is cool, but expensive and what do people do with their cars? My room mates are the coolest dudes/ladydudes and they're actually probably even smarter than me. In fact, I am writing this blog post during sanctioned house writing practice. Otherwise, I would be furiously developing Bad Things for a reason I will reveal shortly.

Oculus VRJam

Yeah! I made a super sweet technology showcase thing for the Rift that apparently nobody cared about.



I didn't place as a finalist, and my guess is because:
1) I didn't actually make a game (it's a music visualization experience).
2) some bugs.
3) needs a very high end gaming computer to run.
If you have a Rift devkit you can try to download and play it here. Keep in mind that you'll need a super fast computer. It runs at 60 fps on my development machine which has a gtx 780, and it ran at 10 fps on my laptop. The reasons for this are clear to me (huge compute shaders and ray-marching), but the fidelity tradeoff was not one that I was willing to make. The jurors had very fast computers (or so I heard), so I don't expect this was a major problem for them, but it might be for everyone else.

It's a little bit disappointing that I didn't make it as a finalist; I thought I had a pretty strong entry. They didn't give any kind of feedback or rating, so I don't know what or why. Oh well. There are pieces of echobox I can possibly reuse and sell on the asset store. I'm thinking I could create some kind of plugin for the acoustic simulation or a plugin for soundcloud streaming. Btw, my game has a physical acoustic sound propagation simulation and it also streams music from soundcloud. Why? Because I am crazy.

So what about those Bad Thing-things?

Err, yeah. So there was the fuel cells thing and then the VRJam was three weeks and then I was moving and now I am doing it! Ok! The level editor still isn't done (though I have saving and loading now). Technology wise- echobox uses a rasterization based voxelization technique that I think I can adapt for use in Bad Things to implement *~Sparse Voxel Octree Global Illumination~*. But that will have to wait. At Boston Indies a few days ago, I met a super cool dude...

Rami's Challenge: Panic Build

Yes, it's true. +Rami Ismail of Vlambeer (@tha_rami) challenged me to stop developing technology for my game and just go and f-ing make a playable build so people can actually go and test whether or not the mechanics are fun. IN 7 DAYS. FROM TUESDAY. (it's wednesday now). Ahhhhhh. He's my Indie Game Dev Idol, so I can't possibly resist. It's a bit unlucky because my game is particularly unplayable right now with the taking apart of everything for the level editor. And I had to do the fuel cells yesterday (one wasted day). I spent all day today fighting with the level editor to get a basic level going. Looks like I am going to be taking the air particle system out for this Panic Build. I've been thinking of mechanics frantically:

Throw fuelium rods into the reactor for ship power using megatongs, load weapons into tubes, crank laser batteries, secret button presses for recalibrating life supports. Cutting off limbs for the bio-replicator, the entropy capacitor (formerly known as the entropy amplifier), and the welding wrench. I think I can get all those things into the game in some form, BUT IS THAT ENOUGH?! I don't know. Is it even a game? Do I need a scenario? What will drive the pace of these events? Panic Build.

Monday, August 5, 2013

Indiecade and Oculus Rift vrjam

Hey! It's been two weeks since I've last posted an update and again, it's because I haven't actually done that much work on Bad Things. I've got some boring level editor stuff working, like saving the level using XML serialization, but loading it up isn't quite there yet. I'm still doing fuel cell work too, but that should be over this week. At any rate, I'm going to be working on the Oculus and Indiecade Rift vrjam thing! Also:

1000 blog views! Party!

That's a mouthful

Yeah, but it's awesome. There's a bunch of prizes, and well, It'll be cool to make some cool stuff for the rift. My idea is to make a game about perspective. People don't usually get to see things at different scales; I think it will be cool to see things from an unusual perspective. Perhaps the scale of an insect will be interesting? There's Pikmin, but that's all I can think of. Also there's...

Echolocation

All you have to do is implement sound wave propagation in a video game and use it to show the environment. This paper holds the secrets of Finite Difference Methods for the wave equation. I have no idea what any of that means (well, maybe a little). First, here's some cool garbage:

Compute shaders are broken.
video

But wait! I actually got it to work:


What is this sorcery?

The magic of compute shaders! Essentially, I just implemented the equation in the paper: 


This equation is the discrete form of the pressure field created by a wave moving through a medium. The pressure field, along with the the speed of the propagation in the media (c(x,y) in the equation) are all the pieces of information you need to describe the wave system. Here's the final line of my compute kernel:
float next = (2 * current) - previous +(t*source)+ A*(leftinfo + rightinfo - (4 * current) + downinfo + upinfo);

which mirrors that equation. There's also like 60 lines above that, but those aren't important. Oh, and a c# script that goes along with it to manage all the buffer swapping. I ran into a lot of bugs, as you can see in the garbage images and that's one of the problems with implementing magic that you can't figure out yourself because of silly things like "discrete calculus" and "partial differential equations". There's an important condition that you have to be aware of:


I don't know why (what is a CFL?), but if you don't make sure this is satisfied (my time interval was much too large) you get something boring like:

video

Oh, and also my buffer swapping modulus tricks were totally incorrect. A more explicit if...else structure cleared that up for me.

Pressure fields

If you didn't know, sound is an oscillating pressure field. When you hear sound, your eardrum simply gets pushed in and out really really fast and your brain transforms that into sound qualia. If you were to sample the pressure field (red for positive, green for negative) at a rate of 48 KHz, you could actually get sound out of this simulation. The same people wrote another paper about doing just that, except that it took them something like 3 hours per second to sample enough pressure information. That's not exactly realtime. Luckily, I won't be using it for that. I'm going to try to implement echolocation in my vrjam game. But first I need to figure out how this generalizes to 3D and also how to voxelize the game scene. I'll be working on that over the next week. Here's a final video showing off attenuation and an oscillating pressure field (sound waves!):



The frequency in this video goes from 1 Hz to 25 Hz. Unfortunately, I can't go much higher than 60 Hz because that's half the frame rate. I'm literally setting pressure field values and toggling them positive/negative once the period interval elapses, so I can't do it any faster than half the frame rate (60 Hz on a 120 Hz monitor with vsync enabled).

I hope you guys find this interesting. I was yelling earlier when I went from glitchy squares to actual propagation simulation, I was so excited. It'll be tricky to try and go from 2D to 3D without a paper to give me magic equations, but I think I can do it. I have no idea how to discretize a partial differential equation, but if I have to learn discrete calculus to do it, then so be it.

All the code for this will be up on github in the next few days. I'll make a post when I get the repository set up. 

Tuesday, July 23, 2013

Level Editors are Boring in Space

It's true! Writing a level editor isn't super fun. Burn down the level editor:

The fire still works! Kinda.
First, a quick update before we get on to the level editor. I am (not) secretly a fuel cell system controls programmer. Recently, a company I've previously contracted for contacted me to do some crazy-not-enough-time rapid fuel cells programming; and also you have to build the whole system by yourself because the other engineer is harvesting his crops (not a metaphor). In my efforts to quit my job and make video games, I have become a fuel cell system engineer. Oops. Luckily, it's only for the next week or two, so progress on Bad Things should be more exciting after that.

Also, Unity 4.2 is officially out. Now I can talk about all the beta secrets! Err, well, I pretty much did that anyways. I am not the best with confidential information. I am slightly surprised they released with this version because some stuff is still broken (TC particle settings don't show up). Also also, I've finally purchased a Unity pro license, so no more sneaking by on 30 day trials and beta keys. $1,500 is a lot, but the Unity team deserves every cent of it. Oh, and If you check out the release notes, near the very bottom it says:

  • Shuriken: Fix collision module such that trigger objects do not cause particle collisions.
You can thank me for that. I reported that bug and harassed the developers for the fix. I've made my mark on the world.

Level editing

Light directions as textures!
I find that I am less productive if I don't have a feature set in mind, so I'll put down some features here to help clear that up:

Done:
Lay out tiles for rooms.
Make walls from tile selection and group section into a room.
Make doors from walls.
Automatically create necessary oxygen system components.

Not done:
Make roof pieces.
Make walls from doors.
Be able to save and load layouts.
Unroom walls and tiles.
Select from a set of default room names (via drag and drop??)
Go back to tile editing mode after the tiles have been cleaned up.
Finalize level, removing level editor garbage.

That's all?

Chameleon, cobalt, and misty rose
Yeah, totally. I'm pretty sure I'll get roof pieces, and walls from doors done fairly quickly. I think saving layouts is a good place to start for the rest of the stuff because of some intangible hunch. Also, once the level editor is done, I'll release a unity scene file with all the level editing stuff so you guys can play around with it if you want. As always, questions and comments are appreciated.

Tuesday, July 16, 2013

iamagamer game jam: Blood

I didn't actually do much development for Bad Things this week, instead I was focused on finishing up at Intel, and then there was the iamagamer.ca game jam this weekend. Along with two other people, I produced a game in 48 hours. The link to download it is here http://jam.iamagamer.ca/submissions/53-blood, though I'll probably put up a download on this blog on an 'other projects' page.

Blood!

The topic was "Strong female protagonist", and my first idea was to take the concept literally and make a game about a female body builder. After some discussion among my friends, I decided to pitch a game about bleeding. One person put their name on it (someone I knew already), then I put my name on it, and we got a third person. Luckily for us, none of us had ever done a game jam before, and I was the only one with game development experience.

We settled on an action platformer where your blood is both your life and your ability to attack. Each weapon or spell you use would remove some of your blood, and on top of bleeding constantly, that makes for a very interesting decision process. You can't wait around too long because you'll bleed to death and you can't attack without planning or you'll bleed to death.

The initial scope was waaay too big, as it always is, but we finished with two enemies, a level, a blood sword, running, attacking and idle animations, and some sound effects stolen from Kirby Super Star put in during the last 20 minutes. It turns out that the particle effects used in the game became too intense for mere laptops so we had to demo the thing on my PC. Yes, I brought my tower to a game jam... I've since optimized the game by turning down the physics frame rate and changing the world particle collisions to planar particle collisions. I expect it will run on most computers now.

Game jams are crazy

It was 48 hours of intense, exhausting effort, and I think we built something that we can be proud of. It feels like we were one of the teams that got the most done, and I believe it's impressive that we got so much done considering our team size and experience. I don't know if I'll do another game jam any time soon, or at least until I forget how exhausting it was. Not to say it wasn't fun, but it didn't exactly achieve the goal of showing other people what I could do on a team, or networking very well (because I already knew one of the team members, and the other quit half way through and tried as hard as possible to remain completely anonymous, including making up new emails for the event and then deleting the accounts after it was over).

The boston scene

The boston scene is a lot stronger than I expected. There's a ton of people and they're all really cool. It's really fun to meet other game developers; my current social circles do not contain very many creative types, and game devs are the same intersection of technical/creative that I pathologically exhibit. I've met a ton of audio guys, some artists, some modelers and a bunch of engine guys. I'm not usually super extroverted, but I can power up my social abilities when faced with the consequences of network or die. I've spent the last 4 days traveling back and forth to boston every day, so it will certainly be more convenient when I move there.

Normal development of Bad Things should resume this week. Here's some level editor progress to smooth things over.


Sunday, July 7, 2013

Surprise level editor!

Yes, the next thing I am going to be working on is the level editor. I didn't actually plan to do this for quite a while because it seems like a nice but not essential feature to have. For reasons I will make clear shortly, I need to build the level editor before I can continue.

What is this garbage?
Menus are the best showcase of my work.

Wrenching animations

I'm so sorry for the pun. What I really wanted to work on after the procedural fracturing was repairing of the fracture sites. Hit the debris with a wrench until it's close to its non-fractured position, and then weld it into place. Here's a spiffy gif of the wrench animation:
+1 for perfectly looping wrench
Here is a video, but it's not the greatest. Everything is 100% broken at the moment; I guess that's what I get for building my game with the Unity betas.


The video shows off some cool stuff, even if it looks horrible. There's the inverse kinematics on the hand placement. Inverse kinematics is a way of calculating the shoulder, arm and elbow position required to place a hand at a certain position. There's also the physics on the debris and well, you can see the lighting is broken (Unity's fault) and the scale is all weird (my fault).

Everyone is three feet tall

That's right. The reference I was using for the scale and level layout pretty much has everyone being three feet tall. After making the wrench animation, I naturally tried to apply it to a character and animate it. The problem here is that a standard character model is 1.8 meters tall, or almost twice the height of the ship. Changing the scale of the ship to be twice as tall reveals just how incredibly cramped the place is.

It's interesting how making the place bigger actually makes it look smaller.

Scaling the character models is a no-go because you'll have to scale down all the animations too and reimporting stuff is a nightmare in general. The ultimate conclusion is that I'm going to have to build the level editor a lot sooner than I expected. I need to test out different scales and room sizes, and I can't do that easily with my current set up. As for the level editor itself, Unity editor scripting is in some respects quite easy, but implementing a custom tool for laying out floor pieces and walls is difficult. There is some artificial difficulty here because most of the problems are caused by the complete lack of documentation. My techniques thus far have relied on snippets from the forums and wishes. Example: Editor.OnSceneGUI()'s documentation consists of:
function OnSceneGUI () : void 
Description 
Lets the Editor handle an event in the scene view.
In the OnSceneGUI you can do eg. mesh editing, terrain painting or advanced gizmos If call Event.current.Use(), the event will be "eaten" by the editor and not be used by the scene view itself.
That's all you get. Now go implement a level editor!
...It's coming along, but there are bugs and I don't really know what I'm doing. Expect to hear more about the level editor next time. Also, it's my last week at Intel(!!!). Let's all celebrate this new era of starvation and potential homelessness!

Friday, June 28, 2013

Procedural fracturing: Mesh generation and physics

This will probably be the last post on procedural fracturing. Why? Because it's mostly done!  In a few hours of heroic brain effort, I went from this:
Lines are drawn between centroids and vertices in this image.
There's also an axis pointing right is there for debug purposes
To this:
A fully procedural mesh!

The little things

After so much brain destruction over this, there were two bugs: one in finding the center of each circumcircle (I was dividing by zero somewhere, getting secret NaNs), and another in the centroid sorting algorithm. I also facepalmed really hard because the first sentence on the wiki page for polygon triangulation is "A convex polygon is trivial to triangulate in linear time, by adding diagonals from one vertex to all other vertices." Oops. I threw out the copy/paste ear clipping algorithm and just wrote my own thing to add diagonals. It's funny how something can seem so simple and obvious in retrospect. The only downside to this technique is that the geometry created isn't really suitable for animation; I'll be avoiding this by joining vorons together instead of animating vorons individually.

Oh, and the extrusion algorithm I hypothesized in the previous post actually worked! ("I can't believe this actually worked!") Here's a video demonstrating mesh generation and some simple rigidbody physics.


Physics.Raycast() or why I love Unity

Now that we have actual GameObjects and meshes, we can do away with all this math and start brute forcing things! The wonder of Physics.Raycast() is one the reasons I started looking into game development in the first place.
Raycast is actually a very descriptive name because it is a magic spell. You pick an origin and a direction and it will tell you what is in that direction. Every time I use this function I feel like a powerful wizard - able to conjure eldritch data from the very aether! Of course, such great power comes with a cost. It is quite an expensive algorithm to run; you can't do a million raycasts each frame and expect your game to perform decently. Which is fine, we only have to do one raycast per voron, and only do it when we fracture stuff.

What is all this for? We're going to use this to figure out how to join each extruded voron together to create a cool looking fragmentation/bending pattern. From the perspective of each voron, do a raycast in the direction opposite from the center of the impact site, if you don't get anything then you're an edge piece, and if you hit another voron, then you create a physics hinge joint between the two. Maybe also have a probability (10% ?) of not creating a hinge so that there can also be some debris on the ground. Then you'll have to set up the hinge position and axis. The hinge position is the half distance between the two voron sites and the hinge axis is the bisector of that distance vector.

This shows the hinges and voronoi sites, though it's a bit buggy here.

There is also this problem of nightmare vorons. These guys have three voronoi sites that are close to colinear, so they produce a vertex that is very very far away. This can be seen in the video, there are some fragment pieces that are much longer than the others and they get stuck in the roof or floor. The solution to this is rather simple, just check if the vertices are too far away from the center during vertex generation. While I was trying to implement this, I did create a pretty cool looking bug:

Nightmare vorons! *shudder*
Here's a few cool looking examples of the nearly final result of all this work:


Wall pieces are a bit thicker in this one.


Of course, it needs a bit of tuning (maybe the fragments look a bit too round?), and well, the wall isn't actually broken behind the 'impact' yet. If it's kind of hard to tell whats going on, just wait till you see it in 3D. It's pretty awesome. I hope you guys like this. This is by far the most intellectually challenging feature I've implemented. As always, feel free to leave comments or questions. Thanks!

Friday, June 21, 2013

Procedural Geometry from Fracturing

Here’s the follow up to the procedural fracturing post I wrote two weeks ago. As for the assets roadmap I talked about in the last post, I have decided that it doesn’t quite make sense to go over the assets at this point, when so much of that is still up in the air. I’m thinking of creating a dedicated roadmap/status page that gives the current progress on all stuff, for convenient access and updating. Anyways, here’s the juice:

Why can’t I hold all these vertices?

Last time we went over how to get the vertices out of a voronoi diagram, and I sort of-maybe-not-really fixed the problem I was having earlier with missing some of the vertices. I scrapped the whole compute shader business and switched it over to Unity C# code, and along the way I discovered LINQ (which is like SQL for code?!?!) but that’s another story. Without using a compute shader, it became both easier and harder to debug. Easier because I could now draw gizmos (which are 3D wireframe objects that are only drawn in the scene view) to show where my vertices are in 3D space. And, you know, you can actually use a debugger for C#. Unfortunately, it also removed the ability to see the whole diagram as a texture... so at this point, I don’t really know if I am missing some vertices because I can’t use my cool human mind brain pattern matcher to detect the vertices or to see where they are missing. At any rate, I have some (possibly incomplete) amount of vertices and I would like to turn them into actual triangle meshes for rendering and animation. There are two challenges here: Triangulation, and vertex sorting.

Voron gizmos. Voronoi sites are in white, vertices are various colors.
Try and see if you can find any missing vertices. (no really, please do).

Triangulation

Terrible garbage (back)
So, we possibly have the vertices for each voron/polygon, and from inspection, some of these vorons have less than 3 vertices (possibly an indication of a bug), so make sure you check that there are at least three vertices for each polygon, because you can’t really draw anything that can’t be turned into triangles. Which brings me to the next fun (terrible) challenge. Maybe you don’t know this, but to draw anything in three-dee graphics you have to have triangles, and not just any triangles- the order of the points in the triangle is important (we’ll get to that later). If you have more than 3 vertices in your polygon, how do you turn that into triangles? There are a few methods for this, some more robust than others, but it’s a confusing topic in general. We can be smart(?) about this and use the knowledge that voronoi diagram sections are guaranteed to be convex to help us a little bit, or maybe we can just copy and paste the ear clipping algorithm located on the Unify wiki. Running the ear clipping algorithm on my raw voron vertices results in this garbage:

Terrible garbage (front)






One of the problems here is vertex sorting, the other is that the geometry is actually infinitely thin; we’ll need to sort the vertices to fix the latter.



Vertex sorting

The direction a triangle ‘faces’ is determined by the winding of the triangle (which also determines the normal direction). If your triangle isn’t facing toward the camera, then it isn’t rendered as long as a common optimization called backface culling is enabled. The winding and normal direction are given by the left hand rule. Literally take your left hand and curl it in the direction of the order of the vertices. Stick your left thumb out, and that is the way the triangle is facing. The problem with the vertices given by a voronoi diagram is that they are given in a completely nonsensical order. Without sorting them, the polygons are invisible or have holes in them.

My first idea was sorting the vertices by starting at the beginning of the vertex list and finding the nearest vertex, and then removing the first vertex, and find the nearest to the next one, and so on. Sorry if that is super hard to follow, but this works because the vorons are guaranteed to be convex. I didn’t spend too much time on this technique because I found out about centroid based sorting soon after.

Centroid based sorting

After reading this page, I decided to try sorting the vertices in a polygon based on the angle from their centroid. The centroid of a polygon is the average position of all of its vertices: sum(x_positions,y_positions)/number of vertices. You can find the angle between some starting angle (which you’d call zero degrees) and the vector from the centroid to the vertex. The page I linked explains it in decent detail. To get a left handed winding, you’d want to sort the vertices such that the vertices with the highest angle are first. C# has a neato List<T> method called .Sort() which takes a comparison function and sorts a list for you! Sweet! Just write something that compares two vertices based on their centroid angle, and pass it to .Sort(). Done! Well, sort of. I’d post a picture of the sorted vertices, but I actually haven’t finished this part yet, oops.

Thickening the dimensions 

The rest of this post contains my thoughts on how to create an actual 3D volume from the sorted vertices. To get thickness out of these polygons, a simple extrusion would work for me. In Maya, just hit the extrude button on your mesh and you’re done! Oops again, we’re not using Maya, we’re creating vertices at runtime using a pile of half baked algorithms which are held together by hopes and dreams. Ask my friends how many times I have incredulously announced, “I can’t believe this worked!”
The process I am imagining goes like this:
  1. Duplicate all vertices and triangles, and offset them by some thickness amount in whatever direction. We’ll call these vertices’ (prime).
  2. Add triangles to the triangle list that contain the vertex, vertex’ and one of the adjacent vertices. The next triangle begins with the vertex across from the last vertex you chose. Repeat all the way around the edge until you have a full ring.
  3. Cross you fingers.
My probability estimate of this working is 65%. (edit: it actually worked)

That’s it for this week. I’ll post up some pictures of my completed procedural fracturing stuff as soon as I get it working. As always, comments, questions, and requests are appreciated.

Edit: that was quick

Turns out the algorithm I've been using for finding the circumcircle center was incorrect. With implementing the algorithm on the wikipedia page, things are starting to look a lot less garbage (though still garbage). 

Better garbage!
Even more voron gizmos!