Friday, June 28, 2013

Procedural fracturing: Mesh generation and physics

This will probably be the last post on procedural fracturing. Why? Because it's mostly done!  In a few hours of heroic brain effort, I went from this:
Lines are drawn between centroids and vertices in this image.
There's also an axis pointing right is there for debug purposes
To this:
A fully procedural mesh!

The little things

After so much brain destruction over this, there were two bugs: one in finding the center of each circumcircle (I was dividing by zero somewhere, getting secret NaNs), and another in the centroid sorting algorithm. I also facepalmed really hard because the first sentence on the wiki page for polygon triangulation is "A convex polygon is trivial to triangulate in linear time, by adding diagonals from one vertex to all other vertices." Oops. I threw out the copy/paste ear clipping algorithm and just wrote my own thing to add diagonals. It's funny how something can seem so simple and obvious in retrospect. The only downside to this technique is that the geometry created isn't really suitable for animation; I'll be avoiding this by joining vorons together instead of animating vorons individually.

Oh, and the extrusion algorithm I hypothesized in the previous post actually worked! ("I can't believe this actually worked!") Here's a video demonstrating mesh generation and some simple rigidbody physics.


Physics.Raycast() or why I love Unity

Now that we have actual GameObjects and meshes, we can do away with all this math and start brute forcing things! The wonder of Physics.Raycast() is one the reasons I started looking into game development in the first place.
Raycast is actually a very descriptive name because it is a magic spell. You pick an origin and a direction and it will tell you what is in that direction. Every time I use this function I feel like a powerful wizard - able to conjure eldritch data from the very aether! Of course, such great power comes with a cost. It is quite an expensive algorithm to run; you can't do a million raycasts each frame and expect your game to perform decently. Which is fine, we only have to do one raycast per voron, and only do it when we fracture stuff.

What is all this for? We're going to use this to figure out how to join each extruded voron together to create a cool looking fragmentation/bending pattern. From the perspective of each voron, do a raycast in the direction opposite from the center of the impact site, if you don't get anything then you're an edge piece, and if you hit another voron, then you create a physics hinge joint between the two. Maybe also have a probability (10% ?) of not creating a hinge so that there can also be some debris on the ground. Then you'll have to set up the hinge position and axis. The hinge position is the half distance between the two voron sites and the hinge axis is the bisector of that distance vector.

This shows the hinges and voronoi sites, though it's a bit buggy here.

There is also this problem of nightmare vorons. These guys have three voronoi sites that are close to colinear, so they produce a vertex that is very very far away. This can be seen in the video, there are some fragment pieces that are much longer than the others and they get stuck in the roof or floor. The solution to this is rather simple, just check if the vertices are too far away from the center during vertex generation. While I was trying to implement this, I did create a pretty cool looking bug:

Nightmare vorons! *shudder*
Here's a few cool looking examples of the nearly final result of all this work:


Wall pieces are a bit thicker in this one.


Of course, it needs a bit of tuning (maybe the fragments look a bit too round?), and well, the wall isn't actually broken behind the 'impact' yet. If it's kind of hard to tell whats going on, just wait till you see it in 3D. It's pretty awesome. I hope you guys like this. This is by far the most intellectually challenging feature I've implemented. As always, feel free to leave comments or questions. Thanks!

Friday, June 21, 2013

Procedural Geometry from Fracturing

Here’s the follow up to the procedural fracturing post I wrote two weeks ago. As for the assets roadmap I talked about in the last post, I have decided that it doesn’t quite make sense to go over the assets at this point, when so much of that is still up in the air. I’m thinking of creating a dedicated roadmap/status page that gives the current progress on all stuff, for convenient access and updating. Anyways, here’s the juice:

Why can’t I hold all these vertices?

Last time we went over how to get the vertices out of a voronoi diagram, and I sort of-maybe-not-really fixed the problem I was having earlier with missing some of the vertices. I scrapped the whole compute shader business and switched it over to Unity C# code, and along the way I discovered LINQ (which is like SQL for code?!?!) but that’s another story. Without using a compute shader, it became both easier and harder to debug. Easier because I could now draw gizmos (which are 3D wireframe objects that are only drawn in the scene view) to show where my vertices are in 3D space. And, you know, you can actually use a debugger for C#. Unfortunately, it also removed the ability to see the whole diagram as a texture... so at this point, I don’t really know if I am missing some vertices because I can’t use my cool human mind brain pattern matcher to detect the vertices or to see where they are missing. At any rate, I have some (possibly incomplete) amount of vertices and I would like to turn them into actual triangle meshes for rendering and animation. There are two challenges here: Triangulation, and vertex sorting.

Voron gizmos. Voronoi sites are in white, vertices are various colors.
Try and see if you can find any missing vertices. (no really, please do).

Triangulation

Terrible garbage (back)
So, we possibly have the vertices for each voron/polygon, and from inspection, some of these vorons have less than 3 vertices (possibly an indication of a bug), so make sure you check that there are at least three vertices for each polygon, because you can’t really draw anything that can’t be turned into triangles. Which brings me to the next fun (terrible) challenge. Maybe you don’t know this, but to draw anything in three-dee graphics you have to have triangles, and not just any triangles- the order of the points in the triangle is important (we’ll get to that later). If you have more than 3 vertices in your polygon, how do you turn that into triangles? There are a few methods for this, some more robust than others, but it’s a confusing topic in general. We can be smart(?) about this and use the knowledge that voronoi diagram sections are guaranteed to be convex to help us a little bit, or maybe we can just copy and paste the ear clipping algorithm located on the Unify wiki. Running the ear clipping algorithm on my raw voron vertices results in this garbage:

Terrible garbage (front)






One of the problems here is vertex sorting, the other is that the geometry is actually infinitely thin; we’ll need to sort the vertices to fix the latter.



Vertex sorting

The direction a triangle ‘faces’ is determined by the winding of the triangle (which also determines the normal direction). If your triangle isn’t facing toward the camera, then it isn’t rendered as long as a common optimization called backface culling is enabled. The winding and normal direction are given by the left hand rule. Literally take your left hand and curl it in the direction of the order of the vertices. Stick your left thumb out, and that is the way the triangle is facing. The problem with the vertices given by a voronoi diagram is that they are given in a completely nonsensical order. Without sorting them, the polygons are invisible or have holes in them.

My first idea was sorting the vertices by starting at the beginning of the vertex list and finding the nearest vertex, and then removing the first vertex, and find the nearest to the next one, and so on. Sorry if that is super hard to follow, but this works because the vorons are guaranteed to be convex. I didn’t spend too much time on this technique because I found out about centroid based sorting soon after.

Centroid based sorting

After reading this page, I decided to try sorting the vertices in a polygon based on the angle from their centroid. The centroid of a polygon is the average position of all of its vertices: sum(x_positions,y_positions)/number of vertices. You can find the angle between some starting angle (which you’d call zero degrees) and the vector from the centroid to the vertex. The page I linked explains it in decent detail. To get a left handed winding, you’d want to sort the vertices such that the vertices with the highest angle are first. C# has a neato List<T> method called .Sort() which takes a comparison function and sorts a list for you! Sweet! Just write something that compares two vertices based on their centroid angle, and pass it to .Sort(). Done! Well, sort of. I’d post a picture of the sorted vertices, but I actually haven’t finished this part yet, oops.

Thickening the dimensions 

The rest of this post contains my thoughts on how to create an actual 3D volume from the sorted vertices. To get thickness out of these polygons, a simple extrusion would work for me. In Maya, just hit the extrude button on your mesh and you’re done! Oops again, we’re not using Maya, we’re creating vertices at runtime using a pile of half baked algorithms which are held together by hopes and dreams. Ask my friends how many times I have incredulously announced, “I can’t believe this worked!”
The process I am imagining goes like this:
  1. Duplicate all vertices and triangles, and offset them by some thickness amount in whatever direction. We’ll call these vertices’ (prime).
  2. Add triangles to the triangle list that contain the vertex, vertex’ and one of the adjacent vertices. The next triangle begins with the vertex across from the last vertex you chose. Repeat all the way around the edge until you have a full ring.
  3. Cross you fingers.
My probability estimate of this working is 65%. (edit: it actually worked)

That’s it for this week. I’ll post up some pictures of my completed procedural fracturing stuff as soon as I get it working. As always, comments, questions, and requests are appreciated.

Edit: that was quick

Turns out the algorithm I've been using for finding the circumcircle center was incorrect. With implementing the algorithm on the wikipedia page, things are starting to look a lot less garbage (though still garbage). 

Better garbage!
Even more voron gizmos!

Friday, June 14, 2013

Road Map: Programming

Hi everyone, Here’s a first draft of the road map for Bad Things Happen in Space. This is my first attempt at collecting my thoughts into a roadmappy structure so it will be a little rough. Of course, everything is super subject to change and probably not going to resemble the final product. Part one includes an overview of programming tasks. Part two will have an overview of assets, and probably some other stuff. But first, LENS FLARES!

*Chorus of angels*

High Level Overview

or what I need to do to launch the kickstarter:

Mechanics programming (two/three months)
Design tools and first ship (one week)
Assets for alpha release (one month? maybe less)
Produce video (no idea, two weeks maybe?)

Kickstarter! (four to five months out?)


During that time, I will also need to find somewhere to live for August and also move to a storage container, and also also move out of a storage container into Boston some time in September, so these estimates are only relevant to Actual Time Spent Working on Game.

Programming Tasks

This section outlines what I definitely want to get done for alpha release, and is probably even less indicative of what the final game will include. Each subheading will include some stuff that I would consider to be in that section, and I will order each bullet from most done to least done.

Player mechanics (40% complete)
  • Moving (done) 
  • Damage / healing / dying (mostly done) 
  • Using stuff (sort of done) 
  • Reviving (currently broken) 
  • Combat (not started) 
  • Incompetence (not started) 
  • Perks (not started) 

Fire (80% complete)
  • Starting (done) 
  • Spreading (done) 
  • Extinguishing (mostly done) 
  • Damaging (mostly done) 
  • Appearance (needs some work) 

Oxygen system (70% complete)
  • Asphyxiation (done) 
  • Air Flow (mostly done) 
  • Appearance (GPU particle system is currently broken in Unity betas... sort of done) 

Item Mechanics (-10% complete???)
  • Undo permanent fire extinguisher (negative done) 
  • Item planning (started) 
  • Using items (started) 
  • Picking up, spawning, equipping items (not started) 
  • Individual item mechanics (maybe started) 

Ship mechanics (20% complete)
  • Procedural fracturing (in progress) 
  • Component damage (started) 
  • Hull damage/destruction (not started) 

Station mechanics (10% complete)
  • Station planning (started) 
  • General station infrastructure (barely started) 
  • Individual station mechanics (not started) 

Multiplayer (30% complete)
  • Basic synchronization (mostly done) 
  • Prediction/smoothing (not started) 
  • Connection management / assistance (not started) 

Interface, menu, options (10% complete)
  • Interface (started) 
  • Menus (not started) 
  • Options (not started) 

Rendering (70% complete)
  • Physical lighting (mostly done... some stupid stuff persists and needs fixing) 
  • Effects (started) 
  • Reflections (not started) 

Ship building tools (40% complete)
  • Air flow calculation and baking (mostly done) 
  • Ship component organization (mostly done) 
  • Layout tools (not started) 

Scenario stuff (5% complete)
  • Scenario planning (started) 
  • Scenario tools (not started) 
  • Scenario mechanics programming (not started) 

Peripheral Integration (40% complete)
  • Oculus Rift support (needs tuning)
  • Sixense Hydra support (needs a lot of tuning)

So, my intuition says that alpha release is at least 20% complete at this point, and note that not all tasks are weighted equally in time or importance. If you’d like, feel free to leave feedback (via comment or directly to kelly@pyromuffin.com), including stuff you think I’m missing or stuff you’d like to see.

Friday, June 7, 2013

Procedural fracturing for kinetic impacts

So, here is the first post that actually has to do with the development of Bad Things. I've got a few ideas for posts - some are overview/roadmappy things, and some are more retrospective, instead, I thought I'd start with what I'm actually working on right now. Before we get started, if you haven't noticed, I've included some pages on the sidebar for About, Images, and Videos.

wtf is procedural fracturing?

The idea here is that when a kinetic weapon (something that is impact or kinetic energy based) collides with the hull of the ship, it should create an entry point that appears to have a broken hole in the center, and shards of bent and peeled metal around the impact site. There are several ways to go about this, you could just create some animated meshes that look like bent metal going through various states of broken-ness or repaired-ness. That's boring. You'll see the same impact model over and over again and repairing it will always be the same task. What we want is some way to create a believable (or at least interesting) destruction shape that can be created procedurally and quickly. After looking into some procedural destruction techniques, I have settled on using Voronoi diagrams to specify the vertices of the model I'll use for destruction.
Sample fracture pattern from early Voronoi diagram compute shader.

Vorons

The regions (which I have deemed "vorons") seen in the image above are created by choosing some points or "sites" on a 2D plane and determining to which point each pixel is closest. A neat property of vorons is that each voron is guaranteed to be convex - which is an important property for procedurally creating 3D models. Choosing sites carefully, we can create a fracture pattern that looks pretty cool. Making these images in a compute shader is stupid easy, but getting useful information out of them is a bit trickier.

So, the ultimate goal here is to get the vertices that make up each voron. The vertices of a voron are the points that are equidistant from exactly three sites. If this is confusing look at the diagram below and think about it for a few seconds until it isn't.
Site is circled in orange, vertices are in light blue.

We can use the vertices of each voron to create a 3D model that will show how the hull might fracture when impacted by a kinetic weapon. The center, fully enclosed region is the part that is actually destroyed, and imagine the shards that surround the center bent back on themselves.

Back to getting the vertices. To do this I first thought that I could just try to bruteforce my way into finding the vertices by sorting the sites by distance - per pixel and see if the distances were really close to each other and then pow! if the distances were close enough, then your pixel is a vertex! But no, pixels are not nice enough for this, they are too imprecise, every time I tried this, even finely adjusting the tolerances, either there were too many pixels for some vertices or not enough for others. Ah, too bad so sad. This image based technique will not work for me, unfortunately. If only there were an analytical way find the vertices...

Circumcircle Centers

I did some research on Voronoi diagrams and I came across this little gem: http://www.cs.tufts.edu/comp/163/notes05/voronoi_handout.pdf
This handout has a bunch of information on Voronoi diagrams and a lot of it is way more than I need, but I did find something that I could use to calculate the vertices of each voron analytically:

To get the vertices for a certain site (lets call it sexySite), we just need to create all unique triplets of three sites that include sexySite, and then find the center of the circumcircle created by that triplet, and lastly, check if there are no other sites within that circumcircle. That last condition is important, without that we would have the intersection of every 3 sites that include sexySite, not just the ones that make up the voron we want to get the vertices for. Run this algorithm for each site, and you can get the vertices that make it up! Sure, it's kind of slow, but that's fine if you're only doing this once in a while. Oh and also this doesn't work for extreme sites. These aren't just extra badass risk taking sites, these are the ones that are on the outside of the diagram and are unbounded on one side... so to fix that problem, just choose your sites so they are surrounded by some extreme sites that you don't care about.

Current State

Right now, I'm working out the bugs getting all the vertices from a set of sites. There is something weird going on where I'm not getting the furthest vertices on the left or right sides... I've narrowed it down to an optimization, but as far as I can tell the optimization really shouldn't affect this. Once I've got that working reliably, I'll start to bring these vertices together into a mesh, and then figure out how to bend them in a way that looks cool!

Monday, June 3, 2013

Bad Things Happen in Space

Hello everyone! As you may or may not know, I've recently put in notice to quit my job at Intel to work on my game full time. This is where I'll be cataloging the development process for my game: Bad Things Happen in Space. The goal here is to update this development blog at least once a week, but updates may come more frequently. I'll be talking about what I'm working on, what some challenges are, and what I plan to do. The posts will be fairly technical, so keep that in mind. In the near future, I'd like to set up some other pages on this site for media and demos and stuff, so stay tuned! (bad) Things are just getting started!