This will probably be the last post on procedural fracturing. Why? Because it's mostly done! In a few hours of heroic brain effort, I went from this:
Lines are drawn between centroids and vertices in this image.
There's also an axis pointing right is there for debug purposes
To this:
A fully procedural mesh!
The little things
After so much brain destruction over this, there were two bugs: one in finding the center of each circumcircle (I was dividing by zero somewhere, getting secret NaNs), and another in the centroid sorting algorithm. I also facepalmed really hard because the first sentence on the wiki page for polygon triangulation is "A convex polygon is trivial to triangulate in linear time, by adding diagonals from one vertex to all other vertices." Oops. I threw out the copy/paste ear clipping algorithm and just wrote my own thing to add diagonals. It's funny how something can seem so simple and obvious in retrospect. The only downside to this technique is that the geometry created isn't really suitable for animation; I'll be avoiding this by joining vorons together instead of animating vorons individually.
Oh, and the extrusion algorithm I hypothesized in the previous post actually worked! ("I can't believe this actually worked!") Here's a video demonstrating mesh generation and some simple rigidbody physics.
Physics.Raycast() or why I love Unity
Now that we have actual GameObjects and meshes, we can do away with all this math and start brute forcing things! The wonder of Physics.Raycast() is one the reasons I started looking into game development in the first place.
Raycast is actually a very descriptive name because it is a magic spell. You pick an origin and a direction and it will tell you what is in that direction. Every time I use this function I feel like a powerful wizard - able to conjure eldritch data from the very aether! Of course, such great power comes with a cost. It is quite an expensive algorithm to run; you can't do a million raycasts each frame and expect your game to perform decently. Which is fine, we only have to do one raycast per voron, and only do it when we fracture stuff.
What is all this for? We're going to use this to figure out how to join each extruded voron together to create a cool looking fragmentation/bending pattern. From the perspective of each voron, do a raycast in the direction opposite from the center of the impact site, if you don't get anything then you're an edge piece, and if you hit another voron, then you create a physics hinge joint between the two. Maybe also have a probability (10% ?) of not creating a hinge so that there can also be some debris on the ground. Then you'll have to set up the hinge position and axis. The hinge position is the half distance between the two voron sites and the hinge axis is the bisector of that distance vector.
This shows the hinges and voronoi sites, though it's a bit buggy here.
There is also this problem of nightmare vorons. These guys have three voronoi sites that are close to colinear, so they produce a vertex that is very very far away. This can be seen in the video, there are some fragment pieces that are much longer than the others and they get stuck in the roof or floor. The solution to this is rather simple, just check if the vertices are too far away from the center during vertex generation. While I was trying to implement this, I did create a pretty cool looking bug:
Nightmare vorons! *shudder*
Here's a few cool looking examples of the nearly final result of all this work:
Wall pieces are a bit thicker in this one.
Of course, it needs a bit of tuning (maybe the fragments look a bit too round?), and well, the wall isn't actually broken behind the 'impact' yet. If it's kind of hard to tell whats going on, just wait till you see it in 3D. It's pretty awesome. I hope you guys like this. This is by far the most intellectually challenging feature I've implemented. As always, feel free to leave comments or questions. Thanks!
Hey! If you missed it, yesterday I wrote an artisanal blogue poaste about a tool I am working on called the HDR injector . Today, I will write about how to present an HDR image in Windows. Some of this will be a summary of the talk by Evan Hart at GDC 2018. There are three steps, which I don't think are explained anywhere, except maybe within these ultra-rare DX12 samples that I had to catch midair from a hot tweet. Set your swapchain effect to DXGI_SWAP_EFFECT_FLIP_DISCARD . Emphasis on FLIP. The GDC talk above seems to think this isn't 100% required, but I couldn't get any HDR working without it. Set your swapchain buffer format to an HDR compatible backbuffer format. The ones that work for me are: DXGI_FORMAT_R16G16B16A16_FLOAT and DXGI_FORMAT_R10G10B10A2_UNORM , though I expect other formats might work. I haven't tested all of them. Depending on which format you picked in ( 2 ) you need to select the correct color space. If you selected RGB
Hi everyone, It's blob time! This is a long one so buckle up. If you haven't been following along on my tweeter for the last month, I've been working on simulating and rendering blobs. Here's a dump of my progress (easily accessible from the hashtag #vectorboy64 ). I'll add some captions This first one is just me trying out position based dynamics and the constraint functions. PBD is a really cool way to simulate physics, and I'm really happy with how flexible it is and how easy it is to implement. I'm definitely going to use it for some other stuff in the future. position based dynamics with oscillating pressure and distance constraints #vectorboy64 (???) pic.twitter.com/ZOa6QLtndR — Kelly Jelly rebs your zankage and distunks you. (@pyromuffin) January 9, 2019 I spent a really long time trying to figure out the gradients of some tricky vector math formulas. I have them mostly correct, but there are still spurious rotations like this in the
Hi. Today I will be writing about my efforts regarding reverse engineering unity games. As you may have read in a previous post, I have recently lost all my project files in a horrific idiot hard drive reformatting disaster. I was able to get my final builds back from some people I had sent it to... and actually, it just occurred to me yesterday that I had sent Unity QA a repro project for echobox that had almost all of the final source in it (except for the visualization portions and the level design), but they have yet to reply about sending me the project. A bug or a feature? So it turns out that reverse engineering Unity games is actually quite possible, and even somewhat easy. Is this a feature or a bug? In my case, I have a legitimate reason to reverse engineer as much code as possible from my game executables, because it will save me a ton of work trying to rebuild what I've lost. For others, I imagine this is somewhat of a headache. There is an extensive forum post ab
Comments
Post a Comment