Hellfire010 Posted September 8, 2007 Share Posted September 8, 2007 NOTE: This post is based on how I think this whole thing works, and it is more than possible that I am way off but whatever, it should hold some relavence either way. I was wondering what the blur rate was for most common blurs (gaussian, feather, etc.). My example assumes you're blending a solid area with a transparent area. However, it would obviously be the same for, say, blue blending into red. I just didn't feel like going through all the different R G and B stuff... cause I really dunno what I'm talking about. o.O Where: An = Current Pixel Alpha An-1 = Previous Pixel Alpha x = Blur rate depending on distance, and stuff... Is it simply: An = An-1 - x Or does it work on a slightly more complicated formula? Basically, I'm wondering if it is linear or exponential. If it is only linear, could it be done exponentially? And would that have any realistic use? Quote Link to comment Share on other sites More sharing options...
BoltBait Posted September 8, 2007 Share Posted September 8, 2007 Blur works like this: http://en.wikipedia.org/wiki/Gaussian_blur Quote Click to play: Download: BoltBait's Plugin Pack | CodeLab | and how about a Computer Dominos Game Link to comment Share on other sites More sharing options...
Hellfire010 Posted September 9, 2007 Author Share Posted September 9, 2007 Oh, not as simple as I had hoped. Thanks for that. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.