The Value of Personal Projects

November 11, 2019
close up of various craft supplies

Personal projects can be a great way to explore ideas, concepts, or frameworks before you use them on an actual work project. As somewhat of a staple of SEP’s Hackathons, I’d like to highlight some of my philosophies around personal projects and the value I’ve gotten out of them. I worked on all the projects I’ll write about at multiple Hackathons, usually with other people.

To Make Personal Projects Last, Give Them a Hook

While there is nothing inherently wrong with doing a side project for the pure reason of learning, I find that these types of “dry” projects run out of steam fairly quickly.

While work projects are defined by budgets involving money and time, personal projects are limited by a budget of motivation. If the project feels too much like work you’re probably going to be “bankrupt” before too long and your repo will join the ranks of abandoned efforts collecting dust on GitHub.

If there’s a particular technology I’m interested in exploring, my strategy is to pair it with an existing hobby or interest I have. Maybe this seems obvious, but in my experience if I have a reason to keep working on something I’ll keep finding new avenues to explore and concepts to learn about that weren’t apparent to me at first.

In this series of posts, I’ll highlight three of the major personal projects I’ve had over the years and the value I’ve gotten from them. We’ll cover the first one in the remainder of this post.

A World Record Buddhabrot

The Buddhabrot is a variation of the famous Mandelbrot Set fractal that’s substantially more computationally expensive to render. If none of that makes sense to you, the easiest way to explain it is that it’s a mathematically defined image that has potentially infinite detail. No matter how big of an image you want, you can create a version at that size and there will be interesting stuff to see.

In my case, I ended up creating what I believe is the world’s largest rendering (at the time it was made) of 68 gigapixels. Why? Because it was fun! It doesn’t do anything beyond looking cool, but that was reason enough. I didn’t even set out to break any records but each time I made it larger I found new challenges to tackle.

A Plethora of Learning Potential

Some of the things I touched upon or learned while doing this project were:

  • Lots of topics related to parallel computing and general optimizations:
    • Multithreading using .NET’s Parallel Task Library
    • Using SIMD in .NET. Most people think that parallel computing begins and ends with threading, but CPUs have specialized instructions for processing sets of data in parallel.
    • An exploration into OpenCL for targeting the GPU in addition to the CPU
      • Did you know that GPUs are usually much, much, much slower at doing math using double instead of float? Depending on your application you might get the most bang-for-your buck by sticking to CPUs!
      • It was also satisfying to verify that if you’re sticking to just the CPU, properly optimized .NET code can be perfectly competitive.
  • Image processing:
    • Turning numerical data into colors is harder than it sounds if you want nice gradients! Human color perceptions is not linear and this has a big impact on how we perceive things.
    • If you want to efficiently save off enormous quantities of images in .NET, using the straight-forward SetPixel methods on Bitmap is massively slow. Going through the hoops to more efficiently set all the image data at once really adds up when you’re generating thousands of images!
    • Image compression and how much you can get away with before things start looking noticeably worse
  • How to display a gargantuan image using a mapping framework (in my case, Leaflet). It doesn’t make sense to create a single image with 68,719,476,736 pixels since your computer would melt trying to open it. It turns out this is a solved problem since it’s essentially how mapping frameworks that show satellite imagery work.

Results

My Buddhabrot rendering ended up being one of the more successful projects I’ve done. One of the things I’m happiest about is that I wrote an in-depth explanation of it that I encourage you to check out if you want more information on it. It’s been really nice to have a solid writeup to look back on and to show other people.

My 68 gigapixel version is generously hosted by SEP and you can see it here. (Temporarily unavailable while it finds a new home!)

Next time we’ll look into a project that combines the fringes of machine learning with an early 90s video game…