Skip to content

You Have Failed Me For the Last Time, Nemescheck

Or, Render Options 2023

Some years ago, as I cast1 around for a rendering workflow that would be at once both inexpensive, and fast and quality (a trilemma that myself and endless others have commented upon) I found myself considering a veteran 3DCC suite: Cinema 4D, and I made the decision at the time to buy into C4D as my “nice” rendering platform.

The reasons for this were at the time, and I think still today, solid: this is a program that has been around for a long time, and is aimed at that curious intersection of artistic and technical folks in a way that not many other programs are. I liked that the interface isn’t too streamlined; after all, you need lots of different inputs available to you if you’re going to create the widest range of possible looks and scenes and whathaveyou. The cost was a secondary consideration for me: if it was making me money, well, if the offset was high enough, then I could justifiably keep it around and not worry too much about the annual fee to make sure the suite was current.

Early on, I did not have a computer with a dedicated “good” graphics card. For many years, my workflow was conducted on a mid-2013 MacBook Pro, a workhorse of a computer that I got many, many miles out of. Sadly, it started to develop a bad habit of randomly shutting off (“off” as in “cold start” to recover.) and I was never able to find a reliable fix or even a reason for this. Clearly this is some sort of worsening hardware degradation, but this and three (!) battery swaps convinced me to retire the machine. I’ve moved to Windows as my platform of choice as a sort of protest against the increasing iOS-ification of the macos platform, and the relatively lower cost of the hardware. Anyway.

The point is that that laptop had a reasonable graphics card, but not a particularly beefy one, at least by 2020 standards when I started looking for replacement. During this time I also invested in a render machine for my home; an AMD-based homebrew tower with a nVidia Ti 3070 card for graphics. Certainly more than capable of rendering scenes faster than the Mac. Or at least, you’d be forgiven for thinking so.

Upon installing C4D on the render machine2 I started testing various scenes. What struck me as odd is that they never seemed to be much faster than the Mac laptop; even relatively basic scenes – albeit ones with lots of light sources – rendered far more slowly than I would have thought they should. And because life is life and I had shows and children who need braces and other more important things to worry about, I didn’t get around to devoting serious noggin time to this issue until recently, when I abruptly realized something about Cinema 4D that I perhaps should have before now:

  1. It does not come with a GPU-based render engine of its own

This was quite the realization. One of my primary thoughts in purchasing C4D and installing it on the render machine was to be able to turn around client asks for new looks very quickly. After the collapse of Hantmade’s Stage software and its developer vanishing off the face of the earth, I had cast around for a realer-time solution to client renders and ultimately settled on L8, which – while fast – comes with its own set of downsides to achieve and help maintain that speed. It caps the number of 3D objects that it will handle in a scene, and doesn’t do advanced materials effects. None of this is really an issue, because L8 isn’t trying to be a render engine, it’s trying (and succeeding quite nicely) at being a realtime visualization solution. While I have used it to create pretty things for clients for a few years and clients have been reasonably happy with it, it’s the wrong tool for this particular job. All this to say: this is not to bash L8 or the developer at all, it’s a very capable bit of software.

Cinema, however, is trying to be a program that people render scenes out of, and once I realized that it hadn’t been using the GPU for renders, I set out in search of, well, a GPU render engine for it. And here, dear reader, is where I fell down the rabbit hole of software. While there are many, many GPU-based rendering options for Cinema, they’re all bloody expensive, come with significant learning curves, or just don’t work, at least not right out of the box.

Take Redshift. This was sort of the de-facto rendering solution that many C4D users turned to, and still seems to be high up on the list of popular engines. Then…Maxon / Nemetschek acquired Redshift in 2019, and jacked the price up. Now, if I want to use Redshift as my rendering solution, it’ll cost an additional $220. That’s on top of the $700 annually that I’ve been paying to keep C4D current. That’s just too expensive for me, so I crossed it off the list.

What of the other render engines that are out there? I didn’t find any of them particularly great, either because the settings to get volumetric lighting correct were too fiddly, or because the engine was not really optimized for what I was trying to do – lots of these are designed and have tools specifically for architectural design and renders, not really for created lots of brightly-colored lights cutting through a haze-filled arena. While others might have been good, my inability to get an actual render out of them after ten minutes or so of trying around with them also knocked them off the list. It’s possible that I wasn’t trying too hard with any of them, either, because they’re all pretty spendy. I want something that works out of the box without too much trying; for the money they’re all asking, I think that’s a reasonable ask. At least one (Corona) had a hellawhack nutty self-hosted license server scheme that cost me a solid thirty minutes of troubleshooting because the plugin can’t tell the difference between “localhost” (which it fills out as the default itself) and “127.0.01”.

Years ago, when I dabbled in the esoteric arts of Linux, I managed to hear about Blender. The thing I remember most about trying to use it was just how monumentally backward the user interface was; I could accomplish next to nothing because the default mouse setting was right-click-to-select, in contravention of almost every other piece of software ever. These being the dark times before you could find free tutorials aplenty on YouTube, figuring out how to actually use the damn thing proved to be too much of a hassle, so I decided not to play around with it too much, and I wasn’t doing much in the way of 3D graphics back then, anyway. (Right click to select? “X” to delete instead of…[Delete]? …why?)

But necessity is the mother of trying something old again, so as I sputtered out trying to find a reasonably-priced hardware rendering engine, I figured I’d check out what the Blender foundation has been up to recently. Turns out: quite a lot. It ships with two GPU-accelerated render engines, EEVEE and Cycles, and comes with the quite attractive starting price of $0. I decided to download and try it, and boy, am I glad that I did. The Blender foundation has cleaned things up significantly since I last was messing around with the software in 2008 or so.

For starters, it seems that around 2013, the lead developer had an epiphany and realized that their bizarre counter-intuitive UI choices were preventing adoption from all but the most weird programmer / engineer types, and set about altering their software workflow to more closely align with industry-standard suites like C4D and 3D Studio Max. Perhaps even more importantly, they established the Blender Marketplace, where there is hosted an absolute treasure trove of paid and free plugins. Lots of these are simply people who are adept at using geometry nodes – which is fine, I’ve bought a few because they’ll save me time – but there are also some really specialized ones, including some specifically for my industry, and one in particular, Stage Lighting Kit, has been extremely worth the $60 I paid for it.

At its base, it’s some very simple moving light models that output both a luminous cone and a light source at the same time, nicely mimicking the look of an automated lighting beam. Because it’s doing “fake beams”, the plugin is extremely lightweight. I haven’t really pushed the limits on either my render machine or my laptop, but with 30-some odd lights going at once, I notice no discernible FPS drop on RENDERMAN and only slight drops on the laptop3. My 3070 can render a scene in around 2 seconds, and the same scene takes around 12 seconds to render on the laptop. We’re talking full resolution here, folks, whereas a piddly 640×480 render would have taken minutes in C4D with Stage. The advantages of hardware rendering with the optimized assets simply can’t be overstated; it’s game-changing. Actually, I’ve experimented with changing the resolution of Renders in Blender, and it doesn’t seem to make much of a difference. The engines take about the same amount of time at low resolutions than they do at full ones, there doesn’t seem to be much of an advantage to downsizing because you don’t gain any speed4.

Where before I’ve been relying on L8 to knock out client renders, my workflow now can transition to something like this: design / modeling in Vectorworks (because that’s unfortunately never going away) hiding the moving light layers and then re-creating those assets with the Stage Lighting Kit ones in Blender. I’ve also created my own “seated crowd” plugin to allow the generation of n-rows and n-columns of seated audience people, using instancing because it’s just crazy fast in Blender. This means I can save about $1,000 a year on a Maxon subscription and just let C4D run out. It’s still a fine piece of software, and while the learning curve for certain Blender features will be a bit tougher (because I’m used to C4’s workflow and because no 3DCC software is going to be easy to pick up) I’m quite happy leaving it alone for the odd times I need to open an old file for conversion or whatever, or to use some plugin for which an equivalent does not yet exist for Blender.

Of course, there are trade-offs, as there always are for any software. The first is the lack of global haze noise, which I can probably work around by exporting EXRs and applying the haze as a noise map / luminance layer thing. I used to do this with Cinema anyway, because Stage’s assets didn’t use global haze noise, either. (The custom ones that I built did.) It might also be possible (and probably is) to automate this step in the compositing feature of Blender.

The other issue is more fundamental to the speed of the plug-in: the reason it’s so fast is that it’s created glowing cones of light, not real (virtual) beams of light, tracing where the rays go. It’s much, must faster, and in general, looks just fine. Until, that is, you intersect some geometry and your light beam goes right through it. While it might be possible to do some sort of weird fancy boolean operator that cuts off parts of the “beam” when it intersects geometry that we don’t want it to pass through, it’s likely that doing so will remove at least part of the speed advantage that the “glowing fake beam” technique affords us anyway, at which point it might be better to just render actual volumetric light beams. Luckily, there’s a separate plugin that does just this (Theatrix) and will be useful for times when I need to have light beams that I need to intersect geometry in a realistic way.

So there you have it: I’ve saved quite a bit of money and time by switching to the free software solution. It’s remarkable, really. Usually the free software suite that competes with the commercial package isn’t as good, or as polished, or as easy to use or comes with Andrew Price (Blender Guru) calls “anticonventions”, where the software – for reasons of overly-rigid philosophy, usually – eschews modern software conventions while sneering that it’s “AKCHUALLY BETTER THIS WAY.” But in this case, while Blender (and really, any 3DCC software) comes with its share of, ah…quirks, including at least some of the aforementioned anticonventions, and there are things that I think could be improved, it seems to have made significant progress in the last several years. I plan on sending at least some of the money that I’ve saved on C4D to the Blender Foundation as a way of saying thanks, and to encourage further development.

But this isn’t just me heaping praise on Blender, this is also (yet another) indictment of Nemetschek, a company I’ve grown to feel increasing antipathy toward during the last several years, especially with the recently-announced price increases for Vectorworks licensing. I still think Cinema 4D is a capable and excellent software suite that accomplishes much of what it sets out to accomplish, but failing to include a GPU-based render engine is clearly a (more or less forced) cash grab at the expense of their users. Acquiring Redshift essentially boxed them in, in terms of offering one within the software: who would pay for their shiny new IP if they included a GPU engine for free in their software? And so, to chase dividends for the shareholders, they chose to hobble the functionality of their flagship 3DCC software to perform a wallet extraction on their userbase. You might reasonably ask if my opinion is that they should have offered Redshift for free to users already paying the subscription fee? My answer is “absolutely”. They would have absolutely recouped the cost, and gained a lot of goodwill in the process.

And so I, a working designer in this business, am doing the very American™️ thing and voting with my wallet, and I choose to send my hard-earned cash to a company that, while an underdog, is doing the right thing and doing it well. I’m sure Maxon / Nemetschek doesn’t give a crap, but that’s sort of the point. They never do.

At least for now, I’m saving money and producing excellent pretty things to show clients, so I’m happy.

1 Minor pun intended.

2 It’s called RENDERMAN.

3 Update; the laptop absolutely does experience some FPS drops, but this is mostly from the 20K+ animated audience member models populating the arena.

4 This might turn out to be a fluke of the scene I was messing with, or something else fundamental to the way I’m doing these. I’ll admit that this surprised me, because it’s flies in the face of my long-held and usually-correct intuitions about how long software should take to Do Things. My gut is telling me that the lack of speed penalty for increased resolution is illusory.

Leave a Reply

Your email address will not be published. Required fields are marked *