Python – Blender still image rendering using Hadoop

Blender still image rendering using Hadoop… here is a solution to the problem.

Blender still image rendering using Hadoop

I’ve been searching for this online but haven’t found any solid ideas to perform this task.

My request

These are set up:

  • Hadoop mapping is reduced
  • Blender
  • Any software/library must be open source

Items:

I’m using python to generate scenes in blender. I had this perfect job and created a camera path. When I open the hybrid file created in the UI, everything is as it should be. I’m currently rendering to still image in python, but even after lowering the quality setting, I’m still rendering at 1 frame per second. After rendering all the still images, I’m using Blender to animate the frames into sequences. All of these can be run independently, but it takes about 5 minutes to render a 30-second video.

Target/Question:

I need to render the camera path and output the result in a video format. On a machine, you probably know very well that a 1-minute video can take a long time to render, and I will render videos of different lengths, ranging from a few seconds to an hour. The idea is to use Hadoop to distribute the rendering across clusters to reduce rendering time.

I’m familiar with mapreduce, but not sure how to break this problem into blocks. I’m looking for ideas on how to fix this to reduce render times. Any suggestions would be appreciated.

Solution

I’m not familiar with mapreduce and don’t think it’s suitable for what you’re trying to do – at least not for existing solutions.

What you are looking for is the so-called render farm. This is where multiple machines each render part of the final result. For animation, this is easy to split by having each machine render a different frame, and there are also ways to split a still image so that each machine renders parts.

To get started quickly, there are commercial render farms like Render Street. There are machines set up to render for you, or free services like sheepit distribute rendering between users’ computers.

If you have several *nix machines available with Blender installed, you can use blender’s cli Arguments renders a different range of frames on each machine.

ssh m1.local 'sh -c "blender -b myanim.blend -s 1 -e 10 -a"'
ssh m2.local 'sh -c "blender -b myanim.blend -s 11 -e 20 -a"'

To run your own render farm, for a simple farm with administrator privileges, there are network Render Addon is included in Blender, for more advanced solutions, check out flamenco

Related Problems and Solutions