When using Blender to render a scene with the newly compiled yafray it does seem to work.
The only thing I noticed was that when the scene uses an HDMI environment map it doesn't show up in the render.
In the Blender scene however the map is clearly present and shows in the world material preview pane.

This could also be due to new setting variables in blender of which I'm not aware of .... yet...

Well... i own a small Indy, but this thing load in 2 secs and less ! shure ! Even on an Indy i can manage smaller objects and meshes with np and quite fast ! ... who optimised this thing done it like a pro ! regards ! off course rendering part on an Indy is not a song, but shure ppl with Octanes and on will get max. from this baby ...

Today i imported some dxf`s and played a little ... maybe will post some examples here if i will have time ... i never used blender before ... but this one rocks !!!

The problem with importing files, and missing file names is fixed, the libiconv problem is fixed too, but i get an warning window which say that i must adjust the path to $phython home, but i not know where to set it up ...

I found out today that when you disable xml output in the Yafray renderoption tab in blender, it does work!
That is, it will render the cornell.blend testfile and makes it look very good ( as usual)

I also discovered that when rendering the same scene on a Macbook ( intell inside... ) it looks very different from the one I've rendered using an Octane2.
In my scene there's a moving object made of glass, and on the SGI it looks really realistic, but on
my Macbook it looks like the glass has a foggy texture.

Can anyone do a test with a Yafray render using blender on SGI and another platform and confirm this?

So although my SGI takes a whole lot longer to render then my OS-X machine I still prefer it cause the final result looks so much better !