The July 2014 update is another big one introducing new nodes like »Fit NURBS«,
»Curls advanced«, »Curvature Amplifier«, »Grouping in
Form« »Follow CurveList« »Stretch Hair« »Hair
Filler Rounded« — for a full description see the si-community thread and the updated documentation.

The Kristinka Hair toolset is a new and unique way to set up, style and simulate hair
using ICE nodes. A set of fully customizable ICE nodes Scalable, from only a few basic compounds for building basic hair, to very complex
structures. Hair styling that always considers the whole shape of the hair. Styling works well for short and for long hair. Unlimited hair length, unlimited number of hair segments. Automatic, procedural generation of details - always with full control. Locks, clumps,
curls, turbulence, are created by ICE compounds Additional modifiers, like cutting hairs by external geometry, constant strand length
for key frame animation, resampling and subdividing strands, morphing with another
hair, modulating hair's distribution over emitter, so user can increase density on
most visible areas Full support for the Sofimage's built-in Strand Dynamics Framework simulation engine. Only factory ICE nodes were used, it should work nicely with any Softimage version
from 7.01 on.

JPWestmas wrote:
So my question is, what would be the best way to keyframe these nurbs deformers (not procedurally but by eyeballing them and placing by hand), which in turn would drive the hair strands.

Hello

here you go. Deformations are applied in pairs (that's a usual method in ICE). Static and deformed emitter, static and deformed hull. Everything should stay at zero transform, only deformation is "safe". Hull should be mesh, only. Generally you want higher mesh resolution than strand resolution, smooth hull as possible as well. Because it using as much fast, "faceted" method for transfer.

I've added some test animation.

'deform by hull' node goes anywhere after 'form' nodes (nodes which usually have some geometry input). For reasonable performance, it should go before hair filler. For later use, you just export the node in same folder, where kH stuff is.

The renderer "turtle" used by this scene, is not currently available. The "turtle" renderer will be used instead.

Hey man, that's excellent! I was thinking about bringing in some characters I animated in messiah. I assume at this point that I could very well constrain the scalp surface of the wig to the verticies of a head that has point cache on it?? In otherwords, what is the method you use to constrain the scalp triangles to the head triangles? Is it using a point constraint? I'll keep looking ^.^.

JPWestmas wrote:I was thinking about bringing in some characters I animated in messiah. I assume at this point that I could very well constrain the scalp surface of the wig to the vertices of a head that has point cache on it?? In otherwords, what is the method you use to constrain the scalp triangles to the head triangles? Is it using a point constraint? I'll keep looking ^.^.

1:If simple, parent-like connection is enough, you could find one polygon on cached mesh, which doesn't deforms itself. Usually the one on top of the head. Create cluster from this polygon, constrain null to cluster (object to cluster constrain). Don't forget to activate both tangent and normal.
Use null as parent, single deformer for envelope, whatever. kH would like a 'single deformer for envelope'.
I've used this successfully in times of XSI 5 or 6 for attaching to point-cached import from Max.

2:Simplest ICE way would be this one. I've used exactly the same method in post above, but on strands,so you can't use kH node on mesh. That's a non-interpolated, 'faceted' method - but it's fast.

3:For nice smoothed interpolation, a bit expensive... there is a XSI Cage op - if you set fallof to very small value, it's not so expensive.

The renderer "turtle" used by this scene, is not currently available. The "turtle" renderer will be used instead.

Thanks for the hints, that's all I really need are pushes in the right direction. That polycluster constraint sound great.

I just did a simple test just now,(I know I take forever) where I exported your head, the hull deform surface and the scalp polymesh emitter to messiah. I put some bones in the meshes and exported the mdd files. Then I brought those cache files back into SI and applied them to the model with point oven, it worked perfectly! The hairs appeared to be stable too!

JPWestmas wrote:
I just did a simple test just now,(I know I take forever) where I exported your head, the hull deform surface and the scalp polymesh emitter to messiah.

poor my head btw if you do something with 'emitter' mesh, like freezing exporting and importing back, there is small ICE node that should reside on this mesh, called 'initialize polymesh emitter' or something, you just re-apply it if was lost, for any reason.

cheers

The renderer "turtle" used by this scene, is not currently available. The "turtle" renderer will be used instead.

JPWestmas wrote:
I just did a simple test just now,(I know I take forever) where I exported your head, the hull deform surface and the scalp polymesh emitter to messiah.

poor my head btw if you do something with 'emitter' mesh, like freezing exporting and importing back, there is small ICE node that should reside on this mesh, called 'initialize polymesh emitter' or something, you just re-apply it if was lost, for any reason.

cheers

hehe, I meant your second head . Luckily, I didn't have to import the mesh to get the animated deformation to work, I only had to import the point cache back onto the original head and it worked just fine because the point order was the same. =)

What are the equivalent of the version 2 vs version 3 nodes of kristinka?

i have and hairstyle that im using for rendering ( not simulation ) and i am trying to have the old setup to work, in my case i was using a kh2 emit hair ( nurbs emitter and point cloud as styling ) but in the new kh3 emit from nurbs i cant find any way to have the previous point cloud to style as there is no guide in name fuction anymore.

Hi Mathaeus
So I finally got around to posting some tests I did with the simulation models you gave me. I found that with the one point simulation I still got a lot of penetration of the collision object. The strand dynamics one was a lot better (though a quite a bit of a workflow) However even though it is better I still get some penetration issues. It is not to too bad since the toon shading is very forgiving. Let me know if there is anything I can do to help the penetration issues. Also let me know what you think of the test.
BTW I did try to do Maya Nhair It was easy to learn and set up and the collisions are absolutely great even on a light mesh. The problem is in rendering and getting it to match the mesh when rendering a sequence. If I render a frame it matches perfectly but if I render a sequence it is off even if the Nhair is cached. Maya Nhair is still a work in progress and I have people at Autodesk looking into the problems. Will post that when i get the issues resolved.

Ok I just raised aircrafts to gather the all simulation department of Kristinka technologies

Now seriously, probably the movement is too fast for "Push Strands outside geo" compound (or what is called, there are a few versions around). As I said before, there is "get closest location" node inside, which searches for closest surface, according point normal it makes decision, is some strand segment is inside of geometry. If so, it pushes the strand segment to closest surface.

With fast movement, closest surface could be on opposite, unwanted side. With faster movement, compound will think that nothing important happens, strand will just pass through the geo.

In short, only my advice is to try it with slower motion. Also, if you didn't already, create a special geometry for collision, as much convex. That is, without holes for eyes, mouth, so on.

Thanks! I will give those ideas a try. At this point I think its awesome and may even render the hair as a different pass etc. But it seems to be the best hair solution for me so far. Thank you so much for giving us this tool as an option.
John

Here is about 5 mb wmw of Syflex simulation test.
I didn't tweaked the sim that much. Anyway, simulation time looks promising, about 2-6 FPS for about 250 guides. 2 is with self-collision - here is just a smooth pin, instead. All that with my single threaded Syflex. I hope I'll ad a full setup for mesh extrusions and Syflex in update.

The renderer "turtle" used by this scene, is not currently available. The "turtle" renderer will be used instead.