In order to gradually test a possible workflow involving both Rhino and Moi,
and to better know Moi, playing with it using a trial version (no expiration but save disabled),
i'm wondering those two things :

1) Is there a Rhino 3d SplitEdge equivalent command in Moi ?

Or, if not, how can i achieve the same result in a efficient way ?

Files 01_egdes_to_split.3dm, 02_edge_to_split.jpg, 03_splitted_edge.jpg and 04_moi_filleted_splitted_edges.jpg,
illustrates the task i want to accomplish.
The file 04_moi_filleted_splitted_edges.jpg is an example of what i might get by filletting the two splitted edges.

2) What are the unwanted lines that i get while in shaded mode, as illustrated by the file 05_garbage_lines.jpg ?
Is there a way to avoid that ?

Yes, the Edit > Trim command can be used on an edge in MoI to cut the edge. You don't need to create an isoparm separately, you can use the "Add trim points" option in the Trim command to indicate the spots you want to cut at. So select the edge, run Edit > Trim, then click "Add trim points" and place a point at the desired split location and then right-click to finish the Trim command.

> 2) What are the unwanted lines that i get while in shaded mode, as illustrated by the
> file 05_garbage_lines.jpg ?
> Is there a way to avoid that ?

Those are a display artifact of the realtime display, they are a result of pulling edges and curves towards the eye point to avoid a different and worse artifact of having the edges being partially submerged in the surfaces. See here for more information:http://moi3d.com/forum/index.php?webtag=MOI&msg=3933.3

There isn't any way currently to avoid that - it's a normal thing that you will see various kinds of display artifacts in the window display because it is focused on doing things quickly so the display is responsive, it's not focused on trying to make a "pixel perfect" display for generating a final illustration since doing that takes a lot more calculation and waiting.

To get a better quality display you should use something that's focused on image quality rather than focused mainly on speed, so something like a rendering program or MoI's export to PDF or AI format should generate a better quality image.

It's a pity that those display artifacts cannot be avoided in the viewport's display,
'cause i tested that those artifacts are also captured by both the commands getViewport.renderToClipboard() and getViewport.render(),
two commands that would be very useful for creating presentation images of the 3d models.

Anyway, do you think that the next V4 version, will integrate an optional better display mode, in case of using a
discrete video card, like those offered by Nvidia Quadro series ?
(i own a Quadro K3100M 4gb, and i'm planning an upgrade to K5100M 8gb if possible)

> 'cause i tested that those artifacts are also captured by both the commands
> getViewport.renderToClipboard() and getViewport.render(), two commands that
> would be very useful for creating presentation images of the 3d models.

Yes, that's to be expected because both of those methods use the viewport realtime display
mechanism. They are meant to just capture the display and are not really intended for
making final presentation graphics.

For better quality presentation images, the PDF or AI exports will work much better, those are
the functions that are more oriented towards making a high quality graphic result and not
only primarily focused on speed like the viewport display is. Or also the other main way to get
high quality renders is to export to a rendering program.

> Anyway, do you think that the next V4 version, will integrate an optional better display
> mode, in case of using a discrete video card, like those offered by Nvidia Quadro series ? <....>

I don't know yet what will be done in this area for V4. I have a few ideas on how to improve
that particular display artifact but it has to be done quite carefully in order to not degrade
the performance of the display engine. So any changes have to be tested quite a bit to make
sure it doesn't come at too high of a performance cost. That type of stuff involves quite
a bit of work and it's difficult for me to predict when such things will be completed. It's very
important for the display to be quick so that it can be interactive, the display engine is just
overall much more focused on speed and not on trying to make a presentation type graphic.

Any future improvements in this area will probably involve calculations done by the CPU
before stuff is sent to the graphics card, and so probably won't be dependent on any
specific video card.