Macro performance [bad]

Is there anything I can do to improve the performance when I repeatedly execute a keyboard macro?

I frequently use macros to perform repetative tasks - maybe executing the macro 100 times (or more) by calling

SelectedView.RaiseEditCommand(macro)

within a loop. (My app allows the user to specify how many times the macro - or any keystroke - will be executed.)

I have a simple macro that just does the following Press 'Delete' 8 times Go to the end of the line Press 'Backspace' 10 times Press Space Press Delete

I use this to convert sections of the language definition files from my old Editor component into the format required by your component. Since there may be hundreds of tokens in each section of the file I generally need to execute it 40 to 400 times.

Using the old editor to do this I can execute the macro 100 times in 2.5 seconds.Using your editor the numbers vary widely - but are always bad - 14 to 46 seconds.

This is AFTER I added code to do the following:

SuspendPainting()
Document.LexicalParsingEnabled = False

In fact adding the above did not seem to make much of a difference.

Note that I also suspend the UndoRedo buffer [Document.UndoRedo.StartGroup()] so that a user can undo all the changes with one click. Undo appears to take slightly longer than the original application of the changes. [19 to 50 seconds]

Is there any way to speed up repetitive executions of a macro?Maybe some kind of 'compile' like option?

I can avoid the problem myself by simply using my older app, but that's not something I can suggest to my customers once we release the new version of our app that uses your Editor control instead of CodeMax.

I tried: syntaxEditor.SelectedView.Selection.SuspendEvents()instead of: syntaxEditor.SuspendPainting()('Selection' seems to be a strange place to suspend events that appear to apply to the entire view)

It now seems a little more consistant [14 to 25 secs vs 14 to 46 secs previously] but still considerably longer than the 2.5 secs in my old editor.

I think I may know why [at least in part] the old editor is so much faster.It has the concept of a 'Repeat Count'.I can set the repeat count to 10 and press the 'a' key to insert a string of 10 a's for example.

This is considerably faster since it inserts a single string and then performs all updates/events instead of using 10 separate 'inserts' each followed by their events.

This would be used in my macro for the 8 consecutive 'Delete' key presses and the 10 consecutive backspaces.So instead of executing 21 separate commands the old editor only executes 5.

This might therfore account for about half the difference in performance.

I always knew that the Repeat Count was a very useful feature but I had not realized it was also a performance enhancement.I have added my own version of Repeat Count to my editor but unfortunately my implementation does not have any effect on macros.I handle KeyPress to convert "Repeat 'key' " into 'insert string of key'.And KeyDown to handle Backspace, Delete, Tab, Enter etc. [Either inserting a string or deleting a text range]I also handle Paste, Indent, etc. by executing them within a loop.But I can't convert a macro - with its consecutive key presses - into anything more performant.

Maybe you could consider adding the concept of a RepeatCount to your product (and Command object).(Needs a method to set it - which applies only to the next command executed. But it also needs a way to associate a count with each command in a mcro.)

This is not only very useful within macros (and for repeatedly executing a macro) it is also very useful when creating test data. If I need a 200 character string to load into a table column I simply set the count to 200 and press a key. If I try to create the string any other way it is highly suseptible to length errors.Similarly when creating numeric test data I can test the boundary conditions of a 38 digit number very easily by setting the count to 38 and pressing the 9 key.This is a feature my customers (developers) frequently use.

Sorry, it looks like the SuspendParsing method is internal. Thanks for the repeat count suggestion.

An idea that might help you, you can make your own MacroCommand instances programmatically. So what you could do is iterate the items in the normal MacroCommand and while iterating, look for opportunities to combine the various child commands into others, and put the results in the new MacroCommand instance. Then use that instance instead.

For instance, if you see that there are 10 DeleteCommands in a row, maybe you make a custom EditCommand-based class that provides a single modification that is optimized for accomplishing x number of sequential deletes.

Right now TypingCommand doesn't have public properties for the KeyChar and Overwrite flags but we will add those for the next maintenance release in case those would help you during any optimizations.

I'm starting to get more complaints about performance now that cusomers are finally upgrading to the new version of my product that uses Syntax Editor rather than CodeMax.

You mentioned an internal method called SuspendParsing().

Does this suspend the parsing without effectively switching to the 'Text' mode and then having to reparse everything when the parsing is switched back on? Currently I use Document.LexicalParsingEnabled = False/True which adds overhead when it is set back to true - sometimes a lot if it is a big document.

If SuspendParsing() avoids, or reduces, this overhead could you possibly make it a public method so that I can use it in cases like these.

The problem with the WinForms version's older design is that if you do a bunch of text modifications in a row, a lot of the code that watches text changes (to update things like outlining, indicators, layout, etc.) watch each of those and the time can add up.

When we rewrote the product for WPF, we have a much better design there where you would create a single text change that can have one or more edit operations inside it. The entire text change (which could contain hundreds of edit operations) is executed as a single atomic change, making large multi-edits execute much faster.

But back on WinForms, the SuspendParsing() would just prevent the parser from kicking off after the first modification you make. The subsequent ResumeParsing() allows parsing to start again. I'm not sure that would really help you here though because say you kick off 100 modifications. The first one will change some text and tell the parser (which should be running on a worker thread to not impact the UI thread as long as you started the semantic parse service) to kick off on that change. Then your other modifications occur, which will tell the parser that any pending parse requests are invalid and it will keep invalidating parse requests (that aren't currently running in the parser) until the last modification.

You had mentioned the repeat count idea in the past and we had said you could try making a custom edit command to combine certain actions in a row that you detect to help perf. Did you ever try that? I'd probably recommend that for your situation.

I use only the Lexical parser - based on definitions in XML files that are loaded as needed - so I think the parsing is done in the UI thread.

I'm assuming SelectedView.Selection.SuspendEvents() also prevents it from firing the various triggers used for outlining etc. However since I currently disable Lexical parsing I guess that part is effectively switched off anyway. I was hoping that SuspendParsing() would allow me to temporarily stop the parsing without having to effectively switch to the text parser and then back to whichever SQL parser I'm using.

I did not get a chance to really look at 'merging' commands in a macro. The problem is that the only things I could probably merge are typed characters - merged into a single insert. I guess I could merge multiple consecutive Delete keys also but that is harder since it is somewhat different than merging typed characters. Similar problem with multiple consecutive arrow keys. (If each command had an associated 'repeat count' then all those would be simple ...)

I need to make other changes for macros (like associating a shortcut with a macro and storing the current 'Find' options and reapplying them before executing the macro) so I'll look at merging typed characters when/if I get approved for that enhancement.

It also appears that each individual command is recorded in the Undo buffer since executing Undo after executing a macro many times takes even longer than executing the macros themselves.

Yes if you are using dynamic languages (XML file definitions only), then they are just lexing and not semantic parsing. The Suspend/ResumeParsing only affects semantic parsing. Lexing is done in the UI thread and a poor original design choice of the WinForms version was to have it fully lex changes to documents, even if those changes are way down past the visible range of the view showing the document. That is something we did much better in the newer WPF version where the only lexing that occurs is that needed to get text tokenized that is before and including text visible in a view. Thus the WPF version can load and display a large multi-MB file instantly, while the WinForms version can take a few seconds.

The selecton event suspension just prevents selection changed events from firing when you execute multiple text modifications in sequence. It doesn't affect anything else that is processing the multiple text modifications themselves along the way though.

Yes, in the older WinForms version, mutliple sequential modifications are each their own undo item whereas in the WPF version, you can cache up many text edits into a single atomic "text change". That will be one undo/redo entry, in addition to firing various text change events a single time for the entire change, versus how in WinForms, every single modification is its own "text change" and thus the events that update views, etc. fire for each modification. That can be a performance bottleneck for large sequential changes in the WinForms version.

If you want to put together a simple sample project showing some performance issues, feel free to do so and send that to our support address. Be sure to reference this thread and rename the .zip file extension of what you send so it doesn't get spam blocked. We can have a look and see if we have any suggestions offhand to speed things up based on the current object model in the WinForms version. It may not help but we could at least have a look.

Do you have any plans to port the newer object model to the WinForms version?

Based on a survey done by the other large component vendor I use, Winforms is still used by far more developers than any of the other technologies they support. (They develop all enhancements on WPF and then port them to WinForms, then to Silverlight, JS, etc. - usually within a few months.)

Porting the newer API back to WinForms is something we had started and stalled on. It would be ideal for us to have the WinForms version on the same newer API since then it would receive all the updates we do to the other platforms, and could be kept in sync very easily. While the backend text API ports (and has been ported) easily, the UI portion is drastically different because WinForms replies on raster drawing instead of retained UIElements. We still do hope to revisit this though.

As for the slowdown you're seeing, that doesn't sound like it should be happening. The same changes on the same text should be occuring in relatively constant time. Again, we can take a look if you send us a sample.