I have been trying out the built in widgets in LiveCode 8 and found I could not find what messages were generated by the widgets in the api docs. For example with the navigation bar I assumed it would send a mouseUp with a parameter as to which icon was clicked. Not so, there was no mouseUp message sent. Turning on the message watcher I discovered that clicking on the Navigation bar sent a "navigate" message. For the title bar widget I tried message watcher and found that it sent two messages: a "headerAction" message was sent with a parameter with the action "new" by the plus icon and "back action" was sent by clicking the "<Back>" button. Somewhere in the widgets docs there needs to be a section for messages sent by the widgets just as there is a section for properties.

Anyway that is not my main question I wanted to raise. While I was trying to learn the messages sent by the widgets with message watcher I realized that those widgets did not send along all the regular set of mouse messages that are sent with legacy objects, i.e. mouseEnter, mouseLeave, mouseUp, mouseDown etc. Looking at the lcb files that Trevor has referred to in his LCB tutorials I realized that the widget will only send mouse events if they have a public handler such as onMouseUp() which then has post "mouseUp" to my script object" in it.

So having these mouse messages in a widget appears optional. My question is is this a good idea to not automatically have widgets handle and send along messages such as mouseEnter, mouseLeave etc. ? The legacy controls do and as a long time LiveCode user you expect that behaviour and you may at times take advantage of these messages. If the widget builder hasn't included the mouseMessages that prevents the user from using those messages.

Are there benefits to not automatically sending all of the mouse messages as a legacy control would? Could there be a standard option in a widget to have it handle all mouse messages? That would make it easier for the widget builder to include them.

Martin Koob wrote:Looking at the lcb files that Trevor has referred to in his LCB tutorials

I can't help with the original question, so I've very sorry for hijacking, but can you point me to Trevor's tutorials? I am struggling even to begin with LCB and the leap from concept to even the very first, most basic widget is too far for me to jump.

Thanks very much.
My first reaction to your original question is that I would expect "usual" messages like mouseUp etc to be generated automatically as well. I am not sure whether LC8 just needs to reach more maturity for that to become part of it, or my understanding needs to reach more maturity to see why not.

I think the current behavior (developer implements all messages) is the correct behavior. A widget is a blank canvas for creating all sorts of UI elements, some of which don't need to generate the traditional messages that legacy controls do. Generating messages comes at a cost. If a control doesn't need to generate a message then it is more efficient not to.

Think about a control such as a bar chart. What good does a general mouseDown event have on a bar chart? Does it provide any real value? Perhaps. But maybe clicking on a bar in the graph makes more sense. The developer would implement a <barClicked> message instead.

Looking over the controls that I've created, I haven't needed mouseMove, mouseUp/Down, etc. in many of them. I have a segmented control that displays arrow head types and the only message I need is segmentSelected. I have another control that needs to trigger mouseDown in part of the control but clicking elsewhere sends a <ShowMenu> message that needs to be acted on. Sending mouseDown if the user clicked anywhere would actually cause incorrect behavior.

Now, while I think that the current behavior (developer defines which messages are sent) is the correct one, I'm not against making it simpler to automatically generate those messages so the developer doesn't have to roll them by hand when creating something like a button.

I understand the attractiveness of the blank canvas for control authors and the efficiency of not sending messages for every mouse interaction in a widget. You suggested having an automatic way of generating the messages for the author if they want to have the mouse messages generated in their controls. What do you think about the option of having a standard widget property that the user could select in the property inspector that would make the widget as a whole (not each element in the widget) generate the standard mouse messages that have not been handled by the author. (i.e. if the author caught mouseDown with an onMouseDown() function and posted mouseDown it would not be generated by this setting.)

The reason I ask this is that sometimes the end user may see a way to interact with your widget that you did not intend.

For example I use line graphics as controls and take advantage of the mouseEnter, mouseLeave and mouseUp messages to interact with them. One could say there is no need for a graphic to generate these messages as a graphic's purpose is just to display a shape on the screen. However the fact that the graphic behaves like any other liveCode object allows for creative uses of the graphic perhaps beyond what was originally intended.

An example of how this might apply to widgets is say I had the Navigation bar widget but I wanted to have something happen when the mouse entered or left the Navigation bar. If the author of the widget did not handle mouseEnter, mouseLeave with onMouseEnter() and onMouseLeave() and in those handlers post mouseEnter and mouseLeave messages respectively I would not be able to do that so would have to create another object that covered the area not covered by the Navigation bar and use mouseLeave, mouseEnter to track when the mouse left or entered that to entered or left the Navigation bar. Having a way to turn on these mouse messages in the property Inspector for the navigation bar would allow me to add those messages to the widget so I could handle them.

I am not suggesting that this setting should cause the mouse messages to be sent for each element within the widget just the widget as a whole. The author may want the messages turned on as a default so the default setting for that property would be generateStandardMouseMessages = true. With this option available the authors would not have to code all of the onMouseUp(), onMouseDown(),…etc. functions in their widgets.

If it is not possible or advisable to have the widget as a whole behave in this way it will mean that widgets will be a different class of objects than legacy objects. As an experienced user I find this difference a bit jarring as I start working with widgets, however am sure I will get used to this as I work with them. For new users I think it make learning LiveCode a bit more difficult. They first learn all the mouse messages for legacy controls but then learn that widgets may or may not use some or all of these. Again it is something they will be able to learn but the learning curve gets a bit steeper.

Anyway I am interested in the thoughts of widget authors like Trevor, the LiveCode team and end users of widgets on this.

I think the mouse location events and mouse status functions should be generated by the widget as a whole and should not cause the type of problems Trevor was referring to.

The mouse button events may be more problematic if they were generated in addition to the messages that the widget author has included. I still think that it would be good to have the user of the widget the option to turn them on but I am wondering what others think about pros and cons.

A couple of interesting things I found while trying out the widgets and mouse messages.

First I found with working with the navigation bar widget is that the mouseRelease handler did not work, that meant if the user clicks and holds down the mouse button and then drags the mouse away and releases it outside of the Navigation bar the icon clicked on would still be selected. With a button mouseUp would not be triggered in this case. This may be something that would be resolved by the widget author ensuring that mouseRelease is handled and not something that would be resolved by automatically generating mouseRelease.

Second I found that the keyboard shortcut for editing the script of an object by holding the option and command keys while clicking the object with the browse tool does not work with all widgets, i.e. it works with the SVG widget but not the navigation bar or the icon picker widgets. Should this be something that should be standard in widgets?

1. I would think that the mouseControl() would return the widget.
2. I don't think there should be an option to turn on any messages that the developer did not explicitly program the widget to send. This could potentially break the intended behavior of the widget and that should not be possible.
3. New users (at least in the future) probably won't be learning about legacy control behavior before learning about widgets. Widgets will be the standard. So I don't think widgets should be designed a certain way just because legacy controls behave that way.
4. I think that messages like mouseEnter/mouseLeave could be sent automatically for the widget. Seeing as it is a single control to the engine I would think that you would want those messages to be sent.

At the moment it is the widget's responsibility to generate all messages which the user might want to interact with. We're trying to go with a 'bottom-up' approach to expanding the widget feature-set - i.e. rather than presupposing the features we need, we are adding them when we have direct use-cases to follow to ensure we are doing so in the 'correct' way (relative to some definition of 'correct', anyway).

There are (at least) two points of view you could take with regards the handling of mouse messages and widgets (and controls in general). The first is that script should get a chance to process mouse messages first, and then pass them to the widget; and the second is that the widget should get the mouse messages first and then dispatch them to script. The current legacy controls do a mixture of the two - some (legacy) controls dispatch a mouse message to script then if it is passed take action; whilst some do the action and then dispatch to script; the slight inconsistency here is why it is not entirely obvious what the 'correct' behavior should be and thus why widgets currently have complete control (it being the most flexible choice).

I suspect the most sensible option here is something along the following lines:

Mouse location events (mouseMove, mouseEnter, mouseLeave etc.) should be seen as notifications rather than 'actionable' things - they can be used to track mouse movements in script in ways which have no impact on widgets or controls directly at all. This suggests that they should always be sent to script and to widgets - the behavior of one should not impact the behavior of the other.

Mouse button events should be seen as gestures rather than individual events - i.e. a mouse click gesture is the message sequence MouseDown, MouseStillDown, ..., MouseUp/MouseRelease. When such a sequence starts with MouseDown this is sent to script first. If script does not pass the message then it then owns that complete gesture sequence and the widget will not get any part of the gesture. If script passes the message then the OnMouseDown handler fires in the widget and things continue as normal. Since we only deal in complete gestures, you can't have a case where some script 'breaks' the widget because it passes some parts of the event sequence and not others; script can 'hijack' mouse gestures as needed safe in the knowledge that the widget won't ever be aware such a gesture existed (if script does not pass the initial mouseDown); and perhaps most importantly, script will still see the standard mouse messages for widgets, regardless of whether the widget developer needs / wants / has handled them internally to the widget.

In terms of the other things mentioned - mouseControl not working is a bug; as is the fact that the navigation bar doesn't check the location of where the click ends to check whether an action should occur.

The reason I ask this is that sometimes the end user may see a way to interact with your widget that you did not intend

trevordevore:

2. I don't think there should be an option to turn on any messages that the developer did not explicitly program the widget to send. This could potentially break the intended behavior of the widget and that should not be possible.

My 2 cents:
Trevordevore, you created a rotating-SVG widget and it seemed you had intended it for use only as a busy indicator but I wanted to use it to drive my knob control. I had to go into the lcb and add the appropriate mouse-firing code and recompile to get to be usable for my purposes. If there were a way force the widget to send the standard mouse messages that the standard controls have normally sent in the past then I could have used your widget as-is. IMHO this is the way it should be by default. Perhaps allow the widget dev to override this behavior by setting some flag. I do see your point about the need to override because you could have a widget that contains multiple objects that would need to send/recieve individual mouse-messages and not allow messages in dead areas in order to work properly. I'm thinking Piano-Keyboard / Piano-Roll-view widget here!

It seems to me that LiveCode script should have first dibs at trapping the mouse messages. It was my understanding that widgets and lcb-libs sit at the same level as the engine in the message-path hierarchy. Since the traditional behavior was to allow a script to trap and not-pass along messages through the rest of message-path hierarchy, it seems like this is the way widgets should behave as well.

Trevordevore, you created a rotating-SVG widget and it seemed you had intended it for use only as a busy indicator but I wanted to use it to drive my knob control. I had to go into the lcb and add the appropriate mouse-firing code and recompile to get to be usable for my purposes.

Martin - Mark explained the problem well in his post and also provided (what I think to be) the proper solution. If LiveCode treats the events associated with a click as an entire gesture that the script can ignore/pass/or catch then there isn't a problem with message order (who gets message first? widget or script?) or breaking a widget because a mouseDown is passed but not a mouseUp. The widget can't be broken and the script writer can override behavior as needed.

I am still a bit unsure about the order of messages travel in the message path where widgets are involved.

First off what you are referring to when you refer to the script and the widget I just want to be clear what those are. So the script is the LiveCode script you see when you edit the script of the widget in the IDE, and, the widget is the LCB script?

So if I understand this correctly
- when a mouseDown handler (or other mouse button event?) is in the LiveCode script the LiveCode script hijacks the mouse gesture and no part of that gesture is seen by the widget, unless that is passed explicitly.

- If there are no mouse button events in the Livecode script of the widget then the widget receives those mouse button events and can handle them using the onMouseUp(), onMouseDown() handlers etc. in the LCB script

In those LCB mouse button event handlers you can post mouseUp, mouseDown etc.. Are those then sent back to the LiveCode script of the widget?

Sorry it is taking me a while to get this. Thanks again for the help with this.

Okay so there are two things here... How things work at the moment (1) and how I propose things should evolve (2).

The code you write in LCB (let's call it the 'widget implementation') sits at the same level as the C++ code in the engine which controls the 'legacy' controls. There's a C++ wrapper control in the engine (MCWidget) which provides a host environment for the LCB code. It manipulates and marshals various parts of the event stream and other aspects of the engine to give a reasonably 'clean' set of events which the widget implementation sees. From the high-level perspective you have a widget object on which you can set a script (let's call this the 'script object'). You can think of it as the script object being the realization of a widget implementation in the LiveCode environment.

(1) At the moment widget implementations are entirely responsible for dispatching all messages to their script object. This means that if a widget implementation does not react to, or post the mouse event messages, the script object will not see them. This mirrors how the legacy controls are written in the engine - they, for the most part, individually control what event messages are sent to the script objects and when.

(2) My proposal is to change this slightly to ensure there is a reasonable level of 'default behavior' in widgets, generalizing and improving the current (although not entirely consistent) default behavior amongst the existing legacy controls. Using a mouse click as an example, here is my current suggestion of how it would work:

The engine receives a mouseDown event on a widget.
A mouseDown message is sent to the script object of the widget.
If the mouseDown message is handled (and not passed) then the script object will have been considered to have 'grabbed' the mouse click gesture and will then receive the mouseUp / mouseRelease message at some later date - the widget implementation will not see the OnMouseDown event or the subsequent OnMouseUp.
If, on the other hand, the mouseDown message is not handled then the widget will receive the OnMouseDown event and subsequently the OnMouseUp event. In this case script would still receive the mouseUp / mouseRelease message *after* the widget has processed the OnMouseUp event - this is to ensure that (just like widgets) the script object always gets matched pairs of mouseDown / mouseUp.

With this logic, the script of a widget can choose to 'block' the widget implementation from processing the mouse click gesture but if it chooses not to then the widget will handle it.

For the other mouse events, such as mouseEnter / mouseLeave / mouseMove then they are just notifications so script and the widget will get the events to process.

This logic should mean that any widget which does not handle mouse click events will still allow the script object to handle them (without any additional work); and widgets that do handle mouse click events still generate the script messages which you might expect.

Let's take an example of a push button widget. I'm imagining here that the push button widget will dispatch a 'clicked' message to script when a mouseUp event is received within its bounds. The order of events would be as follows:

In the case the script object passes (or does not handle) the mouseDown event:
mouseDown to script object
OnMouseDown to widget
...
OnMouseUp to widget
clicked to script object
mouseUp to script object

In the case the script object handles (and does not pass) the mouseDown event:
mouseDown to script object
...
mouseUp to script object

This hopefully gives the best of both worlds - script can control whether widgets handle the mouse gestures, and still get notification about significant points in the gesture; whilst widgets are guaranteed to get a mouse gesture to process in entirety (if script decides it should be so) and thus an opportunity to reinterpret the gesture as a higher-level event (such as clicked, or 'barClicked' in the example of a graphing widget as suggested by Trevor).