I’m working with making polygon objects for tiles in a tileset and then reading those values into unity. But I appear to have run into a behavior of Tiled that doesn’t make sense to me.

In my test scenario (to make sure I wasn’t crazy) I am making a collider object using the “insert polygon” tool in the tileset editor. The collider/polygon in question is an isosceles right triangle with the 90 degree corner in the top-right corner of the tile.

Heres the problem: When I draw the polygon, the resulting coordinates for each point, as shown in the resulting tsx file, are different depending on which corner I start in. If I start drawing the polygon in the top left corner, I get the following:

<polyline points="0,0 32,0 32,32 0,0"/>

If I start in the top right corner this is the result:

<polyline points="0,0 0,32 -32,0 0,0"/>

All the points are in exactly the same positions relative to the tile, and the resulting polygon is in the same orientation relative to the tile. But the position data for the points differs.

I fail to understand why this is happening and its really screwing with how I import this data into Unity.
Is there a way to make these positions consistent regardless of order? or is there some other element to how this part of tiled works that I need to understand?

Also, if this is intended behavior, It should be at least mentioned in the documentation. Perhaps it is there, but I was unable to find it.

Thanks.

EDIT: I should note that my tests were all done with clockwise placement of points.

The positions are relative to the starting point, which is always taken to be 0,0. If you want consistency, then you’d have to draw your shapes consistently - always start in the same corner (e.g. the top-left-most point), and always go in the same direction (clockwise or counterclockwise).
To get the points’ positions relative to the tile, you need to add the parent object’s x and y coordinates to each point.

I’d also prefer to see the point coordinates to be relative to the tile pixels to begin with. The direction isn’t something that can be “fixed” (normalized) though, as there are reasons that one may need CW or CCW lines, and it’s good to have easy access to both.

I thought about your suggestion as a work around, but what if your collider/polygon doesn’t need a point in the common starting corner? Take my triangle tile example. If my start point for all my polygon colliders is the top left corner, but then I add a tile that is vertically flipped version of the image I have in my orignal post, the top-left corner shouldn’t have a point. Which means the workaround doesn’t cover all cases.

This origin as the starting point seems pretty poorly conceived. I would love it if the developer could clarify why in the world he chose this approach. It would be nice to at least understand. but I’ll settle for a workable solution. Unfortunately, since the starting point is the origin, I can’t think of anyway to get consistent import results because I have no way of knowing what corner the origin is in. Its totally relative with no consistent reference point.

if your collider/polygon doesn’t need a point in the common starting corner?

That’s why I said “e.g. top-left-most point”. This need not be the literal top left corner , it can be anything else.
Really, even that isn’t necessary as long as your direction is consistent, since once you add the object location, you should have a fairly consistent set of coordinates, relative to the top left corner (which would be 0,0 no matter where your starting point is). Having a consistent-ish starting point is only a convenience for visualizing the math, it doesn’t actually impact anything once you’re dealing with coordinates relative to the tile and consistent directions.

Forgetting to add the object coordinates seems to be the main source of your difficulties. (But as I mentioned before, the fact that this needs to be done at all seems silly to me, I don’t know why the coordinates aren’t relative to the tile’s top left in the first place.)