You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I know that it's not earcut's goal to be perfect, but just to be acceptable in practice. What I'm wondering is, if it's possible to run earcut as a fast optimistic pass, and fallback to something slower but more robust for problematic geometry. That of course requires earcut to be aware when it might be wrong, or maybe even more conservative just that the input is possibly problematic. Is there a way to sort of use earcut "conservatively", and bail out with an error if there is not a guarantee for the answer to be correct?
I know in a way it's probably asking a lot, but of course it would be great to have the performance improvement for earcut in the 95% of cases where it can handle it, and otherwise fall back to something like tess2.
Maybe worded in a different way, is it sort of clearly known when and where earcut can go wrong? Is it realistic to possibly categorize (hopefully efficiently) inputs that are / are not simple in the eyes of earcut? Even if if there was some performance overhead but half of the time earcut could be used, it would still be a win.
PS: I saw deviation, but I was thinking of something more along the lines of a quick pre-pass or an early bail out, not a post analysis. And also somehow my instinct tells me that just comparing the signed area might not always work. Couldn't it be possible that there are just multiple errors that cancel out in regards to area?
Thanks
The text was updated successfully, but these errors were encountered:
After getting a triangulation, you can verify its correctness with earcut.deviation:
var deviation = earcut.deviation(vertices, holes, dimensions, triangles);
Returns the relative difference between the total area of triangles and the area of the input polygon. 0 means the triangulation is fully correct.
You can verify correctness like this. Earcut will always produce slight deviations when some edges intersect each other (and big ones in some rare cases), and this will be reflected in the check result.
One thought is adding an option to bail out when it reaches the earcutSplit phase (which usually means earcut is having trouble and is entering slow desperate mode). Although I'm not sure the result is guaranteed to be always correct when Earcut doesn't reach it — need to make sure.
I know that it's not earcut's goal to be perfect, but just to be acceptable in practice. What I'm wondering is, if it's possible to run earcut as a fast optimistic pass, and fallback to something slower but more robust for problematic geometry. That of course requires earcut to be aware when it might be wrong, or maybe even more conservative just that the input is possibly problematic. Is there a way to sort of use earcut "conservatively", and bail out with an error if there is not a guarantee for the answer to be correct?
I know in a way it's probably asking a lot, but of course it would be great to have the performance improvement for earcut in the 95% of cases where it can handle it, and otherwise fall back to something like tess2.
Maybe worded in a different way, is it sort of clearly known when and where earcut can go wrong? Is it realistic to possibly categorize (hopefully efficiently) inputs that are / are not simple in the eyes of earcut? Even if if there was some performance overhead but half of the time earcut could be used, it would still be a win.
PS: I saw deviation, but I was thinking of something more along the lines of a quick pre-pass or an early bail out, not a post analysis. And also somehow my instinct tells me that just comparing the signed area might not always work. Couldn't it be possible that there are just multiple errors that cancel out in regards to area?
Thanks
The text was updated successfully, but these errors were encountered: