diff --git a/docs/rendering/animation-systems.md b/docs/rendering/animation-systems.md index 6d19f74..284541b 100644 --- a/docs/rendering/animation-systems.md +++ b/docs/rendering/animation-systems.md @@ -23,11 +23,13 @@ FrameDisplayTimeInMilliseconds = AnimationClip.FrameTimeInIntervals * 25; Before diving into the accuracy of this estimate (_spoiler alert - it's slightly off_), some questions immediately spring to mind: - Why is the frame time given in "intervals" (as ACT editor calls them) in the first place? -- Where does the 25 ms time value that has so far been used by the community actually come from? +- Where does the 25 ms time value that has so far been used by the community actually come from? [^1] - Is it at all correct, and is there a way to precisely measure sprite animation times to verify/disprove this? The following sections attempt to answer these questions, as far as is realistically possible. +[^1]: The earliest known source appears to be [an earlier specification of the ACT format](https://web.archive.org/web/20200220130616/http://mist.in/gratia/ro/spr/ActFileFormatFix.html), which indicates a tool called actOR + ### Animation State Updates In order to answer the first question, a bit of guesswork is needed. There's of course no way of telling for sure why this peculiar unit of measurement was chosen by the game's developers. But knowing a bit about the different animation systems in the client, a likely explanation is that sprite animations use a different mechanism altogether (e.g., clock-based state machines).