You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There are areas where performance is known to be bad, and it would be interesting to have more measurement/understanding of this overall. Specifically, I want to list issues I've wondered about:
JSON, and especially encoding/json. I'd guess that (un)serializing everything is a majority of CPU time in anything with dynamic data.
Update property values individually after changes #15: Property changes update the entire object. This means serializing and sending everything when any property changes without compression. Likewise, on the Qt side change signals are emitted for every property.
Marshaling a QObject requires an extra recursive scan of the object (through reflection) for object initialization. I think this is important, but if the serialization was a bit more customized these could happen at the same time.
Models? Lots may change here. Overall, similar problems with serializing large amounts of data. There are also interesting questions about how to keep data on the client side and what caching might make sense.
On the Qt side, performance with the types and conversions are unknown to me. There is a lot of conversion between QJsonValue and QJSValue, and between those and the raw types needed for metacalls. I'd be surprised if this is all being done optimally.
The text was updated successfully, but these errors were encountered:
There are areas where performance is known to be bad, and it would be interesting to have more measurement/understanding of this overall. Specifically, I want to list issues I've wondered about:
JSON, and especially
encoding/json
. I'd guess that (un)serializing everything is a majority of CPU time in anything with dynamic data.Update property values individually after changes #15: Property changes update the entire object. This means serializing and sending everything when any property changes without compression. Likewise, on the Qt side change signals are emitted for every property.
Marshaling a QObject requires an extra recursive scan of the object (through reflection) for object initialization. I think this is important, but if the serialization was a bit more customized these could happen at the same time.
Models? Lots may change here. Overall, similar problems with serializing large amounts of data. There are also interesting questions about how to keep data on the client side and what caching might make sense.
On the Qt side, performance with the types and conversions are unknown to me. There is a lot of conversion between QJsonValue and QJSValue, and between those and the raw types needed for metacalls. I'd be surprised if this is all being done optimally.
The text was updated successfully, but these errors were encountered: