You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
My team needs to deal with a scenario where we will get continuous stream of data points.(training data)
We think that once we get a specific number of samples, we will train an ML.net Model from these.
Now, these data points will keep coming , so we may need retrain the model again to accommodate the incremental data points
Questions regarding this:
After we train the ML.Net model first time, is it ok to discard the original dataset that we had?(we will save on space and since data will keep coming contionously we cannot keep it forever)
When we retrain the ML.Net model(incremental learning), is there a risk of losing accuracy?
3 Is there a better way to do this?
What are the best practices of managing orginal data whether we do batch learning or incremental learning?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi,
My team needs to deal with a scenario where we will get continuous stream of data points.(training data)
We think that once we get a specific number of samples, we will train an ML.net Model from these.
Now, these data points will keep coming , so we may need retrain the model again to accommodate the incremental data points
Questions regarding this:
3 Is there a better way to do this?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions