Skip to content
This repository was archived by the owner on Dec 1, 2022. It is now read-only.

Adding support for autoscaling based on a table tag as well as custom per-table config. #59

Open
wants to merge 7 commits into
base: master
Choose a base branch
from

Conversation

CSilivestru
Copy link

I'd like to be able to autoscale tables based on some autoscaling tag put on the table. This PR allows the end user to either use the default autoscaled tag name or specify one through an environment variable :-).

In addition, I wanted to have a way to keep latency down but still have a way to have custom, moderately flexible, per-table configs. I included the idea of a CustomProvisioners file where you could map the keys to the table name you want the config to apply to and have the value for those keys just be a specific version of the default config.

Ie:

{
  table_name_1: {
    ReadCapacity: { Min: 5, Max: 100 ...},
    WriteCapacity: { Min: 1, Max: 50 ...}.
  ...
  },
  table_name_2: { ... }
}

I'm quite new to dynamo and this autoscaler project in general so any feedback is always appreciated. I tried to design this in a way that would still be backwards compatible to all current defaults.

We can afford the extra 20c per month.
Make the keys in this file the same as the table you want the config
to apply to.
This would allow us to turn autoscaling on and off of tables without
requiring changing the lambda function code and redeploying it.
If it's there, great! Use it. Otherwise, just stick with the default.
@tmitchel2
Copy link
Member

Hey, the use case is totally valid. Issue at the moment is that everyones requirements are different and accepting them all into the main repo will quickly turn it into a mess.

For this reason in my spare time I've totally refactored the project on my experimental branch, this will allow for better pluggability of custom features. It's not complete yet but support for this kind of thing will definitely be accepted once ready.

@CSilivestru
Copy link
Author

Ok that sounds good -- I will just continue to run off of my branch and keep an eye on progress here as best I can :-). Thanks for your quick response, btw!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants