-
Notifications
You must be signed in to change notification settings - Fork 168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Integration testing via pytest-docker #315
Integration testing via pytest-docker #315
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tests pass in github, I'm happy enough for now to mostly leave it alone until you all get a chance to review
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
more to clean up
netbox-docker should let us pull the latest minor release by just specifying the major release number, e.g. v2.9 will pull the v2.9.11 image. At least, I'm pretty sure. I think I'd rather manually specify which releases we're testing against. That way when a new version comes out we don't have the potential to randomly start failing tests. It's nice to know, but probably frustrating on a PR that deals with something else entirely.
Yep, those all make sense, will have to familiarize myself with pytest-docker before I can help here.
Could just provide a more specific path to the unit tests if you wanted to bypass, e.g.
This is a great start, had no idea we could do all this in pytest. I'd like to convert at least a few of the fixture-based tests over to this to get a better idea of how we'd use it in practice. Particularly curious about how we can best handle dependencies. Like how testing devices requires us to add a device_type which, in turn, requires manufacturer etc. |
Ah yeah great point I totally understand that. So then right now we would hard code the short version 2.8, 2.9, and 2.10 only and let the containers spin up to whatever the latest patch version is automatically correct? That works just fine as I'm only testing the major/minor version at the moment anyway because of the version reporting pre-2.10.
Ah I meant mostly as a way to single out and run only all of the fast-running unit tests to make sure your code at least has no major holes in it before running a much longer (comparatively - there are no tests yet but I'm sure the integration suite will just get longer and longer as time goes on) integration test. It's been handy to me a few times in other repos but that may just be my style, if you don't see a need than that's good with me :)
Yes! I wanted to bring up the dependency issue myself as well once you had looked this over. There is the https://github.com/netbox-community/devicetype-library repo that we could import a few device types from to test with after the containers are spun up.. manufacturers are literally just a name so those can be read from the yaml config for each of the device types we would load as well and populated at the same time. And we could pin a version of that repo to use as well (bumping occasionally) so that no failures are introduced spontaneously from external updates. This solves for servers and switches, but I haven't seen any sort of patch panels included in the repo yet.. I may be wrong though. We could just send up a PR to add a few of those there or perhaps keep a few yaml models in this repo.. we wouldn't need many. Now that I think about it they may not allow patch panels in that repo because of how much of a gray area they are implementation-wise - one setup may not have the same needs as another. Thank you for checking this out, I'll go ahead and get started on the points you OK'd above. |
Yep, I think the API only makes breaking changes between major releases. If not, we can revisit pinning to minor releases.
I follow. I was just saying we can do that now if we provide more specific path to pytest. Though running the old fixture-based tests is a little tough going that route, but those are hopefully on the way out anyways. :)
Yeah, think I like that approach. We should still test adding a device-type from the API, but it'll probably get old having to make all the ones we'll need for some of the interface and cable tests. |
Gotcha, sounds good then. I have most of the changes done locally now I just need a little more time to clean it up before sending. After that set goes in I'll add a basic setup of the devicetype library integration as well to get some initial devices populated in the containers. |
Basic pytest args should work as expected now.. currently netbox 2.8 does not spin up correctly though via netbox-docker. I'll have to look in to that if you want to test that far back. I also added a comment about the container cleanup hacks only being needed until avast/pytest-docker#33 gets resolved. |
Nice, looks good!
hrm, what's it say? Was able to find a tag for it on docker-hub. I feel like we probably should test at least that far back. 2.9 is only ~6mo old. Once this is settled though, we probably should come up with some kind of policy around minimum supported version. |
It's an error in the "netbox" container when running the setup so I assumed it was an issue in the 2.8 tagged image. I'll dig a bit. Last lines:
|
Ah yeah, getting the same thing. |
I asked how far back they want to test in the NTC slack.. if I can't get an answer there I'll open an issue and ask directly in the repo. First iteration of device imports works now.. let me know what you think. I added a basic test as well. |
There are many options for how to set up testing, i only guessed for the test i added in the last change. If you want to change it feel free |
OK.. from talking in the NTC channel one of the maintainers wrote this in response to my post:
How would you feel about checking out multiple copies (different tags) of the netbox-docker repo so we could run as many versions as we want? It would need a little more code and maintenance but I think the flow would be fairly simple still. Just a thought. |
Yeah, saw the reply in the NTC slack. That makes sense. We'll probably need to map repo & NetBox version together at some point anyways. So yeah, we could use it, but I don't think we have to have it immediately. |
- Added support for older netbox versions - Re-added some changes that I had accidentally removed with a force push earlier when fixing a messed up merge
…netbox into master.pytest-docker
Woohoo! Well I had some time and decided to just do it. Checking it out side by side works with a few minor tweaks. The test run time is up to about 4 minutes though, maybe we should bump the max timeout pytest-docker will wait for the containers to spin up from 5min to maybe 8? |
Oh snap, nice work! Yeah, bumping the timeout makes sense. I wonder if, ultimately, we'd be better off putting these in their own workflow 🤔. Anything specific you want help on? |
Ah yeah I gotcha. To split out to its own workflow all we would need is some args to select only unit vs integration tests but that's doable, I've don it before. Right now I think it's fully functional.. I'd like your input on current implementation of the setup and actual tests and to know what style or content you want in the docstrings for args/returns, I'll check out the rest of the codebase for examples in a bit. I think it's just about ready now. I'll bump the timeout for container generation as well for now. |
|
||
class TestSimpleServerRackingAndConnecting: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of setting up class based tests where we would have to do lots of spin up repeatedly we could also use something like the pytest-depends module to just spin up some commonly used objects and reuse them a lot.. but that has its own caveats as well. Maybe a mix of the two is best?
Seems to work now with
Ok, yeah, made a quick pass at it. I might have to stew on the functional approach you have in test_dcim a little more. My fear is that it'll quickly get out of hand with a multitude of tests that do a lot of the same thing (e.g. making devices), and it might be difficult to tell exactly what's under test. OTOH, we might run into the same problem either way. |
Can you select only unit tests currently though? They are pretty spread out, unless you do a glob match of
Yeah, I worry about it also. I had just left a comment about using dependencies via pytest-depends as an alternative, you just have to be careful about not modifying the items between the dependent tests. Maybe tests that would modify items would use classes with dedicated objects and tests that only need to read objects could use commonly created objects via dependencies? |
I think that's fair, and yeah, it'd depends on deprecating the fixture based tests for these so that eventually we only have |
Hey I hope things are going well :) Is there anything else you need done on this at this time or are you just holding on it for testing? I only ask because I should have some spare time this weekend to put in if needed. Thanks |
👋 I don't think so, think it's a great foundation. I've been meaning to spend some time on it to see if we could generate tests similar to how we do with the fixture-based ones (i.e. test each endpoint with a CRUD request). Wanted to do that today, but it doesn't look like that's going to happen. I'll see if I can't take a swing at that over the weekend or sometime next week. |
Oh no rush, I was just checking in. Don't rush it on my part as I won't be able to extend the much in the next few weeks anyhow. Sounds good, if I can help set up a few tests closer to how you want just let me know. Have a good weekend! |
Didn't make it very far into my idea before realizing it's probably too involved to get done on this PR. I'll CC you when I get a chance finish it out. Many thanks for tackling this! |
Ah right on, no worries thank you for the assist and for the module itself! So just to confirm, I should wait to actually build any tests until you get a solid framework pattern going right? |
Probably, just so we're not duplicating effort, but it might be a few weeks before I can get to it. The idea was that we could basically setup session-scoped fixtures of anything that would get used more than once (e.g. Site, rack(s), devices, etc.). Then we'd go through each endpoint methodically by running create, retrieve, update, and finally delete operations on each. Obviously for stuff we've created in the fixtures we wouldn't use another create on, but that also means coordinating our tests so that we deleted things in the right order. There's a few implementation details like that I still need to get into, but that's the gist. I just figured a more methodical approach would mean better coverage and fewer redundancies when we did identify gaps. You're a lot more familiar with it than I am though; so does that make sense? |
So essentially there would be thousands of fixtures then right? Like |
😬 I hope not, maybe at most a few dozen. Not looking to test each device-type, just a more fundamental test of whether we're interacting with the endpoint properly. A cursory example would be where we create a couple devices as session-scope fixtures. Then when we got to say the |
Right on :) Sounds good to me! |
No rush here either - I don't have this at a place it's ready to merge yet but I would love to get your input on the direction to take. So far I have a very basic integration system set up but it needs a lot of polish added.
Running list of goals:
Ability to automatically determine what versions of netbox to test againstDo you know what to test against here? Off the top of my head I would think to scrape the latest verisions from the netbox releases github page and pull the latest version of each of the 3 most recent major/minor combos.. so like right now it would be v2.10.2, v2.9.11, and v2.8.9. Thoughts?Add a pytest flag to disable all integrations tests? Should the default be enabled or disabled?Is there anything else that should be done for a basic integration suite feature add?
The short version of what this does is that when pytest runs it clones the netbox-docker repo and then reads the
docker-compose.yml
file included in that repo as a base config but then renames some things in there (mostly prefixing names with the netbox version for that group of containers, and putting them all in a dedicated network for that netbox version) so that we can spin up multiple versions of netbox at once via writing out multiple slightly altered docker compose files based off of that base config. Is there a better way to test against multiple versions at once?Right now with me testing against 2 versions of netbox the total run time is about 2 and a half minutes. Adding multiple netbox versions does not really add much additional time, just load on the system since most of the time is spent populating the database and caching static files from what I've seen so far.
This is just a jumping off point, let me know what you think. Thanks :)