-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proposal: Use DB migrations for keeping the database up to date. #141
Comments
I'm not too worried about that; I think a dev should commit a migration for anything they want to keep. The big downside IMO is the extra work it involves, but I don't see a better way. Maybe we could build a tool that would automate part of the work to create a migration, but that'd be a future iteration. |
I love the idea, but I think we may need to solve some other problems first. I think the Meta Environment was a good step forward from where we were when it was created, but it hasn't really been successful. I think the main reason for that is that the majority of Meta committers (myself included) don't use it as their daily environment, so they don't keep it in sync with production, fix problems with it, etc. It's "only" a tool to help non-committers, which makes it extra work on top of our already packed schedules. I think we need to consider switching to a model where committers and non-committers use the same environment. The could mean pivoting the Meta Environment, but could also mean deprecating it in favor of something new. It'd probably be good to decide that before we invest a lot of time/energy in something like this. |
Great point! If you find some time, do you mind listing some of the short comings here? Specifically, why don't you use the meta environment? |
There's probably a few more that aren't jumping out at me, but:
For the past few years I've just been using a local MEMP setup, where I can have everything running at once, without any performance issues. |
The meta environment provisioner is unnecessarily complex, which is why it's slow. Otherwise performance is rather good, and on Windows it's identical to Docker when using Hyper-V ( since Docker uses a Hyper-V VM running Alpine Linux as a container host ). I would also note that WP Env and Docker require Windows 10 Pro/Enterprise. Windows 10 Home users would be excluded from contributing if WP Env was adapted. Likewise, the WordCamp docker image is very tightly coupled in the way it's built, making the setup scripts non-portable. Using them in VVV/WSL/other containers/natively is a non-starter
Note that you can
There are plugins in the meta environment that aren't on production and vice versa. Closed source isn't so much a concern as just making sure the plugins and versions are accurate. I don't see why automation couldn't solve this |
I was referring to things like TTFB, but that may have improved in the past few years. It'd be great to speed up the provisioner as well, though.
That's a great point.
That wouldn't solve the UX issue for me, but you mentioned Traefik elsewhere, which might. Although then I'd still have to have everything running at the same time. Maybe I'd just end up using the provision scripts and database, and running my own MEMP for the rest. That's just me, though, and most folks would probably want a container, including other committers. As long as most committers were using the container, and almost all committers were using the provisioning & db scripts, then that could work. If you think it's possible to solve the above problems w/ a refactor, then a new issue to discuss the details would be great! If that's solved, we could circle back here to discuss better ways to keep the sample db up to date. |
I'd also note that Vagrant VMs don't use |
I would ideally prefer a solution that is environment independent so anyone could choose to use VVV, MEMP, XAMPP or whatever they like. I personally prefer a docker env and do not currently have Vagrant installed on my system. I understand that moving away from VVV for this might be vastly out of scope though so having to install VVV to get it working would be acceptable for me. I am afraid I do not have enough insight into how the current setup works or why it would never work for me. Primarily I was lacking essential data and I have a feeling that some of the systems likely relies on things in the closed source part of the .org site which I do not have access to. Happy to be a tester for any changes proposed that could ease the path to getting this running for others :) |
@pattonwebz , I think @dd32 has tested out a Docker env for the theme directory specifically. That might help in the meantime, and could provide provision scripts, etc that WME could adapt in the future, if we decide to keep it going. |
Just noting that it is now possible to install Docker on Windows Home. |
Same here, I use a custom WAMP stack and would ideally like to use it with Meta Environment too, if possible :) Though I guess a Docker environment might also work for me. |
Goals
Summary
As opposed to using a database snapshot, write migrations to create new tables and insert test environment data.
Pros
diffs
Cons
Option 1 (My Preference)
Plugin Driven
We can write a plugin to manage the database update. A plugin would be useful because it won't be coupled to provisioning which we have identified we want to revisit. This will also keep the codebase in a structure & language the community understands... and can help contribute.
The plugin should:
wpdb
Potential Folder Structure for Repo
We could also move the migrations near each code base as well by having them in the same wp-content folder as the site.
Option 2
Update
VVV
to run migrationsModify the current
.sh
file to run the migrations as a new step in the process.How can we get started?
Regardless of the approach, I think we could get started by pulling out the sql
insert
statements from the current db snapshot into asql
file which we would consider the first migration.The text was updated successfully, but these errors were encountered: