Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

unknown DB column error received #2

Open
leonstafford opened this issue Nov 3, 2020 · 2 comments
Open

unknown DB column error received #2

leonstafford opened this issue Nov 3, 2020 · 2 comments

Comments

@leonstafford
Copy link
Contributor

State: had deactivated/deleted WP2Static core a few times after activating the crawl addon. Didn't do an export initially to confirm if this error is present then, too.

To investigate

2020/11/03 11:39:13 [error] 79#79: *115 FastCGI sent in stderr: "PHP message: WordPress database error Unknown column 'crawled_time' in 'where clause' for query SELECT id, url FROM wp_wp2static_urls
             WHERE crawled_time IS NULL OR crawled_time <= '2020-11-03 11:39:13'
             LIMIT 20 made by do_action('wp_ajax_wp2static_run'), WP_Hook->do_action, WP_Hook->apply_filters, WP2Static\Controller::wp2staticRun, WP2Static\Controller::wp2staticHeadless, WP2Static\Controller::wp2staticCrawl, do_action('wp2static_crawl'), WP_Hook->do_action, WP_Hook->apply_filters, WP2StaticAdvancedCrawling\Crawler::wp2staticCrawl, WP2StaticAdvancedCrawling\Crawler->crawlSite, WP2StaticAdvancedCrawling\CrawlQueue::getChunkPHP message: PHP Fatal error:  Uncaught UnexpectedValueException: RecursiveDirectoryIterator::__construct(/usr/html/wp-content/uploads/wp2static-processed-site): failed to open dir: No such file or directory in /usr/html/wp-content/plugins/wp2static-addon-zip/src/ZipArchiver.php:30
Stack trace:
#0 /usr/html/wp-content/plugins/wp2static-addon-zip/src/ZipArchiver.php(30): RecursiveDirectoryIterator->__construct('/usr/html/wp-co...', 4096)
#1 /usr/html/wp-content/plugins/wp2static-addon-zip/src/Controller.php(87): WP2StaticZip\ZipArchiver->generateArchive('/usr/html/wp-co...')
#2 /usr/html/wp-includes/class-wp-hook.php(287): WP2StaticZip\Controller->generateZip('/usr/html/wp-co...', 'wp2static-addon...')
#3 /usr/html/wp-includes/class-wp-hook.php(311): WP_Hook->apply_filters(NULL, Array)
#4 /usr/html/wp-includes/plugin.php(478): WP_Hook->do_action(Array)
#5 /usr/html/wp-content/plugins/wp2static/src/Controller.php(635): do_action('wp2static_deplo...', '/usr/html/wp-co...', 'wp2static-addon...')
#6 /usr/html/" while reading response header from upstream, client: 172.17.0.1, server: , request: "POST /wp-admin/admin-ajax.php HTTP/1.1", upstream: "fastcgi://unix:/var/run/php7-fpm.sock:", host: "localhost:4588", referrer: "http://localhost:4588/wp-admin/admin.php?page=wp2static"
@leonstafford
Copy link
Contributor Author

confirmed again via WP_CLI

/usr/html # wp wp2static crawl
[2020-11-28T01:16:52+00:00] Starting crawling
[2020-11-28T01:16:52+00:00] Starting to crawl detected URLs.
[2020-11-28T01:16:52+00:00] Adding discovered URLs.
[2020-11-28T01:16:52+00:00] Crawling with a chunk size of 20
[2020-11-28T01:16:52+00:00] Crawling sitemaps.
[2020-11-28T01:16:52+00:00] Using CrawlCache.
[2020-11-28T01:16:52+00:00] Crawling all URLs.
WordPress database error Unknown column 'crawled_time' in 'where clause' for query SELECT id, url FROM wp_wp2static_urls
WHERE crawled_time IS NULL OR crawled_time <= '2020-11-28 01:16:52'
LIMIT 20 made by include('phar:///usr/bin/wp/php/boot-phar.php'), include('phar:///usr/bin/wp/vendor/wp-cli/wp-cli/php/wp-cli.php'), WP_CLI\bootstrap, WP_CLI\Bootstrap\LaunchRunner->process, WP_CLI\Runner->start, WP_CLI\Runner->run_command_and_exit, WP_CLI\Runner->run_command, WP_CLI\Dispatcher\Subcommand->invoke, call_user_func, WP_CLI\Dispatcher\CommandFactory::WP_CLI\Dispatcher{closure}, call_user_func, WP2Static\CLI->crawl, WP2Static\Controller::wp2staticCrawl, do_action('wp2static_crawl'), WP_Hook->do_action, WP_Hook->apply_filters, WP2StaticAdvancedCrawling\Crawler::wp2staticCrawl, WP2StaticAdvancedCrawling\Crawler->crawlSite, WP2StaticAdvancedCrawling\CrawlQueue::getChunk
[2020-11-28T01:16:52+00:00] Crawling complete. 0 crawled, 0 skipped (cached).
[2020-11-28T01:16:52+00:00] Crawling completed

@leonstafford
Copy link
Contributor Author

deactivating and activating plugin from this state results in

2020/11/28 01:22:06 [error] 73#73: *292 FastCGI sent in stderr: "PHP message: WordPress database error Duplicate entry 'wp2static-addon-advanced-crawling' for key 'PRIMARY' for query INSERT INTO wp_wp2static_addons (slug,type,name,docs_url,description) VALUES ('wp2static-addon-advanced-crawling','crawl','Advanced Crawling','https://github.com/WP2Static/wp2static-addon-advanced-crawling','Provides advanced crawling options') made by activate_plugin, include_once('/plugins/wp2static-addon-advanced-crawler/wp2static-addon-advanced-crawling.php'), run_wp2static_addon_advanced_crawling, WP2StaticAdvancedCrawling\Controller->run, do_action('wp2static_register_addon'), WP_Hook->do_action, WP_Hook->apply_filters, WP2Static\Addons::registerAddon" while reading response header from upstream, client: 172.17.0.1, server: , request: "GET /wp-admin/plugins.php?action=activate&plugin=wp2static-addon-advanced-crawler%2Fwp2static-addon-advanced-crawling.php&plugin_status=all&paged=1&s&_wpnonce=5bc262ffdd HTTP/1.1", upstream: "fastcgi://unix:/var/run/php7-fpm.sock:", host: "localhost:4844", referrer: "http://localhost:4844/wp-admin/plugins.php?plugin_status=all&paged=1&s"

looks like needing more robust checking/migrating of DB state. my preference would be to look for all fields/seed data we require, if doesn't exist, blow all away and recreate as if new installation. worst case then, should be that we lose caching benefits for a run immediately after a breaking schema change in an addon like this.

@john-shaffer what little MySQL knowledge I may have had seems to have disappeared - any thoughts on fixing this? Was this the already similar fix in WP2Static core that I should be able to copy paste into this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant