You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sitemaps that exist as a single file are usually the small ones that are easy manually look over. The ones that use many tiered layers of indirect references are exactly the ones where a tool is most valuable. One example of a complex multi-file sitemap: https://www.apple.com/sitemap.xml
The sitemap implementation I found here appears to make some other overly simple assumptions. Roughly:
If it finds /sitemap.xml at the root, is doesn't look in robots.txt, whereas I believe both can be valid
This is a nice tool, I'll certainly be using it a lot more moving forward.
However, I noticed when testing a website that has a sitemap index file, it doesn't recursively parse the sitemaps within:
No biggie, but it would be good to see the full resultset if possible.
The text was updated successfully, but these errors were encountered: