Skip to content

XML Sitemap parser class compliant with the Sitemaps.org protocol.

License

Notifications You must be signed in to change notification settings

VIPnytt/SitemapParser

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

55 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Build Status Scrutinizer Code Quality Code Climate Test Coverage License Packagist Join the chat at https://gitter.im/VIPnytt/SitemapParser

XML Sitemap parser

An easy-to-use PHP library to parse XML Sitemaps compliant with the Sitemaps.org protocol.

The Sitemaps.org protocol is the leading standard and is supported by Google, Bing, Yahoo, Ask and many others.

SensioLabsInsight

Features

Formats supported

  • XML .xml
  • Compressed XML .xml.gz
  • Robots.txt rule sheet robots.txt
  • Line separated text (disabled by default)

Requirements:

Installation

The library is available for install via Composer. Just add this to your composer.json file:

{
    "require": {
        "vipnytt/sitemapparser": "^1.0"
    }
}

Then run composer update.

Getting Started

Basic example

Returns an list of URLs only.

use vipnytt\SitemapParser;
use vipnytt\SitemapParser\Exceptions\SitemapParserException;

try {
    $parser = new SitemapParser();
    $parser->parse('http://php.net/sitemap.xml');
    foreach ($parser->getURLs() as $url => $tags) {
        echo $url . '<br>';
    }
} catch (SitemapParserException $e) {
    echo $e->getMessage();
}

Advanced

Returns all available tags, for both Sitemaps and URLs.

use vipnytt\SitemapParser;
use vipnytt\SitemapParser\Exceptions\SitemapParserException;

try {
    $parser = new SitemapParser('MyCustomUserAgent');
    $parser->parse('http://php.net/sitemap.xml');
    foreach ($parser->getSitemaps() as $url => $tags) {
        echo 'Sitemap<br>';
        echo 'URL: ' . $url . '<br>';
        echo 'LastMod: ' . $tags['lastmod'] . '<br>';
        echo '<hr>';
    }
    foreach ($parser->getURLs() as $url => $tags) {
        echo 'URL: ' . $url . '<br>';
        echo 'LastMod: ' . $tags['lastmod'] . '<br>';
        echo 'ChangeFreq: ' . $tags['changefreq'] . '<br>';
        echo 'Priority: ' . $tags['priority'] . '<br>';
        echo '<hr>';
    }
} catch (SitemapParserException $e) {
    echo $e->getMessage();
}

Recursive

Parses any sitemap detected while parsing, to get an complete list of URLs.

Use url_black_list to skip sitemaps that are part of parent sitemap. Exact match only.

use vipnytt\SitemapParser;
use vipnytt\SitemapParser\Exceptions\SitemapParserException;

try {
    $parser = new SitemapParser('MyCustomUserAgent');
    $parser->parseRecursive('http://www.google.com/robots.txt');
    echo '<h2>Sitemaps</h2>';
    foreach ($parser->getSitemaps() as $url => $tags) {
        echo 'URL: ' . $url . '<br>';
        echo 'LastMod: ' . $tags['lastmod'] . '<br>';
        echo '<hr>';
    }
    echo '<h2>URLs</h2>';
    foreach ($parser->getURLs() as $url => $tags) {
        echo 'URL: ' . $url . '<br>';
        echo 'LastMod: ' . $tags['lastmod'] . '<br>';
        echo 'ChangeFreq: ' . $tags['changefreq'] . '<br>';
        echo 'Priority: ' . $tags['priority'] . '<br>';
        echo '<hr>';
    }
} catch (SitemapParserException $e) {
    echo $e->getMessage();
}

Parsing of line separated text strings

Note: This is disabled by default to avoid false positives when expecting XML, but fetches plain text instead.

To disable strict standards, simply pass this configuration to constructor parameter #2: ['strict' => false].

use vipnytt\SitemapParser;
use vipnytt\SitemapParser\Exceptions\SitemapParserException;

try {
    $parser = new SitemapParser('MyCustomUserAgent', ['strict' => false]);
    $parser->parse('https://www.xml-sitemaps.com/urllist.txt');
    foreach ($parser->getSitemaps() as $url => $tags) {
            echo $url . '<br>';
    }
    foreach ($parser->getURLs() as $url => $tags) {
            echo $url . '<br>';
    }
} catch (SitemapParserException $e) {
    echo $e->getMessage();
}

Throttling

  1. Install middleware:
composer require hamburgscleanest/guzzle-advanced-throttle
  1. Define host rules:
$rules = new RequestLimitRuleset([
    'https://www.google.com' => [
        [
            'max_requests'     => 20,
            'request_interval' => 1
        ],
        [
            'max_requests'     => 100,
            'request_interval' => 120
        ]
    ]
]);
  1. Create handler stack:
$stack = new HandlerStack();
$stack->setHandler(new CurlHandler());
  1. Create middleware:
$throttle = new ThrottleMiddleware($rules);

 // Invoke the middleware
$stack->push($throttle());
 
// OR: alternatively call the handle method directly
$stack->push($throttle->handle());
  1. Create client manually:
$client = new \GuzzleHttp\Client(['handler' => $stack]);
  1. Pass client as an argument or use setClient method:
$parser = new SitemapParser();
$parser->setClient($client);

More details about this middle ware is available here

Automatic retry

  1. Install middleware:
composer require caseyamcl/guzzle_retry_middleware
  1. Create stack:
$stack = new HandlerStack();
$stack->setHandler(new CurlHandler());
  1. Add middleware to the stack:
$stack->push(GuzzleRetryMiddleware::factory());
  1. Create client manually:
$client = new \GuzzleHttp\Client(['handler' => $stack]);
  1. Pass client as an argument or use setClient method:
$parser = new SitemapParser();
$parser->setClient($client);

More details about this middle ware is available here

Advanced logging

  1. Install middleware:
composer require gmponos/guzzle_logger
  1. Create PSR-3 style logger
$logger = new Logger();
  1. Create handler stack:
$stack = new HandlerStack();
$stack->setHandler(new CurlHandler());
  1. Push logger middleware to stack
$stack->push(new LogMiddleware($logger));
  1. Create client manually:
$client = new \GuzzleHttp\Client(['handler' => $stack]);
  1. Pass client as an argument or use setClient method:
$parser = new SitemapParser();
$parser->setClient($client);

More details about this middleware config (like log levels, when to log and what to log) is available here

Additional examples

Even more examples available in the examples directory.

Configuration

Available configuration options, with their default values:

$config = [
    'strict' => true, // (bool) Disallow parsing of line-separated plain text
    'guzzle' => [
        // GuzzleHttp request options
        // http://docs.guzzlephp.org/en/latest/request-options.html
    ],
    // use this to ignore URL when parsing sitemaps that contain multiple other sitemaps. Exact match only.
    'url_black_list' => []
];
$parser = new SitemapParser('MyCustomUserAgent', $config);

If an User-agent also is set using the GuzzleHttp request options, it receives the highest priority and replaces the other User-agent.