-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Missing page views due to page size limit defaulting to 1000 #337
Comments
@shanerutter-kempston are you using the latest version of the plugin v4.4.6 ? |
I see, Ive updated to that version now, but still getting the same error after its processed a couple days of data. I found if I wait an hour or so and then continue the import it works fine for another couple days then fails again with the same issue.
|
@shanerutter-kempston Can you check the token grant rate graph ? It will be in your |
I am assuming the rate limits are the cause of this issue |
@shanerutter-kempston Can you confirm your Oauth app is internal/external ? |
@AltamashShaikh its an external app, I have just published it. and Left it running for a couple hours. It pulled through more data but eventually ended with the same error. Pictures of the screens requested.
|
@shanerutter-kempston I am still checking why we would get this error after running import for few hours, how much data does it import before throwing error any idea ? I am trying to reproduce the same..but unable to reproduce it, can you maybe run the import with verbose logging and share the log file here? |
@AltamashShaikh Nothing which indicates a problem, other than the google API responding with invalid credentials, I have gotten around the issue by setting up a cronjob to run the CLI import command each hour, its managed to import a years worth of data so far. |
@shanerutter-kempston have you set |
Without the -vvv but yes, it appears google api every now and again rejects the credentials but if you setup the CLI to run every hour, it will continue the import again and google will accept the same credentials again without issue, its a strange issue.... Its not the best way, but its at least getting my data downloaded now. |
@shanerutter-kempston Strange, if you have already setup an archiving cron, then there is already a task which runs every hour and you don't need to do this separately. This is the guide to set up auto archiving cron, which will trigger this task |
Finding that matomo is massivly under reporting compared to GA. Found that if I do a export of the page URLs from mataomo there always seems to be a maximum of 1000 unique pages, but in analytics we have 20k unique page urls for that same day.
I have done some checking of the analytics API and done some quick testing and it appears the reporting API defaults to a page size of 1000 results. I made a quick modification in the following file
Google\GoogleQueryObjectFactory.php
after line 58 I added$request->setPageSize(100000);
and did a quick import and can now see its pulling all unique url page views through.However it only gets a couple days, maybe a months of data before it crashes.
The text was updated successfully, but these errors were encountered: