Hi, yesterday i noticed that og-crawler always gets a 503 code (Checked at sharing debugger) and links wouldn't displayed right. I changed robots.txt and allowed "FacebookBot" explicit, but no changes. Site is with default user-agent available. If i changend user-agent to "facebookexternalhit/1.1 (+http://www.facebook.com/externalhit_uatext.php)" in my webbrower, site isn't reachable. Someone know's how to fix it?
After checking the log files several times, the following observation was made: Since 10.07.24 the Facebook crawler in my case requests with the HTTP/2.0 protocol - and not as before with the HTTP/1.1 protocol. Obviously the web server of my provider does not support HTTP/2.0, which leads to the 503 messages.
No Chance to find the cause of error. - no special entries in wp error log - no special entries in access-log - provider has http/2.0 enabled - provider is not a great help - all wp-plugins deactivated - no change
I think it has to do with user-agent and HTTP/2.0 but i didn't find the cause. If anyone has an idea. Help is higly apprecated.
If you don't even have access to your server logs so you cannot even provide the respective log details and you probably want to change your hosting provider. Some providers (or some WP plugins) block recurrent requests as they may indentify them as "spam", however w/o access to the logs it'll be a waste of time.
Lars, you are right. I give it up. But suprise! Since this morning it works again. Don't know why but happy. ;-) Thanks for your replies.
Check your server logs
Hello, easy to say. As this is a hosting package, access to the server logs is not possible. The log excerpts available to me only show that the GET requests with the Facebook user agent are immediately acknowledged with a 503 message. I have contacted the support of my provider, who unfortunately shows little interest in finding a solution. The solution will probably remain a mystery.