1

The Googlebot seems to be crawling up inside my jQuery and creating links ending in /a that don't exist and then reporting them as 404 errors.

http://www.mySite.com/a

The site validates green at the W3C.

The "/a" is coming from inside jQuery itself. Edit: The following is a line of code within jQuery v1.5 and 1.5.2 (the only two I looked inside)

<a href='/a' style='color:red;float:left;opacity:.55;'>a</a>

For now, I'm redirecting it within htaccess before it gets out of hand...

Redirect 301   /a   http://www.mysite.com

Does anyone know why/how the Googlebot would go inside jQuery?


EDIT:

I've since blocked the jQuery file with the robots.txt file but I really wasn't expecting the Googlebot to go into external JavaScript files.


EDIT 2:

The following is a response from Google employee JohnMu on this issue in the thread I started at Google Groups. Looks like I'm going to do the 301 after all.

JohnMu

Google Employee

4:39 AM

Hi guys

Just a short note on this -- yes, we are picking up the "/a" link for many sites from jQuery JavaScript. However, that generally isn't a problem, if we see "/a" as being a 404, then that's fine for us. As with other 404-URLs, we'll list it as a crawl error in Webmaster Tools, but again, that's not going to be a problem for crawling, indexing, or ranking. If you want to make sure that it doesn't trigger a crawl error in Webmaster Tools, then I would recommend just 301 redirecting that URL to your homepage (disallowing the URL will also bring it up as a crawl error - it will be listed as a URL disallowed by robots.txt).

I would also recommend not explicitly disallowing crawling of the jQuery file. While we generally wouldn't index it on its own, we may need to access it to generate good Instant Previews for your site.

So to sum it up: If you're seeing "/a" in the crawl errors in Webmaster Tools, you can just leave it like that, it won't cause any problems. If you want to have it removed there, you can do a 301 redirect to your homepage.

Cheers

John

Community
  • 1
  • 1
Sparky
  • 94,381
  • 25
  • 183
  • 265

1 Answers1

2

It looks like jQuery uses that as a test template to determine browser support for features. I am not sure why this would ever be seen by a google bot, though. I was not aware that web crawlers typically ran any Javascript. That would mean that they are actually functioning as a web browser (which one I wonder?). Seems unlikely.

(Edit - see this: how do web crawlers handle javascript - indicates that google may try to pull some stuff from scripts. Surprised it would not be programmed to recognize something that's part of jQuery, do you use a nonstandard name for the include?)

Alternatively, is there any chance that the header for your jQuery include is not correct? Maybe it's being served with an HTML mime type, which most browsers would probably not care about since they type is also set by the script include, but maybe a bot would decide to parse.

In any event rather than setting a redirect, why don't you just use robots.txt? Add this line:

Disallow: /a

You could also try fixing jQuery. Obfuscating the link a little bit would probably do the trick, e.g. change the offending line:

  div.innerHTML = "   <link/><table></table><"+"a hr"+"ef='/a'"
  +" style='color:red;float:left;opacity:.55;'>a</a><input type='checkbox'/>";

If google is smart enough to actually parse string concatenations, which would shock me, you could go one further and assign something like "href" to a variable and then concatenate with that. I can't believe their js scanner would go that far, that would be basically like trying to run it.

Community
  • 1
  • 1
Jamie Treworgy
  • 22,874
  • 7
  • 70
  • 116
  • Yeah... adding that to the robots.txt file seems like a better idea than a 301. I am not alone in this issue. There are a few others on StackOverflow reporting this "/a" issue as well as a few others in the Google Forum complaining about the googlebot getting inside the JavaScript. – Sparky Apr 21 '11 at 21:08
  • No, I did not rename the jQuery file. I host it myself and do not use the CDN. – Sparky Apr 21 '11 at 21:13
  • 1
    Updated the post with an idea... I don't have time to play with this right now but it seems like this could be easily fixed. There's an issue report here: http://bugs.jquery.com/ticket/7659 but nobody seems too interested in dealing with it... – Jamie Treworgy Apr 21 '11 at 21:14
  • Everything passes W3C validation and this is my include... – Sparky Apr 21 '11 at 21:15
  • Yeah, I'm sure it's not you at this point, i was just asking to be sure. – Jamie Treworgy Apr 21 '11 at 21:15