1

I am currently optimizing my site for search engines. It is mainly a database driven site. I am using C# on the back end but database content is loaded via jQuery ajax and a web service. Therefore, my database content is not in html at the point that the bots will crawl it. My site is kind of like an online supermarket format in that there are thousands of items in my database, users can load a single one of these or more onto the web page at a time and the page does not change significantly once items are loaded.

My question is, how (if at all) can I get my database contents indexed? I was thinking of having an anchor that links to an aspx page (eg called mydatabase) which loads all of my database items as a big html list. Then, using jQuery, I would make the anchor invisible to users. The data would still be accessible to users but not by this link, it would be accessed by using the jQuery interface I have created.

The thing is, I don't really want users to see this big, messy list - would google results show this page eg www.mysite.com/mydatabase.aspx as a search result? Also would google see this as "keyword rich" spam page? I have done quite a lot of research but found nothing on this. only instructions for php. Please help I'm not sure what to do and need to know the best way to go about this.

SliverNinja - MSFT
  • 29,007
  • 10
  • 98
  • 161
  • I definitely wouldn't do a database dump to some page that bots will hit. Here is a useful resource for making AJAX apps crawlable: https://developers.google.com/webmasters/ajax-crawling/ – lbstr Dec 13 '12 at 20:11
  • Hi thank you for your quick reply. I saw this but I thought A) it might apply to the old school ajax and B) my site requires that users search for the items from the database before they appear on the page. So surely a bot couldn't do this and access everything (7000+) items in my database without some kind of user input? – user1192900 Dec 13 '12 at 20:42
  • Sorry if I sound really dense I am a newbie – user1192900 Dec 13 '12 at 20:42
  • How about this? I've never tried it, but it might do what you're looking for (at least for Google): https://developers.google.com/search-appliance/documentation/68/admin_crawl/database_crawl_serve – lbstr Dec 13 '12 at 21:08
  • @lbstr That information only works for the Google Search Appliance (hardware) - not Google web search. – Mike Hudson Dec 13 '12 at 22:30

1 Answers1

1

It's a shame you haven't taken the progressive enhancement approach as it would mean you would have started with a standard HTML output that's crawlable, and then adding the layering behaviour (AJAX) on top for the user experience.

Providing a single file (e.g. mydatabase.aspx) that lists all of your products in a list format provides no real value for the reason you gave - it would just be a big useless list. No editorial content relevance for each link etc.

You're much better off taking another look at your information architecture and trying ensure that each product is accessibile by it's own unique URL, then classifying the products into groups (result pages), being careful to think about pagination.

You can still make this act like a single-page application using AJAX, but you'd want to look into HTML5's History API to achieve this in a search engine friendly way.

Community
  • 1
  • 1
Mike Hudson
  • 1,179
  • 5
  • 13
  • I am just not sure how to make each product accessible by a unique url. I know I can use query strings, but even the category links are created dynamically in response to a search. Should I generate search results with the server, then make them link to pages using query strings? Would a bot be able to access all of this when no links are present in the original html? Or should I create a link that essentially functions as a "view all database categories" which leads to server-generated, branching web pages to all of my items? Isn't this just a roundabout way of making a list? Thank you – user1192900 Dec 14 '12 at 00:01
  • This is the major dilemma with search-based database websites. Search engines can't conduct a search - so you need to replicate this in a static way as best as possible. Creating a classification system (categories) and providing a canned search of each category is the first step - then having a link to every product in each category is the next. – Mike Hudson Dec 14 '12 at 05:22
  • ok so just to clarify, my next step should be to have a static link on one of my pages that leads to a page of category links, which each lead to a page of sub-categories and then in turn pages with products. All of these obviously not on individual html pages but generated using query strings? One more question - I heard that Google likes "clean" urls i.e those without question marks. Would the use of query strings impede on my searchability anyway? Thanks :) – user1192900 Dec 14 '12 at 10:21