Welcome to Geeklog, Anonymous Tuesday, April 16 2024 @ 06:29 pm EDT

Geeklog Forums

Cloaking -- I want to make the content of a block depend on the user agent


martingale

Anonymous
I have an ad block on my site that I would like to suppress if a spider comes along. I don't want the ads to appear in the page-cache on google or msn. Instead I would like to return some meta text / keywords / description to the spider where the ad would normally be to help it index my site better. I just want to change the content of this one block, nothing else.

You can tell if it's a spider based on the user agent. I could provide a list of the user agents that appear for msnbot, googlebot, etc., and the idea would be if one of those showed up you wouldn't get the ads.

Any idea how to do this? Either shutting off the block for those user agents, or having the content of a block be differnet would work.
 Quote

Status: offline

Dirk

Site Admin
Admin
Registered: 01/12/02
Posts: 13073
Location:Stuttgart, Germany
Quote by martingale: You can tell if it's a spider based on the user agent.

You are aware that Googlebot comes along as a regular browser on occasion? Google checks for sites that serve different content to spiders than to regular browsers and penalises them by ranking them down ...

bye, Dirk
 Quote

martingale

Anonymous
Quote by Dirk:
Quote by martingale: You can tell if it's a spider based on the user agent.

You are aware that Googlebot comes along as a regular browser on occasion? Google checks for sites that serve different content to spiders than to regular browsers and penalises them by ranking them down ...

bye, Dirk


Yes, but you are incorrect that there is a penalty for cloaking. There is a penalty for cloaking that misrepresents your site. Not the same thing at all.

What is OK: Returning content that is appropriate for a particular browser, returning different content based on geographical region, omitting content that a browser would ignore anyway, deciding whether to return ads and which ads to return based on any factor.

What is not OK: Returning misleading content designed to get your page indexed for one topic when your site is really about another topic (ie: you rank for "recipe" but you are really a "casino").

Practically any site running an ad server does some kind of cloaking; sites like CNN do it based on geography. Cloaking does not result in a penalty. It is abusive and misleading practices implemented through cloaking that lead to a penalty.

There's nothing wrong with deciding not to serve ads to a spider, or, to put it another way, choosing to advertise myself when the visitor is a spider. It's stupid, and maybe even a violation of my TOS with Google, to have my ad blocks running out of some search engine's cached pages.
 Quote

martingale

Anonymous
I mean put it this way, do you really think that every site that returns one page to IE and a different page to Mozilla/Firefox is getting banned from Google for "cloaking"? Obviously not. It's what you DO with cloaking that gets you into trouble.

Choosing what to return based on user-agent is something that a zillion billion sites on the net do routinely for the practical reason that something that works in IE doesn't work in Mozilla.

What I want to do, decide which users to return ads to, is also something that a zillion billion sites do, including CNN, AskJeeves, etc., etc., etc., all the major sites on the net return different ads to different users based on browser and IP.
 Quote

Status: offline

jlawrence

Forum User
Chatty
Registered: 12/30/04
Posts: 49
Location:Plymouth, Devon, UK
Quote by martingale: I mean put it this way, do you really think that every site that returns one page to IE and a different page to Mozilla/Firefox is getting banned from Google for "cloaking"? Obviously not. It's what you DO with cloaking that gets you into trouble.


BULL.
changing content dependant on browsers is not the same as changing the content that bots see.
If googlebot decides that you're cloaking content to the bot, you won't get penalised - you're right in that - you'll get dropped from the index completely.
Cloaking wrt the bots is a massive no no. Do it, and get caught and you will get dropped from the index - end of story and no appeal.

That said, it's easy to do Smile just use the php $_SERVER['HTTP_USER_AGENT']
Search the useragent for googlebot (or even just bot) and serve appropriate content.
www.plymouthcricketclub.com - providing cricket for all ages in the Plymouth area.
 Quote

All times are EDT. The time is now 06:29 pm.

  • Normal Topic
  • Sticky Topic
  • Locked Topic
  • New Post
  • Sticky Topic W/ New Post
  • Locked Topic W/ New Post
  •  View Anonymous Posts
  •  Able to post
  •  Filtered HTML Allowed
  •  Censored Content