Search engine crawling errors
sreerag
Joined: 2010-01-17
Posts: 11 |
![]() |
Google search engine crawling unwanted url in my site and creating large number of server error and 404 error examples of such url's are How to block ?g2_view=rss.SimpleRender , ?g2_view=panorama.Panorama&g2_itemId=,?g2_view=webdav.WebDavMount&g2_itemId=,?g2_view=panorama.Panorama&g2_itemId= crawling by search engine ??? |
|
Dayo
Joined: 2005-11-04
Posts: 1642 |
![]() |
Put them into your robots.txt. Here is a selection from mine User-Agent: * Disallow: *g2_view=ecard.SendEcard* Disallow: *g2_view=rss.SimpleRender* Disallow: *g2_view=core.UserAdmin* Disallow: *g2_view=comment* Disallow: /gallery2/popular* Disallow: *slideshow.html* Disallow: *g2_controller* Disallow: *admin* Disallow: *key/*
-- |
|
sreerag
Joined: 2010-01-17
Posts: 11 |
![]() |
Thanks a lot sir , Which robots txt file i need to update ?? I have three robots.txt file , one is in the root directory ((home) and rest 2 are public_html http://cinespot.net/robots.txt |
|
Dayo
Joined: 2005-11-04
Posts: 1642 |
![]() |
public_html |
|
smartblogger
Joined: 2013-08-07
Posts: 1 |
![]() |
Is there any way instead of blocking from robots.txt |
|