This is the community forum. For a developer response use the Client Area.
Follow us on Facebook, Twitter and YouTube!

robots.txt
#1

So i'm going to be using robots.txt to block some of the results on googles' SERPS such as ones which end with ?cmtx_sort=5‎, ?cmtx_sort=1, ?cmtx_sort=6 ect. To see the train-wreck that is my serp atm click here .

Would this robots.txt work
User-agent: *
Disallow: /*?


In my head the Disallow: /*? directive will block any URL that includes a '?'
ect.
Reply
#2

Yeah, or this might be better:

Disallow: /*?*

Best thing to do is to use the Google Webmaster Tools to test it:

1. On the Webmaster Tools Home page, click the site you want.
2. Under Crawl, click Blocked URLs.
3. If it's not already selected, click the Test robots.txt tab.
4. Copy the content of your robots.txt file, and paste it into the first box.
5. In the URLs box, list the site to test against.
6. In the User-agents list, select the user-agents you want.

Have you completed the interview?
Reply
#3

Yeah Steven works with one or two asterisks thanks for webmaster tool tip, could you explain how the second asterisks changes the parameters of the url blocking ?
Reply
#4

Just one asterisk is fine then. I thought you might need two but it's okay.

Have you completed the interview?
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)