Posts: 15
Threads: 5
Joined: Jul 2013
So i'm going to be using robots.txt to block some of the results on googles' SERPS such as ones which end with ?cmtx_sort=5, ?cmtx_sort=1, ?cmtx_sort=6 ect. To see the train-wreck that is my serp atm click
here .
Would this robots.txt work
User-agent: *
Disallow: /*?
In my head the Disallow: /*? directive will block any URL that includes a '?'
ect.
Posts: 2,890
Threads: 59
Joined: Jun 2010
Yeah, or this might be better:
Disallow: /*?*
Best thing to do is to use the Google Webmaster Tools to test it:
1. On the Webmaster Tools Home page, click the site you want.
2. Under Crawl, click Blocked URLs.
3. If it's not already selected, click the Test robots.txt tab.
4. Copy the content of your robots.txt file, and paste it into the first box.
5. In the URLs box, list the site to test against.
6. In the User-agents list, select the user-agents you want.
Have
you completed
the interview?
Posts: 15
Threads: 5
Joined: Jul 2013
Yeah Steven works with one or two asterisks thanks for webmaster tool tip, could you explain how the second asterisks changes the parameters of the url blocking ?
Posts: 2,890
Threads: 59
Joined: Jun 2010
Just one asterisk is fine then. I thought you might need two but it's okay.
Have
you completed
the interview?