I have a website in the "web design" niche. There was couple of small businesses that went out of business and let their websites expire. I had done some good work on these websites, so I decided to keep some of the content on my website under my portfolio. At First I thought this was the reason my website rankings dropped after penguin but then I added the urls of those few directories to my robots.txt so Google does not index them. I also submitted requests to Google to remove the urls from the index. If I do a search in Google like site:http://mywebsite.com I still see pages from those directories under omitted results but under each page it says:
"A description for this result is not available because of this site's robots.txt – learn more."
So my question is, you guys think this is still hurting my website ? If I think it's pretty obvious to me that a website for web design has content from "construction", "music" etc. But I thought because I have blocked the urls with the robots file I might be ok.
Please advise what do you guys think.