I work as seo consultant for large website. Main content of website are pages of objects(accomodation, restaurants etc).
Because of bad SEO we end up with literally houndred of thousands of pages indexed. We repaired this in summer - all bad links have 301 redirects, and nobody links anymore to that content. After half a year nothing changed. Pages are still in google index. I googled and found 3 possible sollutions:
1. use google removal tool: it was designed to remove only few pages, and according to google it should be used only for urgent removal. I can write a script that will automaticly add pages to google removal, but it is against their policies (if links was in one folder it would be possible to delete all in folder, but they are not).
2. define rule to robots.txt
Google knows wildcards in robots.txt, so i can define rules to remove this urls. Problem with this solutions is, that google will not crawl this pages, but they will remain in index. When or if ever google removes it from index, i didnt find answer for that.
I put few of my bad pages to robots.txt month ago and they are still indexed.
3. put links back on my site - google will find them, go through them, find 301 and deindex them. But I am not sure if this isnt harmfull for my site.
I am hopeless with this. Do you know any better sollution?