We had our search pages being indexed by Google up until a few months ago. We first added the noindex,nofollow tag to the search results, but soon added the /search.php to the robots.txt. Since then our Index status has risen more than 20,000 in GWT to over 100,000. Doing a site: search also gives over 140,000 pages. If I do site:example inurl:search it shows almost 50,000 results.
Did I jump the gun with adding search to the robots.txt which just made the pages uncrawlable but still in Googles index? Should I take search out of the robots.txt and give Google more time to noindex those pages?