Hello all. I'm a web and graphic designer, who, due to circumstances has been left to manage the sites and SEO of our companies. Our websites are a manufacturing quoting and ordering system developed by our former programmers and they power multiple sites, each dealing with a different type of product but they share the same root directory.
We hired an SEO firm to help us get a few keywords to the top of google on our different domains (Shortrunpro.com, FederalBrace.com, Killarneymetals.com, ComputerBracketSolutions.com and BisonBuilt.com) and all was well for a while until Google's latest set of changes started causing our ranking to fall.
One of their requests was that we add a sitemap.xml for our sites. Now as my title suggests all of our sites, despite being different domains, use a skin system that gives them their different look and even content but the sites all share the same root directory, so putting a sitemap.xml in there for one would have adverse effects on the other sites. The same goes for robot.txt files too, however that is less concerning to me at this point.
My question is, knowing the situation our sites are in, what is the best way to handle this so that each site gets indexed properly in a way that doesn't accidentally index the other domains' links as part of it? Can you just list multiple domain urls in there and Google and all the other search engines will be smart enough to pick up which domain belongs and ignore the rest?
Thanks, I look forward to learning more from this community as an active member than the way I've been passively following it so far as a guest.