Just need a little advice. We have a UK domain which is our main site but also a EU domain which is a carbon copy of the UK site but prices are in Euros. At the moment we just had a robot.txt file disallowing all pages to prevent duplicate content issues, and the main traffic source is paid and email. I want to remove the disallow but dont want to cause duplicate content issues. Am I right in thinking if we add a canonical tag on all pages citing the corresponding page on the UK site as the original content this will solve the robot disallow issue without causing duplicate content issues.