As we all know, the United Kingdom has recently made the best U-turn in geopolitical history and they’ve started slowly but surely infiltrating American culture. The best way to do that, of course, is by showing up on their television a lot. Let’s take a look at some Brits that are taking over Hollywood by force.
