I'm really curious, what do Europeans think of the wild west?
That Clint Eastwood westerns are awesome, that
Once upon a time in the west is the best western evah, and I love
Dances with wolves, both the book and the movie. As far as historical events are concerned, when the wild west is mentioned I mostly think of the struggles between the Natives and the white settlers. I won't go into that further, don't want to derail the thread with such a sensitive subject.
I'm really tempted to buy Red Dead Redemption when it comes out, even though I really can't afford it. It looks great