The Thriller
Well-Known Member
As I grow older I feel like one thing America can be is something that offers other people hope. Whether you’re black, foreign born, LGBT, America should be more of an idea, the standing up for human rights, rather than a blood and soil nation limited to short term wealth generation. I really felt we lost our sense for a few years there and it feels like many recognize that if America doesn’t stand up for democratic principles and human rights, who will? I wished more challenged themselves to look beyond just their selfish immediate day to day needs.