Americans are taught more than enough as children to be able to make America ACTUALLY great. I truly believe that. But for some reason, the older generations only retained bits & pieces, or missed the point entirely?? The younger generations just want what they were taught.
We werent taught ALL of black history (& we SHOULD be) but we WERE taught that racism is bad? We were taught the confederacy was racist? We were taught about fascism & the signs leading up to it? We were taught to be on the right side of history? Why is that up for debate now?
We were taught that America is a free country, the land of opportunity. A melting pot that welcomes immigrants and all cultures, a place where you can go to be whoever and whatever you want to be. Why are people being told to go back to their country? Why are we building a wall?
The constitution calls for separation of church and state? So why is abortion/reproductive health/trans healthcare even up to them? The Declaration of Independence says its our DUTY to overthrow the government? So why are we thugs and criminals? Why is anti-fascism “terrorism” ?
America was NEVER great, but growing up we were always taught that we can make it great. Why is our elderly government going against everything they taught us? Why are WE extremists for not accepting that? We are exactly what they raised us to be, why are they so surprised??
You can follow @LIBRAVUITTON.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: