At some point during the Obama administration I started noticing all this silly woke shit becoming mainstream. I can't pinpoint exactly when or what sparked it but I vaguely remember hearing certain key phrases and thinking that's preposterous. Then towards the end of his term and the beginning of Trumps it just exoded.
Was it something being fed to our kids in school and it only surfaced then as they became old enough to express it or was it something by design slowly promoted by radical politicians?
It's just crazy how our culture was tossed in a blender then *poof* suddenly close to half the nation is woke.
Was it something being fed to our kids in school and it only surfaced then as they became old enough to express it or was it something by design slowly promoted by radical politicians?
It's just crazy how our culture was tossed in a blender then *poof* suddenly close to half the nation is woke.