Because you can’t spell muhammad without HAM!
One Girl Looks Terrified, The Other Is With Literally Hitler:
Fear [vote-by-mail] and obey [stay inside or face death or arrest] tactics push will end +1 [election day].
Release of convicts from prison [evil releases evil].
Do these people care about your well-being?
Do these people love America?
Do anything to regain power?
Tay was an artificial intelligence chatter bot that was originally released by Microsoft Corporation via Twitter on March 23, 2016; it caused subsequent controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, causing Microsoft to shut down the service only 16 hours after its launch