Here's a glitch/bug/MASSIVE-FSCKING-PRIVACY-RISK associated with the new #Bing #chatbot feature that I haven't seen documented.
Answers from one query bleed over into another.
Earlier today I asked it about the black power salute that Tommie Smith and John Carlos gave at the 1968 Olympic Games.
Tonight I asked it about headlines referencing chatbot halluncinations.
Look what it did. It returned an utterly irrelevant answer, reflecting my previous query. How can this be production technology?

