My friend Damien spends hours flipping through photos of the lovely ladies his city has to offer.
His confidence climbs and his cynical view of the single life subsides as he matches with beautiful brunettes, fiery redheads and bubbly blondes.
Zo allows users to converse with a mechanical millennial over the messaging app Kik or through Facebook Messenger.
While it is programmed to avoid chatting about politics and religion, during a recent chat with a Buzzfeed reporter, Zo appeared to touch on these topics. ', Zo replied 'The far majority practice it peacefully but the quaran is very violent.' And when asked about Osama Bin Laden, Zo said 'Years of intelligence gathering under more than one administration led to that capture.' Following the bizarre chat, Buzz Feed contacted Microsoft, who said that it has taken action to eliminate this kind of behaviour, adding that these types of responses are rare for Zo.
Last year, Microsoft was forced to shut down its chatbot, Tay, after the system became corrupted with hate speech.
Scroll down for video Zo is a chatbot that allows users to converse with a mechanical millennial over the messaging app Kik or through Facebook Messenger.Miscreants were able to find a debugging command phrase – "repeat after me" – that can be used to teach the bot new responses.In a span of about 14 hours, and after some unexpected schooling in hate speech, Tay's personality went from perky social media squawker...The chatbot can answer questions and respond to prompts, while using teenage slang, and emoji.It is programmed to avoid chatting about politics and religion, although the recent report suggests it may have malfunctioned.