Microsoft’s AI Bing chatbot is generatingodd or aggressiveresponses to queries.

The software generates text based on prompts and has started rumors that chatbots couldbecome sentient.

Not so, saidEvan Coopersmith, the executive vice president of data science for the software firmAE Studio.

A screenshot illustrating how ChatGPT learns from a conversation.

Om siva Prakash / Unsplash

“Much like parenting, we are reinforcing what we believe is the appropriate behavior.”

The real problem might be the fact that users are training the chatbot with their responses.

Coopersmith likens the chats to a parenting relationship.

“Both require patience, consistency, and repetition to achieve the desired outcome,” he said.

What seems like a chat to users is actually a learning exercise for them, Ben-Aroya pointed out.

Conscious or Not?

Ex-Google employeeBlake Lemoinewas fired by the tech giant last yearfor claimingits AI known as LaMDA had gained consciousness.

The idea that AI is self-aware has been met with scoffs by many computer scientists.

However, Coopersmith leaves more room for doubt on the AI consciousness question.

“My intuition is that LLMs are not yet conscious,” he said.