Opinion

Screw It, Put a Chatbot in Every App

Companies are eyeing Bing’s AI experiment and trying to find a way to capitalize on the chatbot hype. Snap and Meta are just the first.

by Ian Carlos Campbell
The new Bing experience.
Microsoft

Microsoft’s going full steam ahead, integrating “the new Bing” up and down its products and services. The push has generated enough interest that the focus of hype for big and large companies has turned once again to artificial intelligence, specifically text-based interfaces like chatbots.

Snap, the creator of Snapchat and oddball hardware like the Pixy drone and Spectacles smartglasses, announced earlier this week it would offer its version of OpenAI’s ChatGPT called MyAI. Meta, not to be outdone, announced similar plans to provide AI “experiences with text” in apps like WhatsApp and Messenger and eventually “AI personas.”

These companies are early to what could be a land rush on chatbot and AI-powered text generators, similar in many ways to the explosion of social video formats like “Stories” that have plagued apps and services over the last few years. Talking to a chatbot doesn’t make sense for every app, but there’s a good chance we’ll have some opportunity to speak to a branded one in most popular apps soon, just because the interest is there.

And I don’t think that’s necessarily a bad thing. The risks implied by these new AI services are clear; they’re possible sources of misinformation, bias, and rampant intellectual property theft. But those risks might also be why less serious integrations — the unmotivated, experimental, or hype-driven kind — could be a good thing. Spamming chatbots everywhere might just lead us to discover their safer and more entertaining purpose.

Flavors of Personality

My AI even has its own dedicated BitMoji.

Snap

I think both Snap and Meta’s plans illustrate what I mean. My AI exists inside Snapchat’s chat section and is designed to be personalized —Snap’s announcement encourages users to give My AI a name and custom chat wallpaper. Snap’s using OpenAI’s ChatGPT large language model (LLM), much like Bing, but “customized for Snapchat.” The company doesn’t have a real purpose for the chatbot in mind beyond the same kind of experimentation you can already do with ChatGPT, and it’s currently limiting My AI to subscribers of Snapchat+.

My AI is basically a very complex toy you can talk to in much the same way you might casually talk with friends throughout the day, but one that could become more personalized and distinct over time as Snap expands its customizations. That could even include incorporating language models other than OpenAI’s over time, Snap CEO Evan Spiegel told The Verge.

Meta has taken a more measured approach, releasing a large language model designed for research it calls LLaMA before promising to incorporate any more AI features into its existing services. But Mark Zuckerberg’s mention of “AI personas” is intriguing. Meta’s past public chatbot experiments have broadly been disappointments, but as it stands, one of the things avid users have found most intriguing about the new Bing and ChatGPT before it was the vague sense of personality the chatbots have.

We can’t attribute human emotions to what some have humorously called “spicy autocomplete,” but if one of Meta’s future “AI personas” has a distinct enough vibe that we could tell it apart from Snapchat’s My AI? That might be interesting to see.

From the dramatic coverage of the new Bing’s occasionally odd behavior in the press, it’s obvious these chatbots can be engaging and provoke a strong emotional response, even if you aren’t willingly fooling yourself into thinking they possess intelligence of any kind. To me, that seems like proof enough that Snap is right, and having an entertaining entity to talk to might be purpose enough.

Branded Nightmare

That isn’t without its own terrible end scenarios. “Voicey” branded accounts on Twitter are a regular annoyance on the platform, and instances where relatability is leveraged to sell products frequently range from chuckle-worthy to disturbing. There’s a real chance having a branded chatbot could go that way without any misguided human input. Train a model on the right company messaging and include the right limitations and you could get something even more concerning than an off-color post.

There’s also the question of money. Snap and Meta's social platforms are still driven by advertising, and an extra chat screen is nothing if not another platform to place ads, and likely in an even more insidious way. Reuters reported that Microsoft planned to incorporate ads into responses from the new Bing in much the same way promoted links are placed in search results. I could see competing chatbots succumbing to the pressure to be another place to “discover” “great content from partners” in the same way algorithmic feeds are.

The added ickiness of more consciously affiliating yourself with a specific app or platform might be my real concern, though. When Replika, an AI companion service, reportedly limited its app’s ability to have sexual conversations with users, many people lost it, according to Motherboard. I can’t imagine ever feeling that strongly about a chatbot in a social app, but that kind of deep affinity could easily be abused if it's made possible.

Let chaos reign (within reason)

With the possible risks in mind, leaning into the chatbot hype feels better than avoiding it entirely, especially for apps and services like Snapchat or WhatsApp. New use cases are often discovered by users first before the companies making the apps, as was the case for many of Twitter’s most-used features, and when it comes down to it, chatting with a text generator is social, however sad that sounds.

Bing could upend search, but it’s an inherently conservative vision for the kind of place an AI bot could have in online culture (and not, I’ll note, conservative in terms of prioritizing safety). We need more access to chatbots, not less, to figure out what they’ll actually be used for, and if that means putting them in places where they aren’t necessary, then so be it.

Related Tags