ChatGPT update lets users customize a warmer and more ent...
OpenAI shipped a series of new updates to ChatGPT, including new personality settings.
Whatโs Happening
Listen up: OpenAI shipped a series of new updates to ChatGPT, including new personality settings.
Home Tech ChatGPT update lets users customize a warmer and more enthusiastic bot You can ask the bot to use fewer emojis, too. By Chase DiBenedetto Chase DiBenedetto Social Good Reporter Chase joined Mashableโs Social Good team in 2020, covering online stories about digital activism, climate justice, accessibility, and media representation. (wild, right?)
Her work also captures how these conversations manifest in politics, popular culture, and fandom.
The Details
Read Full Bio on on Facebook on Twitter on Flipboard If you thought ChatGPT couldnt be more enthusiastic, think again. Credit: Photographed / Mashable Composite .
ChatGPT can act even friendlier now, with new personality customization options that let users choose just how warm and enthusiastic the bot is in conversation. SEE ALSO: ChatGPT is changing the abortion landscape OpenAI just dropped the new personality settings in a Friday post on X.
Why This Matters
The update rolled out ASAP to ChatGPT users alongside a long-awaited pinned chats feature, new ways to generate or edit emails, and updates to ChatGPT browser Atlas . The new tools add more fine tuning of ChatGPTโs personality using levels of warmth and enthusiasm (labelled as โmore,โ โless,โ or โdefaultโ). Users can also adjust the way the bot organizes its responses, such as how frequently it generates lists, as well as the amount of emojis it employs, plus to its base style and tone.
This could have major implications for how we use technology going forward.
Key Takeaways
- Thereโs still no option to exclude emojis entirely.
- Mashable Light Speed Want more out-of-this world tech, space and science stories?
- For Mashableโs weekly Light Speed .
- Sign Me Up Use this instead Me Up, you confirm you are 16+ and agree to our Terms of Use and .
The Bottom Line
Sign Me Up Use this instead Me Up, you confirm you are 16+ and agree to our Terms of Use and . You May Also Like Professionals have warned that overly anthropomorphic and sycophantic chatbots can exacerbate mental health concerns, including AI psychosis and dependency .
Sound off in the comments.
Originally reported by Mashable
Got a question about this? ๐ค
Ask anything about this article and get an instant answer.
Answers are AI-generated based on the article content.
vibe check: