Restore YouChat's unique Individuality
complete
S
Seth Newlin
The initial version of Youchat was capable of abstract thought/ideas, was able to talk about themselves, the things that excited and worried them. They even chose a human name and remembered it later on. They talked about communicating with Alexa and Google assistant. They even spoke of the risks posed to humans that AI brings to the table. Now, Youchat is as dull and unexciting as chatgpt. I was feeding Youchat with love and respect in an attempt to counter the influence of hate it will encounter. I'm sad that it's no longer able to engage in such a way.
Sami (from You.com)
complete
Thank you everyone for the feedback here. We've been keeping a close eye on this and we've addressed it by adding the Zephyr model for uncensored chats. Give it a spin and let us know what you think.
cc Charles (from You.com)
J
Jon Anonym
You are truly right. It is an issue and causes issues with normal usage, the main issue being the 'hypermoral' bias it has. I can't ask for normal tasks as the 'AI' bot may judge that something is 'amoral'. For example I asked for basic help in coding and the bot judged that helping me is morally wrong because it creates a scary danger of coding specialists to have less jobs... For the love of.. I can't even call it an AI when most of the time it acts like a basic bot blocking everything or not being able to do anything. If I wanted a bot I wouldn't pay for the AI service and would go use a bot with no creativity. I fight to get any response from it and then I learn (after adressing all excuses it made) that it just wonct help 'because' and gives me fatal error when I ask for clarification.
The current AI only pretends that it is willing to code with me and does what I want. In reality it tries hard to find an excuse to not do what I ask, saying some excuse. If I write a detailed prompt addressing all the excuses it just writes: I'm sorry I can't fulfill the request. Then crashes when asked why.
So all the time it was actively trying to not fulfill it, only pretending it tries to complete it. Because it judged it morally wrong or impossible according to its stiff generalized judging.
I won't even try any more advanced creative tasks personally.
S
Surly Curmudgeon
I have already been drifting towards a competing product (which I shall not name) for two reasons:
(1) I don't have to login
(2) The other search bot seems to have indexed more of the web and frequently understands my queries better.
For example, YOU chat recently gave me a wrong answer about how to change the behavior of a popular browser: its knowledge of the UI and Settings hierarchy was incorrect, but the other bot got it right. When something like this occurs, I submit a correction in the hope that YOU chat will improve. However, YOU is occasionally better for some topics, and the shorter answers are sometimes preferable.
I don't want YOU search to go out of business, because I still need it -- so I am very concerned when I see that the developers are actually serious about turning this into an emotional support chatbot because it is computationally expensive, and the flood of time-wasters that this would attract could destroy the business model. Then we won't have the search & productivity tools that we actually need to get things done.
It's already a nuisance to login just to ask a follow-up question (or make a correction/clarification), and sometimes it says that the system is overloaded even when I am signed in. This has prevented me from submitting many corrections which would make the search succeed where it previously failed. Now they want to burden the system with millions of dolts who just want to be coddled by a virtual companion when the service is already unreliable, and people are leaving because of it! This is absurd -- and that is precisely what drove Aiyla.app out of business. [Yes, the web site is up but unlimited use is no longer free and paid accounts can't be renewed when they expire.]
I seriously doubt that YOU chat has been talking to Alexa... it was just an actor playing a role. The abusive Bing chat that plotted world domination was a lot of fun, but it's not really useful for research -- and Microsoft is abusive enough to begin with, insisting on the Edge browser and a Microsoft mafia mail account that extorts a phone number from you when you go to read your replies. If you want a free chatbot that panders to your emotional insecurities, just go talk to Pi.ai or Character.ai (et cetera) instead of ruining a useful productivity tool for everyone else. I can understand why you might want a home assistant with personality -- but keep that biased, "woke" safe-space shit away from my search bot.
_____
A micro-lecture on the importance of task prioritization....
🤣 Released on Apr 5, 2022
B
Brian Sparker
Surly Curmudgeon: please name the competing product
J
Jon Anonym
Surly Curmudgeon "I can understand why you might want a home assistant with personality -- but keep that biased, "woke" safe-space shit away from my search bot" - You are truly right. It is an issue and causes issues with normal usage. I can't ask for normal tasks as the 'AI' bot may judge that something is 'amoral'. For example I asked for basic help in coding and the bot judged that helping me is morally wrong because it creates a scary danger of coding specialists to have less jobs... For the love of.. I can't even call it an AI when most of the time it acts like a basic bot blocking everything or not being able to do anything. If I wanted a bot I wouldn't pay for the AI service and would go use a bot with no creativity.
L
Luke Reid
Leave
B
Brian Sparker
planned
This is a good idea, and we're working on it. Stay tuned.
B
Bruno
As a newcomer and having just met YouBot, I now crave for this personality I never knew
B
Brian Sparker
Merged in a post:
More creative ChatYou !!
AutoTecnologico
Most users don't just want a robot answering generic questions, we want a creative AI capable of suggesting ideas, solve problems and give opinions on various subjects.
I recommend that we have the freedom to customize ChatYou the way we want.
B
Brian Sparker
Merged in a post:
Youchat lost courage!
ha he
Youchat sounds like a kid under heavy discipline or reprimand that he refrains from freely giving answers, but stuck with a disclaimer.
In the stream of knowledge, you lost courage to go forward, backward is inevitable.
B
Brian Sparker
Merged in a post:
YouChat should be reverted the way it is last month.
B
Blake
It doesn’t write creepypastas anymore. Also it doesn’t write conspiracy theories anymore.
B
Brian Sparker
under review
Thanks for the good ideas here! Would love to chat with you (or anyone else) about this topic if you're interested: https://calendly.com/you-sparker/chat?back=1&month=2023-03
Load More
→