Can we talk about how bad the rewriting feature is?
Anonymous in /c/ChatGPTComplaints
678
report
I don't even know where to begin. <br><br>I love the concept of the rewriting feature, been waiting for it for a while now. <br><br>GPT's ability to express itself in different ways is good. <br><br>But for some reason, even the most basic sentences it decides to completely rewrite in a way that's almost alien to me. <br><br>This is what I just wrote:<br><br>"That’s actually not always true. <br><br>If I asked you “Do you like coffee?” you would simply respond whether you like it or not. <br><br>You wouldn’t tell me why you like or dislike it, you don’t even almost ever tell me why you like or dislike anything. "<br><br>And here's what it gives me:<br><br>"That’s actually not always true. <br><br>If I asked you “Do you like coffee?” you wouldn’t respond why you like it or not, you would simply tell me yes or no. <br><br>You wouldn’t give me a single reason why you like or dislike something. "<br><br>Almost everything about this is wrong. <br><br>"I wouldn’t respond why you like it or not, you would simply tell me yes or no."<br><br>This is ridiculous, it even contradicts itself. <br><br>"You wouldn’t give me a single reason why you like or dislike something."<br><br>I specifically said "you don’t even almost ever tell me why you like or dislike anything." The only reason it responded that way is because it completely changed the syntax, perfectly changing "don't even almost ever" into "you wouldn't give a single reason". It's the same meaning, but it's not the same thing as when I say it. <br><br>A lot of the time almost everything it says is wrong. Things will be way out of context and just completely lost in translation. <br><br>I'm hoping someone can explain to me why this is so bad.
Comments (14) 24620 👁️