Sometimes it can be hard to tell if we’re chatting with a bot or a real person online, especially as more and more companies turn to this seemingly cheap way of providing customer support. What are some strategies to expose AI?

  • tikitaki@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    The reason is that the web browser chatgpt has a maximum amount of data per request. This is so they can minimize cost at scale. So for example you ask a question and tell it not to include a word. What will happen is your questions gets sent like this

    {‘context’: ‘user asking question’, ‘message’: {user question here} }

    then it gives you a response and you ask it another question. typically if it’s a small question the context is saved from one message to another.

    {‘context’: ‘user asking question - {previous message}’, ‘message’: {new message here} }

    so it literally just copies the previous message until it reaches the maximum token length

    however there’s a maximum # of words that can be in the context + message combined. therefore the context is limited. after a certain amount of words input into chatgpt, it will start dropping things. it does this with a method to try and find out what is the “most important words” but this is inherently lossy. it’s like a jpeg- it gets blurry in order to save data.

    so for example if you asked “please name the best fruit to eat, not including apple” and then maybe on the third or fourth question the “context” in the request becomes

    ‘context’: ‘user asking question - user wanted to know best fruit’

    it would cut off the “not including apple bit” in order to save space

    but here’s the thing - that exists in order to save space and processing power. it’s necessary at a large scale because millions of people could be talking to chatgpt and it couldn’t handle all that.

    BUT if chatgpt wanted some sort of internal request that had no token limit, then everything would be saved. it would turn from a lossy jpeg into a png file. chatgpt would have infinite context.

    this is why i think for someone who wants to keep context (ive been trying to develop specific applications which context is necessary) then chatgpt api just isn’t worth it.