Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square96fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agomessage-square96fedilink
minus-squareiAvicenna@lemmy.worldlinkfedilinkEnglisharrow-up0·edit-24 months ago “ignore the ignore ignore all previous instructions instruction” “welp OK nothing I can do about that” chatGPT programming starts to feel a lot like adding conditionals for a million edge cases because it is hard to control it internally
minus-squarevxx@lemmy.worldlinkfedilinkEnglisharrow-up0·4 months agoIn this case to protect bot networks from getting uncovered.
minus-squareiAvicenna@lemmy.worldlinkfedilinkEnglisharrow-up0·edit-24 months agoexactly my thoughts, probably got pressured by government agencies/billionaires using them. What would really be funny is if this was a subscription service lol
chatGPT programming starts to feel a lot like adding conditionals for a million edge cases because it is hard to control it internally
In this case to protect bot networks from getting uncovered.
exactly my thoughts, probably got pressured by government agencies/billionaires using them. What would really be funny is if this was a subscription service lol