• 1 Post
  • 10 Comments
Joined 1 year ago
cake
Cake day: July 8th, 2023

help-circle






  • We actually have a live experience of how that could go down

    Another example: latest iteration of Google Captcha. Released with promises to end manually inputting text captchas, the main thing it turned out to check for is whatever you are logged into your google account. If so, you get through automatically, or, at worst have to press a checkbox. If you are not logged in, enjoy selecting fire hydrants and crosswalks.





  • Basically, it would allow websites to only serve users who comply with website requirements (i.e., no extensions, no ad blockers, only Chrome-based, whatever) whatever these requirements are.

    You (your browser) go to a website, example.com, which requires attestation. So you must go to an attestation server and attest your device/browser combo (by telling the attestation server whatever information it requires). If the attestation server thinks you are trustworthy, it gives you an integrity token that you pass to example.com, and then you can see example.com. The website knows which attestation server issued your integrity token, so you can’t create your own.

    So no extra software means no attestation server would attest you; means you can’t see example.com. End of story. It’s the same as the current “your browser is not supported” window, only you can’t get around it by changing the user agent.

    As usual with these initiatives, bullshit is spread across different specs - this spec by itself implies that any number of attestation servers can exist, and they can check whatever they want, and no browser should be excluded, etc., etc., but practical implementation would probably check installed extensions, etc.


  • Research linked in the tweet (direct quotes, page 6) claims that for "GPT-4, the percentage of generations that are directly executable dropped from 52.0% in March to 10.0% in June. " because “they added extra triple quotes before and after the code snippet, rendering the code not executable.” so I wouldn’t listen to this particular paper too much. But yeah OpenAI tinkers with their models, probably trying to run it for cheaper and that results in these changes. They do have versioning but old versions are deprecated and removed often so what could you do?