• 0 Posts
  • 20 Comments
Joined 10 months ago
cake
Cake day: December 29th, 2023

help-circle














  • Generative AI doesn’t get any training in use. The explosion in public AI offerings falls into three categories:

    1. Saves the company labor by replacing support staff
    2. Used to entice users by offering features competitors lack (or as catch-up after competitors have added it for this reason)
    3. Because AI is the current hot thing that gets investors excited

    To make a good model you need two things:

    1. Clean data that is tagged in a way that allows you to grade model performance
    2. Lots of it

    User data might meet need 2, but it fails at need 1. Running random data through neural networks to make it more exploitable (more accurate interest extraction, etc) makes sense, but training on that data doesn’t.

    This is clearly demonstrated by Google’s search AI, which learned lots of useful info from Reddit but also learned absurd lies with the same weight. Not just overtuned-for-confidence lies, straight up glue-the-cheese-on lies.




  • I feel the need to point out that a float isn’t an integer with a decimal stuck on. A floating point number is called that because the precision on both sides of the decimal point changes depending on the size of the number.

    It’s actually stored as an exponent and a value to apply the exponent to. This allows you to express incredibly tiny numbers and incredibly large numbers, but the gaps between representable numbers is inconsistent.

    You know how 10 / 3 * 3 is often not 10 because the decimal representation loses the repeating .33? In float, you run into the same issue but in much less predictable places.