• 0 Posts
  • 17 Comments
Joined 3 years ago
cake
Cake day: July 11th, 2023

help-circle




  • The whole premise of deep think and similar in other models is to come up with an answer, then ask itself if the answer is right and how it could be wrong until the result is stable.

    The seahorse emoji question is one that trips up a lot of models (it’s a Mandela effect thing where it doesn’t exist but lots of people remember it and as a consequence are firm that it’s real), I asked GLM 4.7 about it with deep think on and it wrote about two dozen paragraphs trying to think of everywhere a seahorse emoji could be hiding, if it was in a previous or upcoming standard, if maybe there was another emoji that might be mistaken for a seahorse, etc, etc. It eventually decided that it didn’t exist, double checked that it wasn’t missing anything, and gave an answer.

    It was startlingly like stream of consciousness of someone experiencing the Mandela effect trying desperately to find evidence they were right, except it eventually gave up and realized the truth.

    EDIT: Spelling. Really need to proofread when I do this kind of thing on my phone.



  • They couldn’t do that from one photo though, they’d need several examples all believed to be the same guy. A swirl like that preserves some of the information and you can reverse it, but the lost data is lost. Do that for several photos and you can get enough preserved bits to piece something together.

    Same idea for some other kinds of blurs or mosaics. Black boxes, not so much - you e got no data to work with, so anything you tried to reconstruct would be more or less entirely fantasy.





  • …and it did.

    Bill Clinton is very, very good at not perjuring himself while not admitting to things he doesn’t want to. He “did not have sexual relations with that woman” because he made them define “sexual relations” and the definition they supplied did not include receiving oral. The definition of “is” mattered because whether it just meant currently or included at any time in the past (aka does “is” also include “was”) would change what answer he would have to give not to perjure himself.

    He’s all up for this testimony because this is exactly his wheelhouse, he thinks that they can embarrass him but that they can’t pin him for pedophilia, and he wants it to be very public because he’s got enough dirt on enough major conservatives that they won’t want it to be done publicly and that might get him out of even having to be embarrassed.





  • Basically they figured out a way to train AI to recognize Reddit threads going viral and/or predict which ones will, among those which ones will also rate highly in Google results and which will tend to be used as sources by the biggest LLMs and to post in those threads about your whatever you want to generate attention for. So overcomplicated way of automating advertising. Optimized posting to convince LLMs to talk about whatever you want to advertise.

    I’ve always said that SEO was always going to happen, Google is at fault for the search optimized and the best result for what the user is asking for not being the same result. We’re now going to start seeing either LLMs sell whatever this tactic gets used on or essentially a sort of adblock being built into LLM training and search APIs to keep it from working, to make LLMs less likely to fall for native advertising/astroturfing.



  • So AI is a nice new technological tool in a big toolbox, not a technological and business revolution justifying the stock market valuations around it, investment money sunk into it or the huge amount of resources (such as electricity) used by it.

    Specifically for Microsoft, there doesn’t really seem to be any area were MS’ core business value for customers gains from adding AI, in which case this “AI everywhere” strategy in Microsoft is an incredibly shit business choice that just burns money and damages brand value.

    It’s a shiny new tool that is really powerful and flexible and everyone is trying to cram everywhere. Eventually, most of those attempts will collapse in failure, probably causing a recession and afterward the useful use cases will become part of how we all do things. AI is now where the internet was in the late 80s - just beyond the point where it’s not just some academics fiddling with it in research labs, but not in any way a mature technology.

    Most gaming PCs from the 2020s can run a model locally though it might need to be a pruned one, so maybe a little farther along.