

Still doable, but you can’t have external RAM. Hence, lack of RAM is a bigger issue.


Still doable, but you can’t have external RAM. Hence, lack of RAM is a bigger issue.


It is not a lot, but it is not that hard to extend storage. For example with an external SSD/HDD or a NAS.


Honestly, I look at this and I see a future where everyone runs their own Federated services
I very much doubt it. People in general just don’t care, or even worse. They think that it is a good thing as it keeps baddies out (it won’t) and it will protect them (it won’t).
It seems line very dark times are ahead. Self hosting is good, but it also gives the other side a bright shining beacon to go after. If they know everything about 99% of the population already, they can spend a lot more resources to really get to know the last percent that actively tries to avoid it.


I used to live in a place with free buses, you still had to get a card and tap on/off. Most likely so they could track which routes are popular.


I don’t know why, but my guess would be. Everyone involved knows it is bullshit, the people working there, management, etc… but it gives a good loophole to fire anyone that is starting to stir up something, “oh, he/she failed the polygraph.”
The people working there knows it, so they are more likely to stay in line so they can “pass” their annual test.


In his defense, polygraph is just pseudo-science bullshit. You “fail” or “pass” depending on what the one doing it wants you to do. It is just made up.


If they have, then good. Wasn’t sure it was doable with current google’s signing process. Highly unlikely someone hasn’t tampered with them then (far easier to target the site displaying the “correct” fingerprint).
However, my original point still stands. Just because it is open source doesn’t in itself mean that a bad actor can’t tamper with it.


And Signal is open source so, if it did anything weird with private keys, everyone would know
Well, no. At least not by default as you are running a compiled version of it. Someone could inject code you don’t know anything about before compilation that for example leaked your keys.
One way to be more confident no one has, would be to have predictable builds that you can recreate and then compare the file fingerprints. But I do not think that is possible, at least on android, as google holds they signature keys to apps.


Get you hooked to the extreme convenience, much like a drug addict, and then pump up the price or flood every prompt with ads.
There is a big difference between “normal” SaaS and LLM.
In a normal SaaS you get a lot of benefit of being at scale. Going from 1000 to 10000 users is not that much harder than going from 10000 to 1000000. Once you have your scaling set up you can just add more servers and/or data centers. But most importantly, the cost per user goes waaay down.
With AI it just doesn’t scale at all, the 500000th user will most likely cost as much as the 5th. So doing a netflix/spotify/etc, I don’t think is going to work unless they can somehow make it a lot cheaper per user. OpenAI fails to turn a profit even on their most expensive tiers.
Edit: to clarify, obviously you get some small benefits from being at scale. Better negotiations and already having server racks, etc. But those same benefits a traditionsl SaaS gets as well, and so much more that LLM doesn’t, because the cost per user doesn’t drop.
You sound surprised?