Training is transformative use.
A gigabyte of linear algebra that can rap about the Silmarillion is plainly not just copying. It’s not even large enough to contain a meaningful fragment of every book that shaped it.
Training is transformative use.
A gigabyte of linear algebra that can rap about the Silmarillion is plainly not just copying. It’s not even large enough to contain a meaningful fragment of every book that shaped it.
Protecting fair use is more important than any hate-boner toward chatbots.


Oh fuck off.
A site cannot send someone a video and then be mad that they still have it.


All both of them?
Or like a hundred million?
Sony pulls this shit with every new Playstation, through the ingenious and difficult process of not making enough. “PS3 sold out at launch! New shipment sold out again! And again!” Meanwhile they’d moved fewer total units than the 360 in the same timeframe, but Microsoft made one big shipment instead of three small ones.
Mask on, mask off.

On some level - just keep being your old fake self.
Fuck are they gonna do about it? They don’t exist anymore.


Are you responding to someone who said, ‘all use is forced?’ Because I didn’t.
If you want this in your browser, great, there’s these things called extensions. The fact a goddamn LLM is standard, but DownThemAll is barely tolerated, speaks to completely fucked priorities at Mozilla.


‘We pushed shit on everyone and a bunch of them are using it’ never ever ever vindicates pushing shit on people. People use shit that’s pushed on them! Do you know how low that bar is?!
I am a vocal defender of the underlying technology, and this is still some bullshit.


Which is awful marketing, because of how real people respond to that.


The bubble continuing ensures the current paradigm soldiers on, meaning hideously expensive projects shove local models into people’s hands for free, because everyone else is doing that.
And once it bursts, there’s gonna be an insulating layer of dipshits repeating “guess it was nothing!” over the next decade of incremental wizardry. For now, tolerating the techbro cult’s grand promises of obvious bullshit means the unwashed masses are interpersonally receptive to cool things happening.
Already the big boys are pivoted toward efficiency instead of raw speed at all costs. The closer they get toward a toaster matching current tech with a model trained for five bucks, the better. I’d love for VCs to burn money on experimentation instead of scale.


This is the real future of neural networks. Trained on supercomputers - runs on a Game Boy. Even in comically large models, the majority of weights are negligible, and local video generation will eventually be taken for granted.
Probably after the crash. Let’s not pretend that’s far off. The big players in this industry have frankly silly expectations. Ballooning these projects to the largest sizes money can buy has been illustrative, but DeepSeek already proved LLMs can be dirt cheap. Video’s more demanding… but what you get out of ten billion weights nowadays is drastically different from a six months ago. A year to date ago, video models barely existed. A year to date from now, the push toward training on less and running on less will presumably be a lot more pressing.


Excellent news. It’s ridiculous that matrix algebra was turned into proprietary software.


Oh it’s definitely not just Google. Apple’s been this fucked since 2007. But since this is the Android community, it’s helpful to stay on-message.


Shatter this corporation.
All software beneath my web browser is also free, and none of it would dare.
Low friggin’ bar.


Search for lightning / 8steps / 4steps / turbo LORAs.
Try Chroma.


“Where do you think we are right now?”
Downvoters don’t know what “fair use” means. Or do, but would rather work backwards from kneejerk opposition to an outcome.
A robot read every book in the library. That’s what libraries are for. If it can’t reproduce any book more closely than a Wikipedia summary, and serves a different purpose - that’s a protected work.