by zoobab on 10/28/25, 4:32 PM with 118 comments
by malicka on 10/28/25, 4:52 PM
> Any modified versions, derivative works, or software that incorporates any portion of this Software must be released under this same license (HOPL) or a compatible license that maintains equivalent or stronger human-only restrictions.
That’s not what copyleft means, that’s just a share-alike provision. A copyleft provision would require you to share the source-code, which would be beautiful, but it looks like the author misunderstood…
by hackingonempty on 10/28/25, 5:50 PM
Open Source licenses give license to the rights held exclusively by the author/copyright-holder: making copies, making derivative works, distribution.
An open source license guarantees others who get the software are able to make copies and derivatives and distribute them under the same terms.
This license seeks to gain additional rights, the right to control who uses the software, and in exchange offers nothing else.
IANAL but I think it needs to be a contract with consideration and evidence of acceptance and all that to gain additional rights. Just printing terms in a Copyright license wont cut it.
by charles_f on 10/28/25, 5:13 PM
> I am not a legal expert, so if you are, I would welcome your suggestions for improvements
> I'm a computer engineer based in Brussels, with a background in computer graphics, webtech and AI
Particularly when they've already established they don't care about infringing standard copyright
by Galanwe on 10/28/25, 6:18 PM
It has been abuduntly clear that AI companies can train however they want, and nobody will enforce anything.
Realistically speaking, even if you could prove someone misused your software as per this license, I don't expect anything to happen. Sad but true.
At this point, I don't care about licensing my code anymore, I just want the option to block it from being accessed from the US, and force its access through a country where proper litigation is possible.
by amiga386 on 10/28/25, 6:25 PM
It's actually very useful for bots to crawl the public web, provided they are respectful of resource usage - which, until recently, most bots have been.
The problem is that shysters, motivated by the firehose of money pointed at anything "AI", have started massively abusing the public web. They may or may not make money, but either way, everyone else loses. They're just ignoring the social contract.
What we need is collective action to block these shitheads from the web entirely, like we block spammers and viruses.
by kragen on 10/28/25, 4:48 PM
by ronsor on 10/28/25, 5:46 PM
> Any contract term is void to the extent that it purports, directly or indirectly, to exclude or restrict any permitted use under any provision in
> [...]
> Division 8 (computational data analysis)
by gwbas1c on 10/28/25, 5:48 PM
I don't know how you can post something publicly on the internet and say, this is for X, Y isn't allowed to view it. I don't think there's any kind of AI crawler that's savvy enough to know that it has to find the license before it ingests a page.
Personally, beyond reasonable copyrights, I don't think anyone has the right to dictate how information is consumed once it is available in an unrestricted way.
At a minimum anything released under HOPL would need a click-through license, and even that might be wishful thinking.
by gampleman on 10/28/25, 4:58 PM
But EU jurisdictions? I'm quite curious where this will go. Europe is much more keen to protect natural persons rights against corporate interests in the digital sphere, particularly since it has much less to lose, since EU digital economy is much weaker.
I could imagine ECJ ruling on something like this quite positively.
by gnfargbl on 10/28/25, 5:31 PM
Assuming a standard website without a signup wall, this seems like a legally dubious assertion to me.
At what point did the AI bot accept those terms and conditions, exactly? As a non-natural person, is it even able to accept?
If you're claiming that the natural person responsible for the bot is responsible, at what point did you notify them about your terms and conditions and give them the opportunity to accept or decline?
by ApolloFortyNine on 10/28/25, 4:54 PM
Probably rules out any modern IDE's autocomplete.
Honestly with the wording 'chain of use', even editing the code in vim but using chatgpt for some other part of project could be argued as part of the 'chain of use'.
by jenadine on 10/29/25, 3:18 AM
So that means that the source code must be handwritten and never be put on a computer and therefore never be downloaded.
Not sure a source code that can't even be compiled/interpreted by a computer is so useful.
Perhaps for cooking recipes at best?
by Terr_ on 10/28/25, 5:17 PM
The basic idea is that the person accessing your content to put it into a model agrees your content is a thing of value and in exchange grants you a license to anything that comes out of the model while your content is incorporated.
For example, suppose your your art is put into a model and then the model makes a major movie. You now have a license to distribute that movie, including for free...
by tmtvl on 10/29/25, 1:11 PM
by GaryBluto on 10/28/25, 4:39 PM
Not to mention that all you'd need to do is get an LLM to rewrite said programs just enough to make it impossible to prove it used the program's source code.
by pointlessone on 10/29/25, 8:54 AM
No definitions. What is AI for the purpose of this license? What is a natural person?
At some point the text makes distinction between AI, ML, and autonomous agents. Is my RSS reader an autonomous agent? It is an agent as defined by, say, HTTP spec or W3C. And it’s autonomous.
Author also mentions that any MIT software could use this instead. It most certainly could not. This is very much not an open source license and is not compatible with any FLOSS license.
I don’t see it taking off in any meaningful way given how much effort is required to ensure compliance. It also seems way to easy to sabotage deployment of such software by maliciously directing AI agents at such deployments. Heck, even at public source code. Say, OP publishes something under this license, an AI slurps it from the canonical repo. What OP’s gonna do?
by kordlessagain on 10/28/25, 6:23 PM
You've already violated section 1(b) by having a AI parse it, which is technically covered in fair use doctrine.
This makes it more of a philosophical statement than a functional legal instrument.
by Animats on 10/29/25, 1:58 AM
This is probably not a viable idea, even though LLM-based programming currently is not very good.
by falcor84 on 10/28/25, 5:08 PM
If I'm reading this and the license text correctly, it assumes the AI as a principal in itself, but to the best of my knowledge, AI is not considered by any regulation as a principal, and rather only as a tool controlled by a human principal.
Is it trying to prepare for a future in which AIs are legal persons?
EDIT: Looking at it some more, I can't but feel that it's really racist. Obviously if it were phrased with an ethnic group instead of AI, it would be deemed illegally discriminating. And I'm thinking that if and when AI (or cyborgs?) are considered legal persons, we'd likely have some anti-discrimination regulation for them, which would make this license illegal.
by cestith on 10/28/25, 6:32 PM
by ddalex on 10/28/25, 6:48 PM
by frizlab on 10/28/25, 9:38 PM
by warpspin on 10/29/25, 11:37 AM
by cortesoft on 10/28/25, 6:27 PM
I have a feeling that would be hard to do in such a way that it accomplishes what the author is trying to accomplish.
by alphazard on 10/28/25, 5:42 PM
Licenses have been useful in the narrow niche of extracting software engineering labor from large corporations, mostly in the US. The GPL has done the best job of that, as it has a whole organization dedicated to giving it teeth. Entities outside the US, and especially outside of the West, are less vulnerable to this sort of lawfare.
by 1gn15 on 10/28/25, 6:23 PM
by 999900000999 on 10/28/25, 6:24 PM
by tptacek on 10/28/25, 4:42 PM
1. Does an AI "reading" source code that has been otherwise lawfully obtained infringe copyright? Is this even enforceable?
2. Why write a new license rather than just adding a rider to the AGPL? This is missing language the AGPL uses to cover usage (rather than just copying) of software.
by zkmon on 10/28/25, 5:57 PM
by constantcrying on 10/28/25, 5:13 PM
Supposing the software I downloaded is scanned by a virus scanner, which is using AI to detect viruses. Who is in violation? How do you meaningfully even know when I has accessed the software, what happens if it does?
This license also violated the basic Software Freedoms. Why should a user not be allowed to use AI on software?
by bakugo on 10/28/25, 4:59 PM
by rgreekguy on 10/28/25, 4:56 PM
by dmitrygr on 10/28/25, 6:01 PM
https://www.searchengineworld.com/perplexity-responds-to-clo...
by Imustaskforhelp on 10/28/25, 7:20 PM
This is both interesting but at the same time IANAL but I have a question regarding the backends system
Suppose I have an AGPL software, think a photo editing web app and any customer then takes the photo and reshapes it or whatever and get a new photo, now saying that the new photo somehow becomes a part of AGPL is weird
but the same thing is happening here, if a backed service uses it, my question is, what if someone creates a local proxy to that backend service and then the AI scrapes that local proxy or think that someone copies the output and pastes it to an AI , I don't understand it since I feel like there isn't even a proper definition of AI so could it theoretically consider everything automated? What if it isn't AI which directly accesses it
Another thing is that it seems that the backend service could have a user input, think a backend service like codeberg / forejo / gitea etc.
if I host a git server using a software which uses hopl, wouldn't that also inherently somehow enforce a terms and condition on the code hosted in it
This seems a genuinely nice idea and I have a few interesting takes on it
Firstly, what if I take freebsd which is under permissive BSD iirc, try to add a hopl license to it (or its equivalent in future?) and then build an operating system
Now, technically wouldn't everything be a part of this new human only bsd (Hob) lol, and I am not sure but this idea sounds damn fascinating, imagine a cloud where I can just change the operating system and just mention it like proudly on HOB and it would try to enforce limits on AI
What I am more interesting about is text, can I theoretically write this comment under human only public license?
What if I create a service like mataroa but where the user who wants to write the blog specifies that the text itself would become hopl, as this can limit the sense of frustration on their part regarding AI knowing that they are trying to combat it
Also I am not sure if legally speaking this thing could be done, it just seems like a way so that people can legally enforce robots.txt if this thing works but I have its questions as I had shared, and even more
It would be funny if I wrote things with AI and then created a HOPL license
something like HOPL + https://brainmade.org/ could go absolutely bunkers for making a human interacts with human sort of thing or atleast trying to achieve that. It would be a fun social experiment if we could create a social media trying to create this but as I said, I doubt that it would work other than just trying to send a message right now but I may be wrong, I usually am
by ferguess_k on 10/28/25, 4:49 PM
by ukprogrammer on 10/28/25, 5:34 PM