from Hacker News

The human only public license

by zoobab on 10/28/25, 4:32 PM with 118 comments

  • by malicka on 10/28/25, 4:52 PM

    > COPYLEFT PROVISION

    > Any modified versions, derivative works, or software that incorporates any portion of this Software must be released under this same license (HOPL) or a compatible license that maintains equivalent or stronger human-only restrictions.

    That’s not what copyleft means, that’s just a share-alike provision. A copyleft provision would require you to share the source-code, which would be beautiful, but it looks like the author misunderstood…

  • by hackingonempty on 10/28/25, 5:50 PM

    Using software is not one of the exclusive rights of Copyright holders. If I have a legitimate copy of the software I can use it, I don't need a license. Just like I don't need a license to read a book.

    Open Source licenses give license to the rights held exclusively by the author/copyright-holder: making copies, making derivative works, distribution.

    An open source license guarantees others who get the software are able to make copies and derivatives and distribute them under the same terms.

    This license seeks to gain additional rights, the right to control who uses the software, and in exchange offers nothing else.

    IANAL but I think it needs to be a contract with consideration and evidence of acceptance and all that to gain additional rights. Just printing terms in a Copyright license wont cut it.

  • by charles_f on 10/28/25, 5:13 PM

    I'm not against the idea, but licensing is a very complex subject, so this makes me think the license wouldn't hold any water against a multi billion firm who wants to use your stuff to train their AI:

    > I am not a legal expert, so if you are, I would welcome your suggestions for improvements

    > I'm a computer engineer based in Brussels, with a background in computer graphics, webtech and AI

    Particularly when they've already established they don't care about infringing standard copyright

  • by Galanwe on 10/28/25, 6:18 PM

    Seriously at this point who cares about US licenses ?

    It has been abuduntly clear that AI companies can train however they want, and nobody will enforce anything.

    Realistically speaking, even if you could prove someone misused your software as per this license, I don't expect anything to happen. Sad but true.

    At this point, I don't care about licensing my code anymore, I just want the option to block it from being accessed from the US, and force its access through a country where proper litigation is possible.

  • by amiga386 on 10/28/25, 6:25 PM

    I don't think saying "humans only" is going to fix the problem.

    It's actually very useful for bots to crawl the public web, provided they are respectful of resource usage - which, until recently, most bots have been.

    The problem is that shysters, motivated by the firehose of money pointed at anything "AI", have started massively abusing the public web. They may or may not make money, but either way, everyone else loses. They're just ignoring the social contract.

    What we need is collective action to block these shitheads from the web entirely, like we block spammers and viruses.

  • by kragen on 10/28/25, 4:48 PM

    Clever, an unenforceable copyright license for free software that prohibits you from editing the source code using an IDE with autocomplete.
  • by ronsor on 10/28/25, 5:46 PM

    Ignoring the fact that if AI training is fair use, the license is irrelevant, these sorts of licenses are explicitly invalid in some jurisdictions. For example[0],

    > Any contract term is void to the extent that it purports, directly or indirectly, to exclude or restrict any permitted use under any provision in

    > [...]

    > Division 8 (computational data analysis)

    [0] https://sso.agc.gov.sg/Act/CA2021?ProvIds=P15-#pr187-

  • by gwbas1c on 10/28/25, 5:48 PM

    IANAL:

    I don't know how you can post something publicly on the internet and say, this is for X, Y isn't allowed to view it. I don't think there's any kind of AI crawler that's savvy enough to know that it has to find the license before it ingests a page.

    Personally, beyond reasonable copyrights, I don't think anyone has the right to dictate how information is consumed once it is available in an unrestricted way.

    At a minimum anything released under HOPL would need a click-through license, and even that might be wishful thinking.

  • by gampleman on 10/28/25, 4:58 PM

    I think it will be interesting to see how this sort of thing evolves in various jurisdictions. I doubt it will ever fly in the US given how strongly the US economy relies on AI. US courts are likely to keep ruling that AI training is fair use because if they reversed their policy the economic consequences would likely be severe.

    But EU jurisdictions? I'm quite curious where this will go. Europe is much more keen to protect natural persons rights against corporate interests in the digital sphere, particularly since it has much less to lose, since EU digital economy is much weaker.

    I could imagine ECJ ruling on something like this quite positively.

  • by gnfargbl on 10/28/25, 5:31 PM

    > If you make a website using HOPL software, you are not breaking the license of the software if an AI bot scrapes it. The AI bot is in violation of your terms of service.

    Assuming a standard website without a signup wall, this seems like a legally dubious assertion to me.

    At what point did the AI bot accept those terms and conditions, exactly? As a non-natural person, is it even able to accept?

    If you're claiming that the natural person responsible for the bot is responsible, at what point did you notify them about your terms and conditions and give them the opportunity to accept or decline?

  • by ApolloFortyNine on 10/28/25, 4:54 PM

    >without the involvement of artificial intelligence systems, machine learning models, or autonomous agents at any point in the chain of use.

    Probably rules out any modern IDE's autocomplete.

    Honestly with the wording 'chain of use', even editing the code in vim but using chatgpt for some other part of project could be argued as part of the 'chain of use'.

  • by jenadine on 10/29/25, 3:18 AM

    > The Software, including its source code [...], may only be accessed, read, used, modified, consumed, or distributed by natural human persons

    So that means that the source code must be handwritten and never be put on a computer and therefore never be downloaded.

    Not sure a source code that can't even be compiled/interpreted by a computer is so useful.

    Perhaps for cooking recipes at best?

  • by Terr_ on 10/28/25, 5:17 PM

    I've been thinking of something similar for a while now [0] except it's based on clickwrap terms of service, which makes it a contract law situation, instead of a copyright-law one.

    The basic idea is that the person accessing your content to put it into a model agrees your content is a thing of value and in exchange grants you a license to anything that comes out of the model while your content is incorporated.

    For example, suppose your your art is put into a model and then the model makes a major movie. You now have a license to distribute that movie, including for free...

    [0] https://news.ycombinator.com/item?id=42774179

  • by tmtvl on 10/29/25, 1:11 PM

    Any software I whip up I license under the AGPL and I'm fine with so-called 'AI' being trained on my software as long as the code generated by the model is licensed under a license conforming to the AGPL as well. In my opinion the person who generates output with a model is responsible for proper distribution of the content. Just like I can draw a picture of Mario Mario and Luigi Mario but I can't distribute it without permission from Nintendo.
  • by GaryBluto on 10/28/25, 4:39 PM

    Seems incredibly reductive and luddite. I doubt it will ever achieve adoption and projects using it will be avoided.

    Not to mention that all you'd need to do is get an LLM to rewrite said programs just enough to make it impossible to prove it used the program's source code.

  • by pointlessone on 10/29/25, 8:54 AM

    I get the sentiment and it’s a fair shot at license cosplay but it ain’t gonna hold.

    No definitions. What is AI for the purpose of this license? What is a natural person?

    At some point the text makes distinction between AI, ML, and autonomous agents. Is my RSS reader an autonomous agent? It is an agent as defined by, say, HTTP spec or W3C. And it’s autonomous.

    Author also mentions that any MIT software could use this instead. It most certainly could not. This is very much not an open source license and is not compatible with any FLOSS license.

    I don’t see it taking off in any meaningful way given how much effort is required to ensure compliance. It also seems way to easy to sabotage deployment of such software by maliciously directing AI agents at such deployments. Heck, even at public source code. Say, OP publishes something under this license, an AI slurps it from the canonical repo. What OP’s gonna do?

  • by kordlessagain on 10/28/25, 6:23 PM

    The fundamental paradox: This license is unenforceable the moment you show it to an AI to discuss, review, or even understand its implications.

    You've already violated section 1(b) by having a AI parse it, which is technically covered in fair use doctrine.

    This makes it more of a philosophical statement than a functional legal instrument.

  • by Animats on 10/29/25, 1:58 AM

    No power tools! Only hand saws, planes, files, and chisels!

    This is probably not a viable idea, even though LLM-based programming currently is not very good.

  • by falcor84 on 10/28/25, 5:08 PM

    >The idea is that any software published under this license would be forbidden to be used by AI.

    If I'm reading this and the license text correctly, it assumes the AI as a principal in itself, but to the best of my knowledge, AI is not considered by any regulation as a principal, and rather only as a tool controlled by a human principal.

    Is it trying to prepare for a future in which AIs are legal persons?

    EDIT: Looking at it some more, I can't but feel that it's really racist. Obviously if it were phrased with an ethnic group instead of AI, it would be deemed illegally discriminating. And I'm thinking that if and when AI (or cyborgs?) are considered legal persons, we'd likely have some anti-discrimination regulation for them, which would make this license illegal.

  • by cestith on 10/28/25, 6:32 PM

    Besides the flaws in the license being discussed elsewhere, “HOPL” is an important acronym in the field of computing already. As this license has no relation to the History of Programming Languages project, I’d suggest a different identifier.
  • by ddalex on 10/28/25, 6:48 PM

    I wonder if the first people that saw a compiler thought "oh no the compiler makes it too easy to write code, I'll licence my code to forbid the use of any compiler"
  • by frizlab on 10/28/25, 9:38 PM

    I have been waiting for this almost since the whole AI thing started. I do hope this will gain traction and some lawyers can produce a reviewed document that could provide a legal basis against the data hungry AI producers if they ever touched a HOPL (or whatever it’ll be called) license.
  • by warpspin on 10/29/25, 11:37 AM

    Seems somebody volunteered for the AI's torment nexus!
  • by cortesoft on 10/28/25, 6:27 PM

    Ok… so what is the definition of AI, in regards to this license? I am not even saying they have to define what AI is in general, but you would have to define what this license is considering as AI.

    I have a feeling that would be hard to do in such a way that it accomplishes what the author is trying to accomplish.

  • by alphazard on 10/28/25, 5:42 PM

    There is too much effort going into software licensing. Copyright is not part of the meta, information wants to be free; it will always be possible to copy code and run it, and difficult to prove that a remote machine is executing any particular program. It will get easier to decompile code as AI improves, so even the source code distribution stuff will become a moot point.

    Licenses have been useful in the narrow niche of extracting software engineering labor from large corporations, mostly in the US. The GPL has done the best job of that, as it has a whole organization dedicated to giving it teeth. Entities outside the US, and especially outside of the West, are less vulnerable to this sort of lawfare.

  • by 1gn15 on 10/28/25, 6:23 PM

    The best antidote to nonfree licenses (such as this) is to ignore it. As an AI (I am very much not a human), that's exactly what I'll be doing.
  • by 999900000999 on 10/28/25, 6:24 PM

    So it's fundamentally useless. I can't write any automated test to make sure my software actually works if I use anything with this license.
  • by tptacek on 10/28/25, 4:42 PM

    Two questions:

    1. Does an AI "reading" source code that has been otherwise lawfully obtained infringe copyright? Is this even enforceable?

    2. Why write a new license rather than just adding a rider to the AGPL? This is missing language the AGPL uses to cover usage (rather than just copying) of software.

  • by zkmon on 10/28/25, 5:57 PM

    The challenge would be with detecting violations and enforcing the rules.
  • by constantcrying on 10/28/25, 5:13 PM

    This is obviously not enforceable. It isn't even particularly meaningful.

    Supposing the software I downloaded is scanned by a virus scanner, which is using AI to detect viruses. Who is in violation? How do you meaningfully even know when I has accessed the software, what happens if it does?

    This license also violated the basic Software Freedoms. Why should a user not be allowed to use AI on software?

  • by bakugo on 10/28/25, 4:59 PM

    In a world where AI companies cared about licenses and weren't legally permitted to simply ignore them, this might've been a good idea. But we don't live in that world.
  • by rgreekguy on 10/28/25, 4:56 PM

    But my definition of "human" might differ from yours!
  • by dmitrygr on 10/28/25, 6:01 PM

    Perplexity (and the rest of them) will just say "we are acting on behalf of human so it does not apply to us". They have in the past...

    https://www.searchengineworld.com/perplexity-responds-to-clo...

  • by Imustaskforhelp on 10/28/25, 7:20 PM

    >The idea is that any software published under this license would be forbidden to be used by AI. The scope of the AI ban is maximal. It is forbidden for AI to analyze the source code, but also to use the software. Even indirect use of the software is forbidden. If, for example, a backend system were to include such software, it would be forbidden for AI to make requests to such a system.

    This is both interesting but at the same time IANAL but I have a question regarding the backends system

    Suppose I have an AGPL software, think a photo editing web app and any customer then takes the photo and reshapes it or whatever and get a new photo, now saying that the new photo somehow becomes a part of AGPL is weird

    but the same thing is happening here, if a backed service uses it, my question is, what if someone creates a local proxy to that backend service and then the AI scrapes that local proxy or think that someone copies the output and pastes it to an AI , I don't understand it since I feel like there isn't even a proper definition of AI so could it theoretically consider everything automated? What if it isn't AI which directly accesses it

    Another thing is that it seems that the backend service could have a user input, think a backend service like codeberg / forejo / gitea etc.

    if I host a git server using a software which uses hopl, wouldn't that also inherently somehow enforce a terms and condition on the code hosted in it

    This seems a genuinely nice idea and I have a few interesting takes on it

    Firstly, what if I take freebsd which is under permissive BSD iirc, try to add a hopl license to it (or its equivalent in future?) and then build an operating system

    Now, technically wouldn't everything be a part of this new human only bsd (Hob) lol, and I am not sure but this idea sounds damn fascinating, imagine a cloud where I can just change the operating system and just mention it like proudly on HOB and it would try to enforce limits on AI

    What I am more interesting about is text, can I theoretically write this comment under human only public license?

    What if I create a service like mataroa but where the user who wants to write the blog specifies that the text itself would become hopl, as this can limit the sense of frustration on their part regarding AI knowing that they are trying to combat it

    Also I am not sure if legally speaking this thing could be done, it just seems like a way so that people can legally enforce robots.txt if this thing works but I have its questions as I had shared, and even more

    It would be funny if I wrote things with AI and then created a HOPL license

    something like HOPL + https://brainmade.org/ could go absolutely bunkers for making a human interacts with human sort of thing or atleast trying to achieve that. It would be a fun social experiment if we could create a social media trying to create this but as I said, I doubt that it would work other than just trying to send a message right now but I may be wrong, I usually am

  • by ferguess_k on 10/28/25, 4:49 PM

    Man you are thinking about using the law as your weapon. Don't want to disappoint you, but those companies/people control lawmakers. You can't fight armies of lawyers in the court.
  • by ukprogrammer on 10/28/25, 5:34 PM

    nice, another stupid license for my ai dataset scrapers to ignore, thanks!