• 0 Posts
  • 7 Comments
Joined 1 year ago
cake
Cake day: June 9th, 2023

help-circle

  • Not what they did on the surface (limiting source to only customers). That’s allowed by the GPL. But they went beyond that which imo makes them non-compliant.

    1. RH will cancel your access/agreement if you share the GPL’d source with others. That’s directly forbidden by section 6 of the GPLv2. RH is free to cancel your agreement when they want, but not because you exercised your rights under the GPL.

    2. Once your agreement is canceled, you also lose access to the matching source for other GPL’d packages installed on your system. RH could offer other methods to be in compliance, but as far as I know, they have not.


  • Again, less than half of RHEL is even software released under the GPL.

    I would be completely shocked if this were true. I’m calling BS here.

    I used to be my company’s primary contact for our Red Hat TAM for almost 13 years. Our TAMs were very proud to claim that all of RHEL was FOSS software, licensed under the GPL or sometimes other FOSS licenses.

    I spun up a RHEL 9.2 instance and ran:

    $ sudo dnf list --all | wc -l
    6671
    $ dnf info --all | grep "^License .*:.*GPL.*" | wc -l
    4344
    $ python -c "print(4344/6673 * 100)"
    65.11767351221705
    

    So 65% of RHEL 9’s packages are under a GPL license.

    Much of the software that is GPL was authored by Red Hat themselves. According to the text of the GPL itself, Red Hat is not required to distribute the code to the totality of the RHEL distribution or even to more than half the code.

    Half?!? Again, where are these mysterious numbers coming from?

    It doesn’t matter if Red Hat authored those packages or not. What matters is if they were distributed under a GPL license. If you’re claiming that Red Hat multi-licensed those GPL’d packages that they exclusively wrote so they don’t have to comply with the GPL, please point those out to me (or at least a few), so I can check them out.


  • To solve your DRY problem, you may not realize that you can generate target rules from built-in functions eval and foreach and a user-defined new-line macro. Think of it like a preprocessor step.

    For example:

    # This defines a new-line macro.  It must have two blank lines.
    define nl
    
    
    endef
    
    # Generate two rules for ansible playbooks:
    $(eval $(foreach v,1 2,\
    .PHONY : ansible.run-playbook$v $(nl)\
    \
    ansible.run-playbook$v : ensure-variables cleanup-residue | $$(ansible.venv)$(nl)\
    ansible.run-playbook$v :;\
    	... $(nl)\
    ))
    

    I winged it a bit for you, but hopefully I got it right, or at least right enough you get what I’m doing with this technique.


  • You may like an approach I came up with some time ago.

    In my included file that’s common among my Makefiles:

    # Ensure the macro named is set to a non-empty value.
    varchk_call = $(if $($(1)),,$(error $(1) is not set from calling environment))
    
    # Ensure all the macros named in the list are set to a non-empty value.
    varchklist_call = $(foreach v,$(1),$(call varchk_call,$v))
    

    At the top of a Makefile that I want to ensure certain variables are set before it runs:

    $(call varchklist_call,\
            INSTDIR \
            PACKAGE \
            RELEASE \
            VERSION)
    

    I usually do these checks in sub-Makefiles to ensure someone didn’t break the top level Makefile by not passing down a required macro.


  • I’ve written hundreds (thousands?) of GNU Makefiles over the past 30 years and never had a need to unconditionally run particular targets before all others. GNU Make utility is a rule-based language. I’d suggest what you’re attempting to do is enforce an imperative programming language model onto a rule-based programming language model, which you’re going to run into trouble trying to code in a different language model than the tool’s native model.

    Can you provide what you mean by check the environment, and why you’d need to do that before anything else?

    For example, in the past I’ve want to determine if and which particular command was installed, so I have near the top of my Makefile:

    container_command_defaults = podman docker
    container_command_default_paths := $(shell command -v $(container_command_defaults))
    
    ifndef container_command
      container_command = $(firstword $(container_command_default_paths))
      ifeq ($(container_command),)
        $(error You must have docker or podman installed)
      endif
    endif
    

    Using the := operator with $(shell ...) is a way to run a command while GNU Make is initially parsing your Makefile. Normally, using := assignment operator is antithetical to a rule-based language, so you want to limit its use as much as possible, but unusual exceptions can exist.

    I’m also unclear what you mean by “ensure variables are set”. What kind of variables?

    The above snippet shows how you can check if a makefile variable is set when the Makefile is first parsed, if not, declare an error and exit. (The same approach works for environment variables too.)

    Preparing a particular layout ahead of time is not the best approach. I’d suggest a given layout is nothing more than dependencies that should be declared as such.

    Also, running specific targets or rules unconditionally can lead to trouble later as your Makefile grows up. You may eventually have additional targets that say provide information about the build’s state or run checks or tests. You wouldn’t want those targets necessarily to go off and build an entire tree of directories for you or take other unnecessary actions.

    If you want to ensure certain directories are present, add those as dependencies for those targets with the | character. For example:

    build_directory ?= build
    build_make = $(MAKE) ...
    targets = ...
    
    all: FORCE | $(build_directory)
    	$(build_make) $(targets)
    
    $(build_directory):
    	mkdir -p -- '$@'
    

    Even though I’ve been writing GNU Makefiles for decades, I still am learning new stuff constantly, so if someone has better, different ways, I’m certainly up for studying them.


  • Since being forced to use this terrible communication method in my teams and groups, I’ve been copy-and-pasting good Q&A threads into text files that I push to an enterprise GitHub repo for perma-store. At least that way other engineers and myself can either use GitHub’s search or clone the repo locally, grep it, and even contribute back with PRs. Sometimes from there, turn into a wiki, but that’s pretty rare. My approach is horribly inefficient and so much stuff is still lost, but it’s better than Discord’s search or dealing with Confluence.