83 Matching Annotations
  1. Jun 2025
  2. May 2025
    1. It isn't strictly necessary, but set -euxo pipefail turns on a few useful features that make bash shebang recipes behave more like normal, linewise just recipe: set -e makes bash exit if a command fails. set -u makes bash exit if a variable is undefined. set -x makes bash print each script line before it's run. set -o pipefail makes bash exit if a command in a pipeline fails. This is bash-specific, so isn't turned on in normal linewise just recipes.
    1. #!/usr/bin/env npx ts-node // TypeScript code Whether this always works in macOS is unknown. There could be some magic with node installing a shell command shim (thanks to @DaMaxContext for commenting about this). This doesn't work in Linux because Linux distros treat all the characters after env as the command, instead of considering spaces as delimiting separate arguments. Or it doesn't work in Linux if the node command shim isn't present (not confirmed that's how it works, but in any case, in my testing, it doesn't work in Linux Docker containers). This means that npx ts-node will be treated as a single executable name that has a space in it, which obviously won't work, as that's not an executable.
  3. Apr 2025
  4. Sep 2024
  5. Sep 2023
  6. Aug 2022
  7. Jul 2022
  8. Jun 2021
    1. Since looping over the positional parameters is such a common thing to do in scripts, for arg defaults to for arg in "$@". The double-quoted "$@" is special magic that causes each parameter to be used as a single word (or a single loop iteration). It's what you should be using at least 99% of the time.
    1. There is one very important reason for enabling job control to be useful inside scripts: the side-effect it has of placing background processes in their own process groups. This makes it much, much easier to send signels to them and their children with one simple command: kill -<signal> -$pgid. All other ways of dealing with signaling entire trees of processes either involve elaborate (sometimes even recursive) functions, which are often bugnests, or risk killing the parent in the process (no pun intended).
    1. while (( "$#" )); do case "$1" in -a|--my-boolean-flag) MY_FLAG=0 shift ;; -b|--my-flag-with-argument) if [ -n "$2" ] && [ ${2:0:1} != "-" ]; then MY_FLAG_ARG=$2 shift 2 else echo "Error: Argument for $1 is missing" >&2 exit 1 fi ;; -*|--*=) # unsupported flags echo "Error: Unsupported flag $1" >&2 exit 1 ;; *) # preserve positional arguments PARAMS="$PARAMS $1" shift ;; esacdone# set positional arguments in their proper placeeval set -- "$PARAMS"
  9. May 2021
    1. For filter-branch, using pipelines like git ls-files | grep -v ... | xargs -r git rm might be a reasonable workaround but can get unwieldy and isn't as straightforward for users; plus those commands are often operating-system specific (can you spot the GNUism in the snippet I provided?)
  10. Apr 2021
    1. can be easily invoked directly from shell prompt or script

      Can't expect / unbuffer / etc. (whatever this is attempting to contrast itself with) be easily invoked directly from shell prompt or script too??

      Okay, I guess you have to know more about how expect is invoked to understand what they mean. One glance at the examples, comparing them, and all becomes clear:

      #!/bin/sh
      empty -f -i in -o out telnet foo.bar.com
      empty -w -i out -o in "ogin:" "luser\n"
      

      I didn't realize that expect required/expected (no pun intended) to be used in scripts with its own shebang line:

      #!/usr/bin/expect
      
      spawn telnet foo.bar.com 
      expect ogin {send luser\r}
      

      That does make it less easy/normal to use expect within a shell script.

      I was coming to the expect project from/for the unbuffer command, which by contrast, is quite easy to include/use in a shell script -- almost the same as empty, in fact. (Seems like almost a mismatch to have unbuffer command in expect toolkit then. Or is expect command the only odd one out in that toolkit?)

  11. Mar 2021
  12. Feb 2021
  13. Nov 2020
    1. The potential problem: if second_task fails, third_task will not run, and execution will continue to the next line of code - next_task, in this example. This may be exactly the behavior you want. Alternatively, you may be intending that if second_task fails, the script should immediately exit with its error code. In this case, the best choice is to use a block - i.e., curly braces: first_task && { second_task third_task } next_task Because we are using the -e option, if second_task fails, the script immediately exits.
    1. It starts truncating it's output (shortening strings with ...) once you pipe it's output into grep. That is quite unacceptable. When I am checking if something is inhibited in a script, I should have all possible information available and not have to consider if a string will get truncated when being piped into a tool, that is perfectly readable on a wide terminal.
  14. Oct 2020
  15. Jun 2020
  16. May 2020
  17. Apr 2020
  18. Feb 2020
  19. Dec 2019
  20. Nov 2019
  21. Sep 2019
  22. Feb 2017