Looks great. Personally, I have been using a "cache" function to cache output of expensive commands. This has helped me iterate faster on pipelines that call into apis.
function cache {
cmd="$*"
name=$(echo "$cmd" | base64)
if [[ -f ./cache/$name.exit ]] && [[ $(cat ./cache/$name.exit) -eq 0 ]]; then
echo "cached"
else
mkdir -p ./cache
eval "$cmd" > ./cache/$name.out
echo $? > ./cache/$name.exit
fi
cat ./cache/$name.out
}
cache ls /
I like this. I currently do the "plumbing" the OP refers to using files. This has generally worked for me, but want to give this tool a test run and see if it's faster than my current approach and this cache command.