Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Looks great. Personally, I have been using a "cache" function to cache output of expensive commands. This has helped me iterate faster on pipelines that call into apis.

  function cache {
      cmd="$*"
      name=$(echo "$cmd" | base64)
      if [[ -f ./cache/$name.exit ]] && [[ $(cat ./cache/$name.exit) -eq 0 ]]; then
          echo "cached"
      else
          mkdir -p ./cache
          eval "$cmd" > ./cache/$name.out
          echo $? > ./cache/$name.exit
      fi
      cat ./cache/$name.out
  }

  cache ls /


I use something similar, created by a friend of mine, called 'bash-cache': https://bitbucket.org/dimo414/bash-cache/src/default/

It hooks specific functions in Bash and keys off of their arguments.


I like this. I currently do the "plumbing" the OP refers to using files. This has generally worked for me, but want to give this tool a test run and see if it's faster than my current approach and this cache command.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: