I’m working on a side project that uses DynamoDB. I got tired of waiting for my DynamoDB table changes to complete, so I decided to try…
For a while now, I’ve used this script to set up npm on a new machine:
# Declare where you want your global packages installed to. npm config set prefix '~/.local/share/npm' # Ensure the directory exists. mkdir -p ~/.local/share/npm # Change `.zshrc` to `.bashrc` if appropriate, but consider instead # switching to zsh. SHELL_PROFILE=~/.zshrc # Make sure the executables folder is in your $PATH. echo 'export PATH="$HOME/.local/share/npm/bin:$PATH"' >> $SHELL_PROFILE # Make sure global node modules can find their dependencies correctly (think `yo`) echo 'export NODE_PATH="$NODE_PATH:$HOME/.local/share/npm/lib/node_modules"' >> $SHELL_PROFILE # Make the changes take effect. source $SHELL_PROFILE # Install npm as local user. npm i -g npm
This does away with the need to install global node modules (such as
npm itself) with
sudo; instead of installing under
/usr/local (or wherever the default location is for your OS) which requires super-user permissions, this setup installs these packages to a folder under your home directory.
However, it turns out that there are further optimizations you can make to your npm workflow. For instance, you can change the default values shown when you run
npm config set init.author.name "Ryan Kennedy" npm config set init.author.email "firstname.lastname@example.org" npm config set init.author.url "https://www.rmkennedy.com" npm config set init.license "MIT" npm config set init.version "0.0.0-develop"
I’m still trying to figure out how to improve npm’s performance; I can’t use Yarn yet because it doesn’t quite respect all the options in