When laziness is efficient: Make the most of your command line
A terminal is never just a terminal. An elaborate prompt can mean someone digs deeply into optimizing the tools she uses, while the information it contains can give you an idea of what kind of engineering she’s done. What you type into the command line can tell you about environment variables, hidden configs, and OS defaults you never knew about. You can make it speak shorthand only known to your terminal and you. And all of it can help you work more efficiently and effectively.
Bash (a term used for both the Unix shell and the command language; I’ll be using the second meaning in this post) is usually a skill mentioned only in job descriptions for site reliability engineers and other ops jobs. Those same job posts often ask for automation skills, which is a positive way of asking for someone who’s professionally lazy in a way that results in efficiency. The good news is that developers can also learn a few tricks from the land of ops to make their days easier and their work better.
Your own personal(ized) terminal
There are lots of ways to customize your command line prompt and terminal to make you more efficient at work. We’ll start with possibly the most powerful one: meet ~/.bashrc
and ~/.bash_profile
.
This file exists under several different names, depending on your OS and what you’re trying to accomplish, and it can hold a lot of things that can make your life easier: shorter aliases for common commands, your custom PATH, Bash functions to populate your prompt with environment information, history length, command line completion, default editors, and more. With a little observation of your terminal habits (and a little knowledge of Bash, the command language used in many terminals), you can put all kinds of things in here that will make your life easier.
Which file you use depends on your OS. This post gives a rundown on the purposes and tradeoffs of the two files. If you use a Mac, though, use ~/.bash_profile
. Run source ~/.bash_profile
once you’ve saved your changes, so they’re live in your terminal (or just close your terminal window and open a new one).
What else should you put in your beautifully customized new file? Let’s start with aliases.
When automating things in ops work, I watch what operations I do more than a couple of times, make notes on what I did, and put those on a list of likely script ideas. Once it’s clear I’ll be doing it again and again, I know it’s worth the time to put a solution into code. You can do the same thing with your own habits. What commands are you typing all the time? What values are you frequently using? These can all be aliases.
For example, git commit
and git checkout
can become gc
and gco
(or whatever matches your mental map of abbreviations). But you can go further than that by aliasing long commands with lots of flags and arguments. Here’s how to make one:
alias $preferredAlias='$commandToAlias'
alias
is the Bash command here (you can make an alias directly on the command line too, and it will only be available for that session until you close that terminal). $preferredAlias
is your nice, short name for $commandToAlias
, the longer, more cumbersome command you’re typing all the time. No spaces around the =
and don’t forget the single straight quotes around the command you’re aliasing. You can also chain commands together using &&
. Ever sat next to someone whose command line navigation was completely opaque because they’d optimized their work into a flurry of short aliases? Now you can be that person, too.
Here are a couple I use:
mkcd='mkdir $1 && cd $1'
(consolidating a common pair of operations; the $1 takes the first argument, in this case the new file you want tocd
into)tfplan='terraform init && terraform plan'
(preventing a common mistake for me; this can be used to chain any two commonly paired commands)
If you frequently work across different OSes (varying flavors of Linux, Mac OS), you can go a little further by creating multiple tailored dotfiles that assign slightly differing commands that achieve the same thing to the same alias. No more remembering the minute differences that only come up every month or two—it’s the same couple of characters wherever you are. If you’re prone to misspelling commands (looking at you, gerp
), you can alias those too.
Now let’s look at another capability of dotfiles: customizing your prompt.
A constant source of truth on the command line
Your terminal prompt is one of the places you can be kindest to yourself, putting what you need in there so you don’t have to type pwd
all the time or wonder exactly how long ago you typed that fateful command. At a minimum, I suggest adding a timestamp with minutes to it; that way, if you need to backtrack through recent work to tie cause to effect, you can precisely anchor an action’s time with minimal work. Beyond that, I also suggest adding your working directory and current git branch. My go-to tool for setting this up inexpensively is EzPrompt, which lets you drag and drop your desired prompt elements and returns the Bash you need to add to ~/.bash_profile
. It’s a good, simple start when you’re first cultivating your dotfiles.
If you want to get a little more involved, you can try something like Powerline, which looks slick and offers more involved status information. And if you want to roll your own, self-educate about how to work with colors in the terminal and the elements you can add to your prompt. There’s a whole galaxy of options out there, and Terminals Are Sexy provides guidance to some of the constellations you can explore. Hand-crafted customization is a great way to get used to Bash syntax. If you’re looking to do something more complex with a lengthier command, Pipeline provides an interactive environment to help you refine your output, showing you what your command produces as you edit it.
Once you’ve gotten your file how you like it, do the extra step of creating a personal dotfiles repo. Keep it sanitized (so no keys, tokens, or passwords), and you’ll have safe access to your familiar prompt and whatever other settings you love at every new computer you work on.
You’ve made your prompt your friend. Next, let’s look at making what comes after that into an ally too.

The just-enough approach to learning Bash
Bash can be a lot, even when you deal with it every day (especially if some of the codebase comes from someone with an aversion to comments). Not every dev must know Bash, but every dev will benefit from knowing at least some. If nothing else, it helps you understand exactly what’s happening when you use some long, pasted wget
command to install a new program.
The good news is that, with a few strategies, you can navigate most of the Bash you’re likely to encounter without having to become an expert. One of my favorite tools is Explainshell. It can be difficult to get a good, succinct, and completely relevant explanation for what a sample Bash command means, particularly when you get four or five flags deep into it. Man pages are always a good place to start, but Explainshell is an excellent complement. Paste in your command, and the site breaks down each piece so that you actually know what that long string of commands and flags from that seven-year-old Q&A does.
Sometimes, half the work of navigating the command line is figuring out what subcommands are available. If you’re dealing with a complex tool (looking at you, AWS CLI) and find yourself referring to the docs more often than you’d like, take a minute to search for an autocomplete feature for it. Sometimes autocomplete is available as a separate but still official package; other times, a third party has made their own complementary tool. That’s one of the joys of the command line: you will rarely encounter a problem that’s unique to you, and there’s a good chance someone has been annoyed into action and fixed it.
If you end up continuing to work with the command line (and I hope you do), getting acquainted with pipes demystifies a lot of this work. A pipe in Linux is when you use the |
symbol to chain together commands, piping output from one to another. In Unix and Linux, each tool was designed to do one thing well, and these individual tools can then be chained together as needed to satisfy more complex needs. This is a strategy I use a lot, particularly when I need to create and sift through output in the terminal.
My most common pipe involves adding | grep -i $searchTerm
after a command with long output I’d prefer not to pick through manually, if I’m only searching for one thing. (You can use -A
and -B
to add lines before and after for context, with the number of lines you want as a parameter after each flag. See the grep man page to learn more.)
Also useful: piping the output to less
, which is better if I do want to scroll through the whole output, or at least navigate it and search within the open file, using /$searchTerm
, n
to see the next entry, and N
to see the previous. You can also use cut
or awk
to manipulate the output, which is particularly useful if you need to create a file of that output with a very specific format. And if you find yourself parsing JSON output much, getting acquainted with jq can save you some time.
Let’s look at some of the other conveniences the command line offers. sudo !!
repeats your previous command with sudo
pasted in front of it. (The !!
is Unix/Linux shorthand for “the previous command” and can be used in other situations too.) So if you ran something fairly involved but forgot that it needed root-level permissions, just use sudo !!
. Similarly useful: !$
, which gives you the value of the first argument of the previous command, so ls ~/Desktop
and cd !$
would show you the files in ~/Desktop
and then move you to that directory. And if you need to return to your previous directory and don’t remember the whole path, just type cd -
to back up one cd
move.
Faster navigation in text
Here’s a seemingly simple thing I learned a few years ago that regularly startles even long-tenured engineers. Did you know that you can click into the middle of a line in your terminal? Alt-click will move your cursor to where you need to go. It still requires moving your hands off the keyboard, so it’s a little clunky compared with some keyboard navigation. But it’s a useful tool, and oddly impressive—I’ve stunned people by doing that in front of them and then got the joy of sharing it with them. Now you know it too.
The keyboard shortcut methods of moving your cursor can be equally impressive, though. You can get a lot of mileage out of terminal keyboard shortcuts (to say nothing about making your work a little easier). You can jump to the beginning or end of the line with ctrl-A or E, cut the line from your cursor to the beginning of the line with ctrl-U, or delete the previous word with ctrl-W. Here’s Apple’s long list of keyboard shortcuts for the terminal, which generally work on a Linux command line too. I suggest picking a couple you want to adopt, writing them on a sticky note and putting it on your monitor, and making yourself do it the new way until it feels natural. Then move to the next commands you want to commit to muscle memory, and soon enough, you too can be very efficient… if very confusing to watch for those who don’t work this way. (But then you get to do the kind thing of teaching them the thing you just learned, and the cycle continues.)
Time travel, terminal style
If you only need to refer to your last command, !!
or just arrowing up and down are great, straightforward options. But what if you need to dig deeper into the past? To search your terminal history, type ctrl-R and then begin typing. Want to see the whole thing? Just type history
.
The Mac default is 500 history entries, which is not that much for a heavily used terminal. You can check your history length with echo $HISTFILESIZE
. Want to increase its retention? Time to edit ~/.bash_profile
again. Just set HISTSIZE
and HISTFILESIZE
to a very large number—10000000 is a good option. Add export HISTSIZE=10000000
and export HISTFILESIZE=10000000
to ~/.bash_profile
(and don’t forget to source ~/.bash_profile
again or open a new terminal window for it to take effect). For more details on the difference between these two variables, check out the accepted answer here.
Now that your history is (more) infinite, it might be good to know how to clean it up. It lives at ~/.bash_history
, which means you can delete it entirely with rm ~/.bash_history
.
But let’s look at some of the other information accessible via the command line: environment variables.
Your terminal’s hidden values: revealed!
Environment variables can come from many different places. Some are just part of your OS; you can see some common ones here. Others may be put in place via ~/.bash_profile
when you set them yourself in the terminal or via config or other files run on your system. It’s quick and easy to type echo $varName
in the terminal and see if a specific value is set, but what if you don’t know what variables have been set? That’s where set
, printenv
, and env
come in.
These three programs overlap some in output but aren’t identical. Here’s a quick rundown:
set
is more complete and will include variables you’ve set in addition to the ones inherent to your environment.printenv
andenv
offer similar output of built-in environment variables, butenv
has more robust capabilities beyondprintenv
’s simple display purposes, including running a program in a modified environment. The accepted answer here provides some deep history about the existence of both commands and how and why they differ.
You’ll likely get what you need with set, though. The output is longer, which means you’re more likely to need to pipe to grep or less, but it’s also more likely that you’ll find what you’re looking for.
Better living through ops skills
You’ve learned how to customize your command line and make it friendlier for troubleshooting. You’ve learned how to unearth surprise values hiding in your local environment variables. You’ve learned some of how to look like a wizard with aliases and keyboard shortcuts. And I bet you can start spreading the good word of ~/.bash_profile
. There’s more to Bash and terminal tricks than we’ve laid out here, but it’s yours to discover online—or just ask your friendly local ops engineer out for coffee and ask them their favorite terminal customization. You’ll probably learn more than you expect.
40 Comments
You do realise that alias `mkcd=’mkdir $1 && cd $1’` doesn’t work? It needs to be a function. And you have the wrong kind of quotes as well. You say “don’t forget the single straight quotes around the command you’re aliasing” and then you use smart quotes in your code snippets.
The quotes issue is because of how the blog/font formatted it. It shouldn’t do that, and I’m looking into it right now.
Don’t worry. Sysadmins are used to web CMSes breaking quotes. Never copy and paste from a website. Type (and think) it out.
I am completely agree with the post! I am personally a professionally lazy dev and I always land the tasks to automation every time I can. It gives me more time to do other tasks and increases the overall project maintenance.
> `mkcd=’mkdir $1 && cd $1’`
Bash does not expand positional parameters on aliases, but, in case it did, having them unquoted would keep being an awful mistake.
A working alternative would be declaring a function like this:
“`
mkcd() {
mkdir “$*” && cd “$_”
}
“`
The main differences are:
* Replacement of the alias by a function.
* Quoting of all arguments.
* Use of all the function arguments as part of the path.
For more information
check this StackOverflow question: https://stackoverflow.com/q/7131670
Post scriptum: I miss Markdown on these comments. For the sake of completeness, I’m going to test some html tricks just in case:
Chill dave
zsh aliases do recognize “$1” et al as arguments. (So does csh/tcsh.) bash aliases are much more restrictive. Functions are better except for the simplest cases.
What an attitude, wow.
Change single quotes to double. You need double quotes when using a variable like $1.
You can also search github for dotfiles repos and glean that way.
I’d use mkcd=$(mkdir $1 && cd $1) It is far easier to type $() than looking for “ on the keyboard. They are one of the characters that isn’t in a standard place on all keyboards whereas $() are.
Who hurt you, @DavidPostill, oh King of Ackchyually? You do realize you don’t have to be an ass
I really like that you suggest “Hand-crafted customization is a great way to get used to Bash syntax.” That’s cool! I often see beginners struggling with ohmyzsh, without ever understanding it. Hand-crafting it is a very good advice!
But please change
grep -i $searchTerm
to
grep -i “$searchTerm”
and other occurences of unquoted variables!
The problem with aliases? You go to a different machine which doesn’t have them and you don’t recall the original command.
“export HISTSIZE=10000000” – if you have that many different commands, you should consider not to run all your stuff on a single machine. Use virtualization.
You clearly didn’t read the article…
I feel that people who read
alias $preferredAlias=’$commandToAlias’
will be trying to declare aliases with a dollar sign before the name, which isn’t right.
Alt-click doesn’t work for me in RHEL/Gnome/Bash. What a tragedy.
Is ALT+Click to move the cursor an Apple-only feature? It doesn’t work on my Ubuntu gnome-terminal.
“!$, which gives you the value of the first argument of the previous command” should be changed to “last argument of the previous command”.
+1
I’m surprised nobody has mentioned FZF
“Did you know that you can click into the middle of a line in your terminal? Alt-click will move your cursor to where you need to go.”
Is this specific to the Mac terminal? It has no effect in stock xterm/urxvt/LXTerm/Gnome-terminal.
x=”cd ..” is handy
I first started using computers in the early 90’s, on machines that were old then and ran MS-Dos. I’ve used versions 3.X to 6.22. Command line interface was the standard then, but as soon as I got Win 3.1, I virtually stopped using the command line. There was still a fair amount of troubleshooting I had to do with the Dos prompt, but I avoided as much as possible. That continued through all the jobs I had as a computer tech for 15 years as well as the programming I’ve done during those years until now.
I found that it was an immense amount of work to not just know the command line commands, but to learn them without any sort of reference. Everything was a /h with a more than screen full of text that was hard to read and barely gave any info on why to use one switch over another. And trying to keep the whole file system in your brain just isn’t possible anymore. Even my own data directories are far too complex to remember everything without visual cues. Sure, I can “dir /p” every directory, then CTRL-C to see where I’m going, but opening a window, scrolling to the folder I need and double-clicking on it is much faster and easier.
The only thing I find lazy about the command line is when a fellow software developer doesn’t bother to write a GUI for their utility and resorts to a command line with switches. It’s even worse when they don’t even bother to document what all they are or do. If you can group your options into small sections like the standard “File”, “Edit”, etc. menus, it’s much easier for the user to understand what they are doing, not to mention what the options of the software are. Unless the utility only does one thing and it’s only ever going be be used in a script, it needs to be made user friendly.
I know my views aren’t “modern” in some people’s world, but I did command line 20+ years ago. It was clunky and difficult then, it hasn’t changed significantly since, and I very much prefer to not go back, especially when UI’s aren’t generally that hard to build. I know people will point out the exceptions of when GUI’s are difficult to design for the utility, but they don’t negate the 95% of the time where a GUI is much a better design choice. I’ve had this “discussion” more than once in my +8 years of professional software development. Unfortunately, the people who love command lines have an almost religious affinity for it, so I know my arguments will be dismissed almost out of hand.
I mean, if you spent 1/10th the time on the UI as you did on the functionality, you’d have a great GUI that people could actually use, instead of a CLI that only you know how to use. But what do I know, right?
computercarguy: I used to think exactly the same. And indeed, a nice GUI is something to love and treasure….if you can find one. Microsoft seems intent on sabotaging their operating system and their users. OSX is likewise circling the drain. Android is shit and so is iOS, for various reasons. The Linux desktop experience likewise leaves much to be desired.
The revelation I had, what people had been trying to teach me for so long, is there’s simply NO GUI which can equal the power and flexibility of the command line. I won’t say there never will be, as there most likely will be one invented some day, but we’re nowhere near that point right now.
This was what tripped me up for so long in the world of Linux. I kept waiting for the “day of Linux on the desktop” to arrive, stumbling from one run of the mill GUI to the next, not realizing the enormous amount of power and control I was missing out on.
And indeed, learning the UNIX / Linux shell is breathtakingly infuriating and ridiculous at times, due to the large amount of functionality that’s available plus the low ‘discoverability’ of it, sometimes autistic and nonsensical design, combined with the fact that the ecosystem is a great big hodge-podge of stuff thrown together loosely sharing the same philosophy but with small and radical departures here and therei, corresponding to the individual developer’s whims and fancies, which means you have to learn more than you otherwise ought to in a sanely designed system with consistent design from the top down–like Windows used to be, when David Cutler designed it that way, before Microsoft wrecked it starting with Windows NT4 and onward.
If you do take the time and effort to go down the Linux path however, dedicating yourself to learning the command line, various common and helpful UNIX / Linux commands, their basic usages and role they fulfill, and start chaining them together, writing scripts, you can build systems of great power and flexibility–pretty much automating your life, in ways that simply **cannot be done** by using existing off the shelf software. Instead of being at the whims of software developers, now you *become* the developer of your own digital ecosystem. The feeling of power, and actual power that comes from this is incredible. You start with ‘bash’ shell scripting then can progress from there, to awk, perl, python, C, etc, and at every step your power grows and your capabilities increase.
If you want to dip your toes in the water and give Linux a try, there’s a million distros (versions) out there, which is confusing for a noob. A good recommendation is Puppy Linux for a small, light, easy to use, complete out of the box distro. I recommend staying away from Debian, Ubuntu, Linux Mint, Fedora, Red Hat, and other ‘big names.’ Check out Funtoo (funtoo.org) later when you want to upgrade to something more customizable. Have fun….
(By the way, I’m a car guy also…I build Cadillac big blocks and Ford 2.3 engines)
I too started in MS-DOS on underpowered hardware and with limited resources for explaining what was available and why I would care. There was no internet (at least not for me) and building a certain level of proficiency on the DOS command line took patience and persistence.
Windows came along, and aside from a handful of word processors/ text editors, I lived mostly in spreadsheets – Lotus 123, Borland Quattro Pro, and ultimately Microsoft Excel. Still no internet, at least at the start, but there were more books available to reference.
Fast forward on this path, along came the the f-ing ribbon, which destroyed my proficiency and productivity with Excel, and made for a generally miserable work experience. A lot of clicky-clivky and flicking eyes to find what I already knew existed, but which were seemingly hidden. At this point I am confident that that UI will never work for me, and it doesn’t look like we can ever go back.
A few years ago I decide to commit to returning to my roots and learn Bash, along with all of the great Linux tools. Maybe it’s just me, but what I discovered, aside from regaining the efficiency I used to have, is that I have a much better focus working from the command line – keeping the important things in the front of mind, and not having to move my hand to a mouse, or divert my eyes from my task to find and click on an icon.
One thing i’ve started doing and helps organizing your aliases is to have namespaces for them.
For example, I use kubernetes a lot and I have a few alias that i prefix with “k.”
So now I can type “k.” + tab to get a list of all aliases I have setup.
I’m also a car guy 😉 I built and supercharged/tuned boxer engines for fun! 🙂
coolest thing I’ve found is Crtr+R revese shell bash history search 🙂
> Just type history
Or, just type ‘h’ after you alias it to history 🙂
and do the same for grep, so now search history with:
h | g ‘some command’
I prefer it to ctl-r, as you see all the results at once (pip to less if you don’t want to pollute terminal buffer). With history numbers, then you can use:
% !123
If 123 was the hist number of the command you wanted, and you want to repeat it.
Couple more options for controlling history from .bashrc or equiv:
export HISTCONTROL=ignoreboth:erasedups
# append to the history file, don’t overwrite it
shopt -s histappend
erasedups means if you repeat ‘ls’ 10 times on command line, only one ‘ls’ entry is added to the history file.
The histappend means your history will be persistent between terminal sessions. (Don’t know if shopt is a thing on Macs, it is in Linux).
awesome post! Although not everything might work as described it still gives a lot of info, well done!
I really enjoyed this post! I always like learning new bash skills and there were some new tidbits and reminders in here that I could use. I haven’t seen explainshell before and now it’s bookmarked because I always find reading man pages directly to be a pain.
I have a bunch of related aliases in sets, and tend to add a helper to each set, like: `alias gh=’alias | grep git’`
`alias` without args prints all aliases defined in the shell, and I pipe that to grep so that, in this case I get a summary of aliases related to git, and I can refresh my memory of the aliased command 👍
set -o vi
This is good stuff but it assumes that you already understand how the basic UNIX shell commands work. If you don’t, then you have a bit of a hill to climb to understand the techniques described here.
Back in the nineteen seventies there was a very useful computer-based teaching tool called “learn”. Written by one of the original UNIX gurus, it ran on a UNIX system and taught the rudiments of the shell commands. Unfortunately, by the time Linux came along, it had already been dropped from the system and forgotten. So it never found its way into Linux, the version of UNIX that most people now use.
It was written in an old version of C and eventually didn’t even compile, due to changes in the language. Last year I pulled it out of an archive, fixed the syntax errors and produced a Docker build for it to make it easier to install and run.
If you are new to shell commands or you have a colleague who is, you may find that Learn provides a flying start. You can find it here: https://github.com/goblimey/learn-unix.
!$ will repeat the last argument of the previous command, not the first
Shells are great power tools. People who only know how to do things through a GUI, or know a little shell but not really enough to wield it properly, probably can’t understand what they are missing.
However, there is a lot of room for improvement in those old shells like bash. Check out nushell:
https://github.com/nushell/nushell
It’s young, and there are some things I’d like to see done differently, but it’s a big step in the right direction.
esc .
Is my favourite
Actually there is one way Microsoft cannot be beat, either by Unix, Mac, or Linux. If you do a lot of testing from command line inside a program, cannot be beat. I want to enter the same complex lines over and over again, WITHIN AN APPLICATION THAT IS NOT TERMINAL, but looks like it. That is, it is text mode input in a read-process loop.
What I’m talking about here is Python. I start my python program from the terminal: python myprog.py, and type a long command, and maybe mess up only one character in the middle. If I press the up arrow I just get the escape sequence for the up arrow key. But on MS, I get exactly the same behavior on the command line with no preparation in my program. In fact no one has ever told me how I could prepare my program on *X to do this. On MS, every program seems to inherit terminal command line editing seamlessly.
P.S. I am a very power user on *X, always using VI and set -o vi. My preference except for the situation noted above. If I go into heavy testing, have to switch to a MS system to save time.
I’m not sure I understand your problem, but if it is with accessing the history in the python REPL it might simply be a case of needing some additional readline configuration.
I have the following in my ~/.inputrc configuration, and can easily navigate history in the pyhton REPL using my hj keys. I don’t typically use the other mappings, but testing them now I can confirm they work just as well.
I don’t know if this is what does the trick, but I can’t find any other customizations that I have made which would be relevant.
$if mode=vi
set keymap vi-command
“\e[A”: history-search-backward
“\e[B”: history-search-forward
k: history-search-backward
j: history-search-forward
set keymap vi-insert
“\e[A”: history-search-backward
“\e[B”: history-search-forward
“\C-w”: backward-kill-word
“\C-p”: history-search-backward
“\C-l”: clear-screen
$endif
Here is a custom Windows Prompt bat file for Windows 10+. The ANSI.SYS color coding was re-introduced in Windows 10.
https://github.com/EnsignRutherford/Windows-Prompt/tree/main
Winpmt.bat will create a custom prompt for Windows. It also can be installed so it runs first before CMD starts up to have every CMD instance have the same custom prompt.