32

After administering Unix or Unix-like servers, what tools (command-line preferably) do you feel you cannot live without?

HopelessN00b
  • 54,273

42 Answers42

51

GNU screen - essential when you're managing large numbers of systems and don't want to have a dozen terminal windows open.

Murali Suriar
  • 10,406
34

Some I know that I cannot live without...

  • tee - allows simultaneous writing to STDOUT (standard output) and a file. Great for viewing information and logging it for later.

  • top - the task manager of UNIX, gives a great overview of the system.

  • tail -f - allows you to view appended data as a file grows, great for monitoring log files on a server.

  • grep - Global Regular Expression Print, great for searching the system for data in files.

  • df - reports disk usage of current filesystems.

  • du - reports disk usage of a certain file/directory.

  • less - needed to view man pages! also useful for viewing output of commands in an easily seekable manner.

  • vim/Emacs/nano/pico/ed - whatever your text editor of choice may be, self explanatory of why it's needed.

26

lsof to determine which processes are using a file or directory (useful when trying to figure out what is preventing a device from being umount'd)

netstat to determine which processes are using network connections (especially useful when trying to figure out which daemon is bound to a certain port)

erichui
  • 270
19

Learn all the basic tools, but learn Perl.

Perl is ideal for manipulating text, and since un*x operators live on text files, pipes, input and output, Perl is a great fit.

The added bonus is Perl is cross platform and if you have to do some work on a windows box you have an easily installable (just drop a Perl directory on the server) language that you already know.

And on that train of thought, get Cygwin as well. If you are a un*x admin and have to work on a windows box (even your desktop) having ls, rm, grep, sed, tail etc save you a lot of time when switching OS's.

Mark Nold
  • 285
18
  • sed
  • awk

The forgotten grandfathers of modern systems scripting. I know Perl gets most of the love (along with Bash scripting, Python, Ruby, and [insert your favorite scripting language here]), and don't get me wrong, I love Perl. I make use of it almost daily.

But sed and awk should not be forgotten, overlooked, or ignored. For a lot of cases, sed and awk are the best tools for the job. Quick examples are command line filtering with sed, and quick and dirty log processing with awk. Both could be done in Perl, but will require more work and development time.

13

rsync, especially in concert with ssh. It allows simple efficient copying of files from host to host. How did we ever cope without ssh and rsync? :-)

pQd
  • 30,537
dr-jan
  • 432
12

Face it - sooner or later you'll deal with the network as well. mtr, tcpdump and tshark are really useful for seeing what's happening.

pQd
  • 30,537
12

Netcat.

  • Test if TCP services are listening.
  • Perform transactions against plaintext protocols such as SMTP.
  • Quick insecure data transfers between machines.
  • Telnet client emulation.

The network swiss army knife, as they say.

Dan Carley
  • 26,127
9

I use most of the tools already listed, but here's one no one has touched on yet:

Puppet - system for automating system administration tasks

9

For quick scripts, automation, etc:

  • bash
  • perl

To connect to your *NIX server:

  • Open SSH (Linux client)
  • Putty (Windows client)
Paul
  • 133
6

A couple of handy tools I haven't seen mentioned yet:

  • dstat --nocolor (overview of cpu-, disk-, net-usage)
  • iftop (nice dynamic overview of network traffic)
  • ccze (colour logfiles nicely)
  • ssh tunnels (can be useful once in a while; see the manual; -R)
  • expect (automate interactive, chatty dialogy interfaces, nice if you're in a pinch)
asjo
  • 1,318
6

For scripting:

6

Most of the standard ones are included in other answers, so I'll go fo non-standard ones:

  • htop — great for process management;
  • pinfo — lynx like browser for info and man pages.
vartec
  • 6,277
5

ClusterSSH

ClusterSSH controls a number of xterm windows via a single graphical console window to allow commands to be interactively run on multiple servers over an ssh connection.

Tom Feiner
  • 18,598
4

ssh, Vim, htop, su, Python, ls, cd, screen, du, tar :)

4

pv: Displays the progress of long operations that can be redirected. http://www.ivarch.com/programs/pv.shtml

Useful then you want to monitor something that is going to take ages, like copying/compressing a raw block device over the network (which is how I take paranoia backups of my 8Gb netbook before tinkering with anything major like tweaking with file system settings).

Also: I'll second votes for ssh, rsync, screen, htop and netcat as mentioned by people above - all of which are more important than pv but pv had not been mentioned yet. In fact pv is often a useful addition when piping stuff to or from to netcat.

4

vmstat 1

Gives you a great overview of system behaviour.

3

Most of these tools are made much more powerful using Bash "programmable completion" - so you can tab-complete things like commandline options, or say the name of a package with "apt-get install". It will also limit what you tab-complete for relevant files - for example, "unzip" will only complete supported archive files.

It really is the mutts - if you have never tried it you probably just need to fiddle your .bashrc:

if [ -f /etc/bash_completion ]; then
    . /etc/bash_completion
 fi

Certainly this is true on Ubuntu and Debian. You may need to get the package on some Linux distributions.

Tom Newton
  • 4,251
3

tar pipe!

piping the output of tar to another utility, to tar running on the same box, or to tar running over SSH is my favorite old-school Unix move for moving files from one place to another.

This also gives you the Windows-style option of copying one folder to another and ending up with all of the files in the source and destination directory.

3

zsh as a shell

It is especially efficient with grml.org's extensions/setup.

cstamas
  • 6,917
3

sudo.

Seriously though, tail -f is useful.

3

iotop, is a top-like program to monitor I/O accesses to your disks.

Emilio
  • 55
3

Some that haven't been mentioned before:

  • head/tail
  • diff
  • pstree
  • tar
  • gzip/bzip
  • watch
CK.
  • 1,173
2

A few things overlooked I wanted to mention.

  • vim -d split screen console diff that makes it very easy to see the differences in a file
  • pdsh allows you to easily run a command over as many systems as you want either serial or parallel(I am a cluster admin. I can't function without it.)
  • nmon is like top on crack. It gives you a great idea of what is going on on a system on a single screen. You can see disk I/O, network I/O CPU usage, and memory usage real time. At the very least a real fun thing to play with when profiling a system.

Oh, and I forgot to mention, when scripting, I believe you should always use Korn. I hate Korn(Not the band. I love the band:-P) but it's literally everywhere. You can take a script and move it between Solaris, AIX and Linux and not have to worry about whether or not the admin had the decency to install Bash.

2

The shutdown command.

user237419
  • 1,653
2

One tool sometimes very handy is nohup. I use it to run scripts that last for a long time using remote SSH clients.

2

I love AWK as well as "for" on the command line.

Especially to build up a list of commands I want to run and then execute them all at once.

Brian G
  • 375
2

  • vi
  • find
  • ssh
  • AWK
  • sed
  • netcat
  • tar
  • ps
  • Maxwell
    • 5,086
    2

    man - to read the man pages.

    elinks - to check google, cause I sure as hell cant remember everything.

    And attention to detail & tenacity, because without them I just waste time.

    pjd
    • 131
    2

    screen is a must, especially with a good .screenrc file. I have it configured to display visually which window I'm in and can move between them with Ctrl+Arrow. For a single ssh session and multiple shells, it is a life saver.

    Nasko
    • 727
    2

    Some additional answers can be found in this similar question

    Vagnerr
    • 1,275
    1
    • Bash
    • Vim
    • iostat
    • ps
    • top
    • lsof
    • strace
    • tcpdump
    • netstat
    • find
    • grep
    • Perl
    • sed
    • tail
    • dig
    • traceroute

    Where possible the GNU versions of the above over the propritary versions.

    Jason Tan
    • 2,792
    1
    • rsync running over ssh to keep things consistent... in multiple directions (-gloptru[n]c)
    • Vim and vimdiff to edit with 'folding' and viewing differences in scripts, logs, etc.
    • Perl and (Ba)sh for scripting and analysis
    • cURL (and maybe Wget) for posting/fetching data from ...
    • Apache to webify them all (or at least create point-n-click admin tools)
    ericslaw
    • 1,592
    1

    Perl and Vim. In that order. Anything else, I can use Perl to emulate somehow.

    1

    All the standard commands and utilities (Bash, grep, sed, AWK, find, xargs, ssh, Vim, etc.)

    • Lsof, awesome in so many ways, I love to use it for finding open ports AND the files associated with that process.
    • Screen, for multi-session awesome.
    • Tcpdump, its funny how many application problems are really weird network issues
    • Ruby, makes more sense to me than Perl, becoming wildly popular for SA work.
    • Chef, configuration management system.
    • Capistrano, ssh in a for loop, but less crappy. And in Ruby.
    • Rake, more sensible than make.
    jtimberman
    • 7,665
    1

    These are the tools I use on a daily basis (as a developer more than a system administrator)

    • zsh
    • lsof
    • ps
    • ack (or grep)
    • find
    • svn
    • Python
    • tar
    • which
    • fortune (a guy has to keep his sanity somehow)
    1

    Simple, basic but still essential:

    ps - report a snapshot of the current processes.

    free - Display amount of free and used memory in the system.

    w - Show who is logged on and what they are doing.

    gimel
    • 1,203
    1

    pkill or ps for killing processes.

    If you want to use ps to kill any process with a given name or under a certain directory blah (or any matching string you require) you can:

    kill `ps -ef | grep <blah> | grep -v grep | awk '{print $2}'`
    
    1

    nmon

    Haven't seen anyone mention this yet.

    The nmon tool is designed for AIX and Linux performance specialists to use for monitoring and analyzing performance data, including:

    • CPU utilization
    • Memory use
    • Kernel statistics and run queue information
    • Disks I/O rates, transfers, and read/write ratios
    • Free space on file systems
    • Disk adapters
    • Network I/O rates, transfers, and read/write ratios
    • Paging space and paging rates
    • CPU and AIX specification
    • Top processors
    • and more

    Can be run in file mode which generates a big CSV file. IBM also provide an Excel macro for parsing this and turning it in to awesome graphs, although you do need a Windows VM for that.

    nagios and munin for monitoring and graphing.

    Kura
    • 243
    • 2
    • 7
    0
    • atop - yet another top alternative, great for monitoring changes in processes
    • strace/ltrace - for tracking down those REALLY annoying bugs
    • ldd - track down broken library dependencies
    • cron, logrotate ;)

    Of course, beyond command line, you need Nagios/Cacti/MRTG/etc...

    allaryin
    • 323
    0

    Learn Vim or Emacs in and out!!
    For text editing
    Grep
    Sed
    AWK


    For network tools
    Nmap
    dig

    XTZ
    • 183
    0

    munin is a great tool for doing capacity analysis and review, but you need to set it up before you need it. We install it as a standard part of every server install we do.