Emulating Linux ‘man’ in PowerShell
This is a post which, I suspect, will be of very little interest to anyone besides myself. Then again, the target audience for this blog is the hypothetical collection of people, however few, who happen to be Weird in many of the same ways that I am, so here we are.
I’ve spent the five-ish years (thus far) of my career as a professional software developer in a single, Windows-oriented workplace; and yet I am, unapologetically, a Linux enthusiast. My personal computer(s) have run some variety of Linux contiguously for the past three years, with less-permanent dalliances throughout the preceding decade; I drink my 2–3 cups of tea each weekday from a mug featuring a lightly-modernized take on a certain penguin’s jovial visage. Though I have become skilled in the dark arts1 of PowerShell scripting, I rely on a handful of *nix-like aliases—some built-in, some added by me—to make my computing experience at work a bit more comfortable.
Several months
ago,2
I was working with a handful of cross-platform/*nix-native utilities
(curl
, scp
, file
, to name a
few) and found myself frustrated by the need to execute a Google
search every time I needed a refresher on their options.
“Wouldn’t it be great,” I said to myself,
“if I could just type man curl
into my PowerShell
prompt and have done with
it.”3
And thus, the idea was born: write a PowerShell function which,
given a *nix command name, took me to a web-ified man
page for that command. I decided that doing the actual
reading of the docs in my web browser was acceptable,
rather than expending energy trying to obtain or convert versions
suitable for displaying in PowerShell directly. The chief
frustration of my previous workflow was the part where I had to
Alt+Tab away from my terminal, open a
new browser tab, run the search, and then click on a link
in the Google results; whereas now, I run man curl
right there in the terminal and my browser magically pops
into focus with the appropriate info. Yes, I know that sounds
petty.
My antediluvian Google searches typically led me to either linux.die.net or man7.org. Both have nicely predictable URL schemes, so I selected man7 for its nicer text formatting. Their URLs all have the form
https://man7.org/linux/man-pages/man{{section}}/{{program}}.{{section}}.html
where {{section}}
is the section number within the
*nix manual where the program is found. In practice, this is almost
always 1
(user commands) for anything I care about. For
example,
https://man7.org/linux/man-pages/man1/curl.1.html
So if I assumed the section number was, in fact, 1
,
all I had to do was accept a parameter for the program name, plug it
into the URL, and Start-Process
.
But, I didn’t want my browser giving me the ol’ Error
404 if I flubbed the command name, or if I wanted to read about e.g.
sudo
(section 8: system management commands) or the
fstab
file (section 5: file formats and configuration
files). I’ve never had manual sections memorized (and
there’s a good chance I never will), so it seemed to me the
sensible thing to do would be to insert each section number into the
URL, curl --head
them one-by-one to check whether they
exist, and finally Start-Process
the first one that
doesn’t produce an error to open it in my browser. And if none
of those URLs exist, Start-Process
anyway with another
URL that opens a site-specific Google search for any man7 webpage
that contains my program name.
As it turns out, this is a pretty good plan, since that’s
basically what man
actually does! I lifted the search
order4
straight from
man man
, minus section 9 since, also
per man man
, that section is “Non standard
[sic]”5
and it is very unlikely that I will need to read about
kernel routines while in
Windows.6
By this point, I was in pretty good shape, but I ran into one
last problem. Some pages, like crontab
, exist in more
than one section (1 and 5). If I type man crontab
,
I’ll get the crontab
command every time;
yet, I’m more likely to be interested in the
crontab
file. The real man
has a
solution: specify the section number first, e.g. man 5
crontab
. This was straightforward to add to my
man
impostor, but it required a different approach for
parsing the arguments.
You see, PowerShell is meant (via param()
),
to accept named parameters, in any order:
man -Section 5 -Title crontab
but you can omit the names as long as you specify the right number of arguments in the same order they were defined:
man 5 crontab
the problem being that now, if I wanted to invoke
man
without specifying a section, I had to
give the -Title
flag, or else it would think my one
argument was the section number:
man -Title curl
This would not do.
Fortunately, you can skip param()
and get arguments
purely by their position; $Args.Length
tells me whether
there was one argument (the title) or two (the section and
the title), and then $Args[0]
, $Args[1]
,
etc. allow me to access each value in turn.
The final code (for now, anyways) is below. Note that PowerShell
has a built-in alias, translating man
to its own
Get-Help
cmdlet, so you’ll have to put
Remove-Alias man
in your profile alongside this
function in order to make use of it.
function man {
# Search order described in `man man` under DEFAULTS
$sections = @(
1, # User commands (Programs)
8, # System management commands
3, # Library calls
2, # System calls
5, # File formats and configuration files
4, # Special files (devices)
6, # Games
7 # Overview, conventions, and miscellaneous
)
$found = $false
if ($Args.Length -gt 1) {
$section = $Args[0]
$program = $Args[1]
if ($sections -contains $section) {
$url = "https://man7.org/linux/man-pages/man$section/$program.$section.html"
curl --silent --head --fail $url | Out-Null
if ($?) {
Start-Process -Path $url
$found = $true
break;
}
}
else {
Write-Error "$section`: not a manual section"
}
}
else
{
$program = $Args[0]
foreach ($section in $sections) {
$url = "https://man7.org/linux/man-pages/man$section/$program.$section.html"
curl --silent --head --fail $url | Out-Null
if ($?) {
Start-Process -Path $url
$found = $true
break;
}
}
}
if (!$found) {
$url = "https://www.google.com/search?q=$program&sitesearch=man7.org%2Flinux%2Fman-pages&sa=Search+online+pages"
Start-Process -Path $url
}
}
Footnotes
PowerShell is actually quite nice for scripting, in that cmdlets are named and optioned with a degree of consistency which bash’s organic approach could never hope to achieve, and also in that I am accustomed to handling data in the form of objects rather than pure text. However, certain things like extracting a single text value via regex are so much more complicated via the likes of
Select-String
that I often wish I could just invokegrep
instead. ↩︎-
June. It was June. I am very, very slow at converting ideas into finished blog posts.
In this case, some of that time went into figuring out how to get PowerShell syntax highlighting into this Hugo theme, but nonetheless it was mostly a case of not sitting down to write for weeks at a time. ↩︎
Yes, I pronounced the “code snippet” formatting when I spoke the words
man curl
. ↩︎Contrary to what you might expect, it does not search the sections in sequential order. ↩︎
That’s right, I just [sic]ed a quote for lacking a hyphen. ↩︎
Or ever, really. ↩︎