Monday·15·February·2021
Starting a GNU Screen session via SSH’s ~/.ssh/config //at 05:50 //by abe
This is more or less a followup to this blog posting of mine about AutoSSH and GNU Screen from nearly ten years ago — which by the way is still valid and still the way, I use SSH and GNU Screen.
Recently a friend asked me how to automatically start or reconnect to a GNU Screen session directly via OpenSSH’s configuration file. Here’s how to do it:
Add an entry to ~/.ssh/config
similar to this one:
Host screen_on_server Hostname server.example.org RequestTTY yes RemoteCommand screen -RD
and then just call ssh screen_on_server
and you’ll get
connected to an existing screen session if present, otherwise you’ll a
new new one.
Should work with tmux, too, but might need commandline different
options.
Tagged as: CLI, GNU, OpenSSH, Screen, SSH, ssh_config, tip
// show without comments // write a comment
Related stories
Monday·12·October·2020
Git related shell aliases I commonly use //at 14:28 //by abe
Hope this might be an inspiration to use these or similar aliases as
well.
Tagged as: CLI, git, git-annex
// show without comments // write a comment
Related stories
Tuesday·26·November·2013
Showing packages newer than in archive with aptitude //at 22:14 //by abe
I happens quite often that I install a manually built, newer version of some package on a machine. Occassionally I forget to remove it or to downgrade it to the version in the APT repo.
$ apt-show-versions | fgrep newer
easily finds those packages.
But usually when doing such a check, I want this list of packages in my aptitude TUI to have a look at the other versions of that package and to take actions. And I don’t want to manually search for each of the package manually.
This can be done with the following “one-liner”:
# aptitude -o "Aptitude::Pkg-Display-Limit=( `apt-show-versions | fgrep newer | awk -F '[ :]' '{printf "~n ^"$1"$ | "}' | sed -e 's/| *$//'` )"
It uses apt-show-version
’s output, searches for the right
packages, takes the first column and transforms it into an aptitude
search pattern matching all packages whose name is exactly one of the
listed packages.
But this solution is quite ugly and slow. So I wondered if this is also doable with pure aptitude search patterns which likely would also be faster.
And after some playing around I found the following working aptitude search term:
~i ?any-version(!~O.) !~U !~o
This matches all packages which which are installed and which have a
version which has no origin, i.e. no associated APT repository. Since
this also matches all hold packages as well as all packages not
available in any archive, I use !~U !~o
to exclude those
packages from that list again.
Since nobody can remember that nor wants to type that everytime needed, I added the following alias to my setup:
alias aptitude-newer-than-in-archive='aptitude -o "Aptitude::Pkg-Display-Limit=~i ?any-version(!~O.) !~U !~o"'
Only caveat so far:
It seems to also match packages from APT repos which haven’t set an “Origin”. This should not happen with any Debian or Ubuntu APT repository, but seems to happen occasionally with privately run APT repositories.
And using ~A
instead of ~O
, i.e. ~i
?any-version(!~A.)
, does not work for this case either, despite
it matches installed packages of which versions not in any available
archive exist. But unfortunately aptitude seems to remember in some
way if a package was in some archive in the past, so this only shows
packages installed with dpkg -i
, but not packages removed
from e.g. unstable but with older versions still being available in
stable.
Tagged as: alias, apt-show-versions, aptitude, awk, CLI, Debian, filter, grep, one-liner, Package Management, Quoting, UUUCO
// show without comments // write a comment
Related stories
Wednesday·02·October·2013
How to make wget honour Content-Disposition headers //at 16:12 //by abe
Download links often point to CGI scripts which actually generate (or
just fetch, i.e. proxy) the actual file to be downloaded, e.g. URLs
like http://www.example.com/download.cgi?file=foobar.txt
.
Most of such CGI scripts send the real file name in the Content-Disposition
header as specified
in the MIME Specification.
All browsers I know (well, at least those I use regularily :-) handle that perfectly and propose the file name sent in the Content-Disposition header as file name for saving the downloaded name which is usually exactly what I want.
All browsers do that, …, just not my favourite commandline download tool GNU Wget … Downloading the above URL with wget would look like this with default settings:
$ wget 'http://www.example.com/download.cgi?file=foobar.txt' --2013-10-02 16:04:16-- http://www.example.com/download.cgi?file=foobar Resolving www.example.com (www.example.com)... 93.184.216.119, 2606:2800:220:6d:26bf:1447:1097:aa7 Connecting to www.switch.ch (www.example.com)|2606:2800:220:6d:26bf:1447:1097:aa7|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 2020 (2.0K) [text/plain] Saving to: `download.cgi?file=foobar.txt' 100%[============================================>] 2,020 --.-K/s in 0s 2013-10-02 16:04:24 (12.5 MB/s) - `download.cgi?file=foobar.txt' saved [2020/2020]
Meh!
But luckily Wget can do that, it’s just not enabled by default — because it’s an experimental and possibly buggy feature, at least according to the man page. Well, works for me! :-)
You can easily enabled it by default for either your user or the whole
system by placing the following line in your ~/.wgetrc
or /etc/wgetrc
:
content-disposition = on
Given the CGI script sends an appropriate Content-Disposition header, the above output now looks like this:
$ wget 'http://www.example.com/download.cgi?file=foobar.txt' --2013-10-02 16:04:16-- http://www.example.com/download.cgi?file=foobar Resolving www.example.com (www.example.com)... 93.184.216.119, 2606:2800:220:6d:26bf:1447:1097:aa7 Connecting to www.switch.ch (www.example.com)|2606:2800:220:6d:26bf:1447:1097:aa7|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 2020 (2.0K) [text/plain] Saving to: `foobar.txt' 100%[============================================>] 2,020 --.-K/s in 0s 2013-10-02 16:04:24 (12.5 MB/s) - `foobar.txt' saved [2020/2020]
Now Wget does what I mean!
You can also set this as flag on the commandline, but typing
wget --content-disposition …
everytime is surely
not what I want. ;-)
Tagged as: CGI, CLI, Content-Disposition, download, howto, HTTP, Shell, UUUCO, wget
// show without comments // write a comment
Related stories
Saturday·17·November·2012
deepgrep: grep nested archives with one command //at 02:00 //by abe
Several months ago, I wrote about grep everything and listed grep-like tools which can grep through compressed files or specific data formats. The blog posting sparked several magazine articles and talks by Frank Hofmann and me.
Frank recently noticed that we though missed one more or less mighty tool so far. We missed it, because it’s mostly unknown, undocumented and hidden behind a package name which doesn’t suggest a real recursive “grep everything”:
deepgrep
deepgrep
is part of the Debian package strigi-utils, a package which contains utilities related to the
KDE desktop search Strigi.
deepgrep
especially eases the searching through tar
balls, even nested ones, but can also search through zip files and
OpenOffice.org/LibreOffice documents (which are actually zip files).
deepgrep
seems to support at least the following archive
and compression formats:
- tar
- ar, and hence deb
- rpm (but not cpio)
- gzip/gz
- bzip2/bz2
- zip, and hence jar/war and OpenOffice.org/LibreOffice documents
- MIME messages (i.e. files attached to e-mails)
A search in an archive which is deeply nested looks like this:
$ deepgrep bar foo.ar foo.ar/foo.tar/foo.tar.gz/foo.zip/foo.tar.bz2/foo.txt.gz/foo.txt:foobar foo.ar/foo.tar/foo.tar.gz/foo.zip/foo.tar.bz2/foo.txt.gz/foo.txt:bar
deepgrep
though neither seems to support any LZMA based
compression (lzma, xz, lzip, 7z), nor does it support lzop, rzip,
compress (.Z suffix), cab, cpio, xar, or rar.
Further current drawbacks of deepgrep
:
- Nearly no commandline options, especially none of the common grep options
- No man-page or other documentation
- Exit code not related to search results, you have to check the output to see if something has been found
deepfind
If you just need the file names of the files in nested archives, the
package also contains the tool deepfind
which does
nothing else than to list all files and directories in a given set of
archives or directories:
$ deepfind foo.ar foo.ar foo.ar/foo.tar foo.ar/foo.tar/foo.tar.gz foo.ar/foo.tar/foo.tar.gz/foo.zip foo.ar/foo.tar/foo.tar.gz/foo.zip/foo.tar.bz2 foo.ar/foo.tar/foo.tar.gz/foo.zip/foo.tar.bz2/foo.txt.gz foo.ar/foo.tar/foo.tar.gz/foo.zip/foo.tar.bz2/foo.txt.gz/foo.txt
As with deepgrep
, deepfind
does not
implement any common options of it’s normal sister tool
find
.
[The following part has been added on 17-Nov-2012]
As with deepgrep, it also doesn’t seem to support any of the more modern or more exotic compression formats, i.e. it fails on modern debian binary packages which use xz compression on the data part:
deepfind xulrunner-18.0_18.0\~a2+20121109042012-1_amd64.deb xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/debian-binary xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/control.tar.gz xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/control.tar.gz/triggers xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/control.tar.gz/preinst xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/control.tar.gz/md5sums xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/control.tar.gz/postinst xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/control.tar.gz/control xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/data.tar.xz
[End of part added at 17-Nov-2012]
Dependencies
The package strigi-utils doesn’t pull in the complete Strigi framework (i.e. no daemon), just a few libraries (libstreams, libstreamanalyzer, and libclucene). On Wheezy it also pulls in some audio/video decoding libraries which may make some server administrators less happy.
Conclusion
Both tools are quite limited to some basic use cases, but can be worth a fortune if you have to work with nested archives. Nevertheless the claim in the Debian package description of strigi-utils that they’re “enhanced” versions of their well known counterparts is IMHO disproportionate.
Most of the missing features and documentation can be explained by the primary purpose of these tools: Being backend for desktop searches. I guess, there wasn’t much need for proper commandline usage yet. Until now. ;-)
42.zip
And yes, I was curious enough to let deepfind
have a look
at 42.zip (the one from SecurityFocus, unzip seems not
able to unpack 42.zip from unforgettable.dk due a missing version compatibility)
and since it just traverses the archive sequentially, it has no
problem with that, needing just about 5 MB of RAM and a lot of time:
[…] 42.zip/lib f.zip/book f.zip/chapter f.zip/doc f.zip/page e.zip 42.zip/lib f.zip/book f.zip/chapter f.zip/doc f.zip/page e.zip/0.dll 42.zip/lib f.zip/book f.zip/chapter f.zip/doc f.zip/page f.zip 42.zip/lib f.zip/book f.zip/chapter f.zip/doc f.zip/page f.zip/0.dll deepfind 42.zip 11644.12s user 303.89s system 97% cpu 3:24:02.46 total
I though won’t try deepgrep
on 42.zip. ;-)
Tagged as: 42.zip, ar, bzip2, CLI, CLucene, deb, deepfind, deepgrep, efho, find, grep, gzip, jar, KDE, LibreOffice, Lucene, odt, OpenOffice.org, Rant, rpm, strigi, tar, UUUT, war, zip
// show without comments // write a comment
Related stories
Tuesday·10·January·2012
Illegal attempt to re-initialise SSL for server (theoretically shouldn’t happen!) //at 02:52 //by abe
After dist-upgrading my main Hetzner server from Lenny to Squeeze, Apache failed to come up, barfing the following error message in the alphabetically last defined and enabled virtual host’s error log:
[error] Illegal attempt to re-initialise SSL for server (theoretically shouldn't happen!)
Well this is not theory but the real world and it did happen — and it took me a while to find out what was wrong with the configuration despite it worked with Lenny’s Apache version.
To avoid that others have to search as long as I had to, here’s the solution:
Look at all enabled sites, pick out those which have a VirtualHost on port 443 defined and verify that all these VirtualHost containers do have their own “SSLEngine On” statement. If at least one is missing, you’ll run into the above mentioned error message.
And it won’t necessarily show up in the error log of those VirtualHosts which are missing the statement but only in the last VirtualHost (or the last VirtualHost on port 443).
To find the relevant site files, I used the following one-liner:
grep -lE 'VirtualHost.*443' sites-enabled/*[^~] | \ xargs grep -ci "SSLEngine On" | \ grep :0
Should work for all sites which have defined just one VirtualHost on port 443 per file.
I suspect that the raise of SNI made Apache’s SSL implementation more picky with regards to VirtualHosts.
Oh, and kudos to this comment to an article on Debian-Administration.org because
it finally pointed me in the right direction. :-)
Tagged as: Apache, CLI, commandline, Debian, error, experience, grep, HTTPS, KMMR, Lenny, Squeeze, SSL, xargs
// show without comments // write a comment
Related stories
Thursday·22·September·2011
Emacs Macros: Repeat on Steroids //at 16:06 //by abe
vi users have their .
(dot) redo command for repeating
the last command. The article Repeating Commands in Emacs in Mickey Petersen’s blog Mastering Emacs explained
Emacs’ equivalent for that, namely the command repeat
, by
default bound to C-x z
.
I though seldomly use it as I mostly have to repeat a chain of commands. What I use are so called Keyboard Macros.
For example for the CVE-2011-3192 vulnerability in Apache I added a line like
Include /etc/apache2/sites-common/CVE-2011-3192.conf
to
all VirtualHosts.
So I started Emacs with all the relevant files: grep
CVE-2011-3192 -l /etc/apache2/sites-available/*[^~] | xargs emacs
&
To remove those “Include” lines again M-x flush-lines
is
probably the easiest way in Emacs. So for every file I had to call
flush-lines with always the same parameter, save the buffer and then
close the file or — in Emacsish — “kill” the buffer.
So while working on the first file I recorded my doing as a keyboard macro:
C-x (
- Start recording
M-x flush-lines<Enter>CVE-2011-3192<Enter>
- flush all lines which contain the string “CVE-2011-3192”
C-x C-s
- save the current buffer
C-x C-k<Enter>
- kill the current buffer, i.e. close the file
C-x )
- Stop recording
Then I just had to call the saved macro with C-x e
. It
flushed all lines, saved the changes and switched to the next
remaining file by closing the current file with three key-strokes. And
to make it even easier, from the second occasion on I only had to
press e
to call the macro directly again. So I just
pressed e
for a bunch of time and had all files edited.
(In this case I used git diff
afterwards to check that I
didn’t wreck anything by half-automating my editing. :-)
Of course there are other ways to do this, too, e.g. use
sed
or so, but I still think it’s a neat example for
showing the power of keyboard macros in Emacs. More things you can do
with Emacs Keyboard Macros are described in the EmacsWiki entry Keyboard Macros.
And if you still miss vi’s .
command in Emacs, you can
use the dot-mode, an Emacs mode currently maintained by Robert Wyrick
which more or less automatically defines keyboard macros and lets you
call them with C-.
.
Tagged as: Apache, CLI, CVE, CVE-2011-3192, dot-mode, Emacs, EmacsWiki, git, macro, Other Blogs, redo, repeat, vi, xargs
// show without comments // write a comment