Jump to menu and information about this site.

Monday·15·February·2021

Starting a GNU Screen session via SSH’s ~/.ssh/config //at 05:50 //by abe

from the helpful-usage-patterns-never-die dept.

This is more or less a followup to this blog posting of mine about AutoSSH and GNU Screen from nearly ten years ago — which by the way is still valid and still the way, I use SSH and GNU Screen.

Recently a friend asked me how to automatically start or reconnect to a GNU Screen session directly via OpenSSH’s configuration file. Here’s how to do it:

Add an entry to ~/.ssh/config similar to this one:

Host screen_on_server
    Hostname server.example.org
    RequestTTY yes
    RemoteCommand screen -RD

and then just call ssh screen_on_server and you’ll get connected to an existing screen session if present, otherwise you’ll a new new one.

Should work with tmux, too, but might need commandline different options.

Monday·12·October·2020

Git related shell aliases I commonly use //at 14:28 //by abe

from the git-rules--p dept.

  • ga="git annex"
  • gap="git add -p"
  • amend="git commit --amend"

Hope this might be an inspiration to use these or similar aliases as well.

Tuesday·26·November·2013

Showing packages newer than in archive with aptitude //at 22:14 //by abe

from the handy-aptitude-TUI-filters dept.

I happens quite often that I install a manually built, newer version of some package on a machine. Occassionally I forget to remove it or to downgrade it to the version in the APT repo.

$ apt-show-versions | fgrep newer

easily finds those packages.

But usually when doing such a check, I want this list of packages in my aptitude TUI to have a look at the other versions of that package and to take actions. And I don’t want to manually search for each of the package manually.

This can be done with the following “one-liner”:

# aptitude -o "Aptitude::Pkg-Display-Limit=( `apt-show-versions | fgrep newer | awk -F '[ :]' '{printf "~n ^"$1"$ | "}' | sed -e 's/| *$//'` )"

It uses apt-show-version’s output, searches for the right packages, takes the first column and transforms it into an aptitude search pattern matching all packages whose name is exactly one of the listed packages.

But this solution is quite ugly and slow. So I wondered if this is also doable with pure aptitude search patterns which likely would also be faster.

And after some playing around I found the following working aptitude search term:

~i ?any-version(!~O.) !~U !~o

This matches all packages which which are installed and which have a version which has no origin, i.e. no associated APT repository. Since this also matches all hold packages as well as all packages not available in any archive, I use !~U !~o to exclude those packages from that list again.

Since nobody can remember that nor wants to type that everytime needed, I added the following alias to my setup:

alias aptitude-newer-than-in-archive='aptitude -o "Aptitude::Pkg-Display-Limit=~i ?any-version(!~O.) !~U !~o"'

Only caveat so far:

It seems to also match packages from APT repos which haven’t set an “Origin”. This should not happen with any Debian or Ubuntu APT repository, but seems to happen occasionally with privately run APT repositories.

And using ~A instead of ~O, i.e. ~i ?any-version(!~A.), does not work for this case either, despite it matches installed packages of which versions not in any available archive exist. But unfortunately aptitude seems to remember in some way if a package was in some archive in the past, so this only shows packages installed with dpkg -i, but not packages removed from e.g. unstable but with older versions still being available in stable.

Wednesday·02·October·2013

How to make wget honour Content-Disposition headers //at 16:12 //by abe

from the DWIM dept.

Download links often point to CGI scripts which actually generate (or just fetch, i.e. proxy) the actual file to be downloaded, e.g. URLs like http://www.example.com/download.cgi?file=foobar.txt.

Most of such CGI scripts send the real file name in the Content-Disposition header as specified in the MIME Specification.

All browsers I know (well, at least those I use regularily :-) handle that perfectly and propose the file name sent in the Content-Disposition header as file name for saving the downloaded name which is usually exactly what I want.

All browsers do that, …, just not my favourite commandline download tool GNU Wget … Downloading the above URL with wget would look like this with default settings:

$ wget 'http://www.example.com/download.cgi?file=foobar.txt'
--2013-10-02 16:04:16--  http://www.example.com/download.cgi?file=foobar
Resolving www.example.com (www.example.com)... 93.184.216.119, 2606:2800:220:6d:26bf:1447:1097:aa7
Connecting to www.switch.ch (www.example.com)|2606:2800:220:6d:26bf:1447:1097:aa7|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 2020 (2.0K) [text/plain]
Saving to: `download.cgi?file=foobar.txt'

100%[============================================>] 2,020       --.-K/s   in 0s

2013-10-02 16:04:24 (12.5 MB/s) - `download.cgi?file=foobar.txt' saved [2020/2020]

Meh!

But luckily Wget can do that, it’s just not enabled by default — because it’s an experimental and possibly buggy feature, at least according to the man page. Well, works for me! :-)

You can easily enabled it by default for either your user or the whole system by placing the following line in your ~/.wgetrc or /etc/wgetrc:

content-disposition = on

Given the CGI script sends an appropriate Content-Disposition header, the above output now looks like this:

$ wget 'http://www.example.com/download.cgi?file=foobar.txt'
--2013-10-02 16:04:16--  http://www.example.com/download.cgi?file=foobar
Resolving www.example.com (www.example.com)... 93.184.216.119, 2606:2800:220:6d:26bf:1447:1097:aa7
Connecting to www.switch.ch (www.example.com)|2606:2800:220:6d:26bf:1447:1097:aa7|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 2020 (2.0K) [text/plain]
Saving to: `foobar.txt'

100%[============================================>] 2,020       --.-K/s   in 0s

2013-10-02 16:04:24 (12.5 MB/s) - `foobar.txt' saved [2020/2020]

Now Wget does what I mean!

You can also set this as flag on the commandline, but typing wget --content-disposition … everytime is surely not what I want. ;-)

Saturday·17·November·2012

deepgrep: grep nested archives with one command //at 02:00 //by abe

from the grep-revisited dept.

Several months ago, I wrote about grep everything and listed grep-like tools which can grep through compressed files or specific data formats. The blog posting sparked several magazine articles and talks by Frank Hofmann and me.

Frank recently noticed that we though missed one more or less mighty tool so far. We missed it, because it’s mostly unknown, undocumented and hidden behind a package name which doesn’t suggest a real recursive “grep everything”:

deepgrep

deepgrep is part of the Debian package strigi-utils, a package which contains utilities related to the KDE desktop search Strigi.

deepgrep especially eases the searching through tar balls, even nested ones, but can also search through zip files and OpenOffice.org/LibreOffice documents (which are actually zip files).

deepgrep seems to support at least the following archive and compression formats:

  • tar
  • ar, and hence deb
  • rpm (but not cpio)
  • gzip/gz
  • bzip2/bz2
  • zip, and hence jar/war and OpenOffice.org/LibreOffice documents
  • MIME messages (i.e. files attached to e-mails)

A search in an archive which is deeply nested looks like this:

$ deepgrep bar foo.ar
foo.ar/foo.tar/foo.tar.gz/foo.zip/foo.tar.bz2/foo.txt.gz/foo.txt:foobar
foo.ar/foo.tar/foo.tar.gz/foo.zip/foo.tar.bz2/foo.txt.gz/foo.txt:bar

deepgrep though neither seems to support any LZMA based compression (lzma, xz, lzip, 7z), nor does it support lzop, rzip, compress (.Z suffix), cab, cpio, xar, or rar.

Further current drawbacks of deepgrep:

  • Nearly no commandline options, especially none of the common grep options
  • No man-page or other documentation
  • Exit code not related to search results, you have to check the output to see if something has been found

deepfind

If you just need the file names of the files in nested archives, the package also contains the tool deepfind which does nothing else than to list all files and directories in a given set of archives or directories:

$ deepfind foo.ar
foo.ar
foo.ar/foo.tar
foo.ar/foo.tar/foo.tar.gz
foo.ar/foo.tar/foo.tar.gz/foo.zip
foo.ar/foo.tar/foo.tar.gz/foo.zip/foo.tar.bz2
foo.ar/foo.tar/foo.tar.gz/foo.zip/foo.tar.bz2/foo.txt.gz
foo.ar/foo.tar/foo.tar.gz/foo.zip/foo.tar.bz2/foo.txt.gz/foo.txt

As with deepgrep, deepfind does not implement any common options of it’s normal sister tool find.

[The following part has been added on 17-Nov-2012]

As with deepgrep, it also doesn’t seem to support any of the more modern or more exotic compression formats, i.e. it fails on modern debian binary packages which use xz compression on the data part:

deepfind xulrunner-18.0_18.0\~a2+20121109042012-1_amd64.deb
xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb
xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/debian-binary
xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/control.tar.gz
xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/control.tar.gz/triggers
xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/control.tar.gz/preinst
xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/control.tar.gz/md5sums
xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/control.tar.gz/postinst
xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/control.tar.gz/control
xulrunner-18.0_18.0~a2+20121109042012-1_amd64.deb/data.tar.xz

[End of part added at 17-Nov-2012]

Dependencies

The package strigi-utils doesn’t pull in the complete Strigi framework (i.e. no daemon), just a few libraries (libstreams, libstreamanalyzer, and libclucene). On Wheezy it also pulls in some audio/video decoding libraries which may make some server administrators less happy.

Conclusion

Both tools are quite limited to some basic use cases, but can be worth a fortune if you have to work with nested archives. Nevertheless the claim in the Debian package description of strigi-utils that they’re “enhanced” versions of their well known counterparts is IMHO disproportionate.

Most of the missing features and documentation can be explained by the primary purpose of these tools: Being backend for desktop searches. I guess, there wasn’t much need for proper commandline usage yet. Until now. ;-)

42.zip

And yes, I was curious enough to let deepfind have a look at 42.zip (the one from SecurityFocus, unzip seems not able to unpack 42.zip from unforgettable.dk due a missing version compatibility) and since it just traverses the archive sequentially, it has no problem with that, needing just about 5 MB of RAM and a lot of time:

[…]
42.zip/lib f.zip/book f.zip/chapter f.zip/doc f.zip/page e.zip
42.zip/lib f.zip/book f.zip/chapter f.zip/doc f.zip/page e.zip/0.dll
42.zip/lib f.zip/book f.zip/chapter f.zip/doc f.zip/page f.zip
42.zip/lib f.zip/book f.zip/chapter f.zip/doc f.zip/page f.zip/0.dll
deepfind 42.zip  11644.12s user 303.89s system 97% cpu 3:24:02.46 total

I though won’t try deepgrep on 42.zip. ;-)

Tuesday·10·January·2012

Illegal attempt to re-initialise SSL for server (theoretically shouldn’t happen!) //at 02:52 //by abe

from the as-soon-as-you-do-it-right,-it-actually-works dept.

After dist-upgrading my main Hetzner server from Lenny to Squeeze, Apache failed to come up, barfing the following error message in the alphabetically last defined and enabled virtual host’s error log:

[error] Illegal attempt to re-initialise SSL for server (theoretically shouldn't happen!)

Well this is not theory but the real world and it did happen — and it took me a while to find out what was wrong with the configuration despite it worked with Lenny’s Apache version.

To avoid that others have to search as long as I had to, here’s the solution:

Look at all enabled sites, pick out those which have a VirtualHost on port 443 defined and verify that all these VirtualHost containers do have their own “SSLEngine On” statement. If at least one is missing, you’ll run into the above mentioned error message.

And it won’t necessarily show up in the error log of those VirtualHosts which are missing the statement but only in the last VirtualHost (or the last VirtualHost on port 443).

To find the relevant site files, I used the following one-liner:

grep -lE 'VirtualHost.*443' sites-enabled/*[^~] | \
  xargs grep -ci "SSLEngine On" | \
  grep :0

Should work for all sites which have defined just one VirtualHost on port 443 per file.

I suspect that the raise of SNI made Apache’s SSL implementation more picky with regards to VirtualHosts.

Oh, and kudos to this comment to an article on Debian-Administration.org because it finally pointed me in the right direction. :-)

Thursday·22·September·2011

Emacs Macros: Repeat on Steroids //at 16:06 //by abe

from the .-for-Emacsen dept.

vi users have their . (dot) redo command for repeating the last command. The article Repeating Commands in Emacs in Mickey Petersen’s blog Mastering Emacs explained Emacs’ equivalent for that, namely the command repeat, by default bound to C-x z.

I though seldomly use it as I mostly have to repeat a chain of commands. What I use are so called Keyboard Macros.

For example for the CVE-2011-3192 vulnerability in Apache I added a line like Include /etc/apache2/sites-common/CVE-2011-3192.conf to all VirtualHosts.

So I started Emacs with all the relevant files: grep CVE-2011-3192 -l /etc/apache2/sites-available/*[^~] | xargs emacs &

To remove those “Include” lines again M-x flush-lines is probably the easiest way in Emacs. So for every file I had to call flush-lines with always the same parameter, save the buffer and then close the file or — in Emacsish — “kill” the buffer.

So while working on the first file I recorded my doing as a keyboard macro:

C-x (
Start recording
M-x flush-lines<Enter>CVE-2011-3192<Enter>
flush all lines which contain the string “CVE-2011-3192”
C-x C-s
save the current buffer
C-x C-k<Enter>
kill the current buffer, i.e. close the file
C-x )
Stop recording

Then I just had to call the saved macro with C-x e. It flushed all lines, saved the changes and switched to the next remaining file by closing the current file with three key-strokes. And to make it even easier, from the second occasion on I only had to press e to call the macro directly again. So I just pressed e for a bunch of time and had all files edited. (In this case I used git diff afterwards to check that I didn’t wreck anything by half-automating my editing. :-)

Of course there are other ways to do this, too, e.g. use sed or so, but I still think it’s a neat example for showing the power of keyboard macros in Emacs. More things you can do with Emacs Keyboard Macros are described in the EmacsWiki entry Keyboard Macros.

And if you still miss vi’s . command in Emacs, you can use the dot-mode, an Emacs mode currently maintained by Robert Wyrick which more or less automatically defines keyboard macros and lets you call them with C-..

Tag Cloud

Current filter: »CLI« (Click tag to exclude it or click a conjunction to switch them.)

2CV, aha, Apache, APT, aptitude, ASUS, Automobiles, autossh, Berlin, bijou, Blogging, Blosxom, Blosxom Plugin, Browser, BSD, CDU, Chemnitz, Citroën, CLI, CLT, Conkeror, CSS, CX, deb, Debian, Doofe Parteien, E-Mail, eBay, EeePC, Emacs, Epiphany, Etch, ETH Zürich, Events, Experimental, Firefox, Fläsch, FreeBSD, Freitagstexter, FVWM, Galeon, Gecko, git, GitHub, GNOME, GNU, GNU Coreutils, GNU Screen, Google, GPL, grep, grml, gzip, Hackerfunk, Hacks, Hardware, Heise, HTML, identi.ca, IRC, irssi, Jabber, JavaShit, Kazehakase, Lenny, Liferea, Linux, LinuxTag, LUGS, Lynx, maol, Meme, Microsoft, Mozilla, Music, mutt, Myon, München, nemo, Nokia, nuggets, Open Source, OpenSSH, Opera, packaging, Pentium I, Perl, Planet Debian, Planet Symlink, Quiz, Rant, ratpoison, Religion, RIP, Sarcasm, Sarge, Schweiz, screen, Shell, Sid, Spam, Squeeze, SSH, Stoeckchen, Stöckchen, SuSE, Symlink, Symlink-Artikel, Tagging, Talk, taz, Text Mode, ThinkPad, Ubuntu, USA, USB, UUUCO, UUUT, VCFe, Ventilator, Vintage, Wahlen, WAP, Wheezy, Wikipedia, Windows, WML, Woody, WTF, X, Xen, zsh, Zürich, ÖPNV

Calendar

← 2025 →
Months
SepOct Nov Dec
← September →
Mo Tu We Th Fr Sa Su
10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30          

Tattletale Statistics

Blog postings by posting time
Blog posting times this month



Search


Advanced Search


Categories


Recent Postings

13 most recent of 289 postings total shown.


Recent Comments

Hackergotchi of Axel Beckert

About...

This is the blog or weblog of Axel Stefan Beckert (aka abe or XTaran) who thought, he would never start blogging... (He also once thought, that there is no reason to switch to this new ugly Netscape thing because Mosaïc works fine. That was about 1996.) Well, times change...

He was born 1975 at Villingen-Schwenningen, made his Abitur at Schwäbisch Hall, studied Computer Science with minor Biology at University of Saarland at Saarbrücken (Germany) and now lives in Zürich (Switzerland), working at the Network Security Group (NSG) of the Central IT Services (Informatikdienste) at ETH Zurich.

Links to internal pages are orange, links to related pages are blue, links to external resources are green and links to Wikipedia articles, Internet Movie Database (IMDb) entries or similar resources are bordeaux. Times are CET respective CEST (which means GMT +0100 respective +0200).


RSS Feeds


Identity Archipelago


Picture Gallery


Button Futility

Valid XHTML Valid CSS
Valid RSS Any Browser
This content is licensed under a Creative Commons License (SA 3.0 DE). Some rights reserved. Hacker Emblem
Get Mozilla Firefox! Powered by Linux!
Typed with GNU Emacs Listed at Tux Mobil
XFN Friendly Button Maker

Blogroll

People I know personally


Other blogs I like or read


Independent News


Interesting Planets


Web comics I like and read

Stalled Web comics I liked


Blogging Software

Blosxom Plugins I use

Bedside Reading

Just read

  • Bastian Sick: Der Dativ ist dem Genitiv sein Tod (Teile 1-3)
  • Neil Gaiman and Terry Pratchett: Good Omens (borrowed from Ermel)

Currently Reading

  • Douglas R. Hofstadter: Gödel, Escher, Bach
  • Neil Gaiman: Keine Panik (borrowed from Ermel)

Yet to read

  • Neil Stephenson: Cryptonomicon (borrowed from Ermel)

Always a good snack

  • Wolfgang Stoffels: Lokomotivbau und Dampftechnik (borrowed from Ermel)
  • Beverly Cole: Trains — The Early Years (getty images)

Postponed