About Paul Seward

Paul is a Linux sysadmin looking after the servers behind the ResNet and eduroam networks, and the main campus DNS infrastructure at the University of Bristol. He's been using unix of one flavour or another for more than 2 decades, and is still constantly surprised by useful commands he didn't know existed.

Lightning talks: the lizard brain of unix…

I’ve been spending some of my spare time recently trawling youtube looking for lightning talks from unix/devops conferences. Lightning talks have the advantage that you can skip through presenters or topics you’re not interested in, and from time to time you find a presenter or a topic which tickles you.

Top of my list this week is Bryan Cantrill (Used to work for Sun, now works for Joyent) presenting at Surge, which is described on the conference website as “Now in its fourth year, Surge has become the conference on scalability and performance engineering.”

Bryan clearly knows his stuff, and is a very entertaining speaker. Each of these talks is about 15 minutes long, but they’re at the end of each video.

Surge 2012: Two unix commands you’ve probably never heard of, and certainly don’t use
The sound is a bit ropey at the beginning of the 2012 talk, but gets better when he works out how to switch his mic on.

Surge 2013: “tail -f”

If you’ve got any other tips for videos of talks (lightning or otherwise) that you think I should watch, drop a link in the comments and I’ll add them to my playlist of “things to watch while I spend my evenings working on other things

Quicktip – mp4 playback in chromium

I use Ubuntu as my main desktop both at work and at home, and chromium as my default browser. For ages it’s been annoying me that some videos on youtube don’t play. Well, I’ve finally gotten around to fixing it.

It seems mp4 playback is missing from chromium on ubuntu, and can be added with:

sudo apt-get install chromium-codecs-ffmpeg-extra

Restart chromium and you can go back to watching kittens falling off tables.

More Juniper VPN Client…

As mentioned at the bottom of my previous post, it is possible to connect to the Juniper VPN without using the java gui, you just need the right command line arguments.

Well, I’ve worked out what those are.

./ncsvc -h uobnet.bris.ac.uk -f ./uobnet.crt -r UoB-Users -u ab12345

(replacing ab12345 with your UoB username obviously)

The list of 32bit dependencies is significantly smaller (as it doesn’t involve Java!) and on Ubuntu the client seems to be satisfied with:

sudo apt-get install libc6-i386 lib32z1 lib32nss-mdns

As before, the above has had very little testing at all.  Any reports of what it does on non-ubuntu systems would be very welcome indeed!

Making the Juniper VPN Client work…

As well as the 32bit rpm version of the Linux client, Juniper have a browser based Java applet for all OSes – which also works on Linux.

It’s not as pretty as the windows/mac client, but it can be made to work – and if you rummage around in it enough it contains a non-java executable at its core which may be useful as the basis of a homebrew non-java application.

Caveats:
This blog post describes a way to get connected to the new UoB vpn using a linux client.  It’s not tested enough, or foolproof enough to qualify as “official” documentation.  It’s more a record of “what I did to make it work”.

We’ve still got a fair way to go before we can publish any of this fully alongside the other instructions.

So try it, if it works for you great! Let us know in the comments!  If I’ve got something wrong, or you’ve worked around one of the problems I hit, let us know in the comments!

If it doesn’t work for you we’d like to hear about that too, but asking for help via the bris-linux mailing list or the IRC channel is probably going to be more productive than asking in the comments.

Ubuntu 12.04 and 12.10 (64bit)
The GUI for the vpn client is a 32bit Java application.  Ubuntu doesn’t appear to install java at all as part of the base install.  So if you’re starting from a point where you have no java support at all, it’s fairly straight forward:

 sudo apt-get install icedtea-7-plugin openjdk-7-jre:i386

The icedtea-7-plugin package will pull in the 64bit version of Java as a dependency, and the 64bit version is chosen by preference.  You can change this by running:

sudo update-alternatives --config java

And select the 32bit version from the list.

At this point, you’ve got 32bit java installed and selected as your default version of java.  You also have the icedtea java plugin for 64bit web browsers installed.

Now point your web browser at https://uobnet.bris.ac.uk and sign in using your UoB username and password.  When the page loads, you should see a line which says “Network Connect” – click the Start button.

You’ll probably get a popup warning saying that you need to make sure you’ve got all the required 32bit libraries installed.  You have (we’ve just done that!) so click “Yes”  – you’ll also get a couple of popups asking if you’re sure you trust the certificate the application was signed with

The first time you run Network Connect it will pop up a terminal window and ask you to put in your (local) sudo password.  This is because the underlying vpn widget needs to run as root, and this popup will setuid the binary so you don’t need to sudo every time you connect.

NetworkConnect

With a bit of luck and a following wind, you’ll get an ugly box pop up which says you’re connected!

Fedora/RedHat/CentOS/ScientificLinux
I haven’t tested any of these, but as far as I can tell, doing this would probably work:

sudo yum install java-1.7.0-openjdk.i386 icedtea-web.i386

Then follow the rest of the Ubuntu instructions

I don’t want Java in my Browser!
Neither do I to be honest. I have the plugin disabled most of the time and enable it only when I want to connect to the VPN.

The version of the client which Juniper distribute as an rpm appears to be a packaged version of the java client which you can load from the browser as described above.

As yet, I’ve not had a chance to do any testing of the rpm version.  Details of that will follow in later posts…

I don’t want Java at all!
As far as I can tell, Java is only required for the GUI. It’s a wrapper around a 32bit binary executable which could be called from the command line (or wrapped in a non-java GUI or script I suppose)

I haven’t managed to find the right magic incantation to get it to actually build a VPN tunnel yet, but this is where I’ve got to so far:

  1. Download the java applet by going to https://uobnet.bris.ac.uk/dana-cached/nc/ncLinuxApp.jar If it asks you to log in, do so using your UoB username and password
  2. Unpack the archive with
    unzip ncLinuxApp.jar
  3. That should give you a folder full of files, which we need to go into:
    cd ncLinuxApp
  4. There’s a helper script we need to run first, which grabs the SSL certificate in use by the VPN connection.  Run it like this:
    bash ./getx509certificate.sh uobnet.bris.ac.uk uobnet.crt
  5. The binary which actually builds the tunnel is “ncsvc” which is a 32bit executable.  It appears to require 3 packages on Ubuntu:
    sudo apt-get install libc6-i386 lib32z1 lib32nss-mdns

However, at that point I’m stuck.  It looks very much like you should be able to build a tunnel by making ncsvc setuid and then calling it with the appropriate parameters:

sudo chown root ncsvc
sudo chmod +s ncsvc
...

Unfortunately I can’t make it work

I’ve made it work!  See this followup blog post for details.

I don’t like closed source binaries either..
As Ian has mentioned previously, We’ve not managed to get the network-manager-vpnc client to connect at all.  We’ll probably come back to that after prodding at rpms, but in the mean time, if you manage to make it work please let us know!

Further Reading
Most of the above is information culled from the following links.  They may be helpful if you fancy having a prod at this yourself.

Some of those links contain perl script wrappers for ncsvc, but as I’ve not had a chance to read every line of that code to make sure it’s not malicious in any way I really can’t recommend you use them!

Forcing cssh to use IPv4

csshThe majority of the servers I look after are managed by puppet[1], but we have a suite of 10 older servers which fall outside of that management environment.

We’re slowly replacing them with Shiny! New! Managed! Servers! ™ but until we’ve completed that work we’re stuck with limited management tools, doing things manually at the command line, like it’s still the 20th century[2]

Occasionally we need to do the same thing on every server (eg kick off a “yum update” or whatever) and using ssh to connect to them all one at a time is tedious.

So, we use cssh which is a scary… dangerous… “powerful” tool that spawns a load of ssh sessions and allows you to send the same keypresses to all the servers at once.  I don’t like using it, it feels like a really dirty way to admin a bunch of hosts, but sometimes it’s a necessary evil.

As long as you’re careful what you type, and don’t do anything daft like “sudo bash” then you can keep a lid on the fear.

One of the “features” of this bundle of 10 servers is that they’re dual stack IPv4 and IPv6, but ssh is only accepting connections on the IPv4 address.

If you’re connecting to these one at a time, “ssh -4 server.name.bris.ac.uk” will sort that out for you, but it took a little more rummaging in man pages to come up with the cssh alternative, and that’s the real purpose of this post.

Todays Tip
To force cssh to use IPv4 when connecting to a bunch of hosts, use:

cssh --options="-o AddressFamily=inet" <host list>

[1] Other config management systems are available 🙂
[2] To be fair, sometimes it’s also useful to kick off a puppet run everywhere

Government Digital Service (GDS) Tidbits

I’ve recently started reading the blog run by the Government Digital Service (the team behind .gov.uk) and there have been a handful of particularly interesting posts over the last week or so.

In a lot of ways, the GDS team are a similar organisation to some parts of IT Services, and rather like how we *could* be working in terms of agile/devops methodologies and workflows.  They’re doing a lot of cool stuff, and in interesting ways.

Most of it isn’t directly unix related (it’s mostly web application development) but when I spot something interesting I’ll try and flag it up here.  Starting with a couple of examples…

“Three cities, one alpha, one day”
http://digital.cabinetoffice.gov.uk/2013/07/23/three-cities-one-alpha-one-day/

This is a video they put up in a blog post about rapidly developing an alpha copy of the Land Registry property register service.  This rapid, iterative, user led development approach is an exciting way to build a service, and it’s interesting to watch another team go at something full pelt to see how far they can get in a day!

I’d have embedded it, but for some reason wordpress is stripping out the embed code…

FAQs: why we don’t have them
http://digital.cabinetoffice.gov.uk/2013/07/25/faqs-why-we-dont-have-them/

Another article which jumped out at me was this one about why FAQs are probably not the great idea we all thought they were.

Like many teams, the team I work in produces web based documentation about how to use our services, and yes, we’ve got our fair share of FAQs! Since I read this article, I’ve been thinking about working towards getting rid of them.

Instead of thinking about “what questions do we get asked a lot?” perhaps we really should be asking “why do people ask that question a lot?” and either eliminate the need for them to ask by making the service more intuative, or make it easier for them to find the answer themselves by changing the flow of information we feed them.

I doubt we can eliminate our FAQs entirely, they’re useful as a way of storing canned answers to problems outside our domain of control – eg for things like “how do I find out my wireless mac address?”  However if we can fix the root cause where the problem is within our domain – we reduce the list of items on our FAQ, which makes them clearer and easier to use if people do stumble across them – and still gives us a place to store our canned answers.

Ideally I think those canned answers would better live in a knowledgebase, or some kind of “smart answers” system.  Which brings me on to my last example…

The Smart Answer Bug
http://digital.cabinetoffice.gov.uk/2013/07/31/the-smart-answer-bug/

“Smart Answers” is an example of an expert system, which guides customers through a series of questions in order to narrow down their problem and offer them helpful advice quickly and easily.

The gov.uk smart answers system had a fault recently, which was only noticed because their analytics setup showed up some anomalous behaviour.

It turned out to be a browser compatibility fault with the analytics code, but the article really shows up the power of gathering performance and usage data about your services.

Although the example in the post is about web analytics, we can gather a lot of similar information about servers and infer similar information about service faults by analysing the results.

If we do that analysis well (and in an automated way) we can pick up faults before our users even notice a problem.

Waving goodbye to Usenet

usenet

The University of Bristol has had a Usenet service for over 25 years, and it’s finally time for our Eternal September to end.  The service is set to be switched off on 28th August 2013.

Working out exactly how long Bristol has had a Usenet service for isn’t all that easy as most of the people involved in the early days aren’t at Bristol any more.

The earliest posting I’ve found so far [1] is from 3rd March 1988, which came via a machine in Computer Science.

Does this mean Bristol first got access to Usenet in 1988?  I’m not sure.

I think it’s likely that before 1988 messages were posted from Bristol via UUCP (as messages from 1986 exist, posted via UUCP from Bath and it’s probably reasonable to assume that Bristol were doing similar things at the time) but without knowing the naming/addressing scheme in use they’re not going to be easy to find.

In 1998, the University of Bath needed to replace the hardware running their own usenet service, and shortly after the hardware in use at Bristol was also due for renewal.  Paul Smee suggested sharing service between Bristol and Bath, Bath agreed and have been running news.bris.ac.uk for us ever since.

In 2004, Paul Smee retired and as Paul Seward was the last person to ask a Usenet related question before Paul Smees retirement (and had the same initials) Paul was given responsibility for supporting the service at the Bristol end.  A task which has entailed sending a whopping 8 emails in the last 9 years! (although only 1 of those was actually to an end user)

In 2010, Janet retired the national Usenet peering service which Bath were using to keep the server supplied with articles.  Bath restructured the service to peer directly with 7 other institutions around the world and kept it going.

Here we are over 25 years later, and Bath have decided that usage of the service has dwindled to the point that it’s not worth keeping it going.  In the last 3 months of the service only 9 machines at Bristol have connected to it to read news.  Given that we’ve got more machines than that in my office alone (and I’m responsible for one of the machines on the list from Bath!), I’m inclined to agree with them.

I’ll personally be sad to see it go. I’ve been on Usenet since 1995 – and was actually introduced to it (and email) before the Web.  I’ve met some of my best friends on Usenet and although the groups I follow have reduced in traffic significantly over the last couple of years (from 40+ posts a day to 10-15 posts a day)  I can’t give up my news habit completely and I’m looking for a new (free!) server.

Competition time!

Internet archaeology isn’t easy, and I’m pretty confident that earlier usenet postings from Bristol exist.  Google Groups has an “advanced search” option which allows you to search their archive back as far as 1981.

If you can find a message from before 3rd March 1988, go into the “more message actions” dropdown list (top right of the message) and select “Show original” – either slap the URL of the resulting page in the comments for this page, or post the contents of the Message-ID: line.

The person who finds the earliest message before news.bris.ac.uk disappears on 28th August 2013 will win a selection of Fabulous! Prizes!™

What’s this all about?

Every blog needs a first post, and this is it!

What’s the scope of this blog?
Very broadly, the intention is that this blog will fill up with information about Unix and Linux at the University of Bristol.

“It’s all just unix”
We’re hoping to post about cool stuff we’re working on, interesting unix related topics we’re thinking about, talks we’ve seen, and a place to sharing current practice, evolve best practice etc.  The blog is being driven by members of the Virtual Unix Team so will hopefully feature articles from unix/linux administrators around the university, giving a broad range of content.

Unix and Linux use at the University of Bristol is very varied[1], but at the end of the day, it’s all just a slightly different spin on unix so it’s all on topic here!

Local community support
The other purpose of this blog is to pull together (and advertise!) the various sources of community support for linux systems.  We’ve written a first pass at a page about these here: https://unix.bris.ac.uk/resources-for-sysadmins it’ll be expanded as things change, and we’re open to feedback about it, so if you know of a source of support that’s not listed, please let us know!

[1] Off the top of my head I can think of systems running Solaris, RedHat Enterprise, Debian, CenOS, Ubuntu, Scientific Linux, Gentoo, SuSE, FreeBSD, NetBSD…