GB Colemak

My laptop died last week (conveniently during a trip to UCL when I really wanted it to work) and so I took the opportunity to upgrade it to the beta version of Ubuntu Hardy. First impressions are all good --- everything becomes progressively more Mac-like and shiny --- although I have some issues with the usability of the Compiz window effects, pretty though they undoubtedly are. The upgrade also made me think again about my Colemak keyboard layout, which has been necessary on my laptop since I prised all the keys off 6 months ago and put them back down in a different order.

On the whole I have no complaint with Colemak --- I don't get any RSI symptoms on my laptop keyboard anyway, so it's not really solving any problem, but the combination of curiosity, perversity, defying misplaced historical convention, and generally checking that my mind is still able to learn new things still keeps me at it. Actually a simplification: all but four of the keys are now in the correct Colemak positions. The remaining 4 are troublesome because the two index finger "anchor" keys on the home row attach to the laptop body differently from the rest of the keys. So "t" & "f", and "j" & "n" are reversed, which still confuses me. So much for my touch-typing skills. Anyway, as Shai Coleman himself has said, even if you don't want to learn a new keyboard layout, backspacing is a much better use for the Caps-Lock key than actually locking caps, so it's worth adding that feature to your keymap, whatever layout you're using.

In the back of my mind, though, has always been the concern that I had to do some pretty arcane stuff with the X server xkb configuration to add this non-standard keymap. Now life is a bit easier for the Colemak user: Ubuntu Hardy includes it out of the box, thus proving that someone was more successful than me in working out how to get it in there. However, it's the usual US Colemak keymap, with at-signs, quotes and tildes in places that seem weird to we Limeys. And no sterling sign!

So, for all those Brits using Colemak keyboard layouts on Linux (hmm, that must be a lot of people, huh?), here's an xkb patch to make everything as it should be:

To /etc/X11/xkb/symbols/gb, add this section:

partial alphanumeric_keys
xkb_symbols "colemak" {
    include "us(colemak)"
    name[Group1]="United Kingdom - Colemak";
    key <AE02> { [         2,   quotedbl,  twosuperior,    oneeighth ] };
    key <AE03> { [         3, sterling, threesuperior,    sterling ] };
    key <AE04> { [         4,     dollar,     EuroSign,   onequarter ] };
    key <AC11> { [apostrophe,         at, dead_circumflex, dead_caron] };
    key <TLDE> { [     grave, notsign,          bar,          bar ] };
    key <BKSL> { [numbersign, asciitilde,   dead_grave,   dead_breve ] };
    key <LSGT> { [ backslash, bar,          bar,    brokenbar ] };
    include "level3(ralt_switch_multikey)"      };

That was the bit that actually did the work of remapping the offending keys. Now make sure that it's declared properly to X.org and Gnome, by adding this to the "GBr" section in /etc/X11/xkb/base.xml

<variant>
    <configItem>
      <name>colemak</name>
      <description>Colemak</description>
    </configItem>
</variant>

It should be pretty obvious where to put it: search for "GBr" and insert this before the corresponding <variant> entry for the Dvorak map. Now restart X (or get it to re-read the xkb config) and you should be able to select Colemak as a GB keyboard variant. Woo.

This has now been submitted as a patch to the FreeDesktop XKeyboardConfig project, and so will hopefully make it into some mainstream distributions by the next set of releases.

A much needed holiday...

We're back from holiday now - a week of skiing in La Plagne with Jo's brothers and friends. On the whole it was great, not least just to get away and chill a bit. The size of the group, and the variation in skiing level, made coordinating things a bit awkward, though: it's difficult when you have to get back across the valleys in time for lunch or the end of ski-school. On getting back to work, I really appreciate how simple life is when all you need to concern yourself with is "can I get down this route without a nasty accident?" :) I remember feeling the same last year after 10 days of ice climbing... maybe the time of year makes the feeling that bit more intense.

I bought Didier Givois' "Les Cles de Paradiski" book in Plagne Centre on the first day (he was prepared to post me a copy beforehand but the expense didn't seem worthwhile - it's a heavy book) and spent the evenings poring over it with the help of a map. It's a pretty good book but, coming from a climbing background, I'd have preferred a clear photo(s) and route guide for each route in place of some of the spectacular but generic "awesome skiing" photos that take up a lot of pages. Maybe he wants to protect people from themselves by not providing enough info. Maybe he knows that most people will never ski the routes and will just enjoy having a chunky coffee table book. Maybe he just likes big photos. But to me it seems there's no reason why a pocket-sized route guide, as you'd get for climbing, wouldn't work for off piste skiing. The same safety disclaimers would apply: it's not the information about routes that leads to injury --- it's unprepared and over-confident users that do that. (I'm one to talk, though: since no-one else in our group was up for "silly skiing", I had to pick the routes with little avalanche hazard and do them alone. Not ideal.)

I also read Mark Twight's collected articles, Kiss or Kill while I was away. Mark is as forceful and controversial a writer as he is a climber, and the subtitle "Confessions of a Serial Climber" sums things up pretty well --- his take on climbing is far more intense, uncompromising and psychopathic than your average weekend warrior, myself included. On emotional baggage, he repeatedly declares that he's "got good with the knife", cutting away his ties to friends, family and lovers when the chance to die on a mountain comes a-calling. Dark stuff, and refreshingly different from most climbers' writing. Reading the book while in a ski resort was an interesting contrast --- skiing is the perfect example of a commodotised, homogenised sport, whose purer forms are these days a relative obscurity. Every time I left the pistes and headed for a couloir, trudging upward high above the neon-suited goons, I thought of Kiss or Kill. Every time I returned to the bars and creperies I recognised a bit of the Twight darkness in my loathing of the "dude, I did a black run... hardcore" attitude. And all the dudes with cool hats in the lift queues... argh: life would be better if the plastic boot and the carving ski hadn't been invented. I found this time, more than any other ski trip, that skiing between markers, on maintained, secured pistes no longer holds much interest for me --- what I like these days in skiing is the bits that most resemble offshoots of mountaineering. This resonated with the book --- what I find unappetising in resort skiing is much the same as what it is in ice climbing comps that rub Twight up the wrong way. Not that my skiing really compares to super-light, alpine style mountaineering --- that way lies undeserved ego massage --- but it was reassuring to find a voice that I could agree with while hanging out in resortsville. I suspect the reason his books interest me is that I recognise a lot of my own motivation for pushing myself in climbing or skiing in Mark's introspections, only watered down a thousandfold. He'd probably hate me for compromising.

Anyhow, we're back and I'm reeling this off at the end of the first day back at work. It was never going to be good, but with a seminar to give in a week's time, no data or slides for it and a plethora of things to do in-between, it's not been a good 'un. Plus, all my suspicions about crappy numerical algorithms in Rivet seem to have come true in the last week --- at least Hendrick Hoeth has been paying attention and sorting out the stuff I haven't had the time to work on. Damn, not enough time. At times like these I feel like wielding the knife myself a bit, and cutting away a bunch of the things that I'm meant to be responsible for. But it'll pass --- give it a couple of weeks and I'll be adding more projects to the TODO list again...

Doing two things at once

I like this bug that I found in a colon-separated path splitting function in some of our C++ code:

while (size_t pos = pathstr.find(":") != string::npos) {
    // split string at position pos, i.e. the next colon...
}

I couldn't see the bug, so I re-wrote it without the cheeky 2-in-1 assign-and-compare:

while (true) {
    size_t pos = pathstr.find(":");
    if (pos == string::npos) break;
    // split string at position pos...
}

It worked immediately. What happens with the code above is that the path string always gets chopped one character at a time, rather than at the colons. This is because the != operator takes precedence over the assignment (=) operator: the first level of evaluation effectively gives while (size_t pos = true) ... as long as there's a colon somewhere in the string. Automatic type conversion then converts the boolean true to an integer 1 and the string gets chomped 1 character at a time until no more colons are found.

Take the hint: don't try to do two things at once, even if you can.

Another STFC letter

Our Durham City MP, Roberta Blackman-Woods came to visit us in the physics department earlier today, to discuss the funding crisis in particle physics and astrophysics brought on by our piss-poor research council and a government department which didn't realise the significance of inflation or full economic costing on academic grants. They also forgot to include the VAT on the 2012 Olympics, so maybe this shouldn't be such a big shock. The difference is that no-one said "oh well, we'll just have to drop the 1500 metres" --- the money just appeared, hundreds of times more of it than the STFC budget for the same period.

Anyway, the session wasn't bad --- our MP at least appears to be on our side and sympathetic to concerns about young researchers having their careers cut short and the knock-on impact of these cuts on physics outreach --- but tellingly she didn't actually commit to doing anything either. So here's another letter winging its way to her courtesy of WriteToThem, sadly before I've had a response to the last one.

Dear Roberta Blackman-Woods,

Sorry to pre-empt your reply to my earlier letter (25 January), but I was in your discussion session with the Durham University physics department earlier today, re. the STFC funding crisis, and another letter now seems appropriate.

Firstly, thanks for attending: it was reassuring to find that you are sympathetic to our point of view and, like us, highly skeptical of the "everything's just fine" spin being applied by STFC's management. Why our research council is failing to complain about cuts which are decimating its own research community has thoroughly bewildered many of us.

Since today's session was short, I'm sure there were many unanswered questions. A lot of time was spent - necessarily, I think - on the general concerns of the research community, especially young researchers such as myself and our postgraduate students. However, I'm concerned about the details of what happens next: you said that we are "formidable campaigners", but in a recent meeting of young researchers in the IPPP it was evident that we are frustrated with our lack of representation through STFC, the lack of communication or consultation and the lack of rapid preventative action from government.

I'm sure we will take account of your suggestion to provide community submissions to the Select Committee. As Gudi Moortgat-Pick pointed out, though, a timescale of 6-12 months may be quick in parliamentary terms, but the economic benefit to be reaped from the ILC project will be decided during that period. Since the UK will not be a member at that time: the UK will miss the boat on ILC construction and engineering contracts. As you rightly said, we have had 8 years of welcome growth in science funding: that makes it all the more painful that projects which have taken all those 8 years of work (and hundreds of millions of pounds) are now being cancelled just as they come to fruition. This must be avoided as a matter of urgency - long term big science needs a correspondingly long term funding commitment.

What many of us would like to know is whether there is any possibility of an emergency settlement to STFC which would enable the UK interest in the ILC and Gemini programmes to be kept alive until the Wakeham review concludes? Our current "best case scenario" is that 12 months of economic devastation will be wreaked on the fundamental physics community, the opportunity for UK economic payback from ILC research will be missed, and then the Wakeham report declares "oops, we shouldn't have done that". By that time, it will be too late to right the wrong.

We would also like to know what we can do to ensure that the failure of STFC to represent or consult its scientists on major science funding decisions is put to an end, and does not happen again. Our gripe is not primarily with the Government, who appear to be unwitting partners in this disaster (admittedly the ministers' responses have not done them any favours), but with STFC, who have failed to communicate the effects that the "flat" 13% funding settlement would have on the research programme. We need the support of the Government to undo the damage done by this funding council mismanagement.

Finally, is there anything you can do in the near future to help our case? I hope that the fact that you are on the Select Committee does not prohibit you from holding an official opinion on this topic - from today's discussion it seemed obvious that you do hold a view and that it is compatible with that of our community. Perhaps you or another MP who considers this to be an important issue could sponsor an early day motion to provide for interim STFC funding to avoid irreversible effects until the Wakeham report is published. Are there any other official routes left to be investigated?

Once again, thanks for visiting us today and answering our questions. I hope we managed to also answer some of yours and that you will do what you can to avoid unnecessary devastation to our research, its economic associations and the public enthusiasm for science to which our field is so crucial.

Yours sincerely,
Dr Andy Buckley
IPPP, Durham University

65a96a859ce10d963c64/452702349059e20f34d9
(Signed with an electronic signature in accordance with subsection 7(3) of the Electronic Communications Act 2000.)

(Postscript: this never received a reply and there was no evidence of any action being taken. As on several other issues I raised with this MP during my time in Durham, there was a lot of apparent sympathy but never any action that might rock the Labour boat.)

STFC crisis letter

Having read again Ian Pearson MP's dismal response to the STFC young researchers' letter, I finally got round to sending my MP a letter on the subject today, via WriteToThem, which is copied below. Since I'm trying to squeeze understanding this situation in among a lot of other work, I hope I haven't got the wrong end of the stick too much. Also, apologies for the crappy English --- I didn't have time to finesse it, or find a better word than "decimate" for describing 25% cuts.

Anyway, here it is, for the record...

Dear Roberta Blackman-Woods,

I am sure that you have already had several letters in recent weeks about the funding crisis at the Science and Technology Research Council (STFC), not least from your constituents who work in or with the Ogden Centre for Fundamental Physics at Durham University.

While I have been remiss in not contacting you on this issue before now, I feel that the letter to young researchers from science minister Ian Pearson MP is deserving of a response. For your reference, the original letter to John Denham MP from 559 young scientists can be found online at http://cern.ch/james.jackson/DenhamLetterFinal.pdf, Mr Pearson's original (and insultingly jumbled) reply can be found at http://cern.ch/james.jackson/MinisterLetterReply.pdf, and a re-sent, tidied-up version here: http://www.hep.ucl.ac.uk/~markl/pp/Reply_jj2.pdf. I'll be referring to the last of these, since it is actually coherent, if misleading.

The letter spends a great deal of time missing the point by emphasising the overall increase in science funding. We are aware of this, but it begs the question of how fundamental science research can be cut at the same time as increasing overall funding. The minister is then careful to describe the STFC budgets in terms of raw sums, without acknowledging the obvious effect of inflation, and less obvious factors such as the concurrent move to full economic costing of research grants, the removal of exchange rate protection on international research subscriptions (e.g. for CERN) and the fact that most of the budget is already ring-fenced. The effect is that the 13.6% increase over 3 years, of which he is remarkably uncritical, translates into an 80m shortfall over 3 years. This is not big money when compared to the costs of many other government schemes, but it threatens the future of this whole area of science research in the UK.

The rub is that this 80M leads to a 25% shortfall in real grants --- an violent blow to a research area in which the UK is internationally prominent and successful. This will create serious problems for Durham University's physics department: we receive more than 50% of our research funding from STFC. To make matters worse, the STFC delivery plan in the face of these cuts involves focusing its efforts on the industrially-connected parts of its research portfolio. This will take more funding away from particle physics and astronomy in universities. We are in serious danger of paying huge subscriptions to international research sites like CERN, but having no money to pay the university research staff who would use them.

The minister also attempts to invoke the Haldane Principle to distance the Government from these problems, by saying that the decisions of where to make cuts have been made purely by STFC. While this may be true, it's obviously rather hard to reconcile the need for cuts in STFC research grants with the "budget increases allocated to it." It's also evident that these decisions would not have been necessary if the budget had been sufficient to cover STFC's operating costs. The DIUS was briefed by the STFC that a flat cash settlement would result in huge and unacceptable cuts to research, and still a flat cash settlement was chosen by DIUS: either this was intentional, contrary to DIUS' own 2006 white paper, Next Steps, or incompetent. With this settlement, since STFC had no freedom to withdraw from the bulk of its ring-fenced expenditures, the result is that they have had very little room to manoeuvre at all. Their decision, to prioritise funding on the more practical and industrially connected aspects of their portfolio, may be understandable, but the fundamental research that was previously funded via the PPARC research council is definitely getting the raw end of this deal.

I should emphasise the importance of this "blue skies" research, for it is not as isolated or irrelevant as it may be perceived to be. Without basic physics physics research, many lucrative and important practical technologies such as the MRI scanner could not have been developed. The World Wide Web was an offshoot of particle physics research. Additionally, particle physics and astronomy are the "poster boy" subjects which encourage young people to get into physics in the first place: most end up either researching in more directly practical areas or taking their numeracy and computing skills into industry. With UCAS intake levels for physics already worryingly low, cutting the very area which most encourages the intake seems completely contradictory to the overall Government policy on education and science.

It's not too late to do something about this situation: interim funding of 20M until the Wakeham review reports in the autumn would avoid going past the point of no return. Given the contradiction between the consequences of this policy and the DIUS Next Steps 2006 white paper, it seems evident that this is an unintentional situation, which threatens to disproportionately decimate the research being done by STFC researchers and the international reputation of Britain as a supporter of fundamental research. Please do what you can to ensure that these dismal predictions do not come to pass.

Yours sincerely,
Dr Andy Buckley
Institute for Particle Physics Phenomenology
Department of Physics
Durham University

65a96a859ce10d963c64/452702349059e20f34d9
(Signed with an electronic signature in accordance with subsection 7(3) of the Electronic Communications Act 2000.)

Particle physics crisis

As some might be aware, the Government either has a death wish for fundamental science in the UK, or is just showing its affection for it in a very strange way: cutting our funding by more than a quarter.

There's a complex back-story to this, but sort-of-briefly:

  • Back in July 2007 the Particle Physics and Astronomy Research Council (PPARC --- the body that people like me get their research money and wages from) was merged with another council, CCLRC, to form the Science and Technology Research Council. STFC does the same fundamental science as PPARC did, plus a range of more industrially-connected science which needs big technology. This set off the spidey-senses of a lot of PPARC scientists, as our research doesn't directly hit the fashionable "technology transfer" targets. But the merger was promised to be properly funded, i.e. to ensure that deficits from CCLRC wouldn't impinge on the ex-PPARC projects, so everyone breathed a sigh of relief. * Since at least November 2007, STFC has been aware of a Government funding deal from the Department for Industry, Universities and Skills (DIUS) which effectively gives it a "flat cash" funding projection for the next 3 years. While this looks okay, due to science research funding concurrently changing to a full economic costing (FEC) model, it will actually leave STFC with a deficit of more than 25% on the part of the budget that isn't already ring- fenced for international subscriptions like CERN. * Despite vigorous campaigning from scientists, the STFC has announced its "Delivery Plan" to meet this constraint, consisting of a full UK withdrawal from the International Linear Collider programme and the southern Gemini telescope, as well as an expected >10% cut in number of postdoctoral research staff (i.e. people like me). The cuts are likely to lead to closure of many university physics departments, but the Government continues to claim that it's increasing funding, despite being perfectly aware that such a naive picture doesn't acknowledge the effects of FEC.

There's lots more information in these places, if your interest is piqued:

Anyway, frankly, it's rather disappointing to start a research career and then find your subject being shut down a few years later, for the sake of what is really very little money: STFC's deficit is 80m over 3 years, which isn't much compared to the roughly 1.5bn per year for the Iraq war, 12bn for the dysfunctional, unwanted NHS Connecting for Health scheme, potentially 30bn over Northern Rock, 8bn, or maybe more now, for the 2012 London Olympics, probably the upper end of 6bn-20bn for the "National Identity Management Scheme" (ID cards++) and so-on, ad infinitum. Naturally, this really makes us feel wanted; maybe "blue skies" science should have kept the Web and the MRI scanner to ourselves.

Anyway, lest you get the impression that STFC are righteously and courageously defending particle physics, astronomy and suchlike against an onslaught from a government obsessed with financial short-termism, take a look at this BBC article about STFC CEO's testimony before the DIUS funding committee on Monday. In particular, note that PPARC science hasn't done terribly well out of the merger: Mason cites Diamond, space science and laser research (all formerly CCLRC projects) as doing well, while "We have had to constrain some investments (particularly in the particle physics and astronomy programme)". No shit.

Anyway, this all looks like we're going to hell in a handcart, but good for the IOP and the RAS for doing what STFC should have done all along and proposing a sensible, affordable way to postpone these cuts until after the situation has been reviewed. As they say, "We're talking about 20m --- it's not a terrific amount of money, in order not to allow things to go beyond the point of no return." Hear, hear.

Just for the record: upgrading to AMD 64 with Ubuntu Gutsy

I just upgraded my old (3+ years... I guess that's old these days) 32 bit single-core Celeron PC to a 64 bit dual-core AMD X2 processor, along the way taking the opportunity to pick a replacement motherboard that does all the video, LAN, Firewire etc. things for which I was previously using extension cards. No big news, except that I'm happy to report that no OS upgrade was needed: my 32 bit Ubuntu Gutsy has neatly popped itself into 32 bit compatibility mode, so I don't need to do a major software reinstall at a busy time. Cool!

The only thing that was a bit of a faff was getting the video output in the right mode, and that was entirely because of nVidia's binary-only driver distribution --- Ubuntu needs to be prodded (via the GUI) to enable this driver since it's not an open source compatible way of doing things. Otherwise everything is perfect and it's nice to have the two processor load trails appearing in my system monitor when I compile things now :-)

"So this is the New Year...

...and I don't feel any different." Thank you, Death Cab for Cutie, for proving once again that every occasion has a melancholy song to accompany it.

Not much to report from Buckley Towers, other than that pre-Christmas was pretty hectic, early 2008 looks set to be just as busy, and the last two weeks have been a wonderfully welcome break among friends, relatives etc. We headed off to Belfast in the car (and complete with cat) on 21st December and generally vegged in my parents' new house for the next 9 days. Christmas itself was lovely, it was good to see people again and I even managed to see some school friends for the first time in almost 10 years. The jury is out on whether my football skills have improved at all in the last decade. We got back from Belfast just in time to see Jo's brother Martin for an evening, brother James for an hour and then zipped down to Cambridge for an impromptu New Year's celebration. Cracking stuff, though the mileage was pretty high... a dent in our collective resolution to be yet more ethically conscious this year :(

This break was also unusual in that they haven't got any internet connection yet, so I had to go cold turkey for almost a fortnight. It's good to be back in electronic contact with the world, although I haven't yet dared look at my email. I'm now relishing the chance to sit down in front of a computer to organise things, write some code, buy things, plan buying more things, get back to work, scan another night-climbing book and write inanities (like this). Sad but true.

Mad props (and some random warbling)

I actually wrote this about a month ago, but for various reasons it hasn't emerged 'til now... so if it seems hokey and out of date, I blame the posting delay rather than my intrinsic lack of insightfulness :)

Nathan recently blogged about Python's neat approach to getter and setter methods, namely the property() constructor.

This allows programmers to seamlessly "upgrade" from naive direct access to class members e.g. foo.x to contract-maintaining getter and setter functions without changing the interface, by declaring

def getX(self):
    ...
def setX(self, newx):
    ...
x = property('x', get=getX, set=setX)

or, the even neater new read-only property decorator form

@property
def x(self):
    ...

and its optional setter extension:

@x.setter
def x(self, newx):
    ...

This is very sweet and I'm ashamed to say that despite writing more Python than Nathan I wasn't aware of it. (This lack of advertisement is neither new nor unique to properties --- Python's "special" method names aren't exactly promoted in the official docs, either.) As well as (as Nathan points out) breaking the tedious routine of getter/setter boilerplate code ("just in case it becomes huge and complex later..."), first class properties also have a nice aesthetic consequence which is that there is only one standard way to access data members. While I would squirm a bit at seeing foo.getx() rather than foo.getX(), I have no problem at all with foo.x. Obviously this is stupidity on my part, but not having the room to build opposing schools of coding convention is in its way rather liberating. We're so accustomed to seeing property access denoted by parentheses, that I find this notation rather refreshing: if used to be that you had to go to rather obscure, usually functional, languages to get such a change of syntactic scenery, so it's nice to see the architects of a major and popular language thinking beyond the more redundant wrinkles of C-centric syntax.

On the other hand, while this is undoubtedly cool, I do still have The Fear when it comes to deploying Python on big projects. The lack of any significant pre-run type-checking still scares me and I sometimes get the feeling that weakly typed languages are a fast-track route to unmaintainable code unless some serious development discipline is wielded. On the other hand, the biggest bits of Python code I've written are more maintainable than they have any right to be! Ideally, Python would have some sort of "static duck-type checking", i.e. attempting to verify that the required object methods should normally be present. That's impossible generally, perhaps to the extent that it's a completely dumb idea, but with C++ I find that ditching polymorphic typing via inheritance hierarchies in favour of template-based duck typing can be very handy. Of course, the downside is that the rest of C++ is evil. (Largely due to its unholy trinity of raw pointers, manual memory management and the legacy can of worms that is the C preprocessor. A recent long discussion with an experienced C++er has only reinforced my opinion that most clever C++ tricks and idioms exist to fix problems introduced by one of these three. And the innocuous looking preprocessor is responsible for a good half of those. Work I've done since then on heavily templatey (i.e. header-obsessive) C++ code has reinforced it even more.)

Java is much better --- not only does it have a garbage collector and a compilation model without medieval overtones, but since version 1.5 we also benefit from a more powerful and subtle generics system than C++, built-in reflection and the killer metadata feature: annotations. Even more cool is the porential of the combination of Java with Jython: the Jython interpreter speaks and writes Java bytecode, meaning that true interchangeability of code objects is possible. Not only possible, but easy: Jython can run Java library classes out of the box without the need for wrapper systems like SWIG and Boost.Python. I think this really is the way of the future: a common bytecode on top of a lightweight VM allows programmers to choose appropriate languages for different tasks without either reinventing wheels or breaking a sweat. This isn't just faciful extrapolation, either: the JVM apparently now plays host to over 200 languages including Scala, a neat functional OO language, Jython, BeanScript, PNuts and Groovy. Maybe the time of the virtual machine has come after all.

Aaargh! C++ drives me round the bend (again)

I'm mentally exhausted, and I blame C++. I've spent a good chunk of my work time and mental energy for the last couple of weeks trying to rewrite the CLHEP physics vectors, matrices and Lorentz boosts, and it seems like I've been fighting the language every step of the way. Here's a selection of the C++ joys I've encountered:

  • since my vectors and matrices are templated, I have to put the code in header files (cos that's the mark of a really good generics implementation). And so I have to recompile every class in the package every time I make a minor change to the vectors. * it's nice/good practice to have vector and matrix functions defined as external functions and implement the class methods in terms of them - hence I need a vast selection of fiddly, templated forward declarations and friend statements before every class. * I also want to use templated constructors, but C++ templates are so dumb that I have to go through some "traits" nonsense to stop my Matrix<N> operator<< from thinking that it should try to copy construct itself as a four-vector. * Since this is just about the most canonical application where operator overloading is relevant rather than just syntactic sugar, I need to implement the whole slew of operator<, operator==, operator+, operator*=... even though each group of four or so of these things is really just derived from one operation.

The really painful thing is that I'm aware of the D language and I know that it solves all of these problems. In fact, C++ is the only OO language that I might use which has all of these wrinkles, but I really have no choice: this painful procedure has to be done in C++, because that's the language that the HEP community has decided upon. We might as well have decided to poke ourselves in the eye once every 5 minutes for the next ten years.

I know that I'll keep painfully grinding along for the duration of this mini-project until I've got what I think will be the nicest set of physics vector classes on the market, which will be some consolation. But in the back of my mind, I'll be trying to work out how our community can possibly steer itself (or be steered) towards a more modern, less masochistic language. Usually, the people who make the decisions are the ones who don't write the code: as a very code- centric, and generally computing-aware physicist, I want to know if there is any way that I can change things, in at least this one case.

While I really like the look of D, I'm not sure it offers much, if anything, over Java: but the elimination of a VM and hence the overhyped stigma of slowness might be the psychological kick up the arse needed by our community. Yes, I know about native Java compilation with gcj; yes, I know that the benchmarks for Java's speed really aren't that bad; and yes, I don't think physicists can be convinced that it's fast enough. The language shootout results for D vs. Java and D vs C++ are of interest here and any HEPers with an interest in improving analysis frameworks and simulation code should have a look. Since I think this, and a few other selected aspects of HEP computing are worth a longer discussion, I promise (threaten?) to return to the breach when I'm a bit less knackered.

But why leave on a negative note? Get a load of the work Sergei has been doing on his brilliant jHepWork Java analysis framework. Given that one guy, in a year, has produced a more compelling package than 15 years of C++ development on ROOT, I think it can be safely said that the language argument is more substantial than just syntactic sugar. If only it had a catchier name...