2009/12/29

The Urge

About a year ago I wrote:

I’ve accepted long ago that my mind latches onto ideas with a terrible grip and it’s inevitable that something that I’m currently spending time on will overwhelm my concentration, to the detriment of all other tasks and thoughts.

I’m starting to realise that I can predict when this will happen. It usually happens when there’s a natural lull in my work, such as the time just after completing a paper or finishing off some code. In the quiet time when my brain starts asking and wondering “What’s next?” I’ll queue up a bunch of possibilities and regardless of the order of their priority there’ll sometimes be one that simply overruns my thoughts. I’ll be able to picture in detail the stages of the task and exactly how to get started and an urge to get working on it; generally I find that there’s no point resisting at this stage and I’d better drop everything to focus on this one thing.

If I don’t drop everything (and attempt to work on what is most important rather than what is most motivating) I’ll generally start drifting into the fun work anyway, in and out of work hours, and it’ll slowly take over; in the meantime, I’ll have been attempting to multitask between a motivated task and an unmotivated one and generally not been as useful as if I’d concentrated on the former alone.

(Of course, there’s always the backlog that accrues during a time of blinkered/focussed working and it’s never fun to sort that out after emerging for breathe. Still, it’s often from the backlog that the ideas for what’s next? emerge.)

I think that learning how to transition continually between these focussed tasks is really the goal of productive work. It’s the dead time in between that’s soul-killing, when you finish a week or a month or a season and ask yourself “What have I been doing all this time?”.

2009/12/26

Brief comments on: Everything and More (2003) by David Foster Wallace

I don’t really intend to go on for too long about Everything & More, D.F.W.’s book about the history and philosophy behind infinity and maths. Suffice it to say that you’ll either like it or not depending primarily on two factors:

  • Like: If you are a D.F.W. fan (which counts me in)
  • Dislike: If you know a lot about the subject matter (i.e., you’re a set theorist–mathematician yourself, which counts me out)

This is assuming, also, that you’re the type of person who likes to read either unique works of literature and/or popular science–type things. Without a fairly good working knowledge of maths, be prepared to work hard to follow along with a good proportion of the book that’s dealing with technical content; it’s not impossible, however: the friend who lent the book to me managed to get through without much background knowledge.

Apparently there are a certain number of technical inaccuracies, or elements of the story that are simply wrong. I’m not one to judge these; (the only error I ever saw was a basic oversight-style typing mistake in a big list of examples of the application of differential equations (or something like that) in which was written F = m dx/dt) but it’s easy to find numerous critiques of the book in which the technical content is rather heavily denounced. However: It doesn’t matter. First of all, in some cases I’m sure D.F.W. was aware of some of the technical problems his popular descriptions required. He admits as much quite early on. But secondly, this is not a book to learn set theory in a mathematical sense. Just as you read Gödel, Escher, Bach to get a taste of some rather meaty mathematics in the context of a much broader philosophical discussion, Everything & More does a truly excellent job elucidating how all the things we learnt in school & university about abstract ideas such as irrational numbers and limits tending to infinity really were huge mathematical/philosophical problems back in the day and we should do better than take them for granted.

The best example of this is the way anyone’s who’s done a little university-level maths can reject Zeno’s paradox by their being taught about convergence of infinite series. How can you cross the road if you first have to reach half-way, and before you reach half-way you must reach the quarter-way mark, and before that get to one-eighths of the way, and before that one-sixteenths, … , ad infinitum? Anyone who’s studied the maths “intuitively” knows that this particular infinite sum equals one, q.e.d., but this result requires them to have already abstracted in their head the very idea of an infinite sum itself as something that is actually possible. It’s not exactly easy to explain how this works without using terms like “tends towards infinity” that a priori assume that infinity is an abstract concept that can be used to explain infinite sums. The formulation of a rigorous (explanation of a) proof is sort-of the main goal of the book (plus fleshing out a good amount of material about how this happened historically, and a number of consequences of what effects this had on the mathematics of the time leading into the current era).

This is a book about how some certain results were discovered, with a sufficient explanation of those results to expand your mind a little or a lot. And like all of D.F.W.’s writing, even just going along for the literary ride is well worth the effort.

2009/12/08

Dean Allen on philosophy of self

In explaining something he did about something he made, Dean Allen writes:

I’ve spent the past year or so reading and writing and doing my level best to chip away at 40 years of belief in the logical fallacy that one’s identity meaning – self-worth, self-image, whatever you want to call it – can accurately be measured in the thoughts of others. Much as you and I may enjoy being encouraged through recognition and praise and dislike being saddened by rejection or indifference […], deriving personal value from these transactions in the absence of a well-formed internal frame of reference through which you can decide on your own what does and doesn’t work, and subsequently accept the opinions of others as feedback, is just plain faulty thinking, of the sort that makes otherwise capable, centred people all loopy and weird.

2009/11/10

Coders at Work

I recently bought and read Coders at Work, a collection of interviews with past and present people of influence in the programming world. Very easy book to casually read. A few typographical problems, but I have fairly low expectations. Full of fascinating stories that really made me think about how far we’ve come in some areas and how little we’ve progressed in others. I recommend it if you’re into that sort of thing.

I found three quotations particularly poignant. I don’t have anything to add to them; make of them what you will.

Douglas Crockford:

My interest in programming is helping other people to do programming, designing a language or a programming tool specifically so that it’s more accessible to more people—the thing that got Smalltalk started. Smalltalk went in a different direction, but the initial direction was really attractive to me. How do we build a language specifically for children or how do we build a language specifically for people who don’t think of themselves as programmers?

Ken Thompson:

I think by necessity algorithms—new algorithms are just getting more complex over time. A new algorithm to do something is based on 50 other little algorithms. Back when I was a kid you were doing these little algorithms and they were fun. You could understand them without it being an accounting job where you divide it up into cases and this case is solved by this algorithm that you read about but you don’t really know and on and on. So it’s different. I really believe it’s different and most of it is because the whole thing is layered over time and we’re dealing with layers. It might be that I’m too much of a curmudgeon to understand layers.

Fran Allen:

Isaac Asimov made a statement about the future of computers—I don’t know whether it’s right or not—that what they’ll do is make every one of us much more creative. Computing would launch the age of creativity. One sees some of that happening—particularly in media. Kids are doing things they weren’t able to do before—make movies, create pictures. We tend to think of creativity as a special gift that a person has, just as being able to read and write were special gifts in the Dark Ages—things only a few people were able to do. I found the idea that computers are the enablers of creativity very inspiring.

Information wrangling

I’m not particularly happy with the state of how I collect, absorb, share, and store generic information electronically. (On the whole, I’m generally happy with how I browse information, however. It’s not too hard to spend too much time reading about interesting things.)

  • I use Google Reader to browse information, from which I can also share items I find interesting.

  • I use Delicious to share other things that I’ve read from plain old “web browsing”.

  • On my computer, I store in a BibTeX database a subset of whatever appears in the above two public feeds plus “other things” that I haven’t shared for whatever reason.

    The BibTeX database also contains links to local content on my machine, so I can still read articles and watch videos that I’ve collected, even away from a network.

  • Keywords or tags are used separately both in Delicious and in BibTeX to help organise the items therein, but the tagging is inconsistent and not kept in sync.

It’s all a bit of a mess. To top it all off, I still don’t have any good way of organising all this local information so that I can browse through it in a way that doesn’t remind me of combing through a poorly-maintained database (which is exactly what it is).

Furthermore, all of the organisation of my actual research literature I have no way of sharing with anyone. Someone coming along to do a similar literature review as I’ve done will either have to read my thesis or start from scratch, and even reading my literature review will hardly give a good overview for their research interests.

There are sites like Cite-U-Like and so on which aim to make academic literature reviewing a more “social” activity, but the fact that they are web applications means that I still need to locally sync my databases whenever I add new content. Manual syncing is simply not a useful solution. (Aside: I’m looking at you, Things.app.)

Furthermore, I can’t maintain an online database of research and be able to store the papers offline in a local database that contains a superset of the online content. In other words, I don’t want to maintain a folder structure and file-naming scheme for content that’s mirrored in a database.

All of these factors indicates to me that there’s still something missing to tie together all of these different aspects of information wrangling. I hope a solution rears its head at some stage in the not-too-distant future. What’s the point of 3D animation on the web if we don’t have similarly-advanced information systems with which to play?

A measure of low academic funding

The common theme in Australian academia is that lecturers are hired on their research merits but must spend too much of their time teaching and performing administrivia, which ends up sucking the life out the research side of their job.

Certainly in my department, the more established as a lecturer you become, the lower the publication count trends in general.

Now, I know publications are not everything, because a professor will often be sitting on top of a pile of other researchers who are doing the grunt work below them. Still, in my opinion a healthy researcher should still be writing at least some of their own papers to indicate that they are still actually doing some of the research.

Could this be measured to a degree by looking at the publication output of every research group in every school in every university in Australia? Interesting results might appear after crunching the numbers on things like

  • Normalised number of publications per person.
  • Ratio of professors to academics to postdocs to postgrads.
  • Weighted number of publications per role according their ratio in the research group.

For example, let’s say that the ratio of roles is something like 1:4:6:20 (I have no idea if these numbers are feasible or representative). Papers published by a person in each role should be weighted by the inverse of these numbers, so the total number of publications per role is given equal weight (one paper by a professor is equal to twenty papers by the PhD students). Justify this by saying the time of a professor is twenty times more valuable than a grad student.

Now sum the papers per role according to these weights; if you see a large discrepancy from a 1:1:1:1 ratio of weighted publications, something is looking a little fishy. (This is just a hypothesis; I presume the numbers could be analysed a posteriori to determine that a healthy ratio might look like 1:2:4:3 or whatever.)

Even if this isn’t at all a suitable way to gauge the research health of a research group, it should at least be useful in comparing in a more detailed way the publication output between similar groups around the country. If it turns out the RMIT’s academics publish far more than Adelaide’s (after you’ve removed the masking influence of high-publishing postdocs, say), isn’t that a difference worth investigating?

Of course, my opinion on all of this comes back to basic funding at a tertiary level: the number of academics per student should be increased to give the academics more time for research. My aim at discuss ideas like “weighted numbers of publications per research role” is biased towards indicating this at some level. But I don’t have the time to actually look into the matter. (Especially being that I’m not actually part of the system at the moment.)

Does anyone do “research on engineering research”? In my opinion, they should.

2009/10/26

Knuth on addiction

Donald Knuth, Adventure (PDF):

Clearly the game was potentially addictive, so I forced myself to stop playing — reasoning that it was great fun, sure, but traditional computer science research is great fun too, possibly even more so.

2009/10/22

The nature of procrastination

When your mind is distracted to the point of not starting an impending task.

That’s not quite it.

The adage is that after being interrupted it takes fifteen minutes to get back into the “flow” of working.

The first fifteen minutes, then, is crucial. Before even starting something. If your mind slides off track at any point before the fifteen minutes is up, gotta start again.

But how often does my mental trigger kick in to read email, refresh feeds, check newsgroups (and, in the past, Twitter and Facebook and New York Times and …) ? In fact, I still get mental triggers to visit news sites I haven’t read in years. Which scares me, frankly. What sort of rut did I get myself into that my brain still brings it up multiple times every single day even if I never (in years, now) respond to it? Is that the side-effect of the addictions of youth? When will it go away?

(To see if you have such mental triggers of your own, close all the windows on your screen and open up a fresh and blank browser window and try and think of nothing. The first thing that pops into your head to type into the address bar?)

Worse than all of the above, how easy is it to slide from the hard tasks of involved research writing/reading/coding to the easy tasks of fixing bugs or renaming variables in my latest toy project? Especially if you can trick yourself into thinking that your toy project can substitute for your real work.

Sometimes, I even get a mental trigger to write things on the internet, things which people already know and which don’t help either the people reading them or person writing them.

Unlike Merlin Mann. (I linked him before, but no hard in repetition.) He says everything on the topic much better than I’ll ever be able to: (albeit this time in an uncharacteristically difficult-to-quote way)

…developing those invaluable tolerances [to “stick with [your work] at the time you’re most tempted to run away”] requires the exercise of some very small muscles. The muscles are super-hard to locate, and once you do find them, they hurt like a bitch to exercise.

Ain’t that the depressing truth.

Well, I’m off to do the dishes. And then get back to work.

2009/10/12

Matlab vectorisation

Wikipedia tells me that Donald Knuth said:

We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.

(I figure you don’t need a citation when you mention “Wikipedia” in the sentence.)

Back in the old days, the fast way to do things in Matlab was to use “vectorised” code which operated on entire array rather than the individual elements; loops were the devil. More recently (2003-ish), Matlab gained a just-in-time compiler, eliminating the old bottleneck. (Update: Thanks, Ben, for pointing out my mistake there; not sure what I was thinking when I wrote 2007. Perhaps I didn’t get access to Matlab-with-JIT until some time later; I forget.)

But you still sometimes see advice to use vectorised code whenever possible. In short, this is a bad idea on performance grounds alone.

For example, the above-linked advice gave the trivial example:

% Extremely slow:
for i = 1:length(x)
   x(i) = 2*x(i);
end

% Extremely fast:
x = 2*x;

Interested, I tested this out.

[Update: Ha, “I tested this out” completely incorrectly, because I was hasty and hadn’t used Matlab in a while. So don’t mind me on that particular point. However, the following still stands:]

Your rule of thumb should be: write the code that makes the most sense when you’re writing it. If it’s slow, try and fix it then. Vectorised code can get damned hard to write and harder to read. It’s only worth it if it saves you real time running the code. And I’m talking hours and hours of time difference here.

When you write x=2*x, you should do so simply because that’s the clear logical representation of the operation “multiply each element of x by two”. But just because you use vectorised code here doesn’t mean you always should.

GitHub from the command line

Here are a couple of shell aliases that I’ve found useful recently. To use, add them to your .bash_profile file. All of these commands are intended to be run from the working directory of a Git repository that is connected to GitHub.

  • Used in the following, more useful, commands, this returns the GitHub identifier for a repository: (E.g., wspr/thesis)

    alias githubname='git config -l | grep '\''remote.origin.url'\'' | sed -En   '\''s/remote.origin.url=git(@|:\/\/)github.com(:|\/)(.+)\/(.+).git/\3\/\4/p'\'''
    
  • On Mac OS X, this opens your GitHub project in the default browser: (I’m guessing it needs some adjustment to work under Linux.)

    alias github='open https://github.com/`githubname`'
    
  • Similarly, this one opens up the Issues page:

    alias issues='open https://github.com/`githubname`/issues'
    
  • Finally, this one returns the number of open issues in the GitHub project:

    alias countissues='curl -s http://github.com/api/v2/yaml/issues/list/`githubname`/open -- | grep '\''\- number:'\'' | wc -l'
    

Via the GitHub Issue API, it’s possible to extract all sorts of useful information programmatically that could come in handy for project management. Use the output of this URL to get all the juicy info:

http://github.com/api/v2/yaml/issues/list/`githubname`/open

For example, I’d like to write a script to report summary details of all open issues across all of my projects/repositories. Saving it up for a rainy day.

It would also be interesting to write a script to run before pushing that checks which issues you’re closing (via the closes #xyz commit log interface) and shows a brief summary for confirmation before sending off the commits. That’s for a rainy weekend.

2009/09/07

pdfTeX on Windows upgrade

It’s not often in the (La)TeX world that new versions of things break existing documents and packages. Unfortunately this has just happened to me.

I maintain a LaTeX package pstool (written in collaboration with Zebb Prime) that sort of uses TeX as a portable scripting language to perform external graphics processing during the typesetting run. (The main benefit allowing you to use psfrag in pdfLaTeX.)

Since it executes programs on the command line, it needs to know whether it is being run on Windows or on some sort of *nix variant (Linux, Mac OS X, etc.). The package I use to detect this, ifplatform, is also written by me (in collaboration with Johannes Große), and it uses a heuristic based around the difference in how the shells in Windows/*nix deal with quoted material.

Unfortunately, this heuristic no longer works in TeX Live 2009 for Windows, as reported by Joseph Wright. “That’s okay,” I said, “Few people will be using TeX Live 2009 on Windows yet — no rush to fix the problem immediately.”

Unfortunately again, I’ve now discovered that the problem also exists in MiKTeX 2.8, which has been released in the past week, and which is much more widely used than TeX Live on Windows. I’ve received several good ideas how to fix the problem via the tex-live mailing list, so it just comes down to trying a few of them out with the help of some of my Windows-using friends.

But for now, be warned: ifplatform (v0.2 from 2007) and hence pstool and auto-pst-pdf currently do not work in TeX Live 2009 on Windows and MiKTeX 2.8. A new version to fix this problem will be forthcoming.

Update: The new version of pstool is now available from CTAN, and it fixes the problem above.

2009/08/31

Burning happiness

David Sedaris: (‘Laugh, Kookaburra’ in The New Yorker)

This was not a real stove but a symbolic one, used to prove a point at a management seminar she’d once attended. “One burner represents your family, one is your friends, the third is your health, and the fourth is your work.” The gist, she said, was that in order to be successful you have to cut off one of your burners. And in order to be really successful you have to cut off two.

Doris Kearns Goodwin: (TED Talk, about 30 seconds in)

My mind keeps wandering back to a seminar that I took when I was a graduate student at Harvard with the great psychologist Erik Erikson. He taught us that the richest and fullest lives attempt to achieve an inner balance between three realms: work, love, and play. And that to pursue one realm to the disregard of the others is to open oneself to ultimate sadness in older age; whereas to pursue all three with equal dedication is to make possible a life filled not only with achievement but with serenity.

Small trade-off there. I know which one I’d rather tend towards.

2009/08/17

A few minutes with "The Now Habit"

I’m skimming through a section of “The Now Habit” loaned to me by a friend, and while parts of it do resonate quite well with me there are sections that I receive in a vaguely repugnant way:

Jeff was stuck. He felt guilty about not making a contribution to his field and was feeling pressure from his colleagues to publish. But he was unwilling to make the commitment to the long hours of solitary work that were required to read professional journals and to write.

“He was unwilling” to do these things? That doesn’t strike the right chord with me. I’m perfectly willing to commit myself to my thesis. I just happen to break that commitment rather more frequently than I should.

As an aside, I can’t stand this style of self-help writing in which a supposedly-true pithy anecdote is given that fits the relevant points being made. What was Jeff’s solution, by the way? To spend two months acting in a play and then using the empty hole left after he no longer needed to spend twenty hours a week rehearsing to write. Well, not exactly a general solution, but the idea seems to be to spend time in concentrated pleasure and you’ll balance your life out enough to stop procrastinating.

What did I like about the section of the book? Here’s a quote of what sounded true to me:

The promise of future rewards for hard work has little control over what we choose to do now. Instead, the more immediate and definite rewards of life, such as leisure, seeing friends, and eating ice cream, are immediately and definitely followed by tangible pleasures and have, therefore, a higher probability of occurring.

This sums up the exact feeling I have towards procrastination. It’s not an anxiety thing or a fear of failure thing, or whatever of the explanations given in this book; it’s just that there are so many short-term fun and rewarding things to do. It’s a hard habit to break doing them.

2009/08/10

Mental effort directed against disposition and desire

It’s a problem with a well-known solution. But one that’s more easily said than done.

Nicola Tesla:

The possibilities of will-power and self-control appealed tremendously to my vivid imagination, and I began to discipline myself. Had I a sweet cake or a juicy apple which I was dying to eat I would give it to another boy and go through the tortures of Tantalus, pained but satisfied. Had I some difficult task before me which was exhausting I would attack it again and again until it was done. So I practiced day by day from morning till night. At first it called for a vigorous mental effort directed against disposition and desire, but as years went by the conflict lessened and finally my will and wish became identical.

Merlin Mann:

Given that your fears know you too well, they can capitalize on any uncertainty that they know you’d find intolerable. So, even a surprisingly trivial matter […] can suddenly seem extremely important and will swiftly divert your attention from the cool stuff you’d like to be doing onto….oh, whatever that other stuff might be. Better find out.

Procrastination and fractured attention is an addiction I’m terribly far from kicking. Even with a tidy desk, a tidy shelf, and long-past deadlines.

2009/08/06

Least easily distinguished

Edgar Allen Poe, Graham’s Magazine:

After reading all that has been written, and after thinking all that can be thought, on the topics of God and the soul, the man who has a right to say that he thinks at all, will find himself face to face with the conclusion that, on these topics, the most profound thought is that which can be the least easily distinguished from the most superficial sentiment.

2009/07/31

Not exactly DRM for news

I’m a couple days late with this one. Ars discusses the seemingly ludicrous plan by the Associated Press to improve its news platform online. In which they claim to be able to deliver news online with a ‘tracking beacon’. The whole press release/news story sounded very odd in that nothing they mentioned seemed possible with standard web technologies.

Their poor excuse for an info-graphic has received some hilarious commentary. I think I just like profanity in my parody.

But after some digging it seems to make sense after all; and suspicions confirmed that nothing they originally wrote is actually truthful of what their technology does. The best description was a comment by ‘deet’ on the Ars article (for which the permalink seems broken, so you’ll have to scroll down and find it):

Look at this embarrassing DRM verbiage as a kind of sideshow for the old folks.

and

AP’s hope seems to be that this new specification for online delivery of AP member content will slow, stop, or at least reveal the activities of the more blatant rippers-off, while giving useful tools to legitimate publishers for monitoring and controlling the use of their content, which is entirely within their prerogative. Obviously, and as with any security system, a sophisticated attacker can circumvent these measures. And the AP knows this. What’s great about the tagging system is, if you’re a legit publisher, the tags had better be there. If the tags are missing, well, be prepared to hear from AP.

(Don’t just read these snippets, the whole comment is longer and interesting.)

Their plan seems to comes down to some HTML metadata that, if present, flags the content as legitimate, and if not (or used incorrectly) yells loud and clear that the text has been misappropriated. The big problems after this information becomes available and widely-used are (a) to get people who are entitled to the content to use the metadata correctly, and (b) to somehow track down non–fair use quotations of the text that aren’t overwhelmed with false positives. (The former being necessary to even have a hope in hell in achieving the latter, assuming that simply looking at the domain name of the hosted content isn’t enough.)

It’s not clear to me that this latter problem is made any easier by the absence of some metadata.

2009/06/29

Academic funding, and medical data

Interesting article over at the New York Times on cancer research strikes a chord in the way academic funding works in general. Too little money to go around, so those who play the game the best get the (research) money. Usually, those who get the money produce results, so I couldn’t say the system is entirely broken. It’s the lack of money that is the bigger problem. (Cue comparisons with defence budgets.)

But much more interesting than the article is one of the comments left. It rings true with my own thoughts on how the government and the health care system should be feeding data back into the research arena:

The problem with medical research, generally, and cancer research, in particular, is that the amount and range of data that is used to map the causes and course of disease is far less than it should be.

[…]

We must gather consistent and substantial data on large numbers of the population, both those who appear well and those who are ill. […] The best way to accomplish that is to provide every citizen who wants it a substantial medical checkup either once or twice each year. All of that data would be placed in a database maintained by the National Institutes of Health and would be available to any credentialed researcher. Identity of patients would be shielded by assignment of an anonymous ID, that permits tracking of that patients medical history, but that does not otherwise disclose who the patient is.

This would be a gold mine of data, and serve not only to help understand the health of the nation but also to improve it. A gargantuan effort, most definitely, but also an enormous reward.

Pity the idea itself is too far outside of the box.

2009/05/26

Australian five cent piece? Good riddance

In March last year, The New Yorker published an article on the penny in America. Clearly, it’s a ridiculous amount of money, and the article goes into detail about why it still exists and what a nuisance it is.

The case in America is more extreme (as in all things, seemingly), but Australia is facing a similar issue now over its 5¢ piece. A report or rumour on moves to scrap the thing have prompted objection from the Queensland Consumers Association, but their worries really sound ill considered: “no matter what they do with the coinage, they manage to make sure the consumer doesn’t win”. Good luck with that particular argument.

Okay, let’s say you want to buy something that costs less than a dollar and it’s rounded up by, at most, 5¢; your purchase will increase by some shocking 5%–10%, but it’s only 5¢ maximum at any one time. It’s such a negligible amount compared to the overall cost of the weekly shopping that I’m rather appalled someone (speaking on behalf of the consumer) would deign to suggest it’s anti-consumer.

The efficiencies in eliminating this coin (which costs the mint on the order of four million dollars a year) far outweigh any nostalgia one may feel towards the little guy. You can’t imagine how futile and frustrating it feels to count hundreds of the things to balance a till when their sum comes to less than 0.5% of the total balance.

I’m very happy to look forward to saving precious minutes every night I count the till at Chocolate Bean.

2009/05/01

What's in an ‘a’?

When do you know that you’re correct? For me, in the world of grammar, the answer is rarely.

I’m reviewing the changes made to a paper of mine by the production team at Elsevier, and they’ve changed my sentence

An example of a system with such behaviour is…

to (emphasis added)

An example of a system with such a behaviour is…

Well, doesn’t look right to me. And Google reports half as many hits for the latter compared to the former.

But I’m not confident enough in my knowledge of English to call them out on what looks to me like, at best, a matter of style.

2009/03/31

An abbreviated "git log"

I’ve been getting more into (the version control system) Git as I’ve worked more on LaTeX3 code (mostly though git-svn, although I’m also using Git for my PhD work).

Here’s an alias that quickly calls up a one-per-line list of recent commits:

[alias]
    recent = log --pretty=oneline --abbrev-commit -n 10

Update June 2010: I now use the following, which reports all of the commits made since the last push to the remote origin/master:

git log --oneline origin/master..HEAD

I find this much more useful in getting my head back on track after coming back to some code that I've been playing with but haven't made public yet. (End update.)

Add this to your .gitconfig file. Then call git recent to get a quick overview of what’s been going on recently. Saves calling up GitX when you just want to remember what’s going on.

Output looks like:

$ git recent
462e945 set_pTF for l3io
[...]
ccf25cf set_pTF for l3box
48c9f59 Rename \prg...nonexpandable to \prg...unexpandable
cccc41a \prg_set_pred.. improved
8bca027 New version of cs_generate_variant

Git is nice (but intimidating) in that it allows you to make all these friendly modifications but you need to get some pointers on using it all.

2009/03/24

Writing via iPhone

I was really hoping that this site would have a decent mobile view for writing. Alas. Here's hoping for MarsEdit for iPhone sooner rather than later.

Not that typing on this li'l thing feels very good on my thumbs. But that's what a childhood of Mario was supposed to prepare me for, right?

On an unrelated note, am I the only one that's bugged by the fact that the iPhone keyboard uses curly quotes/apostrophes on its key caps but to hit them gives you their 'dumb' (or straight) variants?

Oh, I just happily realise that you can press and hold the keys to get glorious “curly” quotes. Still wish they could be a bit cleverer and auto-correct themselves, however.

2009/03/03

TeX Live 2009 freeze date

I’m not sure how many TeX/LaTeX developers read what I write here, but nonetheless.

TeX Live is the major distribution for TeX and LaTeX and related programs, released yearly by the TeX Users Group and coordinated by the tireless Karl Berry. TeX Live contains essentially everything on CTAN modulo the non-free branch, and is now the only supported distribution for Linux and Unix systems, including Mac OS X. (Windows users also have the option of using MiKTeX.)

Karl has just contacted me about fixing up a couple of my packages to go along nicely with a new feature that will be in TeX Live 2009 — a new “shell escape” feature that doesn’t need to be turned on by default and that accepts a restricted (but customisable) set of commands.

Shell escape can’t be turned on in TeX Live because it is a security hazard; one could write an obfuscated TeX document to delete your home folder, for example. However, it’s absolutely essential for some of the more convenient features that I (at least) rely on: being able to convert EPS figures on the fly (Heiko Oberdiek’s epstopdf); being able to pre-compile psfrag graphics from Matlab and Mathematica (see the pstool package), and so on.

So I have to go and look at auto-pst-pdf and the aforementioned pstool to make sure they behave correctly in this new mode, and to see if they can be improved to take advantage of it.

The initial freeze date for TeX Live 2009 is March 31, which has kind of snuck up on me. While TeX Live won’t be ready for some time after that (ironing out the wrinkles can sometimes take months), it would be unfortunate to miss the date.

I’ll have to go and check to see what we can do about the expl3 code, too…

2009/02/24

Declaring attention bankruptcy

Something changed in me early this year.

I don’t think it’s related to my going to Thailand for a holiday, but it might be something to do with my lack of doing anything for a terribly long period of time around Christmas and well, well, into the New Year.

I’d like to say something grew in me, like a desire to simplify and quieten my mind. Spend less time absorbing others’ information and start creating my own. That might be an overly romantic take, however.

I left Twitter, basically stopped using it. I don’t exactly like this state of affairs, because there is worth to the service; it’s just that I started being overwhelmed by too many people. So trimming down, but not just yet. Still avoiding it, for now, for the most part.

I’ve stopped reading whatever the hell I was reading every day. Once I started unsubscribing from a few RSS feeds, I couldn’t stop — the whole stack collapsed and I’ve decimated (well, bit-shifted left twice would be more accurate, I suppose) the number of sites I’m following. Which has decreased even more the number of “New Items” appearing in NetNewsWire every day, since I’m now only really paying attention to “low frequency, high quality” writing. The effect on my reading habits has been profound; I’m not really linking things on del.icio.us at the moment. People would ask me how I’d find such random/interesting articles all the time. Well, spending a lot of time reading is how.

Instead, I’ve been working on the LaTeX3 Project for the most part. Writing test suites and discussing improvements to the syntax and plans for the future. It’s been really satisfying to actually get some stuff done, even if I recognise that my obsession with “collecting information” (in the guise of reading a lot every day) has been replaced by “I wonder what I can do next on the expl3 code”.

I’ve accepted long ago that my mind latches onto ideas with a terrible grip and it’s inevitable that something that I’m currently spending time on will overwhelm my concentration, to the detriment of all other tasks and thoughts.

And this LaTeX3 work has all been a major distraction from my “real task” — I’ve got a thesis lingering over my head. Tomorrow I discuss with the academics what I’m to do about that, and I anticipate a great deal of soul-crushing acceptance on what there is left to do, how much work it will be, and how long it will take. Soul-crushing, because I should already know this but refuse to admit to the answers.

But sometime soon, the focus of my attention will finally shift back to this weighty document. And damn won’t it be nice to have the monkey off my back.

2009/01/29

Academic English

David Foster Wallace: (emphasis mine)

In other words, it is when a scholar’s vanity/insecurity leads him to write primarily to communicate and reinforce his own status as an Intellectual that his English is deformed by pleonasm and pretentious diction (whose function is to signal the writer’s erudition) and by opaque abstraction (whose function is to keep anybody from pinning the writer down to a definite assertion that can maybe be refuted or shown to be silly). The latter characteristic, a level of obscurity that often makes it just about impossible to figure out what an [Academic English] sentence is really saying, so closely resembles political and corporate doublespeak (“revenue enhancement,” “downsizing,” pre-owned,” “proactive resource-allocation restructuring”) that it’s tempting to think AE’s real purpose is concealment and its real motivation fear.

From “Tense Present: Democracy, English, and the Wars over Usage” published in Harper’s Magazine, 2001. And that’s not the only good bit. A monstrously tremendous essay.

2009/01/21

George Orwell on bad English

An article I was reading in the New Yorker on past Presidents’ Inaugural speeches referenced an essay by George Orwell called “Politics and the English Language”, in which he discusses one variety of bad writing.

You know when criticism is good when you recognise yourself in the examples being criticised. Call it a knack for knowing your own failings. But the article itself is rather long; I’d like to share some of the better quotes.

From the end of the essay, the origin of some oft-heard advice:

(i) Never use a metaphor, simile or other figure of speech which you are used to seeing in print.

(ii) Never use a long word where a short one will do.

(iii) If it is possible to cut a word out, always cut it out.

(iv) Never use the passive where you can use the active.

(v) Never use a foreign phrase, a scientific word or a jargon word if you can think of an everyday English equivalent.

Some argue that these rules lead to overly simplified writing, but I’d say first that only people that understand the rules are allowed to break them. A description of one who does not understand these rules:

The writer either has a meaning and cannot express it, or he inadvertently says something else, or he is almost indifferent as to whether his words mean anything or not.

I’m certainly not one to talk. It’s far too easy to bang out a few (too many) words and be happy that someone, somewhere might be reading them. Or use those words as a crutch to remember some vaguely related point.

His translation of Ecclesiastes to “modern English” is brilliant. From:

I returned, and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favor to men of skill; but time and chance happeneth.

To:

Objective consideration of contemporary phenomena compels the conclusion that success or failure in competitive activities exhibits no tendency to be commensurate with innate capacity, but that a considerable element of the unpredictable must invariably be taken into account.

I sure can see some of my own writing echoed in that example. It’s a great example, because the translation does sound lucid and intelligent.

And finally:

[M]odern writing at its worst does not consist in picking out words for the sake of their meaning and inventing images in order to make the meaning clearer. It consists in gumming together long strips of words which have already been set in order by someone else, and making the results presentable by sheer humbug. The attraction of this way of writing, is that it is easy.

George, you are damn right.

2009/01/19

Harvey (1950)

Just watched Harvey. What a wonderful movie. Puts me in mind of The Man in the White Suit, not for story or anything like that but for the feeling and the extraordinary acting and the uplifting philosophy shining from the whole thing.

They just don’t make movies like that these days, or not ones that I see anyway.

Elwood P. Dowd:

Harvey and I sit in the bars… have a drink or two… play the juke box. And soon the faces of all the other people they turn toward mine and they smile. And they’re saying, “We don’t know your name, mister, but you’re a very nice fella.” Harvey and I warm ourselves in all these golden moments. We’ve entered as strangers — soon we have friends. And they come over… and they sit with us… and they drink with us… and they talk to us. They tell about the big terrible things they’ve done and the big wonderful things they’ll do. Their hopes, and their regrets, and their loves, and their hates. All very large, because nobody ever brings anything small into a bar. And then I introduce them to Harvey… and he’s bigger and grander than anything they offer me. And when they leave, they leave impressed. The same people seldom come back; but that’s envy, my dear. There’s a little bit of envy in the best of us.

(Quote thanks to IMDB.)