Markissism — Have grade point averages made higher education completely lose the plot?

“I would like to take this course because the subject is highly interesting to me but I am worried about the risk. Suppose I scored 100% in some component, is there any chance my marks could be adjusted down just because other people have performed well? I have worked really hard in my academic career to build up a WAM [weighted assessment mark, see also grade point average or GPA] of XX and I’d rather take another course if this has any chance of dropping.” — Paraphrased from numerous anonymous late year students asking about a first year course.

Of all the years to start a new admin role, 2020 was a rather interesting one. Notwithstanding all the coronavirus uncertainty over international student arrivals early on, the urgent decamp from campus to fully-online teaching in the space of weeks around late March, and the rapid adaptation to change in everything from lab classes to final exams thereafter, probably the biggest eye-opener for me this entire year has been the looming train-wreck that our recent obsession with the minutiae of metrics in the higher education sector holds for the future of student learning.

A favourite story from my undergraduate years is my first tutorial for Higher Complex Analysis in second year. Our first tutorial started with the tutor stating point-blank “Welcome to higher complex analysis. This is a hard course. The typical fail rate is 50%. Look at the person next to you… One of you is going to fail.” To my right, was Mr Sydney Grammar with his ATAR (Australian Tertiary Admission Rank then TER) of 99.95 and to my left, was Miss Elite Private School with a TER not far off. Half the room was the cream of Sydney’s elite private and selective public schools, and I was not from one of them. As Hunter would say…

A modern student (and some follow my twitter, so may read this) would probably ask why on earth I would risk a scraping pass or a credit when I could just walk a HD in the ordinary level course. What about my WAM? The answer is that no one cared about WAM back then, it didn’t even exist, that we students could see at least. What more, we barely cared about marks even, at least in the way a modern student does. A HD was always nice, and a credit was a sign that you could have done better. Besides, the marks arrived by snail mail in the middle of the long university breaks, when our minds were on better things.

Back to the question of why though? Because I wanted to push myself and see if I could hack the pace in higher maths courses. That was the culture, if you took yourself seriously as a student, you set out to take courses because they a) seriously challenged you and/or b) were interesting. Doing easy courses was just boring and ‘being soft’. And in the courses that didn’t meet a) or b), often the mantra was something along the lines of “50 is a pass, 51 is a waste of effort that could have been spent elsewhere”. There were courses that had legendary status in terms of difficulty and they attracted serious interest. Taking a shot was respected, and coming out with a lot less than a HD was totally fine, and definitely not cause for major anxiety.

I think about what would happen if the first tutorial for higher complex in 2020 started like it did in 1994. Some would probably drop the course, but that likely happened in 1994 too. I’m sure the tutor was ‘putting the fear’ into us so we wouldn’t underestimate the course or lack solid commitment (Garth Gaudry was the lecturer, really nice guy but he absolutely did not ‘pull punches’, it was fierce). Nowadays, there would probably also be a stack of emails like the one at the top to deal with, questioning how exactly the marking will work to protect their WAM, possibly even a stack of formal complaints about the tutor as well for intimidation, causing anxiety, or suggesting unfair marking practices.

“Their mask reflects what you seek and that is what makes it so nice at first. A manufactured mirror of your dreams” — Tracy Malone

But why are we not surprised? When we ourselves fill our funding proposals with meaningless journal impact factors and obsess/boast about our h-indices. When in a year full of important crises to solve, key amongst them the ever declining funds available for teaching and research, some see high importance in coming up with yet another global university ranking scheme to measure by and boast about. When our students receive an email soon after their exams with their marks, and a key line is their term and overall WAMs… to several decimal places! And it’s not just my own campus — every campus in Australia and many around the world are no different. Look at any modern graduate CV and more likely than not there is a WAM or GPA or equivalent, prominently placed, given to decimal places. With all of this, we all soon become like an academic version of Narcissus, staring into our pool, constantly all-absorbed by our own key performance indicator, letting it dominate our whole existence.

I find it personally amusing, particularly for science and engineering students, because when you come into a first year physics course, what is the first thing that you learn? Uncertainty and significant figures. You start by asking how accurately you can measure things, and then how to present your results so that your level of certainty is clear. For example, if a quantity is 110 to plus or minus 15, then writing 109.4857843 with no stated uncertainty is probably a glaring misrepresentation of your level of trust in that number. All those decimal places are meaningless garbage with an uncertainty at the +/- 15 point level, and we dock the students marks accordingly in their assessments for bad use of significant figures.

Meanwhile, here we are, presenting grade point averages and weighted assessment marks to several decimal places. The question begs asking: What is our real uncertainty on such a number? From a statistical perspective alone, it’s at the very least the standard deviation in the numbers making up that average. Yet, how many students have all their grades such that the standard deviation is less than 1? The decimal places are obviously moribund from the outset, and we haven’t even got down to the uncertainties in the underlying grades yet either. Amusingly, I can go look up my own transcript, which now includes a WAM (the printed transcript with my testamur from 1997 has no such number). My WAM is a touch over 85, and the standard deviation… 8.7. Yet, there is my WAM in the system as 85.733, presented as though that 3 in 1000 is a truly meaningful digit that an employer can bank on. Am I better than someone with 85.732 and worse than someone with 85.734, truly?

And at least locally, the problem is exacerbated by the culture that’s set in high school, where the fixation is on another single numerical score — the Australian Tertiary Admission Rank (ATAR). However, as quantities they are on very different levels in terms of uncertainty and statistical significance. The ATAR is a number between 0 and 100, given to 2 decimal places, that reflects a ranking rather than a statistical average mark. For example, an ATAR of 95 means you are in the top 5% of your age group (based on age not on year level). The statistical set underlying the ranking has approximately 55,000 students in it, and there is due consideration for the course set taken, relative difficulty, etc. with a rigorous and reported process. It may well be sensible at the smallest increment (0.05) albeit incremental, and certainly at integer level.

The easy mistake to make is that a WAM or GPA is similar. But WAM is not a ranking nor is it statistically underpinned by all the courses of 55,000 students. It is little more than the raw average of maybe 30-odd scores between 0 (ideally 50) and 100, often with no weighting for course difficulty or year level, with an inherent lack of statistical significance or good certainty, i.e., low standard deviation. To treat it as meaningful as the ATAR is would be a huge mistake, yet, for the most part, that’s exactly what most people do.

The disturbing thing is that we as a sector present WAM/GPA on official documents as though it has any real meaning at all — we pull the same bullshit charade that we do with h-index, impact factor and university rankings — with our students as the victims this time rather than ourselves. Given a number so prominent and delightfully trivial for employers to discriminate on the basis of, it would be utterly miraculous if employers didn’t do this and do so en masse. It may not be the thing that decides between two final candidates (I’d hope not, I’d seriously disrespect as utterly negligent any employer who did so), but it almost certainly discriminates the first cut with nary a deeper look at the strengths and weaknesses that make for the standard deviation in the marks. With WAM determining your foot in the door to higher levels of recruitment processes, it’s entirely clear why students live in absolute terror of this number, with its whims influencing their every decision. Particularly given that nowhere have I ever seen a clear statement on transcripts and other such documents about the uncertainty in such a number or warning about the extreme care that should be exercised in its interpretation.

“That was one of the problems with the Narcissus figure. Here is a face looking at a face, and the problem is the image of the thing is never actually the thing. You try and grab it and it’s not there. It’s water. It disappears.” — Jane Alison

There’s a destructive assumption inherently built into WAM, which is that it somehow accurately measures student merit. This is a furphy for a number of reasons. First and foremost is the Matthew effect, namely that students from better socio-economic backgrounds and access to elite private schools are better prepared to come into university, hit the ground running, and score highly in their first 1-2 years of studies. Many students go to schools where they were lucky to be adequately prepared for the higher school certificate (I taught myself both my HSC chemistry electives, plus other components of other courses to optimise my performance chances), let alone prepared in advance to excel at university. For many students, that first or two year of adjusting just entrenches the privilege disparities of high school into their WAM and chances after university, effectively turning meritocracy into stealth-cover for aristocracy. And that’s before we look at other factors influencing ability to excel at university, such as having the luxury of parents who can support you to work full time on studies vs those who need to carry a job to pay the rent and eat.

Second is the lack of any real difficulty weighting. For example, whether you get a 95 in complex analysis or a 73 in higher complex analysis, those 6 units of credit count the same in many WAM or GPA schemes. Yet, who would you consider the better student — the one who took a shot at the hard subject and did well, or the one who didn’t push themselves at all and banked the super-high grade? Here we see our first perverse incentive — it discourages students from doing harder subjects in the quest of fully extending their knowledge. It is like an olympic diving contest were every dive has a difficulty score of 1 — why would you risk a front four-and-a-half when you can just execute a perfect pin-drop?

The third follows from second. Many of the WAM/GPA schemes I’ve seen don’t account for the year level of the course either. This opens an enormous perverse incentive known as ‘WAM-gaming’. The idea being that you invest your electives in taking subjects at low levels (1st year) focusing on those with a reputation for having easier assessment or higher grades, since they numerically improve your WAM. This incentivises behaviour like we see in the quote at the start of this post — students taking courses solely for their impact on a numerical score. Is this what we want to see in higher education? I’d argue it isn’t, yet, that’s what we achieve.

Fourth there’s also not really any factor for the type of assessment either. A course could be almost all small pieces of assessment, such as labs or online quizzes, it could be dominated by a whole-of-term major work, like a portfolio, or it could almost be entirely a tail-loaded pressure assessment, for example, a final exam. All these assessment types carry different levels of both difficulty and access to invigilation (i.e., ability to avoid cheating). And within some assessment types there are also issues with measurement of assessment type aptitude over real ability, as well as poor assessment design.

Regarding aptitude versus ability, the classic example is heavily weighted final exams. There’s already been much written on this, see here or here, but a big issue is that they often tend to test a students aptitude for doing exams more than they test actual ability in the taught material. Like many students, I learned to become good at exams, to put the pressure aside and do all the things that enabled me to snaffle up marks like a dog chasing treats. I’m not convinced they tested my knowledge and understanding that much, even the well designed ones (I’ll get to the poorly designed ones in a moment). They were almost entirely about short-term cramming of huge volumes of algebra, and I was good at the mental tricks required for that. I’ve also seen enough cases of the inverse — perfectly good students who clearly understand the material, perhaps even better than I did and still do, but who seem to collapse in a heap under exam pressure and massively under-perform. And the most ridiculous part of the whole lot — a real workplace is nothing like a final exam, so what is the actual point of this? My job, even as a professional physicist, is not dumping algebra onto a page from my brain under time pressure. Ultimately, we use a contrived and pointless exercise to develop a meaningless number with high uncertainty that then determines career prospects, it’s literally insane.

And that’s before I even get to the highly variable quality of assessment in higher education. How truly poor the assessment design can get is best illustrated by another example from my undergrad student days. A lecturer, who I won’t name, gave a horrifically hard exam the year before us, it was carnage, on raw marks for just that exam the whole class is sure they failed spectacularly. We spent weeks with the previous years’ paper, teasing out reasonable solutions to all the problems — putting students under some time pressure is one thing, but giving them a 3 week assignment to solve in 2 hours is another. Anyway, said lecturer made the amateur error of not knowing we had access to the past paper and giving the same exam two years in a row. Needless to say, we were prepared — we knocked the damn thing out of the park. The lecturer was stunned and told us all he suspected we had cheated as we couldn’t all be that smart. No shit, Sherlock. But the point is, unlike, e.g., ATAR, there is sometimes very little accounting for continuity of quality of the assessment between years or across cohorts in a single institution, let alone between one university and another. There isn’t even standardization of the curriculum and assessment at national level in real terms comparable to ATAR. All of these WAMs and GPAs measure different things in different rules in sub-statistical ways, even within a single university, let alone between them.

I wrote earlier about the uncertainty arising from the statistics of averaging being at the 5-10 point level, that’s before we even account for uncertainty in the underlying 100 point grades for each course. Those too are potentially no better than to +/- 5-10 points in real terms at best, perhaps more given an exam can be a good day or a bad day entirely at random over the cohort. By the time you get to the bottom of the four effects above, you’re probably looking at a real uncertainty up at the +/- 25 level, and at that point, you really can’t say a lot more than not competent, competent and very competent. Ultimately, a measure like WAM or GPA is about as predictive of performance for any job you’d like to choose, including being an academic, as height is for performance in basketball. There is some correlation but if you want to make accurate predictions from the integers or decimal places, you are demonstrating nothing but your own foolishness…

“There is no innovation and creativity without failure. Period.” — Brené Brown

Yet, despite this all, WAM or GPA has come to be the dominant factor motivating student decisions across in many universities in Australia and internationally. It makes students into risk-averse, highly-stressed individuals focused solely on chasing a number, much like the academics who teach them, who are endlessly chasing citations and h-indices, and the organisational leaders that are entirely beholden to whatever it takes to jump one spot in some ranking table or other. From the bottom to top, our obsession with the minutiae of single metrics has made us totally lose the plot in the last two decades, and in particular, totally lose sight of our actual mission — producing a next generation of young professionals that are not just highly trained but well adjusted individuals ready to contribute to society.

We profess to value innovation, creativity, leadership, resilience, and in particular, an ability to think critically in a world awash with knowledge and information. We expect them to think about uncertainty, statistical significance and the search for true meaning in noise. And then what do we do? We construct them a totally bullshit metric that we know should earn you a fail in any proper academic course, and happily turn a blind-eye while everyone buys into it. We sell the first few grips up a greasy pole full of stupid sub-statistical performance measures that are about as much a reality as true measures of achievement as pixies and bats in the desert.

So what is the solution?

I am becoming a huge fan of abolishing WAM/GPA and numerical grading on all courses entirely. Throw the whole lot in the bin, it’s largely meaningless, detrimental to education, and as we’re seeing now in the 2020s, driving a whole massive online industry of contract cheating and proctoring wars that I think can only lead the sector down a path to its own destruction. Contract cheating is as much our stupidity being weaponised for private profit as we see our obsession with impact factor and ‘prestige’ journals being similarly weaponised.

A three-point grading system for each course should be more than sufficient, something like not competent, competent and competent with merit*. The first one is only permanently recorded if it’s not ultimately substituted by one of the other two grades. The third one gives the students something to stretch themselves for. This is more than enough to give any employer an initial triage on competence and ability and enable proper management of degree progression. We can still have marks for individual assessments, it’s a good thing for students to have a measure for how they are performing. But there’s no need for those to be tallied and documented on a transcript.

A breakdown into three-point grades gives students room to push themselves and be adventurous with their learning. It gives them the space to sometimes even fail or underperform on a task in a way that’s invisible, thereby enabling all the life lessons that failure brings, e.g., resilience, ability to be positively self-critical, etc. to be achieved without life-long punishment. The simplified grading system also lends itself to assessment that is more continuous across a course, producing less negative stress for the students as a result, with the ability to implement diversity of assessment type and, where required, ‘hurdle requirements’, to enable proper confirmation of competence.

Ultimately, we have to make a decision about what higher education is about and focus on doing it. Is it about developing students or is it about taking money for poorly measured credentials? I would argue our mission is the former, and if we aren’t careful, our obsession with the latter is going to get entirely in the way of our mission and render us ripe for disruption.


*An alternative would perhaps be a four-point grading system, with not competent, competent (equivalent to current PS/CR) and perhaps two bands of competent with merit, which would cover off the current DN and HD grades. This would in some senses be the minimal change that essentially is just a dropping of all insignificant numerical digits, rebadging of the grades, and proper admission of the true uncertainties involved in assessment.

The ‘mental game’ of exams.

I did a revision session for one of my undergrad courses today, and I like to talk a little in these about exam technique, as it’s a topic that isn’t formally taught and often under-appreciated by undergrad students as an important element of performance.

I teach a 2nd year course, and exams in 2nd year are hard in an interesting way. The exam style for 2nd, 3rd & 4th year is similar but often quite different to 1st year. The answer path is often not as straightforward and technique can have more bearing on your outcomes than it does in 1st year exams. The problem is that’s not obvious as a 2nd year, many students don’t realise it until it’s too late, e.g., they get to 4th year. Certainly, in higher year undergrad exams, simply studying the notes and doing the quiz problems isn’t the complete package.

I talked in the revision session about making sure your background maths skills are tight, learning to both read the question and read ‘into’ the question what the examiner is looking for, how to get part marks well, etc. I might write a separate post on those aspects another time.

In this post, I want to focus on what happens outside the normal thinking about exams a bit, what I like to call the ‘mental game’ of exams. Two points on this:

1. Don’t study right up until the exam: As an undergrad, I wouldn’t study the day before the exam or on the day of the exam.

If you’re a racer (triathlon, marathon, cycling, whatever) one thing you learn is that you don’t train the night before a race. If you do, your performance sucks. Your body is still recovering, it can’t give you full performance. Instead, you ramp your training down and don’t train at all the few days before the race. It gives your body time to replenish supplies, repair muscles, refresh and reset your mind. You wake up on race day firing on all cylinders, mentally and physically.

Exams are like races in many ways. They’re the bit at the end of the training where it really counts. And you want to be on your peak performance for it (more on that in Point 2). If you study right up to the exam, your mind is tired. And it’s still trying to digest all this stuff you’ve been jamming in there for days on end right when you want it to shift focus to calmly working through the problems. If you study like crazy until the last minute, you inevitably get mental blanks and disordered thinking much like you get stitches and breathless patches if you train right up to a race.

I would do other things within a day or so of the exam. It might be some light study on another exam further out if I had a crowded exam timetable but more often I’d just read a book or watch tv or go for a walk or go out with friends. Anything to get my mind off the exam and studying and the topic of the course.

An interesting thing happens when you do this. Occasionally your brain throws messages at you. For example, you’re watching GoT and Tyrion is in the middle of his latest scheme and suddenly ‘hey, I don’t get this thing about the inertia tensor, why was one of the terms on the diagonal zero for that plate thing? Does that mean something?’ If I was in the middle of something, I’d just note the question and sort it out later. If I wasn’t, I’d go away and find the answer. Don’t use this as an excuse to get back into intense study (e.g., ‘oh man, I’m not ready, I have to study more’), just fix the disconnect your brain told you about and get back to relaxing until it happens again.

If I ever looked at my notes on the day of an exam, it was only to answer the questions by brain threw up. Nothing you put in your head on exam day is going stay. I would literally just let my brain say ‘I want you to check x for me please’, and I’d check x. And then I’d find something else to do until it hit me with something else to check.

Sometimes your brain also does the opposite, rather than asking it tells. It might throw some weird logical connection at you, and when you follow it, you realise you suddenly understand a connection in the subject better. Likewise, let yourself follow these ‘brain farts’ just to where they are resolved, don’t let them lull you back into intense study. Your brain doesn’t want more to chew on, it want’s time to chew on what it already has.

I would also do things like go on walks or do mundane things like wash the car (the more boring the better) to try and encourage this because it’s what you want to happen. It’s your brain processing what you studied earlier. But you need to stop with enough time before the exam to let this happen. If you study until the last minute, there’s just no chance.

One warning, if you go out with friends to relax the day before, keep it contained. There’s relaxing and going nuts. Doing an exam with a hangover is strongly not advised (trust me).

2. Prepare physically and mentally for the exam: Another aspect of ‘race management’ in sports is pre-race and post-race. And many of these ideas hold for exams too.

After 13 years teaching, I’ve seen my fair share of students unravel in an exam, it’s an unfortunate reality and no fun for anyone (and I came close to doing so myself once as a 1st year, so I do know the feeling). You chat to these students later, and you find out they crammed for 18 hours the day before, crashed at 2am, slept terribly, woke late, were in a mad rush to make the exam, couldn’t find parking… no wonder they fell apart. Some have been sleep deprived for days, binging on caffeine to extend their study hours. Their mind is screaming away with anxiety, they can’t focus… they’re completely paralyzed by the first question that requires a little stretch in mental effort, and it spirals downhill from there.

This is obviously no good, you simply cannot operate like this.

The important thing to realise is you can overcome this, much like any fear. But… it takes conscious effort and practice. You have to work at it. You mightn’t get it the first few times, and sometimes you have it cracked and there’s relapses, but with work you can get there.

At the very least, you can turn exams from terrifying and paralysing (distress) into just nervous energy (eustress), but this is good, that little buzz of adrenaline that comes from ‘it’s time for an exam’ or standing on the beach in a wetsuit waiting for the starter’s gun is the energy that you can learn to turn into a laser-like focus on the task.

Some tips you might want to think about:

  • Make sure you get a proper night’s sleep before the exam. This obviously is not going to happen if you are studying in the evening, your brain will just be buzzing. It’s another reason I wouldn’t study the day before the exam. Instead I would do what I needed to make sure I got a good solid night’s sleep. Being well rested has a massive effect on performance and boosts you much more than some last minute cramming.
  • Get up at a sensible time before the exam. You don’t want to have to rush and panic, that just makes you stressed and anxious, which sets your whole day up like that. Instead, kick off the day with something that puts you in a good mood. Find something funny to watch. Catch up with some friends you can talk some other stuff with. Play with the dog. Whatever it is that does it for you, put it in the start of exam day. If you set out first thing with a good day, then you are off to a really positive start.
  • Have a proper breakfast. If caffeine is your thing, have some and not too much.
  • Less relevant for online exams, but be there early. Again, this is anxiety management, you want to reduce stressors in the exam lead-up. For an online exam, make sure your space for the exam is ready well ahead of the time and in a set-up you like.
  • Leave some time to ‘get in the zone’ before the exam. Put everything after the exam out of your head; you can’t afford to think about that. As extreme skier Doug Coombs says in Warren Miller’s Journey: “If you’re scared, the world is shaking, you’re thinking about the future, thinking about the consequences… that’s not right, that’s no good.” You’re just not going to be able to function at your peak mentally on a crucial task if you’re preoccupied with something other than that task! Same deal with the swim leg in a triathlon, if you’re thinking about sharks or drowning in the sea, you’re obviously not going to swim well, your navigation is off, panic messes up your breathing, it’s no good. Learn to put bad talk out of your head. Often it’s a matter of just steering your thoughts away from it. Going back to the swimming example, when my brain screams ‘but what about those sharks, hey’ during a race and I feel that sort of pre-panic happen, I just start thinking about my stroke, is my hand entry good, is my breathing pace right, etc. I basically talk my brain back to calm by giving it something else to think about. Work out how to do the same in exams, even if it’s something as dumb as thinking about how cool your calculator is and taking a few slow breaths. On exam day, no amount of positive self-talk is too much. Trust your preparation. Back yourself. If music is your thing to get in the zone before an exam, do that. If talking to people about anything but the exam is, do that. Work out what makes you feel comfortable and get your mind off the anxieties of the exam.
  • Find an easy question to start on. If you get stuck, just pass and go to the next question. Like Point 1, sometimes your brain just needs to process the question for a while.
  • Mentally reward yourself for good answers in the exam (‘yeah, nice work’), especially early on. The positive self-talk before the exam should continue during it. Remind yourself that you don’t need to get every question 100% right to make it through. It’s easy to think you haven’t done enough when you probably have. Don’t be overly tough on yourself, encouraging yourself is better.
  • If you need to stop a minute or two, take a mental break and calm yourself down, just do it. That’s better than continuing on and fighting the panic. This is actually how I bailed myself out of the panic in my 1st year exam way back in the past. I stopped, took a few moments to just get some perspective (‘look, it’s just one exam, it’s not the end of the world, work out what you can offer against the question, grab all the part marks you can, and we’ll just hope that’s enough.’) and then pushed on. Don’t be afraid to do this. There’s no rule saying you have to be writing every second of the exam time.

And lastly… make yourself a time to ‘debrief’ after each exam. It’s one thing that never really comes up in science for some reason, but in sports it happens after every race/game and it’s common in the military after missions too, which often have an ‘exam-like’ stress profile and the same mental management games are required.

  • Try to do your review dispassionately. Look both at what you did well and what you could do better (note the language, not what was bad, not what was dreadful, what you can do better — see next point below). There *will* always be both sides so look for both sides. Take note of those lessons for next time, they are exactly how you get better at this. Remind yourself before your next exam what you wanted to do better last time.
  • Don’t be down on yourself. You did your best, you can’t change how it went, and beating yourself up about it certainly isn’t going to help anything. Focus on what you did well and what the lessons are, it will make you better at exams and it will make you happier too.
  • Accept that sometimes you have a bad day. We all have them. It’s not the end of the world. You’re still alive, you still have friends, you still have a place to live and food to eat. Retain your perspective. Many of your professors have failed exams before, failed courses before, had other things go wrong (I fell off a stage during a talk at a conference once, for example). Same in sports, even the best have come dead last in races, stopped in the swim leg freaking out about sharks, crashed a bike. You definitely aren’t the only one. All you can do is pick yourself up, dust yourself off, learn the lessons and fight on.
  • Irrespective of how you did, reward yourself. You survived! If you can take the rest of the day off, do it. Find something fun to do. Whatever you do, don’t just jump straight into study again.

Happy examing!

Game of Gowns — The spoils of #ponzidemia

We live in interesting times… the worst nightmare for a university sector heavily financed by foreign-student revenues has arrived: A global pandemic. Borders closed, revenues declining precipitously, and a government reluctant to bail the sector out.

Salaries in Australian higher education have been a topic I’ve wanted to get to for some time but I’ve avoided it a little as it’s depressing and inflammatory. There are definitely winners and losers. And like any corporation, the winners are almost universally at the top. Now, the crisis, resulting quarantine lock-downs, and two articles in the media this week have converged to push me over the precipice on a long delayed project.

Let me begin with the two articles. It started for me with Merlin Crossley’s opinion piece in the Sydney Morning Herald this week. I agree with some parts of the article, but one part in particular got my ‘yeah, nah, that doesn’t square up’ going, it was:


That’s not untrue, but if you think deeply about it, that picture is not completely true either. Sure. there’s not a ‘profit’ in the corporate balance sheet sense, where you divide up the revenue minus costs and return it as dividend to shareholders who bought equity in your company on the stock market. But… some people sure make a hell of a lot more money out of this new system compared to the standard of an Australian public university of the 1990s and earlier! They might not be share-holders per se. They’re more employees that get serious returns on the business turning ever greater fee revenues.

The self-interest of increasing pay packets in a climate of reduced regulation, spread across dozens of executives over two dozen institutions and 20 years, has driven what anyone with a pair of eyes and some critical assessment skills can see is a massive corporatisation of a public service once run efficiently to minimise costs to the public who use it. Gone are the days when an entire bachelors degree would set a student back maybe $10,000 at most (or even be free going far enough back). Now we charge them several times that, all for super-flashy campuses that look like Westfield malls and courses so compressed as to be almost impossible to teach and impossible to learn.

Now imagine you’re stuck at home in quarantine, without work. And the education minister suggests you might want to retrain. Since huge numbers of Australians now have an undergraduate degree, those are inevitably going to be postgraduate options without commonwealth support. So, you go look at a Grad. Cert. or a Masters, and see that an online degree is going to cost you $30,000/year minimum. Ouch, that’s a lot of upfront when you just lost an income. The appeal in this article to being simply ‘a registered charity’ and all about the community is going to grate with many, who will have a response ranging from a snort to spat coffee, I suspect.

The other was Michael Sainsbury’s article on a few days earlier. This made an attack commonly thrown against the modern higher education sector, namely that VCs and other senior executives earn way too much for institutions that are public sector organisations. And that if they want to run them like corporations, and earn corporate salaries in the process, they need to accept some of the responsibilities of risk management in the corporate sector, like not putting all your eggs in one basket. The article has some fair points amongst a sea of trying to whip sensationalism at the salary numbers. Some colleagues told me this article was unfair; another told me to go look at the top ASX100 CEO salaries and see if I still think the article is valid. I’ll get to that below…

The truth, in most things in the media these days it seems, is somewhere in the middle. And so I figured I’d try to find the reality, and then use some numbers to point at where things really are. Two days of data mining follow below, much of it raking through 25 years of university enterprise agreements on the Fair Work Commission Website.

Back into history: At the core here is the ‘corporatisation’ of the modern university. No one knows how or when it was decided, it just sort of happened. Historically, the major universities in Australia were public organisations, with staff essentially an arm of the public service. You can see this as late as 1995, where the salaries across the universities were standardised and the academic pay charts of universities matched those in the Australian Public Service Enterprise agreements (EAs). It didn’t matter if I worked at UTS or Monash, Murdoch or Melbourne, a Senior Lecturer (B6) earned $50,111 in 1995 and a Professor (E1) earned $80,176. Higher positions were just loadings, for example, in the University of Wollongong EA from 2000, you can see a Head of School’s loading of $4,648 and a Dean’s loading $20,359 on a $92,968 E1 salary. I’m pretty sure back in the 80s & 90s the executive branch (Dean & above) was nowhere as large as it is now. The faculty was a small office, there was maybe a DVC or two.

The regulation of salaries ended about 1997, and the universities all went their own way soon after. At least at the standard levels B6 through E1. You can easily map them as a function of time. Executive salaries are much harder, there’s little data available, even in the few ‘senior staff EA’s that were popular in the mid-00s. I’ll just do some numbers on them later. There’s also the oft-forgotten workforce of universities — the Ph.D. students — I will get to them also.

Does International Student Percentage Affect Salary? The first thing I wanted to look at was the effect of international students, since it’s something Sainsbury’s article alludes to. There was a nice plot from the Centre for Independent Studies recently, which I show below. International student percentages range from below 5% to as much as 45%, surely that should have some effect on salary.

92816862_10158698826770832_7999876956784951296_n.jpgI chose as my set to analyse 8 from the top 11 in the graph below: RMIT, Wollongong (UoW), Monash, Murdoch, Melbourne, UNSW, UTS and UQ. A mix of Go8 and non-Go8, spread geographically. Some in capitals, some not. I also took three with low internationalisation levels: University of New England (UNE), University of Western Sydney (UWS — rebranding, bah, I’m a Campbelltown kid, it’ll always be UWS to me 🙂 ) and University of Tasmania (UTas). I went for more established universities here, mostly as I really want to get back to 1995 with all of them, so I can see how they all evolved from a common salary level.

To keep this sensible, I went for the top rung of Academic level B, to capture what would be a relatively junior academic but almost certainly permanent to have made Step 6. The other obvious one to go after was Professor (E1); I will deal with the mysterious ‘super professors’ later on. Let’s not waste any more time, onto my first graphs.

B6 Salaries.jpg   E1 Salaries CPI.jpg

The universities with high international fractions are blue diamonds, low international fractions are red-orange circles. The trend lines are average in green and CPI projection of 1995 salary in purple (using RBA’s online inflation calculator). I don’t care much for people picking which campus is which, it’s less relevant, but you can ask me if you want the data (happy to share it). Ultimately, it’s not clear that being at a campus with a higher international fraction confers much pay advantage on the salary at a given level for the non-executive staff. But there’s a missing piece in the puzzle here that requires a quick look at another graph, this one on the breakdown of the academic workforce.


The data above is national and includes both permanent and casual academic staff. I can see two things in it. The first is the huge growth in Lv A, which is essentially just a big increase in casualisation in the higher education sector. Causalisation keeps Lv A’s at Lv A either through churn (Ph.D. students come, teach, move on and are replaced by new students) or by preventing rising in the system (side effect of churn, you are easily replaced if you don’t like your level). The other is the huge growth in Lv E and to some extent Lv D. My guess, and I can’t support it with available data, is that the fattening of Lv D and Lv E has happened more strongly at the campuses where the international fraction is higher. I can back it with anecdata (i.e., talking to lots of colleagues at lots of different campuses about what their campus demographics look like), but to be solid on this, someone really needs to get the same data as above, comparing between 1996, mid-00s and recent but fine-grained enough to see individual universities.

Let me be clear, what I’m pointing to here is that the profit for staff from increased internationalisation comes less in the rise of the salary at a given academic level and step but more in the ease of access to higher academic levels. The salary tables are always going to be pinned by inter-university competition for staff — it’s the thing a new hire can see when you are courting them during recruitment. Promotion processes are less transparent. And it is not as simple as them just ‘being easier’ at some places than others, there’s also a strong component of Mathew effect involved. But either way, there’s a trend.

If nothing else, it’s much as a colleague at another campus quipped when I congratulated him on getting to Prof. many years ago: “It’s nice, but Level E is really just the new Level B.” And given this, I think for any staff member that’s moved up the system into the ever fattening Lv D and E to say they haven’t personally gained from internationalisation of the system is simply untrue, because not only are you sliding up an exponential but you are jumping up the cascade of exponentials, it’s a double-whammy. This is not insubstantial as a systemic profit because a small win integrated over a large number of people has a pretty big cost. But you certainly can’t miss it when you travel — when people in countries with strongly public (non-corporatised) university systems ask what the salaries are like here, just watch their eyes when you give the answer.

What about the public sector then? To further explore the growth in a given academic level/step, I went and did some comparison to what’s left of public sector science & technology here. The obvious choice is CSIRO of course. In the deep past the universities and CSIRO were closer counterparts in the ecosystem of science & technology in Australia. To do this comparison, I looked at the closest levels in salary in 1995 to B6 and E1. These are CSIRO Levels 6.1 and 8.2, I also tracked their highest level 9M in my data, but only present it in my final graph at the end. To get a ‘control sample’, I also looked at the Australian Public Service system, which is tricky, as it restructured in the late 90s and some of the EAs aren’t available in the early 00s. So I tracked the bottom and top of the APS executive levels (EL1.1 and EL2.7) in recent years, just to see what government non-science salaries are doing.

The results knocked my socks off at first, and I suspect I may instigate an insurrection in CSIRO with the graph below (hold onto your hats kids).

Sector comparison B.JPG

The CSIRO salaries tracked well early. Taking my academic hat off, I’m not surprised, they’re somewhat closer to the applied/industrial side so the salaries should be higher (just the old money-freedom continuum of academia). Something alarming happens in 2012 though. If you look back at the university data, you can see the same inflection in the average line but it’s less severe. I suspect this is the late days of Gillard/Rudd and the quest to regather a surplus followed by the austerity of Abbott/Hockey. Either way, CSIRO staff are losing ground lately, they aren’t even keeping up with salary progression rates for the APS any more!

One hypothesis for this could be that government is withdrawing funding for science across the board — both CSIRO and universities — and the universities are doing a better job of filling the gap by selling educational services internationally. Still, there isn’t a massive divergence here to suggest that internationalisation is massively beneficial at fixed level/step. It’s still the level-promotion aspect I mentioned earlier where I think the big gains come.

I might come back to staff down below, because if I’m getting into promotion, I need to look at the top end. I want to look the other direction for a moment, to the true workhorse of the university research sector: the Ph.D. students.

Corporations love cheap labour: One of the long-standing ethical issues I’ve had with higher education for some time now has been the way it uses Ph.D. students. A lot has changed in 25 years, let’s go back in time before I look at some data.

I was a university student of the 1990s. I did my first year in 1993, completed a B.Sc. (Hons) in Physics in 1996, and then did a Ph.D. from 1997 to early-mid 2000 before leaving for the US for a postdoc. The only reason I could afford to do this was that university was relatively well subsidised by the public of Australia. I grew up a few blocks from one of Sydney’s notorious nasty suburbs (Airds) in a family of 7. We weren’t in total poverty, but we were hardly well off either. It was only Whitlam’s legacy that meant I even stood a chance of getting where I am now. I could suspend the fees and not end up in massive debt in the process. I finished my 4 years with a HECS debt of $9,700 ($16,742 in 2019 dollars). Were I to do the same degree today, I would be up for $38,100 ($22,000 in 1996 dollars) with domestic fee support. And without that, I’d be in debt to the tune of $191,000 at the age of 21! Nonetheless, owing $38,000 and owing $9,700 are a big difference at that age, especially when you look at the massive differences in employment opportunities, wages growth and the housing sector between 1995 and 2020.

A Ph.D. was also a different prospect back in the 1990s. The stipend was reasonable, the cost of living in Sydney was even more reasonable, and the future prospects were quite strong on the opportunity side. Postdoc positions were easily obtained, and it was clear that plenty of academic positions would be opening in the future too. I found a $150/week apartment in Bondi Beach, a place where no one of sane mind wanted to actually live in 1997 it should be said (it took the olympics and reality tv to bring them back). I kept my finances tight, and got to work.

If I look to 2020, I really do have questions about whether I would make similar decisions to the ones I did in 1995. The ratio of RTP to cost of living is now, quite frankly, poor. The job prospects are similarly dire, to the point where the postdoc in Australia is becoming rapidly an extinct species (see my earlier post on this). And as for academia, why would you want to get in on the bottom level of an obvious Ponzi scheme, especially one this ripe for financial collapse.

That said, I’m totes recruiting, kids, because… well… do I have any choice? Generating the ever-growing output metrics required to compete in this system on my own, with my teaching and admin load, yeah right. And we can forget hiring postdocs in the not-too-distant future, they’re becoming a luxury you cannot afford without multiple grants (plugging my post again, so shameless 🙂 ). What one really needs is Ph.D. students, they are the labour force of modern academic science after all. They truly are the ones who get the real work done, and very cheaply too. To quote a certain past PVC “Why hire a postdoc? They’re too expensive. No, ask for the postdoc, but when you get the salary, hire 4 students instead, it’s so much cheaper”. I’ll leave debates of efficacy for the pub, but let’s drill into the economics on Ph.D. students. I’ve wanted to do this for a long time now, and I keep putting it off as it’s incredibly depressing. Let’s do this in two graphs.

Wins at the top log.jpg

The plot above has E1 and B6 data from earlier, along with APA/RTP in green and the salary for an Australian Public Service Cadet in red. I’ve gone with the log axis to make this less embarrassing (you’ll see why if you read that other post I keep plugging), and I’ve dashed some of the CPI lines for distinguishability. Before I unleash years of pent up bile, let me drop my second graph on the table.

Life on PhD stipend.JPG

In the second graph I go from annual to weekly and I dug back into my archives for some of my budgets from that era. In 1996, when I started, my rent was $150/week. There was no way I could do a Ph.D. with a 1.5 – 2 hour commute each way each day. I had already spent a decent part of my honours year sleeping on the couch of the Physics student society common room (and made good friends with the cleaners and security to get away with it; people tight for cash can always spot people tight for cash). My living costs were about $100/week, tinned spaghetti on toast was a regular meal. That left $40 discretionary, which was a reasonable amount in 1996 dollars. Thankfully, there was casual teaching on campus and my supervisor gave me a top-up, the same $5,000 amount we have now actually, which hasn’t moved partly to prevent financial arms races for students and partly, more recently, because funding is always tight. It is one of many things associated with Ph.D. student funding that shifts at truly glacial pace (travel scholarships are another), especially when compared to how rapidly salaries across the board, and particularly at executive level, have grown. Did someone say we were reinvesting the revenue back into the system? Because if we are, the Ph.D. students certainly aren’t seeing much of it! (n.b. some universities are shifting on this, to their credit, but it’s hardly universal).

It’s interesting to project my scenario from 1995 forward. I’ve had the APA data needed to do it for a while, it was just knowing how ugly the numbers were likely to be that stopped me. The black diamonds are the base income, and the tricky bit is rent. I’ve taken the rent and done three things: a simple CPI projection is the purple line, a proper analysis using rental yield history data (ABS, happy to supply, but it’s a distraction to show) in yellow, and the real rental in red. I just use CPI for living costs (it’s what CPI is after all 🙂 ) and then discretionary is APA minus rent and living costs. It looks ok if you assume the projected rent, except when you realise that $40 in 1996 dollars is $70 in 2019 dollars, yet the discretionary spend is $40 still in 2019. The APA/RTP really has been shaped to ensure it’s ‘just enough’. To highlight that, I’ve put in the poverty line (50% of median salary) for both 1995 and 2020, and yes, I’ve already had one person on twitter comment that they are glad to see it there as they always thought it felt that close.

The problem is… the projected rent data is a national figure, and as we all know, universities are typically in cities and often close to the CBD. Rents there grow much more explosively. I first saw this just before I moved out in 2000 — the olympics were coming, the landlord wanted to increase my rent by 20%, but preferred if I moved out so they could charge even more (letter was something like, if you intend to move out by date, we will forego the increase). It got smashed again in 2016 when it got sold (possibly as deceased estate) to an investor. Either way, rents in Sydney are nothing like the yellow projection and it’s easy for an APA/RTP to evaporate on housing costs in the major capitals. Casual work for Ph.D. students in the modern era is a necessity rather than a way to buy discretionary spend.

But let me come back to the first graph for a second, because when you look at it after the second graph, there’s something sinister. Firstly, the APS Cadet salary, which started well below the APA, has actually risen above it with time. The thing to realise here is that the cadet salary is, essentially, a short-term paygrade in the APS for interns and temporaries (e.g., vacation students). No one is meant to be on cadet pay very long. But when you take an RTP, you’re on it 3-5 years. There are Trainee and Graduate Trainee levels above cadet in the APS that, at 2018 rates are $43,750 and $58,231, i.e., 162% and 215%, of an RTP.  To put it bluntly — in salary terms you would be over two times better off going to work for the Australian government as a trainee instead! And that’s before we even get to the private sector. The fact that the APA/RTP has tracked like that on real world terms is remarkable, the only thing I can think to be as bad is Newstart. The people who generate most of Australia’s scientific productivity get paid peanuts to do so folks…

And before people tell me that the lower pay during an RTP pays off later as a post-doc and academic, let me point out that it did, once, back in the late 90s and early 00s. But then we used internationalisation to build an oversupply of Ph.D. graduates because we were addicted to them as a cheap labour force. It enabled us to funnel money into academic salaries and fancy buildings when the one thing we absolutely should have been doing was building a sustainable and equitable sector. Sainsbury does have a point in his article, he’s just missing some parts of the dereliction of duty of higher education management collectively in Australia. This is not just one VC or university alone. Stating names and numbers is a sideshow. This is over two decades of endless short-sightedness and ‘get-rich-quick’ schemes and league-table games from an entire sector of people demanding the sort of salaries people pay to have things managed properly and then utterly failing to do so. Maybe academia does need to be burned to the ground, because I really don’t know how we fix it in the state it’s currently in.

Back to the top: I want to come back around now onto a final graph to finish up the discussion. The thing that caught me with Sainsbury’s article is that it’s full of series of large numbers, and numbers are hard to accurately gauge as just numbers because emotion can become your ruler and emotion has a non-linear scale. I prefer to see them on a graph with a bunch of other things, and so I set out to make that graph.

My colleague of the ‘E is the new B’ quip saw me post the Sainsbury article somewhere and threw the top 10 ASX 100 CEO salaries at me, said ‘Are the salaries that large? They are very modest by industry standard.’ Sure, but last time I checked, we still worked for the public sector. Our job is to train the next generation and produce science not maximise revenues and generate growth. It’s exactly as Merlin says (although the registered charity thing I still find a little bit of a stretch given we put students into $38,000 of debt at age 21 as part of our operation).

The thing about CEO salaries is that people only see the big ones. Those big salaries are the exception rather than the rule, and using them is a convenient way to make the VC salaries look small and shrink the appearance of greed and self-interest involved. The data in the figure below is in part my own, and some from the Australian Council of Superannuation Investors (ACSI) Annual Survey of ASX200 Chief Executive Remuneration. It does a good job of covering the true CEO salary spectrum. Note that below there is a ‘fixed salary’, which is what they get paid even if bonuses are zero, and a ‘realised salary’, which is the whole package, some of which may be equity (whether it is liquidated immediately or not is a separate question).


To explain the colour-codes: green is corporate, blue is public service, red is academic and yellow are just benchmarks for context (my control samples). I had to put a log scale on this as well, same reason as before, it’s less embarrassing (can share the non-log version later perhaps). I’ve also included what I call a ‘super professor’, these are a growing breed in the internationalisation era — people of sufficient ‘merit’ that they sit above the publicly listed pay scales. I’ve had to make an estimate for what they get paid, and this might be towards the top end of it across the sector (folks can comment, it’s hard to get good info on this for obvious reasons). The ‘super-professor’ is essentially the core of my colleague’s comment about ‘Lv E is the new Lv B’, Lv E1 is just the entry point to a new range of clandestine pay-scales one can access. The transparency on them varies — some campuses just don’t advertise them others will try to deny they even exist. In some ways, you can consider these as returning an extra share of revenue to a special class of employees — more what a corporation does than a public sector organisation. Then there’s all the executive salaries, they are unknown and are not just academics — I’ve heard several cases of non-professorial admin staff on upwards of $0.5M in this sector. The VC is always the top, obviously, and if you look at that graph, they truly are in amongst the corporate salaries now. The VC minimum there is an exception rather than a rule too — it is one single VC that’s below the PM (see here — also I don’t consider the University of Divinity to be serious, sorry.)

The interesting thing is if you extrapolate the VC being the top salary in a university to the public sector. Clearly in the public sector the top of the pyramid is the PM, right, and even the education minister would outrank any individual vice chancellor. Yet, the minimum VC package is nearly double the education minister’s salary, and the highest has a multiplier exceeding 4! You can possibly see why the Education Minister might be a little unhappy about the salary situation given the graph above… and laugh at any suggestion we’re still a public sector organisation in dire need of bail-out. Executive pay cuts of 20% are barely a sip of the glass when you look at the real numbers!

You’ll also note that the VC salary bracket sits neatly amongst the ASX 101-200 salaries and well above what’s typical for CEOs outside the ASX200. As Sainsbury points out, we’re way beyond even the maximum end of the charity sector on CEO salaries, let alone the typical (don’t forget the log scale!).

For some final context, let’s consider VC salaries in a historical context. Let me choose two examples here. Prof. Spence at U. Sydney has a 2017 salary of $1,445,000, which translates to $845,000 in 1995 dollars. I’m pretty sure VC salaries were nowhere near that large back in 1995, given an E1 was $80,000 and a Dean’s loading would have been less than $20,000. Maybe someone long retired can go on the record and tell us what the VC salaries were really like back then. And there’s always the old chestnut that you need to pay good money to get good leaders who give good results. If that’s true, then clearly Sydney University should be well ahead of ANU, given Brian Schmidt does the same job for reportedly $618,000 in 2017. This is $361,500 in 1995 dollars, and might be somewhat more realistic to an actual VC salary in those days (albeit a fat one perhaps). Amusingly, ANU is ranked well above U. Sydney — so much for money buying performance.

I think it’s quite clear there are definite winners and losers in the 2.5 decades of internationalising the higher education sector and that some have profited very handsomely from the whole exercise. It makes any claim that this is ‘all for the community’ likely to be received a little distastefully I think.

It’s clear that the closer you are to the top and the earlier you entered the better your wins are. The academic system was historically easier to navigate. Grants were easier to get and more likely to be both funded and fully funded. The publishing system was easier and the productivity demands in terms of volume and rate of output were not as severe and more easily resourced. This fed heavily off ramping international enrolments and investment in the 00s and easy access to cheap labour in the form of Ph.D. students. Much like CO2 in the atmosphere, in the early days, the swelling numbers of Ph.D. graduates and young postdocs seeking fellowships didn’t matter — lots of the old guys were retiring, particularly since the old super system was so generous. There was still room in the funding systems and department structures. Positions were filled, and then things got tighter and tighter as the years went on, early 10s were hard, late 10s got really tough. It’s been glaringly obvious for some time there’s a big resourcing problem coming. I can’t work out whether the folks running this show are ignorant, negligent or willfully blind to it. Do they not know? Are their heads in the sand instead? Is it the next VC’s problem? Where are the opinion pieces pointing to a solution rather than trying to shove the bill for the expensive dinner back onto the taxpayer?

One thing’s for sure, higher up in the system, the Mathew effect tends to protect you. Lower down, it’s rapidly becoming carnage. You’ll have seen Darren Saunders talk about the issues on the funding front, and he isn’t the first or the last of the researchers now in their 30s & 40s who will end up getting closed out of the funding system and need to put their research efforts on ice. Everyone I know on campus in their 30s & 40s is sure it’s coming for them. For all the bold talk about revenues being invested back into research and teaching operations, it always seems to be the funds that keep labs going when the external funding runs dry that get cut first. Meanwhile, certain labs flush with cash get more funneled in, so much that they sometimes barely know what to do with it. We all know where the best equipment is! And the pads and plastic pens and plastic rulers with the research centre logo on them are when the stationary supply runs dry. And it’s not like the 80s where you probably got funded unless your proposal was really crappy; now your proposal can be supremely awesome and still miss out again and again. It’s only the upper 2-sigma tail that gets any external funding these days. The reality of ‘reinvesting back into research’ is that most young researchers are running on fumes, burning themselves out and destroying their mental health in competitive processes that are massively stacked against them.

Nah, we (and I mean the sector as a whole) built a Ponzi. We got told to look to international opportunities to help supplement the extra money we kept asking for. We aren’t responsible for being asked to do that, sure. But we sure as hell are responsible for the way we responded in the follow-up.

We could have done that maturely, with an eye to sustainability, without being like hungry pigs to a money trough. We could have done it in such a way that we didn’t jack up the fees on our local students to saddle them with five-digit debt levels at age 21. We could have done it in such a way that students weren’t being admitted despite insufficient preparation just to rake in money on fees (and without then nuking people for calling it out as a problem). We could have done this in a way such that the people who generate the results at the coalface and do the painful parts of the teaching — our Ph.D. students — don’t have to live just above the poverty line while watching the VC’s annual salary grow to exceed the median house price in our most expensive capital cities. We could have invested in sustainable research funding and management structures so that individual researchers don’t have to ride a rollercoaster of lab bankruptcy when a grant proposal fails and then having to try and reactivate when some money finally rolls in and then look at going bankrupt again, all the while watching more and more new ‘strategic hires’ get helicoptered in to fight for the same shrinking pool of cash. We could have not treated the system like some mix of get-rich-quick scheme and league-table pissing contest.

And now, as a sector, I think we are all going to pay a very handsome price for our folly. All the gnashing of teeth and pleas to not our fault are not going to get us very far. Some of us will walk away with some very fat bank accounts. Others? Well… Death Amway.

Working from home: How to get sh*t done.

Coronavirus has forced a lot of people who normally don’t work from home to start working from home. I even have academic colleagues who, for the first time, have had to buy a desk and a chair and get a proper home office running.

I have been working from home for years now. What started as bad (workaholism) evolved into a practice of working from home 1-3 days a week on a regular basis depending on my tasks. My typical is probably 2, I will come down to 1 if there’s a lot requiring me on campus and I will come up to 3 when I need a higher density of solid blocks of focussed time (writing grants or a new course). I’ve learned a lot of lessons along the way, so here are some tips to those new to this game.

Initial disclaimer: Yes, I don’t have kids, and yes, I know not all of the things below work or can even be implemented by everyone. This is just what works for me, take what you want from it. I think the main point below is to find your own way…

No particular order on these points, mostly because they kind of tie in with each other in multiple ways.

1. Environment matters: Probably top of the pile for me, if your environment sucks your productivity will suffer. It’s hard at short notice, but try to build a working environment conducive to work. I long ago invested in a proper desk and chair, get the ergonomics right. If you can, try to get your IT set-up similar to work too. I have the same keyboard and trackball at home as at work, I also run the same monitor setup (2 x portrait side by side plus small landscape on the right — laptop at home, old monitor at work) and I run the same file system at both too (C: OS and D: my files, which I sync bidaily using FreeFileSync and a flash HD — nice side-effect, I always have 3 backups done daily!!). I run the same software on both systems too. Moving from home to work and v.v. is completely seamless for me and not reliant on net-stability. I can literally stop at work, sync, ride home, sync, and get started again.

Thinking beyond my desk, make sure you’re in a quiet area, good light, good airflow, nice outlook if you can get it, and most importantly, a place where you can minimise distractions. If you like silence, do what you can (noise cancelling headphones, earplugs, whatever). If you like music, get that set-up well.

It might sound like I invest a lot in this… yes absolutely. If your environment is unworkable, your productivity suffers and eventually your income will too. This really is spending money to make money, and we all know you need to do that sometimes.

2. Plan & prioritise: You should have a good plan even ‘at work’ but it becomes more important when working from home because if you don’t plan your time then no one else will. In the current Coronavirus ‘remote working’ world, with many things becoming asynchronous, your calendar will stop planning your time as well. This can be dangerous in many ways. Firstly, working at home it’s hard to keep tabs on your time — some will work a lot less, some (like me) will just end up working more, even to the point of working to burnout. Also, working at home, it’s easy to lose focus and devote time to things that don’t carry impact. Ultimately, you can easily be like the truck stuck in the mud, wheels desperately spinning, but getting no traction and going nowhere.

When you work from home, make sure you have day, week and month plans. Have ‘to do’ lists at all three scales and prioritise them. Tick things off when they are done. Not only does this help you know what to spend your time on, but it helps you realise how much you are getting done. One thing about working from home is that you save some time in your day from the commute to and from. For me it’s about 1 hour a day, possibly a little more. I tend to reclaim half of this as personal time and use the other half as planning time. One clever trick here: if you can, put your planning time on the end of your exercise time. You can then use your exercise time to work through all the thinking, and then just empty your mind onto the page once you get home.

3. Big blocks: I like big blocks and I cannot lie… in fact in normal times, my work from home days are my ‘big blocks’ days since they are the days where I can knock out most of my distractions. The need to do this in a coronavirus remote working scenario is even greater — with everything going asynchronous it means there’s always stuff flying all over the place and your time is always chopped to pieces. Without big blocks, you just cannot get major tasks done. You spend your whole life being reactive rather than proactive.

For me, on a ‘big blocks’ day, I have only two tasks scheduled: a main and a reserve. The main can take up the whole day, and my big blocks days are the days I’m most willing to work ‘over time’ because of the way ‘flow’ works. It takes time to build momentum into big block tasks, often 1-2 hours, and I don’t want to lose that investment while I’m still getting high output from it. I will attack that main task until it hits a wall, which will either be: a) I finished it, b) I’ve hit a roadblock I can’t solve today, or c) I’ve gone all day, my output is waning, and I need another solid block of time to finish the task. The reserve task is there for when this happens — if I still have enough time left in the day to put solid hours to the reserve I will switch to it and dig in. I won’t always switch to reserve though, if I can’t do it justice, I will often just turn the rest of the day into mopping up pieces (delayed emails, planning, etc).

4. Getting in the zone: Basically this is knowing how to find ‘flow’ and is particularly important on big blocks days. For those who don’t know what I mean by ‘flow’, it’s that mental state you get in when you are heavily engaged in a task — all the distractions fall aside, there is just you and the task and you smash away at the task. It is particularly useful for any writing task, almost to the point now where I don’t want to write unless I know I really do have several hours to push through the ‘flow finding’ phase to proper flow. What you need to do is learn how you find that place for yourself quickly. For me, it’s most effectively done with a) push out all distractions, b) 2-3 minutes somewhere quiet to prep myself for the task — what am I going to do, remind myself of the important bits, etc., c) get the right music going, settle in at the desk, d) either edit the few paragraphs before where I need to write (or if it’s a blank page write some rough rubbish close to the topic), and then e) hopefully I just slide on in to where the work’s needed and get rolling.

Finding the zone is like good running form or swimming stroke. You have to work on it. Find what works for you. Critically analyse your approach, work on developing it.

5. Cluster the small stuff: As much as possible, cluster all the little tasks together and most importantly, don’t let them chop up your big blocks. That doesn’t mean I completely ignore anything small in a big block, sometimes a tiny easy task is a nice ‘mental break’ but do them on your terms strictly — they are breaks not distractions. I usually keep at least one day for small stuff and usually that’s the day with the most meetings in it (currently Friday for me). I also keep a little block of it on Monday morning, partly to ‘shovel snow’ from the weekend and partly because easy tasks are a good way to ‘restart the engine’ after the weekend. The other place I like small stuff clusters is in a big blocks day when the reserve task isn’t viable.

6. Schedule start and finish: Perhaps obvious, but no, you probably shouldn’t sleep to 1pm working from home and you shouldn’t be going until 5am either. Try to keep somewhat normal hours for yourself. On a work from home day, I’m always ‘at the desk’ by 9am without fail and I will have a finish time for myself as well that sometimes depends on my plan for the day. I am in bed by midnight without fail also and rarely work right up until then, it has to be some really exceptional flow on a task that warrants running late. As a famous football coach once said, ‘nothing much that’s good happens after midnight’ it holds as much for work as it does being on the town.

7. Take breaks & get exercise: Still important and you should see working from home and the flexibility of schedule it affords as an opportunity rather than a cost. For example, I do some mix of running and swimming to stay in shape. In the normal 9-5, swimming is hell, the lanes are always most busy before 10am and after 4pm. I make my work from home days my swim days and my office days my run days. I then put my swim either 10:30 to fit between the morning crowd and lunch crowd or 2:30 to fit between the lunch crowd and afternoon crowd. Aside from getting a relatively peaceful swim in, I have the added benefit of being able to break up my day, sometimes use the exercise as thinking time, etc. I sometimes also even jam a nap into a work from home day, something I could never really do in the office. Set a 30 min timer on the phone, crash on the lounge, get back to work after. A key thing in working from home is learning how to maximise your energy and effectiveness in the process. Little things like building exercise and rest in well really matter.

8. Eat properly: Easy when working from home to not keep to meal times, snack all over the place, eat half the pantry as a procrastination tool… Might seem obvious and trivial but if you don’t stick to sensible eating, your energy and focus will lurch all over the place making you less effective not more. Stick to routines and keep some discipline on this aspect. I have my usual breakfast on a work from home day, and a scheduled lunch and dinner. I usually don’t have open snacks in the house to reduce temptation on this front — the open is important, if there’s nothing open it’s less tempting, still need something around if friends pop over.

9. Shut the world out: Really essential to minimising distractions and getting flow going. I never have the television on when working, ever, no radio or podcasts either. If it’s music it’s albums or streaming with no adverts and it’ll be stuff that doesn’t chew too much mental effort up, i.e., music I’ve listened to a lot before so the novelty of it isn’t drawing neurons away from task. If someone is talking about stuff in whatever I listen to, boom, the focus is gone. It’s why people have to walk into my office and scare the crap out of me at work — my noise-cancelling headphones are my tools to shut the distracting voices out.

I sometimes even take this to extreme levels as a focus strategy, putting a single album on endless repeat through noise-cancelling headphones just to ‘lock in’ to the zone. For example, my review on 0.7 anomaly was almost entirely written to Guns’n’Roses’ ‘Chinese Democracy’ album [high rotation for several months, I know every note of that album] and one of my ARC grants this year to Hole’s ‘Live Through This’, which I’m also listening to while writing this. Whatever album it is, it is on only when I’m working on that task, and it’s almost to the point where I train my brain like Pavlov’s dog that that music means ‘focus all to writing task’.

Blocking out the world also means the internet. Hide your phone in the back of the lounge, turn your email browser off, ban yourself from social media. Shut down the things that send pointless notifications (Teams is especially bad for this, I hate Teams). I’m sometimes a little relaxed on this, it depends on the task because it can also make a good mental reset for me, but it needs to be short doses. Sometimes you want none at all. Writing this post I haven’t looked at anything but the text on this page…

10. Brief & debrief: The first half is kind of obvious, give yourself 5-10 min at the start of the day to think about what your strategy for the day is. What are your main tasks? What are your priorities? What type of day is it? Big blocks, lots of little things? When are your meetings (if any)? What’s the smartest way to assemble the day?

The second half is often not. People get to the end of the day, and they just stop working without looking back at the day. When you finish up, find 5-10 minutes to ‘debrief’ your day. How did it go? What is unfinished that needs to be put into a future day? What did you get done and what was the most effective part of the day? What was the least effective part of the day and what lessons are in that to get better at working like this? Celebrate your little wins, just observe what didn’t work perfectly without being down on yourself for it.

Some of this is managing your own morale and expectations. Seeing that you achieved stuff at the end of the day keeps you feeling ok about yourself. It also helps you become more realistic about what you can and cannot achieve in a day.

I put this one last because this is going to be particularly important in the Coronavirus hellscape where we have to work from home every day, again and again. This is working from home ‘for the long haul’ and it requires some extra effort in mental management. It’s going to be really easy to feel demotivated, unengaged, unproductive and unhappy if you really don’t have good methodology for working from home. But it doesn’t have to be this way if you focus on evolving to a situation that you can work with.

Bonus 11. Try not to punish yourself when working at home fails: A big suggestion for beginners is to forget perfect on this, it never happens, even for seasoned ‘work from home’ folks like me. Some of my work from home days in the past have been epic. I’ve started 9am, get to 10pm and written half a paper, and jammed in my meals and a swim. I find my zone easily, the flow is fast and strong, everything seems to hang together. I wind down with an hour and a half on the bass and head off to bed feeling awesome about a massively productive day.

Others are bloody awful, the main task ends abruptly at 10:15am, there’s a book I need and it’s on my shelf in the office. Continuing without it is a waste of time. It’s super-annoying as I was primed for days to smash that task. I go for a swim to try and get over it, come back, start on the reserve task, get an hour in and I’m just not feeling it. Nothing’s working, can’t find the zone, want to smash something because my brain keeps stewing on the main task for the day. I can’t make my brain refocus, I ask nicely, it calls me a jerk. I go to small tasks instead and just pack it in at dinner time as the day is a write-off. Sometimes it just doesn’t work, the mojo isn’t there. Not much you can do.

Don’t rip yourself up about working from home being hard, there’s not much good in it. Just treat each day as a separate day and look at how to get better at it with time. Try to avoid the unrealistic expectations generated by HBR and and stuff on this… the management gurus and associated meritocrats are always looking to turn your productivity into their easy profit. Ignore them as much as you can for anything but tips that seem easy to try and abandon if they fail. The better approach is to just hold to a process of plan, brief, put in a day’s work, debrief, take your wins, sleep, rinse, repeat. Focus on the long game, try to put the little daily ups and downs to the side. You know what you want to get done, pick the important parts of that, divert as much productive time as you can to them, and try to stay positive along the way.

In the end, all you can do is all you can do, right?

The real merit in emeriti…

Life has just reminded me that I’ve been wanting to write this one for a while. I won’t go into that, but a question I’ve long contemplated is: What is the point of emeritus professors? Why have them? What value do they even bring?

Most of us have crossed our fair share of emeritus professors. The old guys (they’re almost universally male) who, on reaching forced retirement at 65 or thereabouts, decide the last thing they want to do is stop being a professor, and so in exchange for a salary to redeploy on someone much younger and, in many cases, working on something that’s important now rather than 30 years ago, the university gives a title and a desk and an institutional affiliation.

Many are quite benign. They come and go for a little while, slowing down as the grant money runs out and they gradually find more interesting things to do. They probably don’t add a lot to the department as a whole, but they don’t cost it a lot either, so why not? It’s like the academic equivalent of a golden handshake, I guess, since many of us seem to see more work as a reward somehow (strange, I know).

Some range from being mildly annoying to a right pain in the neck. Continuing to pursue their own interests, even if they’re sometimes 100% anti-parallel to the interests of the department, via some mix of Machiavellianism, threats, intimidation or militant resistance. I will spare said emeriti the indignity of having their sins laid bare here, but I’m sure enough of us have seen it for my earlier words to be justified. Others do their best to keep the department culture and/or demographics trapped in the 1960s or 70s or whatever decade they consider to be the golden years of academia.

And the trouble with an emeritus position is that it’s easily granted from above and then conveniently forgotten, while the holder lives on another 20-30 years, popping out of the shadows occasionally for a short spree of trouble like a departmental cold sore.

And then, more rarely, there’s the really good emeritus professor. Some of us know examples, often on other campuses as they aren’t nearly as common. And when I’ve talked to people on this topic in the past, it’s interesting how a small handful of names always get mentioned. These are the ones who bring true value to the role, which becomes more like that really nice grandparent you sometimes met as a child. They know everyone in the department, and not only do they get along with them well but they see it as their role to advance their colleagues’ interests as much or even more so than their own. They are helpful with advice, not in a smug or arrogant or condescending way, not chewing your ear off or pontificating loudly at every morning tea, but more in the way that people all feel like they can go seek their council when needed, without judgement or self-interest getting in the way. They realise that their job, as an emeritus professor, is to build legacy, not by pushing to grow their own research, that time has passed, but by being there for the department they worked in, helping to advance the next generation by offering them their one key advantage — years of experience at having to work through many of the issues that their junior colleagues are encountering for the first time. And that’s often less a job of telling people what to do than hearing them out and then guiding them to their own smarter decisions bearing in mind that modern contexts can often be quite different to 20 or 30 years ago.

I think universities could be far more discerning in their decisions of who to give this role to, and more the case, they should completely change their definition of ‘merit’ from thinking about just E for academic excellence and more to thinking about the other two letters ‘us’. Yep, it might be a terrible pun, but the merit in emeritus should be about excellence for us, namely the department that will be hosting them for however many years. There should be wide consultation from the department on who gets given the role, from the top to the bottom. There should be references sought, selected by people other than the candidate. There could even be an anonymous vote within the department that requires more than 2/3 to vote yes for the role to be granted. The role and requirements should be more carefully specified, and more tightly focussed on ability to humbly serve others using outstanding interpersonal skills and impeccable ethics and morals than on past research performance alone.

Start looking at merit the right way, and we might start seeing more of the good ones and a lot less of the bad ones.

Workloads — Academia’s less spoken of equity issue

Discussions with more junior colleagues (some would call it ‘mentoring’, I hate the term as it has become badly misused now) often end up in a place like this:

Colleague: “I’m having to do <task x> but it’s not my job…”

Me: “Wait, well whose job is it?”

Colleague: “Oh, it’s Professor X, he’s supposed to handle Cohort X and I do Cohort Y, but the final sessions are held together and he doesn’t do his bit, which means I have to do it to make sure the whole thing doesn’t collapse.”

Me: “Ok, and your colleagues know about this? And they’re ok with it? Your head of school knows?”

Colleague: “Yeah, everyone knows he never does anything properly, but what can you do about it? They just roll their eyes and go ‘Yeah, but that’s just Prof. X, what can we do about him?‘.”

Me: “Lots of things actually. Does your school have a workload model? He shouldn’t be getting credit for the role, and you should get the credit for carrying his load instead.”

Colleague: “Pfft, good luck, as if we’d have anything like that. They’re thinking of just giving me the whole of <task x>, so I’ll have even less time for research.”

Me: “And if you had a workload model, you’d get credit for that, and he would have to do something else to make up for not doing <task x> properly. A big part of your problem is the lack of a proper workload model…”

I’ve had this conversation with so many junior academics that I’m well tired of it. And once you’ve had it enough times, you realise that the unquantified work always gets displaced to the people least able to say no to it. And that some people are more able to get away with ‘feigned incompetence’ than others. This is the dreaded ‘basketcase’ exclusion, often turning up in conversation as:

Me: “So what about the other roles? Is Prof. X contributing to other chores of the school?”

Colleague: “No, he never gets given anything because he’s always useless at it.”

Me: “Wow, he must be terrible at research then. No money, no students, no papers coming out?”

Colleague: “Not at all, he’s perfectly competent when it comes to those things, and other things that interest him.”

These problems are hard to fix, but one route that helps a little is a good school workload model. It’s not the silver bullet that instantly fixes all problems, but it goes at least some way to evening out workloads. It doesn’t fully fix the ‘basketcases’ issue either, but it can help a little, especially in the hands of a good head of school. See it as a small piece in a larger bunch of carrots and sticks a head of school might have to manage loads fairly.

Back to ‘mentoring’ (I’m so LinkedIn), the conversations above often end up with:

Colleague: “So you have a workload model in your school then? How does it work?”

I often give them a brief run-down, but for wider benefit, I think I might do a thorough job here, because it could be widely useful given the number of times I’ve had the conversation above. But, before I do, two quick disclaimers:

  1. I did not invent this model. It pre-dates me joining my school and it has evolved significantly over the years. It goes so far back I don’t even know who to credit for starting it. It certainly wasn’t me, but I wish it was.
  2. It is not perfect, but no model is. But it’s much better than nothing, and continues to do a great job after well over a decade of ‘evolution’ in the school.

To do this explanation well, let me lay my 2020 load allocation on the table. I can’t show it for others in my school (obvious privacy issue), and actually, I haven’t even seen it for others (yet). That doesn’t mean it’s totally non-transparent though, I’ll get to that near the end. I’ve also stripped the student names off the bottom below…


First observation: Adam is a busy guy! But seriously, let me run from the top. We basically start by adding up all the workload units for the year (WLU). There’s a formula for this that accounts for all the teaching that needs to be done, plus all the admin, plus some fat to account for research deductions. We then divide this by the number of staff to get the annual load. This year it’s 416, but it fluctuates from year to year as staff numbers and load requirements shift around.

Any carryovers then get deducted, this mechanism is there to prevent people being overloaded year after year after year until they get crushed and thrown on the scrapheap. Research is broken into 3 components: publications, income and students. There are formulas that convert reality to WLU amounts. The crucial thing here is that there is a cap on your research deductions of 100 WLU (it’s actually why my papers & students are low, they are legacy claims as I didn’t bother submitting this year because I knew $ alone would put me over 100 WLU anyway, lucky me!). This cap is a good thing, it provides a strong incentive to do research, but it doesn’t mean someone lucky enough to have huge grants or huge student numbers or whatever can totally skip their teaching and admin roles.

So by the time I get to the table, I’m at 323.2 (I owe a little from some shuffling last year as I was on sabbatical, hence the negative carryover). The table has all the courses, with common rates across all staff. I’ve taught PHYS2111 before, so I only get a rate of 3 here, this covers the hour of delivery plus 2 of preparation. PHYS1241 and PHYS2113 are new courses, so my rate is higher to account for preparation — 5 might not be enough, but really, my prep on PHYS2111 is sometimes closer to 1 hour than 2, so it evens out a little in the end. Tute load accordingly. Some people will have other things, like labs. Exam marking gets handled by a similar scheme run in parallel, again to equitably divide marking up, since our first year marking dwarfs everything else.

The allowances down the bottom are for the admin tasks. Being course coordinator carries a little extra work, so that’s accounted for. The bigger tasks in the school carry some really hefty loads, and those might not go all the way to covering the real time needed, but often those big tasks are taken by people at higher rank as leadership roles (carry rewards in career advancement instead).

Right at the bottom, you get a total, for me 463.8 out of 416, which gives me 47.8 as credit into next year, i.e., that second row at top next year for me will be +47.8 rather than -7.2.

Let me answer some questions that I can guess a reader might have about this.

Do carryovers really work? Yep, and I certainly know of cases where someone has accrued a high enough negative balance that they’ve been taken off teaching for a term to bring their balance back towards zero. Likewise, you won’t get away with not carrying your weight for long, your positive balance eventually catches up with you. There are serious discussions had that involve “We can’t do X as Y will be overloaded” but with a real quantitative aspect behind it, not an arbitrary perception or weird self-reporting.

Do you do swaps? Sometimes. A little one off cover for each other we often do as ‘pay ya back later’. But I’ve had to skip a few weeks before to make something overseas, and then I’ve given the relevant WLU across to the person who covered me, as I knew it would be hard to eventually make that back up ‘in kind’. This works nice, it means picking up for others doesn’t have to be a charity case.

What about transparency? We don’t get to see each others details above, but you don’t have to, and I think that would carry some privacy issues and be a little bad for esprit de corps. I call it ‘semi-transparent’ in that the school executive (exec) see them all, and with a good head of school and a sensibly sized and personned exec, you know, at the very least, there is unlikely to be any corrupt things happening in the scheme because one of them is going to call it out as unfair. That said, it also provides scope to manage cases where adjustments are required, e.g., illness, disability, etc., in some fair manner. This is one aspect where a workload model is not a silver bullet — in the hands of a head of department who disposed of their executive and decided to run special deals, a scheme like this is probably only going to be only marginally better than none at all.

I might leave other questions to the comments, I’m nearing my word limit, but one or two parting thoughts on the scheme more broadly.

The scheme is only as good as the people running it — it absolutely requires a fair head of school and a good executive to work really well. That’s a governance issue I might talk about elsewhere. Thinking more broadly, if I was a DVC-EDI, for example, one thing I would be pushing for would be for a version of this to be rolled out in every school, and potentially for an annual review at Faculty level, less of the actual allocations, and more of how the scheme is being run. There are some ‘arbitraries’ in the model above, for example, why is Teaching Director 112 and not 12 or 120,000? Why is some other role 56 and another 27? These numbers are reviewed pretty regularly, and probably also worth setting at individual school level, but with someone looking from above for cases where there is bad skew in instances of a sub-optimum head & exec, for example. They should be justified upwards rather than set from above. From a higher level, this knowledge is useful, it enables to better know the reality of your organisational human resource if you’re trying to make major changes, e.g., to teaching structures.

And with my remaining handful of words: Junior academics who are getting unfairly loaded, here’s a model. If your school doesn’t have one, push for one. If it helps to use my blog as an example, feel free. Having worked in a school with the model above, I am truly grateful for it, and would hate to work in a school that doesn’t have one as well structured as this one is.


Every time I fly home from one of ‘these conferences’ I stare out the window wondering how the hell this keeps happening… that bad taste in the mouth from yet another program lacking in diversity and novelty. While this week’s conference wasn’t guilty of the most heinous and thoroughly discouraged practice, namely an all-male plenary line-up, it had a number of things that really make me question why I even registered. Where do I begin?

The most festering sore…the panels. Oh my god, what an epic fail. On the Sunday there’s a ‘what makes a great leader’ panel, two men & one woman, and then, as if leadership is something women need extra education on perhaps (I’m still trying to work out what the actual underlying message was), there’s a ‘women in leadership’ panel, all woman panel & chair, on the Tuesday. Got to love the token ‘women’s issues’ panel, especially after so many iterations, just to absolve yourself of your other sins … and man, where there sins!

On the program for Wednesday was an all-male panel on how to get your first job. Oh, the irony of three senior male academics at the top of the ponzi talking to an audience where most have buckley’s of getting a similar shot at being an ‘academic for life’. And the pièce de résistance, a 6-person panel on academic-industry relationships, all male including the chair.

Some of you will have read my prior blog post on minimum standards for conferences. I got ambushed on this one. I had already registered, the conference having met threshold on speakers at my 2020 standard, only to see those panels and want my money back. I stupidly walked into a mistake I pointed out on twitter last year – paying your registration to conferences with bad practices is simply enabling them financially. So, I felt duty bound to make a stand on this one to redeem myself.

It began about a month ago when the organisers spammed an advertisement about the panels to the conference email list. I sent an email to two of the chairs that I knew personally, one of whom I’ve raised this issue with several times before, telling them in no uncertain terms how bad a message this sends to people in your audience, and offering to assist with suggestions to remedy the situation.

I never even got a reply.

I arrived at the conference, the updated program on the app, still the same panels.

OK, time to step it up then, so I decided to lob a grenade at the ‘women in leadership’ panel. I can’t remember my exact wording, something along the lines of “This may be a controversial question, but one can’t help but notice that the next two panels at the conference are all-male panels, including one with 6 men on academic-industry partnerships, a topic I think any of you would be very qualified to talk about. Clearly, we are well beyond awareness on this issue, what do we start doing to stop this from happening?

Amanda Ellis gave a really good answer to this, something like “It is time for men to step up and do something about this because women can only do so much. Start refusing to be on biased panels, start making it clear to organisers that this is not ok.” I think this is some of the solution, let me come back to it below, but where did things go after this…

Wednesday’s panel arrived, now there’s three men and one woman. Ok, we’ve gone from manel to token, but it’s better than nothing. Not exactly what Amanda was suggesting, and I would have much more respect had one of those three panelists actually made the sacrifice to lead by example and clear a chair to get to 33% rather than just 25%. Optics matter, particularly in leadership.

Thursday’s panel was a howler, and a good example of why last minute fixes are shit for everyone. The 6-man panel had gone to 6 men and 2 women, who had clearly been dragged into this in a hurry. Why do I say clearly? Well, each panel member got to introduce themselves, the men all had a stack of curated slides, something of a mini-talk so to speak. The two women had no slides at all, and got to be the last two to introduce themselves. As if it wasn’t obvious enough that they were an after-thought from program and seating layout. The arrangement and chairing of this panel can only be described as woeful (I am being diplomatic). The panel had 40 minutes available, and 35 minutes were consumed by the ‘introductions’, the last two shorter than most, such that there was barely 5 minutes left for discussion. Two questions were put forth, one was barely answered, the other never even got that. And that folks, was it. I felt pretty shit after this, all I seem to have achieved by calling the conference manels out was to put two of my female colleagues into the shit-sandwich of being pressured at the last minute to give their best performance in a shemozzle that was little more than an insulting waste of their time (I sincerely apologise to them both on behalf of people who should have done a better job so that this never had to happen).

An interesting question is: Does this have any blowback against the organiser of that panel or of the other panels? How about the conference chairs? After all, the chairs have a leadership and oversight role and therefore a responsibility to ensure quality, equity and diversity in their program. Of course not. We all go home, some of us pretty disappointed at the whole thing. They put their contribution as a line in their CV, get positive credit for it (accumulate merit-cookie), and move onto the next conference. Where it all happens again, and again, and again.

This experience, and Amanda’s answer, got me thinking. Surely this isn’t down entirely to men being invited onto panels to take a stand. Sure, that’s some of it, I 100% agree, but that cannot be all of it. What we need here is some mechanisms for accountability for the organisation of these conferences.

There’s limits to what you can do. Conference series are somewhat independent structures; you can’t act punitively on them easily. The chairs are also hard to exert pressure on. They’re not going to own up in their CV to “Chair of Conference X, which by the way, was a total sausagefest.” or “Chair of panel on Y, which descended into farce because I can’t manage a piss up in a brewery”. Neither can you exert pressure via, e.g.,  funding schemes, for example, simply because these things aren’t really the criteria, and taking out your grudge that way is just unethical.

So how the hell do we do it? I can see a few mechanisms here.

The first would be via Athena Swan awards. One possibility here would be to have every conference and its outcomes tied to the conference chair’s own organisation’s Athena Swan measures. If there are co-chairs, then this would be divided equally between the co-chairs’ organisations. Since this has an impact on something a university cares about in terms of PR, this would mean that university’s DVC Equity & Diversity can lean hard on the conference chair. This could include rescinding financial support, since those often do come from the home university of the conference chair, but also extend to punitive measures directly on the chair from a HR perspective, e.g., for damaging university reputation. After all, this conference I was at had a hell of a lot of University of Queensland branding floating around, and I will forever associate UQ with that conference. I doubt the top folks at UQ would like to be hearing me say that (esp. since I once almost became academic staff there). This Athena Swan approach could be both a stick and a carrot – particularly good conferences should be able to advance a campuses Athena Swan measures as much as the bad conferences hurt them. And there could be financial rewards for staff for doing so!

The second would be to start advocating for conference sponsors to make demands of the conferences they sponsor. A good route to achieving this would be to start publicly asking the sponsors why they are supporting and enabling the most egregious cases whilst showing gratitude for supporting positive action. Sponsors will be more even responsive to negative PR than the universities; they sponsor entirely for marketing reasons and there PR really matters. We could start by encouraging sponsors to write minimum equity standards into their sponsorship contracts, for example. It’s good for these businesses because better diversity of speakers and panels means larger more engaged audiences, and increased customer base to interact with.

The third would be for delegates to become more savvy (myself included after this experience). What I learned this time around is that early-bird registration is perhaps not the best option. Sure, it saves some money (in this case about 14%). But that’s peanuts compared to going to a conference that disappoints you the entire time and sends you home feeling annoyed or uninspired. I would happily pay the $160 early-bird gap out of my own pocket to undo my decision to attend knowing what I was getting into. By delaying registration, you have more pressure over the committee. They take your emails more seriously if you say ‘if you don’t fix this, I just won’t register’. After all, nothing makes a conference committee panic like low registrations. There’s significant ‘upfront’ in any conference budget, and most of the savings given to early-bird registration are really a ‘certainty cost’ to the conference budget.

The fourth would be for delegates to be more discriminating. There are a lot of conferences on the market, more than we can ever obtain grant money to attend. They can also be expensive – registration is regularly at the $1k+ level now. And that’s before you include travel and accommodation. If all that’s going to happen is you are going to see the same old plenary speakers over and over again, or be subjected to monocultured panels, or be in a program so disorganised that it might as well have been put together Nostradamus style (throw the abstracts down the stairs, pick them up, and that’s the talk order) then why waste the money? If there’s only going to be 13 disinterested people, half of them looking at their phones, during your talk then why spend the $1k at all? Your talk opportunity is compromised by the lack of audience, and your listening opportunity is compromised by having to hear the same old plenaries again and again.

There’s always better opportunities, right?

This is particularly the case for the big ‘meritocracy & showcase’ conferences, which are more often than not totally soul-destroying experiences that I end up wishing I had never attended (MRS Fall is the only exception). Especially when you never have much hope of ever getting a major talk because the same old people keep being given the slots ‘because it’s meritocracy, stupid’. This conference included at least 3 plenaries that I had already seen more than twice in 5 years. One is even now at: seen twice, skipped 3 times, and one time attended but listening to Amon Amarth on my AirPods just for some giggles as I was there for the next speaker, who I’d also seen before but wanted to see their two new slides. In another upcoming major conference, the same Nobel laureate has spoken at it every single time I’ve ever gone since I was a Ph.D. student, the talk is always much the same. We’ve often joked that when he dies, they’ll wheel him out, Weekend at Bernie’s style, stiff with merit, just to watch decay for 45 minutes and bask in his meritious glory. Meanwhile, you go to a standard session, and you see some really cool stuff crammed desperately into contributed slots, it would make a great plenary, but no one on the committee has the imagination and the meritocrats at the top would never give them the chance. Especially when committees are using plenary slots to buy influence.

Feeding off this, the fifth would be to start looking for ways to be visible that don’t include the conference circuit. After all, in a carbon constrained world, we’re going to need to stop going to so many conferences (at least until we’ve decarbonised our transport options). The one positive I saw in this conference was that some plenary speakers didn’t come but delivered their talks by Zoom. The quality was just as good as in person, if anything, in the giant theatre you could actually see them without needing binoculars. Why can’t we watch talks in our own time without conferences? Why can’t an academic website host your ‘plenary talk’ for people interested to see at any time? You don’t need conferences to give you a platform, the internet is your platform.

I’ve hit my word limit, so I might stop there and leave this for discussion in the comments? What other clever ways can we end these bad conferences? Feel free to make some suggestions.

Zen and the Art of Academic Mind Maintenance

Academic life is extremely busy, and gets soul-crushingly more so each year as the ruling clown show attempts to wring every last microdrop of blood from the stone to prevent the ultimate collapse of ponzidemia. It’s easy to think about little else in the desperate rush to keep your head above water and survive the surging torrent of endless work. But sudden tragedy has an interesting way of ripping your attention away…

Life also has an interesting sense of humour on the timing of events. One week you’re paddling hard as usual, happy to have survived the rapids of the teaching term for the calmer waters of a well overdue sabbatical. On the weekend you get a 8am phone call that no academic ever wants — a student you are close to has taken their own life — and then less than a ridiculously busy & traumatic fortnight later, you find yourself alone and relatively unloaded in a new country for six months… no friends, partner still working back at home, a huge language barrier and a shoebox apartment living out of two suitcases, with endless hours to think about your predicament. Watching from afar, as everything back home finds its way back towards equilibrium without you. Powerless, isolated, invisible.

If there’s any good to come from this, it’s that it somewhat forcefully presents the opportunity to ask yourself some really tough questions, and then to think very deeply about them. And being in Japan, a country with strong Buddhist traditions, this is well facilitated — there are places and rituals perfectly designed for deep mental explorations, be it sitting silently in some ancient temple or a solo hike through mountainous jungle in the near 100% humidity of August, sweating bullets and trying to stave off heatstroke, snakes and hornets.

Six months of meditations is too much for a blog post, so let me hit my top three. Maybe they’re pointless drivel, maybe they’re obvious and I just woke up, maybe they change the way you see things too. No particular order.

1. Talk is noise only the actions matter: 99% of those who read this will have done so via social media. Social media should have been a blessing. It has become an utter curse. I became so horrified with what it had become, and how it made me feel and act, that I deactivated & burned my twitter account, by that point aptly named 死亡フラグ, deleted facebook from every device I owned and only used instagram to share my best Japan photos and Messenger to keep contact with non-work friends and family for several months. I also ignored all news sources except the Japan times, which I only used to keep up with local big events (e.g., typhoon warnings, sumo results, etc).

It was fantastic, I immediately felt a huge weight lifted off my shoulders, particularly on burning my twitter account. The ‘crack addict’ impulsive phone checking was gone within days. I was once again aware of the world around me, in a good way. Even today I solidly think about never going back to twitter, having missed only the occasional contact with a few new friends I made via my time there.

But as an interesting experiment, I phoenixed my account just before getting home, and set myself the rule of no tweeting, just observation only, for several weeks. Essentially, it was a sort of ‘twitter meditation’, where I let the tweets just pop up, observed them dispassionately, and let them disappear again, without interacting. I just asked myself at the end of short sessions looking at it: What do you see in all these 280 character chunks?

You know what I noticed almost immediately: 99.99% of what you see on social media is pointless irrelevant bullshit. Toxic crap. Brain farts that shouldn’t exist outside the brain that farted them. People hurling outrage and abuse at each other, flipping out at the slightest perceived infraction, jumping totally off the deep end. The outrage floats in this endless sea of irrelevance, tweets that only exist to seek validation from a public that doesn’t even care, as though we lived in a universe where anything not tweeted (and not liked or retweeted) simply doesn’t exist, or worse, makes you a loser or a nobody. It’s the electronic equivalent of a septic tank, where we all contribute our mental excrement several times a day.

Another thing you notice quickly is what I’ve come to call ‘virtual signalling’ (as you can probably tell, I enjoy coming up with new terms). It’s the same ‘virtue signalling’ we normally think of — whether you consider that good leadership or an insult is irrelevant — except that there’s no real action tied to it at all. It’s just a glib statement of desired behaviour, a virtuality rather than an actuality. You see it a lot in academic accounts, and anywhere there’s an easy perception that if you aren’t ‘lefty enough’ you will get shunned by the community (I don’t believe left and right have relevance any more, but I won’t digress, the usage I just made is clear enough). I won’t give specific examples, lest I have them pinned as me firing arrows at specific targets, but let’s just say, it’s interesting to notice the dissonance between the tweets and actions of some you know both as tweeters and in real life. It’s not everyone, but it’s enough that you notice.

Indeed, the whole merry-go-round of twitter, LinkedIn, facebook, etc is interesting when you really step back from it for a while having been deep inside it. It really reinforces that classic old chestnut: “Talk is cheap, actions are expensive.” The big lesson in all this, maybe we all need to say a lot less and start doing a lot more. Be the change you want to see in the world. If you are being it, then you don’t need to talk about it because it’s happening. And people see actions, and talk about them — if others are talking, then you don’t have to say so much. On the flip-side, if you have to make a lot of noise, maybe it’s because the action isn’t there.

This isn’t to say that only the silent do anything, or that anyone on twitter is all pointless words and no action, or that talking about action nullifies action — one side effect of twitter is learning to figure out the worst misconstruction of anything you say and how it can then be used to clobber you over the head. But if you’re an academic on twitter, I can highly recommend taking a few weeks of silence to dispassionately look at the flowing river of toxic shit spewing forth and what your contribution to it is. How does it make you look? How does it make you feel? How can you do better? And I don’t mean tweet better, I mean DO better, in your actions. We can all do better, myself included.

2. Worry more about your substance than your achievements: One thing that’s always disturbed me about academia is how easy it is to lose the person for the achievements. There’s two sides to this coin. The first is frequently highlighted in a ‘meritocratic’ context. An often used example being someone with questionable behaviour, e.g., wandering hands, being endlessly excused because their achievements are so great that they outweigh (for some) the bad behaviour. I’ve always been amused when you hear all about Scientist X and their amazing achievements from a colleague, and you ask ‘yeah, but are they a good person?’ and get weird looks or blank stares in return. As though that’s a completely stupid/pointless question to ask. Or worse, find out they are human trash somehow excused by academic brilliance.

The second is perhaps less commonly pointed out. It’s easy to become so desperate to survive a system that cares only about maximising a certain set of achievements that you can easily lose yourself as a person, or worse, follow dark paths in trying to satisfy them. The system builds a ‘win at all costs’ approach (the dreaded meritocracy again), that soon sees you, e.g., sandpapering a cricket ball to get the edge that wins the match, or turning a blind eye to such… Humility, generosity, honesty & fair dealings can all fall so easily to the wayside, only to be replaced with sniping, gossip, mafia tactics, backstabbing, dishonest promises, etc. All justified by just being ‘what you had to do to survive’. Is survival really worth it if this is what you have to do?

Tragedy is a good catalyst for escaping the meritocratic mental merry-go-around… Nothing switches your mind away from metrics like being harshly reminded of mortality. What’s the point of having a life if you don’t enjoy it; if you just spend it all chasing achievements to help managers with over-commitment issues achieve overpromised KPIs or to impress merit-obsessed colleagues you don’t even like?

All the little quibbles and pointless tasks and chasings to pad out CVs really pale into insignificance. The stuff that really matters becomes a bit more crystal clear — the good people around you, the people you work to serve, i.e., the students and the general public, your kids or family or friends or hobbies. It all starts to matter more than the utterly pointless and unending quest to just push up university league tables.

Probably the biggest question you dwell on is: If I died tomorrow, what would people say about me? I don’t know about the rest of you, but a eulogy full of academic achievements would, for my eyes, be a damning insult. I wrote 300 papers or gave 27 million plenary talks, who cares. I managed to better understand Phenomena X or won Prize Y, so what. I’d much prefer to think my colleagues thought I was a mostly decent guy (no one ever is perfect) who was sometimes fun to be around, that my undergrads enjoyed my teaching and thought I gave my best to teaching them difficult stuff, that the folks who worked in my research group enjoyed being there and had a good environment to work in that enabled them to achieve their best without destroying themselves. Nothing in the metrics points to this — you can have insanely good metrics and be terrible at all these things, you can have rotten metrics and be thought of highly!

None of us are perfect, I’m a jerk and an arsehole too sometimes, this isn’t a demand for perfection. It’s really just a quest to not forget the value of the things that cannot be easily quantified in workplace environments. A focus on Quality, not in the usual corporate bullshit sense or the word, but rather in the sense talked about by Robert Pirsig in his classic Zen and the Art of Motorcycle Maintenance (Richard Buckland put me onto this book, I highly recommend it as a read for all academics too).

As we head towards the inevitable collapse of the academic system (see below), at the very least we have passed it’s golden days, I think one of my favorite sayings in academia rings more true than ever:

“Everybody in this game is smart, hardworking and good at what they do; distinguish yourself by being kind.”

3. Death exists for a reason but we now need organizations rather than people to die: Imagine people didn’t die — angry male boomers would be the least of our problems, Genghis Khan and Adolf Hitler would be kicking up real storms. Death exists as a necessity to evolution. The way we advance is with the old being replaced by the new. Typewriters and trilobites, both left in the past for newer things.

What’s interesting is how easily and suddenly people die, even when they shouldn’t, but organizations that are obviously stuck in the past and should have died long ago are somehow, not just still with us, but as protected and powerful as ever.

I love Japan, there are lots of positives to that country. But one downside that totally blew my mind was the bureaucracy, which is literally stuck in the 1980s. Outsiders and tourists imagine/see Japan as a very modern country, a producer of much technology. They produce it, sure… but in lots of places, they just don’t use it. At all. So if you live there for a bit, you have to go register your address, join the health system and pension system, open a bank account. I don’t know about everyone else, but this process is eye opening and somewhere between amazing, ridiculous and horrifying (perhaps all three in some acid-fueled kaleidoscopic sequence). Large staffed offices await, with floors full of photocopiers and file shelves. Address details are stored on paper sheets in lever-arch folders, found by consulting maps on paper in lever-arch folders. Forms are filled out on paper, photocopied and filed. Banking is done using bank books (I haven’t had one since I was a ‘dollarmite’ aged 9), which you can put into an ATM to have the transactions printed on it (that lovely sound of line-scanning mechanical printers leaking out of the machine). When you head home, they can see your final health insurance payment in the computer system, it says so right there on the screen but… the stamped paper bill stub isn’t in the file folder because the konbini hasn’t sent it yet, so back home you go to get your stamped paper receipt to prove the payment. Everything is in cash, including when you have to be paid some expenses for giving a talk at a university, whereupon you can have the great (and weird and funny) experience of being quietly handed an ‘envelope stuffed with cash’. I could go on, but that’s enough to get the point… This thing is obviously not sustainable. The question isn’t if it will go, it has to, it’s just how long conservatism can sustain it before it dies, and how spectacular that death is.

It’s staggering, and when you see it, it really makes you think in a different way about how organisations can be completely resistant to change and at the same time, somehow stave off dying.

Which brings me back to academia, which is a close rival to the Japanese bureaucracy for the thing in most dire need of total disruptive revolution. I dealt with scientific societies in my previous post, many of these dinosaur-closets are in dire need of an end.

There’s the journals. In return for insane profit margins, we give them our work, labor and mental anguish (as they endlessly stuff us around in the reject-reformat-resubmit dance) for free. The system needlessly slows down our advances — if I could reclaim the time lost to stuffing around with journals, I’d easily be 50% more productive — to the point where any organization not in the journal space, e.g., start-ups, can easily outpace us on competing work. The only reason we’re wedded to this system is the conservatism of the scientific meritocracy — we don’t need these journals to publish, we could do that with ArXiv. We use the journal system as an initial proxy to arbitrate quality so we can run the meritocracy that underpins the league tables. The league tables protect the publishing companies, the publishing companies protect the league tables. And it’s why we spend forever going around in circles trying to get anything published these days. You have to pitch at the top journals to survive meritocracy, and they have to reject you as part of the meritocracy that enables them to survive. Welcome to the Hotel California — “We are all just prisoners here of our own device”

There’s the conferences, a few are good, most are the same tired old format, with the same old blokes giving the same old plenary talks over and over and a bunch of showcasing. It isn’t long before you’ve heard it all before. Physics Today once famously noted that the obituaries was one of the most popular parts of their magazine, mostly for the young rejoicing in the fact that a slot has opened now that yet another dinosaur is gone — the irony of it being adjacent to the academic position openings section was not to be missed!

There’s the funding agencies, with their ever growing form requirements and ever growing justifications of this and that just to compete for a diminishing pool that’s salami sliced ever thinner, unable to keep up with a growing hoard of applicants. And then you look at the awardees — so many instances of the same CIs with the same essential title and the same essential project description that got funded 3 years ago and 6 years ago and 9 years ago and 12 years ago… I’d love to join a project to mine the ARC database for some choice examples of the same essential project being funded again and again for well over a decade, as I know quite a few examples where the outcome promised a decade ago still hasn’t been achieved, and likely won’t be come 2030 either, yet still be sold as a fundable proposal! The instance where CI X has new co-CIs and is doing something very different to 5 or 10 years ago is comparably rare — probably a sign that there’s success to be had in simply not evolving in academia. The sort of topic and field jumps that you see people make in private sector research are basically impossible in the academic system, you’d be instablocked on track-record grounds.

There’s the universities. Now just slaves to gaming the international league tables as their sole reason to exist. The league tables trump everything, it’s all just a game to wring more out of the same old machine, even at the expense of the tangible core outcomes, just to be able to wave about a jump of a few places in Ranking Table X in your PR. Everything becomes about the numbers, the substance be damned. It’s like taking Jackson Pollock’s Blue Poles, calculating the fractal dimension, burning the canvas to ash, putting the number up on the wall of the National Gallery in neon lights, and expecting the public to still come and be engaged with the art. Does anyone really think that SpaceX, or Cochlear, or Vestas care as much about where they are on similar league tables, e.g., who’s the biggest company, to the point where it drives the entire executive strategy? Does anyone really think Ibanez would alienate all their customers by focusing solely on 7 string basses, just so they can be the company producing the basses with the highest number of strings? But, recklessly compressing your teaching program, all the while putting your students under greater stress than ever, driving them off your campus and out of your enrollments, just to jump a few spots higher on some pointless table? Totally cool and normal. Top academic leadership. Education be damned, right, the money still comes in as they can’t get a job without a degree… as long as we’re the number one choice in ponzidemia. How long before businesses see university degrees as disruptable because students can’t learn what they need due to how university programs are structured, and just hire kids without degrees and train them in house. If universities just stay focused on this pointless pissing contest, that disruption may not be so far away.

The whole business of academia seems endlessly stuck in the past. Managed by folks who’ve never done anything but be in academia. Chosen to be managers by managers stuck in the same rigid thinking simply because they’re correctly stuck in the same rigid thinking (the ‘safe pair of hands’ problem). After all, once you’re in, there’s no incentive to leave academia, simply because you know 100% there’s absolutely no way you would ever be allowed back in the door. Welcome to the Hotel California — “You can check out any time you like but you can never leave.”

One thing’s for sure, something’s got to give… eventually. We have a system that’s stuck in the past, with leadership that cannot see past a) doing what we always do, and b) tightening any screw they can to milk more out for league table supremacy without violating a). All with severe constraints coming at decadal scale. There is no creative leadership on the sustainability of the business — it’s simply keep the accelerator to the floor and hope the car doesn’t run out of fuel or crash. Government funding is declining, yet every DVC-R at every campus claims they are going to grow their share of government research income. They can’t all be right. We do a terrible job of convincing the public we are important, meaning that funding pool will keep declining for the foreseeable future. We’re generating oversupplies of graduates, who then can’t get jobs but are stuck with massive debt; there’s an adverse feedback loop coming on this I’m sure. Even postdocs are almost unaffordable now. International markets are shifting quite strongly, such that traditional supply streams may soon run dry. Yet, every DVC-A at every campus claims they are going to grow their share of enrollments, both local and international. They can’t all be right either. Ponzidemia has to collapse at some point, I just can’t work out where it fails first. One of the many fantasies propping this system up is gonna be the straw that breaks the donkey’s back. But which one is it? When it all lets go, which is the campus that dies first? Because companies collapse, and now that universities are essentially companies (future post), the time has to be coming that universities collapse too. I’m still waiting to hear of a single person in the academic leadership system that doesn’t have their head buried in the sand on this.

And if you think I’m looking too hard for a problem here, take a look at modern politics. Similar issues. The organisations have forgotten their core business for different masters, they’re run by people who’ve never done other things, many have been ‘political class’ since they joined undergrad student politics, the bases are angry and unsatisfied and feeling ignored and demanding change. The right change agent just needs to come along… maybe it’s the same for academia.

Scientific Societies — The standard you walk past is the standard you accept… start demanding better.

Scientific societies… for any young researcher they’re a part of the scientific landscape that you cannot avoid stumbling upon before getting far into your career. A few are good, and serve very useful purpose in connecting communities. Many are mediocre, poorly managed relics of a bygone era trying to stay alive in a modernising world. Some are downright rotten, and little more than old boy mafias designed to protect and advance the interests of the few at the expense of the many. Across the board, they are good at luring in new members to grow and advance. It is easy to feel flattered by their interest in your engagement with them, and to imagine them as another ladder to your advancement…

… as someone who, like many, has fallen into the honeytrap, it can be interesting later on to look back at what you’ve seen. At what gave you true value. At what was so unbelievably cringeworthy that it made you wonder how such a thing can ethically exist at all! Of the many societies I’ve joined there are very few I still remain in, and even those I do, I question staying, less in terms of ethics and more in terms of value to a member against money spent. There are some I saw real value in, some I’ve left as I didn’t see the value prospect, and there are others that, could I burn them to the ground and piss on the ashes, I would do so in a heartbeat.

Most of my writings on this blog are aimed at giving those who follow me some useful wisdoms that I wished had been passed down to me to save me some grief. And so for those new to the great Ponzi scheme of academia, I want to give you some things to look for when choosing which societies to invest your money and effort in and which should be left to go the way of the dinosaurs that run them.

To keep it short, let me get to bullet points…

  1. Is the organisation democratic? How is the leadership decided? Is it elected? Is it appointed? Do people come in, do their term, hand over and move on? Or is it the same names rotating around a circle of positions, perhaps even keeping the same position for decades? When candidates are put up for election, are they all just old white blokes from ‘Sandstone/Ivy league’ institutions, or is there some proper diversity of representation?
  2. Is the organisation transparent? How well can the members see inside the goings on of the management of the organisation? Is there an AGM? Are minutes published? Are the budget sheets well accounted for? Can you see what the outgoings vs the incomings are and how they provide you value? If you talk to the leadership, do they care about your interests and opinions, or do they just talk down to you or ignore you completely as you aren’t very important (after you’ve paid your dues of course). Very few professional organisations are even remotely transparent on this; for some, the only transparency comes by accident from their complete incompetence at InfoSec. Don’t be afraid to do your homework!
  3. What they hell do they actually do that provides you value? Professional societies do lots of things. Hold conferences (like we need more of them), manage journals (like we need more of them), give out awards (like we need more of them). Some exist almost entirely as lobbying organisations. Some are businesses aiming to make a profit off conferences and journals. How much is the membership fee? What do *you personally* gain from that membership fee? Or is it someone else’s gain? Or is it sitting in a bank account as ‘assets’ that you aren’t even aware exist? Imagine the organisation collapsed, who would get those assets? Where would they go? Would they be divided and refunded to members?
  4. How much do they advance diversity and progress in the field? Are their conferences endlessly filled with all male plenary and keynote line-ups, with the same old blokes talking meeting after meeting? With insulting sessions on why women and minorities don’t remain in the field despite their obviously exclusionist policies? Or do they actually put real effort into exposing the community to new and different people with new and different ideas? To embracing diversity. To what extent are the prizes just going to people already in the organisational leadership through a closed opaque process, or are they given to people with no prior connection through fair and properly transparent nomination processes with well publicised guidelines and outcome statistics? How much correlation is there between past awardees of any award from the organisation and people who have been in leadership positions in that organisation at some time (past or present)? Are their journals diverse, or just a vehicle for an easy publishing ride for members of the club?
  5. Is the advocacy of the organisation really for everyone, or is this about advancing the interests of the inner sanctum first? This can be hard to tease out without getting close enough to see how the interactions really work. This last point is perhaps less a criterion to join and more a criterion to decide to jump overboard with your cash/time and start swimming for new shores.

Ultimately, the list above, and some recent pieces of satire — Part 1, Part 2 and Part 3 —  are designed to make you all think a little bit more about whether the professional organisation you intend to join is something you should join, or whether the one you’re in is one you should stay in.

The latter is an important point — staying with a dodgy professional organisation is actually doing yourself and your field a dis-service. In the words of Lt. Gen. David Morrison “The standard you walk past is the standard you accept.” By providing your hard earned salary in membership fees, or research grant monies in conference registration or publishing fees, to dodgy professional organisations, you are essentially implicitly endorsing and supporting those activities. You can say whatever you want, money talks, your actions are incongruent, and you are advancing bad interests.

Do your homework in advance. Once you’re in, don’t be afraid to walk away from these things if they aren’t serving your interests. Me personally, I’m currently down to only two professional organisations, and I’m currently questioning both (less in terms of morals, more in terms of value).

Don’t be afraid to let some of these organisations actually die. Death is part of evolution, it’s why we no longer have to worry about being eaten by sabre-toothed tigers. Also, don’t be afraid to start your own organisations… Remember, all of these organisations started somewhere, often from small groups of young people keen on making change. If you do this, avoid the established societies like the plague (see Part 1). You don’t need them. They will only seek to usurp you into their interests. Stay fiercely independent.

As for me, now that I’ve spent a week ‘sucking out the venom’ from my years of pain with professional organisations (not all bad, some really do make great contributions, and those contributions should be visible and the management transparent and humble), it’s time to focus on other things…


Dealing with Referees at journals

This post follows an interesting recent job as an adjudicating editor for a paper that underwent two rounds of review, was rejected and sent back under appeal. The first round was: reconsider and submit elsewhere. The second round was: submit elsewhere, accept and submit elsewhere. This scenario is most authors’ worst nightmare: almost there but not quite. The appeal was textbook “How to not convince an editor to overturn the referees”, which is sad because it can be done, but it takes a very calm head and good tactical approach. It’s far from the first time I’ve seen cases where the authors’ have written an appeal that’s about as successful as the Vasa as an adjudicating referee or editor, so I figured I’d post some useful observations for how to or not to deal with referees when you don’t have a full hand of accept or accept with very minor revisions come back. (n.b., points below are not limited to that most recent paper or entirely motivated by it).

Being annoyed is ok, just don’t let it show: Everyone hates rejection. The key is to not show it. If that means sitting on the reports for two weeks fine. If that means walking around the corridor telling everyone how much those comments suck so they can remind you they get the same too, that’s also fine. Running, boxing, smashing plates, whatever it takes, expel the rage, get a zen like calm, then move on.

Ad hominem attacks on referees never work: Seems obvious I know, but you see them again and again. The editor is always going to side with their referees unless they’ve said something mind-blowingly stupid or inappropriate. Even if they do, you can be tactful about it. Questioning the referee’s competence, haste, or mental state is never helpful. Accusing them of impropriety, deception, etc. likewise.

Forget guessing who they are: You are always wrong. I’ve only ever seen one exception. It was clear the authors thought Person X was Referee F. Person X was certainly a referee, but they were Referee Q instead, and actually on their side. Some referees are good at masking themselves. They have the advantage in that. Don’t play this game, you never win.

Be unemotional in your response: A good editor will know you’re angry and just ignore any emotion and vitriol in your argument. A bad editor will let it colour you. Either way, it’s not the shown emotion that’s your worst enemy here. It’s the fact that the emotion kills your ability to carry off a very cool objective argument. If you have a strong point, you can back it with facts. You don’t need emotion. If anything, you can be completely concessionary and your argument will still draw blood if it’s sharp.

If you disagree, back it with facts and literature: The editor is not a fool. They will back their referees as an a priori position, always, but they are open to logical argument. Make sure that argument is as clear for a non-expert as you can make it but absolutely backable with evidence. It can help to point out what the referee is right about along with what they are not right about, because it’s rare that someone is 100% wrong. Be reasonable.

Make sure your logic is bulletproof: Try to anticipate counterarguments: where can you be defeated? Avoid things that are easily shot down. For example, saying that the fact that a report was returned within a few days means a proper job hasn’t been done is dangerous. Does the fact that a referee took 3 months mean they thus spent the entire 3 months on the report and it’s intrinsically better as a result? Probably not. Argument killed. Don’t put forward an argument that you cannot logically defend.

Always look to make concessions: Taking up “The referees are stupid, we’re changing nothing…” heel-dragging stance will always get you nowhere. In contrast, pointing out that a referee is correct or perhaps not completely correct, but you can see why they got confused and made changes x, y and z to make sure no one else gets confused the same way will get you everywhere. Editors love the hell out of this. The referees can be wrong and the editors will still find their value. They are the canary in the coal mine for your paper. They point out the obvious flaws could statistically hit 1/n of your readers (where n = number of referees) based on a small discrete sample. Fix the holes.

No tinfoil hats: Do not ever go down the path of accusing the whole set of referees of some kind of intrinsic collective bias against your field or against experimental/theoretical papers or against your work. The referees don’t know who the other referees are. They cannot be part of some crazy Illuminati-style conspiracy against you. Even if you suspect it, just don’t even go there…

Look where the train crashes: If a referee gets hung up at some aspect of the paper and then says they can’t see the significance or impact or whatever, it’s usually a sign that you lost them. A lost referee is rarely going to write an eight word report “I don’t get any of this, reject it”. What they normally do is pick it apart on the bits they know, and then reject the paper on the criteria. This will look like they only made technical arguments and have no justification to reject. The actuality for an editor, who knows who the referees are remember, is that it points out how broad your audience can be, because you can rely on people the same distance from your field as that referee having equal struggles with the paper. The clue for you is that the hang-ups point out where you start doing a bad job keeping your readers. This can be a powerful tool for improvements. Don’t think ‘what’s wrong with the referees?’, think instead ‘what do their comments tell me about how far they got in before I lost them?’. It could well be the first column, in which case you have a lot of work to do. All is not lost though — I have seen this fixed and turned into acceptances before (and done it myself too). It just takes some very clever mastery of techniques for handling referees.

The higher the journal, the broader the referees: It stuns me how many people submit to high impact journals somehow expecting that the set of referees will only contain experts in their tiny little subtopic. And then decide the way to rebut referees is to attack their technical expertise level. The higher journals never give you this set of referees, they want to find out what will happen to this paper in a diverse(read as non-expert) audience. That doesn’t mean they’ll send you to people who are clueless about the topic. But they will send you to people that are far enough away that they’ll get lost if you haven’t written your introduction and conclusions exceptionally well. If your referees don’t get it, I’m sorry but it’s not because they aren’t expert enough to appreciate it, it’s because you haven’t done a good enough job at writing for a broad audience. The only way to make amends in an editor’s eyes is to resubmit with major improvements on this side, no matter what round of revision it is. If you don’t, an editor can rarely help you because the referees’ comments stand and there’s really no way out of the fact that if they didn’t get it, the readership probably won’t either.

Just because you made revisions, doesn’t mean referee won’t still reject: Another flawed expectation from some authors. Some referees have a high bar, particularly in the top journals. Sometimes the paper is improved, but it’s still just not there. Some authors have a lottery approach to this, they grudgingly give concessions and go through rounds of revise and resubmit or revise and submit elsewhere until they find a set of referees that will take the work. The better approach is to keep working on improving your writing and upping the effort you put into making revisions. On the latter, always give 110% to revisions, go the extra mile if you can. Referees and editors are so used to crappy rebuttals and pathetic changes that even moderate efforts can have massive positive impact.

In the end, key points here, respect the referees, look for the hidden clues about how to make improvements, if you must argue then back it with facts, and sell on the improvements you’ve made rather than going head to head. I’ve never seen anyone go from reject to accept simply by going to war with the referees. It’s a losing strategy 100% of the time.