It’s been a while between posts. I’ll begin with two caveats. 1. I enjoy writing in stream of consciousness; think fast, write fast, let the warts be topics for discussion. Over-refined arguments are conversation killers. 2. I was an agnostic forced through the catholic school system. I took joy in arguing contrary points to extreme lengths just to see how far I could get defending them. A loss was often a win.
If you don’t like challenging your thinking, don’t read this article. If imperfect arguments drive you nuts, don’t read this article.
Cue 2015. Twitter is full of discussions of #ponzidemia and the unsustainability of academia. The ‘anointed’ professors get money, often for stuff they’re not best qualified for; track record is king. Junior academics are on the breadline, they spend all their lives submitting proposals, only to have all or almost all of them rated highly and rejected. Sometimes they appear the next year, authored by ‘anointed’ professors. We have postdocs and superdocs and probably soon megadocs. We have seas of Ph.D. students, who come in ‘bright-eyed and bushy tailed’, and 4 years later, are burnt out and worried if they’ll even have a future after living the breadline on their measly stipend while working 60+ hours a week. Their future: work 60+ hours a week for 15 years, if you’re lucky, you might have a <5% (and falling) chance at some sort of stable position. Don’t have kids, don’t have a life, don’t settle in any given place or it’s probably game over. Or, just leave, sorry, the university thanks you for the business, be sure to return your graduation garments on time.
Enough woe, we all know it, you all get the picture. How the hell can we fix this system?
Probably not by keeping the status quo. So, what’s the alternative? Is there one? Is there a radical solution? Perhaps… but I think it involves breaking the horrible entanglement between the two core-businesses of a university: teaching and research. I see this historical artefact as a cause of many problems in modern academia.
Before I get to a possible solution, I’ll declare two things. First, the other night I read this great article about the value of monopolies — “Competition is for losers” by Peter Thiel in the Wall Street Journal. It says some really interesting things about innovation in competitive and monopoly environments. Second, I also read this great article about HP labs and the future of computing — “Machine Dreams” by Tom Simonite in MIT Technology Review. Both articles made me realise the massive innovative power that can be achieved when you create an environment where you have a bunch of really smart people put together into well structured teams with a common mission and, most importantly, good strong continuous financial and managerial support. This already happens some times; thinking locally, I see this a little in some of the Australian Research Council Centers of Excellence (ARC CoEs), although they are often too small, too focussed and sometimes too closed/narrow by competition and fund limits. Places like CERN and some of the medical institutes that sit outside universities are other examples.
Science is a different game now — once upon a time you could have an academic, a handful of students, and a bit of cash, let them have their own free ideas and get good to great things out. As science has become more ‘pointy’ you now need bigger teams working with more resources over longer timescales to achieve outcomes of real substance. What happens more frequently now is this: you have an academic and a handful of students rushed to graduate by low stipends and a university seeking cash for ‘on time completion’. The academic is flattened by a teaching load and writing a sea of proposals (mostly full of bureaucracy nowadays) — the students often struggle to get their supervisor’s time, let alone get them to spend half a day in the lab with them when it’s needed (often). Now add a <20% success rate of getting at best half the cash they need to do a proper job. The result is running scaled down ideas, often with the truly innovative bits shaved off, because they’re invariably the most resource intensive parts. The outcome is usually a bunch of undercooked heavily-hyped papers, put out because they must be to maintain competitive track records. The papers often report results that are irreproducible, if not because crucial details are omitted to maintain competitive advantage, then because they are flukey one-offs with low yield. There isn’t a lot of serious innovation in this, nor a lot of intellectual enjoyment — everyone is unhappy, fighting for survival rather than doing the brilliant, edgy science they actually dream of.
The problem is ultimately one of resources, not just money but also time. So… here’s a possible solution (one that would require some serious resolve to implement, admittedly, something few governments or organisations have these days…).
We take universities and we strip all of the research out of them, every bit of it. No university does a single scrap of research under this model — they are training organisations pure and simple. The ‘academics’ employed by a university are there for one purpose — to teach undergraduates and teach them really well. Their ‘core business’ is no longer conflicted by any thoughts of doing actual research. Undergraduates don’t need academics who do research, what they need is academics who teach well, who have time to give them to help them learn the subject. More time than academics have in the current system, where they immediately race off after class, or worse, actively dodge students, so they can focus time on writing papers and grants to further something other than their teaching. As far as undergraduates are concerned, you could even say that having academics do research actively harms them because it robs them of the human interaction they need to learn technical subjects with depth.
Before you all start screaming ‘no research, but how could you!’ bear in mind I’m talking about the academics not the students here. Students doing research is an entirely separate issue — they can access research through internships at the research institutes discussed below. They can also see a little of it in well designed undergraduate lab classes, where one can teach the approach, albeit without doing actual research (i.e., generating new knowledge, which rarely happens for undergraduate classes anyway as the leading edge is too far beyond them and it takes too much time). This will mean you might want academics who’ve done research before joining the university, that’s fine, but they shouldn’t be trying to do two jobs at once.
Some will say ‘but how do we attract students?’ If you talk to undergrads, you’ll find they mostly aren’t attracted by research until they get a couple of years in and are indoctrinated by the system. It certainly doesn’t determine what university they choose to attend when leaving high school beyond our system of deciding that the best universities are the ones with the best research. Students choose to go to University X because it’s the best and they know this from the marketing not from an actual rigorous informed personal assessment of the research (try talking to them about this, you’ll see exactly what I mean — there are only a few exceptions). If we ranked the universities instead purely on teaching quality, and built all the marketing around that, then they’d still behave exactly as they do now. But the nice outcome would be that they choose University X for the right reasons: because it’s the best at what they are going there to get from it — an education. Pulling research out of universities removes this confusion about core business.
We then take the research and we move it into research institutes that are entirely separate to the universities, even if they might be conveniently located (e.g., up the road, or across town). The folks working in institutes are all full-time research; if they teach, it’s the rare ‘guest lecture’ at the nearby university, perhaps one or two a year. The research institutes are proper research institutes, in the sense that they have a specific focus encompassing all expertise in that area for a given nation or state (i.e., they are true monopolies). They have a proper management structure from top to bottom, such that research directions are decided from above by people who know what is/isn’t going to be properly innovative and are then fully and properly resourced to achieve a proper outcome. The funding could be pure government (e.g., like an ARC CoE) or pure industry (e.g., Google/HP) or pure philanthropy (e.g., Victor Chang Institute) or some mix but it needs to be at a level that researchers are resourced well and kept continually active — not like they often are now in academia, where they spend more time begging for meagre funds via massive proposals at atrocious success rates than doing anything else.
Employment at the institutes would be at 5 levels. The entry level is interns — these are masters students from nearby universities seeking experience during their studies with payments by stipend like they are now. The largest ranks would be comprised by junior fellows and technical staff. Junior fellows are what we currently view as Ph.D. students (i.e. freshly completed Masters students) but I would abolish the Ph.D. entirely, it is a historical artefact in this system — one that basically amounts to several years of indentured slave labour and payment of a small ransom to have access to jobs in the sector. This would mean paying these people a proper salary right from scratch, which is only fair given their vital role.
Technical staff are as we have now — less focussed on research itself, more focussed on doing the jobs that are essential to research being done efficiently. The second highest tier would be the senior fellows — they are the postdocs and junior academics of today. The top tier would be the scientific management — they are the professors of today’s system. All of these people are paid on a more continuous scale at a value that’s fair to their contribution and career progression — it seems outrageous to me that Ph.D. students get paid 1/7th of what a professor makes; paying them properly also means we can also expect more of them in terms of capability, professionalism and output.
The junior fellows are on 5 year non-continuing contracts, everyone else is on a 10 year renewable contract with review at 5 years — no one in the organisation has a permanent position including the management. Employment levels are set by management to be sustainable such that the organisation is optimally productive, with stipulations on working in teams, minimum resourcing, reallocation of people between projects/teams and management structure. There are also proper targets for workplace diversity and workload management — staff should work hard because they want to when they feel they should, not as a pre-requisite to survival within the system. This would be enabled by changing the way staff are assessed.
Only junior and senior fellows are assessed based on output and outcomes, but they are judged from a team productivity perspective. Comments like “blah isn’t first or last author enough, therefore they contribute nothing” should never be heard again — teams are never 90% two members and a load of passengers, these sorts of attitudes are absolutely destructive to good collaborative and collegial science. Note that since there is no longer a grant system available to these fellows — funding runs down through the management system much like in industry — career assessment is only really needed internally. This means that performance can be more properly judged and managed, and skills beyond ‘how many first author Nature/Science papers can you get for yourself?’ can be properly valued. It enables people to better survive career breaks, be they for professional reasons or family reasons.
Management are instead assessed by ‘360 degree feedback’ weighted say at 70% from junior and senior fellows (anonymous upward assessment) and 30% at peer and above. It focuses entirely on the extent to which a manager enables the teams below them, and the institute as a whole, to be maximally productive. For management, it is much more about creating a legacy at the junior levels and investing in the future than it is about feathering their own nests or driving their own output. Management are there to inspire, enable and encourage, rather than to slave-drive and claim credit to advance their own metrics.
This model would necessarily mean fewer people in research, but not necessarily reduced employment — there are now two separate systems needing staffing: one devoted to research and one to teaching. In both cases the employees no longer have a major conflict of interest — they either do teaching and do it well or they do research and they do it well. They are not trying to do both, and ultimately doing them to less than the level they can because there are only so many hours in a day and so much cash to go around. On both sides, you will have fewer burn-out victims, destroyed by working insane hours trying to do two jobs really well — chose one, do it properly. In both cases, if at the end of their contract they aren’t doing it well enough, then they go do something else (or move from one system to the other). If anything, there should be encouraged turnover, and perhaps even schemes to cleanly, fairly and properly ‘manage’ people out of the system into other careers, e.g., politics, public sector, etc., where intelligent people who can reason well are sorely needed.
This model should also mean less time wasted on intense competition for dwindling resources — as “Competition is for losers” by Peter Thiel in the Wall Street Journal argues, we deliberately create national/state research monopolies that are resourced to a level where they can properly go after innovative ideas. This is not to say competition is eliminated entirely — it can go on as a contest of ideas inside the institute — but clear decisions on resourcing are then made. This enables innovation directions to be ‘shaken down’ efficiently, with the best ones properly resourced so they result in proper outcomes, not underbaked outcomes due to resource starvation.
Finally, it means that universities are properly competing on their actual core business, which is teaching the next generation advanced ideas, not some other core business, i.e., research, that happens to be entangled into the same institution by history. I can see how a university needed academics to do both teaching and research in the 1700s, 1800s, and even early to mid 1900s, but in the modern era it’s entirely unnecessary. The leading edge of research is generally so far beyond undergraduate studies that it’s no longer essential to have research and teaching in the same place any more.
In the end, what many academics are really tired of is trying to do two jobs, neither of which one can really ever feel like they’re doing to the fullest of their ability. One focuses entirely on teaching in a university and they are a pariah, their ability to rise the ranks is heavily compromised because of this crazy conflict where university rankings are fuelled by research output over teaching quality. One tries to do research, but often with limited time and limited resources, and again it is a nightmare to retain a competitive edge when improperly and inequitably resourced and hit with teaching loads that often vary between individuals (and sometimes used punitively).
If we really want true innovation, we need to do it properly, and perhaps separating these two competing core businesses is the best way… some of you will note that structures like this already exist in many places. Sure. But I often wonder now if it’s how it should be everywhere…