academia, academic writing, Coronavirus, Higher Education, proofreading

Covid-19: one year on

It’s been a little over a year since the Covid-19 pandemic made its entrance on European soil. First stop: Italy. I still remember the day the first Italian victim was announced, one Friday in late February. I had just picked up my C2 English Proficiency certificate that morning, and as I started preparing my lunch I turned on the news, which was all about Covid.

Various lockdowns and millions of face masks on, it now seems like the worst is over. Perhaps we are soon to be “reborn with a flower”, as the Italian anti-Covid vaccination programme is called. But the consequences, I fear, will be felt for many years. Not just lives have been lost: jobs, businesses, (movie) theatres, archives, trade shows, festivals, street markets…everything has been affected. Our habits have changed, too, though hopefully not for good.

Although academia was hit as hard as other sectors, especially in the Humanities, as a proofreader/translator I haven’t suffered the consequences too much. On the contrary, several returning clients and word of mouth have kept me busy for most of the past 15 months. In fact, this must have been the busiest year of my post-academic career so far!

This has also meant not being able to write new blog posts. But, now that I’ve had a chance to take a (much needed) rest, I plan to make up for this absence, starting with a new post coming up soon, on gendered language in academic writing: is gender neutrality really the key?

In the meantime, if you’ve missed any of my older posts, please check out the archive below.

Stay safe and stay tuned!

Budding poppy (photo by Andrea Hajek)
Budding poppy (photo by Andrea Hajek)
Student staring out the window at Utrecht University Library (photo by Andrea Hajek)
academia, Higher Education, mental health, PhD, research

Breaking out of the cocoon that is called a PhD

New round of posts about the “fabulous four” core activities in academia: writing, teaching, research, and disseminationEpisode 2: research.

Breaking out of the cocoon that is called a PhD

The topic of mental health in higher education is usually debated in relation to the rising number of mental health incidents among students, or the growing stress levels experienced by staff.

The Guardian, for example, recently revealed how “British universities are experiencing a surge in student anxiety, mental breakdowns and depression”. The topic gained media attention after a series of suicides affected one university in particular, between 2016 and 2018, although it appears to be a more widespread phenomenon.

Similarly, workloads have become unmanageable for many academics, having to juggle their time between teaching, marking, course development, board meetings, examining theses, producing 4* publications, grant income, and so on. In fact, staff increasingly falls back on counselling and occupational health in universities, as a recent HEPI report by Dr Liz Morrish, entitled Pressure Vessels: The epidemic of poor mental health among higher education staff, has shown.

Who tends to be overseen in this context is the PhD student. Embarking on a PhD project is exciting: you’re at the start of a – hopefully great – career, maybe you’ve won a PhD studentship, or your university is highly ranked. BUT, it can also be really daunting: you need to produce something good now, become an expert in your field, and you have to do it all by yourself.

In sum, doing a PhD can be empowering, but it can also shred your self-confidence. Competing fellow students. Insensitive dinosaur-professors slagging you off at conferences. Pressure being put on you to finish your thesis on time. Feeling cut out when you can’t join your peers because you’re having to work on the side.

Sure, you have a supervisor, but they won’t be holding your hands, or chasing you up. You have to learn how to manage your time, develop research skills, and write academically.

Now it’s OK to make mistakes. Like I replied to one of my proofreading clients, when she apologised for her “crappy” writing: it’s a PhD student’s prerogative to write in a crappy way. For the record, she got the highest score in quality of written English, and she didn’t only have my proofreading skills to thank for it.

Nevertheless, a PhD puts a lot of pressure on people. It can be daunting.

Doing research – be it in a public archive or at home – and writing a thesis is generally a solitary routine, which can be thoroughly demotivating, especially if you don’t have an office to go to.

I used to migrate between home, library, and computer rooms (regularly leaving behind my pen drive), and I remember how it made me feel lost. I was the only new PhD student in my year, and those who had started their PhDs in previous years were all teaching, or doing it part time. I hardly ever saw them, and we never organised anything as a group.

This is another issue: PhD isolation also prevents you from confronting your research with other people, and as a result, it’s harder to grow as a scholar. Talking about your project, what it’s about and why it’s so interesting, can help you clarify your ideas, maybe even open your eyes to things taken for granted, or that skipped your attention.

So if you risk being isolated during your PhD, here’s five practical tips on how to break out of that cocoon:

  1. Groups. See if there’s a postgraduate community in your department or school, and if there isn’t, why not start one? It can be as simple as having lunch together or a coffee break in the common room. And if you’re a “PhD orphan”, have a look outside your department as well.
  2. Seminars. Your department or school is bound to be running postgraduate seminars, which are nice and informal. Also try to attend some postgraduate conferences in your discipline (some of which award travel grants for unfunded students). Avoid major conferences, though, at least in the first two years of your PhD, unless you have a smashing conference paper that you know will make you a superstar overnight.
  3. Training. Most universities offer a postgraduate training programme, which are free to attend, and usually come with coffee and biscuits. Great way to meet new people and also take a break from your research/writing routine.
  4. Study areas. Another good way to break out of your cocoon is to do your research in a space where you’re bound to meet your peers. This is particularly relevant if you work from home, where it’s easy to get demotivated or distracted. Instead, identify a public space where you work well but can also talk to people. It gives your day structure, and it will get you out of your pyjama!
  5. Social media, I didn’t start using Twitter until fairly recently, but I see a lot of PhD students and ECRs talking to each other there, sharing feelings of anxiety and frustration, but also supporting each other (for example at #PhDchat). I guess it’s not as nice as chatting with people over tea and biscuits, but it’s a fair alternative. And it’s always good to get a rant out of your system.

Now all this might not be your thing. Maybe you’re perfectly happy working from home, and your weekly yoga class is all you need to satisfy your socialising needs. Maybe your PhD community includes someone like Sheldon from the Big Bang Theory, and you rather have a quiet lunch on a bench outside, taking advantage of a beautiful autumn day.

That’s fine. As long as you develop a routine that you’re comfortable with.

***

Student staring out the window at Utrecht University Library (photo by Andrea Hajek)
Student staring out the window at Utrecht University Library (photo by Andrea Hajek)

***

Previous post about doing academic research:

It’s that time of REF again!

Dusty bookshelves at Utrecht University Library (photo by Andrea Hajek)
academia, academic writing, proofreading, publishing, style guide

Commas save lives

New round of posts about the “fabulous four” core activities in academia: writing, teaching, research, and disseminationEpisode 1: writing.

Commas save lives 

I recently discovered that 24 September is National Punctuation Day. Well, in the States it is. Yes, people actually celebrate punctuation! A certain Jeff Rubin launched it, and even designed a website, which is all about punctuation: rules, gadgets, games.

I guess fetishes come in all shapes and sizes.

Right, I’m taking the mickey out of poor Jeff. Actually, punctuation is quite important. If you look up #PunctuationDay on Twitter, you get a string of Tweets featuring tons of examples of punctuation gone wrong.

The most popular seem to be in the line of “Let’s eat grandpa” or “Let’s eat grandma”. Not sure why grandparents are such popular objects in these examples, but it does prove a point: commas save lives.

In my job as a proofreader, some of the errors I most regularly encounter regard punctuation. In this blog post I want to have a closer look at commas. As easy as they may seem, they are the most common sources of punctuation errors.

So here are my top 3 tips on how to use commas correctly:

  1. Commas are mainly used when two independent clauses are joined by words such as “and”, “or”, and “but”:

I did the exam, and I went down to the pub.

Only leave out the comma when the subject is omitted before the second verb (“went”):

I did the exam and went down to the pub.

What’s important to remember is that commas can’t join clauses by themselves, as in this sentence: I did the exam, I went down to the pub. Here it’s best to just use a conjunction word (“and”). In other cases you might need to do more, like splitting the sentence up using a period or a semicolon.

  1. Commas are also used a lot to separate words, in a series of three or more items:

I had wine, cheese, and crackers.

Note that I’ve added a comma before “and”. This is called an Oxford (or serial) comma, and it’s used a lot in the US – less in Britain. It serves to avoid ambiguity, especially if the list already contains conjunction words. For example, in this sentence “and” is used twice: I had wine, cheese and crackers and strawberries.

As a result, it’s not clear whether cheese, crackers, and strawberries represented one dish, or were eaten separately. You wouldn’t eat a cracker with cheese and a strawberry on top, would you? If we add a comma after “crackers”, though, the situation is clearer:

I had wine, cheese and crackers, and strawberries.

Awe, brings back so many good memories of conference drinks…

  1. A third error I often come across is when a nonessential clause is NOT set off from the main sentence. A nonessential clause contains information that you can leave out of the sentence without changing its overall meaning:

Libraries, which are full of dusty bookshelves, aren’t my cup of tea.

If we were to remove “which are full of dusty bookshelves”, the meaning of the sentence as a whole – namely that you don’t like libraries – doesn’t change. It’s not essential information, so you would use a comma to set it off from the rest of the sentence.

Things change when you’re dealing with an essential clause, also called a restrictive clause (because it restricts the noun):

I don’t like libraries that look like coffee lounges.

The restrictive clause “that look like coffee lounges” says you don’t like a certain type of library, not that you don’t like all libraries. You do, just not this kind of library! In other words, the restrictive cause gives relevant information, and without it the meaning of the overall sentence changes. It’s integral to the sentence, so it can’t be set off by commas.

So you see, commas should never be underestimated, and must be used wisely and responsibly. Not just for the sake of poor old grandma.

***

For more detailed explanations, examples and tests, check out these online resources:

Blue Book of Grammar

University of Bristol grammar tutorial (followed by quiz)

Punctuation slide show (by William E. Sledzik)

Dusty bookshelves at Utrecht University Library (photo by Andrea Hajek)
Dusty bookshelves at Utrecht University Library (photo by Andrea Hajek)

***

Previous post about academic writing:

The truth about notes

academia, Higher Education, student feedback, teaching

Lest we drown in student feedback (episode 4 of the Fabulous Four)

Fourth and last (for now) in a series of posts, all drawn from my own – often suffered – experience of the academic world, about the “fabulous four” core activities in academia: writing, teaching, research and dissemination (in random order)Episode 4: teaching.

Lest we drown in student feedback

(warning: this blog post contains a rant)

Not long ago I stumbled upon a heated discussion on Twitter, regarding plans of the University of Leeds to implement an anonymous student feedback plan. I couldn’t access the article reporting this fact (wasn’t bothered to pay the subscription fees, sorry), but apparently module leaders would need to respond to the anonymous student posts on the discussion board within five working days. Sure, why not just install self-flagellation poles in every faculty building. Mea culpa, mea culpa, mea culpa.

In the debate, people particularly expressed their concern about teaching staff being confronted with anonymous attacks upon their gender, ethnicity, class, physical appearance and clothing choices, even their teaching style.

They’re not exaggerating. Research has shown that the ethnicity of teaching staff can affect the outcomes of student feedback, which in the UK is gathered mainly through the National Student Survey (NSS). I have also heard accounts of students complaining about too much feminism, for example. I love this reply from the author of an Academics Anonymous article on the topic:

The course content reflects academic research and theory on the subject and is not up for discussion. My response to the […] feedback was to include more content on feminism, not less.

Student feedback is not always useful. Given that participation is voluntary, response rates tend to be low, and when it is made compulsory things get worse: students are asked to comment on a course they might not have attended, or they just didn’t like the topic, leading to useless comments like the one cited above. Let’s face it: students don’t necessarily know how to assess teaching. We might just as well introduce HappyOrNot smiley terminals across campus, like those at airports or public toilets.

In my own experience, the whole student feedback business seems to be mostly a matter of ticking boxes and scoring high in the NSS. In fact, one university department was even caught instructing students to falsify their approval ratings, telling them that nobody would want to employ them if the responses were negative.

Yep, once more it all comes down to rankings, numbers, assessments. What is it about universities always wanting to score highest, or be the best? There’s even such a thing as the Top 20 UK University Campus Awards, which in my opinion have been more beneficial to building, furniture and gardening industries than for students and staff alike.

Let me tell you just this one anecdote, about a criticism I got on a grammar rehearsal class I ran in the last year of a 3-year postdoctoral fellowship. The purpose of the course was to repeat grammar topics the students (in their second year) had studied the previous year. They had to buy a textbook, from where we selected topics and exercises, and then I added some exercises of my own, occasionally also using songs and other, unconventional sources—y’ know, to make it all a bit less boring.

So basically I would refresh their memory by briefly explaining the topic, using the whiteboard, and then we would do exercises. If the topic covered multiple lessons I would do a brief recap. As they say, repetita iuvant. Additionally, I told students where to find the grammar rules explained in the textbook, and I had published a schedule on Moodle, the university’s online learning platform, with the weekly topics listed. In sum, anyone who missed class could stay on track.

In theory.

In practice, they didn’t. Quite a lot of students, in fact, skipped class every other week or so, especially the one at 9am. Maybe they thought attendance was an optional, or they believed to be doing an online, self-learning degree?

What’s worse is that most of the students didn’t buy the textbook. The result was me regularly facing half a dozen of lost souls staring at me as if I was the Dalai Lama doing a tap dance, often stubbornly justifying their failure to do any exercises by the fact that they had missed class. Poor lambs. As if that was my responsibility.

But see, that’s precisely the problem: I was responsible for making the course material available to them, but in the way THEY wanted it to be. Now I realize I’m coming from one angle, and who knows what kind of study/workload the students might have been struggling with, but when I was an undergrad I always tried to catch up if I missed class, copying notes from fellow students, or at the very least doing the required reading. Is it so hard?

It felt as if I had to deliver a product, and if it didn’t reach the student-consumer, even by their own fault, I was nevertheless to blame.

What really blew my mind, though, was the student feedback: I hadn’t uploaded Powerpoints to Moodle. POWERPOINTS. So they could catch up when missing a class.

Now I know my last blog post was all in favour of Powerpoint, and I maintain that stance, but obviously it depends on the context. I actually used PPT a lot in my classes, just not in this one. After all, it was a rehearsal class, with the grammar being explained in the textbook and then briefly illustrated in class: why should I copy and paste all the rules into a PPT presentation? All I did was to give concrete examples of the grammar rules on the whiteboard. Had I used a PPT instead, the students would undoubtedly have complained that the slides were too concise.

Anyway, in my response to the feedback I explained my teaching method for this specific course, and pointed out that many students were playing hooky, a fact that was backed up by the attendance forms. Surely this would be a point of concern for the College, much more so than my not using Powerpoint?

And yet, my line manager – actually a staunch supporter of traditional, non-visual teaching methods – completely disappointed me when I read his proposed solutions to the raised criticism, something he had to communicate to the College as part of the whole student feedback process. Probably didn’t think I would read the report, but I did.

He literally wrote that next year there would be a new teacher. As if the problem was the teacher, not the students skipping class à gogo.

Obviously the issue here is not the line manager. Lord knows how many of these reports he has to send off to his superiors every year. It does show, though, how ineffective student feedback can be, and the harsh impact it can have on staff. But let’s not throw the baby out with the bathwater. Here’s my recipe for a more humane approach to student learning experiences:

  • First, course convenors need to make it absolutely clear to students what is expected of them (also in terms of attendance), and what they should expect from the course.
  • Second, students need to be given proper instructions on how to assess teaching—a simple “be as objective as possible” sentence at the top of the survey will not suffice. And I would add, also tell them not to slash off teachers.

Most of all, I think we ought to do away with the anonymity culture behind student surveys. Maybe get them to share their experiences in a group chat run by an intermediate figure, who will liaise (so to speak) between students and teachers, making the former understand their responsibilities and filtering out potentially unfounded criticism when reporting back to the latter.

Wouldn’t that be a lot nicer?

 

academia, conferences, dissemination, Higher Education

What’s the point of PowerPoint? (episode 3 of the Fabulous Four)

Third in a series of posts, all drawn from my own – often suffered – experience of the academic world, about the “fabulous four” core activities in academia: writing, teaching, research and dissemination (in random order)Episode 3: dissemination.

What’s the point of PowerPoint?

As the academic year draws to an end, conference-goers get on their way. PowerPoint (PPT) has, by now, become an almost indispensable visual aid at conferences and in the classroom, certainly in the UK. It is gaining ever more momentum, even if some are highly critical of it. But, as one person commented in a recent Twitter debate on the topic, “PPT snobbery is just bullshit for people who like to pretend they’re doing a Ted Talk.”

PPT is particularly frowned upon in Italy, I have found, perhaps because of a certain defiance of technological forms of communication? Or maybe the rhetorical tradition – in its original, oral form – is more rooted in the Italian academic context? That said, not all Italians master the art of rhetorical speaking. I vividly recall a conference of the Society for Italian Studies, where an Italian bloke totally missed the mark in terms of presenting. I don’t recall the topic of his talk (which shows just how terrible his presentation skills were), but I do remember how tedious, almost tormenting, it was to sit through his presentation. Yes, “sitting through” is the right description, and I’m not just talking about the audience! In fact, the speaker basically sat behind a table, sliding down the chair as if he was watching telly on a lazy Friday night, his shoulders pending to one side as the relative arm rested on the table edge, his hand barely holding up the paper while the other hand hided in his pocket. He read the whole paper without bothering to look into the room, obviously going way too fast. I can’t recall how good his English was, but if you add bad pronunciation to it, well, you have a worst case scenario.

Apart from being truly indispensable in certain disciplines, such as art history or film studies, PPT can be a really good visual aid, both for the not-so-confident/skilled public speaker and for the audience. Provided it is considered just that: an AID.

In fact, a lot of people make an excessive or bad use of PPT, which results in equally ineffective presentations. This includes established scholars. I once attended a keynote lecture where the speaker seemed to have copied and pasted his talk, or large part of it, into a PPT presentation: his slides were packed with text, text and still more text, one slide after another…impossible to read while also trying to listen.

So, if you’re guilty of the above or any other misuse of PPT (e.g., flashing colours, flickering lights), or if you’re one of those people who speeds through their slides as if they’re worried they’ll miss their flight, here’s five tips on how to deliver a decent, and effective, PPT presentation:

  1. Don’t stand in front of the slides, but on the side: if you’re right-handed stand left and vice versa.
  2. Don’t overcrowd. Forty words or so is enough for one slide.
  3. Don’t prepare too many slides: my advice is 10 to 12 slides for a 20-minute talk.
  4. Choose your background wisely: MS offers a lot of available PPT templates, but they’re not always appropriate. See if your university has its own PPT template – they always look smart! Also avoid dark backgrounds: dark writing on a light background works best.
  5. Limit, or indeed avoid, excessive clip art and animation features. If you do need to make multiple data appear within the same slide, at different moments, practice this in advance, marking the points in which the data is to appear.

By way of example, have a look at one of my own presentations, on 1968 and the Italian right (ASMI annual conference 2008): Hajek PPT 1968. Note how I’ve added sources when using images from Internet, well except for the iconic photo of the French “Marianne” (by Jean-Pierre Rey). It must have skipped my attention! Also, looking at it now, I would have added a reference – on the first slide – to the name of the association organising the conference, and a final slide containing email address and any social media.

To conclude, only use PPT if it has sense or if you think it will help you. But it’s not a must. Different things work for different people: some are great public speakers, others less. The point I want to make is that PPT can contribute to inclusivity by giving inexperienced or nervous speakers confidence, or simply a visual aid to help them deliver a clear and structured presentation. That also goes for the audience, especially people those who aren’t English native speakers, as it helps them follow the narrative and catch up in case they might miss a word.

In sum, if done correctly PowerPoint is really helpful, and audiences will thank you. I certainly would.

Note: other than building on my own conference experience, for some of the tips described above I have drawn inspiration from the “Presenting to an Academic Audience” course led by Dr Steve Hutchinson, from Hutchinson Training & Development Ltd. 

academia, Higher Education, job market, research

It’s that time of REF again! (episode 2 of the Fabulous Four)

Second in a series of posts, all drawn from my own – often suffered – experience of the academic world, about the “fabulous four” core activities in academia: writing, teaching, research and dissemination (in random order)Episode 2: research.

It’s that time of REF again!

REF.

Research Excellence Framework.

If you study or work in a UK higher education context then you are bound to have heard of it. And if you don’t, you will.

In short, the Research Excellence Framework is a national assessment of the quality of UK higher education research. Last undertaken in 2014, the next REF will take place in 2021. Expert panels will assess three elements for each submission: research outputs, impact and environment. These elements will form the overall quality profile of an institution. Each eligible member of staff has to submit 1 to 5 research outputs, which will mostly be publications.

I contributed to the REF of 2014, when I had only just been awarded a three-year postdoctoral fellowship in a Scottish university. My first monograph was being prepared for publication, whereas a number of journal articles and a special issue had been accepted or recently published. I remember the Head of Department – who was collecting data for the REF submission ­– sounding awfully pleased as I sent him details of all these publications. Too bad my “impressive” contribution to the School’s pool of outputs didn’t leave any further marks on him, given his forgetting to mention me in a list of staff members who had left, in a School Newsletter published three years later. I guess that’s the REF for you.

The REF is, in fact, quite an opportunistic business, as was its predecessor: the RAE (Research Assessment Exercise). Back in 2007 I approached a British university for a doctoral project – I had recently finished an MA and was temporarily selling clothes at the weekly market of Bologna (Italy), where I had participated in an Erasmus exchange programme. Failing to win a fellowship competition in my home university, in the Netherlands, and without much hope to get into the Italian academic world, I decided to seek my fortune in the UK. Unfortunately, it was too late to apply for any funding schemes, but I was offered a fee waiver, provided that I started my PhD in July as opposed to September 2007. As I didn’t see myself selling clothes at the market for much longer, I accepted, even a bit flattered that they had offered me a fee waiver, thinking that my project must have sounded really interesting to them. Oh the naivety! It seems, in fact, that the offer of fee waiver/earlier start date allowed the Department to count me among its PhD cohort for the upcoming RAE, a strategy other departments had apparently also applied, so I discovered later on when talking to other PhD students.

So you see, it all comes down to numbers, assessments, rankings. Some institutions don’t even bother to cover this up: recently, a job vacancy was posted where a university explicitly wrote that it was seeking to recruit a postdoctoral researcher to help deliver high-quality outputs for REF 2021, in particular for impact case studies. Given that the position was fixed (one year) and part-time, and that self-motivation was among the required skills, we may deduct from this that the university was looking for someone to help organise exhibitions or write up reports on behalf of overworked staff members…And to then have the courage to write that the successful applicant could “progress” to a higher rate – on a one-year contract?!

It’s not unusual. I constantly come across one-year job postings, and they all sound pretty much the same. I understand precarious researchers feel their heart leap with joy whenever such a position opens within their discipline, or maybe at the same university where they are completing a PhD or a postdoc, but it’s usually a trap. Of course there is chance that, once you get a foot in the door, you may eventually obtain a permanent position. And it does occasionally happen for real. Most of the time, though, it doesn’t.

I recall a temporary teaching post being opened in the university where I was completing my postdoc. I decided not to apply, because I knew all my time would go into teaching, marking and admin, with no time left to do any serious research or publish articles (which counts A LOT when applying for lecturer jobs). All this for not even a year’s contract, for the job would only cover the teaching and marking period (September-May). How to keep paying your rent and bills during the summer recess apparently didn’t interest the School, a clear indicator that it had no intention whatsoever to extend the job position beyond the contract itself. In the end, the School did actually create a permanent lecturer position, the following year, but apparently the person who got the one-year teaching job – and who obviously applied for the lecturer job – wasn’t hired for that. Instead, they offered her to cover a maternity leave: better than nothing, but hardly what she had bargained for.

In sum, as the REF 2021 deadline approaches, many HE institutions will be recruiting short-term staff members to help stack their REF submissions. My advice is to refrain, if you can afford it. Early career researchers deserve respect and support, as well as long-term perspectives: job- and research-wise. You are not numbers or boxes to tick, but qualified scholars who need some level of stability and security to do their job. As important as even a nine-month job might seem for your bank account or CV, spilling energy and academic capital to then find yourself applying for new jobs – a really time-consuming part of academic life – within less than a year, well, it’s not really worth it, is it?

That said, the REF also puts much pressure on those in a permanent job, and there is a steadily rising wave of UK academics leaving permanent jobs (as this Twitter thread demonstrates). This is obviously not a result of the REF alone, but granted, it doesn’t help either. Maybe universities themselves aren’t even to blame entirely.

So what can we do about this? I would say: be more selective when applying for jobs, and most of all, believe in yourself and in your skills. You’re not a number but a person, with academic capital, and if universities want that, they need to give something back.

academia, academic writing, proofreading, publishing, style guide

The truth about notes (episode 1 of the Fabulous Four)

First in a series of posts, all drawn from my own – often suffered – experience of the academic world, about the “fabulous four” core activities in academia: writing, teaching, research and dissemination (in random order)Episode 1: writing.

The truth about notes 

I recently got into a discussion about punctuation rules on Twitter. Yes, people do actually debate these things, including on social media, even if many don’t have an open mind on the matter, so I found.

In retrospect, I think the question (accompanied by a short survey) that meant to spark the debate wasn’t a very useful one: “do the footnote numbers go before or after the period?” One could also tick a third box, “it depends”, and since we’re talking about punctuation a fourth box was added, about using the Oxford comma—completely irrelevant, really, to the question of where notes should be placed in the text.

In my job as a proofreader, I work almost exclusively with Italian native speakers, and one of the most common errors I come across when proofreading their works is the placement of notes before punctuation marks. I know this is common in Italian academia, but I’ve honestly never seen it in any English-language publications in my discipline.

And I have some good piece of literature to back me up: New Hart’s Rules: The Oxford Style Guide (Oxford University Press, 2nd edition, 2014), endorsed by the Society for Editors and Proofreaders (and highly recommended when I took my first proofreading course). It is described as the essential desk guide for all writers and editors. So what does it say on the positioning of notes, I’m sure you are dying to know?

Here it is:

The reader is referred to a footnote or endnote by a cue in the text. This normally takes the form of a superior Arabic number. The cue is placed after any punctuation (normally after the closing point of a sentence). If, however, it relates only to text within parentheses it is placed before the closing parenthesis (pp. 332-333)

Granted, my initial comment to this thread – where I replied to an Italian academic talking about different approaches being used in different languages (something I agreed upon) – must have come across in a wrong, and possibly strong, way, judging from the defensive tone she subsequently took on. In reality I only meant to vent my frustration at having to fix footnotes all too often, when proofreading English texts written by Italian academics—trust me, moving misplaced notes is absurdly boring, and a real drag when using track changes, as these will upset the numbering of the whole note system.

Apparently, though, different practices – and mindsets – exist. So, in spite of the survey resulting in a majority (45%) voting for footnotes after the period (against 32% voting before), I read several confused or opinionated replies to the thread, including a few likes and comments aimed at proving me wrong. I even got mansplained of sorts by a couple of dudes who apparently couldn’t bother to produce any constructive, or even vaguely intelligent, criticism. Instead, they cast my reference to the poor Oxford Style Guide off as “booooooring” (wrong spelling, I replicated), and pointed to a typo introduced by my phone’s autocorrect (being set to Italian rather than English). Wow, really put me on the spot there, big man!

Clearly, there is no golden rule, even when you’re writing in the same language. Whatever “God” or the Polish football manager wish to believe. UK spelling differs from US spelling, and punctuation, too, follows its own rules depending on geography and disciplinary differences. And the style guides used in those disciplines. Which is why a proofreader can be of great help.

Period.