Just when you think it’s over

August 2, 2021 2 comments

The past is never dead. It’s not even past,” memorably wrote William Faulkner. He was right. You really have to give the past some credit — it’s everlasting and all consuming. Just when you think it’s all buried, it keeps coming back like a plague, in the most disturbing way.

The story here is about antisemitism in academia. These days, in my professional life as a mathematician, I rarely get to think about it. As it happens, I’ve written about antisemitic practices in academia and what happened to me on this blog before, and I didn’t plan to revisit the issue. After thirty years of not having to deal with that I was ready let it go… Until today. But let me start slowly.

The symbolism

In American universities, the antisemitism was widespread practice for decades which went out of fashion along with slide rule and French curve. This is extremely well documented. The world at large can be going crazy wild in their Jew-harted, but within confines of a good US university what do I care, right?

The symbolism is still there, of course. If you squint a little you see it all over the place. Like a long-abandoned tombstone in the town center everyone averts their eyes when passing by, a visual reminder of the past nobody wants to think about. Think of a mass murderer Vladimir Lenin very prominently featured in the Red Square and still lauded all over. Or and an even greater mass murderer Joseph Stalin who still has some streets named after him, some statues still standing in front of a museum at his birthplace in Gori, Georgia, and who is buried just a few meters behind Lenin. Thousands of tourists pass by these symbols. Everyone’s happy. Same with past antisemitism — nobody cares…

The news has come to Harvard

When it comes to antisemitic symbolism in academia, it’s worth mentioning Harvard University which stands tall in its obliviousness. For example, a rather beautiful Lowell House is named after Harvard President Lawrence Lowell, who was famous for instituting Jewish quotas. In 2019 the issue was brought up much too often to be ignored. In its infinite wisdom Harvard addressed it by keeping the name but taking down Lowell’s portrait in the dining room. Really! How evenhanded of them — Jews can now feel welcome, totally safe and protected… Not that Harvard learned much of anything from this sordid episode, but that’s to be expected I suppose. After all, Harvard never apologized

Or take the Birkhoff Library at the Harvard Math Department (where I got my Ph.D.), which is named after George Birkhoff, well known for his antisemitic rhetoric and hiring practices, and whom Albert Einstein called “one of the world’s great anti-Semites.” If you don’t know what I am talking about, read Steve Nadis and S.-T. Yau’s book which is surprisingly honest on the matter.

Of course, some things are too much even for Harvard. James Conant was a Harvard President who followed Lowell both as a president and in his love of Jewish quotas. He is also famous for being a Nazi sympathizer. Although still occasionally honored by Harvard (check named professorship there), apparently this is a source of embarrassment best erased from history and not discussed in a polite company. Other educational institutions are much less skittish, of course. Wikipedia helpfully points to Conant Elementary in Michigan and Conant High School in Illinois. I guess these places are ok with Conant’s legacy.

And now this

Consider the present day case of Yaroslav Shitov which was pointed out to me last week. Shitov is a prolific mathematician lauded by Gil Kalai, by Numberphile, by AMS News blog, and on the pages of Quanta Magazine for his recent work. Turns out, he is a rabid antisemite (among other things). The screenshots below (in Russian) taken from his social media account are so odious I refuse to translate them to give them more credence. In fact, if you can’t read Russian, you are better off — even reading this dreck makes you feel dirty.

I don’t have much to say about this person. I never met him and have no insight into where is this filth is coming from (not that I care). I do have a suggestion on what to do and it’s called shunning from the math community. Please ignore this person as much as possible! Never invite him to give talks at seminars or conferences. Refuse to referee his papers. If you are an editor, return his submissions without handling them. Don’t speak to him or shake his hand. If he is in the audience refuse to give a talk until he leaves. If you must cite his papers, do that without mentioning his name in the main body of the paper. He represents the ugly past that is best kept in the past…

The problem with combinatorics textbooks

July 3, 2021 Leave a comment

Every now and then I think about writing a graduate textbook in Combinatorics, based on some topics courses I have taught. I scan my extensive lecture notes, think about how much time it would take, and whether there is even a demand for this kind of effort. Five minutes later I would always remember that YOLO, deeply exhale and won’t think about it for a while.

What’s wrong with Combinatorics?

To illustrate the difficulty, let me begin with two quotes which contradict each other in the most illuminating way. First, from the Foreword by Richard Stanley on (his former student) Miklós Bóna’s “A Walk Through Combinatorics” textbook:

The subject of combinatorics is so vast that the author of a textbook faces a difficult decision as to what topics to include. There is no more-or-less canonical corpus as in such other subjects as number theory and complex variable theory. [here]

Second, from the Preface by Kyle Petersen (and Stanley’s academic descendant) in his elegant “Inquiry-Based Enumerative Combinatorics” textbook:

Combinatorics is a very broad subject, so the difficulty in writing about the subject is not what to include, but rather what to exclude. Which hundred problems should we choose? [here]

Now that this is all clear, you can probably insert your own joke about importance of teaching inclusion-exclusion. But I think the issue is a bit deeper than that.

I’ve been thinking about this when updating my “What is Combinatorics” quotation page (see also my old blog post on this). You can see a complete divergence of points of view on how to answer this question. Some make the definition or description to be very broad (sometimes even ridiculously broad), some relatively narrow, some are overly positive, while others are revoltingly negative. And some basically give up and say, in effect “it is what it is”. This may seem puzzling, but if you concentrate on the narrow definitions and ignore the rest, a picture emerges.

Clearly, these people are not talking about the same area. They are talking about sub-areas of Combinatorics that they know well, that they happen to learn or work on, and that they happen to like or dislike. Somebody made a choice what part of Combinatorics to teach them. They made a choice what further parts of Combinatorics to learn. These choices are increasingly country or culture dependent, and became formative in people’s mind. And they project their views of these parts of Combinatorics on the whole field.

So my point is — there is no right answer to “What is Combinatorics?“, in a sense that all these opinions are biased to some degree by personal education and experience. Combinatorics is just too broad of a category to describe. It’s a bit like asking “what is good food?” — the answers would be either broad and bland, or interesting but very culture-specific.

Courses and textbooks

How should one resolve the issue raised above? I think the answer is simple. Stop claiming that Combinatorics, or worse, Discrete Mathematics, is one subject. That’s not true and hasn’t been true for a while. I talked about this in my “Unity of Combinatorics” book review. Combinatorics is comprised of many sub-areas, see the Wikipedia article I discussed here (long ago). Just accept it.

As a consequence, you should never teach a “Combinatorics” course. Never! Especially to graduate students, but to undergraduates as well. Teach courses in any and all of these subjects: Enumerative Combinatorics, Graph Theory, Probabilistic Combinatorics, Discrete Geometry, Algebraic Combinatorics, Arithmetic Combinatorics, etc. Whether introductory or advanced versions of these courses, there is plenty of material for each such course.

Stop using these broad “a little bit about everything” combinatorics textbooks which also tend to be bulky, expensive and shallow. It just doesn’t make sense to teach both the five color theorem and the Catalan numbers (see also here) in the same course. In fact, this is a disservice to both the students and the area. Different students want to know about different aspects of Combinatorics. Thus, if you are teaching the same numbered undergraduate course every semester you can just split it into two or three, and fix different syllabi in advance. The students will sort themselves out and chose courses they are most interested in.

My own teaching

At UCLA, with the help of the Department, we split one Combinatorics course into two titled “Graph Theory” and “Enumerative Combinatorics”. They are broader, in fact, than the titles suggest — see Math 180 and Math 184 here. The former turned out to be quite a bit more popular among many applied math and non-math majors, especially those interested in CS, engineering, data science, etc., but also from social sciences. Math majors tend to know a lot of this material and flock to the latter course. I am not saying you should do the same — this is just an example of what can be done.

I remember going through a long list of undergraduate combinatorics textbooks a few years ago, and found surprisingly little choice for the enumerative/algebraic courses. Of the ones I liked, let me single out Bóna’s “Introduction to Enumerative and Analytic Combinatorics and Stanley’s “Algebraic Combinatorics“. We now use both at UCLA. There are also many good Graph Theory course textbooks of all levels, of course.

Similarly, for graduate courses, make sure you make the subject relatively narrow and clearly defined. Like a topics class, except accessible to beginning graduate students. Low entry barrier is an advantage Combinatorics has over other areas, so use it. To give examples from my own teaching, see unedited notes from my graduate courses:

Combinatorics of posets (Fall 2020)

Combinatorics and Probability on groups (Spring 2020)

Algebraic Combinatorics (Winter 2019)

Discrete and Polyhedral Geometry (Fall 2018) This is based on my book. See also videos of selected topics (in Russian).

Combinatorics of Integer Sequences (Fall 2016)

Combinatorics of Words (Fall 2014)

Tilings (Winter 2013, lecture-by-lecture refs only)

In summary

In my experience, the more specific you make the combinatorics course the more interesting it is to the students. Don’t be afraid that the course would appear be too narrow or too advanced. That’s a stigma from the past. You create a good course and the students will quickly figure it out. They do have their own FB and other chat groups, and spread the news much faster than you could imagine…

Unfortunately, there is often no good textbook to cover what you want. So you might have to work a little harder harder to scout the material from papers, monographs, etc. In the internet era this is easier than ever. In fact, many extensive lecture notes are already available on the web. Eventually, all the appropriate textbooks will be written. As I mentioned before, this is one of the very few silver linings of the pandemic…

P.S. (July 8, 2021) I should have mentioned that in addition to “a little bit about everything” textbooks, there are also “a lot about everything” doorstopper size volumes. I sort of don’t think of them as textbooks at all, more like mixtures of a reference guide, encyclopedia and teacher’s manual. Since even the thought of teaching from such books overwhelms the senses, I don’t expect them to be widely adopted.

Having said that, these voluminous textbooks can be incredibly valuable to both the students and the instructor as a source of interesting supplementary material. Let me single out an excellent recent “Combinatorial Mathematics” by Doug West written in the same clear and concise style as his earlier “Introduction to Graph Theory“. Priced modestly (for 991 pages), I recommend it as “further reading” for all combinatorics courses, even though I strongly disagree with the second sentence of the Preface, per my earlier blog post.

How to fight the university bureaucracy and survive

June 27, 2021 Leave a comment

The enormity of the university administration can instill fear. How can you possibly fight such a machine? Even if an injustice happened to you, you are just one person with no power, right? Well, I think you can. Whether you succeed in your fight is another matter. But at least you can try… In this post I will try to give you some advise on how to do this.

Note: Initially I wanted to make this blog post light and fun, but I couldn’t think of a single joke. Somehow, the subject doesn’t inspire… So read this only if it’s relevant to you. Wait for future blog posts otherwise…

Warning: Much of what I say is relevant to big state universities in the US. Some of what I say may also be relevant to other countries and university systems, I wouldn’t know.

Basics

Who am I to write about this? It is reasonable to ask if any of this is based on my personal experience of fighting university bureaucracies. The answer is yes, but I am not willing to make any public disclosures to protect privacy of all parties involved. Let me just say that over the past 20 years I had several relatively quiet and fairly minor fights with university bureaucracies some of which I won rather quickly by being right. Once, I bullied my way into victory despite being in the wrong (as I later learned), and once I won over a difficult (non-personal) political issue by being cunning and playing a really long game that took almost 3 years. I didn’t lose any, but I did refrain from fighting several times. By contrast, when I tried to fight the federal government a couple of times (on academic matters), I lost quickly and decisively. They are just too powerful….

Should you fight? Maybe. But probably not. Say, you complained to the administration about what you perceive to be an injustice to you or to someone else. Your complaint was denied. This is when you need to decide if you want to start a fight. If you do, you will spend a lot of effort and (on average) probably lose. The administrations are powerful and know what they are doing. You probably don’t, otherwise you won’t be reading this. This blog post might help you occasionally, but wouldn’t change the big picture.

Can you fight? Yes, you can. You can win by being right and convince bureaucrats to see it this way. You can win by being persistent when others give up. You can also win by being smart. Big systems have weaknesses you can exploit, see below. Use them.

Is there a downside to winning a fight? Absolutely. In the process you might lose some friends, raise some suspicions from colleagues, and invite retribution. On a positive side, big systems have very little institutional memory — your win and the resulting embarrassment to administration will be forgotten soon enough.

Is there an upside to losing a fight? Actually, yes. You might gain resect of some colleagues as someone willing to fight. In fact, people tend to want being friends/friendly with such people out of self-preservation. And if your cause is righteous, this might help your reputation in and beyond the department.

Why did I fight? Because I just couldn’t go on without a fight. The injustice, as I perceived it, was eating me alive and I had a hunch there is a nonzero chance I would win. There were some cases when I figured the chances are zero, and I don’t need the grief. There were cases when the issue was much too minor to waste my energy. I don’t regret those decision, but having grown up in this unsavory part of Moscow, I was conditioned to stand up for myself.

Is there a cost of not fighting? Yes, and it goes beyond the obvious. First, fighting bureaucracy is a skill, and every skill takes practice. I remember when tried to rent an apartment in Cambridge, MA — some real estate agents would immediately ask if I go to Harvard Law School. Apparently it’s a common practice for law students to sue their landlords, an “extra credit” homework exercise. Most of these lawsuits would quickly fail, but the legal proceeding were costly to the owners.

Second, there is a society cost. If you feel confident that your case is strong, you winning might set a precedent which could benefit many others. I wrote on this blog once how I dropped (or never really started) a fight against the NSF, even though they clearly denied me the NSF Graduate Fellowship in a discriminatory manner, or at least that’s what I continue to believe. Not fighting was the right thing to do for me personally (I would have lost, 100%), but my case was strong and the fight itself might have raised some awareness to the issue. It took the NSF almost 25 years to figure out that it’s time to drop the GREs discriminatory requirement.

Axioms

  1. If it’s not in writing it never happened.
  2. Everyone has a boss.
  3. Bureaucrats care about themselves first and foremost. Then about people in their research area, department and university, in that order. Then undergraduates. Then graduate students. You are the last person they care about.

How to proceed

Know your adversary. Remember — you are not fighting a mafia, a corrupt regime or the whole society. Don’t get angry, fearful or paranoid. Your adversary is a group of good people who are doing their jobs as well as they can. They are not infallible, but probably pretty smart and very capable when it comes to bureaucracy, so from game theory point of view you may as well assume they are perfect. When they are not, you will notice that — that’s the weakness you can exploit, but don’t expect that to happen.

Know your rights. This might seem obvious, but you would be surprised to know how many academics are not aware they have rights in a university system. In fact, it’s a feature of every large bureaucracy — it produces a lot of well meaning rules. For example, Wikipedia is a large project which survived for 20 years, so unsurprisingly it has a large set of policies enforced by an army of admins. The same is probably true about your university and your department. Search on the web for the faculty handbook, university and department bylaws, etc. If you can’t find the anywhere, email the assistant to the Department Chair and ask for one.

Go through the motions. Say, you think you were slighted. For example, your salary was not increased (enough), you didn’t get a promotion, you got too many committee duties assigned, your sabbatical was not approved, etc. Whatever it is, you are upset, I get it. Your first step is not to complain but go through the motions, and email inquiries. Email the head of the department, chair of the executive committee, your faculty dean, etc., whoever is the decision maker. Calmly ask to explain this decision. Sometimes, this was an oversight and it’s corrected with a quick apology and “thanks for bringing this up”. You win, case closed. Also, sometimes you either get a convincing explanation with which you might agree — say, the university is on salary freeze so nobody got a salary increase, see some link. Again, case closed.

But in other cases you either receive an argument with which you disagree (say, “the decision was made based on your performance in the previous year”), a non-answer (say, “I am on sabbatical” or “I will not be discussing personal matters by email”), or no answer at all. These are the cases that you need to know how to handle and all such cases are a little different. I will try to cover as much territory as possible, but surely will miss some cases.

Ask for advice. This is especially important if you are a junior mathematician and feel a little overwhelmed. Find a former department chair, perhaps professor emeritus, and have an quiet chat. Old-timers know the history of the department, who are the university administrators, what are the rules, what happened to previous complaints, what would fly and what wouldn’t, etc. They might also suggest who else you should talk to that would be knowledgeable and help dealing with an issue. With friends like these, you are in a good shape.

Scenarios

Come by for a chat. This is a standard move by a capable bureaucrat. They invite you for a quick discussion, maybe sincerely apologize for “what happened” or “if you are upset” and promise something which they may or may not intend to keep. You are supposed to leave grateful that “you are heard” and nothing is really lost from admin’s point of view. You lost.

There is only one way to counter this move. Agree to a meeting — play nice and you might learn something. Don’t record in secret — it’s against the law in most states. Don’t ask if you can record the conversation — even if the bureaucrat agrees you will hear nothing but platitudes then (like “we in our university strive to make sure everyone is happy and successful, and it is my personal goal to ensure everyone is treated fairly and with respect”). This defeats the purpose of the meeting moving you back to square one.

At the meeting do not agree with anything, never say yes or no to anything. Not even to the routine “No hard feelings?” Just nod, take careful notes, say “thank you so much for taking time to have this meeting” and “This information is very useful, I will need to think it over”. Do not sign anything. If offered a document to sign, take it with you. If implicitly threatened, as in “Right now I can offer you this for you, but once you leave this office I can’t promise… ” (this is rare but does happen occasionally), ignore the threat. Just keep repeating “Thank you so much for informing me of my options, I will need to think it over.” Go home, think it over and talk to somebody.

Get it all in writing. Within a few hours after the meeting, email to the bureaucrat an email with your notes. Start this way: “Dear X, this is to follow up on the meeting we had on [date] regarding the [issue]. I am writing this to ensure there is no misunderstanding on my part. At the meeting you [offered/suggested/claimed/threatened] …. Please let me know if this is correct and what are the details of …”

A capable bureaucrat will recognize the move and will never go on record with anything unbecoming. They will accept the out you offered and claim that you indeed misunderstood. Don’t argue with that — you have them where you want it. In lieu of the misunderstanding they will need to give a real answer to your grievance (otherwise what was the point of the meeting?) Sometimes a bureaucrat will still resort to platitudes, but now that they are in writing, that trick is harder to pull off, and it leads us to a completely different scenario (see below).

Accept the win. You might receive something like this: “We sincerely apologize for [mistake]. While nothing can be done about [past decision], we intend to [compensate or rectify] by doing…” If this is a clear unambiguous promise in writing, you might want to accept it. If not, follow up about details. Do not pursue this any further and don’t make it public. You got what you wanted, it’s over.

Accept the defeat. You might learn that administration acted by the book, exactly the way the rules/bylaws prescribe, and you were not intentionally discriminated in any way. Remain calm. Thank the bureaucrat for the “clarification”. It’s over.

Power of CC. If you receive a non-answer full of platitudes or no email reply at all (give it exactly one week), then follow up. Write politely “I am afraid I did not receive an answer to [my questions] in my email from [date]. I would really appreciate your response to [all issues I raised]. P.S. I am CC’ing this email to [your boss, boss of your boss, your assistant, your peers, other fellow bureaucrats, etc.] to let them know of [my grievance] and in case they can be helpful with this situation.” They will not “be helpful”, of course, but that’s not the point. The CC move itself has an immense power driven by bureaucrats’ self-preservation. Most likely you will get a reply within hours. Just don’t abuse the CC move — use it when you have no other moves to play, as otherwise it loses its power.

Don’t accept a draw. Sometimes a capable bureaucrat might reply to the whole list on CC and write “We are very sorry [your grievance] happened. This is extremely atypical and related to [your unusual circumstances]. While this is normally not appropriate, we are happy to make an exception in your case and [compensate you].” Translation: “it’s your own fault, you brought it on yourself, we admit no wrongdoing, but we are being very nice and will make you happy even though we really don’t have to do anything, not at all.” While other bureaucrats will recognize the move and that there is an implicit admission of fault, they will stay quiet — it’s not their fight.

Now, there is only one way to counter this, as far as I know. If you don’t follow up it’s an implicit admission of “own fault” which you don’t want as the same issue might arise again in the future. If you start explaining that it’s really bureaucrat’s fault you seem vindictive (as in “you already got what you wanted, why do you keep pushing this?”), and other bureaucrats will close ranks leaving you worse off. The only way out is to pretend to be just as illogical as the bureaucrat pretends to be. Reply to the whole CC list something like “Thank you so much for your apology and understanding of my [issue]. I am very grateful this is resolved to everyone’s satisfaction. I gratefully accept your sincere apology and your assurances this will not happen again to me nor anyone else at the department.”

A capable bureaucrat will recognize they are fighting fire with fire. In your email you sound naïve and sincere — how do you fight that? What are they going to do — reply “actually, I didn’t issue any apology as this was not my fault”? Now that seem overly defensive. And they would have to reply to the whole CC list again, which is not what they want. They are aware that everyone else knows they screwed up, so reminding everyone with a new email is not in their interest. And there is a decent chance you might reply to the whole CC list again with all that sugarcoated unpleasantness. Most likely, you won’t hear from them again, or just a personal (non-CC’d) email which you can ignore regardless of the content.

Shifting blame or responsibility. That’s another trick bureaucrats employ very successfully. You might get a reply from a bureaucrat X to the effect saying “don’t ask me, these are rules made by [people upstairs]” or “As far as I know, person Y is responsible for this all”. This is great news for you — a tacit validation of your cause and an example of a bureaucrat putting their own well-being ahead of the institution. Remember, your fight is not with X, but with the administration. Immediately forward both your grievance and the reply to Y, or to X’s boss if no names were offered, and definitely CC X “to keep your in the loop of further developments on this issue”. That immediately pushes bureaucracy into overdrive as it starts playing musical chairs in the game “whose fault is that and what can be done”.

Like with musical chairs, you might have to repeat the procedure a few times, but chances are someone will eventually accept responsibility just to stop this embarrassment from going circles. By then, there will be so many people on the CC chain, your issue will be addressed appropriately.

Help them help you. Sometimes a complaint puts the bureaucrat into a stalemate. They want to admit that injustice happened to you, but numerous university rules forbid them from acting to redress the situation. In order to violate these rules, they would have to take the case upstairs, which brings its own complications to everyone involved. Essentially you need to throw them a lifeline by suggesting some creative solution to the problem.

Say, you can write “while I realize the deadline for approval of my half-year sabbatical has passed, perhaps the department can buyout one course from my Fall schedule and postpone teaching the other until Spring.” This moves the discussion from the “apology” subject to “what can be done”, a much easier bureaucratic terrain. While the bureaucrat may not agree with your proposed solution, your willingness to deal without an apology will earn you some points and perhaps lead to a resolution favorable to all parties.

Now, don’t be constrained in creativity of when thinking up such a face saving resolution. It is a common misconception that university administrations are very slow and rigid. This is always correct “on average”, and holds for all large administrative systems where responsibility is distributed across many departments and individuals. In reality, when they want to, such large systems can turn on a dime by quickly utilizing its numerous resources (human, financial, legal, etc.) I’ve seen it in action, it’s jaw-dropping, and it takes just one high ranking person to take up the issue and make it a cause.

Making it public. You shouldn’t do that unless you already lost but keep holding a grudge (and have tenure to protect you). Even then, you probably shouldn’t do it unless you are really good at PR. Just about every time you make grievances public you lose some social points with people who will hold it against you, claim you brought it on yourself, etc. In the world of social media your voice will be drowned and your case will be either ignored or take life of its own, with facts distorted to fit a particular narrative. The administration will close ranks and refuse to comment. You might be worse off than when you started.

The only example I can give is my own combative blog post which remains by far my most widely read post. Everyone just loves watching a train wreck… Many people asked why I wrote it, since it made me a persona non grata in the whole area of mathematics. I don’t have a good answer. In fact, that area may have lost some social capital as a result of my blog post, but haven’t changed at all. Some people apologized, that’s all. There is really nothing I can do and they know it. The truth is — my upbringing was acting up again, and I just couldn’t let it go without saying “Don’t F*** with Igor Pak”.

But you can very indirectly threaten to make it public. Don’t do it unless you are at an endgame dealing with a high ranking administrator and things are not looking good for you. Low level university bureaucrats are not really afraid for their jobs. For example, head of the department might not even want to occupy the position, and is fully protected by tenure anyway. But deans, provosts, etc. are often fully vested into their positions which come with substantial salary hike. If you have a sympathetic case, they wouldn’t want to be featured in a college newspaper as denying you some benefits, regardless of the merit. They wouldn’t be bullied into submission either, so some finesse is needed.

In this case I recommend you find an email of some student editor of a local university newspaper. In your reply to the high ranking administrator write something like “Yes, I understand the university position in regard to this issue. However, perhaps [creative solution]”. Then quietly insert the editor’s email into CC. In the reply, the administrator will delete the email from CC “for privacy reasons”, but will google to find out who is being CC’ed. Unable to gauge the extend of newspaper’s interest in the story, the administrator might chose to hedge and help you by throwing money at you or mollifying you in some creative way you proposed. Win–win.

Final word

I am confident there will be people on all sides who disagree collectively with just about every sentence I wrote. Remember — this blog post is a not a recommendation to do anything. It’s just my personal point of view on these delicate matters which tend to go undiscussed, leaving many postdocs and junior faculty facing alone their grievances. If you know a good guide on how to deal with these issues (beyond Rota’s advice), please post a link in the comments. Good luck everyone! Hope you will never have to deal with any of that!

Why you shouldn’t be too pessimistic

May 13, 2021 2 comments

In our math research we make countless choices. We chose a problem to work on, decide whether its claim is true or false, what tools to use, what earlier papers to study which might prove useful, who to collaborate with, which computer experiments might be helpful, etc. Choices, choices, choices… Most our choices are private. Others are public. This blog is about wrong public choices that I made misjudging some conjectures by being overly pessimistic.

The meaning of conjectures

As I have written before, conjectures are crucial to the developments of mathematics and to my own work in particular. The concept itself is difficult, however. While traditionally conjectures are viewed as some sort of “unproven laws of nature“, that comparison is widely misleading as many conjectures are descriptive rather than quantitative. To understand this, note the stark contrast with experimental physics, as many mathematical conjectures are not particularly testable yet remain quite interesting. For example, if someone conjectures there are infinitely many Fermat primes, the only way to dissuade such person is to actually disprove the claim.

There is also an important social aspect of conjecture making. For a person who poses a conjecture, there is a certain clairvoyance respected by other people in the area. Predictions are never easy, especially of a precise technical nature, so some bravery or self-assuredness is required. Note that social capital is spent every time a conjecture is posed. In fact, a lot of it is lost when it’s refuted, you come out even if it’s proved relatively quickly, and you gain only if the conjecture becomes popular or proved possibly many years later. There is also a “boy who cried wolf” aspect for people who make too many conjectures of dubious quality — people will just tune out.

Now, for the person working on a conjecture, there is also a betting aspect one cannot ignore. As in, are you sure you are working in the right direction? Perhaps, the conjecture is simply false and you are wasting your time… I wrote about this all before in the post linked above, and the life/career implications on the solver are obvious. The success in solving a well known conjecture is often regarded much higher than a comparable result nobody asked about. This may seem unfair, and there is a bit of celebrity culture here. Thinks about it this way — two lead actors can have similar acting skills, but the one who is a star will usually attract a much larger audience…

Stories of conjectures

Not unlike what happens to papers and mathematical results, conjectures also have stories worth telling, even if these stories are rarely discussed at length. In fact, these “conjecture stories” fall into a few types. This is a little bit similar to the “types of scientific papers” meme, but more detailed. Let me list a few scenarios, from the least to the most mathematically helpful:

(1) Wishful thinking. Say, you are working on a major open problem. You realize that a famous conjecture A follows from a combination of three conjectures B, C and D whose sole motivation is their applications to A. Some of these smaller conjectures are beyond the existing technology in the area and cannot be checked computationally beyond a few special cases. You then declare that this to be your “program” and prove a small special case of C. Somebody points out that D is trivially false. You shrug, replace it with a weaker D’ which suffices for your program but is harder to disprove. Somebody writes a long state of the art paper disproving D’. You shrug again and suggest an even weaker conjecture D”. Everyone else shrugs and moves on.

(2) Reconfirming long held beliefs. You are working in a major field of study aiming to prove a famous open problem A. Over the years you proved a number of special cases of A and became one the leaders of the area. You are very optimistic about A discussing it in numerous talks and papers. Suddenly A is disproved in some esoteric situations, undermining the motivation of much of your older and ongoing work. So you propose a weaker conjecture A’ as a replacement for A in an effort to salvage both the field and your reputation. This makes happy everyone in the area and they completely ignore the disproof of A from this point on, pretending it’s completely irrelevant. Meanwhile, they replace A with A’ in all subsequent papers and beamer talk slides.

(3) Accidental discovery. In your ongoing work you stumble at a coincidence. It seem, all objects of a certain kind have some additional property making them “nice“. You are clueless why would that be true, since being nice belongs to another area X. Being nice is also too abstract to be checked easily on a computer. You consult a colleague working in X whether this is obvious/plausible/can be proved and receive No/Yes/Maybe answers to these three questions. You are either unable to prove the property or uninterested in problem, or don’t know much about X. So you mention it in the Final Remarks section of your latest paper in vain hope somebody reads it. For a few years, every time you meet somebody working in X you mention to them your “nice conjecture”, so much that people laugh at you behind your back.

(4) Strong computational evidence. You are doing computer experiments related to your work. Suddenly certain numbers appear to have an unexpectedly nice formula or a generating function. You check with OEIS and the sequence is there indeed, but not with the meaning you wanted. You use the “scientific method” to get a few more terms and they indeed support your conjectural formula. Convinced this is not an instance of the “strong law of small numbers“, you state the formula as a conjecture.

(5) Being contrarian. You think deeply about famous conjecture A. Not only your realize that there is no way one can approach A in full generality, but also that it contradicts some intuition you have about the area. However, A was stated by a very influential person N and many people believe in A proving it in a number of small special cases. You want to state a non-A conjecture, but realize the inevitable PR disaster of people directly comparing you to N. So you either state that you don’t believe in A, or that you believe in a conjecture B which is either slightly stronger or slightly weaker than non-A, hoping the history will prove you right.

(6) Being inspirational. You think deeply about the area and realize that there is a fundamental principle underlying certain structures in your work. Formalizing this principle requires a great deal of effort and results in a conjecture A. The conjecture leads to a large body of work by many people, even some counterexamples in esoteric situations, leading to various fixes such as A’. But at that point A’ is no longer the goal but more of a direction in which people work proving a number of A-related results.

Obviously, there are many other possible stories, while some stories are are a mixture of several of these.

Why do I care? Why now?

In the past few years I’ve been collecting references to my papers which solve or make some progress towards my conjectures and open problems, putting links to them on my research page. Turns out, over the years I made a lot of those. Even more surprisingly, there are quite a few papers which address them. Here is a small sampler, in random order:

(1) Scott Sheffield proved my ribbon tilings conjecture.

(2) Alex Lubotzky proved my conjecture on random generation of a finite group.

(3) Our generalized loop-erased random walk conjecture (joint with Igor Gorodezky) was recently proved by Heng Guo and Mark Jerrum.

(4) Our Young tableau bijections conjecture (joint with Ernesto Vallejo) was resolved by André Henriques and Joel Kamnitzer.

(5) My size Ramsey numbers conjecture led to a series of papers, and was completely resolved only recently by Nemanja Draganić, Michael Krivelevich and Rajko Nenadov.

(6) One of my partition bijection problems was resolved by Byungchan Kim.

The reason I started collecting these links is kind of interesting. I was very impressed with George Lusztig and Richard Stanley‘s lengthy writeups about their collected papers that I mentioned in this blog post. While I don’t mean to compare myself to these giants, I figured the casual reader might want to know if a conjecture in some paper had been resolved. Thus the links on my website. I recommend others also do this, as a navigational tool.

What gives?

Well, looks like none of my conjectures have been disproved yet. That’s a good news, I suppose. However, by going over my past research work I did discover that on three occasions when I was thinking about other people’s conjectures, I was much too negative. This is probably the result of my general inclination towards “negative thinking“, but each story is worth telling.

(i) Many years ago, I spent some time thinking about Babai’s conjecture which states that there are universal constants C, c >0, such that for every simple group G and a generating set S, the diameter of the Cayley graph Cay(G,S) is at most C(log |G|)c. There has been a great deal of work on this problem, see e.g. this paper by Sean Eberhard and Urban Jezernik which has an overview and references.

Now, I was thinking about the case of the symmetric group trying to apply arithmetic combinatorics ideas and going nowhere. In my frustration, in a talk I gave (Galway, 2009), I wrote on the slides that “there is much less hope” to resolve Babai’s conjecture for An than for simple groups of Lie type or bounded rank. Now, strictly speaking that judgement was correct, but much too gloomy. Soon after, Ákos Seress and Harald Helfgott proved a remarkable quasi-polynomial upper bound in this case. To my embarrassment, they referenced my slides as a validation of the importance of their work.

Of course, Babai’s conjecture is very far from being resolved for An. In fact, it is possible that the diameter is always O(n2). We just have no idea. For simple groups of Lie type or large rank the existing worst case diameter bounds are exponential and much too weak compared to the desired bound. As Eberhard and Jezernik amusingly wrote in the paper linked above, “we are still exponentially stupid“…

(ii) When he was my postdoc at UCLA, Alejandro Morales told me about a curious conjecture in this paper (Conjecture 5.1), which claimed that the number of certain nonsingular matrices over the finite field Fq is polynomial in q with positive coefficients. He and coauthors proved the conjecture is some special cases, but it was wide open in full generality.

Now, I thought about this type of problems before and was very skeptical. I spent a few days working on the problem to see if any of my tools can disprove it, and failed miserably. But in my stubbornness I remained negative and suggested to Alejandro that he should drop the problem, or at least stop trying to prove rather than disprove the conjecture. I was wrong to do that.

Luckily, Alejandro ignored my suggestion and soon after proved the polynomial part of the conjecture together with Joel Lewis. Their proof is quite elegant and uses certain recurrences coming from the rook theory. These recurrences also allow a fast computation of these polynomials. Consequently, the authors made a number of computer experiments and disproved the positivity of coefficients part of the conjecture. So the moral is not to be so negative. Sometimes you need to prove a positive result first before moving to the dark side.

(iii) The final story is about the beautiful Benjamini conjecture in probabilistic combinatorics. Roughly speaking, it says that for every finite vertex transitive graph G on n vertices and diameter O(n/log n) the critical percolation constant pc <1. More precisely, the conjecture claims that there is p<1-ε, such that a p-percolation on G has a connected component of size >n/2 with probability at least δ, where constants ε, δ>0 depend on the constant implied by the O(*) notation, but not on n. Here by “p-percolation” we mean a random subgraph of G with probability p of keeping and 1-p of deleting an edge, independently for all edges of G.

Now, Itai Benjamini is a fantastic conjecture maker of the best kind, whose conjectures are both insightful and well motivated. Despite the somewhat technical claim, this conjecture is quite remarkable as it suggested a finite version of the “pc<1″ phenomenon for infinite groups of superlinear growth. The latter is the famous Benjamini–Schramm conjecture (1996), which was recently proved in a remarkable breakthrough by Hugo Duminil-Copin, Subhajit Goswami, Aran Raoufi, Franco Severo and Ariel Yadin. While I always believed in that conjecture and even proved a tiny special case of it, finite versions tend to be much harder in my experience.

In any event, I thought a bit about the Benjamini conjecture and talked to Itai about it. He convinced me to work on it. Together with Chis Malon, we wrote a paper proving the claim for some Cayley graphs of abelian and some more general classes of groups. Despite our best efforts, we could not prove the conjecture even for Cayley graphs of abelian groups in full generality. Benjamini noted that the conjecture is tight for products of two cyclic groups, but that justification did not sit well with me. There seemed to be no obvious way to prove the conjecture even for the Cayley graph of Sn generated by a transposition and a long cycle, despite the very small O(n2) diameter. So we wrote in the introduction: “In this paper we present a number of positive results toward this unexpected, and, perhaps, overly optimistic conjecture.”

As it turns out, it was us who were being overly pessimistic, even if we never actually stated that we believe the conjecture is false. Most recently, in an amazing development, Tom Hutchcroft and Matthew Tointon proved a slightly weaker version of the conjecture by adapting the methods of Duminil-Copin et al. They assume the O(n/(log n)c) upper bound on the diameter which they prove is sufficient, for some universal constant c>1. They also extend our approach with Malon to prove the conjecture for all Cayley graphs of abelian groups. So while the Benjamini conjecture is not completely resolved, my objections to it are no longer valid.

Final words on this

All in all, it looks like I was never formally wrong even if I was a little dour occasionally (Yay!?). Turns out, some conjectures are actually true or at least likely to hold. While I continue to maintain that not enough effort is spent on trying to disprove the conjectures, it is very exciting when they are proved. Congratulations to Harald, Alejandro, Joel, Tom and Matthew, and posthumous congratulations to Ákos for their terrific achievements!

The Unity of Combinatorics

April 10, 2021 1 comment

I just finished my very first book review for the Notices of the AMS. The authors are Ezra Brown and Richard Guy, and the book title is the same as the blog post. I had mixed feelings when I accepted the assignment to write this. I knew this would take a lot of work (I was wrong — it took a huge amount of work). But the reason I accepted is because I strongly suspected that there is no “unity of combinatorics”, so I wanted to be proved wrong. Here is how the book begins:

One reason why Combinatorics has been slow to become accepted as part of mainstream Mathematics is the common belief that it consists of a bag of isolated tricks, a number of areas: [very long list – IP] with little or no connection between them. We shall see that they have numerous threads weaving them together into a beautifully patterned tapestry.

Having read the book, I continue to maintain that there is no unity. The book review became a balancing act — how do you write a somewhat positive review if you don’t believe into the mission of the book? Here is the first paragraph of the portion of the review where I touch upon themes very familiar to readers of this blog:

As I see it, the whole idea of combinatorics as a “slow to become accepted” field feels like a throwback to the long forgotten era. This attitude was unfair but reasonably common back in 1970, outright insulting and relatively uncommon in 1995, and was utterly preposterous in 2020.

After a lengthy explanation I conclude:

To finish this line of thought, it gives me no pleasure to conclude that the case for the unity of combinatorics is too weak to be taken seriously. Perhaps, the unity of mathematics as a whole is an easier claim to establish, as evident from [Stanley’s] quotes. On the other hand, this lack of unity is not necessarily a bad thing, as we would be amiss without the rich diversity of cultures, languages, open problems, tools and applications of different areas.

Enjoy the full review! And please comment on the post with your own views on this alleged “unity”.

P.S. A large part of the book is freely downloadable. I made this website for the curious reader.

Remark (ADDED April 17, 2021)
Ezra “Bud” Brown gave a talk on the book illustrating many of the connections I discuss in the review. This was at a memorial conference celebrating Richard Guy’s legacy. I was not aware of the video until now. Watch the whole talk.

2021 Abel Prize

March 17, 2021 2 comments

I am overjoyed with the news of the Abel prize awarded to László Lovász and Avi Wigderson. You can now see three (!) Abel laureates discussing Combinatorics — follow the links in this blog post from 2019. See also Gil Kalai’s blog post for further links to lectures.

My interview

March 9, 2021 1 comment

Readers of this blog will remember my strong advocacy for taking interviews. In a surprising turn of events, Toufik Mansour interviewed me for the journal Enumerative Combinatorics and Applications (ECA). Here is that interview. Not sure if I am the right person to be interviewed, but if you want to see other Toufik’s interviews — click here (I mentioned some of them earlier). I am looking forward to read interviews of many more people in ECA and other journals.

P.S. The interview asks also about this blog, so it seems fitting to mention it here.

Corrections: (March 11, 2021) 1. I misread “What three results do you consider the most influential in combinatorics during the last thirty years?” question as asking about my own three results that are specifically in combinatorics. Ugh, to the original question – none of my results would go on that list. 2. In the pattern avoidance question, I misstated the last condition: I am asking for ec(Π) to be non-algebraic. Sorry everyone for all the confusion!

How to tell a good mathematical story

March 4, 2021 2 comments

As I mentioned in my previous blog post, I was asked to contribute to  to the Early Career Collection in the Notices of the AMS. The paper is not up on their website yet, but I already submitted the proofs. So if you can’t wait — the short article is available here. I admit that it takes a bit of a chutzpah to teach people how to write, so take it as you will.

Like my previous “how to write” article (see also my blog post), this article is mildly opinionated, but hopefully not overly so to remain useful. It is again aimed at a novice writer. There is a major difference between the way fiction is written vs. math, and I am trying to capture it somehow. To give you some flavor, here is a quote:

What kind of a story? Imagine a non-technical and non-detailed version of the abstract of your paper. It should be short, to the point, and straightforward enough to be a tweet, yet interesting enough for one person to want to tell it, and for the listener curious enough to be asking for details. Sounds difficult if not impossible? You are probably thinking that way, because distilled products always lack flavor compared to the real thing. I hear you, but let me give you some examples.

Take Aesop’s fable “The Tortoise and the Hare” written over 2500 years ago. The story would be “A creature born with a gift procrastinated one day, and was overtaken by a very diligent creature born with a severe handicap.” The names of these animals and the manner in which one lost to another are less relevant to the point, so the story is very dry. But there are enough hints to make some readers curious to look up the full story.

Now take “The Terminator”, the original 1984 movie. The story here is (spoiler alert! ) “A man and a machine come from another world to fight in this world over the future of the other world; the man kills the machine but dies at the end.” If you are like me, you probably have many questions about the details, which are in many ways much more exciting than the dry story above. But you see my point – this story is a bit like an extended tag line, yet interesting enough to be discussed even if you know the ending.

What math stories to tell and not to tell?

February 8, 2021 3 comments

Storytelling can be surprisingly powerful. When a story is skillfully told, you get an almost magical feeling of being a part of it, making you care deeply about protagonists. Even if under ordinary circumstances you have zero empathy for the Civil War era outlaws or emperor penguins of Antarctica, you suddenly may find yourself engrossed with their fortune. This is a difficult skill to master, but the effects are visible even when used in earnest by the beginners.

Recently I started thinking about the kind of stories mathematicians should be telling. This was triggered by Angela Gibney‘s kind invitation to contribute an article on math writing to the Early Career Collection in the Notices of the AMS. So I looked at a few older articles and found them just wonderful. I am not the target audience for some of them, but I just kept reading them all one after another until I exhausted the whole collection.

My general advice — read the collection! Read a few pieces by some famous people or some people you know. If you like them, keep on reading. As I wrote in this blog post, you rarely get an insight into mathematician’s thinking unless they happen to write an autobiography or gave an interview. While this is more of a “how to” genre, most pieces are written in the first person narrative and do tell some interesting stories or have some curious points of view.

It is possible I am the last person to find out about the collection. I am not a member of the AMS, I don’t read the Notices, and it’s been a long time since anyone considered me “early career”. I found a few articles a little self-centered (but who am I to judge), and I would quibble with some advice (see below). But even those articles I found compelling and thought-provoking.

Having read the collection, I decided to write about mathematical storytelling. This is not something that comes naturally to most people in the field. Math stories (as opposed to stories about mathematicians) tend to be rather dry and unexciting, especially in the early years of studying. I will blog my own article some other time, but for now let me address the question in the title.

Stories to tell

With a few notable exceptions, just about all stories are worth telling. Whether in your autobiography or in your personal blog, as long as they are interesting to somebody — it’s all good. Given the lack of good stories, or any math stories really, it’s a good bet somebody will find your stories interesting. Let me expound on that.

Basically, anything personal works. To give examples from the collection, see e.g. stories by Mark Andrea de Cataldo, Alicia Prieto-Langarica, Terry Tao and John Urschel. Most autobiographies are written in this style, but a short blog post is also great. Overcoming an embarrassment caused by such public disclosure can be difficult, which makes it even more valuable to the readers.

Anything historical works, from full length monographs on history of math to short point of view pieces. Niche and off the beaten path stories are especially valuable. I personally like the classical History of Mathematical Notations by Florian Cajori, and Combinatorics: Ancient & Modern, a nice collection edited by Robin Wilson and John Watkins, with a several articles authored by names you will recognize. Note that an oral history can be also very valuable, see the kind of stories discussed by László Lovász and Endre Szemerédi mentioned in this blog post and Dynkin’s interviews I discussed here.

Anything juicy works. I mean, if you have a story of some famous mathematician doing something unusual (good or bad, or just plain weird), that attracts attention. This was the style of Steven Krantz’s two Math Apocryphia books, with many revealing and embarrassing anecdotes giving a sense of the bygone era.

Anything inspirational works. A beautiful example of this style is Francis Su’s Farewell Address as MAA President and part of his moving follow up book (the book has other interesting material as well). From the collection, let me single out Finding Your Reward by Skip Garibaldi which also aims to inspire. Yet another example is Bill Thurston‘s must read MO answer “What’s a mathematician to do?

Any off the beaten path math style is great. Think of “The Strong Law of Small Numbers” by Richard Guy, or many conjectures Terry Tao discusses in his blog. Think of “Missed opportunities” by Freeman Dyson, “Tilings of space by knotted tiles” by Colin Adams, or “One sentence proof… ” by Don Zagier (see also a short discussion here) — these are all remarkable and memorable pieces of writing that don’t conform to the usual peer review paradigm.

Finally, anything philosophical or metamathematical finds an audience. I am thinking of “Is it plausible?” by Barry Mazur, “Theorems for a Price” by Doron Zeilberger, “You and Your Research” by Richard Hamming, “Mathematics as Metaphor” by Yuri Manin, or even “Prime Numbers and the Search for Extraterrestrial Intelligence” by Carl Pomerance. We are all in search of some kind of answers, I suppose, so reading others thinking aloud about these deep questions always helps.

Practice makes perfect

Before I move to the other side, here is a simple advice on how to write a good story. Write as much as possible! There is no way around this. Absolutely no substitute, really. I’ve given this advice plenty of times, and so have everyone else. Let me conclude by this quote by Don Knuth which is a bit similar to Robert Lazarsfeld‘s advice. It makes my point much better and with with more authority that I can ever provide:

Of equal importance to solving a problem is the communication of that solution to others. The best way to improve your writing skills is to practice, practice, practice.

Seize every opportunity to write mini-essays about the theoretical work you are doing. Compose a blog for your friends, or even for yourself. When you write programs, write literate programs.

One of the best strategies to follow while doing PhD research is to prepare weekly reports of exactly what you are doing. What questions did you pursue that week? What positive answers did you get? What negative answers did you get? What are the major stumbling blocks that seem to be present at the moment? What related work are you reading?

Donald Knuth – On Writing up Research (posted by Omer Reingold), Theory Dish, Feb 26, 2018

Don’t be a journalist

In this interesting article in the same collection, Jordan Ellenberg writes:

Why don’t journalists talk about math as it really is? Because they don’t know how it really is. We do. And if we want the public discourse about math to be richer, broader, and deeper, we need to tell our own stories.

He goes on to suggest that one should start writing a blog and then pitch some articles to real newspapers and news magazines. He gives his own bio as one example (among others) of pitching and publishing in mainstream publications such as Slate and the New York Times. Obviously, I agree with the first (blog) part (duh!), but I am rather negative on the second part. I know, I know, this sounds discouraging, but hear me out.

First, what Jordan is not telling you is how hard he had to work on his craft before getting to the point of being acceptable to the general audience. This started with him getting Summa Cum Laude A.B. degree from Harvard in both Math and English (if I recall correctly), and then publishing a well-received novel, all before starting his regular Slate column. Very few math people have this kind of background on which they can build popular appeal.

Second, this takes away jobs from real journalists. Like every highly competitive intellectual profession, journalism requires years of study and practice. It has its own principles and traditions, graduate schools, etc. Call it a chutzpah or a Dunning–Kruger effect, but just because you are excellent in harmonic analysis doesn’t mean you can do even a mediocre job as a writer. Again — some people can do both, but most cannot. If anything, I suspect a negative correlation between math and writing skills.

Here is another way to think about this. Most people do realize that they don’t need to email their pretty iPhone pictures of a Machu Picchu sunrise to be published by the National Geographic. Or that their cobbler family recipe maybe not exactly be what Gourmet Magazine is looking for. Why would you think that writing is much easier then?

Third, this cheapens our profession to some degree. You really don’t need a Ph.D. in algebraic number theory and two perfect scores at the IMO to write about Powerball or baseball. You need a M.S. in statistics and really good writing skills. There are plenty of media sites which do that now, such as 538. There is even the whole DDJ specialization with many practitioners and a handful of Pulitzer prizes. Using quantitative methods is now mainstream, so what exactly are you bringing to the table?

Fourth, it helps to be honest. Jordan writes: “Editors like an angle. If there’s a math angle to a story in the news, pitch it! As someone with a degree in math, you have something to offer that most writers don’t.” This is true in the rare instances when, say, a Fields medal in your area is awarded, or something like that. But if it’s in an area far away from yours, then, uhm, you got nothing over many thousands of other people.

Now, please don’t take this as “don’t comment on current affairs” advice. No, no — please do! Comment away on your blog or on your podcast. Just don’t take jobs away from journalists. Help them instead! Write them emails, correct their mistakes. Let them interview you as an “expert”, whatever. Part of the reason the math related articles are so poor is because of mathematicians’ apathy and frequent disdain to the media, not because we don’t write newspaper articles — it’s really not our job.

Let me conclude with an anecdote about me reaching out to a newspaper. Once upon a time, long ago, flights used to distribute real newspapers to the passengers. I was sitting in the back and got a Wall Street Journal which I read out of boredom during takeoff. There was an article discussing the EU expansion and the fact that by some EU rules, the headquarters need a translator from every language to every other language. The article predicted dark days ahead, since it’s basically impossible to find people who can translate some smaller languages, such as from Maltese to Lithuanian. The article provided a helpful graph showing the number of translators needed as a function of the number of countries and claimed the exponential growth.

I was not amused, cut out the article, and emailed the author upon arrival, saying that with all my authority as an assistant professor at MIT, I promise that n(n-1) grows polynomially, not exponentially. I got back a surprisingly apologetic reply. The author confessed he was a math major in college, but was using the word without thinking. I don’t know if WSJ ever published a correction, but I bet the author will not be using this word so casually anymore, and if he ever advanced to the editorial position will propagate this knowledge to others. So there — that’s my personal contribution to improving public discourse…

Don’t be an apologist

In another beautifully written article in the Early Career collection, Izzet Coskun gives “advice on how to communicate mathematics quickly in informal settings”. He writes:

Whether before a promotion committee, at a party where one might meet future politicians or future parents of future colleagues, in the elevator on the way up to tea, or in the dean’s office at a job interview, we often have the opportunity to explain our work to a general audience. The time we have is usually short [..] Our audience will not be familiar with our terminology. Communicating mathematics in such settings is challenging.

He then gives a lot of very useful practical advice on how to prepare to such “math under a minute” conversation, how to be engaging, accessible, etc. It’s an all around good advice. However, I disagree. Here is my simple advice: Don’t Do It! If it’s a dean and this is a job interview, feel free to use any math jargon you want — it’s not your fault your field is technical, and the dean of sciences is used to it anyway. Otherwise, just say NO.

It’s true that sometimes your audience is friendly and is sincere in their interest in your work. In that case no matter what you say will disappoint them. There is a really good chance they can’t understand a word of what you say. They just think they can, and you are about to disillusion them.

But more often than not, the audience is actually not friendly, as was the case of a party Izzet described in his article. Many people harbor either a low regard or an outright resentment towards math stemming from their school years or some kind of “life experience”. These folks simply want to reinforce their views, and no matter what you say that will be taken as “you see, math is both hard, boring and useless”.

One should not confuse the unfriendlies with stupid or uneducated people. On the contrary, a lot of educated people think this way. A prime example is Amy Wax with her inimitable quote:

If we got rid of ninety percent of the math Ph.D. programs, would we really be worse off in any material respect?  I think that’s a serious question.

I discussed this quote at length in this blog post. There, I tried to answer her question. But after a few back-and-force emails (which I didn’t make public), it became clear that she is completely uninterested in the actual learning of what math is and what it does. She just wants to have her own answer validated by some area practitioners. Oh, well…

So here is the real reason why I think answering such people is pointless. No matter what you say, you come across as an apologist for the field. If people really want to understand what math is for, there are plenty of sources. In fact, have several bookshelves with extremely well written book-length answers. But it’s not your job to educate them! Worse, it is completely unreasonable to expect you to answer in “under one minute”.

Think about reactions of people when they meet other professionals. Someone says “I develop new DNA based cancer treatments” or “I work on improving VLSI architecture”, or “I device new option pricing strategies”. Is there a follow up request to explain it in “under one minute”? Not really. Let me give you a multiple choice. Is that because people think that:

a) these professions are boring compared to math and they would rather hear about the latter?

b) they know exactly what these professionals do, but math is so darn mysterious?

c) they know these professions are technical and hard to understand, but even children can understand math, so how hard can that be?

d) these professions are clearly useful, but what do math people do — solve quadratic equations all day?

If you answered a) or b) you have more faith in humanity than I do. If you answered c) you never spoke to anyone about math at a party. So d) is the only acceptable answer, even if it’s an exaggeration. Some people (mostly under 7) think that I “add numbers all day”, some people (mostly in social sciences) think that I “take derivatives all day”, etc., you get the point. My advice — don’t correct them. This makes them unhappy. Doesn’t matter if they are 7 or 77 — when you correct them the unhappiness is real and visible…

So here is a summary of how I deal with such questions. If people ask what I do, I answer “I do math research and I teach“. If they ask what kind of research I say “advanced math“. If they ask for details I tell them “it’s complicated“. If they ask why, I tell them “because it takes many years of study to even understand the math lingo, so if I tell you what I do this sounds like I am speaking a foreign language“.

If they ask what are the applications of my research (and they always do), I tell them “teaching graduate classes“. If they ask for “practical” applications, whatever that means, I tell them “this puts money into my Wells Fargo account“. At this point they move on exhausted by the questions. On the one hand I didn’t lie except in the last answer. On the other — nobody cares if I even have a WF account (I don’t, but it’s none of their business either).

One can ask — why do I care so much? What’s so special about my work that I am so apprehensive? In truth, nothing really. There are other aspects of my identity I also find difficult discussing in public. The most relevant is “What is Combinatorics?” which for some reason is asked over and over as if there is a good answer (see this blog post for my own answer and this Wikipedia article I wrote). When I hear people explaining what it is, half the time it sounds like they are apologizing for something they didn’t do…

There are other questions relevant to my complex identity that I am completely uninterested in discussing. Like “What do you think of the Russian President?” or “Who is a Jew?“, or “Are you a Zionist?” It’s not that my answers are somehow novel, interesting or controversial (they are not). It’s more like I am afraid to hear responses from the people who asked me these questions. More often than not I find their answers unfortunate or plain offensive, and I would rather not know that.

Let me conclude on a positive note, by telling a party story of my own. Once, during hors d’oeuvres (remember those?), one lady, a well known LA lawyer, walked to me and said: “I hear you are a math professor at UCLA. This is so fascinating! Can you tell me what you do? Just WOW me!” I politely declined along the lines above. She insisted: “There has to be something that I can understand!” I relented: “Ok, there is one theorem I can tell you. In fact, this result landed me a tenure.” She was all ears.

I continued: “Do you know what’s a square-root-of-two?” She nodded. “Well, I proved that this number can never be a ratio of two integers, for example it’s not equal to 17/12 or anything like that.” “Oh, shut-the-F-up!” she exclaimed. “Are you serious? You can prove that?” — she was clearly suspicious. “Yes, I can“, I confirmed vigorously, “in fact, two Russian newspapers even printed headlines about that back a few years ago. We love math over there, you know.”

But of course!“, she said, “American media never writes about math. It’s such a shame! That’s why I never heard of your work. My son is much too young for this, but I must tell my nieces — they love science!” I nodded approvingly. She drifted away very happy, holding a small plate of meat stuffed potato croquettes, enriched with this newly acquired knowledge. I do hope her nieces liked that theorem — it is cool indeed. And the proof is so super neat…

It could have been worse! Academic lessons of 2020

December 20, 2020 4 comments

Well, this year sure was interesting, and not in a good way. Back in 2015, I wrote a blog post discussing how video talks are here to stay, and how we should all agree to start giving them and embrace watching them, whether we like it or not. I was right about that, I suppose. OTOH, I sort of envisioned a gradual acceptance of this practice, not the shock therapy of a phase transition. So, what happened? It’s time to summarize the lessons and roll out some new predictions.

Note: this post is about the academic life which is undergoing some changes. The changes in real life are much more profound, but are well discussed elsewhere.

Teaching

This was probably the bleakest part of the academic life, much commented upon by the media. Good thing there is more to academia than teaching, no matter what the ignorant critics think. I personally haven’t heard anyone saying post-March 2020, that online education is an improvement. If you are like me, you probably spent much more time preparing and delivering your lectures. The quality probably suffered a little. The students probably didn’t learn as much. Neither party probably enjoyed the experience too much. They also probably cheated quite a bit more. Oh, well…

Let’s count the silver linings. First, it will all be over some time next year. At UCLA, not before the end of Summer. Maybe in the Fall… Second, it could’ve been worse. Much worse. Depending on the year, we would have different issues. Back in 1990, we would all be furloughed for a year living off our savings. In 2000, most families had just one personal computer (and no smartphones, obviously). Let the implications of that sink in. But even in 2010 we would have had giant technical issues teaching on Skype (right?) by pointing our laptop cameras on blackboards with dismal effect. The infrastructure which allows good quality streaming was also not widespread (people were still using Redbox, remember?)

Third, the online technology somewhat mitigated the total disaster of studying in the pandemic time. Students who are stuck in faraway countries or busy with family life can watch stored videos of lectures at their convenience. Educational and grading software allows students to submit homeworks and exams online, and instructors to grade them. Many other small things not worth listing, but worth being thankful for.

Fourth, the accelerated embrace of the educational technology could be a good thing long term, even when things go back to normal. No more emails with scanned late homeworks, no more canceled/moved office hours while away at conferences. This can all help us become better at teaching.

Finally, a long declared “death of MOOCs” is no longer controversial. As a long time (closeted) opponent to online education, I am overjoyed that MOOCs are no longer viewed as a positive experience for university students, more like something to suffer through. Here in CA we learned this awhile ago, as the eagerness of the current Gov. Newsom (back then Lt. Gov.) to embrace online courses did not work out well at all. Back in 2013, he said that the whole UC system needs to embrace online education, pronto: “If this doesn’t wake up the U.C. [..] I don’t know what will.” Well, now you know, Governor! I guess, in 2020, I don’t have to hide my feelings on this anymore…

Research

I always thought that mathematicians can work from anywhere with a good WiFi connection. True, but not really – this year was a mixed experience as lonely introverts largely prospered research wise, while busy family people and extraverts clearly suffered. Some day we will know how much has research suffered in 2020, but for me personally it wasn’t bad at all (see e.g. some of my results described in my previous blog post).

Seminars

I am not even sure we should be using the same word to describe research seminars during the pandemic, as the experience of giving and watching math lectures online are so drastically different compared to what we are used to. Let’s count the differences, which are both positive and negative.

  1. The personal interactions suffer. Online people are much more shy to interrupt, follow up with questions after the talk, etc. The usual pre- or post-seminar meals allow the speaker to meet the (often junior) colleagues who might be more open to ask questions in an informal setting. This is all bad.
  2. Being online, the seminar opened to a worldwide audience. This is just terrific as people from remote locations across the globe now have the same access to seminars at leading universities. What arXiv did to math papers, covid did to math seminars.
  3. Again, being online, the seminars are no longer restricting themselves to local speaks or having to make travel arrangements to out of town speakers. Some UCLA seminars this year had many European speakers, something which would be prohibitively expensive just last year.
  4. Many seminars are now recorded with videos and slides posted online, like we do at the UCLA Combinatorics and LA Combinatorics and Complexity seminars I am co-organizing. The viewers can watch them later, can fast forward, come back and re-watch them, etc. All the good features of watching videos I extolled back in 2015. This is all good.
  5. On a minor negative side, the audience is no longer stable as it varies from seminar to seminar, further diminishing personal interactions and making level of the audience somewhat unpredictable and hard to aim for.
  6. As a seminar organizer, I make it a personal quest to encourage people to turn on their cameras at the seminars by saying hello only to those whose faces I see. When the speaker doesn’t see the faces, whether they are nodding or quizzing, they are clueless whether the they are being clear, being too fast or too slow, etc. Stopping to ask for questions no longer works well, especially if the seminar is being recorded. This invariably leads to worse presentations as the speakers can misjudge the audience reactions.
  7. Unfortunately, not everyone is capable of handling technology challenges equally well. I have seen remarkably well presented talks, as well as some of extremely poor quality talks. The ability to mute yourself and hide behind your avatar is the only saving grace in such cases.
  8. Even the true haters of online educations are now at least semi-on-board. Back in May, I wrote to Chris Schaberg dubbed by the insufferable Rebecca Schuman as “vehemently opposed to the practice“. He replied that he is no longer that opposed to teaching online, and that he is now in a “it’s really complicated!” camp. Small miracles…

Conferences

The changes in conferences are largely positive. Unfortunately, some conferences from the Spring and Summer of 2020 were canceled and moved, somewhat optimistically, to 2021. Looking back, they should all have been held in the online format, which opens them to participants from around the world. Let’s count upsides and downsides:

  1. No need for travel, long time commitments and financial expenses. Some conferences continue charging fees for online participation. This seems weird to me. I realize that some conferences are vehicles to support various research centers and societies. Whatever, this is unsustainable as online conferences will likely survive the pandemic. These organizations should figure out some other income sources or die.
  2. The conferences are now truly global, so the emphasis is purely on mathematical areas than on the geographic proximity. This suggests that the (until recently) very popular AMS meetings should probably die, making AMS even more of a publisher than it is now. I am especially looking forward to the death of “joint meetings” in January which in my opinion outlived their usefulness as some kind of math extravaganza events bringing everyone together. In fact, Zoom simply can’t bring five thousand people together, just forget about it…
  3. The conferences are now open to people in other areas. This might seem minor — they were always open. However, given the time/money constraints, a mathematician is likely to go only to conferences in their area. Besides, since they rarely get invited to speak at conferences in other areas, travel to such conferences is even harder to justify. This often leads to groupthink as the same people meet year after year at conferences on narrow subjects. Now that this is no longer an obstacle, we might see more interactions between the fields.
  4. On a negative side, the best kind of conferences are small informal workshops (think of Oberwolfach, AIM, Banff, etc.), where the lectures are advanced and the interactions are intense. I miss those and hope they come back as they are really irreplaceable in the only setting. If all goes well, these are the only conferences which should definitely survive and even expand in numbers perhaps.

Books and journals

A short summary is that in math, everything should be electronic, instantly downloadable and completely free. Cut off from libraries, thousands of mathematicians were instantly left to the perils of their university library’s electronic subscriptions and their personal book collections. Some fared better than others, in part thanks to the arXiv, non-free journals offering old issues free to download, and some ethically dubious foreign websites.

I have been writing about my copyleft views for a long time (see here, there and most recently there). It gets more and more depressing every time. Just when you think there is some hope, the resilience of paid publishing and reluctance to change by the community is keeping the unfortunate status quo. You would think everyone would be screaming about the lack of access to books/journals, but I guess everyone is busy doing something else. Still, there are some lessons worth noting.

  1. You really must have all your papers freely available online. Yes, copyrighted or not, the publishers are ok with authors posting their papers on their personal website. They are not ok when others are posting your papers on their websites, so the free access to your papers is on you and your coauthors (if any). Unless you have already done so, do this asap! Yes, this applies even to papers accessible online by subscription to selected libraries. For example, many libraries including all of UC system no longer have access to Elsevier journals. Please help both us and yourself! How hard is it to put the paper on the arXiv or your personal website? If people like Noga Alon and Richard Stanley found time to put hundreds of their papers online, so can you. I make a point of emailing to people asking them to do that every time I come across a reference which I cannot access. They rarely do, and usually just email me the paper. Oh, well, at least I tried…
  2. Learn to use databases like MathSciNet and Zentralblatt. Maintain your own website by adding the slides, video links as well as all your papers. Make sure to clean up and keep up to date your Google Scholar profile. When left unattended it can get overrun with random papers by other people, random non-research files you authored, separate items for same paper, etc. Deal with all that – it’s easy and takes just a few minutes (also, some people judge them). When people are struggling trying to do research from home, every bit of help counts.
  3. If you are signing a book contract, be nice to online readers. Make sure you keep the right to display a public copy on your website. We all owe a great deal of gratitude to authors who did this. Here is my favorite, now supplemented with high quality free online lectures. Be like that! Don’t be like one author (who will remain unnamed) who refused to email me a copy of a short 5 page section from his recent book. I wanted to teach the section in my graduate class on posets this Fall. Instead, the author suggested I buy a paper copy. His loss — I ended up teaching some other material instead. Later on, I discovered that the book is already available on one of those ethically compromised websites. He was fighting a battle he already lost!

Home computing

Different people can take different conclusions from 2020, but I don’t think anyone would argue the importance of having good home computing. There is a refreshing variety of ways in which people do this, and it’s unclear to me what is the optimal set up. With a vaccine on the horizon, people might be reluctant to further invest into new computing equipment (or video cameras, lights, whiteboard, etc.), but the holiday break is actually a good time to marinate on what worked out well and what didn’t.

Read your evaluations and take them to heart. Make changes when you see there are problems. I know, it’s unfair, your department might never compensate you for all this stuff. Still, it’s a small price to pay for having a safe academic job in the time of widespread anxiety.

Predictions for the future

  1. Very briefly: I think online seminars and conferences are here to stay. Local seminars and small workshops will also survive. The enormous AMS meetings and expensive Theory CS meetings will play with the format, but eventually turn online for good or die untimely death.
  2. Online teaching will remain being offered by every undergraduate math program to reach out to students across the spectrum of personal circumstances. A small minority of courses, but still. Maybe one section of each calculus, linear algebra, intro probability, discrete math, etc. Some faculty might actually prefer this format to stay away from office one semester. Perhaps, in place of a sabbatical, they can ask for permission to spend a semester some other campus, maybe in another state or country, while they continue teaching, holding seminars, supervising students, etc. This could be a perk of academic life to compete with the “remote work” that many businesses are starting to offer on a permanent basis. Universities would have to redefine what they mean by “residence” requirement for both faculty and students.
  3. More university libraries will play hardball and unsubscribe from major for-profit publishers. This would again sound hopeful, but not gain a snowball effect for at least the next 10 years.
  4. There will be some standardization of online teaching requirements across the country. Online cheating will remain widespread. Courts will repeatedly rule that business and institutions can discount or completely ignore all 2020 grades as unreliable in large part because of the cheating scandals.

Final recommendations

  1. Be nice to your junior colleagues. In the winner-take-all no-limits online era, the established and well-known mathematicians get invited over and over, while their junior colleagues get overlooked, just in time when they really need help (job market might be tough this year). So please go out of your way to invite them to give talks at your seminars. Help them with papers and application materials. At least reply to their emails! Yes, even small things count…
  2. Do more organizing if you are in position to do so. In the absence of physical contact, many people are too shy and shell-shocked to reach out. Seminars, conferences, workshops, etc. make academic life seem somewhat normal and the breaks definitely allow for more interactions. Given the apparent abundance of online events one my be forgiven to think that no more is needed. But more locally focused online events are actually important to help your communities. These can prove critical until everything is back to normal.

Good luck everybody! Hope 2021 will be better for us all!