Archive

Archive for the ‘History of mathematics’ Category

How to fight the university bureaucracy and survive

June 27, 2021 Leave a comment

The enormity of the university administration can instill fear. How can you possibly fight such a machine? Even if an injustice happened to you, you are just one person with no power, right? Well, I think you can. Whether you succeed in your fight is another matter. But at least you can try… In this post I will try to give you some advise on how to do this.

Note: Initially I wanted to make this blog post light and fun, but I couldn’t think of a single joke. Somehow, the subject doesn’t inspire… So read this only if it’s relevant to you. Wait for future blog posts otherwise…

Warning: Much of what I say is relevant to big state universities in the US. Some of what I say may also be relevant to other countries and university systems, I wouldn’t know.

Basics

Who am I to write about this? It is reasonable to ask if any of this is based on my personal experience of fighting university bureaucracies. The answer is yes, but I am not willing to make any public disclosures to protect privacy of all parties involved. Let me just say that over the past 20 years I had several relatively quiet and fairly minor fights with university bureaucracies some of which I won rather quickly by being right. Once, I bullied my way into victory despite being in the wrong (as I later learned), and once I won over a difficult (non-personal) political issue by being cunning and playing a really long game that took almost 3 years. I didn’t lose any, but I did refrain from fighting several times. By contrast, when I tried to fight the federal government a couple of times (on academic matters), I lost quickly and decisively. They are just too powerful….

Should you fight? Maybe. But probably not. Say, you complained to the administration about what you perceive to be an injustice to you or to someone else. Your complaint was denied. This is when you need to decide if you want to start a fight. If you do, you will spend a lot of effort and (on average) probably lose. The administrations are powerful and know what they are doing. You probably don’t, otherwise you won’t be reading this. This blog post might help you occasionally, but wouldn’t change the big picture.

Can you fight? Yes, you can. You can win by being right and convince bureaucrats to see it this way. You can win by being persistent when others give up. You can also win by being smart. Big systems have weaknesses you can exploit, see below. Use them.

Is there a downside to winning a fight? Absolutely. In the process you might lose some friends, raise some suspicions from colleagues, and invite retribution. On a positive side, big systems have very little institutional memory — your win and the resulting embarrassment to administration will be forgotten soon enough.

Is there an upside to losing a fight? Actually, yes. You might gain resect of some colleagues as someone willing to fight. In fact, people tend to want being friends/friendly with such people out of self-preservation. And if your cause is righteous, this might help your reputation in and beyond the department.

Why did I fight? Because I just couldn’t go on without a fight. The injustice, as I perceived it, was eating me alive and I had a hunch there is a nonzero chance I would win. There were some cases when I figured the chances are zero, and I don’t need the grief. There were cases when the issue was much too minor to waste my energy. I don’t regret those decision, but having grown up in this unsavory part of Moscow, I was conditioned to stand up for myself.

Is there a cost of not fighting? Yes, and it goes beyond the obvious. First, fighting bureaucracy is a skill, and every skill takes practice. I remember when tried to rent an apartment in Cambridge, MA — some real estate agents would immediately ask if I go to Harvard Law School. Apparently it’s a common practice for law students to sue their landlords, an “extra credit” homework exercise. Most of these lawsuits would quickly fail, but the legal proceeding were costly to the owners.

Second, there is a society cost. If you feel confident that your case is strong, you winning might set a precedent which could benefit many others. I wrote on this blog once how I dropped (or never really started) a fight against the NSF, even though they clearly denied me the NSF Graduate Fellowship in a discriminatory manner, or at least that’s what I continue to believe. Not fighting was the right thing to do for me personally (I would have lost, 100%), but my case was strong and the fight itself might have raised some awareness to the issue. It took the NSF almost 25 years to figure out that it’s time to drop the GREs discriminatory requirement.

Axioms

  1. If it’s not in writing it never happened.
  2. Everyone has a boss.
  3. Bureaucrats care about themselves first and foremost. Then about people in their research area, department and university, in that order. Then undergraduates. Then graduate students. You are the last person they care about.

How to proceed

Know your adversary. Remember — you are not fighting a mafia, a corrupt regime or the whole society. Don’t get angry, fearful or paranoid. Your adversary is a group of good people who are doing their jobs as well as they can. They are not infallible, but probably pretty smart and very capable when it comes to bureaucracy, so from game theory point of view you may as well assume they are perfect. When they are not, you will notice that — that’s the weakness you can exploit, but don’t expect that to happen.

Know your rights. This might seem obvious, but you would be surprised to know how many academics are not aware they have rights in a university system. In fact, it’s a feature of every large bureaucracy — it produces a lot of well meaning rules. For example, Wikipedia is a large project which survived for 20 years, so unsurprisingly it has a large set of policies enforced by an army of admins. The same is probably true about your university and your department. Search on the web for the faculty handbook, university and department bylaws, etc. If you can’t find the anywhere, email the assistant to the Department Chair and ask for one.

Go through the motions. Say, you think you were slighted. For example, your salary was not increased (enough), you didn’t get a promotion, you got too many committee duties assigned, your sabbatical was not approved, etc. Whatever it is, you are upset, I get it. Your first step is not to complain but go through the motions, and email inquiries. Email the head of the department, chair of the executive committee, your faculty dean, etc., whoever is the decision maker. Calmly ask to explain this decision. Sometimes, this was an oversight and it’s corrected with a quick apology and “thanks for bringing this up”. You win, case closed. Also, sometimes you either get a convincing explanation with which you might agree — say, the university is on salary freeze so nobody got a salary increase, see some link. Again, case closed.

But in other cases you either receive an argument with which you disagree (say, “the decision was made based on your performance in the previous year”), a non-answer (say, “I am on sabbatical” or “I will not be discussing personal matters by email”), or no answer at all. These are the cases that you need to know how to handle and all such cases are a little different. I will try to cover as much territory as possible, but surely will miss some cases.

Ask for advice. This is especially important if you are a junior mathematician and feel a little overwhelmed. Find a former department chair, perhaps professor emeritus, and have an quiet chat. Old-timers know the history of the department, who are the university administrators, what are the rules, what happened to previous complaints, what would fly and what wouldn’t, etc. They might also suggest who else you should talk to that would be knowledgeable and help dealing with an issue. With friends like these, you are in a good shape.

Scenarios

Come by for a chat. This is a standard move by a capable bureaucrat. They invite you for a quick discussion, maybe sincerely apologize for “what happened” or “if you are upset” and promise something which they may or may not intend to keep. You are supposed to leave grateful that “you are heard” and nothing is really lost from admin’s point of view. You lost.

There is only one way to counter this move. Agree to a meeting — play nice and you might learn something. Don’t record in secret — it’s against the law in most states. Don’t ask if you can record the conversation — even if the bureaucrat agrees you will hear nothing but platitudes then (like “we in our university strive to make sure everyone is happy and successful, and it is my personal goal to ensure everyone is treated fairly and with respect”). This defeats the purpose of the meeting moving you back to square one.

At the meeting do not agree with anything, never say yes or no to anything. Not even to the routine “No hard feelings?” Just nod, take careful notes, say “thank you so much for taking time to have this meeting” and “This information is very useful, I will need to think it over”. Do not sign anything. If offered a document to sign, take it with you. If implicitly threatened, as in “Right now I can offer you this for you, but once you leave this office I can’t promise… ” (this is rare but does happen occasionally), ignore the threat. Just keep repeating “Thank you so much for informing me of my options, I will need to think it over.” Go home, think it over and talk to somebody.

Get it all in writing. Within a few hours after the meeting, email to the bureaucrat an email with your notes. Start this way: “Dear X, this is to follow up on the meeting we had on [date] regarding the [issue]. I am writing this to ensure there is no misunderstanding on my part. At the meeting you [offered/suggested/claimed/threatened] …. Please let me know if this is correct and what are the details of …”

A capable bureaucrat will recognize the move and will never go on record with anything unbecoming. They will accept the out you offered and claim that you indeed misunderstood. Don’t argue with that — you have them where you want it. In lieu of the misunderstanding they will need to give a real answer to your grievance (otherwise what was the point of the meeting?) Sometimes a bureaucrat will still resort to platitudes, but now that they are in writing, that trick is harder to pull off, and it leads us to a completely different scenario (see below).

Accept the win. You might receive something like this: “We sincerely apologize for [mistake]. While nothing can be done about [past decision], we intend to [compensate or rectify] by doing…” If this is a clear unambiguous promise in writing, you might want to accept it. If not, follow up about details. Do not pursue this any further and don’t make it public. You got what you wanted, it’s over.

Accept the defeat. You might learn that administration acted by the book, exactly the way the rules/bylaws prescribe, and you were not intentionally discriminated in any way. Remain calm. Thank the bureaucrat for the “clarification”. It’s over.

Power of CC. If you receive a non-answer full of platitudes or no email reply at all (give it exactly one week), then follow up. Write politely “I am afraid I did not receive an answer to [my questions] in my email from [date]. I would really appreciate your response to [all issues I raised]. P.S. I am CC’ing this email to [your boss, boss of your boss, your assistant, your peers, other fellow bureaucrats, etc.] to let them know of [my grievance] and in case they can be helpful with this situation.” They will not “be helpful”, of course, but that’s not the point. The CC move itself has an immense power driven by bureaucrats’ self-preservation. Most likely you will get a reply within hours. Just don’t abuse the CC move — use it when you have no other moves to play, as otherwise it loses its power.

Don’t accept a draw. Sometimes a capable bureaucrat might reply to the whole list on CC and write “We are very sorry [your grievance] happened. This is extremely atypical and related to [your unusual circumstances]. While this is normally not appropriate, we are happy to make an exception in your case and [compensate you].” Translation: “it’s your own fault, you brought it on yourself, we admit no wrongdoing, but we are being very nice and will make you happy even though we really don’t have to do anything, not at all.” While other bureaucrats will recognize the move and that there is an implicit admission of fault, they will stay quiet — it’s not their fight.

Now, there is only one way to counter this, as far as I know. If you don’t follow up it’s an implicit admission of “own fault” which you don’t want as the same issue might arise again in the future. If you start explaining that it’s really bureaucrat’s fault you seem vindictive (as in “you already got what you wanted, why do you keep pushing this?”), and other bureaucrats will close ranks leaving you worse off. The only way out is to pretend to be just as illogical as the bureaucrat pretends to be. Reply to the whole CC list something like “Thank you so much for your apology and understanding of my [issue]. I am very grateful this is resolved to everyone’s satisfaction. I gratefully accept your sincere apology and your assurances this will not happen again to me nor anyone else at the department.”

A capable bureaucrat will recognize they are fighting fire with fire. In your email you sound naïve and sincere — how do you fight that? What are they going to do — reply “actually, I didn’t issue any apology as this was not my fault”? Now that seem overly defensive. And they would have to reply to the whole CC list again, which is not what they want. They are aware that everyone else knows they screwed up, so reminding everyone with a new email is not in their interest. And there is a decent chance you might reply to the whole CC list again with all that sugarcoated unpleasantness. Most likely, you won’t hear from them again, or just a personal (non-CC’d) email which you can ignore regardless of the content.

Shifting blame or responsibility. That’s another trick bureaucrats employ very successfully. You might get a reply from a bureaucrat X to the effect saying “don’t ask me, these are rules made by [people upstairs]” or “As far as I know, person Y is responsible for this all”. This is great news for you — a tacit validation of your cause and an example of a bureaucrat putting their own well-being ahead of the institution. Remember, your fight is not with X, but with the administration. Immediately forward both your grievance and the reply to Y, or to X’s boss if no names were offered, and definitely CC X “to keep your in the loop of further developments on this issue”. That immediately pushes bureaucracy into overdrive as it starts playing musical chairs in the game “whose fault is that and what can be done”.

Like with musical chairs, you might have to repeat the procedure a few times, but chances are someone will eventually accept responsibility just to stop this embarrassment from going circles. By then, there will be so many people on the CC chain, your issue will be addressed appropriately.

Help them help you. Sometimes a complaint puts the bureaucrat into a stalemate. They want to admit that injustice happened to you, but numerous university rules forbid them from acting to redress the situation. In order to violate these rules, they would have to take the case upstairs, which brings its own complications to everyone involved. Essentially you need to throw them a lifeline by suggesting some creative solution to the problem.

Say, you can write “while I realize the deadline for approval of my half-year sabbatical has passed, perhaps the department can buyout one course from my Fall schedule and postpone teaching the other until Spring.” This moves the discussion from the “apology” subject to “what can be done”, a much easier bureaucratic terrain. While the bureaucrat may not agree with your proposed solution, your willingness to deal without an apology will earn you some points and perhaps lead to a resolution favorable to all parties.

Now, don’t be constrained in creativity of when thinking up such a face saving resolution. It is a common misconception that university administrations are very slow and rigid. This is always correct “on average”, and holds for all large administrative systems where responsibility is distributed across many departments and individuals. In reality, when they want to, such large systems can turn on a dime by quickly utilizing its numerous resources (human, financial, legal, etc.) I’ve seen it in action, it’s jaw-dropping, and it takes just one high ranking person to take up the issue and make it a cause.

Making it public. You shouldn’t do that unless you already lost but keep holding a grudge (and have tenure to protect you). Even then, you probably shouldn’t do it unless you are really good at PR. Just about every time you make grievances public you lose some social points with people who will hold it against you, claim you brought it on yourself, etc. In the world of social media your voice will be drowned and your case will be either ignored or take life of its own, with facts distorted to fit a particular narrative. The administration will close ranks and refuse to comment. You might be worse off than when you started.

The only example I can give is my own combative blog post which remains by far my most widely read post. Everyone just loves watching a train wreck… Many people asked why I wrote it, since it made me a persona non grata in the whole area of mathematics. I don’t have a good answer. In fact, that area may have lost some social capital as a result of my blog post, but haven’t changed at all. Some people apologized, that’s all. There is really nothing I can do and they know it. The truth is — my upbringing was acting up again, and I just couldn’t let it go without saying “Don’t F*** with Igor Pak”.

But you can very indirectly threaten to make it public. Don’t do it unless you are at an endgame dealing with a high ranking administrator and things are not looking good for you. Low level university bureaucrats are not really afraid for their jobs. For example, head of the department might not even want to occupy the position, and is fully protected by tenure anyway. But deans, provosts, etc. are often fully vested into their positions which come with substantial salary hike. If you have a sympathetic case, they wouldn’t want to be featured in a college newspaper as denying you some benefits, regardless of the merit. They wouldn’t be bullied into submission either, so some finesse is needed.

In this case I recommend you find an email of some student editor of a local university newspaper. In your reply to the high ranking administrator write something like “Yes, I understand the university position in regard to this issue. However, perhaps [creative solution]”. Then quietly insert the editor’s email into CC. In the reply, the administrator will delete the email from CC “for privacy reasons”, but will google to find out who is being CC’ed. Unable to gauge the extend of newspaper’s interest in the story, the administrator might chose to hedge and help you by throwing money at you or mollifying you in some creative way you proposed. Win–win.

Final word

I am confident there will be people on all sides who disagree collectively with just about every sentence I wrote. Remember — this blog post is a not a recommendation to do anything. It’s just my personal point of view on these delicate matters which tend to go undiscussed, leaving many postdocs and junior faculty facing alone their grievances. If you know a good guide on how to deal with these issues (beyond Rota’s advice), please post a link in the comments. Good luck everyone! Hope you will never have to deal with any of that!

The Unity of Combinatorics

April 10, 2021 1 comment

I just finished my very first book review for the Notices of the AMS. The authors are Ezra Brown and Richard Guy, and the book title is the same as the blog post. I had mixed feelings when I accepted the assignment to write this. I knew this would take a lot of work (I was wrong — it took a huge amount of work). But the reason I accepted is because I strongly suspected that there is no “unity of combinatorics”, so I wanted to be proved wrong. Here is how the book begins:

One reason why Combinatorics has been slow to become accepted as part of mainstream Mathematics is the common belief that it consists of a bag of isolated tricks, a number of areas: [very long list – IP] with little or no connection between them. We shall see that they have numerous threads weaving them together into a beautifully patterned tapestry.

Having read the book, I continue to maintain that there is no unity. The book review became a balancing act — how do you write a somewhat positive review if you don’t believe into the mission of the book? Here is the first paragraph of the portion of the review where I touch upon themes very familiar to readers of this blog:

As I see it, the whole idea of combinatorics as a “slow to become accepted” field feels like a throwback to the long forgotten era. This attitude was unfair but reasonably common back in 1970, outright insulting and relatively uncommon in 1995, and was utterly preposterous in 2020.

After a lengthy explanation I conclude:

To finish this line of thought, it gives me no pleasure to conclude that the case for the unity of combinatorics is too weak to be taken seriously. Perhaps, the unity of mathematics as a whole is an easier claim to establish, as evident from [Stanley’s] quotes. On the other hand, this lack of unity is not necessarily a bad thing, as we would be amiss without the rich diversity of cultures, languages, open problems, tools and applications of different areas.

Enjoy the full review! And please comment on the post with your own views on this alleged “unity”.

P.S. A large part of the book is freely downloadable. I made this website for the curious reader.

Remark (ADDED April 17, 2021)
Ezra “Bud” Brown gave a talk on the book illustrating many of the connections I discuss in the review. This was at a memorial conference celebrating Richard Guy’s legacy. I was not aware of the video until now. Watch the whole talk.

What math stories to tell and not to tell?

February 8, 2021 3 comments

Storytelling can be surprisingly powerful. When a story is skillfully told, you get an almost magical feeling of being a part of it, making you care deeply about protagonists. Even if under ordinary circumstances you have zero empathy for the Civil War era outlaws or emperor penguins of Antarctica, you suddenly may find yourself engrossed with their fortune. This is a difficult skill to master, but the effects are visible even when used in earnest by the beginners.

Recently I started thinking about the kind of stories mathematicians should be telling. This was triggered by Angela Gibney‘s kind invitation to contribute an article on math writing to the Early Career Collection in the Notices of the AMS. So I looked at a few older articles and found them just wonderful. I am not the target audience for some of them, but I just kept reading them all one after another until I exhausted the whole collection.

My general advice — read the collection! Read a few pieces by some famous people or some people you know. If you like them, keep on reading. As I wrote in this blog post, you rarely get an insight into mathematician’s thinking unless they happen to write an autobiography or gave an interview. While this is more of a “how to” genre, most pieces are written in the first person narrative and do tell some interesting stories or have some curious points of view.

It is possible I am the last person to find out about the collection. I am not a member of the AMS, I don’t read the Notices, and it’s been a long time since anyone considered me “early career”. I found a few articles a little self-centered (but who am I to judge), and I would quibble with some advice (see below). But even those articles I found compelling and thought-provoking.

Having read the collection, I decided to write about mathematical storytelling. This is not something that comes naturally to most people in the field. Math stories (as opposed to stories about mathematicians) tend to be rather dry and unexciting, especially in the early years of studying. I will blog my own article some other time, but for now let me address the question in the title.

Stories to tell

With a few notable exceptions, just about all stories are worth telling. Whether in your autobiography or in your personal blog, as long as they are interesting to somebody — it’s all good. Given the lack of good stories, or any math stories really, it’s a good bet somebody will find your stories interesting. Let me expound on that.

Basically, anything personal works. To give examples from the collection, see e.g. stories by Mark Andrea de Cataldo, Alicia Prieto-Langarica, Terry Tao and John Urschel. Most autobiographies are written in this style, but a short blog post is also great. Overcoming an embarrassment caused by such public disclosure can be difficult, which makes it even more valuable to the readers.

Anything historical works, from full length monographs on history of math to short point of view pieces. Niche and off the beaten path stories are especially valuable. I personally like the classical History of Mathematical Notations by Florian Cajori, and Combinatorics: Ancient & Modern, a nice collection edited by Robin Wilson and John Watkins, with a several articles authored by names you will recognize. Note that an oral history can be also very valuable, see the kind of stories discussed by László Lovász and Endre Szemerédi mentioned in this blog post and Dynkin’s interviews I discussed here.

Anything juicy works. I mean, if you have a story of some famous mathematician doing something unusual (good or bad, or just plain weird), that attracts attention. This was the style of Steven Krantz’s two Math Apocryphia books, with many revealing and embarrassing anecdotes giving a sense of the bygone era.

Anything inspirational works. A beautiful example of this style is Francis Su’s Farewell Address as MAA President and part of his moving follow up book (the book has other interesting material as well). From the collection, let me single out Finding Your Reward by Skip Garibaldi which also aims to inspire. Yet another example is Bill Thurston‘s must read MO answer “What’s a mathematician to do?

Any off the beaten path math style is great. Think of “The Strong Law of Small Numbers” by Richard Guy, or many conjectures Terry Tao discusses in his blog. Think of “Missed opportunities” by Freeman Dyson, “Tilings of space by knotted tiles” by Colin Adams, or “One sentence proof… ” by Don Zagier (see also a short discussion here) — these are all remarkable and memorable pieces of writing that don’t conform to the usual peer review paradigm.

Finally, anything philosophical or metamathematical finds an audience. I am thinking of “Is it plausible?” by Barry Mazur, “Theorems for a Price” by Doron Zeilberger, “You and Your Research” by Richard Hamming, “Mathematics as Metaphor” by Yuri Manin, or even “Prime Numbers and the Search for Extraterrestrial Intelligence” by Carl Pomerance. We are all in search of some kind of answers, I suppose, so reading others thinking aloud about these deep questions always helps.

Practice makes perfect

Before I move to the other side, here is a simple advice on how to write a good story. Write as much as possible! There is no way around this. Absolutely no substitute, really. I’ve given this advice plenty of times, and so have everyone else. Let me conclude by this quote by Don Knuth which is a bit similar to Robert Lazarsfeld‘s advice. It makes my point much better and with with more authority that I can ever provide:

Of equal importance to solving a problem is the communication of that solution to others. The best way to improve your writing skills is to practice, practice, practice.

Seize every opportunity to write mini-essays about the theoretical work you are doing. Compose a blog for your friends, or even for yourself. When you write programs, write literate programs.

One of the best strategies to follow while doing PhD research is to prepare weekly reports of exactly what you are doing. What questions did you pursue that week? What positive answers did you get? What negative answers did you get? What are the major stumbling blocks that seem to be present at the moment? What related work are you reading?

Donald Knuth – On Writing up Research (posted by Omer Reingold), Theory Dish, Feb 26, 2018

Don’t be a journalist

In this interesting article in the same collection, Jordan Ellenberg writes:

Why don’t journalists talk about math as it really is? Because they don’t know how it really is. We do. And if we want the public discourse about math to be richer, broader, and deeper, we need to tell our own stories.

He goes on to suggest that one should start writing a blog and then pitch some articles to real newspapers and news magazines. He gives his own bio as one example (among others) of pitching and publishing in mainstream publications such as Slate and the New York Times. Obviously, I agree with the first (blog) part (duh!), but I am rather negative on the second part. I know, I know, this sounds discouraging, but hear me out.

First, what Jordan is not telling you is how hard he had to work on his craft before getting to the point of being acceptable to the general audience. This started with him getting Summa Cum Laude A.B. degree from Harvard in both Math and English (if I recall correctly), and then publishing a well-received novel, all before starting his regular Slate column. Very few math people have this kind of background on which they can build popular appeal.

Second, this takes away jobs from real journalists. Like every highly competitive intellectual profession, journalism requires years of study and practice. It has its own principles and traditions, graduate schools, etc. Call it a chutzpah or a Dunning–Kruger effect, but just because you are excellent in harmonic analysis doesn’t mean you can do even a mediocre job as a writer. Again — some people can do both, but most cannot. If anything, I suspect a negative correlation between math and writing skills.

Here is another way to think about this. Most people do realize that they don’t need to email their pretty iPhone pictures of a Machu Picchu sunrise to be published by the National Geographic. Or that their cobbler family recipe maybe not exactly be what Gourmet Magazine is looking for. Why would you think that writing is much easier then?

Third, this cheapens our profession to some degree. You really don’t need a Ph.D. in algebraic number theory and two perfect scores at the IMO to write about Powerball or baseball. You need a M.S. in statistics and really good writing skills. There are plenty of media sites which do that now, such as 538. There is even the whole DDJ specialization with many practitioners and a handful of Pulitzer prizes. Using quantitative methods is now mainstream, so what exactly are you bringing to the table?

Fourth, it helps to be honest. Jordan writes: “Editors like an angle. If there’s a math angle to a story in the news, pitch it! As someone with a degree in math, you have something to offer that most writers don’t.” This is true in the rare instances when, say, a Fields medal in your area is awarded, or something like that. But if it’s in an area far away from yours, then, uhm, you got nothing over many thousands of other people.

Now, please don’t take this as “don’t comment on current affairs” advice. No, no — please do! Comment away on your blog or on your podcast. Just don’t take jobs away from journalists. Help them instead! Write them emails, correct their mistakes. Let them interview you as an “expert”, whatever. Part of the reason the math related articles are so poor is because of mathematicians’ apathy and frequent disdain to the media, not because we don’t write newspaper articles — it’s really not our job.

Let me conclude with an anecdote about me reaching out to a newspaper. Once upon a time, long ago, flights used to distribute real newspapers to the passengers. I was sitting in the back and got a Wall Street Journal which I read out of boredom during takeoff. There was an article discussing the EU expansion and the fact that by some EU rules, the headquarters need a translator from every language to every other language. The article predicted dark days ahead, since it’s basically impossible to find people who can translate some smaller languages, such as from Maltese to Lithuanian. The article provided a helpful graph showing the number of translators needed as a function of the number of countries and claimed the exponential growth.

I was not amused, cut out the article, and emailed the author upon arrival, saying that with all my authority as an assistant professor at MIT, I promise that n(n-1) grows polynomially, not exponentially. I got back a surprisingly apologetic reply. The author confessed he was a math major in college, but was using the word without thinking. I don’t know if WSJ ever published a correction, but I bet the author will not be using this word so casually anymore, and if he ever advanced to the editorial position will propagate this knowledge to others. So there — that’s my personal contribution to improving public discourse…

Don’t be an apologist

In another beautifully written article in the Early Career collection, Izzet Coskun gives “advice on how to communicate mathematics quickly in informal settings”. He writes:

Whether before a promotion committee, at a party where one might meet future politicians or future parents of future colleagues, in the elevator on the way up to tea, or in the dean’s office at a job interview, we often have the opportunity to explain our work to a general audience. The time we have is usually short [..] Our audience will not be familiar with our terminology. Communicating mathematics in such settings is challenging.

He then gives a lot of very useful practical advice on how to prepare to such “math under a minute” conversation, how to be engaging, accessible, etc. It’s an all around good advice. However, I disagree. Here is my simple advice: Don’t Do It! If it’s a dean and this is a job interview, feel free to use any math jargon you want — it’s not your fault your field is technical, and the dean of sciences is used to it anyway. Otherwise, just say NO.

It’s true that sometimes your audience is friendly and is sincere in their interest in your work. In that case no matter what you say will disappoint them. There is a really good chance they can’t understand a word of what you say. They just think they can, and you are about to disillusion them.

But more often than not, the audience is actually not friendly, as was the case of a party Izzet described in his article. Many people harbor either a low regard or an outright resentment towards math stemming from their school years or some kind of “life experience”. These folks simply want to reinforce their views, and no matter what you say that will be taken as “you see, math is both hard, boring and useless”.

One should not confuse the unfriendlies with stupid or uneducated people. On the contrary, a lot of educated people think this way. A prime example is Amy Wax with her inimitable quote:

If we got rid of ninety percent of the math Ph.D. programs, would we really be worse off in any material respect?  I think that’s a serious question.

I discussed this quote at length in this blog post. There, I tried to answer her question. But after a few back-and-force emails (which I didn’t make public), it became clear that she is completely uninterested in the actual learning of what math is and what it does. She just wants to have her own answer validated by some area practitioners. Oh, well…

So here is the real reason why I think answering such people is pointless. No matter what you say, you come across as an apologist for the field. If people really want to understand what math is for, there are plenty of sources. In fact, have several bookshelves with extremely well written book-length answers. But it’s not your job to educate them! Worse, it is completely unreasonable to expect you to answer in “under one minute”.

Think about reactions of people when they meet other professionals. Someone says “I develop new DNA based cancer treatments” or “I work on improving VLSI architecture”, or “I device new option pricing strategies”. Is there a follow up request to explain it in “under one minute”? Not really. Let me give you a multiple choice. Is that because people think that:

a) these professions are boring compared to math and they would rather hear about the latter?

b) they know exactly what these professionals do, but math is so darn mysterious?

c) they know these professions are technical and hard to understand, but even children can understand math, so how hard can that be?

d) these professions are clearly useful, but what do math people do — solve quadratic equations all day?

If you answered a) or b) you have more faith in humanity than I do. If you answered c) you never spoke to anyone about math at a party. So d) is the only acceptable answer, even if it’s an exaggeration. Some people (mostly under 7) think that I “add numbers all day”, some people (mostly in social sciences) think that I “take derivatives all day”, etc., you get the point. My advice — don’t correct them. This makes them unhappy. Doesn’t matter if they are 7 or 77 — when you correct them the unhappiness is real and visible…

So here is a summary of how I deal with such questions. If people ask what I do, I answer “I do math research and I teach“. If they ask what kind of research I say “advanced math“. If they ask for details I tell them “it’s complicated“. If they ask why, I tell them “because it takes many years of study to even understand the math lingo, so if I tell you what I do this sounds like I am speaking a foreign language“.

If they ask what are the applications of my research (and they always do), I tell them “teaching graduate classes“. If they ask for “practical” applications, whatever that means, I tell them “this puts money into my Wells Fargo account“. At this point they move on exhausted by the questions. On the one hand I didn’t lie except in the last answer. On the other — nobody cares if I even have a WF account (I don’t, but it’s none of their business either).

One can ask — why do I care so much? What’s so special about my work that I am so apprehensive? In truth, nothing really. There are other aspects of my identity I also find difficult discussing in public. The most relevant is “What is Combinatorics?” which for some reason is asked over and over as if there is a good answer (see this blog post for my own answer and this Wikipedia article I wrote). When I hear people explaining what it is, half the time it sounds like they are apologizing for something they didn’t do…

There are other questions relevant to my complex identity that I am completely uninterested in discussing. Like “What do you think of the Russian President?” or “Who is a Jew?“, or “Are you a Zionist?” It’s not that my answers are somehow novel, interesting or controversial (they are not). It’s more like I am afraid to hear responses from the people who asked me these questions. More often than not I find their answers unfortunate or plain offensive, and I would rather not know that.

Let me conclude on a positive note, by telling a party story of my own. Once, during hors d’oeuvres (remember those?), one lady, a well known LA lawyer, walked to me and said: “I hear you are a math professor at UCLA. This is so fascinating! Can you tell me what you do? Just WOW me!” I politely declined along the lines above. She insisted: “There has to be something that I can understand!” I relented: “Ok, there is one theorem I can tell you. In fact, this result landed me a tenure.” She was all ears.

I continued: “Do you know what’s a square-root-of-two?” She nodded. “Well, I proved that this number can never be a ratio of two integers, for example it’s not equal to 17/12 or anything like that.” “Oh, shut-the-F-up!” she exclaimed. “Are you serious? You can prove that?” — she was clearly suspicious. “Yes, I can“, I confirmed vigorously, “in fact, two Russian newspapers even printed headlines about that back a few years ago. We love math over there, you know.”

But of course!“, she said, “American media never writes about math. It’s such a shame! That’s why I never heard of your work. My son is much too young for this, but I must tell my nieces — they love science!” I nodded approvingly. She drifted away very happy, holding a small plate of meat stuffed potato croquettes, enriched with this newly acquired knowledge. I do hope her nieces liked that theorem — it is cool indeed. And the proof is so super neat…

What if they are all wrong?

December 10, 2020 5 comments

Conjectures are a staple of mathematics. They are everywhere, permeating every area, subarea and subsubarea. They are diverse enough to avoid a single general adjective. They come in al shapes and sizes. Some of them are famous, classical, general, important, inspirational, far-reaching, audacious, exiting or popular, while others are speculative, narrow, technical, imprecise, far-fetched, misleading or recreational. That’s a lot of beliefs about unproven claims, yet we persist in dispensing them, inadvertently revealing our experience, intuition and biases.

The conjectures also vary in attitude. Like a finish line ribbon they all appear equally vulnerable to an outsider, but in fact differ widely from race to race. Some are eminently reachable, the only question being who will get there first (think 100 meter dash). Others are barely on the horizon, requiring both great effort, variety of tools, and an extended time commitment (think ironman triathlon). The most celebrated third type are like those Sci-Fi space expeditions in requiring hundreds of years multigenerational commitments, often losing contact with civilization it left behind. And we can’t forget the romantic fourth type — like the North Star, no one actually wants to reach them, as they are largely used for navigation, to find a direction in unchartered waters.

Now, conjectures famously provide a foundation of the scientific method, but that’s not at all how we actually think of them in mathematics. I argued back in this pointed blog post that citations are the most crucial for the day to day math development, so one should take utmost care in making references. While this claim is largely uncontroversial and serves as a raison d’être for most GoogleScholar profiles, conjectures provide a convenient idealistic way out. Thus, it’s much more noble and virtuous to say “I dedicated my life to the study of the XYZ Conjecture” (even if they never publish anything), than “I am working hard writing so many papers to gain respect of my peers, get a promotion, and provide for my family“. Right. Obviously…

But given this apparent (true or perceived) importance of conjectures, are you sure you are using them right? What if some/many of these conjectures are actually wrong, what then? Should you be flying that starship if there is no there there? An idealist would argue something like “it’s a journey, not a destination“, but I strongly disagree. Getting closer to the truth is actually kind of important, both as a public policy and on an individual level. It is thus pretty important to get it right where we are going.

What are conjectures in mathematics?

That’s a stupid question, right? Conjectures are mathematical claims whose validity we are trying to ascertain. Is that all? Well, yes, if you don’t care if anyone will actually work on the conjecture. In other words, something about the conjecture needs to interesting and inspiring.

What makes a conjecture interesting?

This is a hard question to answer because it is as much psychological as it is mathematical. A typical answer would be “oh, because it’s old/famous/beautiful/etc.” Uhm, ok, but let’s try to be a little more formal.

One typically argues “oh, that’s because this conjecture would imply [a list of interesting claims and known results]”. Well, ok, but this is self-referential. We already know all those “known results”, so no need to prove them again. And these “claims” are simply other conjectures, so this is really an argument of the type “this conjecture would imply that conjecture”, so not universally convincing. One can argue: “look, this conjecture has so many interesting consequences”. But this is both subjective and unintuitive. Shouldn’t having so many interesting conjectural consequences suggest that perhaps the conjecture is too strong and likely false? And if the conjecture is likely to be false, shouldn’t this make it uninteresting?

Also, wouldn’t it be interesting if you disprove a conjecture everyone believes to be true? In some sense, wouldn’t it be even more interesting if until now everyone one was simply wrong?

None of this are new ideas, of course. For example, faced with the need to justify the “great” BC conjecture, or rather 123 pages of survey on the subject (which is quite interesting and doesn’t really need to be justified), the authors suddenly turned reflective. Mindful of self-referential approach which they quickly discard, they chose a different tactic:

We believe that the interest of a conjecture lies in the feeling of unity of mathematics that it entails. [M.P. Gomez Aparicio, P. Julg and A. Valette, “The Baum-Connes conjecture“, 2019]

Huh? Shouldn’t math be about absolute truths, not feelings? Also, in my previous blog post, I mentioned Noga Alon‘s quote that Mathematics is already “one unit“. If it is, why does it need a new “feeling of unity“? Or is that like one of those new age ideas which stop being true if you don’t reinforce them at every occasion?

If you are confused at this point, welcome to the club! There is no objective way to argue what makes certain conjectures interesting. It’s all in our imagination. Nikolay Konstantinov once told me that “mathematics is a boring subject because every statement is equivalent to saying that some set is empty.” He meant to be provocative rather than uninspiring. But the problem he is underlying is quite serious.

What makes us believe a conjecture is true?

We already established that in order to argue that a conjecture is interesting we need to argue it’s also true, or at least we want to believe it to be true to have all those consequences. Note, however, that we argue that a conjecture is true in exactly the same way we argue it’s interesting: by showing that it holds is some special cases, and that it would imply other conjectures which are believed to be true because they are also checked in various special cases. So in essence, this gives “true = interesting” in most cases. Right?

This is where it gets complicated. Say, you are working on the “abc conjecture” which may or may not be open. You claim that it has many consequences, which makes it both likely true and interesting. One of them is the negative solution to the Erdős–Ulam problem about existence of a dense set in the plane with rational pairwise distances. But a positive solution to the E-U problem implies the Harborth’s conjecture (aka the “integral Fáry problem“) that every graph can be drawn in the plane with rational edge lengths. So, counterintuitively, if you follow the logic above shouldn’t you be working on a positive solution to Erdős–Ulam since it would both imply one conjecture and give a counterexample to another? For the record, I wouldn’t do that, just making a polemical point.

I am really hoping you see where I am going. Since there is no objective way to tell if a conjecture is true or not, and what exactly is so interesting about it, shouldn’t we discard our biases and also work towards disproving the conjecture just as hard as trying to prove it?

What do people say?

It’s worth starting with a general (if slightly poetic) modern description:

In mathematics, [..] great conjectures [are] sharply formulated statements that are most likely true but for which no conclusive proof has yet been found. These conjectures have deep roots and wide ramifications. The search for their solution guides a large part of mathematics. Eternal fame awaits those who conquer them first. Remarkably, mathematics has elevated the formulation of a conjecture into high art. [..] A well-chosen but unproven statement can make its author world-famous, sometimes even more so than the person providing the ultimate proof. [Robbert Dijkgraaf, The Subtle Art of the Mathematical Conjecture, 2019]

Karl Popper thought that conjectures are foundational to science, even if somewhat idealized the efforts to disprove them:

[Great scientists] are men of bold ideas, but highly critical of their own ideas: they try to find whether their ideas are right by trying first to find whether they are not perhaps wrong. They work with bold conjectures and severe attempts at refuting their own conjectures. [Karl Popper, Heroic Science, 1974]

Here is how he reconciled somewhat the apparent contradiction:

On the pre-scientific level we hate the very idea that we may be mistaken. So we cling dogmatically to our conjectures, as long as possible. On the scientific level, we systematically search for our mistakes. [Karl Popper, quoted by Bryan Magee, 1971]

Paul Erdős was, of course, a champion of conjectures and open problems. He joked that the purpose of life is “proof and conjecture” and this theme is repeatedly echoed when people write about him. It is hard to overestimate his output, which included hundreds of talks titled “My favorite problems“. He wrote over 180 papers with collections of conjectures and open problems (nicely assembled by Zbl. Math.)

Peter Sarnak has a somewhat opposite point of view, as he believes one should be extremely cautious about stating a conjecture so people don’t waste time working on it. He said once, only half-jokingly:

Since we reward people for making a right conjecture, maybe we should punish those who make a wrong conjecture. Say, cut off their fingers. [Peter Sarnak, UCLA, c. 2012]

This is not an exact quote — I am paraphrasing from memory. Needless to say, I disagree. I don’t know how many fingers he wished Erdős should lose, since some of his conjectures were definitely disproved: one, two, three, four, five, and six. This is not me gloating, the opposite in fact. When you are stating hundreds of conjectures in the span of almost 50 years, having only a handful to be disproved is an amazing batting average. It would, however, make me happy if Sarnak’s conjecture is disproved someday.

Finally, there is a bit of a controversy whether conjectures are worth as much as theorems. This is aptly summarized in this quote about yet another champion of conjectures:

Louis J. Mordell [in his book review] questioned Hardy‘s assessment that Ramanujan was a man whose native talent was equal to that of Euler or Jacobi. Mordell [..] claims that one should judge a mathematician by what he has actually done, by which Mordell seems to mean, the theorems he has proved. Mordell’s assessment seems quite wrong to me. I think that a felicitous but unproved conjecture may be of much more consequence for mathematics than the proof of many a respectable theorem. [Atle Selberg, “Reflections Around the Ramanujan Centenary“, 1988]

So, what’s the problem?

Well, the way I see it, the efforts made towards proving vs. disproving conjectures is greatly out of balance. Despite all the high-minded Popper’s claims about “severe attempts at refuting their own conjectures“, I don’t think there is much truth to that in modern math sciences. This does not mean that disproofs of famous conjectures aren’t celebrated. Sometimes they are, see below. But it’s clear to me that the proofs are celebrated more frequently, and to a much greater degree. I have only anecdotal evidence to support my claim, but bear with me.

Take prizes. Famously, Clay Math Institute gives $1 million for a solution of any of these major open problems. But look closely at the rules. According to the item 5b, except for the P vs. NP problem and the Navier–Stokes Equation problem, it gives nothing ($0) for a disproof of these problems. Why, oh why?? Let’s look into CMI’s “primary objectives and purposes“:

To recognize extraordinary achievements and advances in mathematical research.

So it sounds like CMI does not think that disproving the Riemann Hypothesis needs to be rewarded because this wouldn’t “advance mathematical research”. Surely, you are joking? Whatever happened to “the opposite of a profound truth may well be another profound truth“? Why does the CMI wants to put its thumb on the scale and support only one side? Do they not want to find out the solution whatever it is? Shouldn’t they be eager to dispense with the “wrong conjecture” so as to save numerous researches from “advances to nowhere“?

I am sure you can see that my blood is boiling, but let’s proceed to the P vs. NP problem. What if it’s independent of ZFC? Clearly, CMI wouldn’t pay for proving that. Why not? It’s not like this kind of thing never happened before (see obligatory link to CH). Some people believe that (or at least they did in 2012), and some people like Scott Aaronson take this seriously enough. Wouldn’t this be a great result worthy of an award as much as the proof that P=NP, or at least a nonconstructive proof that P=NP?

If your head is not spinning hard enough, here is another amusing quote:

Of course, it’s possible that P vs. NP is unprovable, but that that fact itself will forever elude proof: indeed, maybe the question of the independence of P vs. NP is itself independent of set theory, and so on ad infinitum! But one can at least say that, if P vs. NP (or for that matter, the Riemann hypothesis, Goldbach’s conjecture, etc.) were proven independent of ZF, it would be an unprecedented development. [Scott Aaronson, P vs. NP, 2016].

Speaking of Goldbach’s Conjecture, the most talked about and the most intuitively correct statement in Number Theory that I know. In a publicity stunt, for two years there was a $1 million prize by a publishing house for the proof of the conjecture. Why just for the proof? I never heard of anyone not believing the conjecture. If I was the insurance underwriter for the prize (I bet they had one), I would allow them to use “for the proof or disproof” for a mere extra $100 in premium. For another $50 I would let them use “or independent of ZF” — it’s a free money, so why not? It’s such a pernicious idea of rewarding only one kind of research outcome!

Curiously, even for Goldbach’s Conjecture, there is a mild divergence of POVs on what the future holds. For example, Popper writes (twice in the same book!) that:

[On whether Goldbach’s Conjecture is ‘demonstrable’] We don’t know: perhaps we may never know, and perhaps we can never know. [Karl Popper, Conjectures and Refutations, 1963]

Ugh. Perhaps. I suppose anything can happen… For example, our civilizations can “perhaps” die out in the next 200 years. But is that likely? Shouldn’t the gloomy past be a warning, not a prediction of the future? The only thing more outrageously pessimistic is this theological gem of a quote:

Not even God knows the number of permutations of 1000 avoiding the 1324 pattern. [Doron Zeilberger, quoted here, 2005]

Thanks, Doron! What a way to encourage everyone! Since we know from numerical estimates that this number is ≈ 3.7 × 101017 (see this paper and this follow up), Zeilberger is suggesting that large pattern avoidance numbers are impossibly hard to compute precisely, already in the range of only about 1018 digits. I really hope he is proved wrong in his lifetime.

But I digress. What I mean to emphasize, is that there are many ways a problem can be resolved. Yet some outcomes are considered more valuable than others. Shouldn’t the research achievements be rewarded, not the desired outcome? Here is yet another colorful opinion on this:

Given a conjecture, the best thing is to prove it. The second best thing is to disprove it. The third best thing is to prove that it is not possible to disprove it, since it will tell you not to waste your time trying to disprove it. That’s what Gödel did for the Continuum Hypothesis. [Saharon Shelah, Rutgers Univ. Colloqium, 2001]

Why do I care?

For one thing, disproving conjectures is part of what I do. Sometimes people are a little shy to unambiguously state them as formal conjectures, so they phrase them as questions or open problems, but then clarify that they believe the answer is positive. This is a distinction without a difference, or at least I don’t see any (maybe they are afraid of Sarnak’s wrath?) Regardless, proving their beliefs wrong is still what I do.

For example, here is my old bog post on my disproof of the Noonan-Zeiberger Conjecture (joint with Scott Garrabrant). And in this recent paper (joint with Danny Nguyen), we disprove in one big swoosh both Barvinok’s Problem, Kannan’s Problem, and Woods Conjecture. Just this year I disproved three conjectures:

  1. The Kirillov–Klyachko Conjecture (2004) that the reduced Kronecker coefficients satisfy the saturation property (this paper, joint with Greta Panova).
  2. The Brandolini et al. Conjecture (2019) that concrete lattice polytopes can multitile the space (this paper, joint with Alexey Garber).
  3. Kenyon’s Problem (c. 2005) that every integral curve in R3 is a boundary of a PL surface comprised of unit triangles (this paper, joint with Alexey Glazyrin).

On top of that, just two months ago in this paper (joint with Han Lyu), we showed that the remarkable independence heuristic by I. J. Good for the number of contingency tables, fails badly even for nearly all uniform marginals. This is not exactly disproof of a conjecture, but it’s close, since the heuristic was introduced back in 1950 and continues to work well in practice.

In addition, I am currently working on disproving two more old conjectures which will remain unnamed until the time we actually resolve them (which might never happen, of course). In summary, I am deeply vested in disproving conjectures. The reasons why are somewhat complicated (see some of them below). But whatever my reasons, I demand and naively fully expect that my disproofs be treated on par with proofs, regardless whether this expectation bears any relation to reality.

My favorite disproofs and counterexamples:

There are many. Here are just a few, some famous and some not-so-famous, in historical order:

  1. Fermat‘s conjecture (letter to Pascal, 1640) on primality of Fermat numbers, disproved by Euler (1747)
  2. Tait’s conjecture (1884) on hamiltonicity of graphs of simple 3-polytopes, disproved by W.T. Tutte (1946)
  3. General Burnside Problem (1902) on finiteness of periodic groups, resolved negatively by E.S. Golod (1964)
  4. Keller’s conjecture (1930) on tilings with unit hypercubes, disproved by Jeff Lagarias and Peter Shor (1992)
  5. Borsuk’s Conjecture (1932) on partitions of convex sets into parts of smaller diameter, disproved by Jeff Kahn and Gil Kalai (1993)
  6. Hirsch Conjecture (1957) on the diameter of graphs of convex polytopes, disproved by Paco Santos (2010)
  7. Woods’s conjecture (1972) on the covering radius of certain lattices, disproved by Oded Regev, Uri Shapira and Barak Weiss (2017)
  8. Connes embedding problem (1976), resolved negatively by Zhengfeng Ji, Anand Natarajan, Thomas Vidick, John Wright and Henry Yuen (2020)

In all these cases, the disproofs and counterexamples didn’t stop the research. On the contrary, they gave a push to further (sometimes numerous) developments in the area.

Why should you disprove conjectures?

There are three reasons, of different nature and importance.

First, disproving conjectures is opportunistic. As mentioned above, people seem to try proving much harder than they try disproving. This creates niches of opportunity for an open-minded mathematician.

Second, disproving conjectures is beautiful. Let me explain. Conjectures tend to be rigid, as in “objects of the type pqr satisfy property abc.” People like me believe in the idea of “universality“. Some might call it “completeness” or even “Murphy’s law“, but the general principle is always the same. Namely: it is not sufficient that one wishes that all pqr satisfy abc to actually believe in the implication; rather, there has to be a strong reason why abc should hold. Barring that, pqr can possibly be almost anything, so in particular non-abc. While some would argue that non-abc objects are “ugly” or at least “not as nice” as abc, the idea of universality means that your objects can be of every color of the rainbow — nice color, ugly color, startling color, quiet color, etc. That kind of palette has its own sense of beauty, but it’s an acquired taste I suppose.

Third, disproving conjectures is constructive. It depends on the nature of the conjecture, of course, but one is often faced with necessity to construct a counterexample. Think of this as an engineering problem of building some pqr which at the same time is not abc. Such construction, if at all possible, might be difficult, time consuming and computer assisted. But so what? What would you rather do: build a mile-high skyscraper (none exist yet) or prove that this is impossible? Curiously, in CS Theory both algorithms and (many) complexity results are constructive (you need gadgets). Even the GCT is partially constructive, although explaining that would take us awhile.

What should the institutions do?

If you are an institution which awards prizes, stop with the legal nonsense: “We award […] only for a publication of a proof in a top journal”. You need to set up a scientific committee anyway, since otherwise it’s hard to tell sometimes if someone deserves a prize. With mathematicians you can expect anything anyway. Some would post two arXiv preprints, give a few lectures and then stop answering emails. Others would publish only in a journal where they are Editor-in-Chief. It’s stranger than fiction, really.

What you should do is say in the official rules: “We have [this much money] and an independent scientific committee which will award any progress on [this problem] partially or in full as they see fit.” Then a disproof or an independence result will receive just as much as the proof (what’s done is done, what else are you going to do with the money?) This would also allow some flexibility for partial solutions. Say, somebody proves Goldbach’s Conjecture for integers > exp(exp(10100000)), way way beyond computational powers for the remaining integers to be checked. I would give this person at least 50% of the prize money, leaving the rest for future developments of possibly many people improving on the bound. However, under the old prize rules such person gets bupkes for their breakthrough.

What should the journals do?

In short, become more open to results of computational and experimental nature. If this sounds familiar, that’s because it’s a summary of Zeilberger’s Opinions, viewed charitably. He is correct on this. This includes publishing results of the type “Based on computational evidence we believe in the following UVW conjecture” or “We develop a new algorithm which confirms the UVW conjecture for n<13″. These are still contributions to mathematics, and the journals should learn to recognize them as such.

To put in context of our theme, it is clear that a lot more effort has been placed on proofs than on finding counterexamples. However, in many areas of mathematics there are no small counterexamples, so a heavy computational effort is crucial for any hope of finding one. Such work is not be as glamorous as traditional papers. But really, when it comes to standards, if a journal is willing to publish the study of something like the “null graphs“, the ship has sailed for you…

Let me give you a concrete example where a computational effort is indispensable. The curious Lovász conjecture states that every finite connected vertex-transitive graph contains a Hamiltonian path. This conjecture got to be false. It hits every red flag — there is really no reason why pqr = “vertex transitive” should imply abc = “Hamiltonian”. The best lower bound for the length of the longest (self-avoiding) path is only about square root of the number of vertices. In fact, even the original wording by Lovász shows he didn’t believe the conjecture is true (also, I asked him and he confirmed).

Unfortunately, proving that some potential counterexample is not Hamiltonian is computationally difficult. I once had an idea of one (a nice cubic Cayley graph on “only” 3600 vertices), but Bill Cook quickly found a Hamiltonian cycle dashing my hopes (it was kind of him to look into this problem). Maybe someday, when the TSP solvers are fast enough on much larger graphs, it will be time to return to this problem and thoroughly test it on large Cayley graphs. But say, despite long odds, I succeed and find a counterexample. Would a top journal publish such a paper?

Editor’s dilemma

There are three real criteria for evaluation a solution of an open problem by the journal:

  1. Is this an old, famous, or well-studied problem?
  2. Are the tools interesting or innovative enough to be helpful in future studies?
  3. Are the implications of the solution to other problems important enough?

Now let’s make a hypothetical experiment. Let’s say a paper is submitted to a top math journal which solves a famous open problem in Combinatorics. Further, let’s say somebody already proved it is equivalent to a major problem in TCS. This checks criteria 1 and 3. Until not long ago it would be rejected regardless, so let’s assume this is happening relatively recently.

Now imagine two parallel worlds, where in the first world the conjecture is proved on 2 pages using beautiful but elementary linear algebra, and in the second world the conjecture is disproved on a 2 page long summary of a detailed computational search. So in neither world we have much to satisfy criterion 2. Now, a quiz: in which world the paper will be published?

If you recognized that the first world is a story of Hao Huang‘s elegant proof of the induced subgraphs of hypercubes conjecture, which implies the sensitivity conjecture. The Annals published it, I am happy to learn, in a welcome break with the past. But unless we are talking about some 200 year old famous conjecture, I can’t imagine the Annals accepting a short computational paper in the second world. Indeed, it took a bit of a scandal to accept even the 400 year old Kepler’s conjecture which was proved in a remarkable computational work.

Now think about this. Is any of that fair? Shouldn’t we do better as a community on this issue?

What do other people do?

Over the years I asked a number of people about the uncertainty created by the conjectures and what do they do about it. The answers surprised me. Here I am paraphrasing them:

Some were dumbfounded: “What do you mean this conjecture could be false? It has to be true, otherwise nothing I am doing make much sense.”

Others were simplistic: “It’s an important conjecture. Famous people said it’s true. It’s my job to prove it.”

Third were defensive: “Do you really think this conjecture could be wrong? Why don’t you try to disprove it then? We’ll see who is right.”

Fourth were biblical: “I tend to work 6 days a week towards the proof and one day towards the disproof.”

Fifth were practical: “I work on the proof until I hit a wall. I use the idea of this obstacle to try constructing potential counterexamples. When I find an approach to discard such counterexamples, I try to generalize the approach to continue working on the proof. Continue until either side wins.”

If the last two seem sensible to you to, that’s because they are. However, I bet fourth are just grandstanding — no way they actually do that. The fifth sound great when this is possible, but that’s exceedingly rare, in my opinion. We live in a technical age when proving new results often requires great deal of effort and technology. You likely have tools and intuition to work in only one direction. Why would you want to waste time working in another?

What should you do?

First, remember to make conjectures. Every time you write a paper, tell a story of what you proved. Then tell a story of what you wanted to prove but couldn’t. State it in the form of a conjecture. Don’t be afraid to be wrong, or be right but oversharing your ideas. It’s a downside, sure. But the upside is that your conjecture might prove very useful to others, especially young researchers. In might advance the area, or help you find a collaborator to resolve it.

Second, learn to check your conjectures computationally in many small cases. It’s important to give supporting evidence so that others take your conjectures seriously.

Third, learn to make experiments, explore the area computationally. That’s how you make new conjectures.

Fourth, understand yourself. Your skill, your tools. Your abilities like problem solving, absorbing information from the literature, or making bridges to other fields. Faced with a conjecture, use this knowledge to understand whether at least in principle you might be able to prove or disprove a conjecture.

Fifth, actively look for collaborators. Those who have skills, tools, or abilities you are missing. More importantly, they might have a different POV on the validity of the conjecture and how one might want to attack it. Argue with them and learn from them.

Sixth, be brave and optimistic! Whether you decide to prove, disprove a conjecture, or simply state a new conjecture, go for it! Ignore the judgements by the likes of Sarnak and Zeilberger. Trust me — they don’t really mean it.

Take an interview!

October 29, 2020 3 comments

We all agree that Math is a human endeavor, yet we know so preciously little about mathematicians as humans working in mathematics. Our papers tend to have preciously few quotable sentences outside of the dry mathematical context. In fact, most introductions are filled with passages of the form “X introduced the celebrated tool pqr, which over the next 20 years was refined by A, B and C, and most recently was used by D to prove Z’s conjecture“. It is such a weak tea to convey contributions of six people in one short sentence, yet we all do this nonetheless.

In my “How to write a clear math paper” article accompanying this blog post, I argue that at least the first paragraph or the first subsection of a long paper can be human and aimed at humans. That is the place where one has freedom to be eloquent, inspiring, congratulatory, prescient, revelatory and quotable. I still believe that, but now I have a new suggestion, see the title of this blog post.

The art of autobiographies

These days many great scientists remain active into very old age, and rarely want or have time to write an autobiography. Good for them, bad for us. Psychologically this is understandable — it feels a little epitaphish, so they would much rather have someone else do that. But then their real voice and honest thoughts on life and math are lost, and can never be recorded. There is blogging, of course, but that’s clearly not for everyone.

There are some notable exceptions to this, of course. When I was in High School, reading autobiographies of Richard Feynman, Stan Ulam and Norbert Wiener was a pure joy, a window into a new world. The autobiоgraphy by Sofya Kovalevskaya was short on mathematical stories, but was so well written I think I finished the whole thing in one sitting. G.H. Hardy’s “Apology” is written in different style, but clearly self-revealing; while I personally disagree with much of his general point, I can see why the book continues to be read and inspire passionate debates.

More recently, I read William Tutte, “Graph Theory As I Have Known It“, which is mostly mathematical, but with a lot of personal stories delivered in an authoritative voice. It’s a remarkable book, I can’t praise it enough. Another one of my favorites is Steven Krantz, “Mathematical Apocrypha” and its followup, which are written in the first person, in a pleasant light rumor mill style. Many stories in these near-autobiographies were a common knowledge decades ago (even if some were urban legends), but are often the only way for us to learn now how it was back then.

On the opposite end of the spectrum there is L.S. Pontryagin’s autobiography (in Russian), which is full of wild rumors, vile accusations, and banal antisemitism. The book is a giant self-own, yet I couldn’t stop myself from hate-reading the whole thing just so I could hear all these interesting old stories from horse’s mouth.

Lately, the autobiographies I’ve been reading are getting less and less personal, with little more than background blurbs about each paper. Here are those by George Lusztig and Richard Stanley. It’s an unusual genre, and I applaud the authors for taking time to write these. But these condensed CV-like auto-bios clearly leave a lot of room for stories and details.

Why an interview?

Because a skillful interviewer can help a mathematician reveal personal stories, mathematical and metamathematical beliefs, and even general views (including controversial ones). Basically, reveal the humanity of a person that otherwise remains guarded behind endless Definition-Lemma-Theorem constructions.

Another reason to interview a person is to honor her or his contributions to mathematics. In the aftermath of my previous blog post, I got a lot of contradictory push-back. Some would say “I am shocked, shocked, to find that there is corruption going on. I have submitted to many invited issues, served as a guest editor for others and saw none of that! So you must be wrong, wrong, wrong.” Obviously, I am combining several POVs, satirizing and paraphrasing for the effect.

Others would say “Yes, you are right, some journals are not great so my junior coauthors do suffer, the refereeing is not always rigorous, the invited authors are often not selected very broadly, but what can I do? The only way I can imagine to honor a person is by a math article in an invited issue of a peer review journal, so we must continue this practice” (same disclaimer as above). Yeah, ok the imaginary dude, that’s just self-serving with a pretense of being generous and self-sacrificing. (Yes, my straw man fighting skill are unparalleled).

In fact, there are many ways to honor a person. You can give a talk about that person’s contributions, write a survey or a biographical article, organize a celebratory conference, or if you don’t want to be bothered simply add a dedication in the beginning of the next article you publish. Or, better yet, interview the honoree. Obviously, do this some time soon, while this person is alive, and make sure to put the interview online for everyone to read or hear.

How to do an interview?

Oh, you know, via Zoom, for example. The technical aspects are really trivial these days. With permission, you can record the audio/video by pushing one button. The very same Zoom (or Apple, Google, Amazon, Microsoft, etc.) have good speech-to-text programs which will typeset the whole interview for you, modulo some light editing (especially of math terminology). Again, with a couple of clicks, you can publish the video or the audio on YouTube, the text on your own website or any social media. Done. Really, it’s that easy!

Examples

I have many favorites, in fact. One superb video collection is done by the Simons Institute. I already blogged here about terrific interviews with László Lovász and Endre Szemerédi. The interviewer for both is Avi Wigderson, who is obviously extremely knowledgeable of the subject. He asked many pointed and interesting questions, yet leaving the interviewees plenty of space to develop and expand on their their answers. The videos are then well edited and broken into short watchable pieces.

Another interesting collection of video interviews is made by CIRM (in both English and French). See also general video collections, some of which have rather extensive and professionally made interviews with a number of notable mathematicians and scientists. Let me single out the Web of Stories, which include lengthy fascinating interviews with Michael Atiyah, Freeman Dyson, Don Knuth, Marvin Minsky, and many others.

I already wrote about how to watch a math video talk (some advice may be dated). Here it’s even easier. At the time of the pandemic, when you are Zoom fatigued — put these on your big screen TV and watch them as documentaries with as much or as little attention as you like. I bet you will find them more enlightening than the news, Netflix or other alternatives.

Authorized biography books are less frequent, obviously, but they do exist. One notable recent example is “Genius At Play: The Curious Mind of John Horton Conway” by Siobhan Roberts which is based on many direct conversations. Let me also single out perhaps lesser known “Creative Minds, Charmed Lives” by Yu Kiang Leong, which has a number of interesting interviews with excellent mathematicians, many of the them not on other lists. For example, on my “What is Combinatorics” page, I quote extensively from his interview with Béla Bollobás, but in fact the whole interview is worth reading.

Finally, there is a truly remarkable collection of audio interviews by Eugene Dynkin with leading mathematicians of his era, spanning from 1970s to 2010s (some in English, some in Russian). The collection was digitized using Flash which died about five years ago, rendering the collection unusable. When preparing this post I was going to use this example as a cautionary tale, but to my surprise someone made it possible to download them in .mp3. Enjoy! Listening to these conversations is just delightful.

Final thoughts

Remember, you don’t have to be a professional interviewer to do a good job. Consider two most recent interviews with Noga Alon and Richard Stanley by Toufik Mansour, both published at ECA. By employing a simple trick of asking the same well prepared questions, he allows the reader to compare and contrast the answers, and make their own judgement on which ones they like or agree with the most. Some answers are also quite revealing, e.g. Stanley saying he occasionally thinks about the RH (who knew?), or Alon’s strong belief that “mathematics should be considered as one unit” (i.e. without the area divisions). The problems they consider to be important are also rather telling.

Let me mention that in the digital era, even the amateur long forgotten interviews can later be found and proved useful. For example, I concluded my “History of Catalan numbers” with a quote from an obscure Richard Stanley’s interview to the MIT undergraduate newspaper. There, he was discussing the origins of his Catalan numbers exercise which is now a book. Richard later wrote to me in astonishment as he actually completely forgot he gave that interview.

So, happy watching, listening, and reading all the interviews! Hope you take some interviews yourself for all of us to enjoy!

P.S. (Added Dec 3, 2020) At my urging, Bruce Rothschild has typed up a brief “History of Combinatorics at UCLA“. I only added hyperlinks to it, to clarify the personalities Bruce is talking about (thus, all link mistakes are mine).

P.P.S. (Added Feb 6, 2021) At my request, the editors of ECA clarified their interview process (as of today, they have posted nine of them). Their interviews are conducted over email and are essentially replies to the nearly identical sets of questions. The responses are edited for clarity and undergo several rounds of approval by the interviewee. This practice is short of what one would traditionally describe as a journalistic interview (e.g., there are no uncomfortable questions), and is more akin to writing a puff piece. Still, we strongly support this initiative by the ECA as the first systematic effort to put combinatorialists on record. Hopefully, with passage of time others types of interviews will also emerge from various sources.

How Combinatorics became legitimate (according to László Lovász and Endre Szemerédi)

April 26, 2019 3 comments

Simons Foundation has a series of fantastic interviews with leading mathematicians (ht Federico Ardila).  Let me single out the interviews with László Lovász and Endre SzemerédiAvi Wigderson asked both of them about the history of combinatorics and how it came into prominence.  Watch parts 8-9 in Lovász’s interview and 10-11 in Szemerédi’s interview to hear their fascinating answers.

P.S.  See also my old blog posts on what is combinatoricshow it became legitimate and how to watch math videos.

What if math dies?

April 7, 2019 2 comments

Over the years I’ve heard a lot about the apparent complete uselessness and inapplicability of modern mathematics, about how I should always look for applications since without them all I am doing is a pointless intellectual pursuit, blah, blah, blah.  I had strangers on the plane telling me this (without prompting), first dates (never to become second dates) wondering if “any formulas changed over the last 100 years, and if not what’s the point“, relatives asking me if I ever “invented a new theorem“, etc.

For whatever reason, everyone always has an opinion about math.  Having never been accused of excessive politeness I would always abruptly change the subject or punt by saying that the point is “money in my Wells Fargo account“.  I don’t even have a Wells Fargo account (and wouldn’t want one), but what’s a small lie when you are telling a big lie, right?

Eventually, you do develop a thicker skin, I suppose.  You learn to excuse your friends as well meaning but uneducated, journalists as maliciously ignorant, and strangers as bitter over some old math learning experience (which they also feel obliged to inform you about).  However, you do expect some understanding and respect from fellow academics. “Never compare fields” Gian-Carlo Rota teaches, and it’s a good advice you expect sensible people to adhere.  Which brings me to this:

The worst idea I’ve heard in a while

In a recent interview with Glenn Loury, a controversial UPenn law professor Amy Wax proposed to reduce current mathematics graduate programs to one tenth or one fifteenth of their current size (start at 54.30, see also partial transcript).  Now, I get it.  He is a proud member of the “intellectual dark web“, while she apparently hates liberal education establishment and wants to rant about it.  And for some reason math got lumped into this discussion.  To be precise, Loury provoked Wax without offering his views, but she was happy to opine in response.  I will not quote the discussion in full, but the following single sentence is revealing and worth addressing:

If we got rid of ninety percent of the math Ph.D. programs, would we really be worse off in any material respect?  I think that’s a serious question.

She followed this up with “I am not advocating of getting rid of a hundred percent of them.”  Uhm, thanks, I guess…

The inanity of it all

One is tempted to close ranks and ridicule this by appealing to authority or common sense.  In fact, just about everyone — from Hilbert to Gowers — commented on the importance of mathematics both as an intellectual endeavor and the source of applications.  In the US, we have about 1500-2000 new math Ph.D.’s every year, and according to the AMS survey, nearly all of them find jobs within a year (over 50% in academia, some in the industry, some abroad).

In fact, our math Ph.D. programs are the envy of the world.  For example, of the top 20 schools worldwide between 12 and 15 are occupied by leading US programs depending on the ranking (see e.g. here or there for recent examples, or more elsewhere).  Think about it: math requires no capital investment or infrastructure at all, so with the advent of personal computing, internet and the arXiv, there are little or no entry barriers to the field.  Any university in the world can compete with the US schools, yet we are still on the top of the rankings.  It is bewildering then, why would you even want to kill these super successful Ph.D. programs?

More infrastructurally, if there are drastic cuts to the Ph.D. programs in the US, who would be the people that can be hired to teach mathematics by the thousands of colleges whose students want to be math majors?  The number of the US math majors is already over 40,000 a year and keep growing at over 5% a year driven in part by the higher salary offerings and lifetime income (over that of other majors).  Don’t you think that the existing healthy supply and demand in the market for college math educators already determined the number of math Ph.D.’s we need to produce?

Well, apparently Wax doesn’t need convincing in the importance of math.  “I am the last person to denigrate pure mathematics.  It is a glory of mankind…”   She just doesn’t want people doing new research.  Or something.  As in “enough already.”  Think about it and transfer this thought to other areas.  Say — no new music is necessary — Bach and Drake said it all.  Or — no new art is necessary — Monet and Warhol were so prolific, museums don’t really have space for new works.  Right…

Economics matters

Let’s ask a different question: why would you want to close Ph.D. programs when they actually make money?  Take UCLA.  We are a service department, which makes a lot of money from teaching all kinds of undergraduate math courses + research grants both federal, state and industrial.  Annually, we graduate over 600 students with different types of math/stat majors, which constitutes about 1.6% of national output, the most of all universities.

Let’s say our budget is $25 mil (I don’t recall the figures), all paid for.  That would be out of UCLA budget of $7.5 billion of which less than 7% are state contributions.  Now compare these with football stadiums costs which are heavily subsidized and run into hundreds of millions of dollars.  If you had to cut the budget, is math where you start?

Can’t we just ignore these people?

Well, yes we can.  I am super happy to dismiss hurried paid-by-the-word know-nothing journalists or some anonymous YouTube comments.  But Amy Wax is neither.  She is smart and very accomplished:  summa cum laude from Yale, M.D. cum laude from Harvard Medical School, J.D. from Columbia Law School where she was an editor of Columbia Law Review, argued 15 cases in the US Supreme Court, is a named professor at UPenn Law School, has dozens of published research papers in welfare, labor and family law and economics.  Yep.

One can then argue — she knows a lot of other stuff, but nothing about math.  She is clearly controversial, and others don’t say anything of that nature, so who cares.  That sounds right, but so what?  Being known as controversial is like license to tell “the truth”…  er… what they really think.  Which can include silly things based on no research into our word.  This means there are numerous other people who probably also think that way but are wise enough or polite enough not to say it.  We need to fight this perception!

And yes, sometimes these people get into positions of power and decide to implement the changes.  Two cases are worth mentioning: the University of Rochester failed attempt to close its math Ph.D. program, and the Brown University fiasco.  The latter is well explained in the “Mathematical Apocrypha Redux” (see the relevant section here) by the inimitable Steven Krantz.  Rating-wise, this was a disaster for Brown — just read the Krantz’s description.

The Rochester story is rather well documented and is a good case of study for those feeling too comfortable.  Start with this Notices article, proceed to NY Times, then to protest description, and this followup in the Notices again.  Good news, right?  Well, I know for a fact that other administrators are also making occasional (largely unsuccessful) moves to do this, but I can’t name them, I am afraid.

Predictable apocalypse

Let’s take Amy Wax’s proposal seriously, and play out what would happen if 90-93% of US graduate programs in mathematics are closed on January 1, 2020.  By law.  Say, the US Congress votes to deny all federal funds to universities if they maintain a math Ph.D. program, except for the top 15 out of about 180 graduate programs according to US News.  Let’s ignore the legal issues this poses.  Just note that there are various recent and older precedents of federal government interfering with state and private schools (sometimes for a good cause).

Let’s just try to quickly game out what would happen.  As with any post-apocalyptic fiction, I will not provide any proofs or reasoning.  But it’s all “reality based”, as two such events did happened to mathematicians in the last century, one of them deeply affecting me: the German “academic reforms” in late 1930s (see e.g. here or there), and the Russian exodus in early 1990s (see e.g. here or there, or there).  Another personally familiar story is an implosion of mathematics at Bell Labs in late 1990s.  Although notable, it’s on a much smaller scale and to my knowledge has not been written about (see the discussion here, part 6).

First, there will be huge exodus of distinguished mathematics faculty from school outside of the 15 schools.  These include members of the National Academy of Sciences, numerous ICM speakers, other award winners, etc.  Some will move overseas (Canada, Europe, Japan, China, etc.), some will retire, some leave academia.  Some will simply stop doing research given the lack of mathematical activity at the department and no reward for doing research.

Second, outside of top 15, graduate programs in other subjects notice falling applications resulting in their sliding in world ranking.  These include other physical sciences, economics and computer science.  Then biological and social sciences start suffering.  These programs start having their own exodus to top 15 school and abroad.

Third, given the sliding of graduate programs across the board, the undergraduate education goes into decline across the country.  Top US high school students start applying to school abroad. Many eventually choose to stay in these countries who welcome their stem excellence.

Fourth, the hitech, fintech and other science heavy industries move abroad closer to educated employees.  United States loses its labor market dominance and starts bleeding jobs across all industries.   The stocks and housing market dip down.

Fifth, under strong public pressure the apocalyptic law is repealed and all 180 Ph.D. programs are reinstated with both state and federal financial support.  To everyone’s surprise, nobody is moving back.  Turns out, destroying is much faster and easier than rebuilding, as both Germany and Russia discovered back in the 20th century.  From that point on, January 1, 2020 became known as the day the math died.

Final message:

Dear Amy Wax and Glenn Loury!  Please admit that you are wrong.  Or at least plead ignorance and ask for forgiveness.  I don’t know if you will ever see this post or have any interest in debating the proposition I quoted, but I am happy to do this with you.  Any time, any place, any style.  Because the future of academia is important to all of us.

Just combinatorics matters

March 29, 2019 3 comments

I would really like everyone to know that every time you say or write that something is “just combinatorics” somebody rolls his eyes.  Guess who?

Here is a short collection of “just combinatorics” quotes.  It’s a followup on my “What is Combinatorics?” quotes page inspired by the “What is Combinatorics?” blog post.

You should watch combinatorics videos!

May 2, 2015 4 comments

Here is my collection of links to Combinatorics videos, which I assembled over the years, and recently decided to publish.  In the past few years the number of videos just exploded.  We clearly live in a new era.  This post is about how to handle the transition.

What is this new collection?

I selected over 400 videos of lectures and seminars in Combinatorics, which I thought might be of interest to a general audience.  I tried to cover a large number of areas both within Combinatorics and related fields.  I have seen many (but not all!) of the talks, and think highly of them.  Sometimes I haven’t seen the video, but have heard this talk “live” at the same or a different venue, or read the paper, etc.  I tried to be impartial in my selection, but I am sure there is some bias towards some of my favorite speakers.

The collection includes multiple lectures by Noga Alon, Persi Diaconis, Gil Kalai, Don Knuth, László Lovász, János Pach, Vic Reiner, Paul Seymour, Richard Stanley, Terry Tao, Xavier Viennot, Avi Wigderson, Doron Zeilberger, and many many others. Occasionally the speakers were filmed giving similar talks at different institutions, so I included quick links to those as well so the viewer can choose.

Typically, these videos are from some workshops or public lecture series.  Most are hosted on the institution websites, but a few are on YouTube or Vimeo (some of these are broken into several parts).  The earliest video is from 1992 and the most recent video was made a few days ago.   Almost all videos are from the US or Canada, with a few recent additions from Europe.  I also added links to a few introductory lectures and graduate courses on the bottom of the page.

Why now?

Until a couple of years ago, the videos were made only at a few conference centers such as Banff, MSRI and IAS.  The choice was sparse and the videos were easy to find.  The opposite is true now, on both counts.  The number of recorded lectures in all areas is in tens of thousands, they are spread across the globe, and navigating is near impossible unless you know exactly what you are looking for.  In fact, there are so many videos I really struggled with the choice of which to include (and also with which of them qualify as Combinatorics).  I am not sure I can maintain the collection in the future – it’s already getting too big.  Hopefully, some new technology will come along (see below), but for now this will do.

Why Combinatorics?

That’s what I do.  I try to think of the area as broad as possible, and apologize in advance if I omitted a few things.  For the subarea division, I used as a basis my own Wikipedia entry for Combinatorics (weirdly, you can listen to it now in a robotic voice).  The content and the historical approach within sub-areas is motivated by my views here on what exactly is Combinatorics.

Why should you start watching videos now?

First, because you can.  One of the best things about being in academia is the ability (in fact, necessity) to learn.  You can’t possibly follow everything what happens in all fields of mathematics and even all areas of combinatorics.  Many conferences are specialized and the same people tend to meet a year after year, with few opportunities for outsiders to learn what’s new over there.  Well, now you can.  Just scroll down the list and (hopefully) be amazed at the number of classical works (i.e. over 5 y.o.) you never heard of, the variety of recent developments and connections to other fields.  So don’t just watch people in your area from workshops you missed for some reason.  Explore other areas!  You might be surprised to see some new ideas even on your favorite combinatorial objects.  And if you like what you see, you can follow the links to see other videos from the same workshops, or search for more videos by the same speaker…

Second, you should start watching because it’s a very different experience.  You already know this, of course.  One can pause videos, go back and forward, save the video to watch it again, or stop watching it right in the beginning.  This ability is to popular, Adam Sandler even made an awful movie about it…  On the other hand, the traditional model of lecture attendance is where you either listen intently trying to understand in real time and take notes, or are bored out your mind but can’t really leave.  It still has its advantages, but clearly is not always superior.  Let me elaborate on this below.

How to watch videos?

This might seem like a silly question, but give me a chance to suggest a few ideas…

0) Prepare for the lecture.  Make sure to have enough uninterrupted time.  Lock the door.  Turn off the cell phone.  Download and save the video (see below).  Download and save the slides.  Search for them if they are not on the lecture website (some people put them on their home pages).  Never delete anything – store the video on an external hard drive if you are running out of space.  Trust me, you never know when you might need it again, and the space is cheap anyway…

Some years ago I made a mistake by not saving Gil Kalai’s video of a talk titled “Results and Problems around Borsuk’s Conjecture”.  I found it very inspiring — it’s the only talk I referenced it in my book.  Well, apparently, in its infinite wisdom, PIMS lost the video and now only the audio is available, which is nearly useless for a blackboard talk.  What a shame!

1) Use 2 devices.  Have the video on a big screen, say, a large laptop or a TV hooked to your  laptop.  If the TV is too far, use a wireless mouse to operate a laptop from across the room or something like a Google stick to project from a far.  Then, have the slides of the talk opened on your tablet if you like taking computer notes or just like scrolling by hand gestures, or on your other laptop if you don’t.  The slides are almost universally in .pdf and most software including the Adobe Reader allows to take notes straight in the file.

Another reason to have slides opened is the inability for some camera people to understand what needs to be filmed.  This is especially severe if they just love to show the unusual academic personalities, or are used to filming humanities lectures where people read at the podium.  As a result, occasionally, you see them pointing a camera to a slide full of formulas for 2 seconds (and out of focus), and then going back for 2 minutes filming a speaker who is animatedly pointing to the screen (now invisible), explaining the math.  Ugh…

2) If the subject is familiar and you feel bored with the lengthy introduction, scroll the slides until you see something new.  This will give you a hint to where you should go forward in the video.  And if you did miss some definition you can pause the video and scroll the slides to read it.

3) If there are no slides, or you want to know some details which the speaker is purposefully omitting, pause the video and download the paper.  I do this routinely while listening to talks, but many people are too shy to do this out of misplaced fear that others might think they are not paying attention.  Well, there is no one to judge you now.

4) If you are the kind of person who likes to ask questions to clarify things, you still can.  Pause the video and search the web for the answer.  If you don’t find it, ask a colleague by skype, sms, chat, email, whatever.  If everything fails – write to the speaker.  She or he might just tell you, but don’t be surprised if they also ignore your email…

5) If you know others who might be interested in the video lecture, just make it happen.  For example, you can organize a weekly seminar where you and your graduate students watch the lectures you choose (when you have no other speakers).  If students have questions, pause the video and try to answer them.  In principle, if you have a good audience the speaker may agree to answer the questions for 5-10 min over skype, after you are done watching.  Obviously, I’ve never seen this happen (too much coordination?).  But why not try this – I bet if you ask nicely many speakers would agree to this.

6) If you already know a lot about the subject, haven’t been following it recently but want to get an update, consider binge watching.  Pick the most recent lecture series and just let it run when you do house shores or ride a subway.  When things get interesting, you will know to drop everything and start paying attention.

Why should you agree to be videotaped?

Because the audience is ready to see your talks now.  Think of this as another way of reaching out with your math to a suddenly much broader mathematical community (remember the “broad impact” section on your NSF grant proposal?).  Let me just say that there is nothing to fear – nobody is expecting you to have acting skills, or cares that you have a terrible haircut.  But if you make a little effort towards giving a good talk, your math will get across and you might make new friends.

Personally, I am extremely uncomfortable being videotaped – the mere knowledge of the camera filming makes me very nervous.  However I gradually (and grudgingly) concluded that this is now a part of the job, and I have to learn how to do this well.  Unfortunately, I am not there yet…

Yes, I realize that many traditionalists will object that “something will be missing” when you start aiming at giving good video talks at the expense of local audience.  But the world is changing if hasn’t changed already and you can’t stop the tide.  This happened before, many times.  For example, at some point all the big Hollywood studios have discovered that they can make movies simpler and make a great deal more money overseas to compensate for the loss in the US market.  They are completely hooked now, and no matter what critics say this global strategy is likely irreversible.  Of course, this leaves a room for a niche market (say, low budget art-house movies), but let’s not continue with this analogy.

How to give video lectures?

Most people do nothing special.  Just business as usual, hook up the mike and hope it doesn’t distort your voice too bad.  That’s a mistake.  Let me give a number of suggestions based mostly on watching many bad talks.  Of course, the advice for giving regular talks apply here as well.

0) Find out ahead of time if you get filmed and where the camera is.  During the lecture, don’t run around; try to stand still in full view of the camera and point to the screen with your hands.  Be animated, but without sudden moves.  Don’t use a laser pointer.  Don’t suddenly raise your voice.  Don’t appeal to the previous talks at the same workshop.  Don’t appeal to people in the audience – the camera can rarely capture what they say or do.  If you are asked a question, quickly summarize it so the viewer knows what question you are answering.  Don’t make silly off-the-cuff jokes (this is a hard one).

1) Think carefully whether you want to give a blackboard or a computer talk.  This is crucial.  If it’s a blackboard talk, make sure your handwriting is clear and most importantly BIG.  The cameras are usually in the very back and your handwriting won’t be legible otherwise.  Unless you are speaking the Fields Institute whose technology allows one to zoom into the high resolution video, nobody might be able to see what you write.  Same goes for handwritten slides unless they are very neat, done on a laptop, and the program allows you to increase their size.  Also, the blackboard management becomes a difficult issue.  You should think through what results/definitions should stay on the blackboard visible to the camera at all times and what can be safely deleted or lifted up if the blackboard allows that.

2) If it’s a computer talk, stick to your decision and make a lot of effort to have the slides look good.  Remember, people will be downloading them…  Also, make every effort NOT to answer questions on a blackboard next to the screen.  The lightning never works – the rooms are usually dimmed for a computer talk and no one ever thinks of turning the lights on just for 30 seconds when you explain something.  So make sure to include all your definition, examples, etc, in the slides.  If you don’t want to show some of them – in PowerPoint there is a way to hide them and pull them up only if someone asks to clarify something.  I often prepare the answers to some standard questions in the invisible part of my slides (such as “What happens for other root systems?” or “Do your results generalize to higher dimensions?”), sometimes to unintended comedic effect.  Anyhow, think this through.

3) Don’t give the same videotaped talk twice.  If you do give two or more talks on the same paper, make some substantial changes.  Take Rota’s advice: “Relate to your audience”…  If it’s a colloquium talk, make a broad accessible survey and include your results at the end.  Or, if it’s a workshop talk, try to make an effort to explain most proof ideas, etc.  Make sure to have long self-explanatory talk titles to indicate which talk is which.  Follow the book industry lead for creating subtitles.  For example use “My most recent solution of the Riemann hypothesis, an introduction for graduate students” or “The Pythagorean theorem: How to apply it to the Langlands Program and Quantum Field Theory”.

4) Download and host your own videos on your website alongside your slides and your relevant paper(s), or at least make clear links to them from your website.  You can’s trust anyone to keep your files.  Some would argue that re-posting them on YouTube will then suffice.  There are two issues here.  First, this is rarely legal (see below).  Second, as I mentioned above, many viewers would want to have a copy of the file.  Hopefully, in the future there will be a copyright-free arXiv-style video hosting site for academics (see my predictions below).

5) In the future, we would probably need to consider having a general rule about adding a file with errata and clarifications to your talk, especially if something you said is not exactly correct, or even just to indicate, post-factum, whether all these conjectures you mentioned have been resolved and which way.  The viewers would want to know.

For example, my student pointed out to me that in my recent Banff talk, one of my lemmas is imprecise.  Since the paper is already available, this is not a problem, but if it wasn’t this could lead to a serious confusion.

6) Watch other people’s videos.  Pay attention to what they do best.  Then watch your own videos.  I know, it’s painful.  Turn off the sound perhaps.  Still, this might help you to correct the worst errors.

7) For advanced lecturers – try to play with the format.  Of course, the videos allow you to do things you couldn’t do before (like embedding links to papers and other talks, inserting some Java demonstration clips, etc.), but I am talking about something different.  You can turn the lecture into an artistic performance, like this amazing lecture by Xavier Viennot.  Not everyone has the ability or can afford to do this, but having it recorded can make it worthwhile, perhaps.

Know your rights

There are some delicate legal issues when dealing with videos, with laws varying in different states in the US (and in other countries, of course).  I am not an expert on any of this and will write only as I understand them in the US.  Please add a comment on this post if you think I got any of this wrong.

1) Some YouTube videos of math lectures look like they have been shut by a phone.  I usually don’t link to those.  As I understand the law on this, anyone can film a public event for his/her own consumption.  However, you and the institution own the copyright so the YouTube posting is illegal without both of yours explicit permission (written and signed).  You can fight this by sending a “cease and desist” letter to the person who posted the video, but contacting YouTube directly might be more efficient – they have a large legal department to sort these issues.

2) You are typically asked to sign away your rights before your talk.  If an institution forgot to do this, you can ask to take your talk down for whatever reason.  However, even if you did sign the paper you can still do this – I doubt the institution will fight you on this just to avoid bad publicity.  A single email to the IT department should suffice.

3) If the file with your talk is posted, it is (obviously) legal for you to download it, but not to post it on your website or repost elsewhere such as YouTube or WordPress.  As far as I am concerned, you should go ahead and post it on your university website anyway (but not on YT or WP!).  Many authors typically post all their papers on their website even if they don’t own a copyright on them (which is the case or virtually all papers before 2000).  I am one of them.  The publishers just concluded that this is the cost of doing business – if they start going after people like us, the authors can revolt.  The same with math videos.  The institutions probably won’t have a problem with your university website posting as long as you acknowledge the source.  But involving a third party creates a host of legal problems since these internet companies are making money out of the videos they don’t own a copyright for.  Stay away from this.

4)  You can the edit the video by using numerous software, some of which is free to download.  Your can remove the outside noise, make the slides sharper, everything brighter, etc.  I wouldn’t post a heavily edited video when someone else owns a copyright, but a minor editing as above is ok I think.

5) If the institution’s website does not allow to download the video but has a streaming option (typically, the Adobe Flash or HTML5), you can still legally save it on your computer, but this depends on what software you choose.  There are plenty of software which capture the video being played on your computer and save it in a file.  These are 100% legal.  Other websites play the videos on their computers and allow you to download afterwards.  This is probably legal at the institutions, but a gray area at YouTube or Vimeo which have terms of service these companies may be violating.  Just remember – such videos can only be legal for personal consumption.  Also, the quality of such recording is typically poor – having the original file is much better.

What will happen in the future?

Yes, I will be making some predictions.  Not anything interesting like Gian-Carlo Rota’s effort I recently analyzed, but still…

1) Watching and giving video lectures will become a norm for everyone.  The ethical standards will develop that everyone gets to have the files of videos they made.  Soon enough there will be established some large well organized searchable (and not-for-profit!) math video depositories arXiv-style where you can submit your video and link to it from your website and where others can download from.  Right now companies like DropBox allow you to do this, but it’s for-profit (your have to pay extra for space), and it obviously needs a front like the arXiv.  This would quickly make my collection a thing of the past.

2) Good math videos will become a “work product”, just like papers and books.  It is just another venue to communicate your results and ideas.  People will start working harder on them.  They will become a standard item on CVs, grant applications, job promotions, etc.  More and more people will start referencing them just like I’ve done with Kalai’s talk.  Hopefully part 1) will happen soon enough so all talks get standard and stable links.

3) The video services will become ubiquitous.  First, all conference centers will acquire advanced equipment in the style of the Banff Center which is voice directed and requires no professional involvement except perhaps at the editing stage.  Yes, I am thinking of you, MFO.  A new library is great, but the talks you could have recorded there are priceless – it’s time to embrace the 21st century….

Second, more and more university rooms will be equipped with the cameras, etc.  UCLA already has a few large rooms like that (which is how we make the lamely named BruinCasts), but in time many department will have several such rooms to hold seminars.  The storage space is not an issue, but the labor cost, equipment and the broadband are.  Still, I give it a decade or two…

4) Watching and showing math videos will become a standard part of the research and graduate education.  Ignore the doomsayers who proclaim that this will supplant the traditional teaching (hopefully, not in our lifetime), but it’s clear already there are unexplored educational benefits from this.  This should be of great benefit especially to people in remote locations who don’t have access to such lectures otherwise.  Just like the Wikipedia has done before, this will even the playing field and help the talent to emerge from unlikely places.  If all goes well, maybe the mathematics will survive after all…

Happy watching everyone! 

Grading Gian-Carlo Rota’s predictions

November 27, 2014 4 comments

In this post I will try to evaluate Gian-Carlo Rota‘s predictions on the future of Combinatorics that he made in this 1969 article.  He did surprisingly well, but I am a tough grader and possibly biased about some of the predictions.  Judge for yourself…

It’s tough to make predictions, especially about the future

It is a truth universally acknowledged that humans are very interested in predicting the future. They do this incessantly, compiling the lists of the best and the worst, and in general can’t get enough of them. People tend to forget wrong predictions (unless they are outrageously wrong).  This allows a person to make the same improbable predictions over and over and over and over again, making news every time.  There are even professional prognosticators who make a living writing about the future of life and technology.  Sometimes these predictions are rather interesting (see here and there), but even the best ones are more often wrong than right…

Although rarely done, analyzing past predictions is a useful exercise, for example as a clue to the truthiness of the modern day oracles.  Of course, one can argue that many of the political or technology predictions linked above are either random or self-serving, and thus not worth careful investigation. On the other hand, as we will see below, Rota’s predictions are remarkably earnest and sometimes even brave.  And the fact that they were made so long ago makes them uniquely attractive, practically begging to be studied.

Now, it has been 45 years since Rota’s article, basically an eternity in the life span of Combinatorics. There, Rota describes Combinatorics as “the least developed branches of mathematics“. A quick review of the last few quotes on this list I assembled shows how much things have changed. Basically, the area moved from an ad hoc collection of problems to a 360-degree panorama of rapidly growing subareas, each of which with its own deep theoretical results, classical benchmarks, advanced tools and exciting open problems. This makes grading rather difficult, as it suggests that even random old predictions are likely to be true – just about anything people worked on back in the 1960 has been advanced by now. Thus, before turning to Rota, let’s agree on the grading scale.

Grading on a curve

To give you the feel for my curve, I will use the celebrated example of the 1899-1901 postcards in the En L’An 2000 French series, which range from insightful to utter nonsense (click on the titles to view the postcards, all available from Wikimedia).

Electric train.  Absolutely.  These were introduced only in 1940s and have been further developed in France among other countries.  Note the aerodynamic shape of the engine.  Grade:  A.

Correspondance cinema.  Both the (silent) cinema and phonograph were invented by 1900; the sound came to movie theaters only in 1927.  So the invention here is of a home theater for movies with sound.  Great prediction although not overly ambitious. Grade:  A-.

  Military cyclists.  While bicycle infantry was already introduced in France by 1900, military use of motorcycles came much later.  The idea is natural, but some designs of bikes from WW2 are remarkably similar.  Some points are lost due to the lack of widespread popularity in 2000.  Grade: B+.

  Electric scrubbing.  This is an electric appliance for floor cleaning.  Well, they do exist, sort of, obviously based on different principles.  In part due to the modern day popularity, this is solid prediction anyway.  Grade:  B.

 Auto-rollers.  Roller skates have been invented in 18th century and by 1900 became popular.  So no credit for the design, but extra credit for believing in the future of the mean of transportation now dominated by rollerblades. Thus the author’s invention is in the category of “motorized personal footwear”. In that case the corresponding modern invention is of the electric skateboard which has only recently become available, post-2000 and yet to become very popular. Grade: B-.

Barber.  The author imagines a barber operating machinery which shaves and cuts customer’s hair.   The design is so ridiculous (and awfully dangerous), it’s a good thing this never came about.  There are however electric shavers and hair cutters which are designed very differently.  Grade:  C.

•  Air cup.  The Wright brothers’ planes had similar designs, so no credit again.  The author assumes that personal air travel will become commonplace, and at low speeds and heights.  This is almost completely false.  However, unfortunately, and hopefully only very occasionally, some pilots do enjoy one for the road.  Grade:  D.

 Race in Pacific.  The author imagines that the public spectacle of horse racing will move underwater and become some kind of fish racing.  Ridiculous.  Also a complete failure to envision modern popularity of auto racing which already began in Paris in 1887.  Grade:  F.

Rota’s predictions on combinatorial problems:

In his paper, Rota writes:

Fortunately, most combinatorial problems can be stated in everyday language. To give an idea of the present state of the field, we have selected a few of the many problems that are now being actively worked upon.

We take each of these “problems” as a kind of predictions of where the field is going.  Here are my (biased and possibly uninformed) grades for each problem he mentions.

1)  The Ising Problem.  I think it is fair to say that since 1969 combinatorics made no contribution in this direction.  While physicists and probabilists continue studying this problem, there is no exact solution in dimension 3 and higher.  Grade: F.

2)  Percolation Theory.  The study of percolation completely exploded since 1969 and is now a subject of numerous articles in both probability, statistical physics and combinatorics, as well as several research monographs.  One connection is given by an observation that p-percolation on a complete graph Kn gives the Erdős–Rényi random graph model. Even I accidentally wrote a few papers on the subject some years ago (see one, two and three).  Grade: A.

3)  The Number of Necklaces, and Polya’s Problem.  Taken literally, the necklaces do come up in combinatorics of words and free Lie algebra, but this context was mentioned by Rota already. As far as I can tell, there are various natural (and interesting) generalizations of necklaces, but none surprising.  Of course, if the cyclic/dihedral group action here is replaced by other actions, e.g. the symmetric group, then modern developments are abundant.  But I think it’s a reach too far, since Rota knew the works of Young, MacMahon, Schur and others but does not mention any of it.  Similarly, Polya’s theorem used to be included in all major combinatorics textbooks (and is included now, occasionally), but is rarely taught these days.  Simply put, the g.f. implications haven’t proved useful.  Grade: D.

4)  Self-avoiding Walk. Despite strong interest, until recently there were very few results in the two-dimensional case (some remarkable results were obtained in higher dimensions). While the recent breakthrough results (see here and there) do use some interesting combinatorics, the authors’ motivation comes from probability. Combinatorialists did of course contribute to the study of somewhat related questions on enumeration of various classes of polyomino (which can be viewed as self-avoiding cycles in the grid, see e.g. here).  Grade: C.

5)  The Traveling Salesman Problem. This is a fundamental problem in optimization theory, connected to the study of Hamiltonian cycles in Graph Theory and numerous other areas. It is also one of the earliest NP-hard problems still playing a benchmark role in Theoretical Computer Science. No quick of summary of the progress in the past 45 years would give it justice. Note that Rota’s paper was written before the notions of NP-completeness. In this light, his emphasis on algorithmic complexity and allusions to Computability Theory (e.g. unsolvable problems) are most prescient.  So are his briefly mentioned connections to topology which are currently a popular topic.  Well done!  Grade: A+.

6)  The Coloring Problem.  This was a popular topic way before Rota’s article (inspired by the Four Color Theorem, the chromatic polynomial, etc.), and continues to be even more so, with truly remarkable advances in multiple directions.  Note Rota’s mentioning of matroids which may seem extraneous here at first, but in fact absolutely relevant indeed (in part due to Rota’s then-ongoing effort).  Very good but unsurprising prediction.  Grade: A-.

7)  The Pigeonhole Principle and Ramsey’s Theorem. The Extremal Graph Theory was about to explode in many directions, with the the Erdős-Stone-Simonovits theorem proved just a few years earlier and the Szemerédi regularity lemma a few years later.  Still, Rota never mentions Paul Erdős and his collaborators, nor any of these results, nor potential directions.  What a missed opportunity!  Grade: B+.

Rota’s predictions on combinatorial areas:

In the concluding section “The Coming Explosion”, Rota sets this up as follows:

Before concluding this brief survey, we shall list the main subjects in which current work in combinatorial theory is being done.

Here is a list and more of my comments.

1)  Enumerative Analysis.  Sure.  But this was an easy prediction to make given the ongoing effort by Carlitz, Polya, Riordan, Rota himself and many other peope.  Grade: A-.

2)  Finite Geometries and Block Designs.  The subject was already popular and it did continue to develop but perhaps at a different pace and directions than Rota anticipated (Hadamard matrices, tools from Number Theory).  In fact, a lot of later work was connection with with Group Theory (including some applications of CFSG which was an ongoing project) and in Coding Theory (as Rota predicted).  Grade: B-.

3)  Applications to Logic.  Rota gives a one-sentence desctiption:

The development of decision theory has forced logicians to make wide use of combinatorial methods.

There are various important connections between Logic and Combinatorics, for example in Descriptive Set Theory (see e.g. here or more recent work by my future UCLA colleague there).  Note however, that Infinitary Combinatorics was already under development, after the Erdős-Rado theorem (1956).  Another very interesting and more recent connection is to Model Theory (see e.g. here).  But the best interpretation here I can think of here are the numerous applications to Game Theory, which already existed (Nash’s equilibrium theorem is from 1950) and was under rapid development.  Either way, Rota was too vague in this case to be given much credit.  Grade: C.

4)  Statistical Mechanics.   He mentions the Ising model again and insists on “close connections with number theory”.  One can argue this all to be too vague or misdirected, but the area does indeed explode in part in the directions of problems Rota mentions earlier. So I am inclined to give him benefit of the doubt on this one. Grade: A-.

The final grade

In total, Rota clearly got more things right than wrong.  He displayed an occasional clairvoyance, had some very clever insights into the future, but also a few flops.  Note also the near complete lack of self-serving predictions, such as the Umbral Calculus that Rota was very fond of.  Since predictions are hard, successes have a great weight than failures.  I would give a final grade somewhere between A- and B+ depending on how far into the future do we think he was making the predictions.  Overall, good job, Gian-Carlo!

P.S.  Full disclosure:  I took a few advanced classes with Gian-Carlo Rota as a graduate student cross registering from Harvard to MIT, and he may have graded my homeworks (this was in 1994-1996 academic years).  I don’t recall the final grades, but I think they were good.  Eventually Rota wrote me a letter of recommendation for a postdoc position.

UPDATE (October 16, 2019)

I would still give a failing grade for Race in Pacific.   But having seen The Aquaman, the similarities are too eerie to ignore, so this prediction needs an upgrade.  Say, D-.