Sad but true, Mike Israetel is a sham

I watched this video doing a breakdown of Mike’s PhD thesis. His thesis is riddled with failures across every page. His research was shoddily done, with worthless statistics, and with technical errors littering every single paragraph that he wrote. The thesis proves that he cannot do research, cannot write research, and probably cannot *read* research either, since he misunderstands many of the papers and articles he actually cites.

This is sad to me, because as long-time readers know, I followed Israetel and took his advice seriously.

Why it matters: You may say that a thesis is just some bs he did in college, and has no bearing on his current position. But Mike Israetel’s entire brand is based around his PhD, that he is a sport *scientist* and not just a jacked dude. He mentions his PhD in his every ad and video, and so he wants viewers and customers to believe that he’s giving them *scientific advice*, which would be based on *research and testing* and not just vibes.

Yet Mike’s thesis is proof that not only can he not do research, nor write a research paper, he can’t even *read* a research paper as he misunderstands and misrepresents the papers he cites. He tells his readers that the science that *other people do* is saying something completely different than what it actually says, and that’s a big problem.

So Mike’s advice and supplements and apps aren’t actually based on science, they’re based on vibes just like every other gym bro on youtube.

Why else it matters: Some have said that Mike’s PhD program wasn’t like a “normal” program, and shouldn’t be held to the same standards. His program works closely with a lot of US olympic athletes, and it wasn’t focused on research that will help the broader public, but on learning the specific techniques to help the specific elite athletes that Mike worked with.

But if that’s the case then Mike has no business claiming that his PhD gives his knowledge applicable to anyone in his general audience. He isn’t giving advice that you, the listener should actually take, his supplements and programs won’t help you, specifically, instead they are tailored toward the special subset of people who are genuine olympic athletes, and who require a very different program to succeed than what an average 9-5er needs

Likewise, if Mike’s program wasn’t held to the standards of a “normal” PhD, then it should not have *awarded* him a PhD and he shouldn’t call himself doctor. The standards of a PhD, the reason it confers upon you the title of “Doctor” is supposed to be because it proves you have met the highest standards for science and scientific communication. That you are not only knowledgeable, but able to use and communicate your knowledge effectively to help the scientific community and educate the non-scientific community at large. But Mike’s thesis proves he just can’t do that.

He has not met the highest standards for science, and he has not even met a *high school level* standard for scientific communication. And yet he still trades on his title of “PhD,” using it as a crutch to gain legitimacy, and as a shield to deflect criticism. It matters that his thesis is worthless and that his PhD was substandard, because is means his crutch should be kicked out from under him, and his shield should be broken like the trash it is.

Finally some have said that many of these criticisms are “nitpicks.” But it matters because a PhD-level of research is supposed to be held to the highest standard of quality. You aren’t supposed to publish something without feeling certain that you can defend its integrity and its conclusions, and yet it is clear Mike’s thesis was written without any thought whatsoever. If he had even re-read his thesis once, he would not have typos and data-fails across whole swaths of it.

I have had many typos in my own blog, but these are streams-of-consciousness posts that I usually type up and publish without a second read, I’m not acting like these are high quality research publications. Mike *is* claiming that his thesis is high quality, that’s the whole reason he got a PhD for it, so it being as shoddily researched, as shoddily written, and as completely absent of a point as it is really proves that he should never have been given a PhD in the first place.

So is that the end for my following of Mike Israetel? Will I stop doing weight workouts and go back to running, since everything he says about “how to lose weight” is clearly wrong?

No.

Mike’s research is crap, and it always did skeeve me out that he leaned so hard on his “Dr” label. I’ve never bought his app or his supplements, but that doesn’t mean I can’t take his advice. Most of what he says is the same as what my *real* doctor has told me with regards to losing weight. And while his PhD is bogus it’s clear he’s taken a few undergraduate level science classes and is more knowledgeable than most of the gym bros with a youtube channel.

Ultimately his advice is probably fine on the whole. The low-level advice he gives is mostly the same as what you’ll hear from non-cranks, and the high-level advice he gives is mostly his personal opinions like any other influencer. He’s probably correct in the broad strokes that weight-lifting and caloric deficits are the best way to lose weight. And he’s probably correct that you should focus on exercises that improve your “strength” and ignore exercises that improve your “balance” unless you have an inherent balancing issue you need to improve on. He’s probably also right that the hysteria around Ozempic and other GLP-1 drugs is overblown, and that if they help you lose weight you should go ahead and use them. As he says: it’s ok to save your willpower for other parts of your life.

But I have no reason to believe his specific advice around high level concepts like training to failure, periodization, muscle group activation, etc. If you don’t know what those are, then it’s a good idea to ignore what he says about them and just focus on lifting and (if you’re overweight), cutting calories.

I don’t think Mike is a complete idiot who should be ignored entirely. I think he’s a hustler like any other influencer and if the things he says work for you, then do them. But he’s not backed by science like he claims he is, so ignore any of his ramblings if they don’t work for you. Talk to your doctor instead, or an *actual* exercise scientist, although if Mike’s PhD thesis is “the norm” for that discipline, then most exercise scientists aren’t really scientists at all.

I’ve long lamented that the fitness and sports landscape is overrun by bro-science and dude-logic. It’s ruled by the kinds of shoddy science and appeals to tradition that we would normally call “old wives’ tales.” But when a jacked dude says something crazy, like “you should lie upside down to regain your breath so that your blood rushes to your lungs,” a lot of people might say “well he’s jacked, he must know *something*.”

I had thought Mike Isratel was an escape from the wider landscape, and that he was perhaps a trendsetter for actual science to creep into this mess. But it seems he’s just another grifter trying to get rich. Ah well, such is life.

Where is college affordable?

I was talking to a friend recently about a conversation I overheard from two co-workers. One of them had attended Georgia Tech University (in Georgia, natch), the other had attended CalTech (although they were from NYC). The Georgia Tech student said “even though I’m from California, Georgia Tech was the more affordable option, so I went there.”


It’s an amazing statement, because many people who’ve gone to college would assume the opposite. Most states make a distinction between “in-state” students and “out-of-state,” with out-of-state students paying a premium on tuition. Many states also have special scholarships and funds only available for their in-state students.

You’d therefore think that a California student would find CalTech the more affordable option, since they’re both top-ranked schools and California should be providing in-state benefits that Georgia Tech doesn’t. But it wasn’t so for this student.


I told this to a California-based friend of mine who replied “I’m surprised they thought California schooling was more expensive, California community college is free.”


But I think there’s a disconnect between students and those outside the University system as to what constitutes “affordability,” and community college is one part of this disconnect. I won’t bury the lede too much: students don’t want community college because it doesn’t offer them what a proper college does. Making community college affordable doesn’t make college itself affordable.


There are a lot of reasons to go to college:
Gain skills
Gain experience
Build networks
Signal value
None of these are well-served by community college.


Gaining skills is the place where a community college should have the greatest value: a calculus class there should teach you the same things as a calculus class at MIT. Math is math, right? But the specific in-class material is only one part of the skill-building that students seek.


The “quality” colleges that students want to attend will be staffed by eminent professors who are on the cutting edge of research in their field. And yes, there is “research” in mathematics. Students don’t just want to learn the facts that were written down decades ago, they want exposure to the unexplored parts of their discipline so they can learn how to contribute themselves. So for any STEM student, just taking community college courses doesn’t really give you much “skill” in your field, because while you may have the book-learning, you don’t have any exposure or experience working on the cutting edge.


At a research university students can work in labs, attend conferences, hear seminars by famous professors, all things you can’t do at a community college. So really a community college cannot give you the skills that a “real” university can, it can only give you the knowledge base that you can then use later to build those skills.


Gaining experience is sort of different from skills. In both STEM and humanities, students will want some experience in their discipline of choice, both so they can make sure they actually like it and so they can better get a job right after graduation. Community colleges are usually overlooked by companies canvasing for interns, and for good reason since many of the students at a community college already have jobs. Likewise a community college will have a lot less clubs or professional guilds through which a student can find new opportunities for leadership or work.


So everything from internships, to student leadership positions, to funded fellowships is a lot harder to find at a community college vs a university. And for those reasons students who want experience will shy away from community colleges even if they are free.


Networks are again hard to build at community colleges. Harvard isn’t Harvard just because the students are smart, it has an enormous pull factor in part from the children of the rich and famous who attend there. At a tier-1 university you can rub shoulders with future CEOs and politicians. Even at a tier-2 or -3 university you can find future managers and leaders. Community colleges are largely attended by people who already have a job and are trying to upskill into a better one, and therefore don’t have the time or effort to network.


Community colleges also are largely attended by students who couldn’t get accepted anywhere else. And it’s sad but true, that most students who didn’t have the grades to go to a university likely don’t have the wherewithal to be future movers and shakers. So your networking opportunities are not great at a community college. Students who want to network will avoid them.


Finally, signaling value. It has become cliche to say that college doesn’t teach you anything and the only reason to attend is as a signal. I disagree with this, but it is true that college does act like a signal to future employers. By attending a “good” university, you show that you are skilled and capable, at the very least you were skilled enough to get in and graduate, so you can probably be trusted with an entry-level position.


Yet again this is something that community college just doesn’t provide. Community college is not a strong signal for employers, it is required to take everyone and employers are wary about the actual quality of the graduating students. Fairly or not, community college is not a strong signal, and so students who believe in the signaling theory will not want to attend.


I think everything I’ve written here is mostly a “duh,” but it still needs to be said. Because when the debate about the cost of university turns to “free community college,” we need to recognize that “there is a price cheaper than free.” Just because something is free doesn’t mean it’s the best option or truly the cheapest option. What you save in tuition, you lose in opportunities to network, to gain skills, to gain experience, and to signal your value.


Now you can still do all these things outside of a community college. You can network outside of college while taking free classes, find internships, become a local leader, etc etc. But all those things require a lot of time, effort, and frankly even money (travel for networking isn’t cheap). A university provides them along with your tuition. So this is what I mean about “a price cheaper than free,” community college classes may be “free,” but they don’t provide you with any of the benefits that people actually want from a university. And trying to match a university’s benefits alongside the free classes from a community college will usually end up costing more time and money than just attending the university instead.
Free community college was pushed as a solution for the college affordability crisis, but it really isn’t one at all.

Community college has a place, but it doesn’t provide what university students want at all. And that’s part of why California schools remain unaffordable, even for California natives, because the state is subsidizing what students don’t want while allowing costs to grow out of control on what students actually need. If California at least built a hell of a lot more housing, then cost of living would go down and the universities would become cheaper to attend by default. But I suspect more and more that the “quality” California universities will become a playground for the rich or connected, with much of the middle class going out-of-state instead.

When will the glaciers all melt?

Glacier National Part in Montana [has] fewer than 30 glaciers remaining, [it] will be entirely free of perennial ice by 2030, prompting speculation that the park will have to change its name – The Ravaging Tide, Mike Tidwell

Americans should plan on the 2004 hurricane season, with its four super-hurricanes (catagory 4 or stronger) becoming the norm […] we should not be surprised if as many as a quarter of the hurricane seasons have five super-hurricanes – Hell and High Water, Joseph Romm

Two points of order:

  • In 2006, when Mike Tidwell wrote about glaciers, Glacier national park had 27 glaciers. It now has 26 glaciers, and isn’t expected to suddenly suddenly lose them all in 5 years.
  • Since 2007, when Joseph Romm wrote about hurricanes, just four hurricane seasons have had four so-called “super-hurricanes,” and just one season has had five. The 2004 season has not become the norm, and we are averaging less than 6% of seasons having five super-hurricanes

I do not write this to dunk on climate science, I write only to dunk on the popular press. The science of global warming is fact, it is not a myth or fake news. But the popular press has routinely misused and abused the science, taking extreme predictions as certainties and downplaying the confidence interval.

What do I mean by that? Think of a roulette wheel, where a ball spins on a wheel and you place a bet as to where it will land. If you place a bet, what is the maximum amount of money you can win (aka the “maximum return”)? In a standard game the maximum amount you can win is 36 times what you bid, should you pick the exact number the ball lands on. But remember that in casinos, the House Always Wins. Your *expected* return is just 95/100 of your bid. You’re more likely to lose than to win, and the many many loses wipe out your unlikely gains, if you play the game over and over.

So how should we describe the statistical possibilities of betting on a roulette wheel? We should give the expected return (which is like a mean value how much money you might win), we should give the *most likely* return (the mode), and we should give the minimum and maximum returns, as well as their likelihood of happening. So if you bet 1$ on a roulette wheel:

  • Your expected return is 0.95$
  • Your most likely return is 0$ (more than half of the time you win nothing, even if betting on red or black. If you bet on numbers, you win nothing even more often).
  • Your minimum return is 0$ (at least you can’t owe more money than you bet), this happens just over half the time if you bet on red/black, and happens more often if you bet on numbers
  • Your maximum return is 36$. This happens 1/38 times, or about 2.6% of the time.

But would I be lying to you if I said “hey, you *could* win 36$”?

By some standards no, this isn’t lying. But most people would acknowledge the hiding of information as a lie of omission. If someone tried to entice someone else to play roulette only by telling them that they could win 36$ for every 1$ they put down, I would definitely consider that lying.

So too does the popular press lie. Climate science is a science of statistics and of predictions. Like Nate Silver’s election forecasting, climate modeling doesn’t just tell you a single forecast, they tell you what range of possibilities you should expect and how often you should expect them. For instance, Nate Silver made a point in 2024 that while his forecast showed Harris and Trump with about even odds to win, you shouldn’t have expected them to split the swing states evenly and have the election come down to the wire. The most common result (the mode) was for either candidate to win *all* the swing states together, which is indeed what happened.

Bad statistics and prediction modellers will misstate the range of possible probabilities. They will heavily overstate their certainties, understate the variance, and pretend that some singular outcome is so likely as to be guaranteed.

This kind of bad statistics was central to Sam Wong of the Princeton Election Consortium‘s 2016 prediction, which gave Hillary Clinton a greater than 99% chance of victory. Sam *massively* overstated the election’s certainty, and frequently attacked anyone who dared to caution that Clinton wasn’t guaranteed to win.

Nate Silver meanwhile was widely criticized for giving Hillary such a *low* chance of victory, at around 70%. He was “buying into GOP propaganda” so Sam said. Then after the election Silver was attacked by others for giving Clinton such a *high* chance, since by that point we knew she had lost. But 30% chance events happen 30% of the time. Nate has routinely been more right than anyone else in forecasting elections.

I don’t doubt that some people read and believed Sam Wong’s predictions, and even believed (wrongly) that he was the best in the business. When he was proven utterly, completely wrong, how many of his readers decided forecasting would never be accurate again? How much damage did Sam Wong do to the popular credibility of election modeling?

However much damage Sam did, the popular press has done even more to damage the statistical credibility of science, and here we return to climate change. Climate change is happening and will continue to accelerate for the foreseeable future until drastic measures are taken. But how much the earth will warm, and what effects this will have, have to be modeled in detail and there are large statistical uncertainties, much like Silver’s prediction of the 2016 election.

Yet I have been angry for the last 20 years as the popular press continues to pretend long-shot possibilities are dead certainties, and to understate the range of possibilities. Most of the popular press follows the Sam Wong school.

In the roulette table, you might win 36$, but that’s a long-shot possibility. And in 2006 and 2007, we might have predicted that all the glaciers would melt and super-hurricanes would become common. But those were always long-shot possibilities, and indeed these possibilities *have not happened*.

The climate has been changing, the earth has been warming, but you don’t have to go back far to see people making predictions so horrendously inaccurate that they destroy the trust of the entire field. If I told you that you were dead certain to win 36$ when putting 1$ on the roulette wheel, you might never trust me again after you learned how wrong I was. Is it any wonder so many people aren’t trusting the science these days, when this is how it’s presented? When we were told 20 years ago that all the glacier in America would have melted by now? Or that every hurricane season would be as bad as 2004?

And it isn’t hard either to find numerous even more dire predictions couched in weasel words like “may” and “possibly.” The oceans “may” rise by a foot, such and such city “may” be under water. It’s insidious, because while it isn’t *technically* wrong (“I only said may!”) it makes a long-shot possibility seem far more likely than it really is. Again, it’s a clear lie of omission, and it’s absolutely everywhere in the popular press.

We have to be accurate when modelling our uncertainty. We have to discuss the *full range of possibilities*, not just the possibility we *want* to use for fear-mongering. And we have to accurately state the likelihoods for our possibilities, not just declare the long-shot to be a certainty.

Because the earth *has* warmed. A glacier has disappeared from Glacier national park and the rest are shrinking. Hurricane season power is greater than it was last century. But writers weren’t content to write those predictions, and instead filled books with nonsense overstatements that were not born out by the data and are easily disproven with a 2025 google search. When it’s so easy to prove you wrong, people stop listening. And they definitely won’t listen to you when you “update” your predictions to match the far less eye-catching trend that you should have written all along. Lying loses you trust, even if you tell the truth later.

I think Nate Silver should be taken as the gold standard for modelers, statistician, and more importantly *the popular press*. You *need* to model the uncertainties, and more importantly you need to *tell people* about those uncertainties. You need to tell them about the longshots, but also about *how longshot they are*. You need to tell them about the most likely possibility too, even if it isn’t as flashy. And you need to tell them about the range of possibilities along the bell curve, and accurately represent how likely they all are.

Nate Silver did just this. In 2016 he accurately reported that Trump was still well within normal bounds of winning, an average size polling error in his favor was all it would take. He also pointed out that Clinton was a polling error away from an utter landslide (which played much better among the twitterati), and that she was the favorite (but not enough of the favorite to appease the most innumerate writers).

In *every* election Silver has covered, he has been the primary modeller accurately measuring the range of possibilities, and preparing his readers for every eventuality. That gets him dogpiled when he says things that people don’t like, but it means he’s accurate, and accuracy is supposed to be more important than popularity in science.

So my demand to the popular press is to be more like Nate Silver and less like Sam Wong. Don’t overstate your predictions, don’t downplay uncertainties, don’t make extreme predictions to appeal to your readers. Nate Silver has lost a lot of credibility for his temerity to continue forecasting accurately even in elections that Democrats don’t win, but Sam Wong destroyed his credibility in 2016 and has been an utter joke ever since. If science is to remain a force of informing policy, it needs to be credible. And that means making accurate predictions even if they aren’t scary enough to grab headlines, or even if they aren’t what the twitterati would prefer.

Lying only works until people find you out.

If the government doesn’t do this, no one will

I’m not exactly happy about the recent NIH news. For reference the NIH has decided to change how it pays for the indirect costs of research. When the NIH gives a 1 million dollar grant, the University which receives the grant is allowed to demand a number of “indirect costs” to support the research.

These add up to a certain percentage tacked onto the price of the grant. For a Harvard grant, this was about 65%, for a smaller college it could be 40%. What it meant was that a 1 million grant to Harvard was actually 1.65 million, while a smaller college got 1.4 million, 1 million was always for the research, but 0.65 or 0.4 was for the “indirect costs” that made the research possible.

The NIH has just slashed those costs to the bone, saying it will pay no more than 15% in indirect costs. A 1 million dollar grant will now give no more than 1.15 million.

There’s a lot going on here so let me try to take it step by step. First, some indirect costs are absolutely necessary. The “direct costs” of a grant *may not* pay for certain things like building maintenance, legal aid (to comply with research regulations), and certain research services. Those services are still needed to run the research though, and have to be paid for somehow, thus indirect costs were the way to pay them.

Also some research costs are hard to itemize. Exactly how much should each lab pay for the HVAC that heats and cools their building? Hard to calculate, but the building must be at a livable temperature or no researcher will ever work in it, and any biological experiment will fail as well. Indirect costs were a way to pay for all the building expenses that researchers didn’t want to itemize.

So indirect costs were necessary, but were also abused.

See, unlike what I wrote above, a *university* almost never receives a government grant, a *primary investigator* (called a PI) does instead. The PI gets the direct grant money (the 1 million dollars), but the University gets the indirect costs (the 0.4 to 0.65 million). The PI gets no say over how the University spends the 0.5 million, and many have complained that far from supporting research, the University is using indirect costs to subsidize their own largess, beautifying buildings, building statues, creating ever more useless administrative positions, all without actually using that money how it’s supposed to be used: supporting research.

So it’s clear something had to be done about indirect costs. They were definitely necessary, if there were no indirect costs most researchers would not be able to research as Universities won’t allow you to use their space for free, and direct costs don’t always allow you to rent out lab space. But they were abused in that Universities used them for a whole host of non-research purposes.

There was also what I feel is a moral hazard in indirect costs. More prestigious universities, like Harvard, were able to demand the highest indirect costs, while less prestigious universities were not. Why? It’s not like research costs more just because you have a Harvard name tag. It’s just because Harvard has the power to demand more money, so demand they shall. Of course Harvard would use that extra money they demanded on whatever extravagance they wanted.

The only defense of Harvard’s higher costs is that it’s doing research in a higher cost of living environment. Boston is one of the most expensive cities in America, maybe the world. But Social Security doesn’t pay you more if you live in Boston or in Kalamazoo. Other government programs hand you a set amount of cash and demand you make ends meet with it. So too could Harvard. They could have used their size and prestige to find economies of scale that would give them *less* proportional indirect costs than could a smaller university. But they didn’t, they demanded more.

So indirect costs have been slashed. If this announcement holds (and that’s never certain with this administration, whether they walk it back or are sued to undo it are both equally likely), it will lead to some major changes.

Some universities will demand researcher pay a surcharge for using facilities, and that charge will be paid for by direct costs instead. The end result will be the university still gets money, but we can hope that the money will have a bit more oversight. If a researcher balks at a surcharge, they can always threaten to leave and move their lab.

Researchers as a whole can likely unionize in some states. And researchers, being closer to the university than the government, can more easily demand that this surcharge *actually* support research instead of going to the University’s slush fund.

Or perhaps it will just mean more paperwork for researchers with no benefit.

At the same time some universities might stop offering certain services for research in general, since they can no longer finance that through indirect costs. Again we can hope that direct costs can at least pay for those, so that the services which were useful stay solvent and the services which were useless go away. This could be a net gain. Or perhaps none will stay solvent and this will be a net loss.

And importantly, for now, the NIH budget has not changed. They have a certain amount of money they can spend, and will still spend all of it. If they used to give out grants that were 1.65 million and now give out grants that are 1.15 million, that just means more individual grants, not less money. Or perhaps this is the first step toward slashing the NIH budget. That would be terrible, but no evidence of it yet.

What I want to push back on though, is this idea I’ve seen floating around that this will be the death of research, the end of PhDs, or the end of American tech dominance. Arguments like this are rooted in a fallacy I named in the title: “if the government doesn’t do this, no one will.”

These grants fund PhDs who then work in industry. Some have tried to claim that this change will mean there won’t be bright PhDs to go to industry and work on the future of American tech. But to be honest, this was always privatizing profit and socializing cost. All Americans pay taxes that support these PhDs, but overwelmingly the benefits are gained by the PhD holder and the company they work for, neither of whom had to pay for it.

“Yes but we all benefit from their technology!” We benefit from a lot of things. We benefit from Microsoft’s suite of software and cloud services. We benefit from Amazon’s logistics network. We benefit form Tesla’s EV charging infrastructure. *But should we tax every citizen to directly subsidize Microsoft, Amazon, and Tesla?* Most would say. no. The marginal benefits to society are not worth the direct costs to the taxpayer. So why subsidize the companies hiring PhDs?

Because people will still do things even if the government doesn’t pay them. Tesla built a nation-wide network of EV chargers, while the American government couldn’t even build 10 of them. Even federal money was not necessary for Tesla to build EV chargers, they built them of their own free will. And before you falsely claim how much Tesla is government subsidized, an EV tax credit benefits the *EV buyer* not the EV seller. And besides, if EV tax credits are such a boon to Tesla, then why not own the fascists by having the Feds and California cut them completely? Take the EV tax credits to 0, that will really show Tesla. But of course no one will because we all really know who the tax credits support, they support the buyers and we want to keep them to make sure people switch from ICE cars to EVs

Diatribe aside, Tesla, Amazon, and Microsoft have all built critical American infrastructure without a dime of government investment. If PhDs are so necessary (and they probably are), then I don’t doubt the market will rise to meet the need. I suspect more companies will be willing to sponsor PhDs and University research. I suspect more professors will become knowledgeable about IP and will attempt to take their research into the market. I suspect more companies will offer scholarships where after achieving a PhD, you promise to work for the company on X project for Y amount of years. Companies won’t just shrug and go out of business if they can’t find workers, they will in fact work to make them.

I do suspect there will be *less* money for PhDs in this case however. As I said before, the PhD pipeline in America has been to privatize profits and subsidize costs. All American taxpayers pay billions towards the Universities and Researchers that produce PhD candidates, but only the candidates and the companies they work for really see the gain. But perhaps this can realign the PhD pipeline with what the market wants and needs. Less PhDs of dubious quality and job prospect, more with necessary and marketable skills.

I just want to push back on the idea that the end of government money is a deathknell for industry. If an industry is profitable, and if it sees an avenue for growth, it will reinvest profits in pursuit of growth. If the government subsidizes the training needed for that industry to grow, then instead it will invest in infrastructure, marketing, IP and everything else. If training is no longer subsidized, then industry will subsidize it themselves. If PhDs are really needed for American tech dominance, then I absolutely assure you that even the complete end of the NIH will not end the PhD pipeline, it will simply shift it towards company-sponsored or (for the rich) self-sponsored research.

Besides, the funding for research provided by the NIH is still absolutely *dwarfed* by what a *single* pharma company can spend, and there are hundreds of pharma companies *and many many other types of health companies* out there doing research. The end of government-funded research is *not* the end of research.

Now just to end on this note: I want to be clear that I do not support the end of the NIH. I want the NIH to continue, I’d be happier if its budget increased. I think indirect costs were a problem but I think this slash-down-to-15% was a mistake. But I think too many people are locked into a “government-only” mindset and cannot see what’s really out there.

If the worst comes to pass, and if you cannot find NIH funding, go to the private sector, go to the non-profits. They already provided less than the NIH in indirect costs but they still funded a lot of research, and will continue to do so for the foreseeable future. Open your mind, expand your horizons, try to find out how you can get non-governmental funding, because if the worst happens that may be your only option.

But don’t lie and whine that if the government doesn’t do something, then nobody will. That wasn’t true with EV chargers, it isn’t true with biomedical research, and it is a lesson we all must learn if the worst does start to happen.

So just how *do* you get good at teaching?

As a scientist with dreams of becoming a professor, I know teaching is part of the package. Whether it’s a class of undergraduates or a single student in a lab, your knowledge isn’t worth anything if you cannot teach it to others. I always say: no one would have cared about Einstein if he couldn’t accurately explain his theories. It doesn’t matter how right you are, science demands you explain your reasoning, and if you can’t explain in such a way to convince others, you still have a ways to go as a scientist.

Einstein was a teacher. After discovering the Theory of Relativity, he wrote and lectured so as to teach his theory to everyone. Likewise I must be a teacher, whether teaching basic concepts to a class of dozens, or teaching high-level concepts to an individual or a small group, teaching is part of science, and mandatory for a professor.

But how do I get good at it?

The first problem is public speaking. I don’t think I get nervous speaking in public, but I do have a tendency to go too fast, such that my words don’t articulate what I’m actually thinking. It’s hard to realize that the concepts you know in your head will be new and novel to the whole world that lives *outside* your head. When teaching these concepts to someone else, you need to go step by step so that they understand the logical progression, you can’t just make a logical leap because you already know the intervening steps.

So OK, I need to practice speaking more, but beside that, what’s the best method for teaching? And here we get to the heart of why I’m writing this post, *I don’t know and I don’t think anyone does*.

Every decade it seems sociologists find One Weird Trick to make students learn, and every decade it seems that trick is still leaving many students behind. When I went to school, teaching was someone standing at the front of the class, giving a lecture, after which students would go home and do practice problems. This “classic” style of teaching is now seen as passe at best, outright harmful at worst, and while it’s still the norm it’s actively shunned by most newer teachers.

Instead, teachers now have a battery of One Weird Tricks to get students to *really* learn. “ACTIVE learning” is the word of the day, the teacher shouldn’t just lecture but should involve the students in the learning process.

For instance, the students could each hold remote controls (clickers) with the numbers 1 through 4 on them. Then the teacher will put up a multiple-choice question at random points during class, and the students will use their clicker to give the answer they think is correct. There’s no grade for this except participation, and the students’ answers are anonymized, but the teacher will give the correct answer after all the students answer, and a pie chart will show the students how most of their classmates answered. So the theory is that this will massively improve student learning in the following ways:

  • Students will have a low-stakes way to test their knowledge and see if they’re right or wrong, rather than the high-stakes tests and homework that they’re graded on. They may be more willing to approach the problem with an open mind, rather than being stressed about how it will affect their grade.
  • The teacher will know what concepts the students are having trouble on, and can give more time to those prior to the test.
  • Students stay more engaged in class, rather than falling asleep, and likewise teachers feel more validated with an attentive class

The only problem is that the use of clickers has been studied, and has failed to improve student outcomes. Massive studies and meta-analyses with dozens of classes, thousands of students, and clickers don’t improve student’s learning at all over boring old lectures.

Ok, how about this One Weird Trick: “flipped classrooms.” The idea is that normally the teacher lectures in class and the students do practice problems at home. What if instead the students’ homework is to watch the lecture as a video, then in class students work on problems and the teacher goes around giving them immediate and personalized feedback on what they’re doing right or wrong?

In theory this again keeps students far more active, they’re less likely to sleep through class and the immediate feedback they receive while working through the problem sets helps the teachers and students know what they need to work more on. Even better, this One Weird Trick was claimed to narrow the achievement gap in STEM classes.

But another large meta-analysis showed that flipped classrooms *again* don’t improve student learning, and in fact *widen* the achievement gap between minority and white students. Not at all what we wanted!

In theory, science teaches us the way to find the truth. Our methods of storing information have gotten better and better and better as we’ve used science to improve data handling, data acquisition, and data transmission. I read both of those meta-analyses on my phone, whereas even just 30 years ago I would have had to physically go to a University Library and check out one of their (limited) physical journals if I wanted to read the articles and learn if Active Learning is even worth it or not.

But while we’ve gotten so much better at storing information, have we gotten any better at teaching it? We’ve come up with One Weird Trick after One Weird Trick, and yet the most successful (and common) form of teaching is a single person standing in front of 20-30 students, just talking their ears off. A style of teaching not too far removed from Plato and Aristotle, more than 2,000 years ago.

I want to get better at teaching, and I think public speaking is part of that. But beyond just speaking gooder, does anyone even know what good teaching *is*?