Where is college affordable?

I was talking to a friend recently about a conversation I overheard from two co-workers. One of them had attended Georgia Tech University (in Georgia, natch), the other had attended CalTech (although they were from NYC). The Georgia Tech student said “even though I’m from California, Georgia Tech was the more affordable option, so I went there.”


It’s an amazing statement, because many people who’ve gone to college would assume the opposite. Most states make a distinction between “in-state” students and “out-of-state,” with out-of-state students paying a premium on tuition. Many states also have special scholarships and funds only available for their in-state students.

You’d therefore think that a California student would find CalTech the more affordable option, since they’re both top-ranked schools and California should be providing in-state benefits that Georgia Tech doesn’t. But it wasn’t so for this student.


I told this to a California-based friend of mine who replied “I’m surprised they thought California schooling was more expensive, California community college is free.”


But I think there’s a disconnect between students and those outside the University system as to what constitutes “affordability,” and community college is one part of this disconnect. I won’t bury the lede too much: students don’t want community college because it doesn’t offer them what a proper college does. Making community college affordable doesn’t make college itself affordable.


There are a lot of reasons to go to college:
Gain skills
Gain experience
Build networks
Signal value
None of these are well-served by community college.


Gaining skills is the place where a community college should have the greatest value: a calculus class there should teach you the same things as a calculus class at MIT. Math is math, right? But the specific in-class material is only one part of the skill-building that students seek.


The “quality” colleges that students want to attend will be staffed by eminent professors who are on the cutting edge of research in their field. And yes, there is “research” in mathematics. Students don’t just want to learn the facts that were written down decades ago, they want exposure to the unexplored parts of their discipline so they can learn how to contribute themselves. So for any STEM student, just taking community college courses doesn’t really give you much “skill” in your field, because while you may have the book-learning, you don’t have any exposure or experience working on the cutting edge.


At a research university students can work in labs, attend conferences, hear seminars by famous professors, all things you can’t do at a community college. So really a community college cannot give you the skills that a “real” university can, it can only give you the knowledge base that you can then use later to build those skills.


Gaining experience is sort of different from skills. In both STEM and humanities, students will want some experience in their discipline of choice, both so they can make sure they actually like it and so they can better get a job right after graduation. Community colleges are usually overlooked by companies canvasing for interns, and for good reason since many of the students at a community college already have jobs. Likewise a community college will have a lot less clubs or professional guilds through which a student can find new opportunities for leadership or work.


So everything from internships, to student leadership positions, to funded fellowships is a lot harder to find at a community college vs a university. And for those reasons students who want experience will shy away from community colleges even if they are free.


Networks are again hard to build at community colleges. Harvard isn’t Harvard just because the students are smart, it has an enormous pull factor in part from the children of the rich and famous who attend there. At a tier-1 university you can rub shoulders with future CEOs and politicians. Even at a tier-2 or -3 university you can find future managers and leaders. Community colleges are largely attended by people who already have a job and are trying to upskill into a better one, and therefore don’t have the time or effort to network.


Community colleges also are largely attended by students who couldn’t get accepted anywhere else. And it’s sad but true, that most students who didn’t have the grades to go to a university likely don’t have the wherewithal to be future movers and shakers. So your networking opportunities are not great at a community college. Students who want to network will avoid them.


Finally, signaling value. It has become cliche to say that college doesn’t teach you anything and the only reason to attend is as a signal. I disagree with this, but it is true that college does act like a signal to future employers. By attending a “good” university, you show that you are skilled and capable, at the very least you were skilled enough to get in and graduate, so you can probably be trusted with an entry-level position.


Yet again this is something that community college just doesn’t provide. Community college is not a strong signal for employers, it is required to take everyone and employers are wary about the actual quality of the graduating students. Fairly or not, community college is not a strong signal, and so students who believe in the signaling theory will not want to attend.


I think everything I’ve written here is mostly a “duh,” but it still needs to be said. Because when the debate about the cost of university turns to “free community college,” we need to recognize that “there is a price cheaper than free.” Just because something is free doesn’t mean it’s the best option or truly the cheapest option. What you save in tuition, you lose in opportunities to network, to gain skills, to gain experience, and to signal your value.


Now you can still do all these things outside of a community college. You can network outside of college while taking free classes, find internships, become a local leader, etc etc. But all those things require a lot of time, effort, and frankly even money (travel for networking isn’t cheap). A university provides them along with your tuition. So this is what I mean about “a price cheaper than free,” community college classes may be “free,” but they don’t provide you with any of the benefits that people actually want from a university. And trying to match a university’s benefits alongside the free classes from a community college will usually end up costing more time and money than just attending the university instead.
Free community college was pushed as a solution for the college affordability crisis, but it really isn’t one at all.

Community college has a place, but it doesn’t provide what university students want at all. And that’s part of why California schools remain unaffordable, even for California natives, because the state is subsidizing what students don’t want while allowing costs to grow out of control on what students actually need. If California at least built a hell of a lot more housing, then cost of living would go down and the universities would become cheaper to attend by default. But I suspect more and more that the “quality” California universities will become a playground for the rich or connected, with much of the middle class going out-of-state instead.

Declaring victory on my Twitter prediction, conceding defeat on self-driving cars

I’ve made a few predictions over the years here, and I want to talk about two of them.

I’m declaring victory in saying that 2022 was *not* the Year Twitter Died. It was an extremely broad opinion in the left-of-center spaces that Musk was a terrible CEO, that firing so much Twitter staff would destroy the company, that it would be dead and overtaken very soon. I can concede the first one, the second two are clearly false.

The evidence from history has shown that firing most of Twitter’s staff has *not* led to mass outages, mass hacks, or the death of twitter’s infrastructure. It may seem like I’m debating a strawman, but it’s difficult to really convey the ridiculous hysteria I saw, with some claiming that Twitter would soon be dead and abandoned as newer versions of most popular browsers wouldn’t be able to access it. Likewise it was claimed that the servers would be insecure and claimed by botnets, and would thus get blocked by any sane browser protection. None of that has happened, Twitter runs just as it did in 2021. It is no less secure and it not blocked by most browsers.

Nor has the mass exodus of users really occurred. Some people think it has because they live in a bubble, but Mastodon was never going to replace Twitter and Bluesky is losing users. And regardless of your opinions on that, the numbers don’t lie.

I’ve said before that I used to be part of a community that routinely though Musk’s sky was falling. Every Tesla delay would be the moment that *finally* killed the company, every year would be when NASA *finally* kicked SpaceX to the curb, every failed Musk promise would *finally* make people stop listening to him. You’ve heard of fandoms, I was in a hatedom.

But I learned that all of that was motivated reasoning. EVs aren’t actually super easy, and that’s the reason Ford and GM utterly failed to build any. It’s not that Musk was lucky and would soon be steamrolled by the Big Boys, Musk was smart (and lucky) and the Big Boys wet their Big Boy pants and have stilled utterly failed in the EV market despite billions of dollars in free government money.

Did Musk receive free government money? Not targeted money no, any car company on earth could have benefited from the USA/California EV tax credits, it’s just that the Detroit automakers didn’t make EVs. Then they got handed targeted free money, and they still failed to make EVs.

NASA (and the ESA, and JAXA, and CNSA) haven’t managed to replicate SpaceX’s success in low-cost re-usable rockets sending thousands of satellites into orbit. So now *another* Musk property, Starlink, is the primary way that rural folk can get broadband, because Biden’s billions utterly failed to build any rural broadband.

And of course while Musk has turned most of the left against him, he has turned much of the right for him, which is generally what happens when you switch parties. And now that he’s left Trump, some of the left want to coax him back. Clearly people still listen to him even if you and I do not.

So I was very wrong 10 years ago about Elon Musk being the anti-Midas, but I learned my lesson and started stepping out of my bubble. I was right 3 years ago when I said Twitter isn’t dying, and everything I said still rings true. Big companies still use Twitter because it’s their best way to mass-blast their message to everyone in an age when TV is dying and more people block ads with their browser. The same reason people prefer Bluesky (curate your feed, never see what you don’t want to see) is the same reason Wendy’s, Barstool Sports, and Kendrick Lamar prefer Twitter. They want their message, their brand, to show up in your feed even if you don’t want to see it. It’s advertising that isn’t labeled as an ad.

So that’s what I was right about, now I’m going to write a lot *less* about what I was wrong about, because I hate being wrong.

I was wrong about how difficult it would be to get self-driving cars on all roads. In 2022 I clowned on a 2015 prediction that said self-driving cars would be on every road by 2020. Well it’s 2025, and I’ll be honest 5 years late isn’t that terrible.

At the time I thought that there was a *political-legal* barrier that would need to be overcome: how do you handle insurance of a self-driving car? No system is perfect and if there’s a defect in the LIDAR detector or just a bug in the system, a car *can* cause damage. And if it does, does Google pay the victim, or the passenger, or what? Insurance is a messy, expensive system, split into 50 different systems here in America, and I thought without some new insurance legislation (such as unifying the insurance systems or just creating more clarity regarding self-driving cars), that the companies would realize they couldn’t roll these out without massive risk and headaches.

I was wrong, I’ve now seen waymos in every city I’ve been to.

So it seems the insurance problems weren’t insurmountable, and the problem was less hard then I thought. You can read my thoughts about how hard I *thought* those problems were, but to be honest I was wrong.

Assuming your political opponents are just “misinformed” only guarantees that you won’t win them over

A bit more streamsofconsciousness than other posts, because I’m writing late at night. But here goes:

I don’t know much about the right-of-center political shibboleths, but it’s been a shibboleth on the left that people only vote conservative because they “don’t know any better.” They’re “misinformed,” they’re “voting against their own interests,” they’re “low-information voters,” these are the only reason anyone votes for the GOP. Nevermind that the “low-information voters” tag was first (accurately) applied to the *Obama* coalition before Trump upset the political balance of power.

Remember that in the 2012 matchup, Obama voters consumed less news than Romney voters, and were less informed on the issues at large. But in those days calling someone a low-information voter was nothing less than a racist dog-whistle (at least among the left-of-center). By 2016, Trump had upended American politics by appealing to many voters of the Obama coalition, and now this racist dog-whistle was an accurate statement of fact on the left.

“Yes some voters just don’t know any better. They don’t know the facts, they don’t know right from wrong, they just don’t know. And if they don’t know, the quickest solution is to teach them, because once we give them the knowledge that “we” (the right thinking people) have, they’ll vote just like we do.”

But attacking liberals (in 2012) and conservatives (in 2016 and 2024) as “low-information” is old hat, what about attacking leftists?

That’s what the Atlantic’s Jonathan Chait has done in a recent article. Now, he doesn’t directly state “leftists are misinformed” like he would say about conservatives. It’s obvious Chait still wants leftists in his coalition and doesn’t want to insult them too badly. But he’s laying out the well-worn left-of-center narrative that his political opponents do not understand things, and that he needs to teach them how the government actually works so they can agree with his positions and support his favorite policies.

In Chait’s view, leftists just don’t get that the government is too restrictive, and that these restrictions are the cause of the housing crisis. They don’t realize it’s too regulatory, and those regulations harm growth. And they don’t get that government red tape is the reason all our infrastructure is dying and nothing new can be built. Chait attacks California High Speed Rail and Biden’s Infrastructure bill as hallmarks of this red tape. California HSR is 10 times over budget and still not a single foot of track laid down, while Biden signed the Infrastructure bill in 2021 and wrongly believed that he could have photo-ops in front of new bridges, factories, and ports in time for 2024.

The fruits of Biden’s infrastructure bills are still almost entirely unbuilt, their money still mostly unspent. And this lets Republicans make calls to overturn those bills and zero-out Biden’s spending. If his projects were actually finished on-time and during his presidency, Biden’s enemies could never attack his legacy like that. But government red tape stood in the way.

See, with claims like these, Chait is arguing in favor of the Abundance Agenda. I’m not entirely opposed to it. See my many posts on de-regulation.

But Chait is once again missing the mark here. He claims that Leftists don’t *understand* abundance, and that’s half of why they oppose it. He claims the other half is that they’ve built their power base as being the people who “hold government accountable” and oppose its over-reach. But Chait is mostly arguing that Leftists don’t realize that their crusade against Big Government is a “bad thing” that has made our economy worse. And I don’t think Leftists are misinformed at all, I think they just have different priorities than me and Jonathan Chait.

Let me explain though a specific example: Josh Shapiro is well-loved for repairing an I-95 overpass in rapid time. He did so by suspending all the red tape that usually slows down such infrastructure projects. Chait then argues, if we know we need to suspend the rules to get things done quickly, then why do we need to have these rules in the first place? They’re slowing us down and preventing us from building what’s needed, so shouldn’t we just remove some of them?

But here’s the red tape that Shapiro suspended:

  • There was no bidding process for procurement, contractors were selected quickly based on the Govenor’s office’s recommendations
  • There were no impact studies for the building process
  • On-site managers were empowered to make decisions without consulting their superiors or headquarters
  • Pennsylvania waived detailed financial reporting processes
  • Pennsylvania waived most environmental reviews
  • Pennsylvania waived the requirement to notify locals of the construction, and to gain local approval for that construction

I don’t exactly have a problem with these ideas, and if Chait wants to make these de-regulations a central part of the Democratic brand, more power to him. But Chait is wrong that leftists are simply misinformed, I think many leftists would say that while these waivers are fine in an emergency, we should not support this deregulation for all projects, even if it saves us time and money. The reasons (for a leftist) are obvious.

  • Deregulating procurement is central to the Trump/DOGE agenda, and opponents say this opens the door to government graft as those in power can dole out contracts to their favorites.
  • Impact studies were also deregulated under Trump in two different executive orders. Biden revoked both orders at the start of his term because of his focus on health and the environment. I think most leftists would assert that protecting the environment and health is more important than other government priorities.
  • On-site vs HQ is less of an emotive topic, but the need for “oversight” is still a driving idea any time the government Does Stuff
  • Waiving of financial reporting opens up accusations of fraud
  • Waiving environmental reviews, see point 2
  • Waiving local notification and buy-in. You can probably get away with this when “re-“building, but will ANY democrat stick their neck out and say locals shouldn’t have a say in new highway construction? I doubt it. Highways change communities, and any change needs community buy-in (so they say). This focus on localism is very popular on the right, left and center, no matter how much I and the Abudance-crats may oppose it.

So Chait, do the leftists not understand Abundance? Or do they have strongly-held beliefs which are incompatible with Abundance?

This whole theory of “low-information voters” is always appealing to democracies biggest losers. It’s why the GOP liked it in 2012, and it’s why Democrats like it in 2024. The idea cocoons us in a comforting lie that we alone have Truth and Knowledge, and that if only everyone was As Smart As Me, everyone would Vote Like Me.

It also seems Obviously True on the face of it. “The best argument against Democracy is a conversation with the average voter,” so the saying goes. And when you see any of your opponent’s voters interviewed directly, you can’t help but notice how much information they are *lacking*. And it’s obviously true, most people don’t know how government works, they don’t understand permitting, they don’t get that environmental impact reviews cost so much money and time. So obviously if we gave them that knowledge, they’d start voting “correctly,” right?

This misses an important point about political coalitions and humans in general: the wisdom of the crowds. Most people don’t know most things, but we all (mostly) take our cues from those who do know.

Think about the leftist coalition in America, the Berniecrats, the AOC stans, the DSA and the WFP. Most of the voters in this coalition don’t have a clue how environmental review works. But there are some in the coalition (probably including Bernie and AOC) who do know how it works, and the rest of the coalition takes its cues from those people.

There are certainly some people who have looked long and hard at the Abundance Agenda, and they have concluded that (for instance) removing environmental reviews would lead to Americans being exposed to more pollution and harmful chemicals. It was only because of environmental reviews that the EPA took action against PFAS, for instance.

So Chait is arguing that we need to reduce regulatory burden and reduce the ability of locals and activists to halt projects with their red tape and environmental reviews. I agree with this.

But Chait then argues that the only reason leftists don’t agree with us is because they don’t understand how harmful red tape and reviews are, and thus leftists have lead a wrong-headed campaign of being the people who say “no” to new buildings. I disagree with this.

I think the evidence shows that leftists simply have different beliefs than me and Chait. Leftists believe that red tape and reviews are necessary to protect the environment. And a leftist might argue that Chait complaining about environmental reviews is like a conservative complaining that “cars would be cheaper if they weren’t forced to have seatbelts and useless safety stuff.” Chait says environmental review doesn’t help us. Well I’ve never needed my seltbelt either, because I’ve never crashed.

I’m sure you can see how stupid the seatbelt argument is, well that’s probably how stupid leftists would see Chait. Yes 99% of the time an environmental review finds nothing objectionable about a project, but what about those few times when they do? Do we scrap the whole system because it’s usually a waste of time? I say again: without environmental review, the EPA would not yet have taken action on PFAS. A leftist could seriously say to Chait: do you support allowing PFAS in the water? Because it might still be allowed without environmental review.

I don’t know what Chait’s response would be, I’m sure he’d try to say “well that’s different,” because any review that *found* something was clearly a good review. But you don’t know beforehand which reviews will find something dangerous and which won’t. To a leftist, that means you have to do them all.

Now, most leftists *do not understand environmental review* just like most liberals, moderates, conservatives, and reactionaries. Most people don’t understand most things. But the leftist coalition includes people who *do* understand it, and they’ve weighed the costs and benefits and come out with a different stance than Chait has. The rest of the coalition takes its cues from the understanders, just like the every other coalition does.

But Chait’s thesis is built on a lie that because most leftists don’t understand, they’ll side with him and Abundance once they *do* understand. I disagree strongly. Most leftists will continue taking their cues from the informed leftists, and Chait is not saying anything new to inform those informed leftists. The coalition will only modify its position on this issue once the majority loses faith in the understanders (and thus seeks new ones with new positions), or when enough of the current understanders retire and are replaced by new ones. Coalitions, like science, advance one funeral at a time.

But this idea that people are misinformed and just need a smart guy like *me* to set them straight, this is a central tenant of politics that I think needs to die. You shouldn’t assume your opponents are just misinformed, you need to understand that they *actually have different ideas than you do*, and try to win them over by finding common ground. Otherwise you’ll continue to be the Loser Coalition just like Rush Limbaugh and the Romney-ites of 2012.

Research labs are literally sucking the blood from their graduate students

I’m going for a “clickbait” vibe with this one, is it working?

When I was getting my degree, I heard a story that seemed too creepy to be real. There was a research lab studying the physiology of white blood cells, and as such they always needed new white blood cells to do experiments on. For most lab supplies, you buy from a company. But when you’re doing this many experiments, using this many white blood cells, that kind of purchasing will quickly break the bank. This lab didn’t buy blood, it took it.

The blood drives were done willingly, of course. Each grad student was studying white blood cells in their own way, and each one needed a plethora of cells to do their experiment. Each student was very willing to donate for the cause, if only because their own research would be impossible otherwise.

And it wasn’t even like this was dangerous. The lab was connected to a hospital, the blood draws were done by trained nurses, and charts were maintained so no one gave more blood than they should. Everything was supposedly safe, sound, by the book.

But still it never seemed enough. The story I got told was that *everyone* was being asked to give blood to the lab, pretty much nonstop. Spouses/SOs of the grad students, friends from other labs, undergrads interning over the summer, visiting professors who wanted to collaborate. The first thing this lab would ask when you stepped inside was “would you like to donate some blood?”

This kind of thing quickly can become coercive even if it’s theoretically all voluntary. Are you not a “team player” if you don’t donate as much as everyone else? Are interns warned about this part of the lab “culture” when interviewing? Does the professor donate just like the students?

Still, when this was told to me it seemed too strange to be true. I was certain the storyteller was making it up, or at the very least exaggerating heavily. The feeling was exacerbated since this was told to me at a bar, and it was a “friend of a friend” story, the teller didn’t see it for themself.

But I recently heard of this same kind of thing, in a different context. My co-worker studied convalescent plasma treatments during the COVID pandemic. For those who don’t know, people who recover from a viral infection have lots of antibodies in their blood that fight off the virus. You can take samples of their blood and give those antibodies to other patients, and the antibodies will help fight the infection. Early in the pandemic, this kind of treatment was all we had. But it wasn’t very effective and my co-worker was trying to study why.

When the vaccine came out, all the lab members got the vaccine and then immediately started donating blood. After vaccination, they had plenty of anti-COVID antibodies in their blood, and they could extract all those antibodies to study them. My co-worker said that his name and a few others were attached to a published paper, in part because of their work but also in part as thanks for their generous donations of blood. He pointed to a figure in the paper and named the exact person whose antibodies were used to make it.

I was kind of shocked.

Now, this all seems like it could be a breach of ethics, but I do know that there are some surprisingly lax restrictions on doing research so long as you’re doing research on yourself. There’s a famous story of two scientists drinking water infected with a specific bacteria in order to prove that it was that bacteria which caused ulcers. This would have been illegal had they wanted to infect *other people* for science, but it was legal to infect themselves.

There’s another story of someone who tried to give themselves bone cancer for science. This person also believed that a certain bone cancer was caused by infectious organisms, and he willingly injected himself with a potentially fatal disease to prove it. Fortunately he lived (bone cancer is NOT infectious), but this is again something that was only legal because he experimented on himself.

But still, those studies were all done half a century ago. In the 21st century, experimenting with your own body seems… unusual at the very least. I know blood can be safely extracted without issue, but like I said above I worry about the incentive structure of a lab where taking students’ blood for science is “normal.” You can quickly create a toxic culture of “give us your blood,” pressuring people to do things that they may not want to do, and perhaps making them give more than they really should.

So I’m quite of two minds about the idea of “research scientists giving blood for the lab’s research projects.” All for the cause of science, yes, but is this really ethical? And how much more work would it really have been to get other people’s blood instead? I just don’t think I could work in a lab like that, I’m not good with giving blood, I get terrible headaches after most blood draws, and I wouldn’t enjoy feeling pressured to give even more.

Is there any industry besides science where near-mandatory blood donations would even happen? MAYBE healthcare? But blood draws can cause lethargy, and we don’t want the EMTs or nurses to be tired on the job. Either way, it’s all a bit creepy, innit?

“No more austerity! The Government needs to invest!”

“Government” is capitalized here because we’re talking about the UK today. I meant to write about it earlier, but Keir Starmer and Rachel Reeves have been announcing that benefits cuts will hit the UK this year. On top of last year’s tax hikes, this has raised the specter of Austerity, and fears of another Lost Decade in the UK, only this time with Labour at the helm.

Critics of the cuts abound, bringing complains and counsel:

“What happened to the tax rises from last year?!?”

“Austerity failed already! We can’t keep cutting!”

“Tax the rich! Don’t cut off the poor!”

And finally: “We should invest, not cut!”

Let me address these one by one. First, as much as the left-of-center despises the Laffer Curve, it is still an accurate reflection of reality. Raising taxes increases prices and reduces demand. This nearly always leads to a tax rise bringing in less money than the government predicts. They may claim to be modelling the demand reduction, but governments that raise taxes are heavily incentivized to make broad claims about bringing in lots of money to balance the books. Accurate modeling plays second fiddle.

And this has been the case in the UK, the 40 billion pound tax rise announced last year isn’t expected to bring in quite that much. For instance, a tax on private school education was expected to raise money while affecting a minimal number of pupils. But the government underestimated how many families would be unable to afford the tax, pushing those kids back into the public schools, where they aren’t paying the tax and the government will have to pay for their education.

So the government’s tax rise didn’t bring in near enough, and they even raised spending on top of it. The UK now faces a yawning deficit, nearly 5% of GDP. With Debt to GDP already over 100%, the government is finding borrowing unaffordable. The cost of financing all that debt is soaring, it’s 25% higher than it was a year ago at more than 100 billion pounds a year. Remember, that 100 billion pounds is *just the cost of the interest payments*, assuming no money is spent actually paying down the debt. Labour is then adding that 5% deficit on top of that, which will need even more borrowing.

So borrowing is going to cost way more than Labour expected. If they don’t want to enter a debt spiral, they need to manage that deficit.

“But Austerity failed already!” When did the UK ever implement austerity? It was the word of the decade under the coalition government, but despite the tough talk and tax rises, total spending increased every single year of the coalition, and never went down. And this wasn’t “cuts in real terms either,” *real spending* ie inflation adjusted spending, never went down during the Coalition government. It grew more slowly than under Blair/Brown, but it never went down. Boris Johnson has the (dis)honor of overseeing the only year on year reduction in real Government expenses, thanks to the massive pandemic spending that then petered out.

The UK hasn’t done austerity, and it isn’t doing austerity now. The announced cuts aren’t actual reductions in spending, they are really just slowing the rate of spending *increase*. Labour promised massive spending increases last year, and a few of those are being paired back into a smaller increase. This is still an increase in real spending, just less of one than what was promised. This isn’t austerity.

And what of taxing the rich? They’re already pay all the tax. The top 10% of UK earners pay 60% of all taxes, the top 1% pay half of that (ie 30% of the total). The bottom 50% of earners pay 17% of tax. About a third of working age Britons pay no tax at all.

And that is significantly more progressive than on the Continent, the German 10% pay a little over half of their country’s taxes, the German 1% pay a little under a quarter. By and large, the UK taxes the rich more and taxes the poor less than in the rest of Europe.

Of course, the real definition of “rich” is “1 standard deviation above my personal income.” Everyone agrees that someone *else* must pay more, but will the British economy really be improved by chasing off its last remaining high earners to America? Europeans have boasted that Trump will set off a “brain drain” of wealthy Americans, but the difference in after-tax earnings means historically that brain drain has only happened in the America-ward direction. Further tax hikes will only enforce that paradigm.

Finally, shouldn’t the Government *invest* rather than *cut*? The private sector does it all the time! They take out eye-watering amounts of debt and yet somehow come out on top, the public sector should too!

But the Government doesn’t really invest. It spends money, and it uses the language of the private sector to claim that the money is spent well. But the Government doesn’t have the profit incentive that the private sector does, it’s overwhelming incentive is for optics and votes. So as Biden showed us, Government “investment” never really generates a return.

Labour is right to cut spending. They’ve already hiked taxes, and they need to get borrowing costs under control somehow. Besides, Government spending as a proportion of GDP is already nearly 50% in the UK, about 17,000 pounds per person. Just over 10% of the population (people making more than 50,000 pounds) are putting in more money than they’re getting out. The Government already spends a lot of money, and not well. More money in the fire won’t necessarily help.

But like Nigeria’s president Tinubu, Keir Starmer has talked a big game on growth without having the stomach to follow through with it. So again, here’s my unsolicited policy advice:

Keir Starmer should liberalize (liberalise?) the UK’s labor (labour?) laws. UK companies are significantly constrained in their abilities to fire, and this generates a reluctance to hire. The UK has stiff requirements on minimum notice before firing, minimum compensation when you get fired, and if you work there for 2 years a company needs to jump through significant regulatory hoops to be allowed to fire you. These laws should be liberalized to make it easier to fire, and therefore incentive companies to hire.

I know this proposal doesn’t sit well with any of my readers. We’re all workers, I doubt any of us is an owner. But here’s the rule of labor markets: easy go, easy come. The easier it is to fire a worker, the more willing a company will be to hire, and the more nimble a company will be at navigating a changing market.

If a UK company wants to expand, they have to do so very slowly and carefully because any new hire becomes a big liability after 2 years. UK Companies can’t downsize to adjust to market conditions, and so they are hesitant to upsize even during the good times. That makes them grow more slowly, and believe it or not it reduces wages.

Let’s look at Meta as an example: they laid off tens of thousands of employees when the “metaverse” was proven to be a bust. They were able to lay off quickly and adjust their company focus because those metaverse employees weren’t guaranteed a silver parachute. If firing was harder, they might have held on to their losing bet on the metaverse for much longer, because the cost of firing mitigated the upside potential in changing tactics. Then again if firing was harder, Meta might have never made a big expensive bet on the metaverse to begin with.

See the metaverse was a big, expensive failure, but US companies have to expect that most of their bets will fail. But some bets will succeed and wipe out all the loses from the failures, and so US companies are very quick to hire when they’re chasing a big bet.

The ballooning wages in Tech are a symptom of this. Companies like Google and Amazon have made big bet after big bet in the last 20 years, and to when those bets pay off the company starts offering higher and higher wages to expand the company on the success of their big bet. Sometimes those bets go bad and you get layoffs like at Meta. But many of those bets go good and you find that starting salaries in America become higher than mid-tier salaries in most of Europe.

And while Tech is the most famous example, this is endemic in every American industry from energy to pharma and beyond. Liberalized labor markets mean companies are willing to make big bets, meaning some of those bets pay off and the workers get chased by higher salaries. The workers are ultimately the ones who benefit here, that’s why America is such a magnet for high-skilled immigration (on top of its attractiveness for all immigration). Even with Trump in power, tens of thousands of highly skilled immigrants will continue to come to America every year he’s in office, the salaries are just too good to pass up.

That was a lot more than I expected to write on labor markets, but I’ve got more if you’re interested. Stay tuned for the next exciting installment of “if I ruled the world.”

If the government doesn’t do this, no one will

I’m not exactly happy about the recent NIH news. For reference the NIH has decided to change how it pays for the indirect costs of research. When the NIH gives a 1 million dollar grant, the University which receives the grant is allowed to demand a number of “indirect costs” to support the research.

These add up to a certain percentage tacked onto the price of the grant. For a Harvard grant, this was about 65%, for a smaller college it could be 40%. What it meant was that a 1 million grant to Harvard was actually 1.65 million, while a smaller college got 1.4 million, 1 million was always for the research, but 0.65 or 0.4 was for the “indirect costs” that made the research possible.

The NIH has just slashed those costs to the bone, saying it will pay no more than 15% in indirect costs. A 1 million dollar grant will now give no more than 1.15 million.

There’s a lot going on here so let me try to take it step by step. First, some indirect costs are absolutely necessary. The “direct costs” of a grant *may not* pay for certain things like building maintenance, legal aid (to comply with research regulations), and certain research services. Those services are still needed to run the research though, and have to be paid for somehow, thus indirect costs were the way to pay them.

Also some research costs are hard to itemize. Exactly how much should each lab pay for the HVAC that heats and cools their building? Hard to calculate, but the building must be at a livable temperature or no researcher will ever work in it, and any biological experiment will fail as well. Indirect costs were a way to pay for all the building expenses that researchers didn’t want to itemize.

So indirect costs were necessary, but were also abused.

See, unlike what I wrote above, a *university* almost never receives a government grant, a *primary investigator* (called a PI) does instead. The PI gets the direct grant money (the 1 million dollars), but the University gets the indirect costs (the 0.4 to 0.65 million). The PI gets no say over how the University spends the 0.5 million, and many have complained that far from supporting research, the University is using indirect costs to subsidize their own largess, beautifying buildings, building statues, creating ever more useless administrative positions, all without actually using that money how it’s supposed to be used: supporting research.

So it’s clear something had to be done about indirect costs. They were definitely necessary, if there were no indirect costs most researchers would not be able to research as Universities won’t allow you to use their space for free, and direct costs don’t always allow you to rent out lab space. But they were abused in that Universities used them for a whole host of non-research purposes.

There was also what I feel is a moral hazard in indirect costs. More prestigious universities, like Harvard, were able to demand the highest indirect costs, while less prestigious universities were not. Why? It’s not like research costs more just because you have a Harvard name tag. It’s just because Harvard has the power to demand more money, so demand they shall. Of course Harvard would use that extra money they demanded on whatever extravagance they wanted.

The only defense of Harvard’s higher costs is that it’s doing research in a higher cost of living environment. Boston is one of the most expensive cities in America, maybe the world. But Social Security doesn’t pay you more if you live in Boston or in Kalamazoo. Other government programs hand you a set amount of cash and demand you make ends meet with it. So too could Harvard. They could have used their size and prestige to find economies of scale that would give them *less* proportional indirect costs than could a smaller university. But they didn’t, they demanded more.

So indirect costs have been slashed. If this announcement holds (and that’s never certain with this administration, whether they walk it back or are sued to undo it are both equally likely), it will lead to some major changes.

Some universities will demand researcher pay a surcharge for using facilities, and that charge will be paid for by direct costs instead. The end result will be the university still gets money, but we can hope that the money will have a bit more oversight. If a researcher balks at a surcharge, they can always threaten to leave and move their lab.

Researchers as a whole can likely unionize in some states. And researchers, being closer to the university than the government, can more easily demand that this surcharge *actually* support research instead of going to the University’s slush fund.

Or perhaps it will just mean more paperwork for researchers with no benefit.

At the same time some universities might stop offering certain services for research in general, since they can no longer finance that through indirect costs. Again we can hope that direct costs can at least pay for those, so that the services which were useful stay solvent and the services which were useless go away. This could be a net gain. Or perhaps none will stay solvent and this will be a net loss.

And importantly, for now, the NIH budget has not changed. They have a certain amount of money they can spend, and will still spend all of it. If they used to give out grants that were 1.65 million and now give out grants that are 1.15 million, that just means more individual grants, not less money. Or perhaps this is the first step toward slashing the NIH budget. That would be terrible, but no evidence of it yet.

What I want to push back on though, is this idea I’ve seen floating around that this will be the death of research, the end of PhDs, or the end of American tech dominance. Arguments like this are rooted in a fallacy I named in the title: “if the government doesn’t do this, no one will.”

These grants fund PhDs who then work in industry. Some have tried to claim that this change will mean there won’t be bright PhDs to go to industry and work on the future of American tech. But to be honest, this was always privatizing profit and socializing cost. All Americans pay taxes that support these PhDs, but overwelmingly the benefits are gained by the PhD holder and the company they work for, neither of whom had to pay for it.

“Yes but we all benefit from their technology!” We benefit from a lot of things. We benefit from Microsoft’s suite of software and cloud services. We benefit from Amazon’s logistics network. We benefit form Tesla’s EV charging infrastructure. *But should we tax every citizen to directly subsidize Microsoft, Amazon, and Tesla?* Most would say. no. The marginal benefits to society are not worth the direct costs to the taxpayer. So why subsidize the companies hiring PhDs?

Because people will still do things even if the government doesn’t pay them. Tesla built a nation-wide network of EV chargers, while the American government couldn’t even build 10 of them. Even federal money was not necessary for Tesla to build EV chargers, they built them of their own free will. And before you falsely claim how much Tesla is government subsidized, an EV tax credit benefits the *EV buyer* not the EV seller. And besides, if EV tax credits are such a boon to Tesla, then why not own the fascists by having the Feds and California cut them completely? Take the EV tax credits to 0, that will really show Tesla. But of course no one will because we all really know who the tax credits support, they support the buyers and we want to keep them to make sure people switch from ICE cars to EVs

Diatribe aside, Tesla, Amazon, and Microsoft have all built critical American infrastructure without a dime of government investment. If PhDs are so necessary (and they probably are), then I don’t doubt the market will rise to meet the need. I suspect more companies will be willing to sponsor PhDs and University research. I suspect more professors will become knowledgeable about IP and will attempt to take their research into the market. I suspect more companies will offer scholarships where after achieving a PhD, you promise to work for the company on X project for Y amount of years. Companies won’t just shrug and go out of business if they can’t find workers, they will in fact work to make them.

I do suspect there will be *less* money for PhDs in this case however. As I said before, the PhD pipeline in America has been to privatize profits and subsidize costs. All American taxpayers pay billions towards the Universities and Researchers that produce PhD candidates, but only the candidates and the companies they work for really see the gain. But perhaps this can realign the PhD pipeline with what the market wants and needs. Less PhDs of dubious quality and job prospect, more with necessary and marketable skills.

I just want to push back on the idea that the end of government money is a deathknell for industry. If an industry is profitable, and if it sees an avenue for growth, it will reinvest profits in pursuit of growth. If the government subsidizes the training needed for that industry to grow, then instead it will invest in infrastructure, marketing, IP and everything else. If training is no longer subsidized, then industry will subsidize it themselves. If PhDs are really needed for American tech dominance, then I absolutely assure you that even the complete end of the NIH will not end the PhD pipeline, it will simply shift it towards company-sponsored or (for the rich) self-sponsored research.

Besides, the funding for research provided by the NIH is still absolutely *dwarfed* by what a *single* pharma company can spend, and there are hundreds of pharma companies *and many many other types of health companies* out there doing research. The end of government-funded research is *not* the end of research.

Now just to end on this note: I want to be clear that I do not support the end of the NIH. I want the NIH to continue, I’d be happier if its budget increased. I think indirect costs were a problem but I think this slash-down-to-15% was a mistake. But I think too many people are locked into a “government-only” mindset and cannot see what’s really out there.

If the worst comes to pass, and if you cannot find NIH funding, go to the private sector, go to the non-profits. They already provided less than the NIH in indirect costs but they still funded a lot of research, and will continue to do so for the foreseeable future. Open your mind, expand your horizons, try to find out how you can get non-governmental funding, because if the worst happens that may be your only option.

But don’t lie and whine that if the government doesn’t do something, then nobody will. That wasn’t true with EV chargers, it isn’t true with biomedical research, and it is a lesson we all must learn if the worst does start to happen.

Perception and Reality

Well it’s been one of the most tumultuous 3 and a half weeks in politics, ever since the June debate between Biden and Trump. Since that debate:

  • The media perception of Biden has degraded from “frail but sharp old man” to “doesn’t always know what’s happening around him”
  • The Democratic Party line has gone from “Biden is the nominee, we can’t change him or it will cause chaos” to “Harris is the nominee”
  • Every Democrat in congress seemed to be calling for Biden to step down, and
  • Biden has stepped down as candidate, endorsing Harris

Some Democrats have (as they have all year) said that this was nothing more than an overblown media circus, that would have never caught fire if the lyin’ press hadn’t been so desperate for clicks that they cooked up a scandal. There’s a strong current among the Stancilite wing of the party to claim that every voter is an automaton who believes nothing except what the media says. So if the media says Biden is old, that’s what they believe. But the media *should* have said Biden was sharp as a tack and steering the ship of state, because then everyone would have believed that.

The idea that “The Media” (capital T capital M) is always against the Democrats is part and parcel of liberal mythmaking. Nevermind that it’s also part and parcel of *conservative* mythmaking, I encountered this liberal mythmaking first-hand in the aftermath of the Howard Dean campaign.

The liberal myth goes something like this: Howard Dean was a threat to the Establishment with powerful grassroots organization and nationwide appeal. But one night when trying to give a triumphant yell, he instead gave a weird-sounding scream. The Media repeated the “Dean Scream” endlessly, making a mockery of him to the voters and torching his campaign. In his stead, the underwelming, flip-flopping John Kerry was sent to lose against George W Bush. If *only* we’d stuck with Dean!

The problem with the “Dean Scream” myth is that it reverses cause and effect: it says that The Media used the Dean Scream to discredit him in the eyes of the voters. Yet looking at the record, the Dean Scream happened as he was trying to gin up his supporters after a dismal showing in the Iowa caucus, in which he vastly underperformed expectations and got just 18% of the vote, less than half of front-runner John Kerry and a very distant third behind the ascendant John Edwards.

Taken in context, The Media didn’t discredit Dean, the voters had already turned their backs on him. Dean was supposed to be a front-runner going into the caucus but his very poor showing put paid to that idea hours before his historic scream.

Kerry and Edwards would go on to be presidential and vice presidential nominees for that year.

Yet the idea that the Media creates perception (and therefore reality) still has power among the twitterati. When Biden was dealing with the fallout of the debate, many liberal commentators tore into The Media, claiming that if anyone was suffering from dementia it was rambly, half-awake Donald Trump. And since Biden has now dropped out, liberal commentators are trying to will a “Trump has dementia” angle into existence.

This seems like an insane take to me because *we all saw the debate*. No matter how much Trump lied and deflected, he said real words and you could understand them, Biden sounded like he was barely awake! The line of the night was Trump’s terrifyingly accurate quip of “I don’t know what he just said, and I don’t think he does either.”

And we can all see that Trump has done rally after rally after rally while Biden really *hasn’t*, and team Biden did everything in their power to prevent even a single off-script moment from ever being seen. All the while reports are coming in from allies all across congress and *across the Atlantic* that Biden hasn’t been all there for a really long time, and is confusing people and places left and right.

Meanwhile the curious voter can tune into any one of the many rallies that Trump holds, or just watch Fox News and see a man doing twice as many rallies, interviews and the like than Biden. As well as doing infinitely many more unscipted spots since Biden didn’t seem to do any.

Saying Trump is too old will certainly resonate, half the country already thought he was while 80% of the country thought Biden was. But trying to tar Trump with the same brush Biden got will not work I think because the reality doesn’t look like what the Democrats want out of a narrative. Like the Dean Scream myth, Democrats have taken away the idea that The Media creates reality, and if they can just *will* a narrative into existence, they can say anything about their opponents that their opponents say about them. I don’t think that works any more than Republicans trying to call Democrats election deniers works, because people have eyes.

At the end of the day The Media can certainly amplify stories and let narratives run away with things, but the idea that they can create something out of nothing is a myth. And Democrats trying to say *the media needs to be saying this” ie “Trump has dementia, Trump can’t speak straight,” trying to demand The Media simply reverse the story and put all of Biden’s flaws on Trump, well that isn’t going to work. They’d do a lot better hammering on things which are real instead of trying to create something out of nothing.

That may have been part of the problem for Democrats these past 3 weeks. While they were doing damage control for Biden, the most common rejoinder I saw was “Trump is just as old and just as senile!” The first is false, but at least close to true, Trump is very old. The second is an outright lie, 50 million people saw the debate, and you can’t lie to their face like that.

If Democrats lose, I think The Debate will enter the hall of myths alongside the Dean Scream, as a moment when The Media sharpened their knives and took out the strongest Democratic candidate available because (laughably) they were in the tank for Republican. And I think myths like that will make the party far weaker than it would otherwise be.

Chickenhawks

Jingoism is a hell of a drug.

20 years ago during the end of Bush’s presidency, military intervention was anathema to most of the Democratic party. New interventions were treated with suspicion, and getting out of current wars was seen as paramount.

5 years ago, during Trump’s presidency, military intervention was again evil and bad. Trump’s assassination of an Iranian general was yet another reckless decision that would lead us to world war for little to no gain.

Yet today, the Democratic party is again making common cause with many of the foreign policy “hawks” that drove support for the wars in Iraq and Afghanistan. And somehow no one sees what’s wrong with this.

In 2023, the Houthis in Yemen began attacking ships transiting through the Red Sea on their way to the Suez Canal. The Red Sea and Suez Canal bring an enormous volume of trade to Europe, Africa and Asian. Shutting off this passage means ships have to take the long way around Africa, which greatly raises prices and increases shortages.

Then in January of 2024, Biden put the Houthis back on the Global Terrorism list (he’d removed them from the list as one of his first acts as president), and announced the USA would begin bombing Yemen to stop the Houthi attacks.

Social media lit up with stupid talking points about America’s military might, and how “the Houthis are going to learn why America doesn’t have free healthcare.” Social media is overwhelmingly populated by the young and left-leaning, so seeing the same demographic group that protested the Iraq War now beating their chests over a bombing campaign was jarring to say the least.

And what happened? After months of bombing, the Houthis are still attacking ships. Shipping companies are still avoiding the Red Sea. Transit through the Suez is still down and prices due to circumnavigating Africa are still up.

And America still doesn’t have free healthcare.

The bombing campaign has clearly failed at its goal of ensuring safe traffic through the Red Sea. So much so that Biden has now offered a ceasefire where he will again remove the Houthis from the global terror list if they will stop attacking ships. America’s military might could not silence the enemy guns or enforce America’s will, and so we are once again forced to negotiate with terrorists.

To be fair to Biden, this may be the right move. He openly stated that he was only placing them on the global terrorism list because of their attacks against ships, removing them from that list if they stop attacking ships is only natural. It is a low-cost concession to the Houthis, as removing them from the list makes it easier for them to access international markets, but doesn’t do much to harm America directly.

But it’s still obvious that this was a failed bombing campaign, and it raises the question of if we’re negotiating with terrorists now, why didn’t we *start* with negotiations *before* bombing them? The bombing does not seem to have done anything to reduce the frequency or intensity of Houthi attacks, if anything it has only given the Houthis greater credibility in Yemen as it has galvanized the populace to “rally ’round the flag.”

Hawks will complain that I’m being unfair: the bombing campaign was *not* a failure, America just wasn’t even trying to win. And it’s true, America has the capacity to conduct Dresden-level bomb campaigns and Desert Storm level ground campaigns nearly at-will. Neither of those happened, so America clearly wasn’t using its full might.

But was there any political will for carpet bombing or a ground invasion? Absolutely not, a tepid bombing campaign was all that would have been acceptable in an election year. And so if you take America as both a military and political entity, then yes this bombing campaign was about all America was capable of.

But none of the chickenhawks who beat their chest in January will ever admit that the campaign was a failure, ever admit that we are negotiating with terrorists, ever admit that there were other options or other solutions. Thousands of politicians and military aficionados went to their graves believing that the War in Vietnam could have, should have been won, and if we’d just stayed in a little longer (or nuked Hanoi), we could have won it. I have no doubt this campaign (much much smaller as it is) will also be remembered thus by many.

But the fact is that there are not always military solutions. It’s a classic slogan to say that “we don’t negotiate with terrorist,” but it’s just not true, we negotiate with terrorists all the time.

An FBI negotiator brings a suitcase full of cash to a terrorist who has hijacked a plane.

There are times when terrorists have leverage over you, and the problem with leverage is that it exists whether you want it to or not. Whether that leverage is hostages, military might, or geographic position, you can’t just wish it away and pretend it doesn’t exist. Nations also have constraints: budgetary, political, logistic, which can constrain their military response significantly.

So while it’s true that in an open field with no holding back the American military would destroy the Houthi military without a single casualty, that’s not the war that Biden fought. Trying to remove terrorists from their own country that supports them without a ground invasion or naval blockade will always be a challenge. And if a nation is politically, economically, or logistically incapable of doing that, then they need to look hard at what they are *actually trying to accomplish*.

I have seen precious few cases in my adult life of military intervention leading to a lasting improvement in the situation. The best example would be the bombing campaign in Yugoslavia from nearly 3 decades ago. The second best example would be the few years of near-normality that the American military gave to Afghanistan, prior to the Taliban returning.

But one success and one partial success is a terrible track record for the number of military campaigns we’ve been engaged in. And it seems the Houthi campaign will be yet another mark in the failure column, as it has done nothing to eliminate Red Sea attacks which will almost certainly be ended only by negotiations if they are even ended at all.

So the next time social media lights up with chest-thumping about how American military might should be directed at a problem, think for more than a few seconds about whether a military solution is even possible.

Sorry for no posts, I was watching the eclipse

Sorry I haven’t posted in a while, I drove halfway across the continent to see the eclipse. And then after it was finished I immediately drove the other halfway back home. After more than 24 hours of driving, I was beat, and this week was kind of a wash for me after that.

But the eclipse itself was beautiful and I encourage everyone to look for images of it online. NASA had an entire party for the eclipse, I don’t know if they did that for 2017 but maybe with how popular the 2017 eclipse was, they felt they needed to.

There was also some real science being done during this eclipse. Telescopes trained on the sun to look at its corona in great detail as the moon passed in front. A longstanding humorous story in the scientific community comes from an eclipse observed not long after Albert Einstein published his theory of general relativity. The theory predicted that light should bend when passing by massive objects. So scientists used a solar eclipse to visualize stars that were hiding near the sun. As predicted by Einstein, their light appeared to be “bent” because it had passed so close to the sun to get to us.

The newspapers published this with a somewhat hilarious line:

Stars not where they seemed or were calculated to be, but nobody need worry.

New York Times

The “but nobody need worry” always gets to me.

Regardless, eclipses are fun both for scientists and non-scientists alike. I hope if you missed this one, you’ll get to see one soon!

Surge Pricing and Dirty Deals

I’m sorry I haven’t been posting weekly like I promised to. February has not been kind to me. But I wanted to quickly fire off a post relating to two topics I’ve recently seen in the news.

The first has to do with the infamous Wendy’s “surge pricing” announcement which the company has already walked back on. As I know not all my readers are American, I’ll explain both Wendy’s and surge pricing.

Wendy’s is a fast food burger chain just like any other American chain. Surge pricing meanwhile is what Uber and Lyft do when there is a very high demand all of a sudden, prices shoot up during that time, leaving customers to balk at paying 50$ for a ride home from a baseball game, when getting into downtown may have costed just 30$. Many Wendy’s customers likewise were furious at the price of a burger going up and down during the day, possibly meaning they’d pay for their food than someone who’d walked in just a few minutes earlier.

The story got so much traction that Senator Elizabeth Warren even tweeted about it, trying to play up her corporate greed narrative. Little does Warren know that we’re now living in the era of Corporate Generosity.

Nevertheless I’m always surprised that someone with the credentials of Warren is so economically illiterate. Surge pricing has been going on for decades, perhaps centuries even. The earliest examples I can think of are matinees, theatre productions (or movies) that are shown during the daytime for a cheaper cost than the evening. It costs exactly the same to run the shown at either time, so why is the daytime show cheaper? And if you’ve ever seen a bar with a “happy hour” or a restaurant with an “early bird special,” or Halloween candy sold half-off in November, you’ve also seen surge pricing in action.

What’s going in here is simple supply and demand. The price of a good or service is *not* based on the cost to make it, the price comes from the interplay of supply and demand. The price fluctuates even if the cost does not because sellers are trying to clear the market. Lower demand? Lower price.

But a restaurant also has service and shifts. Any server serving one customer must necessarily be not serving another. Yet at the same time, servers paid for 8 hour shifts, and few people would work a job where they’re only paid minimum wage for 2 hours. The cost of transport alone would eat into your wage. What this means is that if everyone only comes to eat during dinner (let’s say a 2 hour period from 4-6pm), then the servers are sitting around for 6 hours doing nothing, then madly scrambling for 2 hours. During those 2 hours, many customers might come in only to find the line is too long, or they might be able to eat but find the service poor due to overworked servers.

Thus, for decades restaurants have lowered prices during the “slow” parts of the day to entice people to eat at those times instead of during the rush. This is exactly the same mechanism as Wendy’s “surge pricing,” only it’s framed differently. But it’s still the case that they’re charging more at dinnertime even though their costs are the same.

Surge pricing like this is actually a very good thing. It evens out demand in service industries, allowing more people to be served during a day while still letting the wait staff work full 8-hour jobs. And certain customers can take advantage of this, getting a lower price at the cost of not eating during a “normal” time. Warren (and other outraged twitterati) are simply jumping on a poorly framed policy to score very stupid political points. In fact, Burger King decided to dunk on Wendy’s poorly framed surge pricing policy by highlighting their own better-framed surge pricing policy. Every restaurant is like this, and it’s actually A Good Thing.

Speaking of restaurants but not about Good Things, Gavin Newsom is quite nakedly corrupt. I had only heard mild criticisms of Gavin before, but there were some Democrats I know claiming he was basically the candidate-in-waiting should Biden not run. He is Governor of America’s largest and wealthiest state, and would surely win election because the only thing Republicans could ever say against him were tired tropes about “Commiefornia.” But actually it turns out here’s corrupt.

I know this because he handed a political kickback to his buddy who owns at least two dozen Panera Bread restaurants. California is set to raise the minimum wage to 20$/hr, except at restaurants that serve freshly bread baked. No, bagels and pastries do not count as “bread.” Panera is one of the very few restaurants that does this, and so they will still be allowed to pay their employees just 16$/hr.

You might think this would cause many restaurants to start opening up bakeries, but it gets even more corrupt: the restaurant must have been serving freshly baked bread in September 2023 to qualify. So only Panera is grandfathered in. Essentially, Gavin Newsom decided to directly use a government law to enrich his friend and confidant, and no one seems to really care.

Now of course he wasn’t handing his friend state money. But he was writing legislation that imposes costs on every single one of his friend’s rival businesses, while shielding his friend. That will allow his friend (whose name I just looked up is “Greg Flynn”) to profit much more than anyone else from fast food, since he can keep the same prices while paying his staff 80% less than the competition.

Some of the twitterati have tried to defend Gavin indirectly, saying that it’s obviously corrupt but that this carve-out won’t actually do anything. They say that since every other restaurant will have to abide by the 20$/hr minimum wage, it means no one will ever work for Panera for less than 20$/hr either. But that ignores that people take jobs based on more than just the wage. Maybe the Panera is closer to you than the Taco Bell, maybe you hate the smell of fried foods and are loathe to work at McDonald’s, maybe you don’t own a car and the Panera is the only restaurant in walking distance. Or maybe you have classes and Panera can offer you hours that better fit your schedule.

And Greg Flynn knows this. He knows that he will likely be able to find at least some workers willing to work for just 16$/hr, that’s why he asked Gavin to put that in the bill. But corruption and friend-dealing has never been punished too strongly in America, no matter how much partisans rage about how “the other side” is corrupt. Still, the naked corruption on display may have hurt Gavin in a national election, so Democrats are probably happier he didn’t decide to challenge Biden.