Neil deGrasse Tyson—director of the Hayden Planetarium in New York City—has turned his childlike awe at the mysteries of the universe into a blockbuster career as a highly regarded astrophysicist, pop culture icon, and Twitter provacateur.


On Season 3 of StarTalk, his beloved radio program turned Emmy-nominated late-night talk show, Tyson brings together movie stars, astronauts, high-wire walkers, athletes, comedians, and other public figures to get nerdy about the cosmos. This week, he also released a companion book called StarTalk: Everything you Ever Need to Know About Space Travel, Sci-Fi, the Human Race, the Universe, and Beyond, essentially a textbook for adults digging into topics like “Could Bigfoot Be A Space Alien?” and “Are Humans Monogamous—or ‘Monogamish’?”

The show kicks off Monday, September 19 on the National Geographic Channel with a talk about medical marijuana and Star Trek featuring actress Whoopi Goldberg. In a coversation with GOOD, Tyson shares a little about we can expect this season, whether time travel to the past will ever be possible, and arguably his biggest social media controversy of the summer, in which he dared to make a case for a society founded on evidence and logic (instead of religion).

[youtube ratio=”0.5625″ position=”standard” caption=”A clip exclusive to GOOD from StarTalk season 3.”]

One of my favorite moments of the season happens in the very first episode, when Whoopi Goldberg tells you that she’s ok with being “dumb as hell”—essentially admitting that there are plenty of times she doesn’t know something. As a defender of all things rational, I’m curious about your take on that.

It’s interesting, because we spend many years in school with the expectation that we will learn something every day that will enlighten us or increase our base of knowledge. Somehow people think that when you leave school, you’re done—you’ve learned all you need to learn and now you just have your views. You become ossified in the same state of mind you were in when you last opened a textbook. But we should view school not as a place to learn, but as a place to learn how to learn. And then the rest of your life, you keep learning.

[quote position=”right” is_quote=”true”]Scientists are just kids. ‘I don’t know what that is. Let me go find out.’[/quote]

You can only keep learning if you recognize the things that you don’t yet know and have the curiosity to explore them. This is the same curiosity that we all wielded as children, when you turned over rocks and plucked petals off of flowers to find out what happens. But we forget about that kind of curiosity as adults. You know who hasn’t forgotten? Scientists. Scientists are just kids. “I don’t know what that is. Let me go find out.”

For Whoopi to have that candor means she’s a lifelong learner. I think we need more of that in this world. Especially when you have people in charge who think they simply know everything they need to know to make an informed decision. That’s just outright dangerous.

So, about getting ossified in the state we were in when we read our last textbook—I’m wondering if you were thinking about that when you were putting the StarTalk book together.

The textbook-y feel of the StarTalk book is because modern textbooks tend to have a lot of boxes with content that are separate from the running narrative of the chapter. The StarTalk book is entirely that.

Just the good stuff.

Sure, with topics inspired by guests that we’ve had on the show. So, “Could You Have Sex in Space?” Right? What would that be like? And the answer is you would need a lot of Velcro and straps and things—otherwise, what you would do would send you recoiling into the walls.

I’d like to think that the book is precisely for people who forgot what it was like to be curious, or thought they never really liked science. People come to [the book with a pop culture] scaffolding and we clad the science onto that scaffolding. And then they walk away with a deeper sense of how or why things work.

You’ve done StarTalk as a podcast and a book, you’ve obviously worked on other types of TV shows, you tweet. Why explore so many different kinds of media? And why a late-night talk show in particular?

All too often, people think of various media formats as ways to reach different demographics, or to show up in multiple places for publicity’s sake. But I think it’s all just different modes of communicating. Not everyone is as fluent in one mode or another when it comes time to learn. So, I find it a fascinating challenge to try to convey information in one medium relative to another. On Twitter, of course, it’s a single sentence, at most two sentences, conveying some morsel of knowledge or insight or wisdom about science. And that helps me to hone my communication skills, to create short sound bites that I might give on camera to the evening news.

There’s this moment in the show when you’re talking to high-wire walker Philippe Petit, who had to become an engineer to do his act, about artists doing science and scientists doing art. Do you like to talk to people about getting out of their comfort zones, or who get you out of your own comfort zone?

Just to be clear, I think the intersection between art and science is sometimes overstated. But when it’s done well, I think magic can happen on both sides. And I don’t think of comfort zones. The entire concept of a comfort zone implies that there is a conversational place you won’t go because you’re not an expert at it, or because you don’t know how to go out and learn.

For me, places where I don’t know things are my comfort zone. I am most comfortable knowing less in the company of someone who knows more. My most fun interviews are those with people who have expertise in something that I know little or nothing about. And then I’m like a kid in a candy shop: “Tell me about this! And how does that work? Tell me more! And how does that fit back into here? Or there? Or everywhere?”

[quote position=”full” is_quote=”true”]I don’t ever want to tell people what to think. I will alert people to the consequences of their thinking. Then I go home.[/quote]

I like probing the creativity of highly accomplished people who are accomplished because of that creativity. It was fun speaking to actor Jeremy Irons. I loved hearing what he did to get inside the head of a mathematician he played in a recent film called The Man Who Knew Infinity, [which allowed us to dig into the actual math done by that real person, G.H. Hardy]. We try to get a consistent level of celebrity conversation as well as actual scientific content.

Another actor you’ll be interviewing this season is Christopher Lloyd. I’d love to know if you think the kind of time travel in Lloyd’s film Back To The Future is possible—backward instead of forward, which you’ve said could happen. Are any legitimate scientists out there coming up with ways to do it?

It turns out there is a way to do it. You just have to go faster than light to make that happen. There’s a colleague of mine named J. Richard Gott, a professor of astrophysics at Princeton who wrote a book called Time Travel in Einstein’s Universe, which has a whole chapter on backwards time travel. Yes, there are serious scientists looking into this, but it requires a level of energy, a manipulation of energy that we don’t have access to.

So you’re saying, maybe one day?

I don’t think it’s going to happen any time soon.

I want to go back to your point about provoking conversation in different forms, including in a sentence or two on Twitter. Can you tell me about that tweet this summer about living in a virtual country based on reason? Obviously it generated a lot of backlash. Do you still believe it’s a good idea?

At the time, I was at a conference where in the room, many of us arrived at a conclusion—wouldn’t it be cool if there were a virtual country called “Rationalia” that had only one amendment to its constitution: “There shall be no policy created unless it can be based on weight and evidence.” That’s it. That’s the constitution. So, it is a rationally conceived country.

[quote position=”right” is_quote=”true”]I don’t want to lead movements, by the way. Do I want to be a politician? Never.[/quote]

Many people misunderstood, and said, “Oh, that’s called communism.” Or “scientists would be running all over the world and you would squash religion.” And then I found myself having to create a Facebook post, where you can put more characters than a tweet, explaining what “Rationalia” would be like.

You ready? This is how it would work. You can believe in anything you want at all. But unless it is based in objective truth, you cannot make policy on it. That’s all. It’s very simple. It’s only about policy.

So, yes, if you are Christian, you cannot legislate things that have come out of your Christian traditions because many of those are not derived from objective truths. If we’re all going to live peacefully together, you can’t have one personal truth being imposed upon another person’s personal truth.

People wanted to not like the idea. They didn’t want our systems to behave rationally. Like—what? You want a country where all the systems behave irrationally? What are you even thinking? By the way, you would only become a citizen if you wanted to be. If you didn’t want to, you could just leave.

I think what happened was, the idea got people talking. And that could only be a good thing, whether or not it ever gets implemented. I don’t want to lead movements, by the way. Do I want to be a politician? Never. I don’t ever want to tell people what to think. I will alert people to the consequences of their thinking one way or another, train them to evaluate evidence and information, and then I go home and you do want you want.

I don’t lobby politicians because they represent people who voted them into office. [It varies, but around] eighty-eight percent of Congress stands for election every two years. If I go to Congress to change things for me, I’ve got to do it again every two years. That’s what any education system is all about. Otherwise the education system would only have politicians in it.

Look, you train an electorate that understands the meaning and value of exploration and innovation and discovery and how that can pump our economy. And what role innovation has in health, to channel wellbeing, and what it means to have a healthy, wealthy country. You train people to evaluate information that way, and then they vote for members of Congress who serve those interests. Right now they’re serving interests that are not based in any objective reality.

Any other advice about how to make the world a better (or at least more objective) place? Maybe from your StarTalk colleague Bill Nye, your dad, or someone else?

My parents were active in the Civil Rights Movement in my early, formative years. So, the idea that some of your energies should be used to help others is very deep within me. I recognize not everyone feels that way. Libertarians in particular are a community of people who are more of a “pick yourself up by your own bootstraps” way of life.

[quote position=”right” is_quote=”true”]If I don’t learn something new, that’s a wasted day. [/quote]

I have found that because of my number of Twitter followers [as of publication, nearly 6 million] and my access to media, that people want to think of me as some kind of a pundit. But, when you listen to pundits, what they do most of the time is tell you how they want you to agree with them and all of their views. I don’t need you to agree with anything I’m saying. But if you’re voting and you are underinformed, that is not the richest democracy that we can make for ourselves.

In the sense of helping others, every day I try to do something that improves that day for at least one other person. It could be helping someone cross the street, or teaching them something. And I also do it for myself in the sense that I want to learn something every day. If I don’t learn something new, that’s a wasted day.

[youtube ratio=”0.5625″ position=”standard” ]

  • Despite all the likes, literallys and dropped g’s, English isn’t decaying before our eyes
    Photo credit: LisaStrachan/iStock via Getty Images Fear not: There isn’t anything that needs saving.

    As a linguistics professor, I’m often asked why English is decaying before our eyes, whether it’s “like” being used promiscuouslyt’s being dropped deleteriously or “literally” being deployed nonliterally.

    While these common gripes point to eccentric speech patterns, they don’t point to grammatical annihilation. English has weathered far worse.

    Let’s start with something we can all agree on: Old English, spoken from approximately A.D. 450 to 1100, is pretty unintelligible to us today. Anyone who’s had the pleasure of reading “Beowulf” in high school knows how different English back then used to sound. Word endings did a lot more grammatical work, and verbs followed more complicated patterns. Remnants of those rules fuel lingering debates today, such as when to use “whom” over “who,” and whether the past tense of “sneak” is “snuck” or “sneaked.”

    The language went on to experience centuries of tumult: Viking invasions, which introduced Old Norse influence; Anglo-Norman French rule, which shifted the language of the elite to French; and 18th-Century grammarians, who dictated norms with their elocution and grammar guides.

    In that time, English has lost almost all of the more complex linguistic trappings it was born with to become the language we know and – at least, sometimes – love today. And as I explain in my new book, “Why We Talk Funny: The Real Story Behind Our Accents,” it was all thanks to the way that language naturally evolves to meet the social needs of its speakers.

    From dropping the ‘l’ to dropping the ‘g’

    The things we tend to label as “bad” or sloppy English – for instance, the “g” that gets lost from our -ing endings or the deletion of a “t” when we say a word like “innernet” – actually reflect speech habits that are centuries old.

    Take, for example, “often.” Originally spoken with the “t,” that pronunciation gradually became less favored around the 15th century, alongside that “l” in “talk” and the “k” in know. Meanwhile, the “s” now stuck on the back of verbs like “does” and “makes” began as a dialectal variant that only became popular in 16th-century London. It gradually replaced “th” whenever third persons were involved, as in “The lady doth protest too much.”

    While dropping the “l” in talk may have been initially frowned upon, today it would be strange if you pronounced the letter. And the shift makes sense: It smoothed out some linguistic awkwardness for the sake of efficiency.

    If people learned to look at language more like linguists, they might come around to seeing that there is more than one perspective on what good speech consists of.

    And yes, that absolutely is a sentence ending with a preposition – something many modern grammar guides discourage, even though the idea only took hold after 18th-century grammarian Robert Lowth intimated it was a less elegant choice based on the model of Latin.

    Though Lowth voiced no hard and fast rule against it, many a grammar maven later misconstrued his advice as an admonition. Just like that, a mere suggestion became grammatical law.

    The rise of the grammar sticklers

    Many of today’s ideas about what constitutes correct English are based on a singular – often mistaken – 19th-century view of the forces that govern our language.

    In the late 18th century, the English-speaking world began experiencing class restructuring and higher literacy rates. As greater class mobility became possible, accent differences became class markers that separated new money from old money.

    Emulation of upper-crust speech norms became popular among the nouveau riche. With literacy also on the rise, grammarians and elocutionists raced to dictate the terms of “proper” English on and off the page, which led to the rise of usage guides and dictionaries that were eager to sell a certain brand of speech.

    Another example of grammarian angst reconfiguring the view of an otherwise perfectly fine form is the droppin’ of the “g.” It became so tied to slovenly speech that it was branded with an apostrophe in the 19th century to make sure no one missed its lackadaisical and nonstandard nature.

    Up until the 19th century, however, no one seemed to care whether one pronounced it as “-in” or “-ing.”

    Evidence suggests that -ing wasn’t even heard as the correct form. Many elocution guides from the 18th century provide rhyming word pairs like “herring/heron,” “coughing/coffin” and “jerking/jerkin,” which suggest that “-in” may have been the preferred pronunciation of words ending with “-ing.” Even writer and satirist Jonathan Swift – a frequent lobbyist for “proper” English – rhymes “brewing” with “ruin” in his 1731 poem “Verses on the Death of Dr. Swift, D.S.P.D..”

    Embrace the change

    Language has always shifted and evolved. People often bristle at changes from what they’ve known to what is new. And maybe that’s because this process often begins with speakers that society usually looks less favorably on: the young, the female, the poor, the nonwhite.

    But it’s important to remember that being disliked and bad are not the same thing – that today’s speech pariahs are driven by the same linguistic and social needs as the Londoners who started going with “does” instead of “doth” or dropped the “t” in often.

    So if you think the speech that comes from your lips is the “correct” version, think again. Thou, like every other English speaker, art literally the product of centuries of linguistic reinvention.

    This article originally appeared on The Conversation. You can read it here.

  • Placebo effect can work as well as real medicine – but your body may need permission to use it
    Photo credit: Irina Marwan/Moment via Getty Images From empty pills to homeopathy to sham surgery, placebos have powerful effects on the body.

    The first time the placebo effect really got under my skin was when I read that roughly one-third of people with irritable bowel syndrome improve on placebo treatments alone. Usually this statistic is presented as a fascinating quirk of medicine. My reaction was anger.

    Humanity possesses an extremely effective treatment, with essentially zero side effects – and patients need someone else’s permission to use it.

    The placebo effect refers to the improvements in symptoms that patients experience after they’re given an inert treatment like a sugar pill. Driven by expectation, context and social cues rather than pharmacology, the placebo effect is often dismissed as all in the mind. But decades of research have shown it is anything but imaginary.

    Placebo treatments can trigger measurable changes in the brain, immune system and hormone function. In studies on pain, placebos cause the brain to release endorphins, the body’s natural opioids. In Parkinson’s disease, placebo injections increase dopamine activity in the brain. The placebo effect isn’t magic. It’s biology.

    Having spent nearly a quarter-century teaching evolutionary medicine, I’ve come to see placebos not as curiosities of clinical trials but as windows into how human biology responds to social signals. And it’s that relationship that is exactly what makes the placebo effect unsettling.

    Medicine works, even when it isn’t medicine

    The placebo effect is so reliable that researchers must account for it in nearly every clinical trial.

    When testing a new drug, scientists compare its effects to what patients experience on a placebo treatment like sugar pills, saline injections or sham surgery. If the drug doesn’t outperform the placebo, it rarely reaches the public. Placebo responses are common and powerful enough to rival active treatments.

    Even surgery isn’t immune to the placebo effect. In several well-documented studies of knee procedures, patients who received sham operations – incisions without the full surgical repair – improved almost as much as those who received the real procedure.

    Clinician in scrubs and gloves holding wrist of patient lying on a hospital bed
    The experience of going under the knife can itself be healing. Jacob Wackerhausen/iStock via Getty Images Plus

    Clearly something real is happening inside the body. But the strangest part of the placebo effect is not that it works. It’s what makes it work.

    The prescription of belief

    Placebo treatments tend to be more effective when delivered by credible authorities. Pills work better when prescribed by doctors wearing white coats. Expensive pills outperform cheap ones. Injections produce stronger responses than tablets.

    Some researchers have even removed the deception from placebo experiments entirely. In open-label placebo studies, patients are directly told they are receiving a placebo; and yet many still report significant improvement.

    But look more closely at how these studies are run. Patients are not simply handed a sugar pill and sent home. They receive an explanation from a clinician, in a medical setting, within a structured ritual of care: a context that may be doing much of the biological work.

    Even when the deception disappears, the social scaffolding remains. The permission to heal is still being granted by someone else.

    The placebo effect extends beyond the patient

    The placebo effect is often framed as something happening inside an individual. But it does not operate in isolation.

    Consider what happens in veterinary medicine. Dogs and cats cannot believe a treatment they’re given will work; they have no concept of receiving medication. Yet when owners and vets believe an animal is being treated, they consistently report improvements in pain and mobility that medical tests do not confirm.

    In one study of dogs with osteoarthritis, owners reported improvement roughly 57% of the time for animals receiving only a placebo.

    Dog resting head against person's arm while vet inspects a front leg
    Is Fido feeling better, or is the placebo effect working on you? Chalabala/iStock via Getty Images Plus

    The animals themselves may not have improved. But the humans caring for them perceived they had. The healing signal, it turns out, travels through the humans in the room.

    When healing makes things worse

    There have been times when going to the doctor made you less likely to survive. In the 19th century, mainstream medicine was built on bloodletting, purging and doses of mercury and arsenic – treatments that killed as often as they cured.

    Homeopathy emerged in the late 18th century precisely in this context. Its founder, Samuel Hahnemann, was a physician horrified by the harm the conventional medicine of his time was causing. His highly diluted versions of contemporary remedies did nothing pharmacologically. But they also did not kill people, which put them decisively ahead of the competition.

    Homeopathic patients not only survived but also reported dramatic recoveries from chronic ailments and acute infections alike. During the cholera epidemics of the mid-1800s, patients at homeopathic hospitals had lower death rates than those receiving standard care. Why was that?

    The standard cholera treatment of the era was aggressive and exhausting; for a disease that already caused massive fluid loss, doctors often prescribed further bloodletting, along with toxic purgatives such as calomel – a form of mercury – to “flush” the system. In contrast, homeopathic care involved extreme dilutions of substances in water or alcohol, effectively providing hydration and a calm, structured environment without the physiological assault.

    Death rates were lower not because homeopathy worked but because the placebo effect – combined with not poisoning patients – was more effective than the medicine of the day.

    Healing is not free

    The body needs resources to heal from injury and disease. Activating systems such as immune responses, tissue repair and inflammation at the wrong time can be dangerous.

    A full-scale immune response is metabolically expensive, with fever increasing metabolic rate by roughly 10% per degree Celsius rise in body temperature. Triggered at the wrong time, this can deplete critical energy reserves needed for immediate survival, such as escaping a predator. Furthermore, misplaced or overzealous inflammation causes collateral damage to healthy tissues, potentially leading to chronic dysfunction.

    Some researchers have proposed that placebo responses reflect a kind of biological health governor: a system that regulates when the body invests heavily in recovery. Cues from trusted individuals may be exactly the signal the body waits for before committing resources to recovery. A caregiver’s reassurance, a physician’s authority and the rituals of medicine may tell the body that conditions are finally stable enough to devote energy to healing.

    If that interpretation is correct, the placebo effect is not a trick of the mind. It is an ancient biological system responding to social information.

    Body under stress

    The placebo effect resembles another system people struggle with today: the stress response.

    Stress evolved to keep you alive in the face of acute danger – predators, famine, immediate physical threat. These days, this useful piece of biological engineering might fire when someone hasn’t replied to your email. The system that once saved people’s lives now makes many miserable over things that would have been unimaginable to their ancestors.

    You can talk back to the stress response, consciously reappraising the threat – in other words, reframing a looming deadline not as a catastrophe but as a manageable challenge – to help quiet it. But notice what you cannot do: You cannot simply decide to activate your placebo response. You cannot will yourself to release pain-relieving endorphins by believing hard enough in a sugar pill. For that, you still need the ritual, the white coat, the authority figure. You need someone else.

    The stress response, misfiring as it is, remains yours. The placebo response has been outsourced: not because it wasn’t always social, but because even now, people still can’t seem to access it on their own.

    The uncomfortable implication

    The placebo effect is not a trick of the mind. It is a feature of human biology that people have largely surrendered to whoever performs authority most convincingly.

    If belief can activate biological healing pathways, belief can also be manipulated. Charismatic figures, elaborate medical rituals and expensive treatments may produce real improvement in symptoms even when the underlying treatment is physiologically inert. That is how wellness culture works. It leverages the same social scaffolding of care to trigger the body’s internal pharmacy, regardless of whether the treatment itself does anything.

    The placebo effect is often celebrated as proof that the mind can heal the body. But I believe that may not be its most interesting lesson. It also reveals that human physiology evolved to take its cues from other people. Your brain, immune system and pain response are not isolated machines. They are deeply intertwined with social signals, expectations and trust.

    In a world filled with doctors, advertisements, wellness influencers and elaborate medical rituals, that insight is both fascinating and profoundly maddening. People are walking around with one of the most powerful healing systems ever documented locked inside them, and they can reliably access it only when someone in a position of authority gives them permission.

    This article originally appeared on The Conversation. You can read it here.

  • Gen Z was asked if $75,000 a year counts as poor. Their answers say a lot about America right now.
    Photo credit: CanvaA woman comforts her friend.
    ,

    Gen Z was asked if $75,000 a year counts as poor. Their answers say a lot about America right now.

    “$75k is a fantasy amount of money to me. I can’t even imagine what it’s like to make that much.”

    Someone on social media posed a simple question to Gen Z: do you consider $75,000 a year to be poor? The answers that came back weren’t simple at all and, taken together, they’re a pretty honest portrait of what it costs to exist in America right now.

    The question came from u/NoHousing11 on r/GenZ, along with a screenshot of an X post that had already made the rounds. In it, an MSNBC commentator suggested that young people fresh out of college, earning $75k or $80k, would naturally be drawn to policies like student loan forgiveness and free healthcare. Another commenter fired back at the framing: “Imagine being so rich that $75k is what you think poor people earn.” Then the kicker: “$75k is a fantasy amount of money to me. I can’t even imagine what it’s like to make that much money a year.”

    That response alone split the thread.

    Some Gen Zers pushed back pointing out that $75k in NYC with a roommate and no car is actually workable if you’re disciplined about it. “There are people making $20 or even less living in Manhattan,” one commenter noted. The city, for all its expense, at least gives you options for getting by without a car, which in a lot of American suburbs isn’t remotely possible. Another commenter made the sharper point: people who claim to live paycheck to paycheck on $100k in NYC are usually doing it because they’re trying to keep up with wealthier friends enjoying dinners out, Broadway shows, and cabs everywhere.

    But others had a different read entirely. In some high cost-of-living areas, a single person earning less than $80k is already classified as low income by local standards. “If you live in an HCOL area and have to pay every single one of your bills,” one commenter wrote, “then yes, you might be considered struggling.”

    The geographic reality of American wages is the whole story here. “Depends on the area. NYC? Yes. Nebraska? No.” That two-sentence comment got a lot of upvotes because it’s basically correct, and also kind of depressing that it needs to be said at all. The number that represents a comfortable life in one zip code represents genuine hardship in another, and the policies, conversations, and assumptions that get built around a single national salary figure tend to miss that entirely.

    What the thread really exposed wasn’t a generation with distorted expectations. It was a country where “how much is enough” doesn’t have a single answer anymore.

Explore More Stories

Everyday Economics

Illustrator says the best way to stay motivated when learning a new hobby is to spend money

Environment

Ancient teeth reveal clues to the environment humans’ early ancestors evolved in millions of years ago

Science

It’s OK to love all the bees (the honey bees, too)

Health

It’s a myth that baby boys are less social than girls – a new look at decades of research shows all babies are born to connect