Your Undivided Attention

By: Tristan Harris and Aza Raskin The Center for Humane Technology
  • Summary

  • In our podcast, Your Undivided Attention, co-hosts Tristan Harris, Aza Raskin and Daniel Barcay explore the unprecedented power of emerging technologies: how they fit into our lives, and how they fit into a humane future. Join us every other Thursday as we confront challenges and explore solutions with a wide range of thought leaders and change-makers — like Audrey Tang on digital democracy, neurotechnology with Nita Farahany, getting beyond dystopia with Yuval Noah Harari, and Esther Perel on Artificial Intimacy: the other AI. Your Undivided Attention is produced by Executive Editor Sasha Fegan and Senior Producer Julia Scott. Our Researcher/Producer is Joshua Lash. We are a top tech podcast worldwide with more than 20 million downloads and a member of the TED Audio Collective.
    2019-2024 Center for Humane Technology
    Show More Show Less
activate_Holiday_promo_in_buybox_DT_T2
Episodes
  • The Tech-God Complex: Why We Need to be Skeptics
    Nov 21 2024

    Silicon Valley's interest in AI is driven by more than just profit and innovation. There’s an unmistakable mystical quality to it as well. In this episode, Daniel and Aza sit down with humanist chaplain Greg Epstein to explore the fascinating parallels between technology and religion. From AI being treated as a godlike force to tech leaders' promises of digital salvation, religious thinking is shaping the future of technology and humanity. Epstein breaks down why he believes technology has become our era's most influential religion and what we can learn from these parallels to better understand where we're heading.

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X.

    If you like the show and want to support CHT's mission, please consider donating to the organization this giving season: https://www.humanetech.com/donate. Any amount helps support our goal to bring about a more humane future.

    RECOMMENDED MEDIA

    “Tech Agnostic” by Greg Epstein

    Further reading on Avi Schiffmann’s “Friend” AI necklace

    Further reading on Blake Lemoine and Lamda

    Blake LeMoine’s conversation with Greg at MIT

    Further reading on the Sewell Setzer case

    Further reading on Terminal of Truths

    Further reading on Ray Kurzweil’s attempt to create a digital recreation of his dad with AI

    The Drama of the Gifted Child by Alice Miller

    RECOMMENDED YUA EPISODES

    ’A Turning Point in History’: Yuval Noah Harari on AI’s Cultural Takeover

    How to Think About AI Consciousness with Anil Seth

    Can Myth Teach Us Anything About the Race to Build Artificial General Intelligence? With Josh Schrei

    How To Free Our Minds with Cult Deprogramming Expert Dr. Steven Hassan

    Show More Show Less
    47 mins
  • What Can We Do About Abusive Chatbots? With Meetali Jain and Camille Carlton
    Nov 7 2024

    CW: This episode features discussion of suicide and sexual abuse.

    In the last episode, we had the journalist Laurie Segall on to talk about the tragic story of Sewell Setzer, a 14 year old boy who took his own life after months of abuse and manipulation by an AI companion from the company Character.ai. The question now is: what's next?

    Megan has filed a major new lawsuit against Character.ai in Florida, which could force the company–and potentially the entire AI industry–to change its harmful business practices. So today on the show, we have Meetali Jain, director of the Tech Justice Law Project and one of the lead lawyers in Megan's case against Character.ai. Meetali breaks down the details of the case, the complex legal questions under consideration, and how this could be the first step toward systemic change. Also joining is Camille Carlton, CHT’s Policy Director.

    RECOMMENDED MEDIA

    Further reading on Sewell’s story

    Laurie Segall’s interview with Megan Garcia

    The full complaint filed by Megan against Character.AI

    Further reading on suicide bots

    Further reading on Noam Shazier and Daniel De Frietas’ relationship with Google

    The CHT Framework for Incentivizing Responsible Artificial Intelligence Development and Use

    Organizations mentioned:

    The Tech Justice Law Project

    The Social Media Victims Law Center

    Mothers Against Media Addiction

    Parents SOS

    Parents Together

    Common Sense Media

    RECOMMENDED YUA EPISODES

    When the "Person" Abusing Your Child is a Chatbot: The Tragic Story of Sewell Setzer

    Jonathan Haidt On How to Solve the Teen Mental Health Crisis

    AI Is Moving Fast. We Need Laws that Will Too.

    Corrections:

    Meetali referred to certain chatbot apps as banning users under 18, however the settings for the major app stores ban users that are under 17, not under 18.

    Meetali referred to Section 230 as providing “full scope immunity” to internet companies, however Congress has passed subsequent laws that have made carve outs for that immunity for criminal acts such as sex trafficking and intellectual property theft.

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X.

    Show More Show Less
    49 mins
  • When the "Person" Abusing Your Child is a Chatbot: The Tragic Story of Sewell Setzer
    Oct 24 2024

    Content Warning: This episode contains references to suicide, self-harm, and sexual abuse.

    Megan Garcia lost her son Sewell to suicide after he was abused and manipulated by AI chatbots for months. Now, she’s suing the company that made those chatbots. On today’s episode of Your Undivided Attention, Aza sits down with journalist Laurie Segall, who's been following this case for months. Plus, Laurie’s full interview with Megan on her new show, Dear Tomorrow.

    Aza and Laurie discuss the profound implications of Sewell’s story on the rollout of AI. Social media began the race to the bottom of the brain stem and left our society addicted, distracted, and polarized. Generative AI is set to supercharge that race, taking advantage of the human need for intimacy and connection amidst a widespread loneliness epidemic. Unless we set down guardrails on this technology now, Sewell’s story may be a tragic sign of things to come, but it also presents an opportunity to prevent further harms moving forward.

    If you or someone you know is struggling with mental health, you can reach out to the 988 Suicide and Crisis Lifeline by calling or texting 988; this connects you to trained crisis counselors 24/7 who can provide support and referrals to further assistance.

    Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_

    RECOMMENDED MEDIA

    The first episode of Dear Tomorrow, from Mostly Human Media

    The CHT Framework for Incentivizing Responsible AI Development

    Further reading on Sewell’s case

    Character.ai’s “About Us” page

    Further reading on the addictive properties of AI

    RECOMMENDED YUA EPISODES

    AI Is Moving Fast. We Need Laws that Will Too.

    This Moment in AI: How We Got Here and Where We’re Going

    Jonathan Haidt On How to Solve the Teen Mental Health Crisis

    The AI Dilemma

    Show More Show Less
    49 mins

What listeners say about Your Undivided Attention

Average customer ratings
Overall
  • 5 out of 5 stars
  • 5 Stars
    3
  • 4 Stars
    0
  • 3 Stars
    0
  • 2 Stars
    0
  • 1 Stars
    0
Performance
  • 5 out of 5 stars
  • 5 Stars
    3
  • 4 Stars
    0
  • 3 Stars
    0
  • 2 Stars
    0
  • 1 Stars
    0
Story
  • 5 out of 5 stars
  • 5 Stars
    3
  • 4 Stars
    0
  • 3 Stars
    0
  • 2 Stars
    0
  • 1 Stars
    0

Reviews - Please select the tabs below to change the source of reviews.

Sort by:
Filter by:
  • Overall
    5 out of 5 stars
  • Performance
    5 out of 5 stars
  • Story
    5 out of 5 stars

Tackling the worst problem of our time.

Tristan and Aza are delving deeper into Attention Economy. Discussing it's risks with experts and providing solutions.
Enjoyable and deep. Really loved it.

Something went wrong. Please try again in a few minutes.

You voted on this review!

You reported this review!