Adam Becker gets Palestinians and Israelis to talk Head On: Part 1

Oh man, this one was fun. Adam is an old friend – we were in an accelerator for our two startups back in 2018 – and he's been doing amazing things ever since. In this interview he and I talk about his company HeadOn, which is a platform designed to get people to have difficult, but critical, conversations. As someone who works on important problems, and as someone born and raised in Israel, of course he wanted to start with the easiest application: Israelis and Palestinians.
Our conversation dovetails with my recent series on creating American relationships at scale, but adds many different and incredibly rich dimensions. One part that I keep coming back to is his stance that a smooth conversation is not necessarily a good conversation, if there's real tension underneath that isn't getting addressed - and that verbal violence is always better than physical violence. So if he can get people to argue productively – and sometimes that means really leaning into the conflict – that can still be a huge win.
But there's a ton more as well.
If you're reading this on your phone, you can either click here for a link, or come to The Plan website and use the embedded player.
And the (buggy, AI-produced) transcript is below. Enjoy!
Alex Loewi (00:00)
Before... Before we go any further, let me just...
I feel like this is a conversation that could go a million different ways and I'll be happy with like any of them. So I'm just going to do the introduction so we have it and then I don't know we can figure it out live and I'm perfectly happy with that.
Adam (00:22)
Let's do it.
Alex Loewi (00:26)
Welcome back to The Plan, the show about saving the world because we have to and Rock and Roll because I want to. Today we're talking about saving the world with my close friend Adam Boaz Becker. Adam has found companies, communities and projects around everything from machine learning to political fundraising, but his most recent project is called Head On and I'm very excited to talk about it today. Adam, thanks so much for being here.
Adam (00:48)
Alex, thanks for inviting me.
Alex Loewi (00:51)
Okay. Intro out of the way. Now what are we doing? ⁓ I want to know about head on. I want to talk about it more because it's been like feeding my own thinking about, you know, sort of WPA scale, not like tree planting, but like relationship building kinds of efforts in the United States right now. So that's where my mind is at, but you've been going through a lot of interesting stuff recently too. if anything sort of top of mind for you start there.
Adam (00:53)
Boom, did it, nice.
You call that the plan?
Alex Loewi (01:19)
That's right.
Adam (01:21)
⁓ we changed it from ⁓ most important podcast.
Alex Loewi (01:25)
that's a good question. probably be something different next week, but, I'll tell you the quick, cause maybe you have good thoughts about this. I'll tell you the quick, Once upon a time, this was the story about me trying to find and solve the most important problem that I could find. I looked at it that way. thought about it that way. Most important podcast was just like a riff on that. And then I finally realized nobody likes that framing. Everybody hated that story. Nobody got it. Cause they were like,
Adam (01:27)
Okay, I got it.
Alex Loewi (01:55)
But who are you to say what the most important problem is? I already know what the most important problem is. I chose it for myself. It's different from yours. What are you talking about?
Adam (02:03)
Hmm.
Yeah.
Alex Loewi (02:05)
So I stopped trying to use that. The plan came because as soon as the election happened, everybody was like, what do we do? And I was like, well, here's a plan. That's what people are asking for. You want instructions? Here they are.
Adam (02:18)
Hmm.
Got it, got it. I like it. I like it. ⁓
Alex Loewi (02:26)
Okay, cool.
Adam (02:31)
Yeah, I think it connects nicely with your... you do have an opinion about what we should do and that is rare.
Alex Loewi (02:36)
I do. One of my qualities.
Yeah. There you go.
Adam (02:42)
You know what mean? Like most people don't have opinions. Like, look, they might have like opinions in like a local sense, but you have an opinion in a global sense.
Alex Loewi (02:50)
Great, yes.
Adam (02:52)
And I think that's helpful at times like this, especially, right? Everybody's just like trying to grab onto anything to provide some structure and some optimism because things are looking pretty bad.
Alex Loewi (03:06)
Yeah, they are. I thought that would be helpful. I thought it was like, you know, we're all drowning, here's a raft. ⁓
I also realized that that was like Elizabeth Warren's tagline. You remember this? She's got a plan for that. And I love Elizabeth Warren, but she's probably the last person on earth you want to take like branding advice from.
Adam (03:22)
Right.
Yeah.
Alex Loewi (03:33)
So now I'm just afraid that it seems too bloodless, too bureaucratic, too administrative. Like nobody is excited about planning. People are excited about heroism.
Adam (03:39)
Hmm.
Alex Loewi (03:46)
And I'm, you know, trying to go in a direction that like raises the heart rate a little bit more.
Adam (03:51)
Well, think about like, so Trump has, what did he say? He has the contours of the beginnings of a plan. Remember somebody asked them that I don't remember the content. Yeah. It's like, yeah, but Trump, do you have a plan for X? And he's like, we were starting to put together the framework for the plan. Doesn't have a plan. Nobody's interested, but he does have a heroic photo of him raising his fist after the assassination.
Alex Loewi (03:59)
Great, yep, uh-huh. That sounds familiar, yeah, right.
There you go.
There you go. So I'm trying to walk that tightrope of like actually having a plan, but not making the story be about the plan.
Adam (04:26)
Yeah, it's also the case that I wonder if people are very policy driven. I never got the impression that like the vast majority of people are interested in policy.
Alex Loewi (04:36)
That's right. Also, no. this is, this is me. ⁓
Adam (04:38)
Yeah. Which is something that we're seeing
Yeah, yeah, yeah. I wonder whether meditation can come in handy here. By the way, I lit this candle. I think this candle is going to explode. I got to...
Alex Loewi (04:48)
that's a candle.
Yeah, check on the candle.
Adam (04:55)
Yeah, I think the glass was about to break or might still break. Talk about harm reduction. I had to turn this off. I think it's, there's a long history.
yeah, the whole glass broke. All right, good thing that I stopped it when I did. Let's go.
Alex Loewi (05:16)
Okay, cool. You just have
like a pile of wax melting into your electrical sockets right now.
Adam (05:21)
Yeah, I think that's it. I think we'll know when it's bad when the internet crashes. You know what I mean? I do wonder whether meditation can help in here because like one thing that I'm trying to do now is so I'm not on social media much and I but the like the one thing that is
Alex Loewi (05:22)
Okay, cool.
Perfect. It'll be really helpful signal.
Adam (05:44)
addicting for me is YouTube shorts. And I hate shorts. I hate it because it's not as good as TikTok, which is why it's a little bit worse. Cause TikTok, know, you could just avoid it. It's so bad for you. It's like meth or whatever. You're like, okay, I'm not going to do that. So then, you know, at night or whatever, you just like start to bro, let's see what's happening on YouTube. And I don't want to see the shorts. I just want to see like,
Alex Loewi (05:48)
They're terrible. Whoever. Yeah.
Uh-huh. Uh-huh.
Right.
Right. Yeah.
Adam (06:13)
actual, you know, long form content on YouTube, but they just, they smuggle in the shorts constantly. And then you see yourself with them like, I don't want this. Okay. Just one more. And then just one more. And then a few minutes later, you're like, what did I just waste my time on? And I wonder what it takes to just say, you know what? I will only, I'm not going to consume this garbage. This is whatever this is like.
Alex Loewi (06:17)
Yeah.
Huh.
Yeah.
Adam (06:41)
This is trash, right? This is, this is fast food. I just, I'll step away and really just consume healthier content. Even on the spot right now when I don't want to, when I feel like I should, I just don't want to give into this. No, no, no. Right now you got to step out. And I wonder whether meditation can help with just training that muscle.
Alex Loewi (06:42)
Right. Yeah.
Yeah, no, I think that's like the whole, that's the whole appeal of it to me that that very much like at any moment being intentional, having enough presence of mind to be intentional about what you choose or not.
Adam (07:14)
Yeah. Yeah.
Alex Loewi (07:14)
So I
had, I am, I don't know, somebody would probably disagree. I don't like to think I'm like an inherently vindictive person, but I wish like a fast and nasty death on whoever invented infinite scroll. I swear to God, like with the shorts and the endless tweets and the whole thing, like.
Adam (07:31)
Yeah. Yeah.
Yeah, I think one thing that I'm trying to figure out is given these components, can you stitch them together in a way that's actually healthy and enriching? And I think it's, we still need to figure that out. Who knows, right? It's not clear if it's even possible, but I do wonder whether the fact of AI and the fact that now we'll be reacting and building on top of totally novel technology and open new doors.
Alex Loewi (07:46)
Huh.
Mm-hmm.
Adam (08:09)
and that some of those doors lead to good places. That I think is left to figure out. Do you have a, like a suspicion? Do you have some like intuition about like the net positive and negative that it would do?
Alex Loewi (08:20)
personally.
Yeah, God.
I mean, it's similar to the thing you were just talking about. I don't think ⁓ AI in particular is like, minorly tied to my identity, both because I have like degrees in this stuff, but also in the sense of like, so much trash is being done with it. Whenever I look at it, I'm just like, I choose not to care because that's not what I'm interested in right now. But, you know, I mean, to the extent that we all use our identities to be able to make these kinds of decisions, it's the kind of thing you were just talking about. Like I choose my like,
Story about myself right now is that I ignore AI content. I just don't think about it because I've got more important things to do. But if you were to stick me in a room and say like, you must use this, you must use YouTube shorts, you must use infinite scroll, and you must use large language models, like what's the best thing you could possibly do for the world? Spectacular outcomes, like huge, huge projects. I have no doubt. I just don't put any thought into it.
Adam (09:20)
Yeah. And do you think that it's... Okay, well, that's the potential. So you think the potential itself is massive. In actuality, what do you think is the likely outcome that we realize that potential? More so than we realize the destructive potential.
Alex Loewi (09:25)
Yeah, right. I think so. Yeah.
I think the way I'm thinking about it at the moment is that it's,
It's zero sum in terms of time spent. Like, by which I mean, if you spend an hour on YouTube, then you only have so much time in your day. And I'm afraid that whether it's ⁓ our attention span is zero sum or whether just like the size of the market is zero sum, something like this, I'm afraid that...
Like, slow and interesting will never be able to innately compete with fast and cheap in a context that's like optimized for fast and cheap.
Adam (10:24)
Hmm. But is that context optimized for fast and cheap because of the very specific monetization models that we had picked thus far?
Alex Loewi (10:32)
This is, yeah,
that's the kind of thing. That's what I mean by the, if I even use the word, but yeah, that's the space that I'm talking about. Like for the monetization models, YouTube is doing an unbelievable job and it makes me so angry to literally be on the site. But something has got to make.
Adam (10:46)
Yeah.
Alex Loewi (10:53)
Something structural like monetization models have got to change for there just to be that space. I think, cause otherwise it's just too powerful. It's just even for me, it's too powerful.
Adam (11:05)
And one thing that I'm seeing and still just trying to wrap my head around is the sheer number of hours of people who are in our community and who are spending time having these conversations. Like we actually got now, like we finally have some data around this. And we see people spending sometimes like 15 hours a week and sometimes more than 10 hours a day. How? Where do they get the time for this? I don't understand.
Alex Loewi (11:20)
Yeah, yeah.
Mm-hmm.
Adam (11:34)
But I get it. The other day we didn't have any conversations going on that one day. And I was just, kept checking it and I was missing it. was like, man, I, something was, something was missing from my life. And those are conversations that I actually feel are, are productive and healthy. And can you do it at scale? Right now we have hundreds of people. Can you do it at thousands? Can you do it tens? Can you do it millions, billions? Who knows, right? Maybe we hit some, there's going to be some phase transition. And we're like, actually at that point, the whole thing breaks down.
Alex Loewi (11:51)
Yeah, right.
Adam (12:04)
Totally possible. Could it be that it's monetarily and financially viable at scale? That's totally possible. The AI that we're running right now can really only run it. And there's no monetization model that can actually support such, all totally possible. But there's something here already. When you see people that are spending, you know, dozens of hours a month, in actual conversation with people who we disagree with,
Alex Loewi (12:31)
Yep.
Yep.
Adam (12:35)
I'm starting to be hopeful.
Alex Loewi (12:37)
Cool. That's very cool. Well, let's back up because I know what you're talking about, but you're talking about head on, you're talking about your most recent project, which is basically a platform for getting people together to have in video. So face to face, no, not in person, but still face to face conversations about hard things. And you're, you're starting with Israel Palestine very specifically, cause you have lots of personal reasons to do that. Obviously lots of global reasons as well, but.
⁓ you're starting with that.
Yeah, set it up.
Adam (13:12)
Yeah. Yeah. Yeah. I think that that's it. It's video optional. We've seen from a lot of Palestinians in particular that maybe they're a little bit camera shy right now because of, you know, security risks that they're incurring. And so some people choose to have video off. That's, that's fine too. But I'm very skeptical about text. think text has a radicalizing effect and
Alex Loewi (13:24)
huh. Yep.
Yep. ⁓
Adam (13:42)
But having conversations with another person where you're speaking and they're speaking, think it's in many ways, think it's going to be the only thing that sets us apart from just interacting with AI. One of the things that's very interesting to me is how and whether AI can help us to be more human and to connect more humanly and more deeply and more profoundly with one another. And it's a bit ironic, right? It's going to artificial intelligence is going to help us to be much more naturally human.
And to what extent can we do that? And I think that there's a lot of massive potential in figuring that out and in cracking this and actually doing social properly. So like the long-term vision is that I think to do what the social media companies were at least saying that they wanted to do, which is to deeply connect us, not just to communicate, but to deeply connect. And they, think have failed, have failed to do that miserably. Right. ⁓ with
Alex Loewi (14:35)
was the premise.
Yeah, yeah, yeah.
Adam (14:41)
destructive consequences, catastrophic consequences. But I think that there is still something left of that dream. And I don't think that dream is out of hand and out of reach. And I want to take a totally fresh perspective on it, given new technology. And actually given the fact of conflict, I don't think conflict is bad. There's nothing wrong with conflict. Conflict gives us good energy.
to actually be able to dig in and have more and more conversations with people. That's fine. It's how you handle conflict that might be good or bad. So the question that we began to ponder is to what extent can you leverage the tensions that exist in society, the fissures that are already here, right? The political polarization that this society, many societies are totally submerged in. To what extent can you leverage those to allow people to connect more deeply?
Alex Loewi (15:22)
Mm-hmm. Yep.
Adam (15:38)
And think the key here might be an AI could help with that. An AI could also make things much worse, but we're trying to figure out to what extent an AI can help.
Alex Loewi (15:49)
Awesome. That's super cool. you've got, and what we were talking about just now was you've got conversations going on in these rooms about lots of different subjects, some of which are AI and societal implications.
Adam (16:04)
Yeah, exactly. the, well, these topics are picked by an AI as well. So humans can also submit. So we have hosts that choose to discuss a very specific topic, say. So they're going on the platform, they're scheduling a room and then they're like, all right, Thursday. So for example, Tuesdays at 8pm, we have conversations about socialism. This one person came up with a topic and he wants to discuss socialism. And now the AI has been listening in to all of those Tuesday at 8pm conversations.
Alex Loewi (16:11)
huh. Yup.
Adam (16:33)
And it's starting to tease out a bunch of other ideas for future conversations that are either follow-ups to previous things. Maybe they're open threads that hadn't yet closed. So that's kind of how the AI is engaging. And yeah, but so it's a combination of humans and AI. And some of those conversations are about AI's role in society and the future of AI and whether or not there's an existential crisis for us.
Alex Loewi (17:01)
Very cool.
Adam (17:02)
and that sort of thing. yeah, ultimately the vision here is you go on this, by the way, there's a, do you hear that sound? I don't know how to turn it off.
Alex Loewi (17:11)
I just
heard a ding. I don't know what it indicates.
Adam (17:13)
I don't know what it is.
Maybe I'll.
Now that's a product. What is doing all the dings and the sounds on your computer? I don't know. You know what mean? So the, the vision here is that at some point in a year, two years, whatever you go on this site and you just see millions of conversations, tens of millions of people in engaging in discussion about any topic under the sun. Ideally talking about things with people they disagree with, ideally breaking these echo chambers.
Alex Loewi (17:29)
That'd be great. Yeah, that'd be nice.
Adam (17:54)
but nevertheless doing it in a way that is engaging. And I think that the AI here is whatever that scrolling was that you resented, if the algorithms could actually get you to talk to people who you disagree with and to talk for longer in a way that is even more profound and personally fulfilling, I'll say that might be a good use of AI.
Alex Loewi (18:04)
Yep, right.
Yes. No, that would be awesome. That would be pointing me towards how would, how would I even conceptualize this? I have thought about everything in terms of like information for very long time. And I feel like that would be, you know, the information that I'm getting from YouTube right now, trivial, frivolous, basically all of it. Not, not all, all of it, but when it isn't, it's almost always by accident. Like I come across something that's profound.
only because it was like sandwiched in between the skateboarding bulldogs who I love. like, I'm not even mad at the skateboarding bulldogs, like, ⁓ but that's, that's the information that I really want that you're talking about. Like, how do
Step back for a moment. I've gone in entirely the wrong direction. The more I think about this, the more I like your framing of like, we're not trying to get together to have a peaceful sit down. That's just not even the premise. Cause basically nobody wants to do that. I was about to say, I want to learn from the people who like are causing the most damage. That's a hundred percent. Not the case. I want to flip and or crush the people who I see as causing.
permanent generational damage to this country and the world. So I have absolutely no interest in going in hoping to make friends. But your point, it seems like, is everybody is there also. Let's not ignore it. Let's actually work with it. We're going to harness that energy. We're going to say, you want to fight? Great, here you go. But that's actually where the best hope for progress is right now.
Adam (19:56)
Yeah. Yeah. No, I think, I think that's right. I think it's one thing that I used to say, I don't say it quite as often now because there's more nuance to it, but the one thing that unites Israelis and Palestinians is that they both hate peace activists. You you say the word P I never use the word peace. You know what I mean? Like our company never uses the word peace. That's not, that will trigger everybody in that region. You know, just imagine right now for us to, you know,
Alex Loewi (20:12)
Perfect.
Uh-huh.
Uh-huh.
Adam (20:26)
I'll tell you, Alex, do you want to go and talk to a bunch of like Trump supporters? You're like, I don't need to learn. There's nothing I want to talk to Trump supporters about at this point. They're way beyond the pale, you know? And I'm sure that Trump supporters would say the exact same thing about libtards, you know? And just imagine then what it must be like for Israelis and Palestinians. You know, it's like you want me to sit down and talk to people that are trying to kill me? You're crazy. I need to ethnically cleanse them or I need to, you know, to kill them. That kind of like...
Alex Loewi (20:34)
Exactly.
Adam (20:54)
And certainly I don't want to hear anything from them and about them. just like get me as far away as possible. That's the mentality. I say, okay, cool. That's what we're working with. Let's start there. Let's start. Not, you know, I want to really like sympathetically and try to understand the other side. No one wants to understand the other side. We need to destroy the other side, right? So now.
Alex Loewi (21:02)
Yep. Yep.
Yeah. Nice.
Nobody's there. Yeah, right. Yep.
Adam (21:20)
Are there some people that want to understand the other side? Yes, of course. Then they are early excellent early adopters. They should come onto the platform. That's fine. ⁓ but we certainly can't brand ourselves and position ourselves and build a product that caters to that minuscule number of people. They're not there. Most people are not there. ⁓ and so we have to build towards the conflict, lean into the conflict. ⁓ and sometimes what the AI does is it seems to be,
Alex Loewi (21:38)
Yep. Yep.
Adam (21:50)
I forget the word. I'm going to have to repeat this one. Sometimes it seems like the AI is even just like stoking the flames. That's okay too. That's okay too. We speak to a lot of like mediators and facilitators and they say, especially Israelis and Palestinians, there's a lot under the surface. And then when you come, you know, on the surface, they're having a good time and they're enjoying the discussion or whatever. You need to excavate and then you need to actually find the sources of disagreement. And that's how you make progress.
Alex Loewi (21:59)
yeah, well, okay.
⁓ interesting.
Interesting. It's like saying, it's like we know, it doesn't matter if you're having a nice conversation now, we know you disagree. And what we're trying to do is resolve that, not make you get along right now.
Adam (22:32)
Exactly, exactly. Actually trying to make progress. one thing that I feel would be, and this is like, as an entrepreneur, I keep feeling this tension, but I feel like at this point, I've been decent at resisting it. The tension is everybody says, just expand to all the other topics. We do right now, maybe 80 % Israel, Palestine, 20 % American politics. And everybody says, just, you should just do all the topics right now.
I don't think that's right. think we really need to understand one market, one group, one conflict really, really well. Do it excellently. And then we can open it. And the nuances of Israel-Palestine, I think are quite revealing about lots of different conflicts. And so yeah, we're deciding to start there. We're actually going in a couple of weeks and to record a lot of content to do what we're just saying should be done online to do it offline.
And we're going to West Bank and we're going to lots of other places in Israel and East Jerusalem. And we hope to survive this. Yeah.
Alex Loewi (23:37)
Yeah.