Adam Becker gets Palestinians and Israelis to talk HeadOn: Part 2

Adam Becker gets Palestinians and Israelis to talk HeadOn: Part 2

Part 1 of our conversation is here.

Part 2 is directly linked here, or can be found in the header of this post on the website.

Here's the next piece of my conversation with Adam, about intentional relationship building around hard conversations – specifically, through the lens of his videochat platform HeadOn, and how he's using it to facilitate conversations around Israel and Palestine. In this installment we get into things like having Jerusalem Days on Wednesdays, how the most productive role of AI is definitely not as an arbiter of truth, and more.

It's only 20 minutes, even though we talked for almost two hours. I want these to be bite-sized, and personally feel like 40 minutes is a lot of podcast to listen to. But if you feel like you're eager for more and this is just dragging things out, let me know.

Unedited Transcript

Alex Loewi (00:00)
I'm with you. That sounds incredibly cool. I feel like, just in terms of product development, it makes sense to me to really clear out this one problem before you moved on to another. It also makes sense from the perspective of, well, if you can solve this one, you can solve anything practically. But I'm also...

champing at the it bit to try to get this as big as possible, as fast as possible. So I'm curious, what kinds of things are you coming across? What issues are you having with scaling? what are you learning in this process that makes you glad that you didn't just release it to the world all at once?

Adam (00:43)
a billion, a billion things. And it's, it's very interesting because virtually every VC most startup, almost everybody tells me to focus on growth. You're doing consumer, you're doing social, you should focus on growth. And I resist it. We'll see whether to my detriment. ⁓ But I, I've been resisting it and I feel increasingly confident that this is the right decision to make.

Alex Loewi (01:00)
Yeah. ⁓

Awesome.

Yep.

Adam (01:11)
And I just feel like I just have to trust my gut here. Even I'll tell you another thing that we've done the other day. Even our homepage was kind of like conflict agnostic. Right now we made it very West Bank oriented. It looks like it's the West Bank. I just, need to get everybody who's interested in this will grow within that space. You know what I mean? And I tell the team we're not in growth, we're in depth.

Alex Loewi (01:27)
Yeah.

Cool. Yep.

Adam (01:40)
And we really are trying to solve the problem. It's not just if I got a million people tomorrow and then 10 million people the next day, it means to me absolutely nothing about whether or not I'm closer to solving the problem, to getting people to have conversations and to make sure those conversations are productive and fulfilling and that they keep coming back to those conversations. It tells me absolutely nothing. But what it does tell me is that I can fundraise and get a lot of money and put my name on the, it's all vanity. It's all vanity.

Alex Loewi (02:06)
Yeah, yeah.

Adam (02:10)
What would it have looked like had the social media companies started by saying I want to make sure that we're doing the right thing and we're building it properly? Would they have been as big? I don't know. I don't care. I just want to make sure that they would have they would have fulfilled the mission that they promised us they're they're oriented towards. You know what I mean? So so that's kind of like even I was very resistant to not enforcing to everybody to always have camera on.

Alex Loewi (02:26)
⁓ Yeah.

Adam (02:37)
I just wanted camera on. thought this makes for the best conversations. I was talked out of that by users, especially by Palestinians. They're like, this is not going to happen. Absolutely not going to happen. All right. I guess I got to, if I would have gone wide instead of going deep, I would have kept video.

Alex Loewi (02:37)
uh-huh.

Mm-hmm. Yep.

Yep. Got it. Cool. Yeah.

Adam (02:58)
You know what mean? I, one of the biggest

challenges right now that we have is we don't jump around from lots of topics. We keep developing an existing topic, but it spans out into, it forks into lots of different threads. So for example, we talk about Jerusalem, right? Right now we have Wednesdays, it's Jerusalem day. Why? Because we had one conversation. was explosive, absolutely explosive. And we just made a mess and everybody there, there was Muslims and Jews and Arabs and Israelis and everybody.

Alex Loewi (03:20)
Yeah. Yeah.

Adam (03:27)
and Americans and honestly, not enough Christians. I wonder if the Christian element would have made some interesting impact on that one. So, but we just, it's like we touched on, I think one of the reasons that conversations tend to be difficult is you can't, you don't have the space to actually make progress. We we live in totally different realities. And then when those realities clash, we can't zoom in to every possible area of clash.

Alex Loewi (03:33)
Huh. Huh.

Adam (03:54)
And iron it out because we don't have time. got 30 minutes right now. And we just happened to be erupting in a conversation and that's it. So this is what happens. But if you know that you're coming back the next day or the day after or the week after, and the AI is understanding every possible thread of the conversation, and we're going to come back to it. Don't worry. And now we have a bank of all the different threads that have remained open and that we've never actually gotten any type of resolution on. Just something like this.

Alex Loewi (04:04)
and ⁓

Adam (04:23)
Clarifies and cleans up and makes so much funner a conversation about something like Jerusalem that could just you know now you have months months of conversations to have and The thing that I think a lot of the car a lot of the progress is made not even in the the semantic arguing over the facts and that That's not people come because they think that's what's driving the conflict. It's not that

Alex Loewi (04:33)
Right. Yeah, right.

Mm-hmm. Yeah.

Adam (04:51)
Just the fact that now I can meet them on a daily basis, on a weekly basis. Now I start to build familiar. Yeah, of course this person's going to say this. Now I'm going to say this. Now you're starting to build a relationship with people you would have disagreed with and that you do disagree with changes everything.

Alex Loewi (05:07)
Mm hmm. That's extremely cool. Yeah, like even even still in the conflict mode, you can be like, God, I was trying to get that guy yesterday and I've got such a good response now and I got to come back and give it to him tomorrow. Like, cause you can, I can see the appeal. That's super interesting.

Adam (05:19)
Yeah.

dude, this guy came up to me the other day and he said, Adam, think he literally, he took time to really reflect on my position on capitalism. And this is on the socialism calls that we have. He said, Adam, I think that what you like about capitalism is actually just liberalism. I don't think that you like capitalism. He took the time to think about it and he came back and I was so, I thought it was one of the nicest things anybody's ever done.

Alex Loewi (05:53)
Sure, yep.

Adam (05:54)
I think I

might have disagreed, I liked, I didn't like, whatever. Just, it was so thoughtful, you know what mean?

Alex Loewi (05:59)
Yeah.

Yeah, sure. Yeah. Who pays that much attention to the things that you're saying ever? Like, that's really cool.

Adam (06:07)
Yeah, and another thing that's fascinating what the socialism calls is that when Israel-Palestine you see the Israelis and Palestinians on opposite sides. The Jews and the Muslims they tend to be on opposite sides. But now we move them to a slightly different conflict which is about capitalism versus communism or socialism. And now you see the lines being redrawn. And now I'm arguing alongside my Palestinian brother.

Alex Loewi (06:25)
⁓ interesting. Super interesting.

Super interesting.

Adam (06:33)
And

another Palestinian is arguing alongside another Jew. And now we're talking about economic policy.

Alex Loewi (06:37)
Yep.

Adam (06:42)
And the next day it's combustible again. But the day after, now we talk about sports and da da. So we're expanding it very, very, very slowly just to actually see what are the emergent behaviors that we get to see as soon as you introduce it slightly.

Alex Loewi (06:42)
Very, very cool. Yeah, right.

Godspeed. That all makes sense to me. Like.

my mind is blown whenever I'm talking about this. It's just like, it's a cool subject.

Adam (07:07)
I have a lot more we could go in any direction or we can just reel it back in and like discuss some other like philosophical thing, whatever you think would be interesting.

Alex Loewi (07:14)
Yeah,

Let me think,

Adam (07:15)
I also don't want to be selling. You know what mean? Yeah.

Alex Loewi (07:18)
No,

Adam (07:21)
Yeah, think another interesting thing would be to talk about the risks of things like this. just so that maybe I could say that because it's, I don't want to sound, I don't know if I've sounded naive in the powers of AI for doing this because I'm mostly conjectural. You know what mean?

Alex Loewi (07:28)
⁓ huh. Yep.

Adam (07:46)
but not sure how that that's really fun. I do think that you might actually get a kick out of talking about the role of truth in this because

Alex Loewi (07:54)
huh. Yep. Go for it.

Adam (07:59)
One of the things that we've learned pretty quickly and pretty early on is that people want the AI to tell them the truth. That is the AI should be the one that steps in and says, right. And who's wrong. And how come the AI didn't step in and say that what the person was saying was, was not factual. But what I was saying is factual. Everybody wants the AI to do the heavy lifting for them. And I think especially younger people, younger people, they expect more that the AI will come in and do like epistemic.

Alex Loewi (08:29)
Interesting.

Adam (08:30)
And

I feel very, very reluctant to, so this in my mind is as, as, as bright a line as I could draw, right? I really don't want to go there because I feel like, first of all, the whole world is going to go there. Like that is part of the problem. And we, we should not be expecting that the AI will tell us what is true. The AI should simply invest in creating the infrastructure, creating the environment.

Alex Loewi (08:34)
Yeah, yeah, yeah. For sure.

Yep. ⁓ huh.

Adam (08:58)
think the AI, if anything should be investing in creating the kind of infrastructure and the kind of environment where truth can emerge between people. shouldn't be the one that comes in and says, this is what is true. This is what is false. I wouldn't trust the team to be able to do this. I wouldn't trust an AI to be able to.

Alex Loewi (09:05)
Yeah, that makes more

Adam (09:14)
have that type of power. I wouldn't trust the team that thinks that they can install this in an AI. I wouldn't trust me to do it. You know what mean? Like I'm sure that I'm like right about, you know, like a hundred percent of the things that I believe, but there's probably a percent that I'm wrong about, you know, maybe it's 1%, maybe it's 99, but who knows? It shouldn't be up to me to choose what is right and wrong and to then forever fixate in AI.

Alex Loewi (09:37)
Yeah.

Adam (09:44)
to believe it and then to propagate my nonsense. And so, you the AI should be as agnostic about truth as possible, but it should be extremely methodologically driven, not ideologically driven. The method should be get people to have conversations and to go deeper into the sources of their own knowledge. Right now we've been living in this world where right now I think it's still because AI is so young, where we are prompting the AI, AI should be prompting us.

Alex Loewi (09:48)
Right. Yeah.

Mm-hmm.

Adam (10:14)
It should ask us questions. We should then be able to excavate more deeply into our souls and into our psyches and to figure out what we believe in and why. And then to question the reasons why we believe certain things. And it shouldn't just be offering us the solutions right away. I think that that's cheating. And it looks like a lot of people are willing to do that kind of work. And so, but it could just be the case that AI needs to, I think about it a little bit as just like creating the conditions for science and

What's most interesting for me isn't scientific discoveries. It's the process and the environment that even allowed us to have scientific reasoning in the first place. And one way I think about it is like, have Galileo, right? When Galileo looks through the telescope, as I see it, what he sees is a whole new way of knowing things. Where beforehand, it's not just that he sees the craters on the moon.

Alex Loewi (11:12)
Yeah.

Adam (11:12)
It's that he introduces a whole new radical concept of what it means to know something. To know something doesn't mean to read it in a book and that somebody told you. Or that you got it from some source of authority. Or that it's just tradition because that's how we've always believed these things. This is we believe will continue. It's that just take a telescope and look. He can take his telescope everywhere in the world. Everybody will look. They will see the same moon. So this is the condition that allows for

Alex Loewi (11:21)
Great. Yeah.

Adam (11:41)
You know, the rise of these instruments allows for us to have this intersubjective way of knowing. I know a thing that you can also know. We need to come back to some type of environment where that's possible. The problem is today, this is what the example would be that if I have a telescope, you look to, you have a different telescope. When I look at my telescope, I'm seeing a different feed. This is almost like an AI generated, right? This is like an infinite feed.

you we're not even agreeing on the same sky. You know, mean, of course, we're not understanding one another. What can AI do here to just very slowly start to do this is almost like a translation job to translate from my sky to your sky. And I don't know, but that's I feel like the solution needs to be more infrastructural and more foundational. It should be at that level and the AI should be involved by

Alex Loewi (12:14)
huh, right.

Mm-hmm. Yep.

Adam (12:41)
at creating that common tapestry that allows us to reason across these different skies. And I think that's much more interesting and important work than for the AI to just tell us who actually started the Six Day War. I don't know. Whatever.

Alex Loewi (13:03)
Totally fascinating. Certainly makes sense to me. I feel like the place where I get there myself, like different, different conclusions or a different road, sorry, same conclusion, different road. I think of everything as like, well, what are we trying to accomplish in the first place? Cause I feel like if we're not talking about what even the goal is, then we have no hope of ever coming to the same conclusion because we're just trying to accomplish different goals. If we have something like we're trying to.

you know, should not have a war. We're trying to be coherent and like coexistent as a country. We're trying to, you know, however you want to understand or persuade or whatever the other side with whom we disagree, then like what you're talking about becomes a necessary step and even like an obvious step on that path. But if we don't make it clear that that's where we want to go eventually, then

I don't know, it never even comes up most of the time.

Adam (14:03)
That's right. That's right. And I feel like we've been sleeping on the wheel here, right? Like we've simply not articulated for ourselves and for others what the plan is and what the values are and what we're trying to go, which is why I really like what you're doing, right? Like this is like, we just, need to do this. Like, this is like intellectual history in a sense, right? But this is, this is like the cultural work that we have to do. And

Alex Loewi (14:08)
Yeah. Yeah.

Awesome. Yeah.

Adam (14:33)
If we're not doing it, somebody else is doing it and you don't, you're not going to like the results.

Alex Loewi (14:38)
Right. Yeah, that's...

Adam (14:40)
No, and I think

and I think just like making it concrete. So for example, we have the discussion in the Jerusalem room, Jerusalem room the other day about, okay, who gets to build which temple and why because if you build a temple, then you'll be able to bring about end of days more quickly and the right Messiah is going to come in there. How do you convey this to a group of people who are totally antagonistic to that concept? Right? And at some point, they agree, well, let's just try to be pragmatic about it.

Alex Loewi (14:46)
No. Yeah.

Adam (15:09)
The end of days will come. That's fine. Whatever happens, the end of days, end of days will happen. You know, it's okay. We'll figure it out. But until then, how about we try to invest in trying to maximize the wellbeing of everybody? What if we all try to live our best lives in this life? What if we, it's almost like it took so much conflict, but conversation to arrive at these truths. And you're trying to get to those truths. You're like, all right, let's just.

Let's just imagine that we agree on this. They're not yet agreeing on this, right? But at least you're putting together the framework that they can agree or disagree on. And so that's why I think that what you're doing is absolutely crucial. How do you get people on board? I think there's just a lot of work to do.

Alex Loewi (16:00)
That's the question. That's one of my big questions for either head on right now or like head on USA, whatever the sort of future generation is. Because it seems like to me, one of the major problems is recruitment. Like I think you've got a key insight into how to get like obstreperous people into these conversations, but there are so many who I'm afraid just fundamentally won't care. So I'm thinking about like, we've got...

77 million people who voted for Donald Trump.

How do we engage them at all? Like, what's the closest path from like us to those people?

Adam (16:45)
I'm not yet under the impression that we'll be able to get a billion people on head on. Certainly not in its current manifestation. I don't believe that most people want to spend so many hours in political debate or political discussion or conversation of any kind. I think that as soon as you broaden the appeal a little bit, and now we're talking about sports and we're talking about pop culture and we're talking about a lot of different things that do matter to people. You talk about death and how to cope with it. You talk about relationships and sex and

Alex Loewi (17:00)
Right. Yep.

Adam (17:15)
and dating and like, I think people do want to have these conversations and they're craving for spaces in which to have the conversations in a productive manner. And at that point, you get to meet a lot of different people that might disagree with you about a bunch of different things. And the AI can then even help to funnel you into those rooms where you're going to be confronting people who are not going to agree with you on things. That's the whole purpose. But ideally we get to do it in a way that is fun and contagious and addictive and engaging.

Alex Loewi (17:21)
Yep. Uh-huh.

Yep.

Adam (17:45)
Let's imagine we don't solve that. Let's imagine that that's

too ambitious. Fine. You, don't think you need to get to the 77 million people on one platform. You need, look, we have, I don't know the impact we've already had by just having what, like a thousand people already having these conversations. Those thousand people could be the next Ben Shapiro. It could be the next, you know, successful New York times, bestselling author. They could be a podcast host and they themselves can then go in and influence a bunch of.

people. So the people that are that who find it very important to have these conversations, especially at very young and formative ages, this already makes an impact. We see the impact that this makes on them. Are they going to be the next president of Palestine, the Prime Minister of Israel? They might be.

Alex Loewi (18:33)
Yeah,

yeah. Because it's exactly the kind of person who wants to engage in this stuff at a young age, who's probably going to be going in that kind of direction later in life anyway.

Adam (18:45)
Yeah. So I don't, I don't think that you need, we need good leaders, you know, and we need good leaders in our politics and we need good, for lack of a better word, like poets, you know, like we need people to do the culture work and we need pundits. need all of these different people. need commentators and credits. And I feel like they're, they're, they're looking for this type of thing. Now we have a bunch of people who are already running podcasts.

Alex Loewi (18:57)
Yep. Yep.

Adam (19:13)
Or writing blogs or writing articles and submitting them and like they're all we have very high quality group. We have people that are like bestselling authors ⁓ and they're all coming to have these discussions and sometimes to sharpen their arguments and sometimes to test out, you know, different ways and sometimes just to open their mind and listen to people from the other side. We've had activists that have been engaging in this for decades who are emailing us and messaging us and telling us

Alex Loewi (19:17)
awesome.

Yeah, yeah. awesome.

Uh-huh.

Yeah

Mm-hmm.

Adam (19:43)
actually learned a lot from this last conversation. So all these things make a difference. To what extent is that going to envelop everybody? I think we'll see, but we just need to pull on any lever we can.

Alex Loewi (19:46)
Super cool. That's awesome.