AGI – When Did You Cry About The End?

35 – When Did You Cry About The End?

Let us travel back in time, starting right now and walking backwards.

Today Elon announced on Twitter “Having a Bit of Existential AI angst today”

A couple days ago Sam Altman (ChatGPT OpenAI) wrote a text talking about AGI and taking it seriously in the OpenAI website. In the aknowledgements, he thanks a few of our folks, including the reasonable Paul Christiano, Nate Soares and Allan Dafoe – all of whom I know in person and vouch for as sincere EAs – Helen Toner, who I also know, and Holden Karnofsky, who I have always been suspicious of and continue to be, but many other people think he is a reasonable EA.

A day before that, Sam Altman showed up with Eliezer Yudkwosky and Grimes (Elon’s ex) in a viral internet picture.

The Day before, a podcast with Yudkowsky saying we are all going to die came out.

Now, you may see coincidence, but I see causation.

In that interview, Eliezer mentions when he cried.

2015. In 2015 me, Elon, and many other people gave presentations in the google based EA global. Elon’s condition for participating was that on top of the public facing panel on AI, they also did a private one.

They later came on to the hotel where we had a rationalist conference a few months before with all the top AI people and some of the smartest people on Earth. (I’ll post pictures of these events below) At that time it was already obvious for the best of us that Paul was the best of the young people. And Paul went on to create OpenAI with some other people.

Yudkowsky was close to the negotiating tables and stuff. But at some point, be it by Elon’s will, or by OpenAI policy, whatever, they decided to go open. People like me from the Nick Bostrom school of Superintelligence were scared about the name being OpenAI.

The story I told myself to sleep at night was that they were pretending and using the name Open so that anyone interested in Openess would go there, and we could neutralize them all if they were likely to destroy the world elsewhere. A small, very small part of me still believes this.

In the Interview, Yudkwoksy says that when that funding round happened and they were more moved by likes and dislikes and primate stuff, that is when he realized that we were not going to make it as a species. That is when he cried. That is when he realized the that we were not going to make it. 2015

Now if you’re new here. No one fought for this harder than Yudkwosky (not even Nick). Yudkwosky is a genius and one of the best people in history. Not only he tried to save us by writing things unimaginably ahead of their time like LOGI. But he kind of invented Lesswrong. Wrote the sequences to train all of us mere mortals with 140-160IQs to think better. Then, not satisfied, he wrote Harry Potter and the Methods of Rationality to get the new generation to come play. And he founded the Singularity Institute, which became Miri. It is no overstatement that if we had pulled this off Eliezer could have been THE most important person in the history of the universe. Once someone asked me if I disliked Eliezer (I guess it was because he didn’t talk to me much despite us working in the same offices) my response was something like “I love Eliezer. There is nothing Eliezer could do to possibly cause me to dislike him. The only scenario where I can possibly imagine Eliezer doing something that would make me personally dislike him is if he was holding a blade to the neck of my family members. And even then once the situation was over I would still be extremely thankful to him for all he has done.” You get the idea. I like the guy.

Anyway, so Elon cried in 2023. Eliezer cried in 2015.

Back when I was leaving Oxford, right before Nick finished writing Superintelligence, in my last day right after taking our picture together, I thanked Nick Böstrom on behalf of the 10^52 people who will never have a voice to thank him for all he has done for the world. Before I turned back and left, Nick, who has knack for humour, made a gesture like Atlas, holding the world above his shoulders, and quasi letting the world fall, then readjusting. While funny, I also understood the obvious connotation of how tough it must be to carry the weight of the world like that. He didn’t cry then, in 2013, but that gesture was king of an emotional cry in a way. So let us say Nick cried in 2013.

I met Geoff Anders in 2012, while a visiting scholar in the Singularity Institute. He arrived 2 weeks before I did. I was like the 6th member of Leverage, about a day into knowing Geoff. I could see that he was there for the same reason I was, with the same intensity. It was pretty cool.

A few days into this trip, me, Geoff, and Justin Shovelain (another character from that world) went out for food. We were doing the same we all did, back in 2012. Talk at maximal speed about how to save a world on fire that didn’t know it was on fire. Back then, there were only like, I don’t even know. There were very few people who understood what was up. More than 10, fewer than 150.

And at some point in the conversation it just hit me. A jolt of neurotransmitters, feelings and sensations just overcame my body like a torrent and I stopped listening and speaking for a little bit.

At that moment I understood the magnitude of the responsibility that had been bequeath on us. I realized how rare I was. How few of us there were.

You may think it is fun to be Neo, to be the One or one of the Chosen ones. I can assure you it is not. We had discussions about how we couldn’t take the same plane, despite odds of planes falling being so low, because the risk for the world was so high. The stakes of every single conversation were astronomical. The opportunity cost of every breath was measured in galaxies.

And when my body finally processed that this is real, that it is actually happening, I did what our ape bodies do best. I cried. I cried with my whole heart. I cried in 2012.

I am sure that Nick, Yudkwosky, and others, have cried before. Put yourself in that position. Can you imagine the weight of the world on your shoulders like that?

Anyway. So, today, as far as I’m concerned, Elon cried.

Of course the situation isn’t the same anymore, not in the least.

Elon was convinced by Nick Bostrom, same as me. I don’t know when he joined the game exactly but we started trying to get him in 2013 and he was definitely on board, enough to present at EA Global and meet our folk and all that jazz in 2015. So at some point in between Nick broke through to him.

Peter Thiel’s book Zero to One has a final chapter that is literally just Nick Bostrom simplified – not interpretation, it says so in the chapter – so I’m sure Peter cried at some point before 2011 when I imagine he finished the course on which a student based the book they wrote. That’s when he cried. Peter and Elon are friends-ish. So you know, maybe that.

2023 is not like 2012, when I cried.

Eliezer and Nick never really believed we had a great chance. But the size of the chance we had in 2012 was so much, so much bigger than than the slim chance our actions matter now. Maybe we will all die. Maybe we will all go to AI heaven. But the odds that our actions will be what determines that have diminished orders of magnitude in between then and now.

Another guy who I saw cry, literally, was Michael Arc, also known as Vassar. It was 2012 and he had just returned from like, Israel, England etc… and he was fucking exhausted like I have seldom seen. He was “me at the end of Burning Man” tired. ‘Burned out’ was named after his state. The guy was a wreck. And in that conversation with a few people I think was a couple days before the day I cried with Geoff and Justin. That was the first time I heard someone state that “it is just us”. He was pointing out that with all the mechanisms they had put in place to locate people, it looked like there were no other people in other parts of the world, in the top unis, etc… there was no one else. It was just them (I had just arrived, I don’t think I should count) and there were not many of them. It was seeing him say that that began the cascade that led me to cry a few days later. That day, Vassar cried.

I also saw Anna Salamon’s day. The memory is blurry but I recall her having returned from England. She wasn’t sad though. She was very grounded and rational (this was before the meeting where I saw them found CFAR) and had just met Nick for the first time (I had not met him yet). She said they seemed to be “just like us, except more able to survive the bureocracy of academia” at least in England, I concluded, there was hope. We were not the only tribe on the case.

A post mortem: Did I do enough?

Well. We lost. So, based off that obvious metric I didn’t do enough. That is what Solzhenitsyin would say anyway.

Even the stuff I did do, like write a paper on AGI and psychology, I didn’t try to spread it to the relevant parties. I accepted being scapegoated relatively easy from the Berkeley world because of rejection sensitive disphoria and other stuff. If we use the Elon or Nate standard of “do whatever regardless of what you feel so we win no matter how unlikely our victory is” there is no doubt that I did not do enough. Thank goodness I am not as draconian as they are in my self judgement. I think for a random monkey in Brazil, I did some pretty high levels of stuff while the window of opportunity was open.

I helped create EA, gave the first TED talk, created a couple crucial consideration institutions, met Nick, Paul, Eliezer, Hanson and so on. I ran for a period the world’s largest EA house. I did a PhD in Altruism which is about creating the necessary conditions in the future world for things to go well both in the case AGI is possible and in the case it isn’t.

I met Toby and Will, of EA fame. Toby showed me what math philosophers should know. Will and I were a little more…. uh. Ok that’s not relevant for the internet. But what I’m most thankful to Will is that he saved me an amount of time that is hard to imagine. Pretty much everything I wanted to do between 2010 and 2016, Will did 2 years before I would. In so doing, Will relieved me of some responsibilities that I cannot express enough how grateful I am for. Will was like the bird in the front of the formation that breaks the air barrier, and he allowed me a much, much, much freer life than I would have without him. Thank You William MacAskill, for what you have done for me, personally.

I’m sure Will cried at some point. I hear from the grapevine that both Toby and Will joined our team, team Nick Bostrom, so to speak, over time. Both of them started as EAs but Toby’s latest book is on existential risk , The Precipice, and WIll’s latest book is What We Owe The Future. Basically the longer we lived, the more I saw them converge to the position I held back in the early days. Joao, my friend colleague and coauthor from Brazil, edited and did research for both books.

But anyway. All this is just a post-mortem pre-mortem. We haven’t died yet. But if even Elon is already in the crying stage, who is left trying to fight who understands the real odds at stake? Dr Strange? The odds are getting close to Avengers odds, which assuming reality fluid homogeneity across possible worlds in the MCU (a ridiculous assumption) is one in 15 million. We are not doing that bad yet, but, we’re less than 2 orders away from it.

So, that’s when everyone that I know cried.

If you have an idea of what to do to save the world, shoot me a message. Can’t really get much worse than this. And one of you may have a good one.

I should have spread my paper and PhD more thoroughly. But I lack the emotional stamina for it. The social dynamics of the Bay affected me, and being away was kind of a necessity psychologically.

Anyway, you don’t have that problem. If you have an idea to save the world, let me know. I can at least help you filter out the bullshit, and refine the decent ones to a level where we could bring them to some of the right people.

Maybe you haven’t cried yet. Maybe you never will.

But most of the rest of us cried.

I cried a little bit writing this text.

It’s ok to cry.

It is a big deal, and we are here for you.

For 22 years, relentlessly, we tried. Not only these characters whose cries I happened to witness, but hundreds of other people.

I thank and love every one of them.

I has been an absolute honor to serve with you all folks!

I’ll see you on the other side!

Advertisement

1 thought on “AGI – When Did You Cry About The End?

  1. A bit melodramatic, innit? AI apocalypse is far from an inevitability, and the future has extremely wide error bars. Have you considered that all this crying might at least partially be the playing out of an ancient archetypal brain modules, i.e., the eschatological impulse?

    Numerous cults/religions throughout history have proclaimed that humans are on the verge of extinction, and it’s provided their members quite an emotional rollercoaster/served as a bonding gel for their religious community. After all, there’s nothing like the end of the world (and being one of the chosen in-crowd that knows about it) to distract from the banality of day-to-day life.

    https://en.wikipedia.org/wiki/Eschatology

    Granted, I’m not saying alignment research isn’t crucially important – I’m just saying that the Yudkowsky-esque bipolar mental breakdown is entirely unwarranted at this point.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s