Hello, I'm Rob Hirschfeld, CEO and co founder of rockin and your host for the club 23 podcast. In this episode we dive in to Shoshana, Zubov book, The Age of surveillance capitalism. You don't have to have read it for for the book. This is a amazing tome of insights, predictions. really insightful work, you can literally scan every page every time I opened it up to any page, I read something fascinating and interesting. And we have a very good conversation as a result of it. But once again, you don't need the book. We start before we dive into the book watching Apple's new iPad ad. And I highly recommend that you actually watch that it's only a minute long. And it's the links in the shownotes before you listen to the video because I left in our discussion about that ad because it's a good tie into the surveillance capitalism and the discussion. So if you've seen it, you will I think enjoy our commentary about it. If you haven't, please pause for a moment watch the ad and then enjoy our discussion about the age of surveillance capitalism
I had a tidbit that I thought y'all would would want to weigh in on first which is the Apple ad Have you all seen this ahead?
No.
I don't want to buy as it for you. Okay, so we'll jump to the book but you need
no no where do I find what is what is the Apple
Apple has a new ad for the state of a new iPad that that
they just announced right super
super thin and they have this this one minute ad it's a minute that is generating a lot of controversy and you I'm gonna try it let me let me see if I can I can find it for you I watched it last night
I might be able to run it through this is it say
I'll Yep, okay. The that's gonna work Okay, perfect. All right, I'm going to share a screen and hopefully the audio will share to let's say it's running on this screen good. And share sound
let me pause my other sounds
and optimizing for video clip Alright, we'll see if this works in the back
I can see it only guy
did they did they issue a trigger warning in advance of showing it off
usually disturbing
I've been described as the first totally AI generated ad it's a message to all of us
more you think about it the worse it gets
squashed by Apple. I know this is apple juice
the oh my god but the whole Yeah, now they're getting they're getting pretty soundly beaten up over over the ad and basically you know all the creatives and all the all the you know, arts and things like that they're just crushing.
Oh, it seems a little bit scorched earth on their part.
I you know, they love being this creative brand where people are going to do all this stuff on the iPad and they think it's going to be this amazing new computer, no computer, but
you know, I feel better 20 years ago. Yeah.
You know, when when Steve Jobs died, my thought was eventually Apple is going to be the new IBM.
I listen, you think this ad has the potential to be as iconic? Maybe not in the same way as the 1884
The 84 And yeah,
I don't because I think I think there's too much The fragmentation of media that it did that in 1984, everybody saw it, everybody. And it doesn't work like that anymore. But
I think that it was iconic, not just of what people saw, but of Apple's sort of ethos, right? And remember, I think this is this is indicative of their ethos, even more than it is of the sediment.
And, and remember, in 84, you know, they were the upstarts, they were then their adversary were the, you know, the dusty gray looking, you know, Microsoft, these right? So little, little different. But no, this was this is just discipling.
Let me let me ask y'all a question. Who's more innovative right now, Microsoft or Apple? Neither?
Well, no, they're both innovative. I think that they're, they're, they're innovating in very different ways. Very different ways. And
I agree with that.
Surveillance capitalism, by the way,
this is a this is a great, this is a great, it is a well, actually, it's a, it's a pretty interesting use of the whole surveillance capitalism is, I won't say a litmus test, but it's a it's a boundary. Because, you know, when you think about what, you know, Apple has at least projected out into the the rest of the world is that, you know, we as a company, our devices, our services, are generally trying to give you the sense of safety and security with respect to your data, where we, we make a big point of that. Now, you may use our devices and go out there and, you know, give me give you all of your data tools, you know, every every data broker on the face of the planet, but we're not going to do it. Oh, Microsoft, not quite so much. And remember Microsoft spending, where's Microsoft's spending its money, you know, it's and spend and, you know, pushing for innovation. And I'd say
they've made a pretty good job of building out a very robust, and to some degree of peeling, cloud service that kind of puts starting to put everybody else to shame in some, some ways.
They're also spending like crazy. You know, what they're doing what they've committed to with AI? Innovative? Well, it's, you know, they're not, they're not watching their their nickels and dimes, that's for damn sure.
I think the general assumption that, to me that AI two is there's a, do a whole bunch of stuff and figure out what's going to stick mentality at the moment, which is actually in the book. So this book is too big to receive. But I, one of the things I found is that I can turn to any page spent 10 minutes reading it, and have to put it down because it is absolutely terrifying on every single page. So one of the things that was interesting, two comments, one is going back to the Apple piece, they talked about Google, needing to make money. And they had been collecting all this data to improve search results. And all of a sudden, they realized they could tune ads better with that same data. And that like the light flip, they're like, oh, we need to make money on ads. We're going to use the same data to target ads. And it was just that easy for them to be like, Oh, we have all this extra data. And I don't know when that moment comes for for Apple where their collect they have all the data. They might not be sharing it as much as some of the other providers are. But it's a monetization opportunity.
It's a monetization opportunity. I think it's an F. F those the that Apple has cultivated that they won't do that, or at least they, you know, they will do that only in code in consultation with them in collaboration with their their customer.
I don't I don't agree with that quite exactly. I wouldn't say EFIS I would say image. Yeah, I am convinced that that they use and potentially abused the data. But not.
Yeah. Yeah. So So 19 years ago, I remember a case study in my MBA program at SMU in Dallas, where we talked about Target stores, doing analytics on data and being able to predict when customers were likely to be pregnant. And there was a there was a case where they sent targeted ads to a 17 year old girl. And her parents gotten very upset with target because it was targeting all this baby stuff. And it turns out that 17 year old girl was pregnant unbeknownst to the parents, right? And every person that has gone through any top MBA program has gone through that case study for the last two decades. So if the question is, is Apple doing this? The answer is yes.
Well, I mean, the point of the book is that it's actually the economics are, are overwhelmingly compelling for companies to do this. And there's a step further that we'll get to my talk to Microsoft, but Joe, and you had your hand up. So
I had my hand up, because Apple is doing it. And if you want the case, in point, all you have to do is look at their beacon business, which they never talk about, oh, they're a primary supplier to every major retail chain on their beacons. And they do collect the data. And they do turn it into revenue. And I think it's utter pardon the expression horseshit that they constantly come out with, we are so private, and we won't sell your data, blah, blah, blah. And I do not believe that they are any less monetary monetarily engaged than any of the others. And I also disagree with the premise that these companies are innovate. There's a lot of innovation, there's very little intervention. And what we need is invention, not innovation. Anybody can make a better mousetrap. Yeah.
Yeah. But the thing that the thing that I found really fascinating in the book, looking for to see if I can find the graph here, because it ties back into some of the hunger for AI. In this is that we're, the book predicts a cycle, where we go from data collection, to data analysis to data prediction, where all this AI work comes in to the end prediction is not the end of the story. There's actually a further stage where you're actually causing the behavior.
Right. So your script, it's prescriptive, prescriptive. So
what you're doing is you're saying, okay, I can predict that if I show you these ads, then you will do these things, or I can show you these things to make you happy or make you sad or whatever.
We've We've hit prescriptive. I mean, Facebook is probably the most Tiktok, the earliest and one of the one of the one of the most heinous examples of prescriptive because I they have turned they've turned their platform into exactly that for you know, everybody from you know, selling, selling cosmetics or foodstuffs to to political and political. Yeah, it's a and the prescriptive aspects. I think it's really it's really that every time I read the zuboff books or the articles that also makes me just cringe the The hair on the back of my head, kind of its kind of its rises.
I just wanted to go back to the previous discussion about the bacon business. Tie in the news this week for Google's announcement of using Bluetooth data or their find by device network. Which is terrifying. Particularly since since they also describe how they protected the encrypted with your PIN or phone unlock credentials. What? Yeah, exactly. Lovely. Yeah. Yeah.
And Tyler, just to that point, that's why I said to you, machines.
We're gonna we're gonna do a takeoff on the graduate, you know?
Yeah, I got one.
I got one word for you got one word for you. But the machines
can I watch? Well done. I watched that movie recently. It's worth rewatch, I'd I hadn't seen it for decades.
So I need just relevance in.
Yeah, let's, let's take this question about, you know, surveillance capitalism, we could call it digital capitalism, which kind of expands it. And the part of it, that has kind of haunted me. And this is a good group to bring it up with. The thing that haunts me, besides the prescriptive aspect on human behaviors, is what I would like to think of as kind of digital colonialism. And what that is, is basically holding my colonizing and holding hostage, if you will. A variety of of entities, mostly enterprises, utilities, governance, particularly in the edge computing, and kind of the distributed, we'll call it distributed data and distributed compute. This is where, you know, Ford or John Deere, you know, basically, you know, you don't push it, you don't buy a tractor or a vehicle anymore, you basically lease it, you read it, and what you're doing is actually subscribing to their services, only their services, and try to make a change to it. And, you know, you are you're, you're cut off, you have just lost your investment. So it's a it's a kind of a, in some ways, it's a it's a really insidious approach to the control of, yeah, an enormous amount of infrastructure, including, you know, essential services, everything from electricity, water, hospitals, traffic, you name it.
It's a cyberpunk dystopia. Yeah,
I spent a lot of time doing competitive analysis of the Microsoft fabric platform. And I'll give you a data point in support of what you just said, which is that if you look at their pricing model, there are literally hundreds of different product components across those categories that are priced out in units that no one can understand. With 1000s of skews, it is impossible to do a pricing, proper pricing analysis of a solution like that
completely. That's on purpose. And that's on purpose. Yeah,
I would go farther. And I wouldn't I wouldn't call it digital colonialism. I would call it digital feudalism. Oh,
well, all the CSPs are five times. Right, their walled garden. It's, it is very hard to break through. But one of the points that I took from the book, and you know, think about this conversation and think about the rules that we all have, and I said to myself, Am I actually already a product? Yeah, well, I am a product, any catalyst is a product because it comes down to the difference between explicit consent to be used as a product. That was my opening shot of IBM data versus uninformed versus non informed consent. Right? I mean, her whole argument is, if consent is not there, you're being used, you're being taken advantage of blah, blah, blah, blah. But anytime we post on LinkedIn, anytime we speak publicly, anytime we have a conversation that then gets publicized, we are in effect, giving our giving the rest of the world carte blanche to use our collective brainpower.
What's the consent piece is also questionable as well. Like, yeah, almost every service that did, or practically every every service that that's out there now, do they will ask you to consent to their terms. But those terms are in no way understandable by the common person? No, yeah, no.
Which, which leads me to, you know, I mean, I sit here and I think, well, what's the counterpoint to her thesis? And I say, well, digital ID, which is a conversation, we've already had no radical, I'm not going to go again. But that's where I come up with those ideas of why we need to be far more protective of our intellectual capital. I mean, in some respects, you have to give to get I get that. But giving to force multiply a flywheel of benefits to a large corporation, without any real renewed remuneration for it is, I wouldn't call that a colonialism. I'd call that exploitation. And that is
the trickle down economics. Yeah, of course.
I believe firmly that the end state of every economic system is slavery. Is slavery is the end state of surveillance, capitalism is slavery. And slavery.
That's what I mean, that's the point. It's very well articulated in the book, because ultimately, the driver here is that the the goal is to ultimately direct your behavior. So your your witch shadow plays, you're right. Yeah. And so if if they have enough information to, you know, direct your behavior on a regular basis, then you're, you know, you've given up a degree of the illusion of free will. But you're being you're being directed. I mean, can you I mean, I could see in the not too distant future, where because we're already seeing things about your driving patterns being shared to the services. I'm actually surprised that the cars aren't embedding sensors in the cars that they use or have access to like beacon networks or video surveillance or things they need
to reach.
I, what I'm talking about is they you drive the car, there's services embedded in the car that the you know, you know, you can't turn on or off or you don't know, they're just you're carrying around sensor binder bundle. Well, I got
another car. Yeah, you can get us you can subpoena recordings, the government subpoenas, recordings of people for criminal cases for OnStar, where the customer has not even signed up for the service. Yes. Okay.
So so GM
just admitted that they have been using the data collected from all the vehicles and selling it to the insurance companies to, you know, surreptitiously up the premium on your policy, right, we
could see it going to a point where if you're driving faster than average, they're going to infer state your emotional state, right? If you're right, or if you're if you have a rug breaking,
you don't even you don't even need it in the car. You've got yourself on the phone with all of these sensors.
Yeah, you're right. No, it's this. Apple and
Google have convinced us to carry our surveillance device with us wherever we go. And
this is this is now back to where Apple right now is happily collecting all that data, pretending that they're not doing anything with it, but you know, With the point of the book becomes the it is an irresistible force of the market for them to be like, wait a second, we're if we can actually control this behavior, it's it's much more, you know, valuable for us to create the cycle.
That's why I call it feudalism, because you've got your your your nobility are the corporations, right? And then the kings are the countries and everybody, everybody's playing games to control the serfs I.
On the other hand, the colonialism analogy is appealing because it also provides the image of trying to exploit as many resources from outside of your fiefdom as you can just like, drag them in on leave at the empty house client do.
You guys it's kind of like feudalism from the 13th century combined with mercantilism of the 17th century. Yeah,
well, that was what that was, that was what colonial powers were all about. Yep. Not only to extract, but the other part of it was create a market for goods that they were producing, with the raw materials they were extracting from their colonies, and turning it right back around and making outrageous profits on mostly shoddy goods. I mean,
the competition? Exactly.
I mean, the problem with the cotton gin, and they generated enormous amounts of wealth. You basically up the production of cotton, in the in, in North America, and in North Africa. It goes, gets
turned into cloth and woven into into clothing, in in Britain, in factories that were pretty horrendous, and turned back around and sold to a very willing population in all the countries that they were colonizing at just outrageous profits.
You know, it's truly extractive. Yeah, for trades the same, but even more so with with, you know, goods like, like clothing and so forth. So,
I have seen that happening now, with private data, or personal data. Not private. No, exactly,
exactly. The only other thing about colonialism that has some appeal in let's call it a little bit of an optimistic appeal, is the fact that it is it's within reason, sometimes that the colonies rebel, the colonies will pull together and, and in, they'll sometimes just substitute one one colonial power for another, but still, there is some freedom of action, and some of them, I'm not sure I feel super optimistic about that. And this notion of being, you know, kind of being the product where, you know, personal data is gathered, extracted, compiled, sold, you know, paying me for my personal information on a per capita basis, ain't gonna move the needle. However, I mean, if I get $60 A year from my, which is 60 to $80 a year is what they think individuals would be paid if the amount of money generated on there. Well, no, actually, I'll turn it around. And it's not profit sharing, I don't think that's going to work be either because you're still only, you know, you're sharing a profit of, you know, somewhere, some something, whatever the profit on that $60 is, if it's equally however, if you took that $60 a year for all the people for whom data with on whom data is being gathered, and use that for the common, you know, basically as a source of, of funding for the common wheel, and what I mean by that is, you know, you It goes after and funds the prosecution of privacy like legislation, it goes after and supports a variety of service services that are infrastructure. That's a lot of money. Putting it into various kinds of common pools used for the for that community. And on the basis of, you know, for the entire community that's being surveilled. That under in the right hands or under the right controls, has a lot of power.
So I've got one for you. The first slogan for probe ops was control your data, control your future, and no one cared.
And if you change that, to control your privacy, control your future people would care.
Not enough.
Maybe not enough, but
I think we all were there. I think we are worried. But I don't think that's I I'm I guess I'm a little cynical. But
you're like the does exactly the word I was going to use. There's so much cynicism in the comp population. Like, yeah, I, I've talked with people. I mean, I'm particularly prior to that. So again, I'm an outlier. But I've talked with people who said, like, what, when the big discussion about Danfoss, Facebook buying WhatsApp? Like, they will have your private information? And the answer was, I don't care, they have it already anyway. Like when there's so much system built into society already that in some cases, you just cannot convince someone to take their privacy back. And
that's one of the things the book, I think, actually did a good job explaining because a lot of times I have the conversation with when people are like, I'm just a regular citizen, nothing I do is interesting. And white, on the contrary, yeah, what what what the book did a nice job explaining was the goal here is not to, you know, you know, target you from a crime, you know, keep you from preventing crimes, it's literally just to extract value from you. And you're very interesting from your marginal extracted value perspective, the challenge one of the challenges makes me think of the Amazon payola schemes where they're, they're in the middle, and they're extracting their job, you know, is there dragons it is their job to they're maximizing data abstraction, extraction on both sides from the consumers and from the vendors. And so that's, that's what they're using this data for.
Right, ultimately.
I don't think we see it. I don't think we're individually troubled by it. We get you know, it's, you know, the thing service ads, we get good prices.
There are been there have been a few efforts. Mostly, you know, they've been almost, you know, kind of soapbox, you know, almost almost religious efforts that have happened in governments that I think, have made a difference in the US for a while at least, you had consumer protection as being a you know, one of these soapboxes, one of these platforms where somebody really did, a lot of people really did benefit, and they truly believed in the value of the program got killed, and there
was a Ralph Nader serious that NPR is doing is really good about it. Yeah.
So but the whole point? Well, it was Elizabeth Warren, who ran who pushed for and was was part of the whole consumer protection agency. Those are the kinds of things that were the colonial powers taxed on. The money collected and put to use of like that I can see making a difference. I think, while it has its flaws, the Use approach to where you have GDPR and you have a variety of both intellectual property, personal privacy, their approach to basically, anti monopolistic practices, you know, this is a this is a place where for the common good on the back of granted taxes and things collected from the population at large not from not explicitly from the offending colonial powers here. I think those are those are important and, and good indicators that something can be done and with enough with enough appeal to, to the effect, and with basically, willingness to sit there and place attacks on the real profitability of these, as you've pointed out, that Zubov points out the elements the irresistible profit margins on this, then, you know, I think there's, there's at least some counterpoint to a lot of this.
Do you think that at some point, we're gonna break up? And this I'm looking at this calendar, and I'm wondering if this is a make 30? If you know, I'm just going forward? If there's a how to breaking up these big companies? Question right there. We have these new
Do you think breaking them up? Is the is the answer? I
don't know. I this is a little off off topic. But maybe I'm trying to figure out what the topic is. This morning, I was talking to my team about the there's a white space is getting left with the Hashi Corp, acquisition of enterprise software. And, and it none of the cloud people are particularly interested in the white in the white space. Red Hat's in there, but they're doing their things their own way. And it's sort of a strange, you know, we've got these weird just these giant companies doing very big things. And there's there's a fair bit of work where there's a merging a lot of whitespace in between what they're interested in, and what their what their product suites are interested in. If that makes sense. Maybe I'm not explaining well, because you're giving me the way
Rob can can make that the topic of next week's conversation.
I can't next week we have no, I can move around the calendar and the next open spot is May 30.
I really would like to dig into that with you guys.
Okay, so
the companies are the whitespace of Hashi Corp in
the whitespace of hashey. Corp. Okay.
I wanted to make sure, yeah. Well, actually,
you could look at a couple of white spaces caused by acquisitions. Yeah, there's hashey Corp. There's VMware. Yeah. How
about white white? It's not just hashey Corp. It's a bigger, they're the white spaces all over bins adjacent to that. I mean, I've been thinking about that. Because in some ways, what I'm doing is analogous to what hashey Corp is doing at the emphasis, except I'm doing it at the data layer. And so the term I'm using is integration as code, for example. And I would really like to dig into that. Not not the Proverbs, but the whitespace. That thing, right.
I wouldn't call it McCune. But, yes, it's, it's certainly I think I speak for all of us that we would be interested in this
issue by tracking it as early as May 30. And I'm putting in enough notes
but the question that was raised about breaking up the big boys and girls, I don't know that it serves the purpose to break them up. As much as putting,
hey guys, I gotta run
blinders on them to a certain extent around. You can only do this without doing that. Yeah, because the breakup I mean, that's like breaking a belt.
Either it's either criminalizing some behavior or taxing it.
Yeah. And I would say taxation is probably the way to go.
Yeah, but then we're getting down into the down the rabbit hole of like criminal punishment for companies, and how the fines are essentially a slap on the wrist of like the cost of doing business?
Well, taxation for you know, if you make the argument that this is that these companies are appropriating a an asset that shouldn't be that an asset from from individuals, despite their completely incomprehensible user license agreements, if you make that case, and then you say, look, you're appropriating, you're appropriating something that's owned by somebody else, you have to, you have to pay a price for that. And that price goes into a pool that is used for their collective benefit. I say that's a that is an approach that is not criminalizing them in, although they probably should be. But what it does say is, there's a there's a notion of governance, and there's a notion here of the common the common good for, you know, what's a common carrier? What is? How do you How does a government spend money on the back money derived from taxation, for the benefit of the populace? That it
it also supports the notion that I our individuality has value. Which is long gone from the dialogue of social media. That's for sure.
All right, I'll wrap this. This was good. I was like, yeah, how deep we would go on those books. And I feel like we actually covered it pretty well.
I look sounds really interesting. Okay.
We will, we will definitely get there. Next next week. Next week is analog computing systems. And the week after that I'm tracking my tongue in cheek title Hello, crypto, my old friend.
And we'll we'll do that I'll be I'll be on the road but able to start call on the 16th. So should be good. I'll be at glue con, I'll give you an update on all the AI topics there has to be to know,
everybody. Bye, bye.
Wow, this was a really deep topic. The book really helped us gel conversations that we've been having for 40 years now with the cloud 2030 group, and teed up some really interesting conversations that I hope you'll join us for in the future. Our next book after this is going to be the two but rule. And we will discuss that probably in September. If you want to get a jumpstart on reading it, I do highly recommend at least flipping through the age of surveillance capitalism. It is quite a book to read and one that will likely keep you up at night. In all the ways that some of our conversations should. If you want to find out more or look at our other discussions, and find out everything at the 23rd dot cloud. Looking forward to seeing you in one of our sessions. Thanks. Thank you for listening to the cloud 2030 podcast. It is sponsored by rockin where we are really working to build a community of people who are using and thinking about infrastructure differently. Because that's what rec end does. We write software that helps put operators back in control of distributed infrastructure really thinking about how things should be run and building software that makes that possible. If this is interesting to you, please try out the software. We would love to get your opinion and hear how you think this could transform infrastructure more broadly or just keep enjoying the podcast and coming to the Questions and laying out your thoughts and how you see the future unfolding all part of building a better infrastructure operations community thank you