comparison BCS_HST_2024-06-19/audio_3.txt @ 10:707f760a8359

raw from turboscribe
author Henry Thompson <ht@markup.co.uk>
date Wed, 11 Sep 2024 16:25:13 +0100
parents
children
comparison
equal deleted inserted replaced
9:46b1600e1d55 10:707f760a8359
1 (Transcribed by TurboScribe.ai. Go Unlimited to remove this message.)
2
3 [Speaker 2] (0:00 - 0:08)
4 It's of my eye, which being that whatever I focus on goes blurry.
5
6 [Speaker 1] (0:11 - 0:16)
7 And my mobile phone camera has that problem.
8
9 [Speaker 2] (0:17 - 0:21)
10 Well, so that might be true of attention, too.
11
12 [Speaker 1] (0:22 - 0:22)
13 Yeah.
14
15 [Speaker 2] (0:23 - 0:23)
16 Anyway.
17
18 [Speaker 1] (0:28 - 0:39)
19 We should probably, in the interests of your stamina, if not mine, bring this to a close for now and let you spend a little more time on reflection before.
20
21 [Speaker 2] (0:40 - 0:41)
22 Before collapsing.
23
24 [Speaker 1] (0:42 - 0:46)
25 Before you, yeah, exactly. Turn to.
26
27 [Speaker 2] (0:46 - 0:47)
28 At any rate, Agri.
29
30 [Speaker 1] (0:48 - 0:52)
31 Yes, there you go, Phil Agri. Well done. You see, it took you 90 seconds.
32
33 [Speaker 2] (0:52 - 1:00)
34 I know. It was obvious.
35
36 [Speaker 1] (1:01 - 1:03)
37 What about him? It was obvious to him, yes.
38
39 [Speaker 2] (1:03 - 1:15)
40 Yeah, I remember saying it to him and he said, oh, yeah, of course, that's right. And in fact, the model that he takes to underwrite whatever he's famous for, is it Pangea? I can't remember.
41
42 [Speaker 1] (1:16 - 1:27)
43 Yeah, I mean, well, it's the revolution in planning, which I used to have to teach, which I used to be able to teach about and have now for the people have lost sight of. I mean, planning isn't a thing anymore.
44
45 [Speaker 2] (1:28 - 1:28)
46 No, that's right.
47
48 [Speaker 1] (1:28 - 1:43)
49 I don't think I could point to a single one of my 120 colleagues at Edinburgh, the largest AI establishment in Europe and say that since Austin Tate retired and said, say that they work on planning.
50
51 [Speaker 2] (1:43 - 1:43)
52 Yeah.
53
54 [Speaker 1] (1:44 - 1:55)
55 I mean, if there's reinforcement learning, but that's only planning by courtesy or post hoc.
56
57 [Speaker 2] (1:57 - 1:58)
58 It's interesting, Xander is a planner.
59
60 [Speaker 1] (2:01 - 2:04)
61 In terms of his job title, yes.
62
63 [Speaker 2] (2:04 - 2:06)
64 Right, well, in terms of what he does.
65
66 [Speaker 1] (2:07 - 2:07)
67 Yeah, yeah, no, I know.
68
69 [Speaker 2] (2:08 - 2:10)
70 It's urban planning, but hold the urban.
71
72 [Speaker 1] (2:13 - 2:28)
73 Well, I mean, that's what Catherine, who's back there somewhere, I think. Yes. Absolutely.
74
75 The design of the non-built environment.
76
77 [Speaker 2] (2:28 - 2:28)
78 Right.
79
80 [Speaker 1] (2:32 - 2:33)
81 Defined privatively.
82
83 [Speaker 2] (2:34 - 3:01)
84 Here's a question which I thought about once and I was sort of struck by it, which is, if England were to decide one day to switch overnight to driving on the right instead of driving on the left, I believe the following proposition is true, which is the roads would be fine as is. What would have to change is an awful lot of signage.
85
86 [Speaker 1] (3:05 - 3:25)
87 And it's interesting question, at least some traffic lights, but that is so there would have to be, some things would have to be physically moved as well as the content of some. Well, indeed, some signs would have to be moved, right? They would be in the wrong place.
88
89 [Speaker 2] (3:26 - 3:27)
90 They'd be in the wrong place.
91
92 [Speaker 1] (3:27 - 3:38)
93 No, the stop signs, you know, a stop sign, which is to your right as you come to a stop would have to be moved. Right. Sorry, to your left.
94
95 [Speaker 2] (3:38 - 3:39)
96 Yeah.
97
98 [Speaker 1] (3:40 - 4:52)
99 I mean, maybe you're just remembering this because it might well have been you who told me this order 45 or 50 years ago, which is that two particular points about the early days of the systems theory class at MIT, both of them about examining the systems theory class, that the final year project was to describe in as much accurate detail as possible the algorithm implemented by the tech square elevators without using an oscilloscope. And the other was to, you know, produce a succinct, whether it was a project or actually an exam question, which is to produce the too long didn't read three page outline of what the Swedish government needed to do to do exactly that, because they have.
100
101 [Speaker 2] (4:53 - 4:58)
102 Right, right, right, right. No, these are not, none of this rings a bell.
103
104 [Speaker 1] (4:58 - 5:56)
105 So, okay. Well, yeah, someone else, but exactly. So, so, okay.
106
107 Sorry. But, but so the point of the fact that the, you know, stipulate that we had somehow managed to evolve our traffic system without ever actually asphalting any tarmac. And, and so the paths were just the consequence of our driving.
108
109 That, that none of the, nothing, nothing of the non-human world would need to change. People would find it inconvenient that their steering wheel was on the side that it was on. But the fact that I drove in this country, and so did Catherine drive the Saab that I brought to California for many years is, you know, that's not.
110
111 [Speaker 2] (5:56 - 5:57)
112 Testament to something.
113
114 [Speaker 1] (5:58 - 6:02)
115 Well, it's a testament to something, but it's also, it doesn't, it doesn't falsify the claim.
116
117 [Speaker 2] (6:03 - 6:03)
118 Right.
119
120 [Speaker 1] (6:03 - 6:48)
121 No human artifacts would need to change to adapt to this. But, but the question of whether you could, well, if you use, if, if, so we could, we could articulate this further, right? You have to suppose that in the absence of any signage or any traffic signals that rotaries had to evolve because rotaries enable you to efficiently have crossroads with, without any, you know, without any overt control of flow.
122
123 But no, but hang on a second.
124
125 [Speaker 2] (6:48 - 6:50)
126 The ovaries, I mean, the ovaries, the rotaries.
127
128 [Speaker 1] (6:51 - 6:57)
129 I'm sorry. We won't go there. Strike that from the record, your honor.
130
131 [Speaker 2] (6:59 - 7:04)
132 The rotaries in Sweden have the following property, which is their spirals.
133
134 [Speaker 1] (7:07 - 7:17)
135 So you enter and depending on how far you're going, you go further in.
136
137 [Speaker 2] (7:18 - 7:28)
138 That's right. Because that's right. Because if you're going to get off at the farthest exit to the rotary, you move into the center.
139
140 [Speaker 1] (7:29 - 7:36)
141 Yeah. Well, all, all big rotaries in the UK are like that now, but they're, but they're marshaled by lines on the pavement.
142
143 [Speaker 2] (7:37 - 7:48)
144 Right. But you, you, once you got yourself in the right lane, then it dumps you off at a certain exit.
145
146 [Speaker 1] (7:49 - 8:05)
147 No, it doesn't. That's what's interesting because the lines just take you into the center. You have to then cut across the lines that are taking somebody else into the center to get off when you want to get off.
148
149 [Speaker 2] (8:05 - 8:07)
150 Well, that's the opposite of Sweden, I believe.
151
152 [Speaker 1] (8:09 - 8:18)
153 Well, I'm sorry. You have to, you have, you, I, my immediate thought is you can't not have potential crossings.
154
155 [Speaker 2] (8:20 - 8:26)
156 No, you, but, but they happen right when you get in. You, when you get into the rotary, you cross it.
157
158 [Speaker 1] (8:26 - 8:29)
159 As you get in, you're crossing people who are on their way out.
160
161 [Speaker 2] (8:30 - 8:30)
162 Right.
163
164 [Speaker 1] (8:31 - 8:31)
165 You must be.
166
167 [Speaker 2] (8:32 - 8:32)
168 Yes.
169
170 [Speaker 1] (8:32 - 8:55)
171 No, absolutely. So you can't, you, you can, you can only have, you, you have to, if you're going to put lines on the pavement that indicate priority, they can only deal with, with, they have to give that priority unequivocally at every intersection. And so you have to decide who has priority, people coming in or people going out.
172
173 [Speaker 2] (8:56 - 8:58)
174 Right. And you pretty much have to decide people going out.
175
176 [Speaker 1] (8:59 - 9:03)
177 Because, except I, yeah, I guess. Yes.
178
179 [Speaker 2] (9:03 - 9:07)
180 Otherwise you might end up with a logjam of people in the rotary.
181
182 [Speaker 1] (9:08 - 9:35)
183 I need to, I need, I need to, there is a rotary that I don't like going on, on the bike precisely because as a cyclist, when you have to cross the line of priority, you're vulnerable because people assume you're going slower. They don't even see you. Anyway.
184
185 Anyway. Sorry. I was entertaining ourselves without actually moving ourselves forward.
186
187 [Speaker 2] (9:36 - 10:11)
188 Yeah. So, okay. So I think what I wanted to say by way of summary is, is there a feasible project, God willing, in the Creekstone rise to have a shortish version of this story that tells these two stories.
189
190 [Speaker 1] (10:14 - 10:15)
191 We can but try.
192
193 [Speaker 2] (10:16 - 10:52)
194 We can but try. And I think a subsidiary question to that is, can I, and my answer might be no. Can I come up with an F-able description of why the Dijkstra's generates, why the physical Dijkstra's generates linguistic indexicals?
195
196 That's simple.
197
198 [Speaker 1] (10:53 - 10:55)
199 That's already a better way of putting it. Carry on.
200
201 [Speaker 2] (10:56 - 11:21)
202 Yeah. No, I think, I mean, that's all I would want to say. And is that story essential to the fusion, as it were, that this project is aiming at?
203
204 I don't think those are questions that I'm not going to try to answer them now. And I don't think, I can't answer, I can't do anything until the reflections book is done.
205
206 [Speaker 1] (11:22 - 11:25)
207 Understood. But those are questions we can pick up then.
208
209 [Speaker 2] (11:25 - 11:27)
210 Well, but also there, they can be mulled on.
211
212 [Speaker 1] (11:28 - 11:29)
213 Yeah, absolutely.
214
215 [Speaker 2] (11:30 - 11:30)
216 Before then.
217
218 [Speaker 1] (11:33 - 11:47)
219 And there was something moderately important there that just slipped. What was it? Oh, yes.
220
221 It also occurs to me that there's a, just, I just want to get this on the record and we can Yeah, sure.
222
223 [Speaker 2] (11:47 - 11:47)
224 Say goodbye.
225
226 [Speaker 1] (11:48 - 12:48)
227 There's a, there's a, a trick that, that Feynman pulls in what I think is the, the best thing that he ever wrote, which is his, his book about why glass is transparent, which has a title, which has quantum something in it, but I can't remember now. Doesn't matter. It's only about this.
228
229 It's only, it's less than a centimeter thick. When you think of it, give me a note. And it, what he, what he says in the introduction is that the characteristically Feynman introduction, because he says the thing I hate about popular science is they don't tell you when they're lying to you.
230
231 [Speaker 2] (12:48 - 12:49)
232 Right.
233
234 [Speaker 1] (12:50 - 13:39)
235 Because they have to lie to you because to tell you the truth would mean that they would, they'd exceed their word count. Right. Right.
236
237 And newspaper editors are very jealous of people, you know, their word count. And there is a point in this book in which I'm going to lie to you and I'm going to tell you when we get there so that you know it. Okay.
238
239 Because it's the one point in the book where I think people with a reasonable grasp of moderately sophisticated mathematics to the degree of a high school diploma and maybe a little bit of calculus can understand. But there's one point at which I'm, that, that's blown. And I'm, so I'm going to lie to you and I'll tell you.
240
241 [Speaker 2] (13:40 - 13:41)
242 That's what Feynman says.
243
244 [Speaker 1] (13:42 - 14:35)
245 That's what Feynman says in the introduction. And he does indeed at some point in the book tell you. And the, the version of that, that might apply is to, to go back to a rhetorical stance, which you adopted briefly 20 or 30 years ago.
246
247 And I think correctly, subsequently abandoned, which is the promissory, you know, where, where your version of that was, who, who, who needs to accept what promissory notes in the course of, you know, when am I uttering promissory notes, which I'm not delivering on? The similar phenomenon. I think it's in the objects book that you, in the, the preface of the objects book that you, you articulate a sort of epistemological stance about the necessity of being clear when promissory notes are being uttered.
248
249 [Speaker 2] (14:36 - 14:36)
250 Right.
251
252 [Speaker 1] (14:36 - 15:07)
253 A sort of lighter weight. And, and, and even in the, in the sense of, of Feynman being humor, being constantly lurking around the edges. Right.
254
255 That, that may be the way to get a 50 page draft of the 4,000 page pair of parallel drafts is to, is, is simply to put square brackets. Magic happens here.
256
257 [Speaker 2] (15:08 - 15:08)
258 Right, right.
259
260 [Speaker 1] (15:09 - 15:09)
261 Ad lib.
262
263 [Speaker 2] (15:11 - 15:11)
264 Yeah.
265
266 [Speaker 1] (15:12 - 15:17)
267 And fill them in as time allows.
268
269 [Speaker 2] (15:18 - 15:19)
270 The time allows, yeah.
271
272 [Speaker 1] (15:19 - 15:24)
273 The point at the very least, not to, not to, to run into the sand at those points.
274
275 [Speaker 2] (15:24 - 15:47)
276 Right, right, right. Another way to do it is to, is to put together lecture notes for my class, which I regularly did and compressed an enormous amount of whatever. I mean, that was a discipline that came pretty easily getting in, didn't bog down.
277
278 [Speaker 1] (15:48 - 15:49)
279 Yeah. Yeah.
280
281 [Speaker 2] (15:49 - 15:50)
282 So.
283
284 [Speaker 1] (15:51 - 16:01)
285 Yep, we could. Yep. I mean, if, if we have the time to have the lectures to cover the 2,000 pages, then that can be done.
286
287 [Speaker 2] (16:01 - 16:02)
288 That can be done.
289
290 [Speaker 1] (16:04 - 16:21)
291 That can be done. Anyway, best to Jill. And it occurs to me since, since we're, you know, it's, I don't think there's much coming in the way of reasons why this wouldn't work for us.
292
293 Would sometime this weekend for a four-way, half-hour social conversation be possible?
294
295 [Speaker 2] (16:22 - 16:23)
296 Um.
297
298 [Speaker 1] (16:25 - 16:30)
299 After the point at which you're no longer capable of doing useful work on the reflection book one day?
300
301 [Speaker 2] (16:33 - 16:34)
302 Yep. Sorry, I just.
303
304 [Speaker 1] (16:38 - 16:40)
305 And not only possible, but welcome.
306
307 [Speaker 2] (16:41 - 16:41)
308 Yeah.
309
310 [Speaker 1] (16:42 - 16:47)
311 With respect to which you, you, you need to involve Jill and I need to involve Catherine.
312
313 [Speaker 2] (16:50 - 16:52)
314 I think not this weekend.
315
316 [Speaker 1] (16:52 - 16:56)
317 Okay. Well, I can't see further forward than that. So that's okay.
318
319 [Speaker 2] (16:56 - 16:58)
320 Okay. We'll do it soon.
321
322 [Speaker 1] (16:58 - 17:00)
323 Yeah. Let's not lose time. It's interesting.
324
325 [Speaker 2] (17:01 - 17:07)
326 Oh, sorry. No. Yes.
327
328 [Speaker 1] (17:09 - 17:12)
329 Okay. Take care. God bless.
330
331 Best to Jill.
332
333 [Speaker 2] (17:13 - 17:15)
334 And God bless Catherine too.
335
336 [Speaker 1] (17:16 - 17:22)
337 I will. Goodbye. Okay.
338
339 Cheers. I'm going to stop recording and then.
340
341 (Transcribed by TurboScribe.ai. Go Unlimited to remove this message.)