You are viewing archived messages.
Go here to search the history.

Joshua Horowitz 2025-06-12 09:27:39

Two related things:

  • The LIVE Workshop deadline is July 21st, so there’s still plenty of time to get a submission together! Please let me know if you have any questions, or want to talk about a submission.
  • I’ve been cooking up a LIVE Primer. It’s an overview of LIVE-adjacent research – mapping out some territory, sharing some advice, & curating some citations. It’s rough right now, but I think there’s good stuff in there already. Please take a look and let me know what you think. ( Especially if there’s something you wish was in there that isn’t; that would be great to know.)

Thanks!

Kartik Agaram 2025-06-12 09:48:56

This is shaping up wonderfully, thank you. Every field should periodically summarize its major advances and open problems like this.

Joshua Horowitz 2025-06-12 09:56:59

Open problems! That’s a good idea. If there’s any of those in the Primer yet, it’s purely accidental – I should add some deliberately. 🙏

Kartik Agaram 2025-06-13 00:58:01

The Primer crystallized something for me. I hope you won't mind that I linked to it. It seemed more important to give credit than to not send people to unfinished drafts.

Joshua Horowitz 2025-06-13 03:40:13

Oh that’s fine; it’s slowly transitioning into publicly-available unfinished-draft status.

Konrad Hinsen 2025-06-13 08:15:32

Thanks Joshua Horowitz and Kartik Agaram for your writeups. I discover links there that I hadn't thought of before. In particular on visual programming.

Joshua Horowitz classifies it as a subspecies of Live. That's probably true for most projects in this space, but I don't see it as a necessity. The one counterexample I am aware of is Paul Tarvydas' diagram compiler from draw.io to Python.

Kartik Agaram sees visual coding as a special case of domain-specific notations. If that were the case, I'd probably be more interested in the topic. My impression is that most visual coding projects see moving away from text as their main mission, independently of domain considerations.

Something else I hadn't seen before is the link between Live and overcoming the vendor/owner divide. It's hard to imagine a live system with a vendor/owner divide, but the inverse is quite possible (and again, Paul Tarvydas t2t work is an example).

Joshua Horowitz 2025-06-13 09:45:11

Hmmm are you saying the primer classifies visual programming as a subset of live programming? That would seem to be contradicted by “Is visual programming live programming ? Not automatically, no!”

Konrad Hinsen 2025-06-13 11:29:40

Joshua Horowitz Not a subset, but a strong relation. What I found surprising is to see any link at all between visual and live programming. I have always seen them as orthogonal aspects. Your review points out that there is at least a correlation, if you do statistics across programming systems. So now I am wondering where that correlation comes from.

Konrad Hinsen 2025-06-13 08:44:01

I think that Kartik Agaram’s recent devlog post (mentioned in 💬 #thinking-together@2025-06-13) deserves a thread of its own, outside of the discussion of live programming. The topic that interests me in particular is what he calls the vendor/owner divide, which bothers me as well.

More generally, it's a dependency chain from hardware vendor via OS vendor and programming tool vendor up to the owner, end user, or whatever else we'd call the person or team that wants to use computation as a tool for their own goals. Along this chain, everyone has the power to break the work of the people further down the chain, unless there is some counteracting force such as competition between multiple vendors of fungible products.

As somebody at the end of the chain, if I want to preserve my agency, I have basically two choices (plus hybrids): I can be selective in my dependency chain, only accepting dependencies whose vendors I consider friendly and ethically sound. Or I can restrict myself to dependencies that are fungible because they implement standards for which there are other implementations as well.

Out of the two, my preference is for the latter, which is clearly the more robust strategy. Vendors change over time, and even those that promise not to be evil today can drop this promise tomorrow. Vendors or their products can also disappear for lots of reasons. In fact, a vendor that is serious about being ethically sound should signal this attitude by implementing standards, reducing its own power over its clients. Except of course that there are no standards for most software interface layers, and you cannot create one unilaterally either. Nor quickly, because good standards require many design iterations involving multiple vendors and users. Evolving standards is expensive.

There's a third aspect to consider, which is code complexity. For simple enough software, a vendor provides the convenience of a ready-made and tested implementation, but if the vendor disappears or becomes evil, I can maintain the code myself, or convince someone else to do so. That's what early FLOSS advertised as its strength: you can always fork. Except that today's software stacks have grown too complex for this.

[June 12th, 2025 5:58 PM] ak: The Primer <https://akkartik.name/post/2025-06-12-devlog|crystallized something for me>. I hope you won't mind that I linked to it. It seemed more important to give credit than to not send people to unfinished drafts.

Tom Larkworthy 2025-06-13 09:30:22

I think github.com/tomlarkworthy/lopecode pretty much addresses all the mentioned issues. Happy to hear where it doesn't so I can fix it.

  • built on standards (the web), with multiple vendors
  • plain text serialization format, offline-first (the web but without the network or DNS)
  • The reactive micro-kernal that enables its architecture is FOSS

Everything but the reactive micro-kernal is implemented in userspace, and it hot swappable at runtime. The architecture allows module literate development. So you can look at the exporter in isolation to understand how it serializes, right along side its prose explanation. You can live edit all these things at runtime without an external toolchain. You can write your own serializer if you want, you can have a program with two serializers, the runtime-is-the-source-of-truth

Furthermore the things you write can run both on Observable (the commercial closed source platform with owner/vendor divide) and Lopecode as stand alone (without owner/vendor divide). The point is: they share the same mirco-kernal and this is the core substrate that enables the transfer.

I am gonna present at Live something along the lines of micro-kernal architecture for mallable software. This field is littered with awesome ideas that work in a single app at the cost of everything else because people's innovative ideas are tied to framework they built for that specific purpose. So it looks cool but then nothing else you are familiar with can be delivered in any sane timeframe. So its impossible to transfer one cool programming paradigm to another. I hope the micro-kernel architecture can allow much more parrallelization built on a platform foundation that has all the right things baked in.

The point of Lopecode is not the web interface. Its the reactive micro-kernal design that removes the vendor/owner divide like smalltalk did but now with the web-without-the-net as the platform runtime

Konrad Hinsen 2025-06-13 11:43:47

Tom Larkworthy Thanks for describing your project in terms of the keywords of this thread! That's very helpful.

Not having actually used your project, I can only comment from a somewhat theoretical point of view. My points of criticism are then:

  • The kernel being FOSS is irrelevant. The Python community has given a spectacular demonstration of how a FOSS community can do evil to (a part of) their users, with the violent 2->3 transition. So unless there's an alternative implementation, or the kernel is simple enough for you to maintain it, you and the users of your project are dependent on a vendor.
  • For Web, the number of vendors depends on how far into modern Web technologies your dependencies go. If Lopecode works with Dillo or SeaMonkey, it's OK. If it requires a browser derived from Chromium or Firefox, then it's at a high risk of vendor dependency, given the increasing weakness of Firefox.

Point 2 is a very serious problem for many projects building on the Web, including one of my own. I wish I knew how to escape from it.

J. Ryan Stinnett 2025-06-13 12:00:32

Tom Larkworthy Your Lopecode effort is quite impressive and seems to achieve many desirable features I'd personally like to see in a programming substrate (though I still need to explore it in more detail). 😄

I do want to push back slightly on your framing here though, where you mention many people's ideas are tied to their framework. I agree that does happen and it is a risk, but isn't your own kernel also a new framework things would need to be adapted to...? I am not sure this particular complaint you have about other projects holds up, since it would seem to apply to your project (and probably most others I can imagine), so I would suggest not focusing so much on that point.

Kartik Agaram 2025-06-13 12:48:24

I need to give Lopecode another whirl.. 🤔

Just one point until I do so: I didn't intend to suggest we need to eliminate all vendors. Rather I'm looking for a certain "enlightenment" among both vendors and owners. Owners should want to minimize vendors, but not to 0. The sweet spot is probably 2 or 3 vendors. I consider it a feature that I ask people to install LÖVE for themselves rather than bundling it with my apps. That's my attempt as a vendor to broaden the horizons of computer owners beyond pure short term convenience.

Konrad Hinsen 2025-06-13 15:34:11

Also, you make it clear that LÖVE comes from another vendor, so it's out of your (moral) responsibility.

And now I see why your sweet spot is 2 or 3. LÖVE plus Lua Carousel makes for two 😜

Konrad Hinsen 2025-06-13 15:39:42

Zero-vendor is unrealistic anyway, at least today. It would be interesting to estimate the vendor number for some popular environments. But then, the number is perhaps not the most relevant factor. The risk of the dependency depends on the size and status of the vendor. Google, for example, is a pretty bad one. They are so big that they can safely ignore most of their users, and they have demonstrated repeatedly that they are happy to do so.

Tom Larkworthy 2025-06-13 17:05:04

Thanks for feedback. I am grateful as its been hard getting people to say what is on their minds.

The Python community has given a spectacular demonstration of how a FOSS community can do evil to (a part of) their users, with the violent 2->3 transition

You might like the lopecode vision. Its half chatGPT generated and very cringe because it was written in a hurry, but a massive motivation for bundling all dependancies inside a single file is to prevent breaking working software. The title is " Designing Immortal Software ". If it works on your machine today, it will work in 10 years provided web standards do not break backwards compatibility. The browser is the only runtime dependancy and all assets and code are inline inside that file, so you do not need a local webserver or a network connection to open that file. You do not need 3rd party software to make code modifications, because the build tooling is inline too. There is no "login". If you have the file, you own the software and have the toolchain to modify it forever. If I break backwards compatibility, it is 100% decoupled from the file you own and therefore cannot affect it. I can still run the files I exported in November despite the system changing massively. I picked web as a dependancy because it was one of the most backwards compatible technologies outside of the win32 API I could think of that has multiple vendors. I hear what you are saying an 100% agree and lopecode is my answer to that very thing.

Lopecode supports Firefox, Safari and Chrome and tested on Zen too. Can't do seamonkey as it requires Javascript modules. But still, Lopecode is standards based but just the modern web standards that came after seamonkey.

Ryan

but isn't your own kernel also a new framework things would need to be adapted to...?

yeah I was thinking "am I a hypocrite?" as I was typing it. I think no because I did not develop the kernel. Observable did, I am not affiliated. I am a fan, and I want to take that venture funded MIT licensed reactive spreadsheet engine and put it in a different context that allows more malleability and userspace interaction. Code written on observable works on Lopecode, so there is already a ton of code that needs no adaptation to run on lopecode.

I am not writing my personal vision of a kernel. Reactive runtimes are hard and Observable's is battle tested and made by Mike Bostock so its better than anything I could make. What I have done is written a ton of userspace projections of that runtime state that enable you to interact with it like an IDE and save it as a file, but those are userspace, and have no elevated status. They are libraries not frameworks. I think its an important distinction. The runtime is the frameworky bit, but if you want to do a reactive programming environment you will need something like that so I think its healthier I outsource that because it means you and I are equals (I can't change the runtime if I wanted!), plus its actually good and I do not burn out wasting my time writing one. Its MIT licensed so nobody has a problem with us using it.

Generally I think we need expand our horizons not just to the application, but to include the toolchain and IDE. That is the programming system and I don't want any of them breaking unexpectedly and I want all of them malleable and under our control.

Kartik Agaram 2025-06-13 17:55:38

Just one more response, then I'd rather hear more about Lopecode.

Also, you make it clear that LÖVE comes from another vendor, so it's out of your (moral) responsibility.

And now I see why your sweet spot is 2 or 3. LÖVE plus Lua Carousel makes for two > 😜 >

Zero-vendor is unrealistic anyway, at least today.

Yes, I was very much thinking about this when I embarked on my LÖVE adventures. Mu required a single vendor -- me -- and I had this epiphany where I went from absolutely wanting that, to realizing it was in fact a huge problem. The issue is not moral responsibility; I think there's no escaping that if we put out products we want others to use. No, the issue is that the smart computer owners I want to attract -- the ones who care about reliability and security and privacy -- they are not going to be willing to trust just li'l ol' me with their entire computer, when I have no track record. I wouldn't trust me. It just seems good sense. I wouldn't go to zero-vendor even if it was realistic.

Similarly, the fact that Lopecode requires a browser as one additional dependency feels squarely in the bullseye from that perspective. Tom's thinking regarding Mike Bostock exactly mirrors mine regarding the Lua and LÖVE developers. I think I'm still concerned about the exposure to browser vendors. But I'm very happy that we now get to run this very rigorous experiment. We have a single-file stack atop a modern browser. How long can we keep it running, what are the overheads? 🍿

Konrad Hinsen 2025-06-14 08:19:39

Yes Tom Larkworthy, I very much like your vision statement! I wasn't aware that you bundle so much in the exported files. That looks very good indeed for long-term preservation.

As for the browser dependency, what I'd really like to have is a browser one step above Dillo and SeaMonkey: HTML plus a JavaScript engine, but none of the more complex recent additions to the Web stack. That would be a nice target for durable software. Maybe also add Web Assembly, but I already hesitate about that one because its long-term stability cannot be assessed at this time.

Konrad Hinsen 2025-06-14 08:23:19

Kartik Agaram I very much agree that you want delegate low-level stuff to people who are good at it. So yes, you want some vendor below your work. It's just that in my ideal world, those dependencies would be multi-implementation standards. Which the Web still is, but LÖVE is not (yet?). On the other hand, LÖVE is much simpler, which is also worth something.

Maybe I'll try to put together an evaluation sheet for software dependencies.

Kartik Agaram 2025-06-14 08:31:30

I think I might be closer here to Tom Larkworthy than to you, Konrad Hinsen. The trouble with browsers is not that they're too complex, and I don't consider standards to be particularly helpful. What I try to focus on is the people and entities and incentives rather than the tools or standards. I can do X right now. Do I trust the people involved to permit me to continue doing X indefinitely? My bet is that LÖVE will outlive the browser in terms of not breaking current functionality, but I'm also glad to see someone taking the other side of that bet. It's good for the eco-system to hedge our bets.

I think SBCL will live as long as LÖVE. Wildly different thrusts and weights, but roughly equivalent thrust/weight ratios. However, I'm not convinced people will care about interop between SBCL and other *CLs over a period of decades even if they care now (something I'm also skeptical of). Standards are susceptible to embrace-extend-extinguish, so I consider them irrelevant. All I care about is concrete counterparties.

Kartik Agaram 2025-06-14 08:33:56

Maybe I'll try to put together an evaluation sheet for software dependencies.

This would be wildly helpful!

Kartik Agaram 2025-06-14 08:39:08

Another way to put it: In the immature tech landscape we find ourselves in today, standards are helpful early in the life cycle, but not once people converge on implementations. I think society (regulation, enforcement, etc.) will have to mature a lot more before we get to standards with the staying power of standards in the world of atoms. Like, maybe in a hundred years. Or a thousand.

Stefan Lesser 2025-06-14 11:37:22

Reading this thread, three words come to (my) mind: standards , protocols , and patterns .

I only have a rough intuition that those three are similar but different and what the differences between them might be. If I had to conjure up some kind of description today, I’d say:

Standards feel like enforced hardness to me; we put into rules what we think is best to create robustness. They are mostly analytic and propositional. Not everyone agrees and sometimes we need to break rules to get somewhere better. Hard in the heavy and rigid way, if it breaks it shatters. **

Protocols feel very close to standards, until you consider agreements born out of pragmatism protocols as well (what we may call de facto standards ). Hardness emerges at least partially out of practice and experience.

Patterns feel more like some abstract ingredient for both standards and protocols. Some repeating structure that grabs our attention, perhaps because we feel their significance, even though we can’t necessarily explain it. Our subconscious bubbles them up into our consciousness.

It feels like trust becomes less and less important the deeper you go on this hierarchy.

What does this have to do with this thread?

Not sure. Felt relevant to me. Can’t describe why…

Konrad Hinsen 2025-06-14 13:35:11

Kartik Agaram My view of standards is shaped by my experience with scientific computing in the 1990s. That was a time with significant diversity and innovation in hardware, including processors. MIPS, PowerPC, Dec Alpha, etc. Every machine came with its own compilers (mostly Fortran for us scientists), and language standards were crucial for code portability. Few people were willing to commit to one hardware/compiler platform, as they knew it might disappear a few years later. It was common practice to test one's code on different machines. And platform vendors advertises standards compliance. The narrow waist of the time was standard Fortran, not x86 machine code.

Konrad Hinsen 2025-06-14 13:39:05

With that background, I see standards as a commitment not to lock buyers into a piece of hardware or software. It's a much stronger commitment than saying "don't be evil".

Konrad Hinsen 2025-06-14 13:42:07

Common Lisp is no different, it started as a consensus language to let people port code across the various hardware/software combinations available at the time for running Lisp. Today, SBCL may be the most popular compiler, but there are non-negligeable competitors (mostly ECL, but also LispWorks on the commercial side). I test all my published code with both SBCL and ECL, and I know that others do the same. It's my contribution to keeping the standard as the agreed-upon interface for everyone, even though in deployment I used SBCL exclusively.

Konrad Hinsen 2025-06-14 13:46:39

There is an indirect impact of standards that matters perhaps at least as much, though it's hard to formalize. Writing to a standard creates a permanent awareness of the importance of stable interfaces. In my anecdotal experience (I'd love to see a scientific study!), libraries for standardized languages (C, Fortran, Lisp, ...) are much more stable over time than libraries for non-standardized languages (e.g. Python). Library authors (myself included) are aware that their API can be just as critical as the language itself for people working downstream.

Konrad Hinsen 2025-06-14 13:53:18

Standards are susceptible to embrace-extend-extinguish, so I consider them irrelevant.

That's what happens when one implementation gets to big that it can afford to break its original commitment. I guess there is no protection against that, other than antitrust laws.

Kartik Agaram 2025-06-14 13:56:21

Yeah. Your response makes me realize I was over-stating my case. It's not that standards aren't ever useful. It's more a question of what's your threat model. They were very relevant in the 70s and 80s and are less so now, but may become more so in future. It's important to understand when they're relevant to really navigate these choices confidently.

Konrad Hinsen 2025-06-15 07:51:33

Threat model, or perhaps better risk model, is indeed a good keyword here. To keep in mind for my dependency evaluation sheet.

Jason Morris 2025-06-15 20:52:40

Weakly held, very strongly stated, for your consideration: There are no dependencies between software components of note that are not also dependencies between people. Taken in that context, the idea that those people should be obliged to continue doing what they have done before because they did it ever, and/or the idea that the risk that they might change their behaviour justifies interacting with as few people as possible , is absurdly isolationist, and categorically self-defeating. Avoiding software dependencies (dependencies on people who write and maintain that code, OS, language, etc.) because they might disappoint you discards the greatest advantages of being social animals and is so rarely seen because it is a fundamentally unsustainable approach that will always be outpaced by maximizing cooperation. If you find yourself, e.g., so frustrated by version changes that you would rather not use Python at all, that is a maladaptive trait borne either of (deeply privileged) ideology, or a sort of neuro-divergence. It is also inherently self-contradicting except for single-user software or software in an unchanging environment, because if by virtue of avoiding dependencies you create something that is meaningfully better, and share it, you have created the next dependency that other people should, by the same logic, avoid.

Kartik Agaram 2025-06-16 01:48:55

Jason Morris That is so far from my belief system that I'm torn between asking a bunch of staccato clarifying questions and speculatively trying to take aim at potential root causes of the divergence. Breadth-first or depth-first? The narrowness of the communication channel has seldom felt so stifling. Let me try depth-first.

I don't think depending on 1 package == connecting with a single person or even a single team/project.

I don't think minimizing dependencies in the technical sense == minimizing the number of connections with people.

The basic problem I see is that in code a single bad line in a computer can destroy the good intentions of a million people. So all software becomes primarily this act of curation. The curator has enormous power/responsibility. So as a shorthand I focus on the top-level curator and ignore the people whose life/work is being curated, who are also all curating in their own right at all levels of the curation tree. The top-level curator has to be extremely coherent. Either a single person or a single point of certification by a larger team that makes atomic go/no-go decisions.

Does this seem like new information at all, or do I seem to just be repeating myself?

Jason Morris 2025-06-16 02:48:40

My thoughts on the matter are shallow, so don't expect any big fish. I understood your second paragraph to be asserting the analogy is poor, and your third to be attempting to show why, but I don't understand the third paragraph at all, so I don't know if you are repeating yourself or not. For just one example, it seems self-evident to me that not all software is an act of curation. Curation is careful selection. Most software is built with what's lying about. I don't know who the top-level curator is, or where all these powers, responsibilities, and duties of coherence come from. Also, I'm not sure it's an analogy, so much as a reframing. A dependency does not have a 1:1 relationship with people, but the relationship is monotonic, so I'm not sure that fact helps explain why something that seems true about people generally would be so untrue about people who write code. If there is a deep disagreement, it might be that I do not believe software == important, necessarily, and I'm disinterested in focusing on where that equivalence holds, because that's not where the learners are? I'm guessing. I genuinely don't know.

Kartik Agaram 2025-06-16 03:24:57

Interesting. Yeah, I think we have some basic disagreement. Agreed, not worth debating but may be worth trying to lay them out:

  • I believe software == important. Eating the world, etc. India where I have roots is quite heavily bureaucratic and extremely mediated by software. We're awash in scams, and scam-resistance imposes UX burdens for the elderly in particular. My medical records require a computer. I'm a minimalist in atoms but a packrat in bytes. Also philosophically, I think I have the privilege to assume my thoughts are much more valuable than my physical possessions. So I'm much more concerned about electronic security than about physical security. More circumstantial evidence: 30 years ago people thought security meant not writing down your password anywhere. Now we know it's far more important to choose a long password even if you then write it down. The critical threat is someone brute-forcing the bits, not getting access to the atoms.
  • A dependency does not have a monotonic relationship with people. Notice that a single line in Gemfile or package.json can expand into 100 or 1000 lines in Gemfile.lock or package-lock.json respectively depending on the precise package you choose. So even if you start with the assumption that they are, you're forced to conclude they are not. Proof by contradiction.
  • Personally I take curation extremely seriously.
  • Even if you don't believe me, or you think I'm a real outlier, what I mean by "curation is important" is that the choices every top-level curator (software package) makes are extremely consequential for computer owners (to the extent software == important above). Perhaps you'd use a different word for it than "curation". I'm not attached to the word. But it seems incontrovertible that this is true of all software.

As a quick attempt to clarify my third paragraph above, consider the XZ utils backdoor. Someone tried to change a miniscule number of lines on roughly every computer on the planet in order to flip them from net positive to net negative for their owners. Again, I feel like I'm liable to be saying obvious things so I'll stop there, but we can chat at greater length about it as needed.

Tom Larkworthy 2025-06-16 06:37:21

I think networked collaboration is extremely important to be able to scale up and supply the worlds software needs. I do not think it needs to be realtime, which I think why curation is important. e.g. package managers, pinned dependencies, git workflow etc. That said, when we say "end user programming" or "long tail creators" we are now talking about programs that probably are not very collaborate or worth "upstreaming". I imagine very personalized software has less generality in it, inherently. So in this area I think we need more stability (its extremely frustrating that working personal software breaks), and the need for networked development is less (the software is not intended to supply a solution to the world). So I think considering the audience for the software changes the development tradeoffs a lot.

Paul Tarvydas 2025-06-15 18:48:29

I created a document repo on github and a channel on discord programming simplicity for anyone interested in discussing and adding ideas...

from a substack article, brainstorming SCP

We explored how this principle led us from single-machine programming to * Solution Centric Programming * (SCP), which treats hundreds of small computing devices (Arduinos, sensors, actuators) as * new atomic operations * for automating specific problems, requiring * new recipe techniques * for combining them. Unlike traditional programming that forces all code through one paradigm, SCP enables * computational diversity * by letting each distributed node use the most appropriate programming paradigm (Forth for real-time control, Prolog for logic, FP for data processing, OOP for state management) as specialized atomic operations, while connecting them through pure data flow rather than restrictive function calls that impose control flow protocols. The key architectural insight is * Solution Centric Program Choreography * - a hierarchical tree structure where parent nodes contain the recipe logic for coordinating child atomic operations, eliminating peer-to-peer coupling that destroys scalability. This creates a new abstraction layer where solutions are choreographed through structured data flow between specialized atomic operations, each autonomous in their execution but coordinated through hierarchical recipes rather than lateral negotiation - representing the next evolutionary step in programming's fundamental cycle of creating atoms and recipes.

📝 Join the programming simplicity Discord Server!

Check out the programming simplicity community on Discord - hang out with 25 other members and enjoy free voice and text chat.

📝 Solution Centric Programming

2025-06-12