You are viewing archived messages.
Go here to search the history.

Oleksandr Kryvonos 2024-06-10 07:00:03

A spreadsheet with the FoC projects

Duncan Cragg 2024-06-10 21:25:26

Go and fill it in with your project, or update one that's there

ender 2024-06-14 21:01:55

I am currently thinking about working on one or more of these ideas:

  • visual pl for creating your own compiler
  • visual pl for spreadsheet creation and customization
  • visual pl for making domain specific languages

I will probably converge on a visual programming language with a spreadsheet GUI that compiles to WebAssembly. Any resources for intuition about compilers, spreadsheets, or DSLs that help me understand these topics would much appreciate.

(think: x explained visually, explorable explanations)

ender 2024-06-14 21:15:46

Also: how does Photoshop work? For example, how would one make a free, open-source Photoshop clone?

am particularly thinking of the Smart Object functionality, which is the reason I use Photoshop in particular

none of the Photoshop clones have this feature, except for Photopea, which is very slow on my computer and not open source

Kartik Agaram 2024-06-15 10:49:53

Some possible goals for the future of software

After some đź’¬ #linking-together@2024-06-06 discussions, I spent some time searching the archives of this community for the word 'manifesto', then skimming the manifestos I found in search of their goals, phrased as problems they saw in the world. Then I clustered them by these problems. Here's what I ended up with, possible problems we have seen in the past:

  • Programming computers requires a lot of knowledge and effort.
  • Adapting software to ourselves is hard; few people do it.
  • Software is trapped in silos (apps) and can't be recomposed.
  • Software is inefficient and unstable because it's built atop a Jenga-like tower of dependencies.
  • Programmers encourage the world to be profligate with the attention of others.
  • Software has a deep influence on populations without corresponding accountability.
  • Programmers build software atop platforms optimized for consumption rather than creation.
  • Programmers can't build a sustainable living without behaving in anti-social ways hostile to their customers.
  • Programming requires simulating the computer in your head.
  • UIs are poor.
  • It is possible to break a computer's software in such a way that it requires outside intervention (e.g. a rescue disk) to fix.
  • Computers can't model the world.

If the problem you're chasing doesn't quite fit in any of these buckets, please share it in a similar format. (One sentence, not describing a solution.) If it does fit one or more of these buckets, please mention them. (Alternative wordings are also appreciated, but for me the primary goal here is to cluster ourselves.)

Kartik Agaram 2024-06-15 11:13:08

My problem has nothing new but seems to span multiple problems on this list:

  • The few can influence the many in anti-social ways, because adapting software to ourselves requires a lot of knowledge and effort, because it's built atop a Jenga-like tower of dependencies.

Writing it out like this exposes assumptions I'm making and points out other possible causes at each point. (Will reducing dependencies really reduce the amount of knowledge programming requires?)

Jase Pellerin 2024-06-15 14:37:57

This is an awesome breakdown, thank you for putting it together!

I'm new to the channel, and it's been tough to get a broad picture of what people are trying to do. This is such an accessible way to dig into the space đź‘€

Konrad Hinsen 2024-06-16 08:29:02

Thanks Kartik Agaram! This is very helpful.

I recognize my goals as mostly a combination of the points listed, but there is one more aspect which is somewhat related to other points but also distinct: People (users, other developers, ...) have no chance of knowing precisely what a program does if they have not written it themselves. Put differently, expressing software source code is a form of encryption of the initial developers' intentions.

An obvious illustration is spyware, but the problem exists even in a world without evil-minded players. In my field of work, computational science, it has become impossible to understand someone else's work, because so much of it is documented only by source code.

Maikel van de Lisdonk 2024-06-16 09:15:19

Great effort Kartik Agaram .. I think this list should be put on futureofcoding.org . My goals are mostly related to "programming requires simulating the computer in your head" and "Programming computers requires a lot of knowledge and effort"

Christopher Shank 2024-06-16 12:23:34

Kartik Agaram love the list! Curious if you could expand on what you by “anti-social”?

Kartik Agaram 2024-06-16 12:30:36

Christopher Shank It's kind of a rough attempt at summarizing the contents of the actual manifestos under that bullet. Could you scan them and let me know if it doesn't seem the most apt phrasing?

Christopher Shank 2024-06-16 13:03:37

I first interpreted “anti-social” as saying developers are socially avoidant, the act of programming and distributing it isolates and insulates them from society, so they never understand the impact of the software they are making. But it seems you’re also using “anti-social” to highlight how the incentives of building software largely go against the good of society (e.g. attention economy) and developers, the ones making the systems, turn a blind eye in order to make money/a living.

Christopher Shank 2024-06-16 13:07:41

For your own stated goal I’m curious some ways you’ve seen “the few impact the many”

Kartik Agaram 2024-06-16 13:19:21

Yes, my reading of the manifestos was more your second interpretation. Going against the good of your users, or of society more broadly. And that fits with my own perspective as well. Examples:

  • when we say, "programming is a priesthood and programmers have power."
  • The new word of 2023, enshittification. For example, how Google search has gotten worse over time.
  • The broadly documented social ills that arise from say Twitter optimizing for engagement.

One additional personal gloss of mine is that such behavior is not just anti-social but also goes against one's own long-term self-interest. I wonder what Larry Page and Sergey Brin use for searching the internet, and how they feel about it. Google's behavior seems anti-long-term to me.

Konrad Hinsen 2024-06-16 15:05:24

Looking at that list again, it's mainly the first four points that overlap significantly with my own itches, in addition to the one I already described and which I'd summarize in a catchy phrase as "It's hard to see what exactly a program does, even given its source code."

These problems do of course overlap with others from the list. For example, the power that software professionals and their employers have over the rest of us derives from their priesthood status, having competences and resources that can only be obtained by becoming a software professional oneself.

Don Abrams 2024-06-16 04:20:02

we keep telling computers how to work instead of what they should do

Personal Dynamic Media 2024-06-16 04:25:40

Prolog and Haskell users might disagree with you. Now if we could only figure out why neither one took over the world...

Konrad Hinsen 2024-06-16 08:39:52

There are quite different expectations about telling computers "what they should do". One is "let me give instructions in plain English, with the computer knowing the context as well as any person I might talk to". The other extreme is "let me write a formal specification, and have the computer derive a provably correct implementation from it".

Prolog and Haskell are in this camp, but they haven't achieved the goal so far. Prolog accepts the formal specification but then solves it for a specific problem by trial and error, rather than deriving once and for all a suitable algorithm that works for many inputs. Haskell does nothing more than any other programming language. It cannot do anything with just a specification. But functional code looks more similar to specifications, so there is the illusion of progress.

As for the first version of the goal, it has been the holy grail of AI for a few decades. Recently we have discovered that stating a goal informally entails the risk of the computer filling in the blanks in ways that we don't like.

Alex McLean 2024-06-16 12:14:42

I think using a computer should be more about asking 'what if' than telling it what or how