• Welcome to Righteous Wrath Online Community. Please login or sign up.
 

Programming languages don't need strings... or even numbers

Started by Darren Dirt, August 13, 2019, 12:44:28 PM

Previous topic - Next topic

Darren Dirt

August 13, 2019, 12:44:28 PM Last Edit: August 13, 2019, 01:09:50 PM by Darren Dirt
The incomparable Paul Graham: "The Hundred-Year Language" (April 2003)

http://www.paulgraham.com/hundred.html https://archive.is/vzYQX

It took me a few re-reads of certain paragraphs to really "get it". ( So does that mean I am now ready for Lisp? ;-) )
[LISP="LISt Processor" again aha I get it now!]
(Or maybe Arc? Holy crap!)

"
It's hard to predict what life will be like in a hundred years. There are only a few things we can say with certainty. We know that everyone will drive flying cars, that zoning laws will be relaxed to allow buildings hundreds of stories tall, that it will be dark most of the time, and that women will all be trained in the martial arts.

Here I want to zoom in on one detail of this picture. What kind of programming language will they use to write the software controlling those flying cars?

The reason I want to know what languages will be like in a hundred years is so that I know what branch of the tree to bet on now.

...strings are more or less a subset of lists in which the elements are characters. So why do you need a separate data type? ...separate the meaning of a program from the implementation details. Instead of having both lists and strings, have just lists, with some way to give the compiler optimization advice that will allow it to lay out strings as contiguous bytes if necessary.

...How far will this flattening of data structures go? I can think of possibilities that shock even me, with my conscientiously broadened mind. Will we get rid of arrays, for example? After all, they're just a subset of hash tables where the keys are vectors of integers. Will we replace hash tables themselves with lists?

...Logically, you don't need to have a separate notion of numbers, because you can represent them as lists: the integer n could be represented as a list of n elements. You can do math this way. It's just unbearably inefficient.

Could a programming language go so far as to get rid of numbers as a fundamental data type? I ask this not so much as a serious question as as a way to play chicken with the future. It's like the hypothetical case of an irresistible force meeting an immovable object-- here, an unimaginably inefficient implementation meeting unimaginably great resources. I don't see why not.

The future is pretty long. If there's something we can do to decrease the number of axioms in the core language, that would seem to be the side to bet on as t approaches infinity. If the idea still seems unbearable in a hundred years, maybe it won't in a thousand.

Just to be clear about this, I'm not proposing that all numerical calculations would actually be carried out using lists. I'm proposing that the core language, prior to any additional notations about implementation, be defined this way. In practice any program that wanted to do any amount of math would probably represent numbers in binary, but this would be an optimization, not part of the core language semantics.
"
_____________________

Get better at getting better. Daily.
_____________________

Darren Dirt

Finding essays** like this -- especially 16 years after they were written, which really gets me thinking about my current career path -- has to be brought into balance with being wary of "The Blub Paradox"
http://paulgraham.com/avg.html


**thanks to the article linked in the OP, I even have a different way of thinking of that word "essay" now! Ahhh, the deep-yet-not-pretentious wisdom of https://en.wikipedia.org/wiki/Paul_Graham_%28programmer%29
_____________________

Get better at getting better. Daily.
_____________________