Interesting perspective from Peter Clare of Oracle fame
"The problem w/ this approach philosophically (I think) is that most layered abstractions in real life undergo paradigm shifts as you move from one domain to another. Sure, you can describe cooking via atomic reactions, or even chemistry, but who cares – I guarantee you that a chemical model of cooking won’t likely taste as good as a traditional mixing ingredients w/ a pinch here or there approach.
The same is true for computer paradigms. Every (computer) modeling language is designed to solve a particular set of problems, and will likely be miserably inadequate for problems outside the domain for which this language was intended. Sure, pgmrs can do amazing things w/ languages using techniques that the language designers themselves didn’t forsee – but this is just stretching the limits, not making any fundamental paradigm shift.
I have always tho’t that computer weenies want to play at being God, and that our systems reflect this basic arrogance and prejudice.
Somehow, we feel that this modeling Deus ex Machina is going to solve some implausable problem by inserting the genius of Our-New-Computer- Language-in-God’s-Image into the mix. I am skeptical.
The real world seems rather complex to me and we seem to have varying degrees of comfort and discomfort using many different (internal) systems and models to navigate our way through The Maze. Such is the World.
If we look at the evolution of computer-as-language today, we use a wide variety of linguistic mechanisms ranging from Imperative to Dialogue to point-and-click Exploration to You-Name-It. At the end of the day, most of these linguistic mechanisms are basically proxies for communicating our intent to other people, usually communally through space and time, much as literature communicates intent through space and time.
If we look at the chaos that is the current landscape of ways to communicate our intent via these linguistic automatons, it is pretty easy to see that computer language mechanisms are evolving much the way natural languages evolve – via our ad hoc social systems that determine in a willy nilly way what works and what doesn’t over long periods of time."
May be the approach is wrong.
May be we still do not know the right way.
It happened with the written language.
I have a theory about it.
The first written language was made with idioms (as the Chinese and Japanese still are)which reproduced the concept instead of the sound.
The letters were the biggest improvement in the way of written language.
Because with 26 (Italians, 22)letters we can express all what we think and say.
Simplifying the way of writing allowed many more to access culture and the other bigger step was printing.
Now a days in many countries the basic culture is a matter of wanting more than affording.
The real reinvention of computing will come when the languages instead of using basic data format (string)will use simpler letters with which it will be universally possible to build a common language understood by all computers.
Simplifying computing will make informatics more a matter of wanting than affording.
I regret one thing: that I was born too soon...