Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
When all you have is a pair of bolt cutters and a bottle of vodka, everything looks like the lock on the door of Wolf Blitzer's boathouse. (Replace that with a hammer and a nail if you don't read xkcd)
I currently program Clojure, Python, Java and PHP, so I am familiar with the C and LISP syntax as well as the whitespace thing. I know imperative, functional, immutable, OOP and a couple type systems and other things. Now I want more!
What are languages that take a different approach and would be useful for either practical tool choosing or theoretical understanding?
I don't feel like learning another functional language(Haskell) or another imperative OOP language(Ruby), nor do I want to practice impractical fun languages like Brainfuck.
One very interesting thing I found myself are monoiconic stack based languages like Factor.
Only when I feel I understand most concepts and have answers to all my questions, I want to start thinking about my own toy language to contain all my personal preferences.
Matters of practicality are highly subjective, so I will simply say that learning different language paradigms will only serve to make you a better programmer. What is more practical than that?
Functional, Haskell - I know you said that you didn't want to, but you should really really reconsider. You've gotten some functional exposure with Clojure and even Python, but you've not experienced it to its fullest without Haskell. If you're really against Haskell then good compromises are either ML or OCaml.
Declarative, Datalog - Many people would recommend Prolog in this slot, but I think Datalog is a cleaner example of a declarative language.
Array, J - I've only just discovered J, but I find it to be a stunning language. It will twist your mind into a pretzel. You will thank J for that.
Stack, Factor/Forth - Factor is very powerful and I plan to dig into it ASAP. Forth is the grand-daddy of the Stack languages, and as an added bonus it's simple to implement yourself. There is something to be said about learning through implementation.
Dataflow, Oz - I think the influence of Oz is on the upswing and will only continue to grow in the future.
Prototype-based, JavaScript / Io / Self - Self is the grand-daddy and highly influential on every prototype-based language. This is not the same as class-based OOP and shouldn't be treated as such. Many people come to a prototype language and create an ad-hoc class system, but if your goal is to expand your mind, then I think that is a mistake. Use the language to its full capacity. Read Organizing Programs without Classes for ideas.
Expert System, CLIPS - I always recommend this. If you know Prolog then you will likely have the upper-hand in getting up to speed, but it's a very different language.
Frink - Frink is a general purpose language, but it's famous for its system of unit conversions. I find this language to be very inspiring in its unrelenting drive to be the best at what it does. Plus... it's really fun!
Functional+Optional Types, Qi - You say you've experience with some type systems, but do you have experience with "skinnable* type systems? No one has... but they should. Qi is like Lisp in many ways, but its type system will blow your mind.
Actors+Fault-tolerance, Erlang - Erlang's process model gets a lot of the buzz, but its fault-tolerance and hot-code-swapping mechanisms are game-changing. You will not learn much about FP that you wouldn't learn with Clojure, but its FT features will make you wonder why more languages can't seem to get this right.
Enjoy!
What about Prolog (for unification/backtracking etc), Smalltalk (for "everything's a message"), Forth (reverse polish, threaded interpreters etc), Scheme (continuations)?
Not a language, but the Art of the Metaobject Protocol is mind-bending stuff
I second Haskell. Don't think "I know a Lisp, so I know functional programming". Ever heard of type classes? Algebraic data types? Monads? "Modern" (more or less - at least not 50 years old ;) ) functional languages, especially Haskell, have explored a plethora of very powerful useful new concepts. Type classes add ad-hoc polymorphism, but type inference (yet another thing the languages you already know don't have) works like a charm. Algebraic data types are simply awesome, especially for modelling trees-like data structures, but work fine for enums or simple records, too. And monads... well, let's just say people use them to make exceptions, I/O, parsers, list comprehensions and much more - in purely functional ways!
Also, the whole topic is deep enough to keep one busy for years ;)
I currently program Clojure, Python, Java and PHP [...] What are languages that take a different approach and would be useful for either practical tool choosing or theoretical understanding?
C
There's a lot of C code lying around---it's definitely practical. If you learn C++ too, there's a big lot of more code around (and the leap is short once you know C and Java).
It also gives you (or forces you to have) a great understanding of some theoretical issues; for instance, each running program lives in a 4 GB byte array, in some sense. Pointers in C are really just indices into this array---they're just a different kind of integer. No different in Java, Python, PHP, except hidden beneath a surface layer.
Also, you can write object-oriented code in C, you just have to be a bit manual about vtables and such. Simon Tatham's Portable Puzzle Collection is a great example of fairly accessible object-oriented C code; it's also fairly well designed and well worth a read to a beginner/intermediate C programmer. This is what happens in Haskell too---type classes are in some sense "just another vtable".
Another great thing about C: engaging in Q&A with skilled C programmers will get you a lot of answers that explain C in terms of lower-level constructs, which builds your closer-to-the-iron knowledge base.
I may be missing OP's point---I think I am, judging by the other answers---but I think it might be a useful answer to other people who have a similar question and read this thread.
From Peter Norvig's site:
"Learn at least a half dozen programming languages. Include one language that supports class abstractions (like Java or C++), one that supports functional abstraction (like Lisp or ML), one that supports syntactic abstraction (like Lisp), one that supports declarative specifications (like Prolog or C++ templates), one that supports coroutines (like Icon or Scheme), and one that supports parallelism (like Sisal). "
http://norvig.com/21-days.html
I'm amazed that after 6 months and hundreds of votes, noone has mentioned SQL ...
In the types as theorems / advanced type systems: Coq ( I think Agda comes in this category too).
Coq is a proof assistant embedded into a functional programing language.
You can write mathematical proofs and Coq helps to build a solution.
You can write functions and prove properties about it.
It has dependent types, that alone blew my mind. A simple example:
concatenate: forall (A:Set)(n m:nat), (array A m)->(array A n)->(array A (n+m))
is the signature of a function that concatenates two arrays of size n and m of elements of A and returns an array of size (n+m). It won't compile if the function doesn't return that!
Is based on the calculus of inductive constructions, and it has a solid theory behind it.
I'm not smart enough to understand it all, but I think is worth taking a look, specially if you trend towards type theory.
EDIT: I need to mention: you write a function in Coq and then you can PROVE it is correct for any input, that is amazing!
One of the languages which i am interested for have a very different point of view (including a new vocabulary to define the language elements and a radical diff syntax) is J. Haskell would be the obvious choice for me, although it is a functional lang, cause its type system and other unique features open your mind and makes you rethink you previous knowledge in (functional) programming.
Just like fogus has suggested it to you in his list, I advise you too to look at the language OzML/Mozart
Many paradigms, mainly targetted at concurrency/multi agent programming.
Concerning concurrency, and distributed calculus, the equivalent of Lambda calculus (which is behind functionnal programming) is called the Pi Calculus.
I have only started begining to look at some implementation of the Pi calculus. But they already have enlarged my conceptions of computing.
Pict
Nomadic Pict
FunLoft. (this one is pretty recent, conceived at INRIA)
Dataflow programming, aka flow-based programming is a good step ahead on the road. Some buzzwords: paralell processing, rapid prototyping, visual programming (not as bad as sounds first).
Wikipedia's articles are good:
In computer science, flow-based
programming (FBP) is a programming
paradigm that defines applications as
networks of "black box" processes,
which exchange data across predefined
connections by message passing, where
the connections are specified
externally to the processes. These
black box processes can be reconnected
endlessly to form different
applications without having to be
changed internally. FBP is thus
naturally component-oriented.
http://en.wikipedia.org/wiki/Flow-based_programming
http://en.wikipedia.org/wiki/Dataflow_programming
http://en.wikipedia.org/wiki/Actor_model
Read JPM's book: http://jpaulmorrison.com/fbp/
(We've written a simple implementation in C++ for home automation purposes, and we're very happy with it. Documentation is under construction.)
You've learned a lot of languages. Now is the time to focus on one language, and master it.
perhaps you might want to try LabView for it's visual programming, although it's for engineering purposes.
nevertheless, you seem pretty interested in all that's out there, hence the suggestion
also, you could try the android appinventor for visually building stuff
Bruce A. Tate, taking a page from The Pragmatic Programmer wrote a book on exactly that:
Seven Languages in Seven Weeks: A Pragmatic Guide to Learning Programming Languages
In the book, he covers Clojure, Haskell, Io, Prolog, Scala, Erlang, and Ruby.
Mercury: http://www.mercury.csse.unimelb.edu.au/
It's a typed Prolog, with uniqueness types and modes (i.e. specifying that the predicate append(X,Y,Z) meaning X appended to Y is Z yields one Z given an X and Y, but can yield multiple X/Ys for a given Z). Also, no cut or other extra-logical predicates.
If you will, it's to Prolog as Haskell is to Lisp.
Programming does not cover the task of programmers.
New things are always interesting, but there are some very cool old stuff.
The first database system was dBaseIII for me, I was spending about a month to write small examples (dBase/FoxPro/Clipper is a table-based db with indexes). Then, at my first workplace, I met MUMPS, and I got headache. I was young and fresh-brained, but it took 2 weeks to understand the MUMPS database model. There was a moment, like in comics: after 2 weeks, a button has been switched on, and the bulb has just lighten up in my mind. MUMPS is natural, low level, and very-very fast. (It's an unbalanced, unformalized btree without types.) Today's trends shows the way back to it: NoSQL, key-value db, multidimensional db - so there are only some steps left, and we reach Mumps.
Here's a presentation about MUMPS's advantages: http://www.slideshare.net/george.james/mumps-the-internet-scale-database-presentation
A short doc on hierarchical db: http://www.cs.pitt.edu/~chang/156/14hier.html
An introduction to MUMPS globals (in MUMPS, local variables, short: locals are the memory variables, and the global variables, short: globals are the "db variables", setting a global variable goes to the disk immediatelly):
http://gradvs1.mgateway.com/download/extreme1.pdf (PDF)
Say you want to write a love poem...
Instead of using a hammer just because there's one already in your hand, learn the proper tools for the task: learn to speak French.
Once you've reached near-native speaking level, you're ready to start your poem.
While learning new languages on an academical level is an interesting hobby, IMHO you can't really learn to use one until you try to apply it to a real world problem. So, rather than looking for a new language to learn, I'd in your place first look for a new things to build, and only then I'd look for the right language to use for that one specific project. First pick the problem, then the tool, not the other way around..
For anyone who hasn't been around since the mid 80's, I'd suggest learning 8-bit BASIC. It's very low-level, very primitive and it's an interesting exercise to program around its holes.
On the same line, I'd pick an HP-41C series calculator (or emulator, although nothing beats real hardware). It's hard to wrap your brain around it, but well worth it. A TI-57 will do, but will be a completely different experience. If you manage to solve second degree equations on a TI-55, you'll be considered a master (it had no conditionals and no branches except a RST, that jumped the program back to step 0).
And last, I'd pick FORTH (it was mentioned before). It has a nice "build your language" Lisp-ish thing, but is much more bare metal. It will teach you why Rails is interesting and when DSLs make sense and you'll have a glipse on what your non-RPN calculator is thinking while you type.
PostScript. It is a rather interesting language as it's stack based, and it's quite practical once you want to put things on paper and you want either to get it done or troubleshoot why isn't it getting done.
Erlang. The intrinsic parallelism gives it a rather unusual feel and you can again learn useful things from that. I'm not so sure about practicality, but it can be useful for some fast prototyping tasks and highly redundant systems.
Try programming GPUs - either CUDA or OpenCL. It's just C/C++ extensions, but the mental model of the architecture is again completely different from the classic approach, and it definitely gets practical once you need to get some real number crunching done.
Erlang, Forth and some embedded work with assembly language. Really; buy an Arduino kit or something similar, and create a polyphonic beep in assembly. You'll really learn something.
There's also anic:
https://code.google.com/p/anic/
From its site:
Faster than C, Safer than Java, Simpler than *sh
anic is the reference implementation compiler for the experimental, high-performance, implicitly parallel, deadlock-free general-purpose dataflow programming language ANI.
It doesn't seem to be under active development anymore, but it seems to have some interesting concepts (and that, after all, is what you seem to be after).
While not meeting your requirement of "different" - I'd wager that Fantom is a language that a professional programmer should look at. By their own admission, the authors of fantom call it a boring language. It merely shores up the most common use cases of Java and C#, with some borrowed closure syntax from ruby and similar newer languages.
And yet it manages to have its own bootstrapped compiler, provide a platform that has a drop in install with no external dependencies, gets packages right - and works on Java, C# and now the Web (via js).
It may not widen your horizons in terms of new ways of programming, but it will certainly show you better ways of programming.
One thing that I see missing from the other answers: languages based on term-rewriting.
You could take a look at Pure - http://code.google.com/p/pure-lang/ .
Mathematica is also rewriting based, although it's not so easy to figure out what's going on, as it's rather closed.
APL, Forth and Assembly.
Have some fun. Pick up a Lego Mindstorm robot kit and CMU's RobotC and write some robotics code. Things happen when you write code that has to "get dirty" and interact with the real world that you cannot possibly learn in any other way. Yes, same language, but a very different perspective.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
After competing in and following this year's Google Code Jam competition, I couldn't help but notice the incredible number of [successful] contestants that used C/C++ and Java. The distribution of languages used throughout the competition can be seen here.
After programming in C/C++ for several years, I recently fell in love with Python for its readable/straightforward nature. More recently, I learned functional languages like OCaml, Scheme, and even logic languages like Prolog. These languages certainly have their merits and, in my opinion, can be applied more easily than C++ and Java for certain situations. For example, Scheme's use of call/cc simplifies backtracking (a tool required to answer several problems) and Prolog's logic specification, although inefficient due to its brute-force nature, can drastically simplify (and even automatically solve) certain problems that are difficult to wrap one's brain around.
It is clear that a competition contestant should use the tools that are best suited for the challenge. Even x86 assembly is Turing complete - that doesn't justify solving problems with it. In this case, why are the contestants that use less common languages like Scheme/Lisp, Prolog, and even Python significantly less successful than contestants that use C/C++ and Java? Worded differently, why don't successful contestants use languages that, although may be less mainstream, are arguably better tools for the job?
There are several motivations for my question. Most importantly, I would like to become a better programmer - both in the practical aspect and the competition aspect. After being introduced to such beautiful paradigms like functional and logic programming, it is discouraging to see so many people discard them in favor of C/C++ and Java. It even makes me question my admiration for said paradigms, worrying that I cannot be successful as a Lisp/Scheme/Prolog programmer in a programming competition.
Great question! As someone who has dabbled in programming contests a bit myself, I may have something to say.
[Let's get the standard disclaimer out of the way: contest programming is only loosely related to "programming in the real world", and while it tests algorithmic and problem-solving skills and the ability to come up with fast bug-free working code under time pressure, it does not necessarily correlate with being able to build large software projects, write maintainable code, etc (beyond the fact that well-structured programs are easier to debug).]
Now for some answers:
C++/Java are more common than other languages in the real world as well, so you'd expect to see a higher proportion anywhere. (But it's even higher in the contest population.)
Many of these participants are students, or got into contests as students, and C++/Java are more common "first languages" that students learn. (Undergrad students these days may start with Scheme, Haskell, Python, etc., but high-schoolers (often self-taught) less often.) In fact, many of the Eastern European participants still use Pascal, and are more amazing with it than the rest of us will ever be with any language.
The school- and college-level contests usually use these languages. The International Olympiad in Informatics (IOI) allows only C, C++ and Pascal (or maybe it allows Java now; I haven't kept up), and the ACM Intercollegiate Programming Contest (ACM ICPC) allows only C, C++ and Java. TopCoder allows C++, Java, C# and VB (really :p); and recently, Python. So you could say the "contest ecosystem" has more C++/Java programmers in it. Google Code Jam and IPSC are among the few contests that allow code in any language, actually.
Now the question is, in GCJ where the contestants are free to choose a language, why wouldn't they choose Python or Scheme? The most relevant factor is that these languages are slow. Sure, for most real-world programming they are easily fast enough, but for the tight loops that are often involved in getting a program to run under the n-second limit for all test cases, these languages don't cut it for any of the algorithmically more involved problems. (A problem designed to accept O(n log n) solutions but not Θ(n2) solutions for C/C++ frequently rules out even optimal O(n log n) solutions in slower languages. Even Java used to be given a handicap at USACO; I'm not sure this is still the case.)
Another factor is the libraries: C++ and Java have better libraries for frequently useful algorithms and data structures (e.g. red-black trees, C++'s next_permutation), while Python's libraries (good enough for the real world) are less useful here, and Prolog and Scheme... I don't know about their libraries. This is a relatively minor factor, because these programmers can write their own code when necessary. :-)
General-purpose multi-paradigm languages are more useful for just getting things done within the time constraints of the contest, than languages that force a philosophy or way of doing things on you. This is why Prolog will always remain unpopular, for instance. (General philosophy: some languages are "enabling" languages that let you do anything including shooting yourself in the foot, some are "directing" that force you to do things the right way.) This is also why C++ is three times more popular than Java in the general contest participants, and much more popular among the top contestants. Since code doesn't have to be read by anyone else, it's ok and even useful to have loop macros like FOR(i,n) (less code to type, and more importantly less chance of making a bug when in a hurry). Nothing against Java, there are a few top programmers who use Java too. :-)
Finally, although many of these top programmers may have C++/Java/Pascal as their "first language", they are not good because of their language, so you don't have to despair about that. Many of these same programmers have won contests like the ICFP contest even with intentionally using crazy languages like shell scripts, m4 (used in autoconf), and assembly (the team named "You Can't Spell Awesome Without ASM").
I liked Jerry Coffin's idea of plotting contestants of the Google AI contest, so I took all of the results and plotted them (calculated mean, standard deviation, and then graphed the normal distribution curves in Excel).
With Lua and JS, got this:
Without (there were few contestants, so maybe the results are skewed):
It looks like Java participants did markedly worse than the rest, while Go, Common Lisp, and C are on the better end.
Why we all speak English and not Esperanto? Well, it just happened so. Even though English is inconsistent and bloated and Esperanto is intentionally designed as 'better tool'.
Thus, one reason is a tradition. In most schools programming is still taught in C/C++, Java, Pascal or even Basic. And participate in those contests mostly students, which choose language they know better.
Also, you can notice that most algorithmic books feature psedudocode in style of Pascal or Ada, and very very rarely - Lisp. I don't know why, perhaps also a tradition. Or perhaps it's just not so good for the algorithms.
Another reason would be speed. Although it's not a problem for Google Code Jam, in almost all contests 2x speed gap is a difference between 'Accepted' and 'Time Limit' verdicts.
In other words, if optimal algorithm in C++ runs 10 times faster than in Ruby, it may mean that sub-optimal algorithm in C++ will still be faster than a good one in Ruby. And contest authors usually don't want to allow O(n^2) submissions, if O(n*logn) can be achieved.
First, I'd question your premise [edit: or what I take to be a premise -- that contestants using C++ and Java fare about equally well]. For example, here's what languages were used for the entries that came in the first 100 places and the last 100 places in Google's recent AI contest:
Contestants using C++ and Java did not seem to be anywhere close to equally successful in that contest. Contestants using Python didn't seem to fare particularly well either, though there were considerably fewer of them, weakening any conclusion in that regard.
Second, of course, an awful lot of the explanation (as others have pointed out) is undoubtedly just the number of people who are familiar with each language. There are probably more people taking a course in Java right now than the total number of people who've ever written any Lisp, Scheme or Prolog.
Edit: I think a third possibility is simply versatility. To pick an extreme example, Prolog is very well suited to a few problems, but equally poorly suited to many others. Few people can (or at least do) learn more than one or two languages well enough to use them in a contest, so most people who are interested in such things are likely to choose languages that can work reasonably well for almost anything, rather than attempting to learn a specialized language for every problem that might be chosen.
In nearly all Google Code Jam rounds, more of the higher-performing contestants code in C++.
Below are the language stats from Google Code Jam 2012 Round 1A, 1B, and 1C (listed top to bottom).
The number of contestants in each round are 3,686, 3,281, and 3,189 respectively.
fun question, probably should be community wiki.
Look at number of finalists by countries: http://www.go-hero.net/jam/10/regions. notice number of people from East Europe and Russia. those places have very strong C++ communities, as well as Java, for number of reasons.
look at number languages in qualifiers: http://www.go-hero.net/jam/10/languages/0 and finals: http://www.go-hero.net/jam/10/languages/6. C++ starts out less than half and has 75 percent in finals. either good programmers prefer C++ or C++ makes the programmers. Probably by the time you master C++, other things become trivial.
You are free to draw your own conclusions though.
First of all, as you have pointed C++ and Java are mainstream languages. These automatically means that people who start doing programming competitions will be introduced to them first - by the way who learns Lisp as a first language:) I also participate regularly in such competitions - I use C++ to compete, although my favorite language is Java. It is just I want to practice another language apart from Java - also C++ is a little bit less verbose and runs faster which is important for programming competitions.
Now to my point - people become experts first in mainstream languages. To participate in programming competitions you must have quite a good grasp of the language you are using. You don't have time to search on the internet for stupid things - like forgot a construct. It is just that speed is an important factor there. To use Lisp in a competition, you must be fond of it. I don't think there are such many people out there. Correct me if I am wrong. And honestly the pros you have mentioned like simplifies backtracking: In whatever language backtracking is easy - declare a method and just call it again for every possible outcome. It couldn't be simpler. I haven't felt till now that the language I am using is trying to trip up my feet for programming competitions.
OMG ... People are all going through the Stats and Figures !!
Lets not forget the basics.. These are the only two languages (mostly) which are taught to people in college/schools...!
That might answer the heavy rush!
A vital reason might be that every contests don't support languages like python or prolog. Specially ACM ICPC World Finals support C/C++ and Java. And TopCoder also supports only C++, Java, C#, VB, and now Python. It is natural for the contestants that they will choose one language that is available in every contest. Another reason might be execution speed. And yes, another reason is these are the languages that most of the people learn first.
Big libraries were a selling point for Java in ACM ICPC. It's handy to be able to realize you want some random data structure or algorithm and just pull it out of the standard libraries.
Keep in mind that C++ is not only the majority among all contestants, but as the rounds progress, its percentage just keeps and keeps improving.
I'd say it is true that most of the participants are students (However, since it is an open tournament with chances to a job interview with google, then you have to consider that many who participate are graduated). But the latest rounds are only for people with ton of experience. They are not just students who just learned to code in C++ / Java.
Of course, the student argument also works against languages like LISP and OcaML or ProLog. That is languages, that are used a lot in AI areas but in the mainstream world students are the most likely to be learning and use them.
Big contests other than google's support few languages, but that still wouldn't explain why Pascal or .net are not near the level of Java (As they tend to be equally supported in the major contest events).
A lot of the best coders in these contests know a lot of languages. But they still prefer to use C++ during the rounds it must be for a bigger reason than "learned C++" first.
I would argue against the claim that languages other than C++ or Java are better tools for the job. If direct data says that the finalists are more likely to use C++ and Java it is a direct contradiction to that claim.
Google AI competition data does not actually contradict any premise regarding the code jam. It actually does show that top coders are able to use languages like Common Lisp when it is truly the better tool for the job. If we want to use this data to assume that CLISP is a great tool for AI competitions, then we should also assume that C++ is a great tool for algorithm competitions like GCJ.
Basically is the only advantage of object oriented languages the improved understanding of a programs purpose?
Do the compilers of object oriented languages break apart the objects into structures and function libraries?
Basically, yes. The only advantage is improved understanding of code.
For some languages the OO version is the same as the non-OO version after compilation. Perl for example. For the majority of cases the OO version is much slower than the non-OO version. With very rare exceptions, non-OO languages are always faster than OO languages.
But in general, most experienced programmers will tell you not to worry about the performance differences between OO and non-OO languages (or Lispers will tell you not to worry about the performance difference between procedural and functional languages). This is because you should never, ever, ever, underestimate the importance of understanding code.
These days we rarely talk about it anymore because we've gotten used to using very high level languages - be it OO or functional or multi-paradigm or metaprogramming. But back in the 80s and 90s there was what was then known as the software crisis. What was the software crisis? It's basically the fact that most software projects were never completed!
The software crisis affected all sectors of the industry: from military radar systems, to games to commercial operating systems. The consumers called them vaporware. They were projects that were too ambitious.
But these days there are lots of very ambitious and impressive projects that manage to reach at least beta versions (and for web2.0 beta is good enough for public consumption). Part of the reason is that we now understand requirements engineering better and we also understand the process of software development better. But part of it is also because we have better tools to actually understand what we're doing. And OO is part of that toolset.
Yes, method code is central to the class definition and each instance method accepts an implicit this pointer to the data as its first argument. If you disassemble an instance method call you will see this.
Here are a couple of links to compare speed, first is comparing C/C++, please read the entire article:
http://unthought.net/c++/c_vs_c++.html
To compare Python, Java, C++, PHP and other languages:
http://blog.dhananjaynene.com/2008/07/performance-comparison-c-java-python-ruby-jython-jruby-groovy/
But, to answer your question, the main advantage of OO is that for many problems it is the best way to model the solution, as the model naturally fits into objects. But, if you try to force it to work where it is not a good fit you will have harder to understand code.
There are various language paradigms as there are many different types of problems, and you should pick the language type that best models the solution. For example, I would not want to write an OS in C++ as it doesn't seem to fit well in OO methodologies, but I would also not want to write a car racing game in C, as it would make more sense to have objects.
Depending on the language and compiler, you may see the compiled application compile to C, but others are not, as some are going to be interpreted.
For example, C++ compiles to C, but Java does not, neither does .NET languages. PHP is generally interpreted, though it is possible to compile it (though I have never tried it). One compiler is:
http://www.phpcompiler.org/
What are your criteria or things that you consider when you are an early adopter of a programming language or technology?
Two of the most common explanations I've heard are:
It should be "fun" (what I've heard from technical people).
It should be capable of solving our problem (what I've heard from business people).
So what's yours?
I've made this change several times over my career spanning various companies, moving from C to Java to Ruby to Haskell for the majority of my software development.
In all cases, I've been looking for more expressive power and better abstractions. This is always driven by business needs: how can I develop better software more cheaply? To me, the challenge of this problem is "fun," so fun rather automatically comes along with it. Justifying the business value to managers can be difficult, however; they often don't have the technical skills to understand why one programming language can be better than another, and are worried about moving to technology that they understand even less than the current one. (I solved this problem by taking over the manager's job as well: I started a company.)
It's hard to say what exactly to look for in a new language. You obviously don't have a detailed grasp of the language, or you would already be using it or know why you're not. Vast experience will bring an instinct that will make certain languages "smell" better than others, but—and this can make it especially hard to convince others to look at a new language—you won't know precisely what features give you big advantages. An example would be pattern matching: it's a feature found in relatively few languages, and though I knew about it, I had no idea when I started in with Haskell that this would be a key contributor to productivity improvement.
While it's negative ("avoid this") advice rather than positive ("do this") advice, one fairly easy rule is to avoid spending a lot of time on languages very similar to ones you already know well. If you already know Ruby, learning Python is not likely to teach you much in the way of big new things; C# and Java would be another example. (Although C# is starting to get a few interesting features that Java doesn't have.)
Looking at what the academic community is doing with a language may be helpful. If it's a fertile area of research for academics, there's almost certainly going to be interesting stuff in there, whereas if it's not it's quite possible that there's nothing interesting there to learn.
My criteria is simple:
wow factor
simple
gets things done
quick
I want it to do something easily that is hard to do with the tools I'm used to. So I moved to Python, and then Ruby, over Java because I could build a program incrementally, add functions easily, and express programs more concisely (esp. with Ruby, where I can pass blocks/Procs and have clean closures, plus the ability to define nice DSLs making use of blocks and yield.)
I took up Erlang because it expresses Actor-based concurrency well; this makes for easier network programs.
I took up Haskell because it fit with a number of formal methods tools I wanted to experiment with.
Open source.
Active developer community
Active user community, with a friendly mailing list or forum.
Some examples and documentation, preferably a tutorial
Desirable features (solves problems).
If it's for my personal fun, I need very little excuse, as I do love learning new things, and the best way of learning is by doing. If it's for an employer, customer, or client, the bar is MUCH higher -- I must be convinced that the "new stuff", even after accounting for ramp-up effects and the costs that come with being at the bleeding edge, will do a substantially better job at delivering value to the client (or customer or employer). It's a matter of professional attitude: my job's to deliver top value to the client -- having fun while so doing is auxiliary and secondary. So, in practice, "new" technologies (including languages) that I introduce in a professional setting will generally be ones I've previously grown comfortable and confident with in my own spare time.
Someone has once said something to the effect of:
"If learning a programming language doesn't change the way you think about programming, it's not worth learning."
That's one metric (out of many) to judge the value of learning new languages (or other technology) by. Using this, one might suggest learning the following languages:
C, because it makes you understand the Von Neumann architecture better than any other language (and it's sort-of random-access Turing Machine like, sorta'...).
LaTeX (as a programming language, not only as a typesetting system) because it makes you learn about string rewriting systems as a model of computation. Here, sed is similar; learn both, because they're also both useful tools :-)
Haskell, because it teaches you about functional programming, lambda calculus (yet another model of computation), lazy evaluation, type inference, algebraic datatypes (done with ease), decidability of type systems (i.e. learn to fear C++)
Scheme `(or (another) ,Lisp) for its macro system, and dynamic typing, and functional programming done somewhat differently.
SmallTalk, to learn Object-orientation (so I hear)
Java, to learn what earning money feels like :D
Forth, because wisdom bestowed forth learned implies.
... that doesn't explain why I learn python or shell scripting, though. I think you should take enlightenment with a grain of salt and a shovelful of pragmatism :)
Should be capable of solving the problem
Should be more adequate to solve the problem than other alternatives
Should be fun
Should have prompt support, either from a community or the company promoting it
A language should be:
Easy to use, to learn and to code in.
Consistent. Many languages have 50 legacy ways of doing things, this increases the learning curve and turns quite annoying. C# for me is one of those languages.
It should provide the most useful solution with the least amount of code. On the other hand sometimes you do need a bit of expressiveness to make sure you're not making a huge mistake.
The right tool for the right job and maybe the right tool for any job
My criteria that the language should have:
1. New ideas - If the language is just another Scheme variant, if you know one than I don't feel the need to learn this new one. I will learn it if I think I will learn something new.
2. Similar to another language, but better. For example, while Java and C++ have many of the same ideas, Java's automatic garbage collection makes it a better choice in many cases.
Gets the most done with the least amount of effort
Extremely interoperable with different protocols, out of the box
Fast
Has lots of libraries built in for stuff 99% of web developers do (PDF's, emailing, reporting, etc..)
It depends on why I'm learning the new language. If I'm learning it for fun, then it has to meet these criteria:
Is well it supported on my platform?
Something that runs only on Linux
isn't interesting to a Windows
programmer.
Will I learn something new? In
other words, does it come up with a
new way of doing things?
Does it look fun? I don't want to learn Ada even if it has new ways of doing things.
If I'm learning it for work, the criteria are different:
How mature is it? Has it been
proven to work in the real world?
How big is the community?
Will it make my job easier? I.e. is
it worth the time investment versus
just doing the task with a language
I already know.
Over the years we have seen (well, I have :) a number of languages come and go. Some were more accepted, some a little less. So I was wondering, what do you think are factors which most impact whether the language survives ? And whether it will have a future for a number of years (by that I mean several decades or so) ?
For example, fortran and C have survived the test of time. They were popular though, but they also had very good corporate backing, financing, and standard specifications (ANSI and ISO).
Some of the modern languages I see today, although they are popular, have none of that (the current implementation is often considered standard). That is all fine for the time being, but what about 10 or 20 years later, when their authors are maybe not here anymore. I very rarely see open source languages which make the transition into corporate financing.
If you could put with a few words, in your opinion, what would be the most important factors for the survival of a language and why ?
Ruby is popular, although it has no corporate backing. And it has been here for 14 years already.
Perl already survived 22 years, and probably will survive a few more.
Python has no corporation backing (ok, don't know if you'd count Google's engagement), yet it made to Fortune 500 companies.
On the other hand:
Pascal got corporate backing and died.
Ada has corporation backing and it's practically reduced to DSL for avionics.
I think the answer depends a lot on the time-frame in which you define survival. This is important because I think there are three factors that have changed over time, and are still changing:
Hardware performance (i.e. speed or memory)
Hardware complexity (i.e. single-core v.s. multi-core)
Software complexity
I think the reason C has survived is because, until just the past few years, there was still a very real need for maximum performance in a lot of applications. Perhaps there will always be that kind of need, but I think it has been growing much less relevant in the past few years. I think it's always going to be around, but I'd be surprised if it was widely used 20 years from now; it's already started getting passed up in favor of C#/Java/etc in the past five years.
The recent (by which I mean past five years or so) rise of languages like Python are also a response to the fact that software has grown more complex, while performance has become less of an issue. Because consumers value the 'now', there's a huge incentive to develop quickly, and worry about speed later, if at all. That has a pretty big impact on which language you use for development.
I see clarity, maintainability and ease of use as the most important factor for survival, if you take the future out to 20+ years.
Every future language needs to make an existing problem easy
For example, concurrent programming is not easy on most languages today. This will be solved with a new language as we can not easily coax our existing paradigms into the parallel world. Just take a look at Java, which was built from the ground up with threads in mind, it has so many caveats with you even dare to do concurrent programming.
We'll need a system that makes it so easy to do concurrent programming that we won't even need to think about it. We'll need a memory model that protects us from having to think about these problems. For those who can't imagine such a world, you are just stuck in our current paradigm. We will need to change the way we develop software for this to work. Serious problems require change.
Another way for a language to survive is to attach it to an entire system. Just look at Objective C, it is Apple's language for all Apple products. I think this is the way to go. Design a system that is worthy of its own language.
There are many other examples, I've been thinking about this problem for a long time.
As far as I can recall, Fortran had no corporate backing until it was well established. C was backed by AT&T, but they really didn't care if anyone else adapted it. And both were well established before they had ANSI Standards (also, note the ANSI & ISO provide Standard specifications, not implementations)
On the other hand, IBM heavily back & promoted PL/I, and that never really caught on. And the US Government tried to get all of us writing Ada, and that didn't work either.
So, what does work? Good question. Getting schools to teach it is good (Pascal pretty much disappeared when colleges switched to C++ & Java). Lately have "buzz on the 'net" is good (cite: Java, Ruby)
In order for a language to survive it needs several things:
It needs to solve a problem better then other comparable options. This is the subjective aspect, that developers feel it is better and so they adopt it.
It needs to have good tooling. Without good tooling a language will never catch on to the masses.
It needs a strong community to be built around it. A community which provides assistance, help, components, etc etc...
I don't think corporate backing has a direct impact on these items. I think it can make things such as developing tooling more likely, but there are too many examples where it has helped or not helped adoption of a language.
Open source community has become more like a huge corporate, hasn't it?
Languages survive while they are used, and while people are prepared to maintain them. People are often prepared to maintain the language while it is used. If a language is not used, it dies.
There can be all sorts of things that contribute to, or determine whether, a language dies. Corporate-sponsored languages die if the corporate sponsor ceases to see a benefit (profit) in the language, or they want people to use an alternative, and the corporate sponsor is unwilling to release the code to open source, and there is no open source alternative.
I don't see evidence that corporate backing or standardization are sufficient to determine whether a language survives or not. There are many corporate backed languages that have failed to gain a strong foothold (ADA comes to mind). There are many standardized languages (Common Lisp) that also failed. On the other hand, there are plenty of non-standard non-corporate languages that gain popularity (Perl, PHP, Ruby). There doesn't seem to be causality there.
The viability of a language is really determined by the community around it. There is a positive feedback loop. More users means more support and more libraries which in turn means more users. Popular languages can languish, but they don't totally die out. Not for a long time.
If I were looking for a language to use for something that had to last, the two biggest criteria in my mind would be:
Does it work well for my problem domain?
Is the community strong enough to be self-perpetuating?
If the answers to those two questions are true, use the language. If either answer is false, don't.
While other languages have been almost killed by their corporate backing = Delphi