This applies for creating computer applications:- Start with C, it will strengthen your programming basics.You should also read object oriented programming and data structures using C and C++.Then you can learn some more high level languages from this point like C#, VB & Java.
This applies for web applications:-
First you need to have basic knowledge of two presentation languages one is html and other is css, among them html is must.Then start with php, in php learn its object oriented concepts.Then you can go more deeper by learning the use of CVS/SVN and unit testing and after this you can learn some great frameworks designed for php.Especially zend framework and typo3.This will make you the pro in php programming.
What to learn depends on what you want to do. I'd also recommend learning C, but it's unlikely that you'll ever do much with it in anger (unless you really have a thing for creating Linux desktop apps or server back ends). What learning C will do (and what learning something like Java won't) is teach you about pointers, data structures and memory management. In short, you'll get a better idea of what's going on under the hood, and that will make you a better programmer in the languages that you likely will be using for something more productive.
If you're thinking about web applications, then you have a metric craptonne of choices available, but only a few that will let you operate at low cost (or free). They break down to Perl, PHP, Java, Python and Ruby.
Perl is, well, it will do what you want it to do. You may not be able to figure out how it does it. Let's just say that the syntax is a little opaque, and the war cry of the Perl community (TIMTOWTDI!!!*) means that you'll rarely see two functionally-equivalent snowflakes that even bear a slight family resemblance to one another. If you know what you're doing, you can do wonderful things -- but that won't keep the source code from looking like a cat wandered onto your keyboard and did one of those little cat-circling things. It's a fundamental fact of programming that code you wrote last year will have you muttering "what was I thinking", but Perl is more likely to have you muttering "what the heck was this supposed to do again?"
PHP is something that nearly every web host offers in their base package, whether that be free, low-cost, or even a dedicated server with five-nines uptime at hundreds of dollars a month. There are a billion books and a trillion websites to learn from (or to steal example code from) -- and most of them are about as wrong as you can get. You'll hear a lot of people talking dirt about PHP. The problem isn't the language as such, but the fact that the barrier to entry is low. It's hard to find anybody who can't learn enough of the basics to be dangerous. The problem is that many (okay, most) folks stop learning at the dangerous level. Hey, their websites work. So what if their code is using a hundred times the memory it needs to use and takes so long to run that they can never support more than four users at a time? (If you have a grounding in something like C, you'll develop the discrimination it takes to tell good code from bad. And when in doubt, start with the links you'll find in misson's postings in the Programming Help category here.) You can write good apps in PHP; it's just not a requirement.
Java is great to learn if you want to move into programming as a profession. That's the only good thing you'll ever hear me say about it. It's unnecessarily verbose and it takes the idea of object-oriented programming too far, really. Not in the sense that Smalltalk or Ruby do, where "everything is an object", but in the sense that all of your code needs to be contained in an object/class whether the paradigm is appropriate or not. And yet somehow it leaves some types (scalar values like ints) outside of the object system (in the jargon of the profession, they're primitives). You have to know when you can use a method (variable.doSomething()) and when you have to feed the value to a function (doSomething(variable)), when a value is what it seems to be and when it just represents a place in the memory where the real value lives (again, C can help you understand that). There is a lot of inconsistency in Java, but it is very widely used in corporate development, so it's good to have on a resume. If you're not looking for a box to check on a resume, save yourself the headache.
Python is really straightforward. It's pretty much the polar opposite of Perl, both in terms of the syntax and the philosophy (no TIMTOWTDI -- there should only be one obvious and "Pythonic" way to do something). It will take a while to get used to what is mutable (values you can change) and what is immutable (values can't be changed, you need to make an altered copy), but the rules are consistent. Python is the language to use if you want to develop applications on the Google App Engine (which you can do for free, and pay to increase capacity if you accidentally create the next Facebook). It also has some functional language features ("functional" meaning that you can do a lot by passing functions into functions**), which can make writing some kinds of code a lot easier. And if you can find a host that offers Django support (Google App Engine supports a subset of Django), you can effectively separate your web page (the HTML) from the code that creates it. Oh -- MIT is using Python now as a teaching language for a lot of the courses that used to be based on the Scheme dialect of Lisp. That doesn't make it "the best", it just indicates that there is enough "bottom up" capability in the language to make it your own.
Ruby. It's the darling of a lot of people, and I've got to say that I like it. As a curiosity, more than anything else. Don't get sucked in by the apparent ease of Ruby on Rails, though -- RoR makes the simple stupidly simple, but you still have to engage in serious programming when you get beyond what comes in the box. Ruby itself, though, is a wonderful little language. As I said, I like it -- but I've spent most of my time as a professional programmer, and there isn't a whole lot of the world running on Ruby just yet. It is a glorious thing, though, to be able to treat everything consistently in your program -- there's no flip-flopping back and forth between "things you do with primitives" and "things you do with objects", since everything is an object. It'll take a little bit of getting used to (the mathematical notation is very different from what you learned in school), but (like Python) it is a very consistent language that will let you make it your own.
That just leaves the .NET family of languages. Frankly, they're a lot like Java, even if the syntax is different. Again, an obscene amount of corporate work is running on the .NET platform, both on the desktop and on the web/intranet, so if you're looking forward to a comfy job in a cubicle, C# (and, to a lesser degree, VB.NET) are great entries on a resume. If you're running Windows, they're probably the best choice for creating simple desktop applications to help you get things done (you can talk to other programs through COM, grab stuff from the net, and so on). But if you want to create a streamlined, efficient and maintainable web application, well, there are other choices that are better suited to the task. On the other hand, Visual Studio is probably the single best IDE (integrated development environment) out there (which, along with vendor lock-in to the Microsoft stack, accounts for its popularity in the corporate world).
*There is more than one way to do it
** For example, you will find yourself writing a lot of code that says essentially "compare these two things, and do something if this condition is met". In a language that has no functional features, you pretty much need to write a separate version of that code for every occasion. In a functional language, you can write a skeleton function, then pass in not only the two things you want to compare, but the way to compare them AND what to do if the condition is met. You'd probably need to have the comparator function available in more than one place in your program; the same would go for the "what to do" part. In a functional language, you write each of them once, then use them when and where you need them without having to add a lot of conditional code to the functions that may call them.
C++ IMO. It's an old language that has mountains of documentation, as well as having good Object Oriented Programming support. Besides that, many languages borrow heavily from C/C++ (such as Java. Java and C/C++ have very similar syntax), which makes it easier to learn new languages.
Depends on the application. If you want cross-platform, java. Games for windows, C# (if you know C# you can make xbox 360 games too (with some additional learning)). Java and C# is pretty much the same syntax even. For mobile apps, Java for everything except iPhone. Android apps are built with java, pretty much the same java as used in desktop apps.
I learn Java in school, and now I can code android apps too and I didn't need to learn Java again.
So it all depends on what you want to do.
Also @essellar, what headache are you talking about about Java? I never had any and I understood everything pretty much instant. Before that I only knew some basic procedural PHP.
The Java community is big, there is lots of tutorials and guides on how to do pretty much everything. The IRC support channel on the other hand can be a bit of a pain sometimes, they expect you to Google it first, even thought I have had some hard time finding solutions for my problems on Google as I'm unfortunately not good at googling :/
@vigge_sWe: You can code Android apps in languages other than Java. In fact, you can code most "Java" apps in languages other than Java. The JVM (the runtime) is only worried about the compiled bytecode, not the source. The main problem* with Java (as a language) is that it's verbose for no good reason, and so-called "best practices" just increase the verbosity seemingly for the fun of it. (And if you've never run across the DesignPattern Pattern, you are still a n00b. I've waded hip-deep through code that had FactoryFactories sprinkled throughout as a matter of local code policy compliance.)
* There are others, such as static typing. Static typing is safe, but it's also very restricting and puts a hard limit on polymorphism. The introduction of generics helped a little in that regard, but I'll take dynamic typing (and weakly typed) over static typing any day -- it may make writing a module a little harder up front, but once it's written, it's written, and I can quite literally use it anywhere.
1. Do you know anything about programming?
2. What is your 'target'? Web applications like WordPress? Desktop apps like Word? IPhone apps? Something that will run in a browser?
In general, if you are just starting learning programming, I would lean toward scripting languages. You get immediate feedback. Compiled languages take longer to find, at first, syntactic errors and then logical errors because of the compile phase.