w w w . a q u a m e n t u s . c o m
Programming languages [ C++ ] [ perl ]

Introduction to programming

         Programming is a fun little hobby that I've kept up since about 1991. Since then I've collected a degree in computer engineering and now it's what I do for a living. (That's probably a bad idea, because if and when I get sick of it I'll lose a hobby and hate my job.) I've learned BASIC, Pascal, C/C++, a little Java, a little Assembly, a little more Perl than I wanted to, and a whole lot about computers. My main interest in programming is in writing games, but I have a fundamental problems with following through and finishing any of them to releasable quality. Like I said, it's a hobby.

         Computer programming, in its strictest sense, is telling the computer what to do. You can tell it to print things to the screen, to print things to the printer, to do all kinds of math, to draw all kinds of things on the screen. You can network computers together and do internet programming, too (I recommend Java for this).

         So what does this mean? Programming isn't so much a geek thing as it is a problem solving thing. You can take any problem you want and get the computer to try to solve it for you. A word of warning: your programs are only as smart as you make them. One of the major problems with computers is that they do what you tell them to do, not what you want them to do. :)

         This set of web pages is supposed to bring you to the hobbiest level. Believe it or not, programming is fun, and a great way to piss off anyone you live with. :) My main interest is in making games and in making pretty graphics, and it turns out that making games is actually one of the more difficult things to do on computers; graphics is difficult enough in itself, but if you're going to get into sound there are shelves full of (contradictory) books on how to do it, if you're going to have a computer player you're delving into the world of Artificial Intelligence (AI), and if you're doing a super-hot first person game, you pretty much have to take over the computer to get the resources that you need. This makes game programming account for most of what computers can do. Rough stuff. (I have a page for graphics progrmming, too.) Of course, this depends on which OS you're writing for. I started on DOS, which has no I/O or memory protection, and I ended up on KDE (linux); these are very different to program for.


         The first thing you need to know is very basics about computers and how things run on them. On a fundamental level, there is a central processing unit (CPU) buried somewhere in the innards of the box you call your computer. This is one of a whole bunch of microchips that make your computer system, but this one does all the thinking and hence gets all the press. (Such as the raging AMD versus -- ew! -- intel debate, which isn't so much as debate as a marketing war.) Processors work (very basically) like this:

  • fetch the contents of memory at some address
  • figure out what kind of instruction it is
  • if it's an ALU operation, do the math
  • if it's a memory operation, talk to the main memory
  • store the result wherever it goes
  • repeat for the next instruction

             The reason for the stage outline here is that when you get to speed optimization, knowing very basic information about how processors work will really help. Or you could just use an AMD chip, and not have to worry about speed optimizations. :)

             The whole objective of what we're trying to do is to get the processor to do some particular instructions to accomplish whatever it is we're trying to do. These instructions, however, are stored as binary digits. Programming binary digits into memory sounds like pain. (Why, when I was your age I was programming with tweezers and a little magnet! Uphill! Both ways!!) Instead of typing 0's and 1's, then, we're going to write a text file with what's called a high-level description of what we want the computer to do, and have a program convert that into the 0's and 1's that the processor understands. This makes life much easier, and makes programming fun! (Or at least much less painful.) This kind of program is called a compiler, and if you have a *nix system ("*nix" means either UNIX or Linux) you already have it: g++.

             There's also something called an interpreter, which works like a compiler except it translates and executes the program at the same time. (Compilers translate the whole mess and put it into a file, which you then run well away from the compiler.) There aren't many advantages to interpreters over compilers because they're so slow, but interpreted programs are easier to debug and cross platforms (UNIX to BeOS to DOS) very easily.

             Here is an (extremely short) list of some programming languages:
    Compiled Interpreted
    Pascal
    C/C++
    COBOL
    Fortran
    Assembly
    BASIC
    Java
    Perl
    Ruby
    Python

             There are many many many more languages out there (in fact, you can make your own up!), but some more that I can think of off the top of my head are: ML, APL, Ada, PL/1, Lisp, etc.

             I have two general types of tutorials on this web page. The first is my C++ tutorial, which is not only a tutorial on C++ but also a tutorial on all of programming itself. The second type is what all the others are, which is really just a syntax reference with some comments on how it's different from C++.

    SIMPLE

             SIMPLE is another programming language, and the name is an acronym for Sheer Idiot's Monopurpose Programming Language Environment. The developer behind SIMPLE (probably Arvonn Tully) decided to make a programming language that would be easy for beginners. Hence, there are only three statements in the language: BEGIN, END, and STOP, and no matter which order you put them in the program compiles. So you get all the functionality of other programs without all the time wasted trying to find hidden bugs.

    Arvonn's Law of Computer Science

             All programs have at least one extraneous line, and have at least one bug. Then, by induction, all programs can be reduced down to one instruction that doesn't work.


    chrisv@aquamentus.com
    January 19, 2002