About programing, a general question

Jerry Feldman gaf at blu.org
Tue Dec 21 18:54:12 UTC 2010


On 12/21/2010 01:16 PM, William Case wrote
> I am not a programmer, but I wanted the answer you seem to want.  How
> does the damn thing work?  More explicitly: 
>
> How does human understandable information get converted by a machine
> into electrical data; then store it; may or may not, transform, compare,
> and/or relocate the data; and then re-present the data as information
> meaningful to humans?
>
> I found the answer in "The C Programming Language" by Brian W. Kernighan
> and Dennis M. Ritchie.  This book is such a basic that it is often
> referred to just as K&R.  If you try to simply use this book as a
> tutorial for the C language it is too difficult.  Almost every sentence
> contains a new concept.  But K&R and 'C' are closest to the metal.  It's
> description and particularly its appendices are used by programmers
> mainly as a reference.  It really is a text on how to best write code so
> that the compiler can use your 'C' code by translating it into machine
> language. It is also, therefore, basic instructions for compiler writers
> on how they have writer their compilers.
>
> A big however!  I found that when I took my time, and worked each new
> concept through, with liberal use of google and some tutorial sites,
> with some contemplation on my part, with some reference to the the basic
> electrical properties of transistors, capacitors, Direct Current and
> crystals, I was able to come to a fairly complete (or at least useful)
> understanding of how my computer worked and what the text instructions
> that I was imputing were doing. 
>
> In addition, I spent a couple of afternoons exploring assembly language
> and the IS-32 instruction set.  Looking at how compilers work, and how
> they translate your text code into machine code (the '1' and '0's you
> mentioned) from libraries makes the need for precise instructions (text
> syntax) clear and less of a burden.
>
> I found that by using K&R as a course outline rather than a final all
> knowing, all teaching, tutorial book I was able to drill to the bottom
> of everything that was happening inside my computer.
>
> If you, like me, look for those Eureka! moments in life, you will find
> exploring the capabilities of your computer through the 'C' language is
> a wonderful voyage of discovery.  The ingenuity and creativity over the
> last 50 years that has gone into making the metal, the electricity, and
> the programming of a computer is truly a marvel.
>
> If this is the kind of approach that you are interested in respond to
> this post, and I will give you some hints and tricks about uncovering
> the programming process.  If I happen to steer you wrong, I am sure
> there are lots of people on this list who will jump in with corrections.
>
I learned C from K&R, from the tutorial that was included with Unix, and
the fact that I was told I had to maintain the Unix CShell. Fortunately
at the time I knew a number of languages including IBM 360/370,
AutoCoder, Burroughs, PDP-8, PDP-11, and Raytheon PTS1200/PTS100
assembly. My real learning experience was looking at the code generated
by COBOL. You can see how some compilers generate the code, but you can
also see how some decisions you make at the high level language affect
the actual code. I was taught that looping through an array by using
pointers was much faster than subscripting through the same array. BUT,
when working with the compiler group on Tru65 Unix I found that
subscripting was actually much faster when using an optimizing compiler
because the compiler must dereference every pointer, but when
subscripting, it can keep the array base in a register. I had some old C
code that I used to first test that pointers are better than
subscripting, and I then tested that in the Tru64 compiler on the Alpha,
and subscripted loops were faster.

Basically, the first rule I always use is to write code clearly and
concisely. Remember that someone (you or someone else) may have to debug
it. If you need to make the code work faster, then is the time to
analyze, and we have some really neat tools like graphical profilers
that can point out bottlenecks, and many times these surprise even the
most experienced programmers.

Basically, the CPU generally does a few things. While there are major
differences, between CISC and RISC, you load an address into a register,
perform some operation on that register, such as add, subtract,
multiply, divide, or, not, and, and a few more. Then you save the
contents into memory. Of course there are branch and conditional branch
where you go to a different part of the code based upon a true or false
condition. To really understand computer logic, an interesting approach
is to build a computer using plugboard parts. Remember, everything in a
computer is binary not decimal, not octal (3 bit representation of
binary), not hexadecimal (4 bit representation). 

-- 
Jerry Feldman <gaf at blu.org>
Boston Linux and Unix
PGP key id: 537C5846
PGP Key fingerprint: 3D1B 8377 A3C0 A5F2 ECBB  CA3B 4607 4319 537C 5846


-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 253 bytes
Desc: OpenPGP digital signature
Url : http://lists.fedoraproject.org/pipermail/users/attachments/20101221/1b83ebfb/attachment.bin 


More information about the users mailing list