If I wanted to learn code?

Started by Drauger9, December 22, 2013, 05:05:32 AM

Drauger9

I've been thinking about this a lot since I joined the forums. I've often found myself wanting to help out in some way but I don't have any real skills to offer. So I thought maybe I could learn code?

Which could be a great benefit to not only the community but to myself as well (something to do when I get bored. LOL). So I was wondering, If I wanted to learn code which would be the best one to learn? C++, java, HTML ect...?

Which do you think would be more useful to know in the long term, say 10 years from now? Seems C++ has stuck around for awhile?

Is there a certain one that's more universal than the others? More widely used, more adaptable than others ect?

Not sure if I'd be able to offer any help in the near future but in the distant future if I did learn some code then I could.

Anyways, any suggestions would be great as always. :)

Take care. :)

ROBOKiTTY

It doesn't really matter what language you start out with, since all programming languages are fundamentally the same. Usually by the time you start on your third language, it becomes just a matter of remapping syntactic symbols and learning the libraries.

That said, there are some languages you do not want to tackle as a beginner. They can overwhelm you and introduce bad habits for years to come. I'm looking at you, C and C++. (Also, HTML is not a programming language. It's just markup.)

For a disciplined start, I recommend you start out with the ebook How to Design Programs and learn Scheme. It will teach you a lot of good habits and expose you to the elegance of Lisp early on.

After you're done with HtDP, I suggest moving on to either Java or C# to learn modern object-oriented programming. Since C# started out as a Java-clone, the two languages are almost functionally identical. C# has more bells and whistles, but you won't notice any huge difference until later on.

Once you're comfortable with Java and/or C#, you can consider getting your hands dirty with C (not C++!). You will find all the hard discipline HtDP forced you through with Scheme coming in handy at this stage.

Only after all that will I recommend learning C++.
Have you played with a KiTTY today?

GuyPerfect

I can teach you to program, but my regimen isn't for the faint of heart. (-:

What I do is start with the basics of how a computer architecture works (in very simple terms), then proceed to the best beginner language I've ever come across... QBasic. From there, fundamental concepts are taught, then further academics proceed to C (not C++ or C#). And then, after C concepts are covered, I delve into assembly and describe the low-level functions of the machine code and CPU operations.

I very strongly believe that you can't be a good programmer without understanding what your code ultimately becomes by the time the computer gets to it. Not that you can't use abstract languages like Java or .NET, but I've seen way too many programs that were written poorly for the fact that the programmer didn't know any better. I want to make the world a better place by laying everything out in the open!

ROBOKiTTY

I don't think there's anything wrong with starting low-level, but I think it can be a bit offputting for people who grew up with GUIs and modern computers. Unless you're programming an Arduino or other microcontrollers, you're going to have a hard time making something shiny and fun by modern standards, which can be frustrating.

What I like about the HtDP curriculum is that it balances discipline with instant gratification. Despite using a dynamically-typed language, it drills a reverence for type safety into your head while teaching some pretty fundamental concepts, notably those you don't typically learn from using C-family languages (e.g. building structures with cons, tail recursion, functional programming, etc).
Have you played with a KiTTY today?

Codewalker

The biggest problem using with languages like Lisp and Scheme for learning is that they teach you how a theoretical machine functions -- one that works NOTHING like actual computers.

Procedural first, since that's what's going on under the hood and if you don't know that you're doomed to think that O(1) is the end-all-be-all of efficiency. Then you can graduate to more advanced concepts like object oriented and functional programming, as well as theoretical concepts such as lambda calculus that are a useful mode of thinking for specific problems, but certainly not all of them.

ROBOKiTTY

Scheme is a procedural language though. It's functional-friendly, but not quite as extreme as Haskell.

I think HtDP addresses most of the bad habits self-taught hackers tend to pick up. Starting low level without learning the requisite discipline that higher-level thinking instills is IMO pretty dangerous.
Have you played with a KiTTY today?

GuyPerfect

Would you mind elaborating on these cryptic "bad habits" you're referring to? (-:

In my experience, something low-level like C is perfect for avoiding bad habits. Strings aren't treated like a data type, the only way to pass variable references is with pointers, and you get to do your own memory allocation manually. And all of that is very close to how it all runs on the bare metal, so I definitely hold C in high regard when it comes to learning to program well.

ROBOKiTTY

With a good teacher/curriculum, bad habits can be avoided, but nonetheless...

1. Lack of universal code documentation/commenting standards.

Flower boxes? Wherever you feel like it? Everyone seems to prefer a different style.

Scheme (with HtDP) has its set of commenting conventions that force the beginner to document functions, constants, types, etc. Java has Javadoc. C# has XML documentation comments.

2. Type assumptions

C is weakly typed, and its types and typedeffed types also tend to differ from platform to platform. The compiler will also quite happily cast things silently for you. Is a plain char signed or unsigned? size_t, time_t, ptrdff_t? Can you safely assume sizeof(size_t) == sizeof(ptrdff_t) when the standard does not guarantee it? If time_t is a typedeffed long, is it only 32-bit wide and will thus break in 2038?

Should a beginner learn to worry about these things? I think so, if they're going to interact with these types, but that might be overwhelming.

3. Bit twiddling

Bit twiddling seems unavoidable to me when you use C libraries. This requires a beginner to learn and apply boolean algebra in a dry and error-prone manner. HtDP also teaches boolean algebra, but without bit twiddling. Bit twiddling in 2013 just screams premature optimization to me.

I like C and all, but IMO it's a bit much for a beginner.
Have you played with a KiTTY today?

The Fifth Horseman

Something to consider is that programming demands a strong grasp of logic and at least a decent one of math.
A good imagination coupled with the ability to visualize the way your code works goes a LONG way to being a good developer.
We were heroes. We were villains. At the end of the world we all fought as one. It's what we did that defines us.
The end occurred pretty much as we predicted: all servers redlining until midnight... and then no servers to go around.

Somewhere beyond time and space, if you look hard you might find a flash of silver trailing crimson: a lone lost Spartan on his way home.

GuyPerfect

Quote from: ROBOKiTTY on December 24, 2013, 06:12:29 PM1. Lack of universal code documentation/commenting standards.

Flower boxes? Wherever you feel like it? Everyone seems to prefer a different style.

The following is my two cents; I don't mean to be contradictory. (-:

I'm in favor of support for willy-nilly comments, personally. Sometimes you want a comment in a place that would be awkward in all other contexts, and sometimes you don't want one where one normally would go.

I feel it's the programmer's responsibility to appropriately comment his code. Sure, everyone has their own style for doing it, but as long as the documentation is there, that's what matters. Besides, the best library designers will document their functions in entirely different places than in the source code. I can't see a compelling reason for some enforced/standardized comment setup built right into the compiler.


Quote from: ROBOKiTTY on December 24, 2013, 06:12:29 PM2. Type assumptions

C is weakly typed [...]

By definition, you've got it backwards. A strongly-typed language prohibits operations on a value of inappropriate type. While it's true that C will automatically promote types within expressions (where applicable), it's still particularly anal when it comes to things like passing the wrong type to a function (even so far as to complain when you try to pass in a const array to a function that doesn't specify const for its argument).

Quote from: ROBOKiTTY on December 24, 2013, 06:12:29 PMIs a plain char signed or unsigned? size_t, time_t, ptrdff_t? Can you safely assume sizeof(size_t) == sizeof(ptrdff_t) when the standard does not guarantee it? If time_t is a typedeffed long, is it only 32-bit wide and will thus break in 2038?

C makes no guarantees regarding the format or size of any of its data types. It's guaranteed that, for instance, int is at least as large as short, but anything more specific than that isn't in the spec. For this reason, I discourage making any assumptions regarding the nature of C's types, including the assumption that integers are two's-complement. If you need to know the type of your variables, use stdint, which guarantees two's-complement signed or unsigned integers of unambiguous size. Even in my personal programs, I use stdint all the time.

Reals are defined in the C spec as IEEE-754.

Types defined in the standard libraries are at the mercy of the implementation and to be worth their weight in salt should be declared as an appropriate type. But that's not really a language thing, per sé. Pure C programs don't have "#include <stdio.h>" at the top since pure C programs are just C code.


Quote from: ROBOKiTTY on December 24, 2013, 06:12:29 PM3. Bit twiddling

Bit twiddling seems unavoidable to me when you use C libraries. [...] Bit twiddling in 2013 just screams premature optimization to me.

Do you have any examples? I deal with standard libraries a lot and I can't readily think of any examples where I have to bit-pack any fields.

Either way, bitwise operations afford better control of certain data than arithmetic ones. Let's say you need to take variable x and ensure it's divisible by 4, increasing to the next multiple of 4 if necessary (this is required to process City of Heroes's bins, for example). You'd be stuck with something like this in arithmetic:

x += (4 - x % 4) * ((x % 4 + 3) / 4);

I sat here trying to come up with the best way to do that using arithmetic alone, and that's my solution. And it's three divisions and a multiplication, among other things. Yeowch!

I was able to eliminate some of the more expensive operations with some conditional execution:

x += x % 4 ? 4 - x % 4 : 0;

That reduces it to 2 divisions three quarters of the time, and 1 division a quarter of the time. Still not great, but more efficient than the earlier example. Readability, though? Good luck picking that apart if you're not a C-style syntax expert.

On the other hand, there's the bitwise approach:

x = x + 3 & -4;

Why does that matter? Well, let's look at how you'd implement those in machine code. I'll be using the V810 architecture in my example here (which is what the Nintendo Virtual Boy used), since it's fairly easy to understand. It has 32 registers, r0 to r31. Let's say x is in r6.

Using the arithmetic approach, you can crunch it down to this:


MOV  4, r1
MOV  r1, r30
DIVU r6, r30  /* The remainder is stored in r30 by design */
CMP  r30, 0
BZ   +12      /* Skip the next two instructions if x % 4 was zero */
SUB  r30, r1
ADD  r1, r6


Total cycles: 42, and that's reducing it to a single division.

Whereas with the bitwise approach:

ADD 3, r6
MOV -4, r1
AND r1, r6


Total cycles: 3

This is the reason I teach C and machine code after introducing programming concepts with BASIC. Call it premature optimization if you will, but I've seen many programmers do some ludicrously inefficient things just because they weren't fully aware of the impact it would have on program execution.

Drauger9

Thanks for the replies everyone.

Robokitty, I've saved that Ebook and I'll look at it with in the next couple of days.

GuyPerfect, I've actually messed with Qbasic but that was around 1997ish? So it's been awhile LOL! I had a how to book that I made it about half way threw in my teenage years. Then summer, friends and parties happened. LOL! So I didn't get beyond that. :P

Thanks for offering to teach me but my schedual is so chaotic I couldn't dedicate anything more that what little free time I have here and there to it right now..... and I'm scared you might break me :P

The Fifth Horseman. I'm decent at math and modest at logic LOL! I'm not going to lie I'm alittle slow when it comes to learning new things but when I get it. I really get it and tend to be very good at it once I've grasped it.  It's the dedication that I'm weighing now. Is it something that I'd really stick with? Seems like it could be a huge undertaking just to learn the basics. So I'll read the Ebook mentioned, poke around some more and see if I can snag some cheap used books off Amazon. LOL!

Ty again and take care. :)

ROBOKiTTY

Quote from: GuyPerfect on December 24, 2013, 08:29:46 PM
The following is my two cents; I don't mean to be contradictory. (-:

Not at all. I like the discussion. ;D

Quote from: GuyPerfect on December 24, 2013, 08:29:46 PM
I'm in favor of support for willy-nilly comments, personally. Sometimes you want a comment in a place that would be awkward in all other contexts, and sometimes you don't want one where one normally would go.

I feel it's the programmer's responsibility to appropriately comment his code. Sure, everyone has their own style for doing it, but as long as the documentation is there, that's what matters. Besides, the best library designers will document their functions in entirely different places than in the source code. I can't see a compelling reason for some enforced/standardized comment setup built right into the compiler.

Well, I don't think any of the languages mentioned actually enforces a comment setup, but they do encourage a standard. I don't know if a self-taught programmer will on their own pick up good commenting and naming habits. It's too easy to give in to the temptation to name things "i" and "j" or introduce magic numbers into code without saying what they're for, but a good standard will force you to think about every function parameter and constant you put into the code, even if the language is by itself lax about it.

Quote from: GuyPerfect on December 24, 2013, 08:29:46 PM

By definition, you've got it backwards. A strongly-typed language prohibits operations on a value of inappropriate type. While it's true that C will automatically promote types within expressions (where applicable), it's still particularly anal when it comes to things like passing the wrong type to a function (even so far as to complain when you try to pass in a const array to a function that doesn't specify const for its argument).

This leads me to one of my minor gripes about C. You cannot cast a void * into a function pointer... but you can cast a void ** into a function pointer pointer.

Ruleslawyering has never been so ugly. As they say about C, when in doubt, add another layer of indirection.

Quote from: GuyPerfect on December 24, 2013, 08:29:46 PM
On the other hand, there's the bitwise approach:

x = x + 3 & -4;

<snip>

This is the reason I teach C and machine code after introducing programming concepts with BASIC. Call it premature optimization if you will, but I've seen many programmers do some ludicrously inefficient things just because they weren't fully aware of the impact it would have on program execution.

I do think that's premature optimization in 2013. :o

On the other hand, I think it's a good thing to teach some manual memory management, if only to demonstrate the techniques that have been developed over the years to make it less bug-prone, like scope-bound resource management and reference-counting smart pointers. C isn't so great for teaching those.
Have you played with a KiTTY today?

The Fifth Horseman

#12
Quote from: Drauger9 on December 25, 2013, 06:25:29 AMThe Fifth Horseman. I'm decent at math and modest at logic LOL! I'm not going to lie I'm alittle slow when it comes to learning new things but when I get it. I really get it and tend to be very good at it once I've grasped it.  It's the dedication that I'm weighing now. Is it something that I'd really stick with? Seems like it could be a huge undertaking just to learn the basics.
The opposite: It's easy to get your foot in the door, but becoming competent at it takes a while.

BTW, you'll want to grab some literature on algorithms and read a bit about computational complexity. Those are two items that will come in handy no matter what language you use. (CC at least as far as recognizing the complexity class of a given part of your program - trust me that it may matter a lot)

On a side note, the 2nd edition of Numerical Recipes in C is free and might come in handy some time (certainly did for me): http://www.nrbook.com/a/bookcpdf.php
Quote from: ROBOKiTTY on December 25, 2013, 06:52:46 AM
I do think that's premature optimization in 2013. :o
I do think that would depend on how often that particular operation is performed. :p
Also, some of us enjoy tuning our products for top performance. It's an addiction. :)
We were heroes. We were villains. At the end of the world we all fought as one. It's what we did that defines us.
The end occurred pretty much as we predicted: all servers redlining until midnight... and then no servers to go around.

Somewhere beyond time and space, if you look hard you might find a flash of silver trailing crimson: a lone lost Spartan on his way home.

therain93

My own 2 cents as a non-professional programmer with a CS background who has to work with programmers in different situations -- there's no such thing as learning optimization prematurely.  Sloppy, brute force coding is simply awful to review and modify, and lack of efficiency stacks up -- I see it frequently from some of the old-school mainframe guys who self-taught themselves to stay "relevant".

And, from the way your conversation has evolved, I think you all have collectively proven the point that (generally speaking) better coders come from having good mentors who can push back and advise on proper conventions and efficiencies that someone teaching him/herself just won't necessarily be aware of.
@Texarkana - March 5, 2004 - December 1, 2012 -- Imageshack |-| Youtube
---------------------------------------------------------------------------------------

You don't know what it's like.... |-| Book One. Chapter one...

GuyPerfect

Quote from: ROBOKiTTY on December 25, 2013, 06:52:46 AMThis leads me to one of my minor gripes about C. You cannot cast a void * into a function pointer... but you can cast a void ** into a function pointer pointer.

Eh? This works just fine:

#include <stdio.h>

int MrFunc(int a, int b) {
    printf("I'm MrFunc with arguments %d and %d\r\n", a, b);
    return a + b;
}

int main() {
    void *MrVoid = (void *) &MrFunc;

    printf("MrVoid got %d and %d, and returned %d\r\n",
        5, 8,
        ((int (*)(int, int)) MrVoid)(5, 8)
    );

    return 0;
}


Quote from: ROBOKiTTY on December 25, 2013, 06:52:46 AMI do think that's premature optimization in 2013. :o

Really? I'd say one should use the right operator for the job if one's available. I mean, you don't use a for loop with an addition in it to perform multiplication, do you? (-:

Drauger9

Thanks Horseman, I saved that book into my favorites folder as well. :)

Take care. :)

ROBOKiTTY

#16
What I mean is this:


void *someFunc();
//...
typedef int (*FuncPtr)(void);

FunctPtr fp;
fp = someFunc; //nope
*(void **)(&fp) = someFunc; //okay

//or I do a union hack and do my penance
union {
    int (*FuncPtr)(void);
    void *ptr;
} forgiveMeGcc;
_STATIC_ASSERT(sizeof(forgiveMeGcc.FuncPtr) == sizeof(forgiveMeGcc.ptr));
forgiveMeGcc.ptr = someFunc;
Have you played with a KiTTY today?

GuyPerfect

#17
I think I see where you're going with this. Let's take a look...

void *someFunc();

someFunc() is a function that returns a value of type void *. This is a function that returns a pointer.

typedef int (*FuncPtr)(void);
FuncPtr fp;


This is a type declared as FuncPtr that serves as a function pointer for a function with prototype int (void).

Variable fp is declared as such a pointer.

fp = someFunc; //nope

I'm not sure what the C spec says about attempting to evaluate functions directly by name, but I just tried it in gcc and it used the function's address. So basically, this line is the same thing as saying "fp = &someFunc;"

The reason this doesn't work is because of a type mismatch, what with C being strongly-typed and all. (-:

fp is of type int (*)(void), whereas &someFunc is of type void * (*)(void). These are incompatible types and therefore one cannot be assigned to the other.

*(void **)(&fp) = someFunc; //okay

&fp is of type int (**)(void), meaning a pointer to a function pointer. That's being cast to type void **, which is also a pointer to a pointer.

The expression is further dereferenced with that * on the far left, effectively saying "the value of fp if it were of type void *".

Since someFunc evaluates to a function pointer (again, I'm not sure if that's in the C spec), it can be assigned to a void *.

You'd have an easier time if you did this:

typedef void * (*FuncPtr)(void);
FuncPtr fp = &someFunc;


Since fp and someFunc represent the same prototype this way, the compiler won't get all indignant and ask "What's-a-matta you!?"

Twisted Toon

It has been about 8 years since I did any programming (at college) in C. Forgive me if my eye glaze over...  :o
Hope never abandons you, you abandon it. - George Weinberg

Hope ... is not a feeling; it is something you do. - Katherine Paterson

Nobody really cares if you're miserable, so you might as well be happy. - Cynthia Nelms

Second Chances

Meanwhile, I have found it very interesting. Two of my kids are getting into programming, so I have been pondering if there is some guidance I can give them. I got into programming in a pretty idiosyncratic way back in the 80's, so duplicating my journey would be weird... hearing the various perspectives here is handy.