I picked up O'Reilly's Mastering Algorithms with C at the library, and I'm already having fun. It will get into recursion, Big O notation, linked lists, quicksorts, encryption, and all that jazz, but right off the bat, it starts with pointers.
Immediately some aspects of C# become more clear, by understanding their precursors from C: the ideas behind reference types and passing parameters by reference, and the joys of automatic garbage collection and abstracted memory allocation.
The syntax of pointers in C involves a lot of punctuation—a subtle and nuanced application of asterisks and ampersands—which the book assumes I would already know. My favorite search engine (code-named Sweetie, as in, "Hey, Sweetie, can you find...") turned up an online edition of The C Book, containing this excellent explanation of pointers.
Here's my understanding so far. Please suggest corrections if I've gotten it sideways.
int *mypointer; | mypointer's type is "pointer that can point to an integer" |
mypointer = &myint; | mypointer is pointing to the location where myint is stored |
*mypointer = 7; | the place where mypointer is pointing now contains a 7, so myint now equals 7 |
myotherint = myint; | myotherint also equals 7 (but just a copy of 7, so changes to myint won't affect myotherint) |
So what? So now I'm clearer about reference types in C#. In C, I can pass a pointer as a parameter to a function, instead of passing a value. That means I'm passing the function a reference to a location in memory. If the function uses the pointer to change the stuff in that location, when someone else accesses that location, they will receive the changed stuff. Contrast this with passing a value, such as an integer, to a function, which actually passes a copy of the value. Any changes to the value are scoped within the function and are not detectable from outside. It's something I've known for a while, but now I know why.
If you no longer need the piece of memory that a pointer points to but you fail to de-allocate that memory, it will stay held in reserve forever—voilà, a memory leak. Different data types take up different amounts of memory, so writing a data type into a pointer of a differently sized data type will overwrite other pieces of memory in unpredictable ways. I feel like I've been making peanut-butter sandwiches with a butter knife and just noticed that other people are wielding samurai swords: looks really powerful and flexible, but I'm scared I'd cut off a finger. Now I get what the big deal is about the CLR's garbage collector (and how in some contexts it would be too restrictive).
I'm having fun getting "closer to the metal" and realizing the reasons behind some things I've taken for granted. I can't wait to get into the chapters on algorithms.
No comments:
Post a Comment