Books or Tutorials that explain Calculus from a programming perspective - language-agnostic

K I'm a self taught programmer and been so for a couple of years. But in order to go beyond the programming scab work (entry lvl, Tester, Web Dev, Commercial App Dev; of which I am more than grateful for) and go beyond conventional programming gigs. (I.E. R&D, Embedded Devices, Scientific Computing and the like ) I'm going for a BS in Computer Engineering. However I'm not all that great in the Calculus Department. I've taken Calculus 1 before and had to drop because of a job opportunity. But while I was in it I gotta say stuff like the Chain rule for derivatives and the like was really kicking my ass. So it seems I have a long way to go as far as the maths in school. (although Discreet Math does look fun) Learning by relating concepts known to concepts unknown seems to work well for me and I was wondering if there was any recommended materials that would provide good self study supplemental material for the time when I have to take Calculus classes again.

There is simply no better book to learn calculus from than Stewart's Calculus, period. It is not aimed at programmers, but it will give you a thorough grounding in the principles. [Spivak's calculus is good, but not so easy to learn from, and it takes a more theoretical standpoint. I wish I'd had Stewart's calculus when I studied it]. I have a copy of Stewart's Calculus to hand; it's always the first place I look things up!
There is an accompanying web site here.
Have you looked at the free offerings at MIT OpenCourseware, for example: 18.085 Computational Science and Engineering I

Calculus is actually a continuous world, Programming happens on the digital computer (discrete world), so I think you cannot learn seriously calculus by considering only the discrete world, although some concepts in calculus can be better understood by considering the discrete case

Related

Examples of cellular automata that perform real-world tasks

I've been reading about one-dimensional cellular automata and am intrigued with the idea that they can solve real-world problems.
However, I have not found a single example.
I'm not talking about approximations of prey-predator populations since they just look like what we think a population should look like - I am unaware of any real basis in reality.
Instead, I'm talking about something measurable. Is there an example that adds? Multiplies? Pattern-matches?
Or do automata that do these sorts of things require scads of special case rules that violate the simplicity of the concept?
This Wireworld computer computes prime numbers:
http://www.quinapalus.com/wi-index.html
I've been using cellular automata to create some very interesting fractal art:
With each iteration, I've magnified the original image, and then applied a cellular automaton rule by hand after each magnification. In theory, at least, it would be possible to write a computer program that would replicate these images that I've created by hand.
My own profile picture is another example of a fractal that I have created using cellular automata - this fractal generation technique is very straightforward, but it could possibly be used to create very convincing fractal landscapes for video game terrain generators.
Check out http://is.ifmo.ru/english/ (Most of the website is in Russian; you can use automated translation, I suppose.)
I know professor Shalyto personally. He has spent many, many years doing research on various applications of finite automata including cellular automata. In particular, he has worked with many very bright computer science students, perhaps some of the brightest in Russia, and they have created lots of various projects using finite automata in various ways to solve real-world tasks.
Professor Shalyto did some other very useful things, including his efforts towards promoting open project documentation as well as his perseverance in supporting computer science education in Russia. However, as far as finite automata go, I became convinced that they are good for nothing practical, except maybe compiler programming, ATM transactions, controlling production processes on large factories, and a number of other niche applications. Also, as far as cellular automata go, I became convinced, again from observing the efforts of prof. Shalyto and his many talented students, that they (cellular automata) are basically good for nothing. Except, of course, for their mathematical beauty.
Inspired by Stephen Wolfram's work on cellular automata in the early 1980s, there was a surge of interest in applied use of CA algorithms. Before interest petered out after about a decade, quite a lot of articles were published demonstrating how CA (usually 1-D, binary) could be used for pseudorandom sequence generation, error-correcting codes, cryptography, FSM testing, signal processing, and a bunch of other stuff. These articles were generally just mathematical sketches, though, and there is little code that you could dig up to look at.
If you want examples that do something practical but are still small and easy to understand, I'd suggest random number generators. CA-based crypto systems dropped out of sight because they were found to be insecure and computationally inefficient. The simplicity of implementing RNGs, though, seems to have made them popular for hobby projects and I have seen several.
You said you aren't interested in simulations, but if you want to see CA used in a significant real-world application, look into traffic flow simulation. This is probably the area in which CA methods have come the closest to being accepted as a useful tool. Check out chapter 13 in the recent book, Traffic Flow Dynamics: Data, Models and Simulation.

What are some highly-regarded books on (modern or historic) programming language design?

I greatly enjoyed Douglas Crockford's recent lecture series, particularly the talk which covered the history of programming languages. I'd like to learn about this subject in more detail.
Consider this question language agnostic. I'm not interested in books that teach programming. I'm interested in books which discuss decisions made during the design of one or more languages.
Following three are IMO the must-read books for any programming langauges junky :)
Project Oberon by Niklaus Wirth
Language Implementation Patterns by Terence Parr
Programming Language Pragmatics by Michael Scott
Every 15 years, the ACM puts on a History of Programming Languages conference (affectionately known as HoPL). The proceedings are of exceptionally high quality, and are available, unfortunately only behind the ACM paywall. (However, if you access them from a university, college or school IP address, you should be able to access them.)
For HoPL-III (2007), Guido van Rossum wanted to submit a paper about Python, but he wasn't able to meet the review requirements in time, so he published it in form of a blog instead.
Several presenters also published their papers for free, in addition to the official conference proceedings. Also, several presenters gave the same talk again, at a different venue. For example, Guy L. Steele, Jr. and Richard P. "Dick" Gabriel repeated their "50 in 50" talk (which, as you can imagine if you've ever seen a talk by Guy Steele or Dick Gabriel, is not really a talk, more like multimedia performance art crossed with poetry slam meets Broadway), which presents 50 programming languages in 50 words each.
As #Missing Faktor mentioned above, not only Project Oberon, but all of Niklaus Wirth's languages are tremendously well documented: Algol-60, Algol-X, Algol-W, Pascal, Modula-2, and Oberon.
Structure and Interpretation of Computer Programs. I have a print copy, but it's now available online for free:
http://mitpress.mit.edu/sicp/full-text/book/book-Z-H-4.html#%_toc_start
The Design and Evolution of C++
http://www2.research.att.com/~bs/dne.html
Programming Language Essentials
Rationale for the Design of the Ada Programming Language:
http://www.amazon.com/Rationale-Design-Programming-Language-Companion/dp/0521392675
Although the book discusses the original version of the language, it still makes interesting reading. For each design decision, rationale and discussion is included, both from the point view the programmer and compiler implementer.
"Architecture of Concurrent Programs", by the late Per Brinch Hansen, includes a good overview of the design and rationale for his Concurrent Pascal language, which added monitors (and other things) to his Sequential Pascal, a proper subset of Pascal.
The big thing missing from Sequential Pascal is pointers. However, given the restrictions intended to be placed on Sequential Pascal programs, everything you can do with a pointer you can also do with an array index, and in a more secure way, "secure" in the sense that it is impossible (and checked by the compiler!) to do illegal things.

Does knowing a Natural Language well help with Programming?

We all hear that math at least helps a little bit with programming. My question though, does English or other natural language skills help with programming? I know it has to help with technical documentation, but what about actual programming? Are certain constructs in a programming language also there in natural languages? Does knowing how to write a 20 page research paper help with writing a 20k loc programming project?
Dijkstra went so far as to say: "Besides a mathematical inclination, an exceptionally good mastery of one's native tongue is the most vital asset of a competent programmer."
Edit: yes, I'm reasonably certain he was talking about the programming part of the job. Here's a bit more complete quote:
The problems of business administration in general and database management in particular are much too difficult for people who think in IBMerese, compounded by sloppy English.
About the use of language: it is impossible to sharpen a pencil with a blunt axe. It is equally vain to try to do it with ten blunt axes instead.
Besides a mathematical inclination, an exceptionally good mastery of one's native tongue is the most vital asset of a competent programmer.
From EWD498.
I certainly can't speak for Dijkstra, but I think it's impossible to cleanly separate the part where you're doing actual programming from the part where you're interacting with people. Just for example, even when you're working alone, it's crucial that you're able to understand (clearly and unambiguously) notes you wrote down about what to do, the nature of a bug, etc. A good command of English is necessary even when nobody else is involved at all (and, of course, that's unusual except on trivial tasks).
I don't know about causality, but the skill set required to write well overlaps quite a bit with those required for programming: knowing how to plan, being able to keep a myriad of details consistent, being able to make things clear for a future reader, knowing how to organize your thoughts and the resultant product. That isn't to say that a successful author would make a good programmer, but a programmer with good language skills and the same logic/math/deductive skills is probably a better programmer than one with poor language skills -- at least the code has a greater chance of being understandable.
Yes. Strong natural language skills help you to organize your thoughts in a coherent way that can easily be understood by others. That can help improve your code in everything from naming variables, methods, classes, etc., to expressing the contexts of objects in your model. Practices such as pair programming require you to be able to communicate well with your partner in order to write good code. Techniques such as Domain Driving Design emphasize using the domain language of the business in your code. Natural language skills facilitate that. And there is a strong drive in the development industry toward more natural language-like tools, e.g. many of the newer testing tools like rspec, gherkin, etc., are moving toward more natural language-like syntax. One of the things many people like about dynamic languages like Ruby and Python are that the code tends to read more like a natural language.
Let me state what should be the obvious: every healthy person above 12 knows at least one natural language. Moreover, every healthy person above 12 is able to generate and parse natural language a complex and rich language, and express and understand an extremely large set of ideas. In general, people are not likely to be limited in their ability to discuss issues by their language, but by the type of things they experienced and learned.
Having said that, there are several language-related skills that you might have thought about.
Writing style. You mentioned those specifically. Written language is different from spoken language. Way less intuitive. This is one reason people have to get coached in writing through their years in the education system.
Coding doesn't really involve writing. I mean, there's comments, but they can be rather laconic. Of course the work of a programmer usually involves at least some writing of documents, and writing abilities to make a difference there.
Analytical skills. Analytical skills are a complicated (not to say fuzzy) concept. Analytical skills aren't really about language, but insomuch they are taught and tested at all, it's in the context of writing essays.
Analytical skills are obviously very important in programming. I am not sure that these are exactly the same skills required to write a good essay about Euthanasia or whatever, but as was previously suggested, they may be related.
Foreign language. For people whose native language isn't English, a certain command of English may be needed. Not in the coding itself (knowing what "while" means in English isn't really critical to understanding what it does in Java), but because much training and support material is available mainly in English (did anyone mention Stack Overflow?). The English requirement may differ on the country you are in, and the company you work for, though.
Communication Skills. Ahhm. I was never exactly sure what this means exactly. Maybe it's a cultural thing. I do suspect it's less about knowing a language and more about knowing people.
So to some up, Dijkstra is a venerable computer scientist, but I am not sure he knew that much about language.
Programming isn't just about writing code. On any programming project of any size there will be the need for:
initial project proposal documents
design and architectural documents
programmers manual
users manual
training materials
communication with third party suppliers
etc.
On every big project I've worked on I'd guess I spent at least 50% of my time on the English language documents. So yes, an ability to explain and express yourself well is extremely important. Does it lead to writing better code? Once again, I would say yes - the need to provide clear documentation spills over into the need to write better code, itnerfaces et al.

How large a role does subjectiveness play in programming?

I often read about the importance of readability and maintainability. Or, I read very strong opinions about which syntax features are bad or good. Or discussions about the values of certain paradigms, like OOP.
Aside from that, this same question floats about in my mind whenever I read debates on SO or Meta about subjective questions. Or read questions about best practices and sometimes find myself or others disagreeing.
What role does subjectiveness play within the programming realm?
Sometimes I think it plays a large role. Software developers are engineers in a way, but also people. A large part of programming is dealing with code that's human readable. This is very different from Math or Physics or other disciplines with very exact and structured rules. Here the exact structure and rules are largely up in the air, changeable on a whim, and hence the amount of languages in existence. And one person may find one language very readable, and another person may find their own language the most comforting.
The same with practices. One person may not like certain accepted practices. I myself find splitting classes into different files very unreadable, for instance.
But, I can't say rules haven't helped in general. Certain practices have and do make life easier. And new languages have given rise to syntax and structure that make life easier. There's certainly been a progression towards code that is easier to read and maintain even given a largely diverse group of people. So maybe these things aren't as subjective as I thought.
It reminds me, in a way, of UI design. Certainly it's subjective, but then there's an entire discipline involved in crafting good UI and it tends to work.
Is there something non-subjective about the ideas behind maintainability, readability, and other best practices? Is there something tangible to grasp when one develops a new language or thinks of new practices?
Arguably your question is really about the distinction between programming, which is mathematical, algorithmic and scientific, and software engineering, which is subjective, variable and human-focused.
Great programmers are not necessarily great software engineers, and vice versa. The two skillsets, while not exclusive by any means, have less overlap than they appear at first. Their relative importance depends a lot on the project: a brilliant programmer working alone can turn out amazing examples of technical genius, and it doesn't matter that nobody else can understand or maintain it, because he's not going to share the code anyway. But move into an enterprise environment -- like corporate in-house software development -- and I'll gladly trade you ten "cave troll" geniuses for a mediocre programmer who understands the importance of readability and documentation.
It's been my experience that the world needs great software engineers more than it needs great programmers. Relatively few people in this day and age are writing software which is truly performance-critical (OS kernels, compilers, graphics engines, realtime embedded systems, etc), and the Internet allows mediocre programmers to quickly grab algorithmic solutions for problems they couldn't solve alone. But nearly everyone writing professional code has to work within a team. And team productivity rises and falls dramatically on the ability of its members to communicate effectively and distribute workload efficiently, two skills which are highly subjective and impossible to prove by rigid formula.
Most software engineering principles are built on experience rather than objective law. Much like the social sciences, we study, learn, adapt and apply -- but with no real guarantees of outcome. All we can say is that some things seem to work better than others in most groups.
I think, a lot of it is necessarily determined by how much our mind is able to process at one time. So it comes down to how much the language and tools enable a team or a developer to break down the problem into chunks that are meaningful by themselves, but not so large that it becomes too hard to grasp them. The common theme is the art of organizing information (in this case, the code, the logic, ...) But that's not so different from Maths or Physics, by the way.
Just as the best authors borrow from many styles, the best programmers keep a huge range of patterns in their mental arsenal. Slavishly following a few patterns and adhering to some absolute truth is both lazy and dangerous.
Put it another way, the day we rely on robots for code review is the day I quit.
It all depends on your point of view :-)
But to answer your questions, I think one way to view subjectivity is to recognize that software languages, tools, and best practices are a shared means of communication among individuals. Yes, a programming language is a formal way of instructing a computer how to behave, but a programming language may also be viewed as a way to define and communicate specifications to a high level of detail (the code is the ultimate spec, is it not?).
So as far as we may want to concern ourselves with the degree of subjectivity in software languages, tools, and best practices, I would say that the lack of subjectivity may indicate how well communication is facilitated.
Yes, individuals have certain proclivities that are expressed in their habits and tendencies, but that should not ultimately matter too much in the perfect platform for development.
Turning to my Maths PhD wife I asked if there's any subjectivity in mathematics. Her answer is yes there is, mainly in the way we as humans achieve the answer.
If a mathematical proof is the result, how you get to that result can vary. If the dataset is large you may need to use a computer, which can introduce errors, and thus debated about whether that is the right approach. Or sometimes mathematicians can disagree on the theory - one is trying to prove that x is true while the other is trying to prove that x is false.
I think the same thing exists in computer science. A correct answer is a program that runs correctly, but that definition of correct may be different for each project. Sometimes correct means no bugs. Sometimes it means running efficiently.
From here programmers can argue how best to achieve the "correct" result. A good example of this is is the FizzBuzz application. A simple answer would be just a for loop, but Enterprise FizzBuzz is also "correct" in that it produces the correct answer, but is generally laughed at as "bad" engineering due to its overcomplication of the idea (it was a joke app after all).
How large a role does subjectiveness play in programming? I'd say it's a very large part of what we do, simply because we are human, and because there are multiple ways of getting the "correct" answer so there is disagreement over which way is the best.
Studies have been done showing that certain practices reduce defect rates in software. For instance, a study found a strong correlation between cyclomatic complexity and the probability of being fault-prone. Other studies show the average effectiveness of design and code inspections are 55 and 60 percent. So it appears to be in our best interests to favor simplicity, check metrics, and do code reviews.
We're talking probabilities here, though. If I review your code, I'm not guaranteed to find 60% of your bugs. There are also few absolutes in software development; experienced developers know that the correct answer is generally "it depends." That said, there are a number of practices with objective data in their favor.

Entry level computing text books [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I am looking for some good text books in the public domain, which could be used to teach computing to school kids aged (10-15). I couldn't get any googling for it. Can somebody out there point me to good links. If text books are not available any information on what is taught commonly to this age group as part of computing curriculum will be helpful.
"Computing" is an awfully broad topic. Do you mean teaching them how a computer works (like on the inside), or how to use computer applications (word processing/spreadsheet/internet), or how to program them? I think all three would be good topics for 10-15 year olds.
My dad, a computer engineer, taught me much of the above on my own around that age (of course, it depends upon how motivated your students are). Mostly it was through experimentation, and asking questions. I'll point out a few good resources that I went through when I was that age. While these books aren't public domain, they're not that expensive (you can purchase all the books I mention below for under $100US at the time of this answer, cheaper than a single college textbook; the movies you could try finding at your local library). Note some of these are from MS-DOS 3.x era of 10-12 years ago, but honestly, the basic concepts haven't changed that much. The IBM PC platform still has the same architechture, it's just been upgraded. Applications have changed though.
How a computer works
You might want to teach about all of the different parts in a modern computer tower by opening one up and explaining to them what each part is and does, and how they hook together. It doesn't even need to work, you just need to be able to show them stuff. A good, simple book that explains the parts of a computer is How Computers Work by Ron White. I believe there is also a Magic School Bus episode on this ("The Magic School Bus Gets Programmed", Episode 50 From Season 4).
After that you may want to explain about the startup process, and how the computer first turns on, and doesn't know anything. Then you could explain about how the BIOS chip finds all the various things hooked to the motherboard, and then uses the hard-drive to start the Operating System (Windows/OS X).
There are some interactive programs that describe how all of these things work, although I can't think of any off the top of my head. There was an old learning program called "What's in that Box" that I used, but it's so outdated now its useless.
If you have really motivated students, you may want to introduce them to the electronic foundations of the computer, and binary numbers and gates. A good (teacher level) introduction to these things is Charles Petzold's Code. You try explaining how computers talk about things using 1s and 0s. Also, there is a Bill Nye the Science Guy episode on this (Season 4, Episode 78).
How to use applications
Honestly, I'm really not sure what to tell you here, but I imagine you can find lots of tutorials on this if you google. Explaining Word, Excel, etc. to students is beneficial and I imagine widespread.
NB: Teach students how to touch-type at that age. I didn't learn how to touch-type until I was in middle of HS, and it was hard. By that time, I had already learned how to do stuff on computers and play games on the keyboard, and so had my fingers memorized to go to all the wrong spots. If you catch them early, they'll do well.
How to program
The way I learned how to program at
age 10 was this way using a language called BASIC (BTW, that
link gives lots of other ideas for
learning how to program at a young
age). I'm certain others will have
good answers on this too. A book that I used to learn from after that was QBasic by Example, that a computer bookstore owner gave to me when I was 12 from his throw-away pile. That was my programming bible, and I read it cover-to-cover.
Note, though, that some students will want to do "cool" things when learning how to program; admittedly, how I learned would not be classified as cool. My old computer science prof from undergrad is involved in a program with inner-city kids that teaches them how to program robots using a language called LOGO, Sun's SunSPOTS, and LEGO Mindstorms NXT (depending upon the class focus). You basically give commands to the robot and it does them, and you can watch the robot do what you told it to do. Very cool, and interactive. It can get them thinking about programming, and how its about telling the computer/robot what to do, and how you can be a "robot commander" or "computer commander." This is, essentially, what computer programmers do everyday.
The new children's show Cyberchase teaches critical thinking skills that are a foundation to programming skills.
Just some thoughts to get you started. I think many kids would appreciate a hands-on approach; most of those in the industry got started because of hands-on exposure and not rote book-learning. The above early education, along with some books, worked for me, most of that stuff I rattled off from memory.
Also check out Woz.org; Steve Wozniak (programmer/hardware engineer, designer of the Apple ][) now teaches grade school children computers.
Try How to Design Programs: An Introduction to Programming and Computing, by Matthias Felleisen, Robert Bruce Findler, Matthew Flatt and Shriram Krishnamurthi. The book uses a language called Scheme, which is freely available and designed to be used by students. The book is available online at www.htdp.org.
The book was designed to be used by high school and university introductory programming classes and is intentionally written to teach how to design a program, not just how to use the syntax of a particular programming language. It stresses things like how to design readable programs, thinking about the structure of your program before typing anything, and general programming concepts such as recursion and encapsulation.
Think Python would be my first choice for teaching programming. The book is free, python is free, and some really good IDE's are free - Stani's Python editor or Wing IDE 101. This provides a really good environment and set of resources for teaching programming to kids in that age range. If programming is a bit much for the younger kids, PyGame can be used to get simple graphics on the screen fast - and that is a great way to hold onto attention span.
EDIT: I just ran across a great book for introducing computer programming to kids (and other beginners): Hello World! Computer Programming for Kids and Other Beginners. This book uses Python as the programming language, and the projects are all simple games. Overall, this is a great book for introducing kids to computer programming.
In the UK schools tend to have some affiliation with Microsoft so:
Word
Excel
What is a network
Dinner-time Java class (if lucky)
What areas are you trying to focus? MIT open course ware offers a few good open texts on different programming languages - but doubtlessly intended for an older audience. I think that finding books specific to this age group on general computing will be a challenging task, but there are always good websites that the kids can explore addressing a variety of different topics.
As mentioned, unless you know the kids to be at a more advanced level (which many that age are now days) I'd focus on word processing, web use (searching & responsible surfing) and things of that nature. Googling "introduction to for kids" will often yield useful websites on things of such nature.
Check out Squeakland. It's specifically about computers & teaching.
Not a book, but could looks like a pretty good resource.
KidsDomain.com
The C# Yellow book by Rob Miles is a good and free resource to teach students C#. It is used by the Department of Computer Science in the University of Hull as the basis of the First Year programming course
For younger kids especially ToonTalk is an awesomely cool introduction to programming. It even does concurrent programming from the very beginning. ToonTalk teaches foundational principles while being extremely engaging with a 3D interface, animals to carry out computations and so on. I recommend it very highly!