Why do newbie programmers seem to shy away from libraries? [closed] - language-agnostic

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I've noticed many questions on here from new programmers that can be solved using libraries. When a library is suggested, often times they respond "I don't want to use X library" Is it the learning curve? or ? Just curious!

A lot of new programmers are still working at a very low level of abstraction, learning the trade. That's something everyone has to go through. It takes a while to "move up the stack" so to speak.
Once programmers realise that they spend most of the time solving the same problems as someone else already did, and the goal is to realise "business value", then they can really appreciate the value a good library brings.

When you're still learning the ins and outs of a new language, also having to learn how to use a 3rd party library can look like too much work. Also, libraries tend to be badly documented - or at least have documentation that seems totally opaque to a new(er) programmer.
So, faced with trying to solve problem X, saying "use a library" can sound a lot like "solve problem Y THEN problem x".
(Also, their professors told them not to. I managed to get all the way though my undergrad in C++ without learning the STL existed. Boy, did THAT cook my noodle.)

Some people, when confronted with a
problem, think “I know, I'll use a
library.” Now they have two
problems.
Seriously - this is a reasonable way for a newbie, already overwhelmed by new language, programming environment, paradigms, keystrokes, etc. to react to the suggestion to use a library. If you've got a solution, but it's not working, there are many potential sources of error; sorting through them is a challenge. Adding to them can seem irrational.
"Use a library" means find the library, download it, install it in your project, and call the necessary function. Not hard, if you're used to it (and there aren't corporate policies against it, and you have reason to trust the vendor, and the library itself has minimal dependencies, etc.). But if it's all new to you, when you ask a programming question and get back a system configuration answer, it can seem unhelpful (even if it is not, in fact).

Almost always it's because their professor has told them that they can't.
Sometimes it's just because they want to learn it themselves, but I'd say that's rare.

It's the learning curve.

Using libraries is probably one of the worst things a learning programmer can do. Instead of learning how to code, they're learning how to use specific APIs that other people implemented. I'm not saying that every programmer has to understand every single thing that they use, but programmers who know the ins and outs of a computer (digital logic, assembling op-codes, etc) usually have an edge over people who've started with something like Java Swing and are just throwing together libraries.
In production, this is a different matter of course. But I think the best course of education is to 'make everything' once, at least. Writing my own web application framework from the ground up really improved my programming skills and abstract abilities. Doesn't mean I'll use that framework if someone hires me to build them an application, but I know the strengths, weaknesses, and reasons behind the things that the 'giant' frameworks use, and it can help me choose a particular framework for a particular situation.

I remember shying away from several libraries simply because I wanted to see if I could create my own algorithm. I didn't want to just give up and let someone do the work for me but rather I wanted to learn from my mistakes. Once I had come up with a solution I was happy with, I looked into the libraries.
So for me it was simply wanting to see if I could do it.

I always have this urge to do it myself, but sometimes I can see my own limitations.
Just recently downloaded a library to create PDF documents, but thats pretty much the only time I can remember.
At least for me, (trying to) do things myself, is my way of learning.
My impression is that many newbie programmers wouldn't consider it their own work if they were to use someone elses libraries.

I don't think that this is necessarily a bad thing. Using libraries is great; it saves time, effort, bugs, etc. However, you learn very little in the process, and for new programmers, learning is the goal. To answer the question, I think that they tend to shy way from libraries simply because they are not used to using them and perhaps they don't know that they exist.

For many poorly documented libraries that are either implemented loosely or in languages that don't allow you to control containment and visibility very well, it can be quite difficult to guess just how the library is supposed to be used.
After you've used it for a while, you've gotten used to the quirks or read other source code that taught you the right way; but until then it can be pretty irritating to use a poorly put-together/designed library. (or even a well designed one that isn't terribly well documented).
If you don't have the source code to the library, that's another problem--you have no control over the ability to keep your program working. This is much more rare these days, but still happens in the case of a purchased library.

Most of the points covered off (for me the main one is the learning curve) but one other I think plays a part:
Because learning about a library is less exciting than coding the same functionality yourself.

More libraries = less billable hours.

I think there's a lot of time that needs to be invested in understanding the library's purpose - yes, a learning curve, but it's more that newbie programmers probably don't know what they need until they have a lot more experience.

Because it's fun.

Because part of maturing as a developer is learning to quickly identify problems which can be solved by a library or existing solution and which need personal attention.

When you're trying to learn how to do things, anytime something is accomplished "magically" by calling AwesomeClass.doAwesomeStuff(), you end up giving away a portion of control. When you are "new" and don't know what you're giving away or why it can be unsettling. This was my primary knock against Rails when I was first learning it. So many things just "worked" and I didn't know why without digging through lots of Rails source (which I generally didn't have time to do).
At least, that's my take on it.

The same reason that more experienced developers do -
Because it can often be as difficult to learn how to use a library as to write the part of it you need yourself. And at least then you can understand how it works when it doesn't do what you expect.
An experienced developer just has experience at understanding how to use libraries so more likely to consider it. An inexperienced developer it's one more thing to learn...

I'm a programmer, not a psychologist! :)
It was a long, long time ago for me, but it was because I wanted to learn and experience. I didn't want to use something I did not understand, so if I didn't think I understood the library and could program it myself, I tried not to use it. There might have been a bit of fear too; programming gives you a feeling of control, and using a library is like giving away this control.

Answer from a noob -
"I am not sure how to use the libraries or even how to access them or how it works"

Libraries often come with the overhead of learning some API and it's paradigm. It can get complex fairly quickly, and I could easily understand that beginners would prefer something a bit more in their comfort zone. From my experience, I found most libraries & frameworks seem to do a great job abstracting some tedious routine, but when I need to either extend this functionality, or use it in a way that's not intended, it can be a handful.
I think it's one of those things where "practice makes perfect".

Well, the newbie's purpose might be more solving the problem than implementing a solution. Perhaps what they really want to do is figure out how to solve the problem. I mean, if they're still heavily in the learning phase, it's quite possible they don't want easy answers handed to them.

I think the professors want them to stick to the basics. When I graduated from under-grad school, I knew C++, Java and some other languages but had no clue about libraries and frameworks being used in companies. It was like do you know java..yes..can you write a servlet..no.

For speed demons they rarely use 3rd party libraries and new programmers are usually looking to squeeze every once of speed of of their code. I think if they don't have control over their code they can't get the performance that they are looking for. At least thats why i avoided libraries when I first started to program.
I remember programing my first DAL and avoided all the other free libraries out on the web because I wanted my code to perform at top speed. Later, I discovered that usually its not the code thats the bttleneck its actually the database.

Some open source libraries are buggy or not as efficient as others.

In my eyes another factor is that additional libraries add complexity. Programs tend to get harder to understand, harder to maintain and buggier when getting more complex. I think what makes especially new programmers shy away from libraries is that adding library code increases complexity more than adding your own code - simply because understanding how the library works is still out of their grasp. So it seems to be a problem of both skill and psychology.

I think more fundamental issues can be recognized as a deterrant to using existing libraries.
Part of this as "newbie programmers" is a lack of exposure to libraries. If you don't know they exist, how do you know to use them?
Number Of Options Available. Let's say I'm really interested in learning more about MVC, but if I have to choose between cakephp and smarty and zend and ... well you can quickly see the gears work to discover a way to achieve the goal without investing the time experimenting. Take a look at Freshmeat or SourceForge to get a better understanding at the daunting selection of libraries available.
Questionable support combined with sketchy/outdated documentation for the libraries. Do I want to use this tool that may no longer work or may be abandoned in the future? It is likely that a project will evolve, and so it will for the project of a library too. Will its usefulness last the lifetime of my project or will I be required to re-do this work again?

Using a library requires you to understand the relatively complex design of the library, something that new programmers might not have mastered because all they've ever written is simple/procedural/single-purpose code. For example, to an experienced programmer standard design patterns like template method, observer and command seem pretty obvious, but to a newbie it all just seems like magic and/or unnecessary complexity. For me the turning point was when I got good enough to grok design patterns and write some basic reusable code.

It's been a long time now, but when I came out of college, I knew nothing of libraries. This was in the days of mainframes and mini-computers. Our college had a VAX and the managers were paranoid about students hacking the system, so didn't allow us to even see the library manuals. So, when I first came out of college, I didn't even think of libraries being available.

I am sure there are a lot of reasons why the newbie doesn't want to use the new library. But wouldn't this be a good opportunity, if you have enough time, to show them what the advantage of using the library is? With the people I work with, I will usually provide an example of why something is better than their approach. It helps them learn and mature as a programmer.

Happens that noobs use Libs without knowing, but when they must import/add one that is fairly less documented, there is a fear of unknow. That happens mostly for compiled langs!
In the interpreted, (or compiled IRT), mainly when there is a console, such fear is almost non-existent; since you can require and see if it fails, call a method and see what it returns.
Consoles are tools of bravery !

Related

Learn and understand the full stack

I have been struggling with an idea for a few weeks and wanted to see if someone can help me out here.
Programming today is full of abstractions, and people who do not understand the abstractions, do not truly understand the reason or design than went into building that abstraction/layer/framework and will struggle as soon as they step outside the comfort zone.
I was wondering if there is a learning resource that goes about teaching programming in an incremental fashion. This will lead to understanding the full stack.
take a small problem
implement a simple solution
talk about the the solution and the designs used
convert the solution into a framework or utility of some sort
now extend the problem space and repeat from step 2.
This way when someone then picks up any framework/library, they can easily visualize the problems the framework is trying to solve, the design decisions taken and the reasons thereof.
[Added to clarify the intent]
Based on the answers and comments below, I want to clarify that I want to move further up the stack. Building your own ORM to understand ORM better, same goes for ActiveRecord, IOC container, data binding, templating engine, and the host of other magic/glue/plumbing we use day-to-day.
Thanks.
Here's what I recommend : Have a brush with assembly (just one book or one month is enough). Have a good strong review of C++ (hopefully it will teach you some of C as well). Now the world is yours. Python is made in C/C++ , Object C is pretty close to c++, .NET is in C++ and C#/VB.NET , The windows API is oriented for C.
I picked C# as my abstract language of choice after this by the way.
Read the source. It is a good idea to build something you want to understand, but you can enhance your understanding of concepts significantly by looking at how something is built. This is especially true for infrastructure pieces (ORM/DI/Templating) which you seem to be interested in.
Get the software to build on your machine, attach a debugger and trace through the code. This is pretty easy for C#/Java with a good IDE. For dynamic languages like Python and Ruby, it takes a good editor and a lot of grepping.
If it is a good software package, it will usually have tests. Tests are a great place to start digging into code. They usually make clear the intent of the code, and also provide you a logical starting point to peel off the layers and actually peek under the hood.
Build a fully functional compiler from scratch in a systems language like C or C++. Maybe it isn't the full stack, but it's a large part of it. This is something I want to do as well. If only I could find the time and space.
The best example of the sort of learning resources I am seeking is the MIX session by Rob Eisenberg on "BUILD YOUR OWN MVVM FRAMEWORK". It goes step by step on explaining the pattern and also implementing it at the same time, attacking one problem area at a time.
http://live.visitmix.com/MIX10/Sessions/EX15
Hope there are more out there.

Help me choose a web development framework/platform that will make me learn something

I'm having a bit of an overload of information these past two days.
I'm planning to start my own website that will allow local businesses to list their items on sale, and then users can come in and search for "Abercrombie t-shirt" and the stores that sell them will be listed.
It's a neat little project I'm really excited for and I'm sure it'll take off, but I'm having problems from the get go.
Sure I could use ASP.Net for it, I'm a bit familiar with it and the IDE for ASP.Net pages is bar-none, but I feel this is a great chance for me to learn something new to branch out a bit and not regurgitate .NET like a robot.
I've been looking and asking around but it's all just noise and I can't make an educated decision.
Can you help me choose a framework/platform that will make me learn something that's a nice thing to know in the job market, but also nice for me to grow as a professional?
So far I've looked at:
Ruby on Rails
Kohana
CakePHP
CodeIgniter
Symfony
But they are all very esoteric to me, and I have trouble even finding out which IDE to use to that will let me use auto-complete for the proprietary keywords/methods.
Thank you for your time.
Have you considered ASP.NET MVC?
You will learn MVC architecture and client-side programming with javascript and jQuery, in a nimble platform that produces clean markup. If you are used to ASP.NET, you will be surprised at the clarity of design.
The NerdDinner tutorial is the go-to guide for getting started.
I suggest you try the whole Java eco-system.
Both similar and different to the .NET world, a lot of open-source frameworks.
Eclipse is a very good free IDE.
Also, why not go with GWT for the client side ?
In any case - enjoy !
Because nobody's mentioned it yet, I'm gonna have to suggest Rails.
I'm sure you're going to hear arguments for each framework you mentioned (and probably some you didn't), but I"ll be brief and explain why I got into Rails, having tried several of the others in your list:
It's beautiful. Coding in Ruby actually makes me happy. I find that I'm removed from a lot of tedium I've come to associate with large projects, and it's totally changed my coding career - I would literally hate to think how much less content of a programmer I'd be had I not taken the plunge. My only suggestion here would be to just try it.
Besides the succinctness of Ruby, Rails got me thinking in a more modular, logical and maintainable manner. As well as the great community, I really subscribe to the principles Rails stands for (especially the emphasis on DRY code), but all of the frameworks you mentioned have their recognized (and controversially disputed) pros and cons.
Each one is both similar and unique in varying degrees and aspects, and ultimately it's up to you to decide - Do some research; deliberate as to whether you're looking for huge libraries, a strong community, cutting-edge updates or assurances in tight documentation. Figuring out what you hope to gain from a framework is a great way of choosing the right one. But you should go for Rails anyway ;P
Good luck!

Beyond simple coding: Where to go from here? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I've been coding since my early teenager years. I started out with HTML, went on to PHP/MySQL (created my own forums, social networking sites, etc..) and then branched out into more traditional languages such as Java and C++ (also picked up a little VB .NET in high school). I'm most familiar now with C++ as that is the language of choice taught to us in Computer Science II (skipped Comp Sci I) at RPI. I now feel like I have a very good knowledge of how to program from this class (the homework assignments were brutal).
It is now summer and I am interning at a company that is providing me some level of new experiences with programming, but I want more. I want to feel the incredible satisfaction that I got out of my computer classes at RPI whenever I finished a long assignment (generating every possible wordsearch from a set of words to include and exclude comes to mind).
My question is, where do I go from here? I tried following a tutorials online, but they were all too simple. I then tried heading over to sourceforge and helping out with various projects. But, as soon as I downloaded the code I got lost in its complexity. I have never created actual software of a real life applicapable scale. And I don't know where to get started. How do you transition from programming knowledge to actuall creation of software?
I'm also open to learning new languages (javascript/jQuery, ASP .NET, C#, Python), or using new libraries and/or frameworks with c++ (we only used the STL in Comp Sci II) and other languages. Again my question is, where do I go from here? And I am welcome to all possible answers.
Thanks Much,
Michael
You could join a big open source project, you'll learn a lot and accomplish something cool.
EDIT after comment: If this is too complicated for you, try starting a "real" project. I say "real" because it should be something that you want to do, not a "create a blog" or "how to do a loop". For instance last year I created a fully functional project management system that I now use for some of my projects. If this is real, you will be more motivated and you will want to get this done.
Pick the technology you want for it. I'd recommend Ruby on Rails because it's awesome and full of interesting concepts that will improve the way you code overtime (DRY, RESTful, MVC...)... but you can pick whatever you want.
Try to create your project using what you know and basic tutorials. You will get stuck and have to learn some more in order to get the features you want going. To me that's the best way to improve the way you code and general programming knowledge.
Since you'll start the project from scratch, you'll see where to start and how this will evolve. I'll take the project management tool. We started with "A project has todos" and we ended up with all kind of other features such as a complex calendar, a full ajax interface, a embedded chat...
Once you see how you did this, try doing the same with a friend of yours to get a fealing of teamwork in development. Learn how to use SVN, basecamp... learn about software development processes (Agile!), peer programming..
There's a lot to experience! Then you could give open source another try.
Hope that helps
I'd recommend trying a pure functional language, such as Haskell. It's a completely different way of looking at programming, and I found it very satisfying.
I recommend the book Real World Haskell for learning it.
Edit: In response to comments, my interpretation of the question is where can he go from here as a programmer. Functional programming is a logical direction. For someone learning functional programming for the first time, Haskell is nice because it doesn't really allow for imperative programming practices. Furthermore, it has basically all features you will find in other functional languages, which means it will be easy to pick up other functional languages.
I would suggest looking at Project Euler. It's a great and fun way to learn a new language, and it does provide that level of satisfaction when you solve the problems. They offer a huge range of problems at all levels of difficulty.
I think the best way to improve your skills as a programmer is to do what programming was intended for: solve problems. I'm sure you've been playing around on your machine and at one time thought to yourself "I wish there was a program that did this..." or "I wonder if there are any programs that do X..." Instead of just googling to see if someone else wrote it, write it yourself. Start with something small, and gradually make it more complex. Add features. Allow yourself to fail, and when you do, ask yourself (or us at stackoverflow) how to overcome that obstacle. Once you have the basics of programming, everything else is just making big things out of littler things, and the little things are usually pretty easy.
When you are making something you know you will actually use, it's more fun and more gratifying when you finish.
Do you understand all of the following and how they work?
Linked lists (single and double)
Sequential vs. binary search
Binary trees
Stacks
Red-black trees
Algorithmic complexity and big O notation
Recursion
Hash tables and hashing algorithms
If you don't feel completely conversant in any of these, take a class in data structures and algorithms.
It seems like you've experienced many languages, but have not created a real-world application. Creating software in any of those languages will bring you to a new level.
PS: Creating software is much more than just knowing how to mess around with a language.
Check out Design Patterns.
I believe that's beyond mere simple coding.
You could offer your abilities to a non profit or a friend or someone who needs a website or program built.
Tell them you can do it for free and then just go for it. As you start to make it you'll start to see what you need to learn.
For example I bit ago I had the program Peel www.getpeel.com on my Mac. But sold my Mac for Wind as I am traveling.
Seeing that there was nothing for Windows (or nothing I could find) like Peel. I build my own in PHP and am up to a 4th rebuilding of it as I learn betters ways to do each new thing I am learning.
You've got to find a particular type of functionality that interests you. For me it's been basic socket programming and making my own protocols. I got that feeling when I was able to create a functional file uploading control that worked in conjunction with a file upload service on the server which I had created. The protocol handles authentication, chunking and hash comparison. Sometime soon I plan on incorporating file-resume functionality as well.
Well my advice will be to find a problem/project you are interested in and try to code it. Trying a real problem is something that drives you. Find something not to complex but not too simple. Something to try when starting a new language is to write some tools. You can for example program a python script that print metrics on a C++ project. You can write a tool that extract some statistics from a website you use, etc. When I learn programming game programming was quite fun (I was young :-) ). You can try to program some simple game using a 3D engine like Ogre3D for example. Participate in an opensource project is great too but as you stated perhaps a bit overwhelming for now ;-)
Find something you like and that has a REAL useful goal for you. You will thrive to solve the problem and learn a lot along the way !
I think the transition from the kind of finite tractable problems that are part of class projects (and some internships) to "real world" projects is quite tough. Sounds like language syntax, getting bugs out of code etc. is something you are comforatble with.
When we come to larger projects, probably with many release cycles, with developers working together, different kinds of problems emerge. I remember being shocked when I saw my first big project plan - what a small propertion of the time was actually writing code!
The thing I found helpful as a junior was "Sitting By Nelly", I was lucky enough to work for enlighted employers who put me with very experienced and helpful developers. Sometimes to work with, sometime to work alongside. I reckon that helped me get over the hurdles you describe.
So, I reckon you need to find employment of this kind. Be less concerned about specific technologies, salary etc. Look instead for the culture and level of responsibility you would have.
I think you should really start a "real world application" as suggested here. You will see that a real project requires a bit more than just knowing the language.
I suggest you find some simple problem you had issue with, and make a software that solves that issue.
For example, if you want some kind of software that checks RSS feeds every 5 minutes and makes a little pop-up as soon as a new one comes along. Or you want a program that will let you do simple presentations by just dragging a few pictures in.
Then whatever you choose, start by making the simple solution to the problem and extend from there (like do different settings, extra features etc.)
I've looked for the answer to this question for a while now. Most all of the suggestions are usually either help an open source project or build something that interests you. I'm starting to realize that those answers are vague because what's 'interesting' or 'valuable' experience and how to get started doing it is completely subjective. Also, there's only one way to get real experience, which is to work on real projects. It's difficult but find a project, personal, open source, or otherwise that will increase your knowledge in a technology or platform that you think will hold your interest. Then just dig in. It doesn't have to be of earth shattering importance, just valuable in relation to your own goals.
There is no magic bullet transition from academic puzzle solving programs to real world applications. The best way to learn is just to jump in head first. It will take you a very long time to learn what you need to if you only ever look at your own code. You need to be looking at code written by professionals and struggling to understand why it works the way it does until you do understand it.
It seems overwhelming at first, but you will quickly start to see patterns if the application is at all logical. Well written code will be separated in logical ways, so you should be able to pick it apart one layer at a time.
For example, you could try a bottom up approach where you try to understand how the database interactions are handled before looking at the code that uses the database layer. You keep going upward until you get to the GUI event handlers.
Large enterprise applications can be even harder to understand because there might be a lot more than one executable, or component. Try to stay focused and learn what the component is responsible for doing, and then pick it apart a piece at a time.
You will see that there are not just patterns at the function and class level, but at higher levels as well. This makes it simpler to understand what is going on when you understand those patterns.
Try to find an internship or co-op position. I was in a similar situation after my first few courses. I took a co-op position sophomore year and I learned so much more on the job than I did in class. Class is great for teaching you theory and the basics. I learned C# on my first project on my co-op and that got me my second co-op position at another employer (wanted to see how it was working at a small company).
This past spring, I accepted a position at my first co-op employer, reworking my first major project I had started on my co-op. I have a list of side projects I also want to complete, which will help round out my skills, as well as learn some other languages.
So my suggestion is try finding a job where you can have a great mentor. On my second co-op, I learned alot of the coding standards that I code against from my supervisor. He was a great teacher, and really had some great input, and explained why things should be done certain ways.
You will almost certainly go nowhere unless you find something that interests you. Figure out what is interesting, and then how to write software involving it.
The only way to begin is to begin! There is really no other way... The best answers you get would always tell you this. You have the knowledge, now put it to work!

The "You're Doing It Wrong!!" feeling

NB - This question is not a stab at RoR or at Redmine's plugin system
I have been working on a custom plugin for Redmine, a project manager platform built with Ruby on Rails (RoR). Now I am not really a RoR or for that matter, a Ruby guy. I have managed to write the plugin and everything works, despite being a mess behind the scenes. The "You're Doing It Wrong!!" feeling is coming from the fact that I know RoR is a highly-regarded framework, so there must be simple, better, elegant, [insert best-practice term here], ways of doing what I have done. I am not too concerned with making this code as perfect as possible, for the sole reason that this is only used internally, and really, people only care that it "works" (I am the only one who will look at the code).
What words of wisdom can be passed on to suppress the "You're Doing It Wrong!!" feeling? I am pretty sure if this feeling is present, then it is probably true that the task at hand is being done wrong.
What is the problem? The “You’re Doing It Wrong!!” feeling is the essence of our existence.
I read a piece of advice a while back that helps me in such situations. It can be paralyzing to try to do something the "right" way when you don't know anything about it.
So if I have to build, say, a customer-management application, my first step might be:
Build a really crappy customer-management application.
It's hard to build a good one, but it's easy to build a crappy one that does a thing or two. And while it's hard to know when something is good enough, it's pretty easy to know when something is crappy.
Once I have the crappy one in front of me, I can clearly see what needs improvement. And it's easy to change it, because the crappy one actually does something; so, using the TDD principle, I improve it while keeping it working. Eventually the crappy application evolves into a good one.
Many of the applications I've worked on that ended up crappy started out attempting not to be...and thus were overly complicated and hard to change, debug, and improve.
Don't suppress the feeling! That this is wrong feeling is the key to innovation. Just ask yourself "if this is wrong, then what would be better?"
Don't think of it as "You're doing it Wrong!!" Think of it as "What's a better way to do this?" Best case you end up doing it more efficiently, worst case you discover something that may be useful on your next project.
I have never finished a piece of code that I am 100% happy with. But then, I don't believe there is a single (non-trivial) piece of code in the World that can't be improved upon.
It is the developers who always think they have delivered great code that I worry about.
They say writers should write something every day, whether it's good or not.
Marc Raibert's advice is if you can't write something good, write something bad, and then polish the heck out of it.
My experience programming is when I've done something several times over, I finally start to get good at it.
So don't expect perfection.
RoR like a lot of other frameworks, make it really easy to do things that they anticipate, and quite difficult to do things that they didn't anticipate. This is true of almost all frameworks that aren't custom frameworks that you designed specifically for your project. The fact that you had to do a lot of messy stuff behind the scenes to do something doesn't surprise me at all. So, if you are doing something that wasn't anticipated, then it's perfectly natural to feel like you are doing it wrong, because from the eye of the framework designers, you shouldn't be doing that at all.
In many cases RoR employs 'syntactic vinegar' precisely to give you this feeling that there is a better way. You should probably look at some other plugins to get a feel for the conventions used. Generally if you follow the convention with RoR, it feels right.
Specifically with Redmine, the plugin system is still new and we are trying to work out the best practices. Since Redmine (and Ruby on Rails) are Open Source, I'd suggest bringing up your feelings to the project's forums and see what people think. Others might agree with you and can help to improve the systems.
Note: I'm one of the core developers of Redmine, helped build the Redmine plugin system, and have about a dozen Redmine plugins released. And even I feel things are backwards from time to time.
If you look back on any piece of code you write today a few years in the future, chances are you'd be shocked how bad it is. Put another way, it may show how much you've progressed over time. The feeling you are getting is part of continous learning. Start by getting it right and making it stable. Elegance comes with time, and fashion changes in computer programming style same as with everything else.
"What is the problem? The “You’re Doing It Wrong!!” feeling is the essence of our existence."
Perhaps not really related, but :
"A programmer is someone who makes a living out of solving the problems that were caused by the solutions he invented before."
The first time you work with a language, framework, etc, you're going to do something "wrong." As you develop other applications, or make tweaks & improvements to your original application, you'll find out what needs fixing, or what you could do better.
On every project, I almost always feel like I've done something sub-optimally, or not perfectly, but when it comes down to it, the code works, allows people to do what they need, and is in a way, a success.
So keep writing and keep learning!

What's your most controversial programming opinion?

Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
This is definitely subjective, but I'd like to try to avoid it becoming argumentative. I think it could be an interesting question if people treat it appropriately.
The idea for this question came from the comment thread from my answer to the "What are five things you hate about your favorite language?" question. I contended that classes in C# should be sealed by default - I won't put my reasoning in the question, but I might write a fuller explanation as an answer to this question. I was surprised at the heat of the discussion in the comments (25 comments currently).
So, what contentious opinions do you hold? I'd rather avoid the kind of thing which ends up being pretty religious with relatively little basis (e.g. brace placing) but examples might include things like "unit testing isn't actually terribly helpful" or "public fields are okay really". The important thing (to me, anyway) is that you've got reasons behind your opinions.
Please present your opinion and reasoning - I would encourage people to vote for opinions which are well-argued and interesting, whether or not you happen to agree with them.
Programmers who don't code in their spare time for fun will never become as good as those that do.
I think even the smartest and most talented people will never become truly good programmers unless they treat it as more than a job. Meaning that they do little projects on the side, or just mess with lots of different languages and ideas in their spare time.
(Note: I'm not saying good programmers do nothing else than programming, but they do more than program from 9 to 5)
The only "best practice" you should be using all the time is "Use Your Brain".
Too many people jumping on too many bandwagons and trying to force methods, patterns, frameworks etc onto things that don't warrant them. Just because something is new, or because someone respected has an opinion, doesn't mean it fits all :)
EDIT:
Just to clarify - I don't think people should ignore best practices, valued opinions etc. Just that people shouldn't just blindly jump on something without thinking about WHY this "thing" is so great, IS it applicable to what I'm doing, and WHAT benefits/drawbacks does it bring?
"Googling it" is okay!
Yes, I know it offends some people out there that their years of intense memorization and/or glorious stacks of programming books are starting to fall by the wayside to a resource that anyone can access within seconds, but you shouldn't hold that against people that use it.
Too often I hear googling answers to problems the result of criticism, and it really is without sense. First of all, it must be conceded that everyone needs materials to reference. You don't know everything and you will need to look things up. Conceding that, does it really matter where you got the information? Does it matter if you looked it up in a book, looked it up on Google, or heard it from a talking frog that you hallucinated? No. A right answer is a right answer.
What is important is that you understand the material, use it as the means to an end of a successful programming solution, and the client/your employer is happy with the results.
(although if you are getting answers from hallucinatory talking frogs, you should probably get some help all the same)
Most comments in code are in fact a pernicious form of code duplication.
We spend most of our time maintaining code written by others (or ourselves) and poor, incorrect, outdated, misleading comments must be near the top of the list of most annoying artifacts in code.
I think eventually many people just blank them out, especially those flowerbox monstrosities.
Much better to concentrate on making the code readable, refactoring as necessary, and minimising idioms and quirkiness.
On the other hand, many courses teach that comments are very nearly more important than the code itself, leading to the this next line adds one to invoiceTotal style of commenting.
XML is highly overrated
I think too many jump onto the XML bandwagon before using their brains...
XML for web stuff is great, as it's designed for it. Otherwise I think some problem definition and design thoughts should preempt any decision to use it.
My 5 cents
Not all programmers are created equal
Quite often managers think that DeveloperA == DeveloperB simply because they have same level of experience and so on. In actual fact, the performance of one developer can be 10x or even 100x that of another.
It's politically risky to talk about it, but sometimes I feel like pointing out that, even though several team members may appear to be of equal skill, it's not always the case. I have even seen cases where lead developers were 'beyond hope' and junior devs did all the actual work - I made sure they got the credit, though. :)
I fail to understand why people think that Java is absolutely the best "first" programming language to be taught in universities.
For one, I believe that first programming language should be such that it highlights the need to learn control flow and variables, not objects and syntax
For another, I believe that people who have not had experience in debugging memory leaks in C / C++ cannot fully appreciate what Java brings to the table.
Also the natural progression should be from "how can I do this" to "how can I find the library which does that" and not the other way round.
If you only know one language, no matter how well you know it, you're not a great programmer.
There seems to be an attitude that says once you're really good at C# or Java or whatever other language you started out learning then that's all you need. I don't believe it- every language I have ever learned has taught me something new about programming that I have been able to bring back into my work with all the others. I think that anyone who restricts themselves to one language will never be as good as they could be.
It also indicates to me a certain lack of inquistiveness and willingness to experiment that doesn't necessarily tally with the qualities I would expect to find in a really good programmer.
Performance does matter.
Print statements are a valid way to debug code
I believe it is perfectly fine to debug your code by littering it with System.out.println (or whatever print statement works for your language). Often, this can be quicker than debugging, and you can compare printed outputs against other runs of the app.
Just make sure to remove the print statements when you go to production (or better, turn them into logging statements)
Your job is to put yourself out of work.
When you're writing software for your employer, any software that you create is to be written in such a way that it can be picked up by any developer and understood with a minimal amount of effort. It is well designed, clearly and consistently written, formatted cleanly, documented where it needs to be, builds daily as expected, checked into the repository, and appropriately versioned.
If you get hit by a bus, laid off, fired, or walk off the job, your employer should be able to replace you on a moment's notice, and the next guy could step into your role, pick up your code and be up and running within a week tops. If he or she can't do that, then you've failed miserably.
Interestingly, I've found that having that goal has made me more valuable to my employers. The more I strive to be disposable, the more valuable I become to them.
1) The Business Apps farce:
I think that the whole "Enterprise" frameworks thing is smoke and mirrors. J2EE, .NET, the majority of the Apache frameworks and most abstractions to manage such things create far more complexity than they solve.
Take any regular Java or .NET ORM, or any supposedly modern MVC framework for either which does "magic" to solve tedious, simple tasks. You end up writing huge amounts of ugly XML boilerplate that is difficult to validate and write quickly. You have massive APIs where half of those are just to integrate the work of the other APIs, interfaces that are impossible to recycle, and abstract classes that are needed only to overcome the inflexibility of Java and C#. We simply don't need most of that.
How about all the different application servers with their own darned descriptor syntax, the overly complex database and groupware products?
The point of this is not that complexity==bad, it's that unnecessary complexity==bad. I've worked in massive enterprise installations where some of it was necessary, but even in most cases a few home-grown scripts and a simple web frontend is all that's needed to solve most use cases.
I'd try to replace all of these enterprisey apps with simple web frameworks, open source DBs, and trivial programming constructs.
2) The n-years-of-experience-required:
Unless you need a consultant or a technician to handle a specific issue related to an application, API or framework, then you don't really need someone with 5 years of experience in that application. What you need is a developer/admin who can read documentation, who has domain knowledge in whatever it is you're doing, and who can learn quickly. If you need to develop in some kind of language, a decent developer will pick it up in less than 2 months. If you need an administrator for X web server, in two days he should have read the man pages and newsgroups and be up to speed. Anything less and that person is not worth what he is paid.
3) The common "computer science" degree curriculum:
The majority of computer science and software engineering degrees are bull. If your first programming language is Java or C#, then you're doing something wrong. If you don't get several courses full of algebra and math, it's wrong. If you don't delve into functional programming, it's incomplete. If you can't apply loop invariants to a trivial for loop, you're not worth your salt as a supposed computer scientist. If you come out with experience in x and y languages and object orientation, it's full of s***. A real computer scientist sees a language in terms of the concepts and syntaxes it uses, and sees programming methodologies as one among many, and has such a good understanding of the underlying philosophies of both that picking new languages, design methods, or specification languages should be trivial.
Getters and Setters are Highly Overused
I've seen millions of people claiming that public fields are evil, so they make them private and provide getters and setters for all of them. I believe this is almost identical to making the fields public, maybe a bit different if you're using threads (but generally is not the case) or if your accessors have business/presentation logic (something 'strange' at least).
I'm not in favor of public fields, but against making a getter/setter (or Property) for everyone of them, and then claiming that doing that is encapsulation or information hiding... ha!
UPDATE:
This answer has raised some controversy in it's comments, so I'll try to clarify it a bit (I'll leave the original untouched since that is what many people upvoted).
First of all: anyone who uses public fields deserves jail time
Now, creating private fields and then using the IDE to automatically generate getters and setters for every one of them is nearly as bad as using public fields.
Many people think:
private fields + public accessors == encapsulation
I say (automatic or not) generation of getter/setter pair for your fields effectively goes against the so called encapsulation you are trying to achieve.
Lastly, let me quote Uncle Bob in this topic (taken from chapter 6 of "Clean Code"):
There is a reason that we keep our
variables private. We don't want
anyone else to depend on them. We want
the freedom to change their type or
implementation on a whim or an
impulse. Why, then, do so many
programmers automatically add getters
and setters to their objects, exposing
their private fields as if they were
public?
UML diagrams are highly overrated
Of course there are useful diagrams e.g. class diagram for the Composite Pattern, but many UML diagrams have absolutely no value.
Opinion: SQL is code. Treat it as such
That is, just like your C#, Java, or other favorite object/procedure language, develop a formatting style that is readable and maintainable.
I hate when I see sloppy free-formatted SQL code. If you scream when you see both styles of curly braces on a page, why or why don't you scream when you see free formatted SQL or SQL that obscures or obfuscates the JOIN condition?
Readability is the most important aspect of your code.
Even more so than correctness. If it's readable, it's easy to fix. It's also easy to optimize, easy to change, easy to understand. And hopefully other developers can learn something from it too.
If you're a developer, you should be able to write code
I did quite a bit of interviewing last year, and for my part of the interview I was supposed to test the way people thought, and how they implemented simple-to-moderate algorithms on a white board. I'd initially started out with questions like:
Given that Pi can be estimated using the function 4 * (1 - 1/3 + 1/5 - 1/7 + ...) with more terms giving greater accuracy, write a function that calculates Pi to an accuracy of 5 decimal places.
It's a problem that should make you think, but shouldn't be out of reach to a seasoned developer (it can be answered in about 10 lines of C#). However, many of our (supposedly pre-screened by the agency) candidates couldn't even begin to answer it, or even explain how they might go about answering it. So after a while I started asking simpler questions like:
Given the area of a circle is given by Pi times the radius squared, write a function to calculate the area of a circle.
Amazingly, more than half the candidates couldn't write this function in any language (I can read most popular languages so I let them use any language of their choice, including pseudo-code). We had "C# developers" who could not write this function in C#.
I was surprised by this. I had always thought that developers should be able to write code. It seems that, nowadays, this is a controversial opinion. Certainly it is amongst interview candidates!
Edit:
There's a lot of discussion in the comments about whether the first question is a good or bad one, and whether you should ask questions as complex as this in an interview. I'm not going to delve into this here (that's a whole new question) apart from to say you're largely missing the point of the post.
Yes, I said people couldn't make any headway with this, but the second question is trivial and many people couldn't make any headway with that one either! Anybody who calls themselves a developer should be able to write the answer to the second one in a few seconds without even thinking. And many can't.
The use of hungarian notation should be punished with death.
That should be controversial enough ;)
Design patterns are hurting good design more than they're helping it.
IMO software design, especially good software design is far too varied to be meaningfully captured in patterns, especially in the small number of patterns people can actually remember - and they're far too abstract for people to really remember more than a handful. So they're not helping much.
And on the other hand, far too many people become enamoured with the concept and try to apply patterns everywhere - usually, in the resulting code you can't find the actual design between all the (completely meaningless) Singletons and Abstract Factories.
Less code is better than more!
If the users say "that's it?", and your work remains invisible, it's done right. Glory can be found elsewhere.
PHP sucks ;-)
The proof is in the pudding.
Unit Testing won't help you write good code
The only reason to have Unit tests is to make sure that code that already works doesn't break. Writing tests first, or writing code to the tests is ridiculous. If you write to the tests before the code, you won't even know what the edge cases are. You could have code that passes the tests but still fails in unforeseen circumstances.
And furthermore, good developers will keep cohesion low, which will make the addition of new code unlikely to cause problems with existing stuff.
In fact, I'll generalize that even further,
Most "Best Practices" in Software Engineering are there to keep bad programmers from doing too much damage.
They're there to hand-hold bad developers and keep them from making dumbass mistakes. Of course, since most developers are bad, this is a good thing, but good developers should get a pass.
Write small methods. It seems that programmers love to write loooong methods where they do multiple different things.
I think that a method should be created wherever you can name one.
It's ok to write garbage code once in a while
Sometimes a quick and dirty piece of garbage code is all that is needed to fulfill a particular task. Patterns, ORMs, SRP, whatever... Throw up a Console or Web App, write some inline sql ( feels good ), and blast out the requirement.
Code == Design
I'm no fan of sophisticated UML diagrams and endless code documentation. In a high level language, your code should be readable and understandable as is. Complex documentation and diagrams aren't really any more user friendly.
Here's an article on the topic of Code as Design.
Software development is just a job
Don't get me wrong, I enjoy software development a lot. I've written a blog for the last few years on the subject. I've spent enough time on here to have >5000 reputation points. And I work in a start-up doing typically 60 hour weeks for much less money than I could get as a contractor because the team is fantastic and the work is interesting.
But in the grand scheme of things, it is just a job.
It ranks in importance below many things such as family, my girlfriend, friends, happiness etc., and below other things I'd rather be doing if I had an unlimited supply of cash such as riding motorbikes, sailing yachts, or snowboarding.
I think sometimes a lot of developers forget that developing is just something that allows us to have the more important things in life (and to have them by doing something we enjoy) rather than being the end goal in itself.
I also think there's nothing wrong with having binaries in source control.. if there is a good reason for it. If I have an assembly I don't have the source for, and might not necessarily be in the same place on each devs machine, then I will usually stick it in a "binaries" directory and reference it in a project using a relative path.
Quite a lot of people seem to think I should be burned at the stake for even mentioning "source control" and "binary" in the same sentence. I even know of places that have strict rules saying you can't add them.
Every developer should be familiar with the basic architecture of modern computers. This also applies to developers who target a virtual machine (maybe even more so, because they have been told time and time again that they don't need to worry themselves with memory management etc.)
Software Architects/Designers are Overrated
As a developer, I hate the idea of Software Architects. They are basically people that no longer code full time, read magazines and articles, and then tell you how to design software. Only people that actually write software full time for a living should be doing that. I don't care if you were the worlds best coder 5 years ago before you became an Architect, your opinion is useless to me.
How's that for controversial?
Edit (to clarify): I think most Software Architects make great Business Analysts (talking with customers, writing requirements, tests, etc), I simply think they have no place in designing software, high level or otherwise.
There is no "one size fits all" approach to development
I'm surprised that this is a controversial opinion, because it seems to me like common sense. However, there are many entries on popular blogs promoting the "one size fits all" approach to development so I think I may actually be in the minority.
Things I've seen being touted as the correct approach for any project - before any information is known about it - are things like the use of Test Driven Development (TDD), Domain Driven Design (DDD), Object-Relational Mapping (ORM), Agile (capital A), Object Orientation (OO), etc. etc. encompassing everything from methodologies to architectures to components. All with nice marketable acronyms, of course.
People even seem to go as far as putting badges on their blogs such as "I'm Test Driven" or similar, as if their strict adherence to a single approach whatever the details of the project project is actually a good thing.
It isn't.
Choosing the correct methodologies and architectures and components, etc., is something that should be done on a per-project basis, and depends not only on the type of project you're working on and its unique requirements, but also the size and ability of the team you're working with.