Explaining refactoring [closed] - language-agnostic

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
Question
My question is how can you teach the methods and importance of tidying-up and refactoring code?
Background
I was recently working on a code review for a colleague. They had made some modifications to a long-gone colleagues work. During the new changes, my colleague had tried to refactor items but gave up as soon as they hit a crash or some other problem (rather than chasing the rabbit down the hole to find the root of the issue) and so reimplemented the problem code and built more on top of that. This left the code in a tangle of workarounds and magic numbers, so I sat down with them to go through refactoring it.
I tried to explain how I was identifying the places we could refactor and how each refactoring can often highlight new areas. For example, there were two variables that stored the same information - why? I guessed it was a workaround for a bigger issue so I took out one variable and chased the rabbit down the hole, discovering other problems as we went. This eventually led to finding a problem where we were looping over the same things several times. This was due in no small part to the use of arrays of magic number sizes that obfuscated what was being done - fixing the initial "double-variable" problem led to this discovery (and others).
As I went on this refactoring journey with my colleague, it was evident that she wasn't always able to grasp why we made certain changes and how we could be sure the new functionality matched the original, so I took the time to explain and prove each change by comparing with earlier versions and stepping through the changes on paper. I also explained, through examples, how to tell if a refactoring choice was a bad idea, when to choose comments instead of code changes, and how to select good variable names.
I felt that the process of sitting together to do this was worthwhile for both myself (I got to learn a bit more about how best to explain things to others) and my colleague (they got to understand more of our code and our coding practices) but, the experience led me to wonder if there was a better way to teach the refactoring process.
...and finally...
I understand that what does or does not need refactoring, and how to refactor it are very subjective so I want to steer clear of that discussion, but I am interested to learn how others would tackle the challenge of teaching this important skill, and if others here have had similar experiences and what they learned from them (either as the teacher or the student).

Like most programming, refactoring skill comes with practice and experience. It would be nice to think it can be taught, but it has to be learned - and there is a significant difference in the amount of learning that can be accomplished in different environments.
To answer your question, you can teach refactoring methods and good design in a pedagogical fashion, and that's fine. But, ultimately, you and I both know attaining a certain level is only through long hard experience.

I am not 100% to understand your question but I think you can refer yourself to Code Smell that need to be refactored.It contain a lot of example that you could show to other.
Here is a list of when refactoring should be used (list of code smell)

If you haven't read it, Martin Fowler has an excellent book on the subject called Refactoring: Improving the Design of Existing Code. He goes into substantial detail about how and why a specific piece of code should be refactored.
I hesitated to even mention it for fear that knowledge of this book is assumed by someone asking about refactoring, and that you would think, "Duh, I meant besides the Fowler book." But what the hey, there you go. :-)

You don't mention tests. To 'prove' that a refactoring does not break the existing functionality you need to either have existing tests or write tests before doing the refactoring.

Pair Programming seems to be the best way for me to get this across. This way, as we're working on real, production code, and we both encounter some code that doesn't smell right, we tackle a code refactoring together. The pair acts as the driver's conscience saying to do the right thing instead of the quick fix, and in turn, they both learn what good code looks like in the process.
Refactoring can be an art, and just takes practice. The more you do it, the better you get at it. Keep studying the methods described in Martin Fowler's Ractoring book, and use your tools (Resharper for Visual Studio folk)

One simple way to conceive of refactoring is right there in the name -- it's just like when you factor a common variable out of an equation:
xy + xz
becomes
x(y + z)
The x has been factored out. Refactoring code is the same thing, in that you're finding duplicate code or logic and factoring it out.

It sounds like your approach is a very good one. At the end of the process, you showed how you were able to uncover and fix a lot of problems. For educational purposes, it could then be interesting to invent a new change/enhancement/fix. You could then ask your mentoree how they would enact that change with the old a new codebase. Hopefully they'll see that it's much easier to make the new change with the refactored code (or how doing more refactoring would be the easiest way to prepare for the hypothetical change).

I see a couple of different ways you could try to teach refactoring:
Given textbook-like examples. A downside here is that you may have contrived or simplistic examples where why refactoring is useful doesn't necessarily shine through as well as in other cases.
Refactoring existing code. In this case you could take existing legacy code that you'd clean up, or your own code in development and show the before and after doing various bits to see how much better the after is, in terms of readability and ease of maintanence. This may be a better exercise to do as this is live code being improved and enhanced to some extent.
It isn't something that someone can pick up instantly, it takes time, practice, effort and patience as some refactorings may be done for personal preference rather than because the code runs optimally one way or another.

Teaching someone to refactor when they aren't a natural is a tough job. In my experience your best bet is to sit down with them in an office and refactor some code. While you are doing this keep up a "stream of consciousness" dialog. Talk about what you see, why the code doesn't smell right, options to refactor to, etc. Also you should make sure they're doing the same thing. The most important thing is to impart why, not how, to change the code. Any decent programmer can make a change and have it work, but it takes skill and experience to be able to state why the new solution is better than the previous.

Related

How to tell someone that their mod's to my program are not good?

G'day,
This is related to my question on star developers and to this question regarding telling someone that they're writing bad code but I'm looking at a situation that is more specific.
That is, how do I tell a "star" that their changes to a program that I'd written are poorly made and inconsistently implemented without just sounding like I'm annoyed at someone "playing with my stuff"?
The new functionality added was deliberatly left out of the original version of this shell script to keep the it as simple as possible until we got an idea of the errors we were going to see with the system under load.
Basically, I'd argued that to try and second guess all error situations was impossible and in fact could leave us heading down a completely wrong path after having done a lot of work.
After seeing what needed to be added, someone dived in and made the additions but unfortunately:
the logic is not consistent
the variable names no longer describe the data they contain
there are almost no comments at all
the way in which the variables are used is not easy to follow and massively decreases readability and hence maintainability.
I always try and approach coding from the Damien Conway point of view "Always code as if your system is going to be maintained by a psychopath who knows where you live." That is, I try to make it easy for follow and not as an advertisement for my own brilliance. "What does this piece of code do?" exercises are fun and are best left to obfuscation contests IMHO.
Any suggestions greatly received.
cheers,
I would just be honest about it. You don't necessarily need to point every little detail that's wrong, but it's worth having a couple of examples of any general points you're going to make. You might want to make notes about other examples that you don't call out in the first brief feedback, in case they challenge your reasoning.
Try to make sure that the feedback is entirely about the code rather than the person. For example:
Good: The argument validation in foo() seems inconsistent with that in bar(). In foo(), a NullPointerException is thrown if the caller passes in null, whereas bar() throws IllegalArgumentException.
Bad: Your argument validation is all over the place. You throw NullPointerException in foo() but IllegalArgumentException in bar(). Please try to be consistent.
Even with a "please," the second form is talking about the developer rather than the code.
Of course in many cases you don't need to worry about being so careful, but if you think they're going to be very sensitive about it, it's worth making the effort. (Read what you've written carefully, if it's written feedback: I accidentally included a "you" in the first version to start with :)
I've found that most developers (superstar or not) are pretty reasonable about accepting, "No, I didn't implement that feature because it has problem X." It's possible that I've been lucky though.
Coming from the other perspective, I would encourage you to think about it in their shoes. I will describe a "hypothetical" experience.
Some things to keep in mind:
The guy was trying to do something
good.
Programmers are terrible at
mind reading. They tend to only know
what they read.
He may have not been given complete guidance as what needs to be done(or what doesn't need to be done)
He is likely doing the best he knows how to.
Just keep that in mind and talk to them. Teach them. No need for yelling or pissing contests. Just remember that they are not intentionally trying to make your life difficult.
I see that you've asked a lot of questions about how to deal with certain kinds of developers. It seems to be a common thread for you. You keep asking about how to change people around you. If this is a constant problem for you, then perhaps you are the problem.
Now I know you are asking questions to learn how to deal with people you find difficult, and that's good, however, you keep asking (and getting answers) about how to change people.
It seems to me that you need to change. Work with these people to change the code to what you want it to be. With them. Don't try to get them to do it. Just do it, and tell them what you did and why, and ask suggestions for further improvement, and learn from each other. Play off of each other's experience and strengths. Just my 2 cents.
If you have clearly defined coding standards for the project, point out that the code needs to be changed to meet those standards. The list you have there seems like quite reasonable feedback (though #3 is much argued-over; I would only push to document the really confusing parts as fixing the other three points, hopefully, makes the code less confusing).
If there are any other examples you have in your repository from this developer that are several months old, show one to him and ask him what it is doing. (Show him this one in a few months). When he has to zip around to find out what is actually in his variables, and deconstructing every line of code to figure out what it is doing. Break into a code review / pair programming session right there. Refactor and rename together so that he hopefully begins to see for himself exactly why these things are important.
Frankly, I think this is a political problem, not a coding problem. Specifically...
WHO SAID THIS PERSON WAS A "STAR"? If this is the same person you described in your other question, then you already have your answer there: THIS PERSON IS NO "STAR".
So then you get into the other effects of politics...
Who is claiming this person to be a star? Why can you not just tell the person "this is crap code"? Who is protecting them / defending them were you to do that? Can you do that or would you get blasted / demoted / put on the "to be laid off" pile?
You are asking questions that cannot really be answered in isolation. IF the code is crap, then throw it away and do it correctly yourself. IF there are reasons that you cannot do that, then you need to ask yourself if the benefits of this place outweigh the negatives.
Cheers,
-R
Creating a program and then releasing it to be worked on by other developers is tough. You are throwing your code to the mercy of others' development styles, coding conventions, etc.
Telling those developers that they are doing coding poorly, after the code is in, is one of the hardest things that you can do. It is best to address your concerns before they ever start working with your code. This can be done in two ways: Maintaining a detailed coding standard, requiring that submitted code adheres to this and maintaining a development road map, not to just outline when new features will be in, but to create dependencies to avoid such mishaps.
More to your situation, it is important not to criticize or it could cause hostility and worse code coming in. Maybe you can work with that developer to create standards documentation. You will be able to express your ideas about what the standards should be, and you will get their input, without causing any hard feelings.
Always point out the good things in their code, and be sure when discussing the weaknesses that you frame them pointing out the reasons that it will benefit everyone (the developer included), never criticize.
Good luck.
I would do the following:
Make sure he knows that his hard work is appreciated (preferably, this should be the truth)
Ask him if he would mind making a few changes, making it sound like no big deal and easy to fix
Explain the issues, including why they are issues, and suggest specific changes to set him on the right path.
Hopefully, the exercise will help him integrate into the culture project better.
We try to solve these potential 'issues' proactively:
Every 'bigger' project where people work together gets assigned a project 'codelead' (one of the developers). This rotates every project (based on preferences, experience with the particular task ...) so everyone gets to be in the 'contributing' and 'code-project-lead' roles once in a while.
We explicitely made an agreement that
these project 'leads' can decide
whatever they want to with the code
contributions of the others (sort of
like a temporary dictatorship: change
it, make suggestions, ask people to
redo stuff etc.). The projectcode
'lead' bears the complete
responsibility for the aggregated
code to work.
With these formalised 'leads' (and the changing roles) I think people have less problems with (constructive) criticism of the parts they contribute.
Yes, keep the feedback as appreciative, professional and technical as possible, back up your concerns with possible "worst case" scenarios so that the disadvantages of those features and/or this particular implementation, become blatantly obvious.
Also, if this is about features/code that are very specific and are not of any use to most users, express your concerns about the code/use ratio - indicating concerns about increased code base complexity etc.
Ideally, present your concerns as open-ended questions - in the sense of: "Though, I am wondering if this way of doing it may work in the long term, due to ...". So that you actually encourage an active dialogue between contributors.
Invite your fellow contributors and user to provide their opinions on these concerns, in fact ask other people/contributors what they are thinking about this addition (in terms of pros & cons, requirements, code quality), do make the statement that you are willing to reconsider your current position if other contributors/users can provide corresponding insight.
You are basically encouraging an informal review that way, asking your community to also look into the proposed additions, so that the advantages and disadvantages can be discussed.
So, whatever the decision will be, it will be one that is community-backed, and not just simply made by you.
You being the architect of the original design, are also in an excellent position to provide architectural reasons why something is not (yet) suitable for inclusion/deployment.
If stability, complexity or code quality are a real concern, do illustrate how other contributions also had to go through a certain review process in order to be acceptable.
You can also mention how specific code doesn't really align with your current design, or how it may not scale too well with future extension to your current design, similarly you can highlight why certain stuff was left out explicitly.
If you actually like the features or the core idea, be sure to highlight the excellent addition these features would make if properly implemented and integrated, but do also highlight that the existing implementation isn't really appropriate due to a number of reasons.
If you can, do make specific suggestions for improvements, provide examples of how to do things better, and what to avoid and do express that you hope, this can be reworked to be added with the help of your project's community.
Ideally, present your requirements for actually accepting this contribution and do mention the background for your requirements, you may in fact say that you hate some of these requirements yourself.
Preferably, present and discuss instances where you yourself contributed similar code (or even worse code) and that you ended up facing huge issues due to your own code, so that these policies are now in place to prevent such issues. By actually talking about your own bad code, you can actually be very subjective.
Emphasize that you generally appreciate the effort itself, and that you are willing to provide the necessary help and pointers to bring the code in question into a better shape and form. Also, encourage that similar contributions in the future should be properly coordinated within your community, in order to avoid similar issues.
Always think in terms of features and functionality (and remind your contributor to do the same), not code - imagine it like a thorough code review process, where the final code that ends up being committed/accepted, may have hardly anything in common with the original implementation.
This is again a good possibility, to present examples where you yourself developed code that ended up largely reworked, so that much of it is now replaced by a much better implementation.
Similarly, there's always the issue with code that has no active maintainers, so you can just as easily suggest that you feel concerned about code that may end up being unmaintained, you could even ask if the corresponding developer would be willing to help maintain that code, possibly in a separate branch.
In the same sense, always require new code to be accompanied with proper comments, documentation and other updates. In other words, code that adds new or changes existing functionality, should always be accompanied with updates to all relevant documentation.
Ultimately, if you know right away that you cannot and will not accept any of that code in the near future, you can at least invite the developer to branch or even fork your project, possibly in you repository and with your help and guidance, so that you still express your gratitude for working with your project.

How can I become better at realizing how to solve a particular problem?

I have become pretty fluent in a few different languages now, but I seem to have a hard time actually figuring out the best way to go about solving particular problems. What are some ways to go about getting better at the actual problem solving of programming.
Experience. Solving something completely new is hard. The best way to solve problems is to try and find a problem that you've solved before, and that is similar, and adapt you solution to the new problem. So until you have experience with many different kinds of problems, it's hard to solve new problems that you come across. Visiting sites like this and reading questions and theirs answers are a great way of learning how others solved problems that they encountered.
Basically, "just do it". When you have to make a choice, just make any choice (except flipping a coin).
Once you have something that works, then sit back and scratch your head about what you did wrong and how to do it better.
If you have absolutely no clue how to do even that, just solve a part of the problem completely and move on.
I suggest checking out this book. They aren't the best kid on the block, though they want us to think they are...but they did well with Basecamp.
When all you have is a hammer, everything begins to look like a nail.
So, make sure you're well versed in algorithms and data structures. When you study them, think hard about what sort of uses a particular algorithm is good for.
Ask someone else. Someone in your office, on Twitter or SO, or even your wife. People with no technical knowledge often come up with simpler solutions.
If you must solve it on your own, try one of these others:
Do a quick search for another person or project which has tried to solve your problem. If they have a blog, documentation or source code, you might be able to learn from their implementation.
Come up with at least TWO solutions and pick the best one.
Pretend you have 15 minutes to solve the problem before the civilized world is destroyed by Nuclear War / Skynet / Permanent endless re-runs of Seinfeld, you might think of something much simpler which gets 99% of the work done.
The book is a set of heuristics to go through when solving a problem. Read about it on Wikipedia. Buy it on Amazon.
By solving actual problems. Practice makes perfect.
If you have time to become fluent in multiple languages, my guess is that you haven't spent much time doing any actual work. If you have a job, it might be time for a new one. If you're still in school, do you have any interest in starting a project for yourself or contributing to one that you use regularly?
It might help to know what kinds of problems you're having difficulty solving.
Go find an open source or free project you can get excited about and contribute. I learned a lot by signing up to code for my favorite video game modification.
Experience.
Study really only goes so far. Find something fun and small. Do it.
One way that seems to work for a lot of people is to pick use a book like Programming Challenges as a guide, and focus on solving problems of a particular type. For example if you're weak in an area like graph problems or dynamic programming, find a set of problems on an online judge and work through them. You'll start to recognize patterns and be able to classify problems.
Google for an answer. Chances are someone else has solved the same problem or a similar problem before.
Ask on SO. :)
Read some textbooks or online articles about design patterns.
Problems may have many solutions, some simpler and some more complicated. Don't get stuck thinking there is only one solution. Just go with the simplest solution that makes the most sense in the context of your application.
After years of experience you'll be able to think of your own solutions to most problems. :)
Study Algorithms!
Search and get a hold on as many examples, books on the subject programming or otherwise, etc.
Problem solving skills can also be improved by playing tactical games.
These made me Enjoy problem solving and become better (not necessarily good) at problem solving:
Chess and igo
I like this general method:
List the possible solutions with their strengths and weaknesses
(This will push you to briefly taste all of them)
Chose the best one and make your design on it
(If you find any heavy obstacle, reconsider other options)
Implement
most importantly, on every step, learn
The best way is probably to learn form a master if that's an option. Especially if you can find someone familiar with the problems your addressing.
Generally the more tools we have at hand the more options we have for tackling a problem. I agree that's important to always code and to always deliver something that works (however inelegant it is). But I think we need to increase our skills/knowledge in many directions:
Language skills (know your language(s) in depth)
Programming paradigms (Imperative, Object, Functional)
Framework knowledge
Algorithms
Patterns
Date-Structures
Methodologies (Agile, DDD, BDD, ?DD)
Tools
etc
You can get a lot of skill through on the job just-in-time-learning, but I usually have a pet subject at any time that I'm trying to get a deeper understanding of, typically this means getting the book and reading it cover to cover.
Work your way through Project Euler, and look at other people's solutions to the problems. Almost every problem will have been solved in a way that wouldn't have occurred to you, and usually with greater efficiency.
I think that there is a lot more than raw experience involved in becoming a good problem solver - because I've seen poor problem solvers with lots of experience.
Here are a few tips but you can find many more around the web.
Look at a number of problems and
figure out what they have in
common. The greater the generality
with which you understand the
solution to a problem, the more you
can apply it to other problems.
Try to discover approaches which
good problem solvers use to solve problem. But don't assume anyone has monopoly on problem solving
If you read Richard Feynman's
books, you'll notice that he
considers many different routes to
get to his goal. Don't narrow your
approach prematurely
Be positive. Assume that you can find
the solution to anything. Your state of mind matters. Enjoying the process of problem solving makes it much easier
Don't beat your head against the wall. If you don't
seem to be making progress with one approach, try another approach
Always be looking for more ways of
solving problems and more insights
into the process of problem solving itself
Be willing to work. It still can take a lot of effort solve some problems
The more different fields of study you know, the more viewpoints you have. I have a strong math background and I find it very useful for many problems. Physics, music or any different viewpoint might be useful
Practice solving problems.
Take an algorithms or discrete math course.
Here some tools that I've used in the past to help me understand a particular problem and its solution. I don't always use them today, but they helped me to learn how to think about breaking down a problem and coming up with a solution.
Class-Responsibility-Collaboration (CRC) cards
One card per class, details the responsibility of the class and what other classes it collaborates with. Using cards you can layout your design for the solution and see where you have too much coupling or too much responsibility. They allow you to think about the design in a lightweight manner before committing to code.
Use cases
Either actual structured use cases that describe user interaction with the system or even briefer stories or story cards. I still use stories, though I capture them in a wiki instead. This allows you to capture the interaction with the system in an informal way. Stories are basically placeholders for conversations that you need to have with the customer about what is supposed to be done. Using the collected stories, you start getting a grasp on the overall intention of the code. You can also start seeing how things interact and what works with other things. This is really the beginning of design.
UML Diagrams - particularly interaction diagrams
For awhile I used these a lot. It really helped to see how things actually worked together under the hood. I will still diagram, informally, some complex interactions to make sure I don't miss anything important. Going through a lot of these really helped me to think about how my objects interacted and now it is sort of second nature to think in terms of interactions.
Class diagrams -- a really high level view of the code.
These allow you to see your code structurally, especially if you can break the diagram down into components or layers of architecture. Mostly I use these now to explain the code to other people when necessary. When starting out these provide a pretty good visualization, though, if you're struggling with the bird's-eye view of the code.
The best advice I can give you if you try these is to follow the "rules" until you really have a good grasp of what is going on. Once you feel like you have a better understanding of what they provide, you can use them or not, or modify how you use them to keep only what is helpful and let the other stuff go.

Should I prepare my code for future changes? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
Should I prepare my code for possible/predicted future changes so that it's easier to make these changes even if I don't really know if these changes will be required anytime?
I am likely to get lynched for my opinion on this, but here I go.
While I have had this hammered into me over years of reading idealistic articles and sitting through far too many seminars and lectures categorically stating the nirvana like benefits of this, I too had similar questions in my mind. This line of thought can lead to massive over-engineering of the code, adding many man hours or more to design, development and testing estimates, increasing cost and overheads, when in reality this is not often the case. How many times have you actually reused your code or a library. If it is going to be used in many places, through numerous projects, then yes you should.
However, most of the time this is not the case. You will often find it more economical (in time and money) to only refactor your code for reuse and configurability when you actually know that you are going to use it again. The rest of the time the real benefits are lost.
This is not, I repeat NOT, an excuse to write sloppy, poorly designed, poorly documented code. This should be a fundamental that is so wholly ingrained in you that you could not break it, but writing a class for reuse is a waste most of the time as it will never get reused.
There are obvious exceptions to this. If you are writing third party libraries then obviously this is not the case and reuse and expansion should be key to your design. Certain other types of code should be obvious for reuse (Logging, Configuration etc.)
I asked a similar question here Code Reusability: Is it worth it It might help.
Within reason and certainly if its not much effort.
I don't think you can always apply this, as it can make you over-engineer everything and then it takes too long and you don't make much money. Consider how likely the client is to implement something, how much extra it will take to prepare for it now and how much time it will save later.
If it requires a lot of work, but makes sense to save money, you could raise it with the client.
I seem to be in disagreement with a lot of people here, who say always - but I've seen a lot of things where effort has been put into make future features easy to implement ... but they've never been implemented. If a client hasn't paid for the time spent making the feature easy to implement, that's money straight off your bottom line.
Edit: I think its relevant to point out that I'm coming from an agency environment. If you are working on code for yourself, you can probably predict future development with a greater level of certainty, and so its probably feasible to do this in more cases.
yagni.
http://en.wikipedia.org/wiki/YAGNI (*inserted by friendly editor :-) *)
fix the bugs in that horrenous code you're writing today.
refactor when the time comes.
If you work in a refactoring-friendly lanuguage I'd say NO. (In other languages I'm not sure)
You should make your code as loosley coupled as possible and keep things as simple as possible. Stay specific and dont generalize to unknown use cases.
This will make your code base prepared for the things the future will bring.
(And frankly, most anticipations of the future tend to be sufficiently off-mark not to warrant coding for it today)
Edit: It also depends on what you're doing. Designing apis for external users is not the same as developing a web app for your company
Yes -- by doing less.
You won't know what the future requirements for your code. The best preparation for the future is not to implement anything that's not needed right away, and have good unit-test coverage everything you do implement.
Scalability in your code is one thing you should always consider.
The more time you spent today in catering for scalable solutions, the less time you will spend in the future when actually expanding
Predicted or very likely changes - yes, generally it's good to have them in mind.
"Take anything that might ever happen in the universe into account" - no. You don't know what could happen, trying to cover for everything unknown is just over engineering.
Remember that most of you code will be changed/refactored. If you know that you will have to change your code within the next week, prepare it. But don't start making everything exchangeable and modular by default. Just because "maybe in the future" you shouldn't create a framework, if three lines of code do the job for now.
But think twice, if the system behind makes refactoring difficult (databases).
One thing I learned in my mere year of coding for the company I work for, everything you do, no matter how perfect you think it is will come back haunting you for an update or needs to be altered because client X suddenly decided not to like it.
Now I am making my code highly customizable so when that day comes to do some adjustments, it would be ready in no time and I can continue with my work.
In a word, yes.
In a few more words, you should always make your code as readable as possible, include comments, and always assume that at some time in the future, you will be called upon, or someone else will be, to modify the code.
If that someone in the future comes across a block of code, uncommented, unformatted, with no indication of what it does or should do, then they will curse you forever :)
No, never. Write good code that is easy to reuse/refactor but preparing for half thought out enhancements is, imo, the brother of premature optimisation; you'll likely end up doing things you don't need or that push you down a certain (possibly non-optimal) design path at a future date. As mfx says, do the minimum required now and unit test everything; it makes extending code a doddle.
In two words: yes, always.
What you describe is part and parcel of being a good developer.
See On Writing Maintainable Code
The obvious answer is YES. But the more important question is how!
Orthogonality, Reversibility, Flexible architecture, Decoupling, and Metaprogramming are some of the keywords that address this problem. Check out chapters 2 and 5 of "The Pragmatic Programmer"
I find it is generally a better strategy to design a "change-accommodating" architecture than trying to specify specific changes that might (or might not) happen. It is a good exercise, though, to ask "What may change in the future?", and then resist the temptation to prematurely implement potentially unnecessary features, but rather have such possibilities in mind when creating the application architecture.
I find that there is a parallel here to something I heard on test-driven development recently. The person talking about it had observed that while at first it could be a little annoying to always write unit tests and think about how your code can be written to be testable, it turned out that at some point it just begins to come naturally to write test friendly code.
The point is that if you always write with modifiability in mind, you might end up doing it more or less by reflex, thereby making the overhead of writing the extra code very small. If you can reach a state where high quality, extendable code is what comes naturally to you, I think that would definately be worth the initial cost. I do still believe, though, that you must to it in moderation and that sometimes it's just not the right thing to do for a given customer.
Two simple principles to follow:
KISS
Always avoid dependencies
Thats a good start
Yes, but only by writing maintainable, easily refactored code.
You should definitely NOT try to guess what might be required in the future. Such efforts are not only pointless and time-wasting for your current targets, but are more often than not counterproductive when any future changes become apparent.
This is really important. It needs to be done well, but takes experience.
If I count up the number of edits (via "diff") after implementing a typical requirements change, numbers like 10-50 are common.
If I make a mistake on any of them, I've inserted a bug.
So personally, I always try to design to keep that number down.
I also try to leave instructions in the code for how to make anticipated changes. If I'm maintaining code, I really appreciate it if the prior author also did this.
To balance the effort with the
benefits is the skill of design.
Not all our code needs to be flexible. Some things will not be changing.
No wasted effort. Finding the right parts to devote our attention to.
Tricky.
Yes, always think of where your code may need to evolve in the future. In the current project I am working on there are thousands of files and every single one of them has atleast one revision to it. Even leaving aside bug fixes plenty of those revisions are to make way for additional software features.
I wouldn't change my could to prepare for an unknown future feature.
But I would refactor to get the best solution to the current problem.
You can't design against an unknown (future), and as other people have said, trying to build a flexible design can easily lead to overengineering, so I think that the best pragmatic approach is think in terms of avoiding things that you know will make it harder for you to maintain your code in future. Every time that you make a design decision, just ask yourself whether you are making it harder to change things in future, and if so, what you are going to do to limit the problem.
Obvious things that will always cause problems:
Scattered configuration information - you need to be able to check and change this stuff easily
Untested code - you need tests to make changes with confidence
Mingling of storage and output concerns with the core logic - you will switch database instances and output formats, if only for testing
Complex architecture - you need to have a clear mental model of the system
Arrangements that require manual intervention or updates to keep them running

Weird coding phase I'm going through [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
Lately, I've been going through a pretty weird phase. I feel the need to write/rewrite all the tools I use ( text editors,IDE's,libraries/modules ) and I don't know why . Even though this can be seen as a good thing ( because I learn alot of things in the process , and my software design skills are improving constantly ) ,
the weirdest part is that I throw away all the code after I'm done, so I kinda work in vain. I'm never satisfied with the code I write. Have you been experimenting this too? How did you get over it?
It sounds like you're driven by one of a couple different possible "syndromes" (or maybe more than one).
Not invented here syndrome (NIHS). This is where nobody's code is good enough, or does things in the way you want them done. So you think you can do it better.
How Does It Work? Syndrome (HDIWS). This where you are fascinated by a program and want to understand how it works, so you basically get down to writing it yourself.
Biting off more than you can chew syndrome (BOMTYCCS). This is where you think "That can't be too hard to do, so i'll just write it myself then I know what it's doing". Then, you get partway done with it, enough to make it kind of work, and you realize that your code sucks and that you made too many compromises to get it done quickly, so you throw it away.
Too cheap to buy it so you figure you can build it cheaper syndrome (TCTBISYFYCBICS). The name speaks for itself. The problem is that you get partway done, then you realize you've fallin into #3 above and think your time is worth more than whatever it is.
I have suffered from all of these, and more.. (Like the classic "Need to create an acroynm for everything syndrome (NTCAAFES)" ;)
I do this all the time (well, except for the throwing away part). Personally, I started to do this early in my programming learning (when I was a student); it got the point where I would examine the Java core source code and if I didn't like it, I would make my own/extend it. It got to the point where the running class joke was that I was going to reinvent Java.
Personally, I never got over it... Instead I found new ways to use the to my benefit. For example, rather than simply reinventing things, I found that there were things that I wanted that were/are not available anywhere else, so I would make them for myself. My first such project was a chat program. Sure many existed, but this was the only one my teacher didn't know about (hehe). Sure learned a lot about networking protocols through it, but once it was "done", my first action was to send it to all my buddies and we would chat while the teacher was lecturing... Soon enough he found out about it and has since then forced all his students to turn their monitors off while he is talking, but it was fun while it lasted.
Since then, I have been continuing to come up with my own projects that are useful to me, have at least one aspect that I will need to learn along the way, and one or more aspects of there not being a current program that solves the problem for me for free.
So in all, I recommend that rather than trying to reinvent the IDE and then throwing your code away, try to find some project or problem that you would like to solve on your own and use again. I believe that dogfooding (using your own programs) is the best any programmer can hope to achieve.
Put whatever code you write in a website and make it open source.
Let others discuss the code and make it better.
This helps you to improve your coding skills while you are giving back to the community.
Later you will be amazed to find that your code got better than what it was when you left it.
I think the nature of us as programming species is to want to control everything. We're total control freaks. Everyone has tons of reasons not to trust someone else's work. Any given day at the office can be proof of that.
However, in general, you'll be wrong. Chances are that tool/library/IDE/whatever was developed by a team that put a lot more time in to it than you'll be able to. Sure, it has it's flaws from the outside, but by the time you're done with your version, you'll probably find that yours has a few flaws of it's own.
It's just something you have to come to terms with.
In hindsight, it's not an entirely bad habit. It takes you out of your normal programming area and gives you exposure to other problems. Ultimately you'll be more versatile for it. If you continue doing it from that perspective, and mix it in with real productivity, I think it's perfectly healthy.
I commend you on this. You wouldn't be a good coder if you didn't obsess about the perfection of your tools, and also wonder how they were made, so:
Don't waste too much time,
make sure you release whatever you can open source,
but I look at it like a Padawan building his/her own light saber. It has to be done.
If you are just duplicating your existing tools, there is little value. But if have an idea on how to make it better, even in just one way, its worth it.
This way at the end of it you'll have a slightly better tool and you won't just throw it away.
I went though a similar phase with tools, but then I got more interested in other topics like graphics and games.
I only write tools when it's going to save me time (or, of course, achieve more in the same time).
There's plenty enough application code to be written without wanting to write tools as well.
If you're doing it in your spare time, then each to his own, I guess, but I find that the need to make money from the code I write kinda focuses my mind on what's important.
We programmers are flow-junkies. We enjoy the mental states associated with problem-solving and progress toward solution. Sometimes, when the projects at work get bogged down -- for a variety of reasons -- its tempting to turn to side projects as a way to get the flow-fix.
This isn't necessarily bad, as long as you're sharpening your tools or learning something, and it doesn't take time away from other responsibilities (work or relational).
You opted for the red pill didn't you? Somebody should have told you NOT to do that.
The problem is that if you really really like computers, and you spend way too much of your life thinking about them, you get to this point where the entire random/irrational state of software development -- that arbitrarily complex working mess -- seems to be clearly far more convoluted than necessary, even if you don't know what necessary is yet.
There is underneath it all, a fabulously amazing tool just waiting to be set free. It teases you from time to time, hiding in various programs, embedded along with various bits of data. Tantalizing functionality, that if only you could find a real way to join it all together to make one big super-cool system, you could ...
It's that purely mathematical sense that computers could in fact be something wonderful, if only they didn't suck so much.
Now take the blue pill, and get back to work ...
It also sounds like you might be really dissatisfied with the other work you are doing. I find that when I'm not enjoying my job, I tend to find anything else to do, and will get especially fixated on things that I can excuse as not being time-wasters because they are, on the surface, productive. Things like going back and cleaning up code that was really a one-time only ad-hoc, rewriting what already works, and making my own "improvements" on tools. If I'm right, it makes sense that you would then throw it out/walk away/find something else because if you are only doing that to avoid doing other work.
I've experienced it sometimes. For example, when preparing HTML documents for publication, I felt compelled to write my own tool to do it just the way I want. It's natural to want to make the world better. If you feel that urge - I suggest you go for it.
I think this is probably a good thing, if it's not getting in the way of getting things done.
But don't chuck it, even if it's rubbish, put it on the net somewhere, and when somebody else tries to do the same thing they can see yours and at the very least avoid making the same mistakes.

What's your most controversial programming opinion?

Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
This is definitely subjective, but I'd like to try to avoid it becoming argumentative. I think it could be an interesting question if people treat it appropriately.
The idea for this question came from the comment thread from my answer to the "What are five things you hate about your favorite language?" question. I contended that classes in C# should be sealed by default - I won't put my reasoning in the question, but I might write a fuller explanation as an answer to this question. I was surprised at the heat of the discussion in the comments (25 comments currently).
So, what contentious opinions do you hold? I'd rather avoid the kind of thing which ends up being pretty religious with relatively little basis (e.g. brace placing) but examples might include things like "unit testing isn't actually terribly helpful" or "public fields are okay really". The important thing (to me, anyway) is that you've got reasons behind your opinions.
Please present your opinion and reasoning - I would encourage people to vote for opinions which are well-argued and interesting, whether or not you happen to agree with them.
Programmers who don't code in their spare time for fun will never become as good as those that do.
I think even the smartest and most talented people will never become truly good programmers unless they treat it as more than a job. Meaning that they do little projects on the side, or just mess with lots of different languages and ideas in their spare time.
(Note: I'm not saying good programmers do nothing else than programming, but they do more than program from 9 to 5)
The only "best practice" you should be using all the time is "Use Your Brain".
Too many people jumping on too many bandwagons and trying to force methods, patterns, frameworks etc onto things that don't warrant them. Just because something is new, or because someone respected has an opinion, doesn't mean it fits all :)
EDIT:
Just to clarify - I don't think people should ignore best practices, valued opinions etc. Just that people shouldn't just blindly jump on something without thinking about WHY this "thing" is so great, IS it applicable to what I'm doing, and WHAT benefits/drawbacks does it bring?
"Googling it" is okay!
Yes, I know it offends some people out there that their years of intense memorization and/or glorious stacks of programming books are starting to fall by the wayside to a resource that anyone can access within seconds, but you shouldn't hold that against people that use it.
Too often I hear googling answers to problems the result of criticism, and it really is without sense. First of all, it must be conceded that everyone needs materials to reference. You don't know everything and you will need to look things up. Conceding that, does it really matter where you got the information? Does it matter if you looked it up in a book, looked it up on Google, or heard it from a talking frog that you hallucinated? No. A right answer is a right answer.
What is important is that you understand the material, use it as the means to an end of a successful programming solution, and the client/your employer is happy with the results.
(although if you are getting answers from hallucinatory talking frogs, you should probably get some help all the same)
Most comments in code are in fact a pernicious form of code duplication.
We spend most of our time maintaining code written by others (or ourselves) and poor, incorrect, outdated, misleading comments must be near the top of the list of most annoying artifacts in code.
I think eventually many people just blank them out, especially those flowerbox monstrosities.
Much better to concentrate on making the code readable, refactoring as necessary, and minimising idioms and quirkiness.
On the other hand, many courses teach that comments are very nearly more important than the code itself, leading to the this next line adds one to invoiceTotal style of commenting.
XML is highly overrated
I think too many jump onto the XML bandwagon before using their brains...
XML for web stuff is great, as it's designed for it. Otherwise I think some problem definition and design thoughts should preempt any decision to use it.
My 5 cents
Not all programmers are created equal
Quite often managers think that DeveloperA == DeveloperB simply because they have same level of experience and so on. In actual fact, the performance of one developer can be 10x or even 100x that of another.
It's politically risky to talk about it, but sometimes I feel like pointing out that, even though several team members may appear to be of equal skill, it's not always the case. I have even seen cases where lead developers were 'beyond hope' and junior devs did all the actual work - I made sure they got the credit, though. :)
I fail to understand why people think that Java is absolutely the best "first" programming language to be taught in universities.
For one, I believe that first programming language should be such that it highlights the need to learn control flow and variables, not objects and syntax
For another, I believe that people who have not had experience in debugging memory leaks in C / C++ cannot fully appreciate what Java brings to the table.
Also the natural progression should be from "how can I do this" to "how can I find the library which does that" and not the other way round.
If you only know one language, no matter how well you know it, you're not a great programmer.
There seems to be an attitude that says once you're really good at C# or Java or whatever other language you started out learning then that's all you need. I don't believe it- every language I have ever learned has taught me something new about programming that I have been able to bring back into my work with all the others. I think that anyone who restricts themselves to one language will never be as good as they could be.
It also indicates to me a certain lack of inquistiveness and willingness to experiment that doesn't necessarily tally with the qualities I would expect to find in a really good programmer.
Performance does matter.
Print statements are a valid way to debug code
I believe it is perfectly fine to debug your code by littering it with System.out.println (or whatever print statement works for your language). Often, this can be quicker than debugging, and you can compare printed outputs against other runs of the app.
Just make sure to remove the print statements when you go to production (or better, turn them into logging statements)
Your job is to put yourself out of work.
When you're writing software for your employer, any software that you create is to be written in such a way that it can be picked up by any developer and understood with a minimal amount of effort. It is well designed, clearly and consistently written, formatted cleanly, documented where it needs to be, builds daily as expected, checked into the repository, and appropriately versioned.
If you get hit by a bus, laid off, fired, or walk off the job, your employer should be able to replace you on a moment's notice, and the next guy could step into your role, pick up your code and be up and running within a week tops. If he or she can't do that, then you've failed miserably.
Interestingly, I've found that having that goal has made me more valuable to my employers. The more I strive to be disposable, the more valuable I become to them.
1) The Business Apps farce:
I think that the whole "Enterprise" frameworks thing is smoke and mirrors. J2EE, .NET, the majority of the Apache frameworks and most abstractions to manage such things create far more complexity than they solve.
Take any regular Java or .NET ORM, or any supposedly modern MVC framework for either which does "magic" to solve tedious, simple tasks. You end up writing huge amounts of ugly XML boilerplate that is difficult to validate and write quickly. You have massive APIs where half of those are just to integrate the work of the other APIs, interfaces that are impossible to recycle, and abstract classes that are needed only to overcome the inflexibility of Java and C#. We simply don't need most of that.
How about all the different application servers with their own darned descriptor syntax, the overly complex database and groupware products?
The point of this is not that complexity==bad, it's that unnecessary complexity==bad. I've worked in massive enterprise installations where some of it was necessary, but even in most cases a few home-grown scripts and a simple web frontend is all that's needed to solve most use cases.
I'd try to replace all of these enterprisey apps with simple web frameworks, open source DBs, and trivial programming constructs.
2) The n-years-of-experience-required:
Unless you need a consultant or a technician to handle a specific issue related to an application, API or framework, then you don't really need someone with 5 years of experience in that application. What you need is a developer/admin who can read documentation, who has domain knowledge in whatever it is you're doing, and who can learn quickly. If you need to develop in some kind of language, a decent developer will pick it up in less than 2 months. If you need an administrator for X web server, in two days he should have read the man pages and newsgroups and be up to speed. Anything less and that person is not worth what he is paid.
3) The common "computer science" degree curriculum:
The majority of computer science and software engineering degrees are bull. If your first programming language is Java or C#, then you're doing something wrong. If you don't get several courses full of algebra and math, it's wrong. If you don't delve into functional programming, it's incomplete. If you can't apply loop invariants to a trivial for loop, you're not worth your salt as a supposed computer scientist. If you come out with experience in x and y languages and object orientation, it's full of s***. A real computer scientist sees a language in terms of the concepts and syntaxes it uses, and sees programming methodologies as one among many, and has such a good understanding of the underlying philosophies of both that picking new languages, design methods, or specification languages should be trivial.
Getters and Setters are Highly Overused
I've seen millions of people claiming that public fields are evil, so they make them private and provide getters and setters for all of them. I believe this is almost identical to making the fields public, maybe a bit different if you're using threads (but generally is not the case) or if your accessors have business/presentation logic (something 'strange' at least).
I'm not in favor of public fields, but against making a getter/setter (or Property) for everyone of them, and then claiming that doing that is encapsulation or information hiding... ha!
UPDATE:
This answer has raised some controversy in it's comments, so I'll try to clarify it a bit (I'll leave the original untouched since that is what many people upvoted).
First of all: anyone who uses public fields deserves jail time
Now, creating private fields and then using the IDE to automatically generate getters and setters for every one of them is nearly as bad as using public fields.
Many people think:
private fields + public accessors == encapsulation
I say (automatic or not) generation of getter/setter pair for your fields effectively goes against the so called encapsulation you are trying to achieve.
Lastly, let me quote Uncle Bob in this topic (taken from chapter 6 of "Clean Code"):
There is a reason that we keep our
variables private. We don't want
anyone else to depend on them. We want
the freedom to change their type or
implementation on a whim or an
impulse. Why, then, do so many
programmers automatically add getters
and setters to their objects, exposing
their private fields as if they were
public?
UML diagrams are highly overrated
Of course there are useful diagrams e.g. class diagram for the Composite Pattern, but many UML diagrams have absolutely no value.
Opinion: SQL is code. Treat it as such
That is, just like your C#, Java, or other favorite object/procedure language, develop a formatting style that is readable and maintainable.
I hate when I see sloppy free-formatted SQL code. If you scream when you see both styles of curly braces on a page, why or why don't you scream when you see free formatted SQL or SQL that obscures or obfuscates the JOIN condition?
Readability is the most important aspect of your code.
Even more so than correctness. If it's readable, it's easy to fix. It's also easy to optimize, easy to change, easy to understand. And hopefully other developers can learn something from it too.
If you're a developer, you should be able to write code
I did quite a bit of interviewing last year, and for my part of the interview I was supposed to test the way people thought, and how they implemented simple-to-moderate algorithms on a white board. I'd initially started out with questions like:
Given that Pi can be estimated using the function 4 * (1 - 1/3 + 1/5 - 1/7 + ...) with more terms giving greater accuracy, write a function that calculates Pi to an accuracy of 5 decimal places.
It's a problem that should make you think, but shouldn't be out of reach to a seasoned developer (it can be answered in about 10 lines of C#). However, many of our (supposedly pre-screened by the agency) candidates couldn't even begin to answer it, or even explain how they might go about answering it. So after a while I started asking simpler questions like:
Given the area of a circle is given by Pi times the radius squared, write a function to calculate the area of a circle.
Amazingly, more than half the candidates couldn't write this function in any language (I can read most popular languages so I let them use any language of their choice, including pseudo-code). We had "C# developers" who could not write this function in C#.
I was surprised by this. I had always thought that developers should be able to write code. It seems that, nowadays, this is a controversial opinion. Certainly it is amongst interview candidates!
Edit:
There's a lot of discussion in the comments about whether the first question is a good or bad one, and whether you should ask questions as complex as this in an interview. I'm not going to delve into this here (that's a whole new question) apart from to say you're largely missing the point of the post.
Yes, I said people couldn't make any headway with this, but the second question is trivial and many people couldn't make any headway with that one either! Anybody who calls themselves a developer should be able to write the answer to the second one in a few seconds without even thinking. And many can't.
The use of hungarian notation should be punished with death.
That should be controversial enough ;)
Design patterns are hurting good design more than they're helping it.
IMO software design, especially good software design is far too varied to be meaningfully captured in patterns, especially in the small number of patterns people can actually remember - and they're far too abstract for people to really remember more than a handful. So they're not helping much.
And on the other hand, far too many people become enamoured with the concept and try to apply patterns everywhere - usually, in the resulting code you can't find the actual design between all the (completely meaningless) Singletons and Abstract Factories.
Less code is better than more!
If the users say "that's it?", and your work remains invisible, it's done right. Glory can be found elsewhere.
PHP sucks ;-)
The proof is in the pudding.
Unit Testing won't help you write good code
The only reason to have Unit tests is to make sure that code that already works doesn't break. Writing tests first, or writing code to the tests is ridiculous. If you write to the tests before the code, you won't even know what the edge cases are. You could have code that passes the tests but still fails in unforeseen circumstances.
And furthermore, good developers will keep cohesion low, which will make the addition of new code unlikely to cause problems with existing stuff.
In fact, I'll generalize that even further,
Most "Best Practices" in Software Engineering are there to keep bad programmers from doing too much damage.
They're there to hand-hold bad developers and keep them from making dumbass mistakes. Of course, since most developers are bad, this is a good thing, but good developers should get a pass.
Write small methods. It seems that programmers love to write loooong methods where they do multiple different things.
I think that a method should be created wherever you can name one.
It's ok to write garbage code once in a while
Sometimes a quick and dirty piece of garbage code is all that is needed to fulfill a particular task. Patterns, ORMs, SRP, whatever... Throw up a Console or Web App, write some inline sql ( feels good ), and blast out the requirement.
Code == Design
I'm no fan of sophisticated UML diagrams and endless code documentation. In a high level language, your code should be readable and understandable as is. Complex documentation and diagrams aren't really any more user friendly.
Here's an article on the topic of Code as Design.
Software development is just a job
Don't get me wrong, I enjoy software development a lot. I've written a blog for the last few years on the subject. I've spent enough time on here to have >5000 reputation points. And I work in a start-up doing typically 60 hour weeks for much less money than I could get as a contractor because the team is fantastic and the work is interesting.
But in the grand scheme of things, it is just a job.
It ranks in importance below many things such as family, my girlfriend, friends, happiness etc., and below other things I'd rather be doing if I had an unlimited supply of cash such as riding motorbikes, sailing yachts, or snowboarding.
I think sometimes a lot of developers forget that developing is just something that allows us to have the more important things in life (and to have them by doing something we enjoy) rather than being the end goal in itself.
I also think there's nothing wrong with having binaries in source control.. if there is a good reason for it. If I have an assembly I don't have the source for, and might not necessarily be in the same place on each devs machine, then I will usually stick it in a "binaries" directory and reference it in a project using a relative path.
Quite a lot of people seem to think I should be burned at the stake for even mentioning "source control" and "binary" in the same sentence. I even know of places that have strict rules saying you can't add them.
Every developer should be familiar with the basic architecture of modern computers. This also applies to developers who target a virtual machine (maybe even more so, because they have been told time and time again that they don't need to worry themselves with memory management etc.)
Software Architects/Designers are Overrated
As a developer, I hate the idea of Software Architects. They are basically people that no longer code full time, read magazines and articles, and then tell you how to design software. Only people that actually write software full time for a living should be doing that. I don't care if you were the worlds best coder 5 years ago before you became an Architect, your opinion is useless to me.
How's that for controversial?
Edit (to clarify): I think most Software Architects make great Business Analysts (talking with customers, writing requirements, tests, etc), I simply think they have no place in designing software, high level or otherwise.
There is no "one size fits all" approach to development
I'm surprised that this is a controversial opinion, because it seems to me like common sense. However, there are many entries on popular blogs promoting the "one size fits all" approach to development so I think I may actually be in the minority.
Things I've seen being touted as the correct approach for any project - before any information is known about it - are things like the use of Test Driven Development (TDD), Domain Driven Design (DDD), Object-Relational Mapping (ORM), Agile (capital A), Object Orientation (OO), etc. etc. encompassing everything from methodologies to architectures to components. All with nice marketable acronyms, of course.
People even seem to go as far as putting badges on their blogs such as "I'm Test Driven" or similar, as if their strict adherence to a single approach whatever the details of the project project is actually a good thing.
It isn't.
Choosing the correct methodologies and architectures and components, etc., is something that should be done on a per-project basis, and depends not only on the type of project you're working on and its unique requirements, but also the size and ability of the team you're working with.