What is "overkill"? [closed] - terminology

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Programmers often wonder if the use of a certain library or programming style is overkill. They also often claim that this is the case (and they are often believed).
What does "overkill" mean in the context of programming?

"overkill" is typically used to mean deploying overly flexible and/or over-engineered solutions to solve what is ostensibly a simple and highly localized problem. The canonical example is FizzBuzz Enterprise Edition.

The term "Overkill" literarilly (if there was ever a literal use of it) refers to the action of killing something or someone, with more resources than necessary. Something like shooting a deer 50 times to make sure it dies.
In programming it applies for the same principle: making use of more resources than necessary or to find an overly complex solution to a simple problem.
Some simple examples are
for i=1 to 100
x[i]=2^z[i];
y=x;
end
Where copying the entire array x in every iteration step achieves the desired result but you could also copy it elementwise y[i]=x[i] saving you some 900 operations and is thus an overkill.
Using the OpenCV library to threshold an image is definetley possible but uses many more resources than strictly necessary and is an exagerated example of an overkill.

Related

Is cloud functions a valid replacement/implementation of a distributed system? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I want to process a list of data parallelly; processing of each element of the data won't affect other.
With for example google pub/sub + cloud functions, I could achieve something scalable and parallel, which looks like a distributed system.
I have little knowledge about distributed programming, and it seems that it takes a lot of time to master.
So I would like to know is this a replacement or a valid implementation of distributed system?
For the specific use case you're talking about - dividing work among function invocations to run in parallel - yes, it sounds like that would be adequate.
I would be very hesitant to call it a full "distributed system" (at least not without your very strict definition of what that really is). If you take wikipedeia's explanation of distributed computing, you might have a very basic system in place, but lack of a peer-to-peer direct messaging system probably makes it unsuitable for many of the listed applications you see on that page.
The bottom line I think you should really consider is if it satisfies the requirements of the problem at hand. Whether or not it's a "distributed system" is mostly irrelevant - either it works or it doesn't for that use case.

Difference between structured programming and structured approach to development? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I'm looking for clarification -
The terms 'structured programming' and 'structured development approach' refer to different things. Is this correct?
My understanding is that 'structured programming' is programming which is clear and well set out, using subroutines and modules etc. to keep 'structure' in software. This is different to the 'structured software development approach' which refers to the methodology of completing the stages of development (defining the problem, planning, creating, testing and maintenance) sequentially and not starting one until the previous is finished.
I'm a student and need to have these terms clear. Am I on the right track? Are there any important details I'm missing?
Yes, you are on the right track, these are different things.
Both of these terms are very broad. All programming paradigms that have been in use during the last 50 years (procedural, object oriented, functional etc.) are structured. And I would also say that every development methodology (waterfall, v model, agile etc.) is also structured in some way. Of course there are different grades.

Programming: Difference between concept and construct [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
When it comes to programming what is the difference between concept and construct for describing "patterns" used in multiple languages?
e.g. are "immutable data structures" or "pattern matching" concepts or constructs?
To start off, concepts are ways to explain something that we observe and that we try to explain. Construct is a little bit created in the opposite order. We "construct" a concept to be able to work with a situation and explain somethings that are not directly observed or deduced.
Immutable data structures is a concept since it qualifies something that is "tangible" or actually invisible with your eye but have a precise mechanic in the computer. "Pattern matching" is a concept of really specific procedures.
Now I guess that when you think about "construct" you also refer to programming "constructs" like operators (like +, -, *). In this case they are also concepts called "constructs" but shouldn't be considered to be the equivalent to "construct"/"concepts" concepts ;).
Hopefully, this helped you sort those words out.
Dan

Which coding abbreviations is widely used and generally understood? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Sometimes it is difficult to invent short and accurate name for a method or variable that is clear on the first glance. Abbreviations are not recommended today. But, for instance, if we use
src and dest in function params, it is contextually clear what they mean
i often means index: for(int i=0; i<size; i++).
Are there any other abbreviations like these used in more than one language?
There are a lot of abbreviations that are common and generally understood ("args", "i18n", "HTTP", …).
I'm not sure if I get your question right, less that there is a good answer to it. A good abbreviation is short (dooh) and easily understood by the readers. So the choice of an abbreviation depends on audience; what is obvious for a particle physicist might seem far fetched for a finance analytic or a games developer.
Maybe, the misleading word is abbreviation. The choice of a good abbreviation is the same problem of naming things in general (which is known as one of the two hard things in computer science). Readability is more important than conciseness.
Conclusion: if you know your audience (and your problem domain of course), you should be able to find understandable (and not too long) namings.
[Update]
Robert Martin wrote a whole 14-page chapter on "Meaningful Names" in his book on Clean Code, which for me is a Must-Read for every serious developer.

Story generation [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Upon reading a blog post about a minimalist story-generating python program, I was asking myself - and you - which are the most successful attempts at such programs. I remember seeing something using generating grammars, for instance. And which are the best attempts that, like this one, are extremely compact, either self-contained or able to read, say, the Web or an independent textual corpus (but not simply a file with a large number of story chunks)?
Search for Talespin for some famous ground breaking work. (Example: Micro-Talespin in Common Lisp by Warren Sack.)
I actually like Turner's "Minstrel: A Computer Model of Creativity and Storytelling" better :
ftp://ftp.cs.ucla.edu/tech-report/1992-reports/920057.pdf
Talespin is, in my opinion, blind in it's algorithm to everything but planning. So the author goals are given very little consideration (if at all). Minstrel is better that way.