framework for browser based MMO? - language-agnostic

I want to create a browser based MMO similar to "monopoly city streets." Is there a good framework available for this kind of thing?

Generally speaking, browser based 'MMOs' have little in common on the technical level with MMOs and are usually just websites with a recreational element. As such, your options are much the same as they are for any website, with the added caveat that you probably want a richer client than Javascript can offer. Flash and Silverlight are your two main contenders there and there are various libraries and frameworks available for them.
One option I know about that is geared directly towards larger online games is SmartFoxServer, which comes highly recommended. This is better suited to games that require a real-time element, although in practice such games are rarer than you think.

The short answer: no.
The long answer:
Back in 2003 or so, I was using Game Maker extensively. I would frequent the Game Maker Community very often, and every now and then a question would pop up in the Novice Questions & Answers section: "How I make MMORPG?".
There is no framework for making a browser-based MMORPG because the subject is vast. RuneScape is an MMORPG, and it's Java-based. But so is Kingdom of Loathing, and it's based on PHP (turn-based).
Also, you will need a design that is better than "Our game is going to be like X."

You could use MMO.js... it allows you to build great MMORPG's without worrying about sockets, threads or the server side handling...

Monopoly City Streets is itself built upon two publically available APIs [1], one of which is suited nicely to real-time game development although neither is comprehensive nor designed for 'non technical' use.
MMO is a catchall term that can refer to a great deal of different technical approaches and the differing hazards and skills required to attempt them. Effectively it refers to scale, rather than the actual style of game. Whilst a framework might deal with a very specific type of game concept, it's unlikely to be what you had in mind.
Certainly to my knowledge there is no layman's MMO framework for any of the common mapping APIs.
[1] http://en.wikipedia.org/wiki/Monopoly_City_Streets

Related

How to deal with internal company frameworks and SW factories?

Based on my own experience and on experience of my friends I see that many companies have some strange ideas to develop their own frameworks and SW factories (builds skeleton of application for you). These ideas are usually based on belief that own framework will be much better than anything else available. How to deal with such ideas and how to explain that it is not always good way to go?
Why I think internal frameworks / factories are not good:
Budget & Resources - There is usually only some initial budget to create framework. Nobody thinks about budget needed to maintain and support framework. Nobody can even estimate budget and resources needed for maintaing. At the beginning nobody thinks about maintaining multiple versions of framework to support already existing applications.
Lack of experience - Framework is usually created by people without any such experience or with support of "consultants" - generally much more expensive people with similar skill set.
Architecture / design - any architectural problem in framework affects all applications build using this framework. Bad design decissions in framework force developers to code applications in the bad way.
Technical debt - bad code in framwork is technical debt.
False belief of silver bullet - managers believe that own framework / factory is silver bullet. All application will be written in the same way and it will be easily maintainable. My experience is that it is simple not truth. Even with SW factory each application is specific.
Insufficient documentation - documentation is first part affected by low budget. Framework without documentation is useless. Reflector (.NET) is my best friend.
Insufficient user group - internal framework has only small user group. Small user group means small experience. If I'm using public tool / framework and I have a problem, I can ask question on SO (or similar web) or just try to find answer on google. With internal framework it is not possible.
Policy - company policy force you to use the framework to vindicate framework costs. This goes so far that the framework is chosen before the first requirement is gathered.
Complains to framework are prohibited.
Usage of other frameworks is prohibited.
Why I think companies are doing this:
Arrogance & egoism - sombody in company believes that he can do it better.
Ignorance - ignoring existing frameworks / solutions and the fact that only good frameworks survived long enough to be popular. Ignoring user group and already available informations on the Internet.
Management failure and incompetence - not understanding the impact (especially long term) of this decission. Decission based on incorrect informations. Management without SW development background.
I understand that sometimes own solution or framework for specialized scenario is needed but I'm tired with all these "great internal frameworks" for creating web or desktop applications. Am I wrong? Are these frameworks really needed (.NET and Java world)? Can you provide me some example or reason why it is good to have internal framework / factory?
Edit:
Thanks for answers but I expected some advice how to deal with a problem as a developer (except changing a job) not as a manager.
In my experience the most common cause of excess frameworks is... bored developers! Uninspired developers find that developing frameworks to solve their problems is much more fun than actually solving those problems - the end result is frameworks that suffer from all of the above (because of course the developer only did the fun bits), and possibly don't even solve the actual problem (because the goal was to have fun, not to solve the problem).
The solution is tricky - its difficult to know what it is that motivates developers as everyone is motivated by different things, however motivated developers who are busy doing things that they enjoy don't see to suffer from this ailment!
That said, well thought out frameworks when properly used are definitely a good thing - However if its only going to be used internally then it might be better to instead think of it as an extension to re factoring and code re-use rather than as a framework.
A classic sign that someone is suffering from bored developer framework syndrome is when a framework is being developed to solve the general case when there isn't yet a solution for the specific case:
How can you know how to solve the general case until you have solved at least one specific case?
Until you've had to solve the general case a few times you are only guessing that a framework is even needed.
The second case of course leads to the worst sort of framework - the mammoth generic framework that is only ever used once, to solve a problem which is far simpler than the framework itself.
Instead consider these sorts of frameworks more as an exercise in "extensive refactoring" - if the framework is produced as a way of grouping and tidying common code on an as-needed basis then the framework will grow in size and complexity dynamically - having already solved the problem before you start producing frameworks is nice too as it means you are already experts in whatever it is the framework needs to do.
Finally, try to keep developers from being bored (they get up to all sorts of mischief otherwise!)
Generalization is bad, but here is what I've noticed, especially in large enterprise projects:
Such behavior is generally driven by one of few more software engineers (they usually name themselve software architect) that fall into the descriptions you wrote in your question. Everybody needs to be proud of something to have the courage to wake up in the morning. I will add that they are usually CV driven for that reason and try to apply the latest design patterns without thinking about business implications and ROI. The key is NOT to hire that kind of person (or try to coach him/her to understand business consequences of his/her choices). Trying to make them proud of working for the company instead of working on their framework may help too.
Missed deadlines, bugs, high turn-over tend to be solved by applying fancy methodologies (usually badly implemented) like scrum or hired highly priced consultants that will make things worse,... instead of removing (or coaching) the persons that shouldn't have been hired at the first place.
Removing the person in question is in most case a bad thing since he OWN the thing. So teach him/her to understand consequences of his/her choices is probably the most appropriate way to solve the problem. But to do that, you need a good manager.
So my only advice would be:
Hire better managers that understand very well both business and software development. They won't hire that kind of person or will try to teach them how to consider business in addition of pure software development. They will also understands that the most powerful motivation fuel for employees is making them proud of working for that company.
I would add some other reasons these things evolve, and I have seen this in more than one place:
- Developer lock-in. Once you have the devs coding in a non-transferable skill set, it's harder for them to leave.
- Author lock-in. Once you have several apps under maintenance dependent on the framework, you have the organisation dependent on the administrative group.
- Political control. The centralisation allows the framework to become a channel for political control.

What are some advanced software development topics every developer should know? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
Let's say your company has given you the time & money to acquire training on as many advanced programming topics that you can eat in a year, carte blanche. What would those topics be and how would you prefer to acquire them?
Assumptions:
You're still having deliverables to bring into existence, but you're allowed one week per month for the year for this training.
The training can come from anywhere. IE: Classroom, on-site instructor, books, subscriptions, podcasts, etc.
Subject matter can cover any platform, technology, language, DBMS, toolset, etc.
Concurrent/Parallel programming and multi-threading, especially with respect to memory models and memory coherency.. I think every programmer should be aware of the considerations in this arena as we move into a world of multi-core/multi-cpu hardware.
For this I would probably using Internet research most heavily; but an on-campus primer at a good university could be a good way to start off.
Security!
Far too many programmers just build something and think they can add security as an afterthought after finishing the "main" part of the program. You could always benefit from knowing more about how to secure your app, how to design software to be secure from the get-go, how to do intrusion detection, etc.
Advanced Database Development
Things like data warehousing (MDX, OLAP queries, star schemas, fact tables, etc), advanced performance tuning, advanced schema and query patterns, and the like are always useful.
Here are the three that I'm always finding myself explaining to junior developers who didn't get enough CS training. All that other stuff is generally more hype than substance, or can be fairly easily picked up. But if you don't know these three, you can do a great deal of damage:
Algorithm analysis, including Big O
Notation.
The various levels of
cohesion and coupling.
Amdahl's Law, and how it pertains to optimizations.
Internationalization issues, especially since it sounds like it would not be an advanced topic. But it is.
Accessibility
It's ignored by so many organizations but the simple fact of the matter is that there are a huge number of people with low or no vision, color blindness, or other differences that can make navigating the web a very frustrating experience. If everybody had at least a little bit of training in it, we might get some web based UIs that are a little more inclusive.
Object oriented design patterns.
I guess "advanced" is different for everyone, but I'd suggest the following as being things that most decent developers (i.e. ones that don't need to be told about NP-completeness or design patterns) could gain from:
Multithreading techniques that go
beyond "lock" and when to apply them.
In-depth training to learn and
habitualize themselves with clever
features in their toolchain (IDE/text
editor, debugger, profiler, shell.)
Some cryptography theory and hands-on experience with different common flaws in security schemes that people create.
If they program against a database, learn the internals of their database and advanced
query composition and tuning techniques.
Developers should know the basics in SQL development and how their decisions impact database performance. It is one thing to write a query it is another thing to write a query, understand the explain plan and make design decisions based off that output. I think a good course on PL/SQL development and database performance would be very beneficial.
Unfortunately communication skills seem to fall under the "advanced topics" section for most developers (present company excluded, of course).
Best way to acquire this skill: practice.
Take of the headphones, and talk to
someone instead of IM'ing or emailing
the guy at the next desk.
Pick up the phone and talk to a
client instead of lobbing an email
over the fence.
Ask questions at a conference instead of sitting behind your laptop
screen twittering.
Actively participate in a non-technical meeting at work.
Present something in public.
Most projects do not fail because of technical reasons. They fail because they could not create a team. Communication is vital to team dynamics.
It will not harm your career either.
One of the best courses I took was a technical writing course. It has served me well in my career.
Additionally: it probably does not matter WHAT the topic is - the fact that the organization is interested in it and is paying for it and the developers want to go and do go, is a better indicator of success/improvement than any one particular topic.
I also don't think it matters that much what the topic is. Dev organizations deal with so many things during a project that training and then on the job implementation/trial and error will always get you some better perspective - even if the attempts to try out/use the new stuff fail. That experience will probably help more on the subsequent projects.
I'm a book person, so I wouldn't really bother with instruction.
Not necessarily in this order, and depending on what you know already
OO Programming
Functional Programming
Data structures and algorithms
Parallel processing
Set based logic (essentially the theory behind sql and how to apply it)
Building parsers (I only put this, because it actually came up where I work)
Software development methodolgoies
NP Completeness. Specifically, how to detect if a problem is NP-Complete, and how to build an approximate solution to the problem.
I see this as important because you don't want a developer to try and solve an NP-complete problem by getting the optimum solution, unless the problem's search space is very small, in which case brute force is acceptable. However, as the search space increases, the time required to solve the problem increases exponentially.
I'd cover new technologies and trends. Some of the new technologies I'm researching/enhancing my skills with include:
Microsoft .NET Framework v3.0/v3.5/v4.0
Cloud Computing Frameworks (Amazon EC2, Windows Azure Services, GoGrid, etc.)
Design Patterns
I am from MS based developer world, so here is my take on this
More about new concepts in Cloud Computing (various API etc.). as the industry is betting on it for sometime.
More about LinQ for .net framework
Distributed databases
Refactoring techniques (which implies also learning to write a good set of unit/functional tests).
Knowing how to refactor is the best way to keep code clean -- it is rare when you get it right the first time (especially in new designs).
A number of refactorings, however, require a decent set of tests to check that the refactoring did not add unexpected behavior.
Parallel computing- the easiest and best way to learn it
Debugging
Debugging by David J. Agans is a good book on the topic. Debugging can be very complex when you deal with multi threaded programs, crashes, algorithms that doesn't work. etc. Everybody would be better off being good at debugging.
I'd vote for real-world battle stories. Have developers from other organizations present their successes and failures. Don't limit the presentations to technologies you're using. With a significantly complex project, this is bound to cut into 'advanced' topics you haven't even considered. Real-world successes (and failures) have a lot to teach.
Go to the Stack Overflow DevDays
and the ACCU conferences
Read
Agile Software Development, Principles, Patterns, and Practices (Robert C. Martin)
Clean Code (Robert C. Martin)
The Pragmatic Programmer (Andrew Hunt&David Thomas)
Well if you're here I would hope by now you have the basics down:
OOP Best practices
Design patterns
Application Security
Database Security/Queries/Schemas
Most notably developers should strive to learn multiple programming languages and disciplines, in order for their skill set to be expanded in more than one direction. They don't need to become experts in these other skills but at least have a very acute understanding of integration with their central discipline. This will make them much better developers in the long run, and also let them gain the ability to use all tools at their disposal to create applications that can transcend the limitations of a singular language.
Outside of programming specific topics, you should also learn how to work under Agile, XP, or other team based methodologies in order to be more successful while working in a team environment.
I think an advanced programmer should know how to get your employer to give you the time & money to acquire training on as many advanced programming topics that you can eat in a year. I'm not advanced yet. :)
I'd suggest an Artificial Intelligence class at a college/university. Most of the stuff is fun, easy to grasp (the basics at least), and the solutions to problems are usually creative.
Hitchhikers Guide to the Galaxy.
How would I prefer to acquire the training? I'd love to have a substantial amount of company time dedicated to self-training.
I totally agree on Accessabiitly. I was asked to look into it for the website at work and there is a real lack of good knowledge on the subject, not only a lack of CSS standards to aid in the likes of screen readers.
However my answer goes to GUI design - its quite a difficult thing to get right. There's too many awful applications out there that could be prevented just by taking the time to follow HCI (Human Computer Interaction) advice/designs. Take Google/Apple for inspiration when making a GUI - not your typical hundreds of buttons/labels combo that too often gets pushed out.
Automated testing: Unit testing, functional integration testing, non-functional testing
Compiler details (more relevant on some platforms than others): How does the compiler implement certain common constructs in language X? On a byte-code interpreted platform, how does JIT compilation work? What can be JIT-compiled (for example, can virtual calls be JIT compiled?)?
Basic web security
Common design idioms from other problem domains than the one you're working in at the moment.
I'd recommend learning about Refactoring, Test Driven Development, and various unit testing frameworks (NUnit, Visual Test, CppUnit, etc.) I'd also learn how to incorporate automated unit testing into your continuous integration builds.
Ultimately if you can prove your code does what it claims it can do, you don't have to be there to answer questions as to why or how. If a maintainer comes along and tries to "fix" your code, they'll know instantly if they broke it. Tests written around the requirements (use cases) explain to the maintainer what your users wanted it to do, and provide a little working example of how to call it. Think of unit tests as functional documentation.
Test Driven Development (TDD) is a more novel design approach that begins with the requirements, where you start by writing a test before you write the code. You then write exactly enough code required to pass the test. You have to stop before you write extra code (that you may never need), because you will refactor it later if you find that you really needed it.
What makes TDD cool is that a bad interface (such as one with lots of dependencies) is also very hard to write tests for. It's so hard that a coder would rather refactor the interface to make it easier to test. And that refactoring simplifies the code, removing inappropriate dependencies, or grouping related tests together to make it easier to test, thus improving cohesion. By making it immediately apparent to the developer when he's writing a badly interfaced module, the developer sticks to the architecture and gravitates to the principles of tight cohesion and loose coupling. Good interfaces are the natural result. And as a bonus, once you pass all your tests, you know you're done.
On the surface this seems like an easy question to answer, just enter your favorite pet peeve about what other developers can't do correctly. But when I read through the answers and gave it some thought, I realized that every "advanced topic" brought up was covered in my undergraduate computer science curriculum--20 years ago. And I doubt that OO, security, functional programming, etc. concepts have changed in that time. Sure the tools have, but I argue that tools are different than topics.
So what is an "advanced topic" in computer science? Who is the Turing, Knuth, Yourdon of the 21st century?
I don't have a clear answer to this question, though I'd like to see more work on theories for parallel programming that will enable tools to abstract that messy stuff for developers.
Quite funny that noone hasnt mentioned:
debugging.
tools & ide you work with
and platform you are developing to.
Everyday development is much more fun if you know your tools really well and you accomplish more and make your life easier if you know how to debug someone elses code at ease.
Source Control

How is web programming different from back-end programming?

I have worked on single threaded business logic/back-end programming for most of my career. I now wish to learn web programming but would like to know how web programming is different from non-GUI programming (e.g. writing an API or a file processing application). I am not talking about the GUI design aspects (someone has already asked that question here) but more about programming complexity.
On the few occasions when I have worked on a web application, I felt that web applications are relatively more non-deterministic and unpredictable (for example, due to the event driven, multi-threaded model of web applications, there are several permutations and combinations of events and actions one needs to take care of) .
What would you say are some of the basic features of web programming that makes it different from non-GUI applications? What are the pitfalls/mistakes a back-end developer might commit while working on web applications?
EDIT
My definition of back-end programming means non-GUI applications like an API or a file processing batch application that parses a large data file, reads the records, does a lot of number crunching calculations on the data and spews out the results into another file or database. Anothe example could be a library of date and time utilities.
The biggest challenge with web programming is dealing with state. HTTP is a stateless protocol. This can make maintaining state more challenging than in a desktop application. Web applications tend to have a different life cycle due to this. Each web development platform deals with this somewhat differently, but they all need to deal with it in some way.
Web applications generally feel like single threaded applications, as you - the application developer - rarely create threads of your own. If anything, it's actually a lot easier, because the stateless nature of the web transactions means that you have to load the data for the page each time from the database. Therefore, you don't have to worry about concurrency, since 'whatever is there' is usually good enough.
The biggest problem with Web development is all of the background knowledge that you have to accumulate over time. How do you lay out web pages? How do you style things with CSS? How do you get parameters from the query string? How do you validate a field value in JavaScript? All of those things are actually really easy to learn, but there's just so many of them that it can be a real pain.
The biggest pitfalls I've witnessed Application developers make when moving into Web is not considering the costs of their code. Either they abuse MySQL to much too the point of bogging the RDBMS down, they write code that uses too much memory, or they make front end pages that are to big to fit in dialup/cellphones or low end broadband/dsl pipeline.
Sometimes it can't be avoid in writing a heavy duty page, but considerations can be made to attempt to cache as much as possible or when writing a page that will be hit a lot they will make no effort to profile and optimize queries before they go out the door.
Its not that these people are stupid, just a lack of experience and awareness that they need to play nice and write code that's somewhat lean.
Back-end programming is infinitely easier than web programming. (You have been warned!) Web programming is the easiest to show off to everybody.
Most web sites have a back end component as well. Typical structure will be something like:
UI - html/css/javascript
Controller - if using MVC
Business Logic/Services - this is backend
Database - this is also backend
So building web sites will still mean a lot of back end work. In regards to the UI, the main difference is that you will need to have a good eye for design and layout to do it well. The html/css technology is pretty simple in itself.
HTML was actually developed to deliver physics papers. You can still see it in some of the old meta tags. At any rate the difference is web programming is stateless and thick client development is not.
As you have adeptly indicated, its all driven by events. True javascript has mucked up web development a bit by creating the illusion of a stateful enviornment but in the end everything comes down to simple HTML.
Its never too late to start learning, I would say start making some static HTML pages and move your way up to an MVC Framework, I suggest Microsoft MVC Framework. Its pretty fantastic, there are others you could use like ASP.Net Webforms but you won't learn anything by dragging and dropping things onto a designer ;).
web & GUI applications interface with humans .. back-end applications interface with services and databases .. As such your specifications need to include significant consideration of your user's mental model - making things behave as people expect them to. And doing that - understanding how users think - is not always easy or logical. You may have elegant algorithmic solutions that simply fail to engage, because people don't always think logically. Many times, quite elegant UI's are extremely twisted coding-wise .. which is very contrary to system->system programming
Depending on problem-space, much of this can be more art than science.
One consideration (amongst many) with web programming is that users won't just be stupid (not that they all are, but you always have to factor that in), they will sometimes (assume always) be downright malicious and nasty, and will do everything in their power to destroy your application, your database, your weekends, your sanity...
Be as paranoid as a very small nun at a penguin shoot. Do not trust your users.
Another consideration is that Back End programming as per your definition is easier to test.
Once you begin web programming you're at the mercy of the various browsers' different interpretations of the same code. Plus the user, with inputs of mouse and keyboard, has a variety of ways to break what you produce.
Web programming isn't back-end programing. It shows stuff on the front end, the web.
Are you defining it otherwise?
EDIT
Web programming pulls you into presenting data consistently, visually, to everyone. Back end coding means constructing that data, in the same way for presentation, but not presenting it.
Based on your definition of "back end programming," your question applies not only to web applications, but to any GUI application.
It kind of depends what kind of GUI application we're talking about. For example:
Internal business applications tend to involve lots of business process workflow logic, record keeping, and interoperability between separate systems. No fancy alorithms or number crunching are needed. Your audience is limited, so performance is not a big deal, but cross-platform compatibility is important so these tend to be web applications. Your main concerns are making it easy to tie business sytems together, and keeping the API layered to ensure that the GUI code does not have to deal with any of the business logic code.
Public web sites (such as this one) tend to involve less of a formal architecture, and more of a mentality of "just get this cool feature to work so we can get more visitors." Again, no number crunching or algorithms unless performance is an issue. Performance is more of an issue for massively popular sites like Slashdot or Google, so if you anticipate rapid growth it pays to design for scalability in advance.
Public e-commerce web sites are kind of like both of the above: features and performance are important, but equally important is the structured architecture underneath it that ties all of the commerce business systems together (purchasing, supplier, shopping cart, payment gateways, etc.)
For the actual GUI portion, the complexity of the application kind of determines how complex the GUI code will be. For highly complex, nested GUIs where your requirements change often, it's easy to fall into the trap of putting too much GUI stuff into one page. Soon the page exceeds most people's complexity threshold, making the page very difficult to maintain. It pays to think in advance how you can separate different portions of the GUI into separate components, and then tie them together. If you're new to GUI programming, read some articles on the Model-View-Controller (MVC) pattern.
For simple web sites, where most pages are fairly static, this issue doesn't come up so much because each individual page is easy to maintain.
Most web programming is done in the style popular in the early seventies, before Dijkstra's 'goto considered harmful' was well-known.

How do you plan an application's architecture before writing any code? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
One thing I struggle with is planning an application's architecture before writing any code.
I don't mean gathering requirements to narrow in on what the application needs to do, but rather effectively thinking about a good way to lay out the overall class, data and flow structures, and iterating those thoughts so that I have a credible plan of action in mind before even opening the IDE. At the moment it is all to easy to just open the IDE, create a blank project, start writing bits and bobs and let the design 'grow out' from there.
I gather UML is one way to do this but I have no experience with it so it seems kind of nebulous.
How do you plan an application's architecture before writing any code? If UML is the way to go, can you recommend a concise and practical introduction for a developer of smallish applications?
I appreciate your input.
I consider the following:
what the system is supposed to do, that is, what is the problem that the system is trying to solve
who is the customer and what are their wishes
what the system has to integrate with
are there any legacy aspects that need to be considered
what are the user interractions
etc...
Then I start looking at the system as a black box and:
what are the interactions that need to happen with that black box
what are the behaviours that need to happen inside the black box, i.e. what needs to happen to those interactions for the black box to exhibit the desired behaviour at a higher level, e.g. receive and process incoming messages from a reservation system, update a database etc.
Then this will start to give you a view of the system that consists of various internal black boxes, each of which can be broken down further in the same manner.
UML is very good to represent such behaviour. You can describe most systems just using two of the many components of UML, namely:
class diagrams, and
sequence diagrams.
You may need activity diagrams as well if there is any parallelism in the behaviour that needs to be described.
A good resource for learning UML is Martin Fowler's excellent book "UML Distilled" (Amazon link - sanitised for the script kiddie link nazis out there (-: ). This book gives you a quick look at the essential parts of each of the components of UML.
Oh. What I've described is pretty much Ivar Jacobson's approach. Jacobson is one of the Three Amigos of OO. In fact UML was initially developed by the other two persons that form the Three Amigos, Grady Booch and Jim Rumbaugh
I really find that a first-off of writing on paper or whiteboard is really crucial. Then move to UML if you want, but nothing beats the flexibility of just drawing it by hand at first.
You should definitely take a look at Steve McConnell's Code Complete-
and especially at his giveaway chapter on "Design in Construction"
You can download it from his website:
http://cc2e.com/File.ashx?cid=336
If you're developing for .NET, Microsoft have just published (as a free e-book!) the Application Architecture Guide 2.0b1. It provides loads of really good information about planning your architecture before writing any code.
If you were desperate I expect you could use large chunks of it for non-.NET-based architectures.
I'll preface this by saying that I do mostly web development where much of the architecture is already decided in advance (WebForms, now MVC) and most of my projects are reasonably small, one-person efforts that take less than a year. I also know going in that I'll have an ORM and DAL to handle my business object and data interaction, respectively. Recently, I've switched to using LINQ for this, so much of the "design" becomes database design and mapping via the DBML designer.
Typically, I work in a TDD (test driven development) manner. I don't spend a lot of time up front working on architectural or design details. I do gather the overall interaction of the user with the application via stories. I use the stories to work out the interaction design and discover the major components of the application. I do a lot of whiteboarding during this process with the customer -- sometimes capturing details with a digital camera if they seem important enough to keep in diagram form. Mainly my stories get captured in story form in a wiki. Eventually, the stories get organized into releases and iterations.
By this time I usually have a pretty good idea of the architecture. If it's complicated or there are unusual bits -- things that differ from my normal practices -- or I'm working with someone else (not typical), I'll diagram things (again on a whiteboard). The same is true of complicated interactions -- I may design the page layout and flow on a whiteboard, keeping it (or capturing via camera) until I'm done with that section. Once I have a general idea of where I'm going and what needs to be done first, I'll start writing tests for the first stories. Usually, this goes like: "Okay, to do that I'll need these classes. I'll start with this one and it needs to do this." Then I start merrily TDDing along and the architecture/design grows from the needs of the application.
Periodically, I'll find myself wanting to write some bits of code over again or think "this really smells" and I'll refactor my design to remove duplication or replace the smelly bits with something more elegant. Mostly, I'm concerned with getting the functionality down while following good design principles. I find that using known patterns and paying attention to good principles as you go along works out pretty well.
http://dn.codegear.com/article/31863
I use UML, and find that guide pretty useful and easy to read. Let me know if you need something different.
UML is a notation. It is a way of recording your design, but not (in my opinion) of doing a design. If you need to write things down, I would recommend UML, though, not because it's the "best" but because it is a standard which others probably already know how to read, and it beats inventing your own "standard".
I think the best introduction to UML is still UML Distilled, by Martin Fowler, because it's concise, gives pratical guidance on where to use it, and makes it clear you don't have to buy into the whole UML/RUP story for it to be useful
Doing design is hard.It can't really be captured in one StackOverflow answer. Unfortunately, my design skills, such as they are, have evolved over the years and so I don't have one source I can refer you to.
However, one model I have found useful is robustness analysis (google for it, but there's an intro here). If you have your use-cases for what the system should do, a domain model of what things are involved, then I've found robustness analysis a useful tool in connecting the two and working out what the key components of the system need to be.
But the best advice is read widely, think hard, and practice. It's not a purely teachable skill, you've got to actually do it.
I'm not smart enough to plan ahead more than a little. When I do plan ahead, my plans always come out wrong, but now I've spend n days on bad plans. My limit seems to be about 15 minutes on the whiteboard.
Basically, I do as little work as I can to find out whether I'm headed in the right direction.
I look at my design for critical questions: when A does B to C, will it be fast enough for D? If not, we need a different design. Each of these questions can be answer with a spike. If the spikes look good, then we have the design and it's time to expand on it.
I code in the direction of getting some real customer value as soon as possible, so a customer can tell me where I should be going.
Because I always get things wrong, I rely on refactoring to help me get them right. Refactoring is risky, so I have to write unit tests as I go. Writing unit tests after the fact is hard because of coupling, so I write my tests first. Staying disciplined about this stuff is hard, and a different brain sees things differently, so I like to have a buddy coding with me. My coding buddy has a nose, so I shower regularly.
Let's call it "Extreme Programming".
"White boards, sketches and Post-it notes are excellent design
tools. Complicated modeling tools have a tendency to be more
distracting than illuminating." From Practices of an Agile Developer
by Venkat Subramaniam and Andy Hunt.
I'm not convinced anything can be planned in advance before implementation. I've got 10 years experience, but that's only been at 4 companies (including 2 sites at one company, that were almost polar opposites), and almost all of my experience has been in terms of watching colossal cluster********s occur. I'm starting to think that stuff like refactoring is really the best way to do things, but at the same time I realize that my experience is limited, and I might just be reacting to what I've seen. What I'd really like to know is how to gain the best experience so I'm able to arrive at proper conclusions, but it seems like there's no shortcut and it just involves a lot of time seeing people doing things wrong :(. I'd really like to give a go at working at a company where people do things right (as evidenced by successful product deployments), to know whether I'm a just a contrarian bastard, or if I'm really as smart as I think I am.
I beg to differ: UML can be used for application architecture, but is more often used for technical architecture (frameworks, class or sequence diagrams, ...), because this is where those diagrams can most easily been kept in sync with the development.
Application Architecture occurs when you take some functional specifications (which describe the nature and flows of operations without making any assumptions about a future implementation), and you transform them into technical specifications.
Those specifications represent the applications you need for implementing some business and functional needs.
So if you need to process several large financial portfolios (functional specification), you may determine that you need to divide that large specification into:
a dispatcher to assign those heavy calculations to different servers
a launcher to make sure all calculation servers are up and running before starting to process those portfolios.
a GUI to be able to show what is going on.
a "common" component to develop the specific portfolio algorithms, independently of the rest of the application architecture, in order to facilitate unit testing, but also some functional and regression testing.
So basically, to think about application architecture is to decide what "group of files" you need to develop in a coherent way (you can not develop in the same group of files a launcher, a GUI, a dispatcher, ...: they would not be able to evolve at the same pace)
When an application architecture is well defined, each of its components is usually a good candidate for a configuration component, that is a group of file which can be versionned as a all into a VCS (Version Control System), meaning all its files will be labeled together every time you need to record a snapshot of that application (again, it would be hard to label all your system, each of its application can not be in a stable state at the same time)
I have been doing architecture for a while. I use BPML to first refine the business process and then use UML to capture various details! Third step generally is ERD! By the time you are done with BPML and UML your ERD will be fairly stable! No plan is perfect and no abstraction is going to be 100%. Plan on refactoring, goal is to minimize refactoring as much as possible!
I try to break my thinking down into two areas: a representation of the things I'm trying to manipulate, and what I intend to do with them.
When I'm trying to model the stuff I'm trying to manipulate, I come up with a series of discrete item definitions- an ecommerce site will have a SKU, a product, a customer, and so forth. I'll also have some non-material things that I'm working with- an order, or a category. Once I have all of the "nouns" in the system, I'll make a domain model that shows how these objects are related to each other- an order has a customer and multiple SKUs, many skus are grouped into a product, and so on.
These domain models can be represented as UML domain models, class diagrams, and SQL ERD's.
Once I have the nouns of the system figured out, I move on to the verbs- for instance, the operations that each of these items go through to commit an order. These usually map pretty well to use cases from my functional requirements- the easiest way to express these that I've found is UML sequence, activity, or collaboration diagrams or swimlane diagrams.
It's important to think of this as an iterative process; I'll do a little corner of the domain, and then work on the actions, and then go back. Ideally I'll have time to write code to try stuff out as I'm going along- you never want the design to get too far ahead of the application. This process is usually terrible if you think that you are building the complete and final architecture for everything; really, all you're trying to do is establish the basic foundations that the team will be sharing in common as they move through development. You're mostly creating a shared vocabulary for team members to use as they describe the system, not laying down the law for how it's gotta be done.
I find myself having trouble fully thinking a system out before coding it. It's just too easy to only bring a cursory glance to some components which you only later realize are much more complicated than you thought they were.
One solution is to just try really hard. Write UML everywhere. Go through every class. Think how it will interact with your other classes. This is difficult to do.
What I like doing is to make a general overview at first. I don't like UML, but I do like drawing diagrams which get the point across. Then I begin to implement it. Even while I'm just writing out the class structure with empty methods, I often see things that I missed earlier, so then I update my design. As I'm coding, I'll realize I need to do something differently, so I'll update my design. It's an iterative process. The concept of "design everything first, and then implement it all" is known as the waterfall model, and I think others have shown it's a bad way of doing software.
Try Archimate.

How to get started with speech-to-text?

I'm really interested in speech-to-text algorithms, but I'm not sure where to start studying up on them. A bunch of searching around led me to this, but it's from 1996 and I'm fairly certain that there have been improvements since then.
Does anyone who has any experience with this sort of stuff have any recommendations for reading / source code to examine? Or just general advice on what I should be trying to learn about if I want to get into the world of writing speech recognition programs (sometimes it's hard to know what to search for if you don't have much knowledge about the domain).
Edit: I'd like to do something cross-platform, but for the moment I'd be targeting linux.
Edit 2: Thanks csmba for the well-thought out reply. At this point in time, I'm mainly interested in being able to create applications that allow automation, or execution of different commands through voice. So, a limited amount of recognizable commands being able to be strung together. An example would be a music player that took commands like "Play the album Hello Everything by Squarepusher", or an application launcher that allowed the user to create voice-shortcuts to launch specific apps.
I realize that it's a pretty giant problem, and that I have nowhere near the level of knowledge required right now to tackle implementing an entire recognition engine, although the techniques involved with doing so fascinate me, and it is something I'd like to work myself up to doing. In all likelihood, I'll probably end up picking up a book or two on the subject and studying up / playing with "simple" implementations in my free time.
This is a HUGE questions, I wouldn't know how to begin... So let me just try giving you the right "terms" so you can refine your quest:
First, understand that Speech Recognition is a diverse and complicated subject, and it has many different applications. People tend to map this domain to the first thing that comes to their head (usually, that would be computers understanding what you are saying like in IVR systems). So first lets distinguise the concept into the main categories:
Human-to-Machine: Applications that deal with understanding what a human is saying, but the human knows he is talking to a machine and the grammar is very limited. Examples are
Computer automation
Specialized: Pilots automating some controls for example (noise a huge problem)
IVR (Interactive Voice Response) systems like Google-411 or when you call the bank and the computer on the other side says "say 'service' to get customer service"
human-to-human (Spontaneous speech): This is a bigger, more complex problem. Here we can also break it down into different applciations:
Call Center: conversation between Agent-Customer, phone quality, compressed
Intelligence: radio/phone/live conversations between 2 or more individuals
Now, Speech-To-Text is not what you should be saying that you care about. What you care about is solving a problem. Different technologies are used to solve different problems. See an overview here of some of them. to summarize, other approaches are Phonetic transcription, LVCSR and direct based.
Also, are you interested in being the PHd behind the technology? you would need a Masters equivalent involving Signal processing and probably a PHd to be cutting edge. In which case, you will work for a company that develops the actual speech engine. Companies like Nuance and IBM are the big ones, but also Phillips and other startups exist.
On the other hand, if you want to be the one implementing applications, you will not be working on the engine, but working on building application that USE the engine. A good analogy I think is form the gaming industry:
Are you developing the graphic engine (like the Cry engine), or working on one of several hundred games, all use the same graphic engine?
Don't get me wrong, there is plenty to work on the quality of the search also outside the IBM/Nuance of the world. The engine is usually very open, and there are a lot of algorithmic tweaking to be done that can dramatically affect performance. Each business application has different constraints and cost/benefit function, so you can make experiments for many years building better voice recognition based applications.
one more thing: in general, you would also want to have good statistics background the lower in the stack you want to be.
At this point in time, I'm mainly interested in being able to create applications that allow automation
Good, we are converging here... Then you have no interest in "Speech-to-Text". That buzzwords takes you to the world of full transcription, a place you do not need to go to. You should be focusing on some of the more Human-to-Machine technologies like Voice XML and the ones used in IVR systems (Nuance is the biggest player there)
I would definitely recommend picking up a book or two if you are new to the field. I've got no experience in the field, so I can't make a recommendation. If you are still in college (or still have close ties), you should find out if any of your professors can make a recommendation.
The survey you linked is probably an excellent resource, too. I'm sure there have been advancements since 1996, but the basics are unlikely to have fundamentally changed. If the survey is well-written, then it would be well worth your time to read it.
For OS X check out this: OS X Speech Technologies
For Windows check out this: Microsoft Speech API
I have worked with IBMs ViaVoice product. It has a good ASR (automated speech recognition) engine, and a nice text-to-speech engine.
The websites not very good, but this is a link for the Embedded version http://www-01.ibm.com/software/voice/support/
It is platform agnostic though, and everything works through a MVC architecture using vxml a variant of xml for voice purposes.
What platform are you targeting ?. There is Microsoft Speech APIs that you can use if its for windows.
There is also the Speech Recognition Service for Android.