B2B Application Building and Maintenance Cost - integration

I've been considering for some time now to get into be b2b integration business. I've researched the tools available for doing this,
like Oragle's WebLogic Integration, IBM's WebSphere, or Microsoft's BizTalk. They all seem to do the job (each having their ups and downs).
I've also looked at some companies that already are doing this (ex: www.hubspan.com). It seems that b2b integration is very needed service.
Although my background is in integration of commercial products with open source software, I feel that concerning the b2b integration world,
I still feel that I need to feel some blanks.
So basically I'd like to clear a few things concerning all this:
All the frameworks that I previously mentioned are just that, frameworks. They allow to build an application ON TOP of the said frameworks,
they are not itendet ot be a final product. I assume that this is because the integration needs of different companies vary so much,
that an out-of-the-box solution is just not possible. So my question is, do the applications build with the said frameworks vary so much
from business to business, that it's not possible to reuse them?
Also, is it possible to build a single framework of Suppliers and Customers (build a Core of somekind), and connect new Costumers and/or
Suppliers as they come? (this is the way HubSpan did it, not counting the developing of custom Connectors to the Client ERP systems).
Or will I have to do a separate integration for each Customer?
How much work hours is required to complete a typical integration project, (assuming everything is planned and executed properly)?
(For the sake of simplicity, let's say that the integration includes only 'Query Product Price', 'Query Product Availability',
and 'Purchase Order Management'.
And finally, is this a job for a sigle person (can I do this myself?, assuming I have the knowledge to do it) or a team is required?
Thanks in advance for sharing your thoughs and oppinions.

Yes they can vary that much.
It depends on the business. Some will integrate easily while others will need custom modules and connectors
There isn't really a "typical" integration project.
Depends on the size of the project. If you're talking fortune 500 companies then no. If you're talking a local manufacturer and local supply house (presumably small) then maybe.
This is probably a question better asked on the programmers.stackexchange

I think it varies a lot. You should probably define what you mean by B2B, and there are a lot of different types these days. From a BizTalk perspective, it is possible to build an application service provider (ASP) version of B2B but it is hard to do.
The level of customization is one of the factors that drives up cost and the length of the project. I think it is difficult to do B2B alone, usually there is so much business domain knowledge specific to each company that you need those business people to help explain the existing systems.
Thanks,

Related

Why should we go for one solution with ETL/DWH/BI rather that specialized solution for each of these?

I'm working in a reporting service application where I use ETL/DWH/BI using SSIS packages for ETL and SQLserver for Data warehouse my client wants to know why should they would go for one solution with ETL/DWH/BI rather that specialized solution for each of these?
I will pleased for any suggestions.
Thanks
Business needs vary, no-one will be able to tell you what you should do for your client without analyzing that clients needs.
However, general reasons for using the SQL Server family of tools include:
All components are supported by the same vendor (Microsoft). This can help prevent the finger pointing when you have a problem integrating tools from different vendors.
In general, compatibility issues are reduced by using a single vendor and feature overlap is less common between tools within a suite.
Licensing/acquisition is streamlined. Buying from a single vendor will generally require less overhead than buying from multiple vendors.
Consolidated technical base. In my experience, there is a lot of overlap in the skills of an individual professional. Other than a non-development DBA and entry-level developers, the SQL Server professionals I have worked with generally have at least a functional understanding of the full suite of tools. I've yet to work with someone that can only produce in a single tool (SSIS, SSRS, etc). When you have tools from multiple vendors, you will have a harder time finding candidates that already have knowledge of all of your tools. This may not be an issue if your staffing needs do not force individuals to wear multiple hats or your volume of work in each category (ETL, Reporting, etc) requires dedicated resources.
Some downsides
Rarely are all of the tools "best of breed". The tools may be very good, but a focused competitor can generally out perform a suite in some areas.
Vendor lock-in is even more likely. If a tool falls behind the curve, it can be much harder to replace when the tool is part of a comprehensive suite.

Is BizTalk The Correct Solution?

We have about about six systems (they are all internal systems) that we need to send data between. Currently we do not have a consistent way of doing this. We use SSIS, SQL Server linked servers to directly update databases, ODBC connections to directly update databases, text files, etc..
Our goals are:
1) Have a consistent way of connecting applications.
2) Have a central way of monitoring and logging the connections between
applications.
3) For the applications that offer web services we
would like to start using them instead of connectiong directly with
the database.
Whatever we use will need to be able to connect to web services, databases, flat files, and should also be able to accept data via a tcp connection.
Is Biztalk a good solution for this, or is it is overkill?
It really depends. For the architecture you're describing, it would seem a good fit. However, you will need to validate wether biztalk can communicate whith the systems you are trying to integrate. For example; when these systems use webservices, message queues or file based communication, that may be a good fit.
When you start with biztalk, you have to be willing to invest in hardware, software, en most of all in learning to use it.
regarding your points:
1) yes, if you make sure to encapsulate the system connectors correctly
2) yes, biztalk supports this with BAM
3) yes, that would match perfectly
From what you've described (6 systems), it is definitely a good time to investigate a more formalized approach to integration, as you've no doubt found that in a point to point / direct integration approach will result in a large number of permutations / spaghetti as each new system is added.
BizTalk supports both hub and spoke, and bus type topologies (with the ESB toolkit), either of which will reduce the number of interconnects between your systems.
To add to oɔɯǝɹ:
Yes - ultimately BizTalk converts everything to XML internally and you will use either visual maps or xslt to transform between message types.
Yes. Out of the box there are a lot of WMI and Perfmon counters you can use, plus BizTalk has a SCOM management pack to monitor BizTalk's Health. For you apps, BAM (either TPE for simple monitoring, but more advanced stuff can be done with the BAM API).
Yes - BizTalk supports all the common WCF binding types, and basic SOAP web services. BizTalk's messagebox can be used as a pub / sub engine which can allow you to 'hook' other processes into messages at a later stage.
Some caveats:
. BizTalk should be used for messages (e.g. Electronic Documents across the organisation), but not for bulk data synchronisation. SSIS is a better bet for really large data transfers / data migration / data synchronisation patterns.
. As David points out, there is a steep learning curve to BizTalk and the tool itself isn't free (requiring SQL and BizTalk licenses, and usually you will want to use a monitoring tool like SCOM as well.). To fast track this, you would need to send devs on BizTalk training, or bring in a BizTalk consultant.
. Microsoft seem to be focusing on Azure Service Bus, and there is speculation that BizTalk is going merged into Azure Service Bus at some point in future. If your enterprise strategy isn't entirely Microsoft, you might also want to consider products like NServiceBus and FUSE for an ESB.
You problem is a typical enterprise problem. Companies start of building isolated applications like HR, Web, Supply Chain, Inventory, Client management etc over number of years and once they reach a point these application cannot be living alone and they need to talk to each other, typically they start some hacked solution like data migration at database level.
But very soon they realize the problems like no clear visibility, poor management, no standards etc and they create a real spaghetti. The biggest threat is applications will become dependant on one another and you lose your agility to change anything. Any change to system will require heavy testing and long release cycle.
This is the kind of problem a middleware platform like BizTalk Server will solve for you. Lot of replies in the thread focused on cost of BizTalk server (some of the cost mentioned are not correct BTW). It's not a cheap product, but if you look at the role it play in your organisation as a central middleware platform connecting all the applications together and number of non-functional benefits you get out of the box like adapters to most of the third party products like SAP, Oracle, FTP, FILE, Web Services, etc, ability to scale your platform easily, performance, long running work flows, durability, compensation logic for long running workflows, throttling your environment etc., soon the cost factor will diminish.
My recommendation will be take a look at BizTalk, if you are new then engage with local Microsoft office. Either they can help or recommend a parter who can come and analyse your situation.

In which domains are message oriented middleware like AMQP useful?

What problem do MOM (Message Oriented Middleware) solve? Scalability? Integration?
In which domain are they typically used and in which domains are they typically not used?
For example, say, is Google using such solution for it's main search engine or to power GMail?
What about big websites like Walmart, eBay, FedEx (pretty much a Java shop) and buy.com (pretty much an MS shop)? Does MOM solve a need there?
Does it make any sense when you're writing a Webapp where you control the server-side and have an homogenous environment (say tens of Amazon EC2 instances all running Linux + Java JVMs) there and where the clients are, well, Web browsers?
Does it make sense for desktop apps that need to communicate with a server?
Or is it 'only' for big enterprise stuff where you typically have a happy mix of countless of different systems that needs to communicate in a way or another?
I'm a bit confused as to what they're useful for and I think that with example of where they're appropriate and where they're not appropriate I could better understand their use.
This is a great question.
The main uses of messaging are: scaling, offloading work, integration, monitoring, event handling, routing, networking, push, mobility, buffering, queueing, task sharing, alerts, management, logging, batch, data delivery, pubsub, multicast, audit, scheduling, ... and more. Basically: anything where you need data but don't want to make a database request. (Caching is another, longer story).
Another way of looking at this is to notice that many applications used to be built by assuming that users (people) would perform actions that would be fulfilled by executing a transaction on a database (including reads, writes). But today, many actions are not user-initiated. Instead they are application-initiated. For example "tell me when the book that I want to buy is in stock". The best way to solve this class of problems is with messaging of some sort. Whether you call it middleware or web push or real time salad dressing does not matter. It's all messaging.
When you enable applications to initiate or react to events, then it is much easier to scale because your architecture can be based on loosely coupled components. It is also much easier to integrate those components if your messaging is based on a stable, scalable, serviceable tool, preferably using open standard APIs and protocols.
I hope this helps. We try to maintain a list of useful links about messaging here
Please get in touch with questions and comments on any of this, we are dead easy to find.
To address your specific questions:
In which domain are they typically used and in which domains are they typically not used?
Like databases, messaging systems crop up everywhere.
For example, say, is Google using such solution for it's main search engine or to power GMail?
Google uses a lot of home grown technology, but a lot of their open source contributions and known use cases suggest that messaging is (or should be) central to some of the main services.
What about big websites like Walmart, eBay, FedEx (pretty much a Java shop) and buy.com (pretty much an MS shop)? Does MOM solve a need there?
Very much so.
An example use case is scaling web page requests. When the user makes a web request, the web server puts it onto a queue for background processing. This means that the web server can keep working while the request is processed. It also means that the web server does not need to know how the request is handled, making system maintenance, upgrade and rollback much simpler because the main parts are 'decoupled'.
So, anyway, the web request gets processed by a back end service, or possibly by many services, eg 'look up book titles', 'draw shopping cart', 'get advertisement', 'check user account'... Finally all the results get put onto another queue, ready for collection and user response by the web server. Typically the system will include a timeout of around 100ms so that any late requests just get thrown away. The user sees anything that got processed in the time interval. This is one reason why some large ecommerce sites have pages that appear to load in stages.
There are many more use cases...
Does it make any sense when you're writing a Webapp where you control the server-side and have an homogenous environment (say tens of Amazon EC2 instances all running Linux + Java JVMs) there and where the clients are, well, Web browsers?
Definitely. If you have an unknown, or unbounded, number of users, server side instances, and application latencies, then it makes sense to use messaging, even if just as a scalable substrate for non-blocking RPC.
Does it make sense for desktop apps that need to communicate with a server?
In lots of cases. One very common case is when the server pushes events to the desktop app, eg game event, tweets, price feeds in finance, system alerts....
Or is it 'only' for big enterprise stuff where you typically have a happy mix of countless of different systems that needs to communicate in a way or another?
Definitely not only for those 'legacy integration' cases but they are important too. At RabbitMQ, the biggest customers we have in terms of pure scale or message volume are cloud providers and big web application providers.
I will answer only one answer, from prior experience - take a look at this middle-ware that is employed by big companies here - middle-ware has one purpose - to glue dis-connected systems (written in disparate languages) together so that they can interact with one another and streamline the business process - Entera as I have had experience with, creates a middle layer in which the unix box using processes written in C, interact with the mainframe system (DB2, COBOL) via a front-end written in PowerBuilder (I am not naming the company!).
From the description I have given, Entera is a middle-ware which hosts a number of things - smooth integration of the flow of data regardless of the endian format, ability for different languages to talk to the middle-ware broker (a broker is a CORBA or DCE like process, that conforms to 'The Open Group) that listens on a particular port) and is specified by an IDL which makes a process appear to be local - if you understand the terminology used in Remoting under Microsoft's .NET Framework, you are not far off the mark! The middle-ware generates stubs which are linked at compile-time and manages the creation of the process, hosting it off a port, multi-threading at run-time, and also, the modern front-ends (such as .NET, Java, PowerBuilder even the unspeakable VB6...ok...VB.NET for the purists out there) can interact by opening a connection to the specified port on a particular IP address, and using the stubs generated, can interact with it directly.
Obviously, from what was described you can see how the legacy systems can have new life breathed into it and thus scalability of the process, the major downside of this is the cost factor which can run into thousdands of dollars. Big companies who uses mainframes as their back-end processing systems for billing/invoicing, who generate a huge revenue can obviously afford such an expensive product - to them it would seem like throwing pennies into a pool of water...because of the use of middle-ware which prolongs the business process, and breathe new life into it, can extend the business by a good number of years into the future without worrying about 'legacy' tag attached to it.
Incidentally, I carried this out as part of my thesis for my BSc. in Information Systems which covered this commercial front-end. There was an open source version of the middle-ware available on sourceforge called FreeDCE, but development efforts have declined or stopped.
Edit:
#cocotwo: That is exactly what middle-ware does as you said it is a plumbing tool...message oriented middle-ware is not really heard of AFAIK because I would imagine, the processes (functions) would need to be called as if they are locally visible within the application domain of the front-end to make it easy to interact with.
Using messages may have its advantages over RPC calls in that the messages are queued in a safe-holding area in the event that a network disconnection occurs - there may be some data caching going on within that aspect to allow the front-end to continue regardless...it would be useful in the instances of 'updating a status of a particular billing/invoice number' - a one-way write-data to the back-end via the middle-ware.
Ok, big companies would have advanced systems infrastructure in that technicians are constantly around the clock to ensure a smooth delivery of data-flow so that would have to be factored in. The company that I worked with had IBM Global Support contract to fulfill in order to ensure a maximum uptime 99% with 6 nine's after the decimal point...with hot-swapping/balanced-clusters/mirroring systems in place...
Whereas with RPC, if the disconnection occurs, the front-end would have to be restarted or would have to handle the disconnection event. It really depends if the message-queueing middle-ware handles each message in real-time and pass back results to the front-end immediately...
This is where each (Message-queueing and RPC related middle-ware) have their strengths and weaknesses...and also the cost mitigation factor such as support, maximum up-time, development efforts and training - that's a biggie here as middle-ware are really proprietary (despite following the 'The Open Group' layout/standards) and complex to setup and to glue the whole thing together via scripts.
Good answers and discussion here. Our consulting team has two preferred "messaging" solutions: RabittMQ and NXTera a high speed RPC middleware, the contemporary version of Entera mentioned above. My partners and I have developed several solutions using RabittMQ, it is the best tool available in that space right now. Additionally, I happen to work for the company that makes NXTera/Entera.
From experience I can clearly say that both of these products meet the need for reliability and low maintenance as discussed above. There are situations where a messaging service, like RabittMQ, is the right choice -- where Publish and subscribe, certified delivery, Queuing or store-and-forward are required.
In other cases, RPC's (remote procedure calls) are the best and fastest solutions for transactional and distributed processing for enterprise or cloud-based applications. When it is right to use an RPC, but SOAP/.NET (yes these are RPC implementations) are too slow, expensive or complex, a lightwieght high speed RPC middleware like NXTera/Entera is the right choice for us.
There is some use case overlap between RPC middleware and message oriented middleware, and where there are you can use either successfully. But both are strong and dependable choices.
The large companies I work with use both RPC and MoM side-by-side. As far as Internet companies, Google (Protocol Buffers) and Facebook (Thrift) show that RPC's have a roll to play in modern web and cloud-based development.

Pros and cons of building apps with proprietary database systems

I've been interested in 4D SAS' database product for a long time, though have barely touched it in eons.
In considering what tools to use for application development, especially one that will require a database component, what should be looked for when considering open-source tools like MySQL and PostgreSQL vs proprietary solutions like 4D or Pervasive SQL?
What good (and bad!) experiences has the SO community had with various DB tools like 4D, Pervasive, FilemakerPro, etc?
Any bad experiences?
Difficult to make a relevant list of Pros and Cons without a context.
My advice would be the following: when making the decision of using a proprietary database, make sure that this decision is based on strong facts and not merely a technical interest for an exotic tool. Put into the balance the benefits for using the proprietary database and the advantages of a non-proprietary solution.
The answer is different from system to system.
A prerequisite is that your system is well identified, with a clear scope, a quite predictable evolution, so that the results of your analysis will be robust. Then, if your proprietary solution brings a real benefit for your system, that you are comfortable with the support and that you can afford the overall cost, you should be a good candidate for the proprietary solution.
4D is a MacOS/Windows only cross-platform, proprietary database system with both stand-alone and Client-Server varieties. You would do well to compare it to Alphafive.com software which is Windows only. I've worked with it for 17 years and it has served me and my department very well. Off the top of my head ...
Pros:
Interface & code are closely tied to the data engine which makes development of rich, cross-platform user interfaces very fast and easy.
Proprietary relational data engine runs natively on both platforms, along with native client interfaces (but requires licenses for multi-users). Auto-relations are helpful (but sometimes get in the way).
Can access external systems via SOAP and ODBC and SQL drivers (limited).
Can access 4D from external systems via SOAP or http requests & web pages.
Native procedural programming language based on Pascal and is EASY to learn.
Excellent tool for small to mid-sized departments.
Latest version accepts subset of SQL commands AND original data access, so it's backward compatibility record has been very good.
Security is EASY in 4D.
You can build solutions to deploy through a variety of means, and are not limited by whether or not MS Access is installed.
Cons:
Interface & code are closely tied to the data engine which can lead to limited use of abstraction and "black-box" coding unless you make it a goal of your development.
Compiles to one monolithic structure file forcing restart for single fixes.
Language is still only procedural--making it harder for object-oriented programmers to accept. Every method requires separate "file" in 4D so you can't include more then one function or procedure in a single routine -- it will take some getting used to it.
While company appears to be in good shape, growing and developing, you simply never know as they keep their condition to themselves.
Company has never really marketed itself--trusting in its developer base to spread the word and grow the product through site deployments and product upgrades. Web site is clearly useful only to developers who already use the product -- it simply fails to attract new users.
Product upgrades have always seemed to focus on how the tool is better for the DEVELOPERS rather than for the CUSTOMERS of those developers.
SQL lacks views, compound indexes, and other common SQL features.
When a user requests a report of specific columns of data, I often have to write yet another program just to provide that specific data -- I can't always just query the data and generate a text file.
Does not handle new OS versions with nearly the ease of web browser based applications. Older version is broken on Mac OS 10.6, and newest version requires the latest Mac OS 10.6. No version is certified yet on Windows 7.
I've been nearly a year at learning ASP.NET and a few weeks at Ruby on Rails. While SQL data stores are EASY, user interface is HARD -- but worth it when your application still functions through OS upgrades. You can always use an older browser if the latest version breaks something.
I'd recommend you consider either of those, depending on how much funds you have available to implement the project--Rails being the cheaper of the two. Then, ANY system with a web browser can access the data, and you can fix interface pages on the fly as needed rather than taking the whole system down a few minutes for a single, simple update. Those skills might be more marketable in the future.
I will only say one thing.. Watch the "actual" cost of your decision.. Most proprietary database systems are Windows only.. or sometimes Mac/Windows only.
This means that along with paying quite a bit of money for the database system, you must also pay a good amount of money on a Server operating system to run it...
Also, compare the database system with current open source solutions. Is it really worth it? After moving from Microsoft Sql Server(which has a free edition, but anyway) to PostgreSQL I was blew away that people pay so much for SQL Server.. I mean, Postgres to me is a lot more clean, and most of it works exactly how you'd expect(unlike in certain SQL server syntaxes) and it has more features built into it(programming stored procs in Ruby anyone?)
So basically, compared the proprietary with the open source software and decide upon which one to take by total pricing(including OS) and feature set..
Pro of zeroing in on any DB: it's got good non-portable features that help you get things done
Con of zeroing in on any DB: sometimes a different DB is appropriate (for example running your tests with in-memory SQLite instances), but that option is now closed
Con of a proprietary commercial DB: if you need many instances, licensing costs can kill you
Consider the following questions:
How easy (or difficult) is it to make changes in maintenance? Applications are likely to spend far more time in maintenance than they do in development, so if changes are hard, long-term pain is guaranteed.
What is the quality of support? A system that is well-documented, proprietary or otherwise, is going to be easier to work with.
How large (or small) is the user community? Systems with larger user communities mean more people to ask for assistance if and when things go wrong.
How robust are the import/export capabilities of this proprietary database system?
I found the last point particularly useful at my first full-time job. Our client was using CA-Ingres, and no one at the company knew it well enough to write queries to validate the data. So I came up with the idea of exporting the data from Ingres and importing it into MS SQL Server (which I knew from a brief stint at Sybase Professional Services) so we could write our validation queries there. If it had been really hard to export data from Ingres, my idea wouldn't have been an option at all.
From 4D's webpage, I gather that we are looking at a complete development+deployment environment, not a standalone database as such. So the alternatives you could be looking at include stuff like django, ruby-on-rails, hibernate and others. The real question, of course, is if the proprietary system can save you enough money doing the product lifetime to justify the costs of the product. And that would depend on the type of human resources you have available.
4D is a good option for vertial applications. I have worked for a company which used 4D to build a medical records and billing application for general practitioners and specialists. The rapid design and deployment features of 4D enabled the application to quickly move with market desires and legislated changes to medical record storage.The environment itself was not cutting edge, but it was integrated, cross platform and very productive.
If you are entering a market with high vendor lock-in and a high barrier to entry, then I think proprietory integrated development environments are a good option.
At various points in my career, I've used and gotten very good at FileMaker Pro, FoxPro, 4D, and a few other commercial products. Now I mainly use PHP/MySQL, and haven't used the latest versions of any of the products.
I've always liked FileMaker because most people who can use a computer can pick up FileMaker and design their own systems. They don't have to know programming or database design. But, you can "program" FileMaker, put a web front end on it, or do other more sophisticated setups if you need to. Many times I was "handed" a system created in FileMaker by a non-technical person that needed to be made into a full fledged data management system. The good part was that all the "specs" and data flow were already designed into a system. The prototype was already created!
4D and FoxPro I always found required a certain amount of extra programming and/or database knowledge to really do anything with. 4D & FileMaker are really complete self-contained systems, not just database systems. Although they all have the ability to hook into other backend databases systems (i.e. MySQL, Oracle), that is not their strong point.
On the downside, doing more complex, dynamic systems can be difficult in 4D and Filemaker due to everything being tightly coupled. Because of their cost, you really would want to create multiple systems with them. Which means you need to really "buy into them" to get your money's worth.
The key concept is always adherence to standards: if you plan to use 4D's custom and / or special designed functions (but the discussion could be far more general, and cover any other free or commercial tool in the wild), well, just use it and take your advantage.
Not surprisingly, that's why huge DB systems like Oracle or IBM's DB2 in the past were wide accepted for specific business areas, as commercial transactions, for instance.
The other main reason to adopt a very closed solution is the legacy support. One of the products you cited (Pervasive SQL) acted as a no-effort port for BTrieve-based applications in late 90s, and it gained popularity thanks to the huge BTrieve community all over the planet.
Finally, last but not least, you should evaluate the TCO (Total Cost of Ownership) not only in terms of license price (single seat, network environment, site licenses and so on), but also for what concerns tech support, updates and availability for your platform. Many business units I know have been obliged to change their base OS for DB related problems.
Tip: add a bonus for custom solution that are proven or supported for usage in virtualized environments, if you aren't in seek for extreme performances. It will save more than a head ache for your DB manager.
In all other cases, rely on opensource/freesoftware DBs. MySql and Postgres for big projects, SQLite for single app persistence layer. Fairly standard and very good (community) support. Good value for no price.
I don't have any experiences with the proprietary database products you listed: 4D, Pervasive, FilemakerPro.
I'd be interested in knowing what those products offer that make them more attractive to you than the open source alternatives, you listed: MySQL and PostgreSQL.
I'd be interested in what makes those more attractive to you than the much more popular proprietary alternatives: Oracle, SQL Server, DB2, etc.
Without you providing more specifics, it's hard to advise you.
I personally feel safer using a widely used open source solution than a narrowly used closed source solution. The more widely used, the more battle-tested it's likely to be. The more open, the more control over my own destiny I have in case I do encounter some bug.
I have reported bugs to open source projects and gotten a quick fix. I have reported bugs to companies that make for-profit proprietary software and have gotten nothing.

MS Access as Enterprise Software?

Something that I often run into with my users is their desire to aquire solutions quickly means that they sometimes have said "Heck, I'll just roll up my sleeves and do it in Access - it's installed on my desktop".
Sometimes, we're lucky and the person that creates the Access database back-ends it to a SQL Server, so at least the mdb file issues that often come up aren't an issue.
However, it is my opinion that rolling out an Access front-end to a SQL Server database as an enterprise solution with thousands of users, and hundreds of thousands of rows is still problematic.
What are your opinions on this? What are some of the potential pitfalls?
OR
Is this a perfectly acceptable, stable, maintainable, and robust solution?
I've worked with this scenario a great deal. In fact as a consultant/developer Access front end SQL Server back end has been a significant part of my bread and butter work over the past 10 years. Which doesn't mean I like Access ;-)
Up until the common adoption of AJAX it was a perfectly reasonable solution. And there's still vast numbers of small to medium sized applications put together in Access out there that run bespoke business systems perfectly happily and I doubt it's going to go away for the next 10 or more years - indeed Access/SQL is probably going to be the Cobol of the 21st century. If you're working on a 'green field' site then there is now virtually no excuse for deploying Access when building from scratch - but if you do inherit an existing application then the costs of a rewrite may not be worthwhile and difficult to pass with the users.
Access does have some advantages that are still significant - and can present problems if proposing to convert to a web app
It's quick. For simple CRUD work it's as fast to write and deploy as any other realistic solution.
Built-in reporting is easy to get running and remarkably powerful given the system. It's usually pretty easy to create and deploy new reports for users on demand.
It integrates well with Office. This one tends to be the show-stopper when looking to move Access apps to web-apps. It's extremely common for a 'department-size' Access application to tightly integrate with Outlook, Word or Excel - and often all three.
This is the major problem when dealing with real-world situations. It's very easy for coders to underestimate the importance of this for everyday usage of such systems and the imposition of even a small degree of additional hassle for the users will generally be met with much resistance - often enough to completely scupper the project.
If your working with a reasonable sized department - a dozen people or so - it's quite common for there to be someone in the office who fancies themselves as a bit of a computer wizard. These people can be a major pain if handled incorrectly, but equally can be a major asset. If I have such a person I will try to get management to send them on an Access course or two so they can write simple queries and reports, and set up a separate Access application for them which they own which has appropriate (restricted) access to the SQL database. You can then trust this person to handle producing simple reports and the like for their colleagues. This can be a real win-win - you gain someone who is on your side and will use you as a mentor - a ready-made advocate for you in the department - and they keep the grunt report work out of your hair. They gain a lot kudos and job satisfaction - and even a potential career path. It's far harder, well near impossible, to do this kind of thing with any other system but Access.
Main practical disadvantages
Deployment can be a nightmare. Generally if you have a very tightly defined environment - a small company, single department, citrix based or distributed with an IT department that closely controls it's PCs then you're fine. Deployment as a commercial app across multiple companies - well only if you can charge significant maintenance (been there).
Code does not scale. Access VBA code, even when written by a professional has a strong tendency to rot into rancid spagetti. It's quite common to end up with an Access application that was easy enough to maintain, but gradually becomes unmaintainable as dependencies multiply.
So I'd say Access still has a place, and it's use is defendable in many real world situations, but increasingly it's better to choose a more modern solution if circumstances permit.
We have built such a solution (Access front-end, SQL back end), with now something like 80 users, millions of rows replicated between different countries, more than 100 000 updates a month. It works fine. I think the main mistake about Access is to consider it as a tool made for amateurs to develop applications. It can work this way, but keep in mind that amateur development will give you amateur applications, while professional development will give you professional results.
A quick list of its advantages, problems and limits:
It's free for the final user, thanks to msAccess runtime
It works with the free SQLServer Express, and not so expensive SQL Server Enterprise.
It's quick, specially when dealing with forms
It communicates very easily with other Office apps, which are still enterprise standards
You can manage its interface to be so close to Office standards that using it can be very intuitive, making people happy (I talked a little bit about that on my blog, need to be updated!)
On a large scale, you have to think about the best way to distribute it to your users. This issue can turn into a nightmare, as noted by #Cruachan, but it can be solved by building and distributing msi packs for example. Such msi packs can also contain all your external references such as 'added' dll, ocx, tlb files (report dll, activeX scanner controls, etc). We had a few words on this here.
When distributing an updated version of the mdb file, you can have a common network folder holding the new mdb/zipped file that clients will check/update at startup. Your clients should have the possibility to reinstall a previous version of the mdb file. Upgrading becomes then easier than installing a new .exe file.
You have to set a version controlling system. Please check here for details.
You must be very strict on your code organisation. One of our basic rules is for example not to have any specific code at the form level. Please check here on this subject.
I didn't find any problem with VBA code scaling, as noted by #Cruachan. If professional coding rules are implemented, there will not be any unusual code scaling issue. As an example, our application is now working really fine with more than 180 different forms, and still growing without any problem.
As a conclusion, our main problem with Access is an image problem, where Microsoft still let people think that Access is here to give them the possibility to develop real sofware in 10 lessons ... and professionals, who know that is not possible, view it as a amateur tool for amateur development, looking down on ms-access users as boring low IQ red-necks.
I know quite a few professional Access developers who have developed and maintained Enterprise-level apps using Access as the front end (either MDB or ADP) and supporting user populations in the 100s (and even in a few cases, thousands).
Like any Enterprise-level application development, it requires a higher level of programming skill than building a little Access database for your 5-person department.
Oddly enough, the design principles that make for an efficient Enterprise-level app also make for a more efficient workgroup-level Access app.
I think the reason most of the people posting in this thread can't conceive of it as a good solution is simply because they've never seen it done properly, or were themselves not sympathetic to the development model that Access uses.
Yes, it's hard to do properly.
But at that level, so is every other development platform -- all of them require planning, experience and a high-level skillset.
And you can rag on Access apps developed by people without all of that (Enterprise or not), but frankly, I've encountered a boatload of non-Access database apps of all kinds that are incredibly badly implemented.
Sturgeon's law applies everywhere, and there's no reason to assume that Access development would be any different.
I started out doing desktop applications in Access with JET back-ends. I moved up to using SQL Server/MSDE with Access as the front-end and then VB6 and a smattering of classic ASP.
There are many "enterprisey" reasons to go with a "real" development tool like Visual Studio. For the scale you are inquiring about, thousands of users, I think those reasons may apply.
That said, I think there are scenarios where it still works to use Access. In my own experience, I fell back to Access with a SQL database when given a mandate to come up with an enterprise solution, albeit for a much smaller enterprise, in a very short period of time. The main reason driving my decision was time. I can put together a database UI in Access much, much faster than I can in any other tool. Some of that is familiarity with the tool, but a lot of it is that Access just gives you more database purpose-specific bits to work with out of the box. The Access UI can also be tweaked to look and operate very much like a standard WinForms app.
The hitch that many run into in an enterprise scenario is rolling Access and the application MDB/MDE out to the masses. This is easily resolved by setting it up on a Windows Terminal Server, which can also be rigged to operate almost like another app window on the client machine with the right RDP file parameters. But even that approach has its limits. I don't think it would scale into the thousands very well, but for several dozen users, I found that it worked just fine and bought enough time to meet the time constraints I had to work with so a web interface could be implemented when time allowed.
For a professional who knows what they're doing in a SQL database, an Access front end is not necessarily an unpardonable sin, especially when the mandate is cheap and fast and there isn't a religious purism involved.
If you have the choice, no.
That being said, there are situations where it may be alright. One situation is if you never ever plan on updating the access application. If it is installed for thousands of users, you may run into problems getting all of the client apps updated.
You are much better off making a web front end... Although Access makes the multiple master-detail forms easier than anything I have seen. Even Oracle Application Express, intended to compete with Access, cannot do everything that Access does.
My advice is that if you are a programmer, you can make an asp.net app that will do the same job in a much more scaleable, maintainable, nature.
For a lot of CRUD (Create Read Update Delete) work, MS Access is OK. I'm more confident in it if the data is in another engine (MSSQL/Oracle/MySQL). However, most of the time I have problems with an MS Access database it's because:
It was home grown by a desktop user (not a programmer/IT professional) and hasn't planned ahead for future development (so additions are often more painful that if a pro had been involved)
It's full of unnormalized tables, inconstancies, and key-less tables.
My solution. Limit MS Access to the pro's and deploy the runtime version to the users desktops.
For the multiple user/high data volume situation, I use Access front-ends with a MySQL back end. I must say that in the client-server situation especially on a LAN, MS Access is as good as they come. Personally, I find development in MS Access much faster than say Visual Studio especially when it comes to database driven apps. And Access reports are as good, if not better than the industry standard, Crystal Reports.
The only shortcoming I see with Access is when it comes to non-LAN situations where you have to distribute the application to users spanning a wide geographical area. But again, web apps themselves have a major shortcoming - handling of one-to-many relationships, something access superbly handles with its sub-form, sub-report feature.
And more importantly, Access has a very powerful event model that most applications cannot match.
Personally, I can literally do anything on Access! So, my conclusion is that MS Access has many advantages that make it a competent development tool especially in LAN environments.
Sadly, I have quite a bit of experience with this. We built an entire product around Access Forms tying into SQL database. Honestly, the performance wasn't an issue - it really is the normal db connection type scenarios that you'd have to be concerned about with any client/server app. In our case, the original developer knew tons of "tricks" in Access, and did things like databinding drop downs to stored procedures. Oh, and the awful triggers. Awful. As in, 45 triggers firing per update awful.
The tables we worked with did indeed have millions of rows of data, however typically the roll-out was to tens or hundreds of users. I'd imagine that any effort going out to thousands of users would benefit more from a custom development so that you can do things like build the software correctly, support it from a performance and development perspective, and build automated deployment options (MSIs or ClickOnce, for example).
So, I would not say it is a perfectly acceptable, stable, maintainable or robust solution. It worked for us because we were there to support it (and eventually rewrite it in .NET), but I wouldn't recommend it for anyone. I have, however, worked in government where trying to get anything done from "IT" (which I was part of) was so filled with red-tape and paperwork that departments would oftentimes just do the Access solutions.
Ultimately if that's the case you are in - where the departments simply can't get access to IT resources - then showing them at least some best practices for how to eventually scale the app would be helpful. As long as right after you show them, you put your resume out to find a better job.
12-15 years ago this might have been an acceptable practice (not really advisable, but acceptable) but nowadays its unforgivable. There are so many more scalable and distributable solutions that Access should be the last thing to cross somebody's mind.
When you say Access as a solution what comes to my mind is a simple, 2-3 table application that some marketing employee put together, not a real developer. If the marketing guy had a really good idea then perhaps the development team should look at it (I'm assuming there is one sense you indicated there may be thousands of users), refactor it to a better platform (intranet or winforms distributed via ClickOnce, etc), and then deploy it.
Back in the early 90s I was an Access developer--even had a MS certification. I built dozens of "Enterprise" apps (meaning 10-15 people used them). Those days are gone, IMO. There are easier solutions to build, deploy, and maintain nowadays.
I've had the misfortune to work on Access front ends like you describe, here are some non-Enterpise arguments.
Programming is easy! Creating forms in access is geared toward non-developers. Case in point, if you have multiple columns in a drop down, do you have list fields and data fields. No way! you just set he width of things you don't want to see to 0". So your looking at forms either thrown together by non-developers, or that will irk most people that have to work on them.
Versioning? Who needs versioning?, Just send out an attachment If changes need to be made to the front-end re-deployment is time consuming and fault prone.
This form, I'm thinking magenta The front end doesn't lock down well so end-users can get creative.
With Microsoft "giving away" free versions (MSDE, or SQL Express for 2005 onwards) of the SQL Server engine with each release, there is really no need to use Access any more. Although these free versions don't have a visual front end which can make development harder, good knowledge of SQL is all you need.