Pulling data from Sage Line 50 Cloud - sage-line-50

It is June 2018. I have been tasked with architecting a solution to integrate data from SAGE 50 into a web application. The specific task is to pull account credit position data from Sage into the web app so that users of the latter can appreciate the exposure for each client before processing new orders.
The scope says that we should avoid installing any solution-specific code on the on-premise Sage installation, or do anything to limit the potential for the customer to have Sage updated.
The stack on the web-app side is all MS, so SQL Server + C#, etc. The Sage installation is on-premise.
In practical terms I have two options:
Use some kind of standards-based gateway or interface layer that we can talk through to get data Sage DB. This would insulate us from needing to make any low-level changes to the Sage installation. Using this option we would query the credit position data as needed.
Have some scheduled job on the Sage box post out the credit data periodically, either to a middle file store, or directly into the web app. This option obviously has a data latency problem.
Sage hides its SDK information inside a developers program that has a yearly price tag of £1500. Before I commit I would like to confirm that there is a solution waiting there.
Some of my research to date:
SO question about using ODBC from 2009
SO question about generic integration from 2009
Sage has changed its product line since 2009, at least in marketing terms, since then.
I realise that this is a broad and imprecise question, but my research so far leads me to no clear conclusion. Sage has many millions of customers so if I can beg your indulgence in not tagging this as off-topic then I think this question could help many people in the future.

So the easiest option for reading data from Sage 50 is to use the ODBC driver if you need to write data to Sage then you would need to use the Sage Developer Kit or commercial solution.
In terms of commercial solutions there are lots of toolkits and import tools available, the company I work for develops one of the leading Sage integration platforms and if you do a quick google you will find our company amongst others who can provide "no code" integration solutions for Sage 50, Sage 200 and others which compatible with historical and future versions of Sage products.
Using ODBC has not changed, one thing to bear in mind is that the driver is a 32 bit driver and afaik there isn't a 64 bit driver - this may or may not be an issue.
Connection string looks something like this
string connStr = "Driver={Sage Line 50 v24};UID=MANAGER;PWD=pass1234;DIR=C:\Accounts\ACCDATA;"

Related

Is BizTalk The Correct Solution?

We have about about six systems (they are all internal systems) that we need to send data between. Currently we do not have a consistent way of doing this. We use SSIS, SQL Server linked servers to directly update databases, ODBC connections to directly update databases, text files, etc..
Our goals are:
1) Have a consistent way of connecting applications.
2) Have a central way of monitoring and logging the connections between
applications.
3) For the applications that offer web services we
would like to start using them instead of connectiong directly with
the database.
Whatever we use will need to be able to connect to web services, databases, flat files, and should also be able to accept data via a tcp connection.
Is Biztalk a good solution for this, or is it is overkill?
It really depends. For the architecture you're describing, it would seem a good fit. However, you will need to validate wether biztalk can communicate whith the systems you are trying to integrate. For example; when these systems use webservices, message queues or file based communication, that may be a good fit.
When you start with biztalk, you have to be willing to invest in hardware, software, en most of all in learning to use it.
regarding your points:
1) yes, if you make sure to encapsulate the system connectors correctly
2) yes, biztalk supports this with BAM
3) yes, that would match perfectly
From what you've described (6 systems), it is definitely a good time to investigate a more formalized approach to integration, as you've no doubt found that in a point to point / direct integration approach will result in a large number of permutations / spaghetti as each new system is added.
BizTalk supports both hub and spoke, and bus type topologies (with the ESB toolkit), either of which will reduce the number of interconnects between your systems.
To add to oɔɯǝɹ:
Yes - ultimately BizTalk converts everything to XML internally and you will use either visual maps or xslt to transform between message types.
Yes. Out of the box there are a lot of WMI and Perfmon counters you can use, plus BizTalk has a SCOM management pack to monitor BizTalk's Health. For you apps, BAM (either TPE for simple monitoring, but more advanced stuff can be done with the BAM API).
Yes - BizTalk supports all the common WCF binding types, and basic SOAP web services. BizTalk's messagebox can be used as a pub / sub engine which can allow you to 'hook' other processes into messages at a later stage.
Some caveats:
. BizTalk should be used for messages (e.g. Electronic Documents across the organisation), but not for bulk data synchronisation. SSIS is a better bet for really large data transfers / data migration / data synchronisation patterns.
. As David points out, there is a steep learning curve to BizTalk and the tool itself isn't free (requiring SQL and BizTalk licenses, and usually you will want to use a monitoring tool like SCOM as well.). To fast track this, you would need to send devs on BizTalk training, or bring in a BizTalk consultant.
. Microsoft seem to be focusing on Azure Service Bus, and there is speculation that BizTalk is going merged into Azure Service Bus at some point in future. If your enterprise strategy isn't entirely Microsoft, you might also want to consider products like NServiceBus and FUSE for an ESB.
You problem is a typical enterprise problem. Companies start of building isolated applications like HR, Web, Supply Chain, Inventory, Client management etc over number of years and once they reach a point these application cannot be living alone and they need to talk to each other, typically they start some hacked solution like data migration at database level.
But very soon they realize the problems like no clear visibility, poor management, no standards etc and they create a real spaghetti. The biggest threat is applications will become dependant on one another and you lose your agility to change anything. Any change to system will require heavy testing and long release cycle.
This is the kind of problem a middleware platform like BizTalk Server will solve for you. Lot of replies in the thread focused on cost of BizTalk server (some of the cost mentioned are not correct BTW). It's not a cheap product, but if you look at the role it play in your organisation as a central middleware platform connecting all the applications together and number of non-functional benefits you get out of the box like adapters to most of the third party products like SAP, Oracle, FTP, FILE, Web Services, etc, ability to scale your platform easily, performance, long running work flows, durability, compensation logic for long running workflows, throttling your environment etc., soon the cost factor will diminish.
My recommendation will be take a look at BizTalk, if you are new then engage with local Microsoft office. Either they can help or recommend a parter who can come and analyse your situation.

B2B Application Building and Maintenance Cost

I've been considering for some time now to get into be b2b integration business. I've researched the tools available for doing this,
like Oragle's WebLogic Integration, IBM's WebSphere, or Microsoft's BizTalk. They all seem to do the job (each having their ups and downs).
I've also looked at some companies that already are doing this (ex: www.hubspan.com). It seems that b2b integration is very needed service.
Although my background is in integration of commercial products with open source software, I feel that concerning the b2b integration world,
I still feel that I need to feel some blanks.
So basically I'd like to clear a few things concerning all this:
All the frameworks that I previously mentioned are just that, frameworks. They allow to build an application ON TOP of the said frameworks,
they are not itendet ot be a final product. I assume that this is because the integration needs of different companies vary so much,
that an out-of-the-box solution is just not possible. So my question is, do the applications build with the said frameworks vary so much
from business to business, that it's not possible to reuse them?
Also, is it possible to build a single framework of Suppliers and Customers (build a Core of somekind), and connect new Costumers and/or
Suppliers as they come? (this is the way HubSpan did it, not counting the developing of custom Connectors to the Client ERP systems).
Or will I have to do a separate integration for each Customer?
How much work hours is required to complete a typical integration project, (assuming everything is planned and executed properly)?
(For the sake of simplicity, let's say that the integration includes only 'Query Product Price', 'Query Product Availability',
and 'Purchase Order Management'.
And finally, is this a job for a sigle person (can I do this myself?, assuming I have the knowledge to do it) or a team is required?
Thanks in advance for sharing your thoughs and oppinions.
Yes they can vary that much.
It depends on the business. Some will integrate easily while others will need custom modules and connectors
There isn't really a "typical" integration project.
Depends on the size of the project. If you're talking fortune 500 companies then no. If you're talking a local manufacturer and local supply house (presumably small) then maybe.
This is probably a question better asked on the programmers.stackexchange
I think it varies a lot. You should probably define what you mean by B2B, and there are a lot of different types these days. From a BizTalk perspective, it is possible to build an application service provider (ASP) version of B2B but it is hard to do.
The level of customization is one of the factors that drives up cost and the length of the project. I think it is difficult to do B2B alone, usually there is so much business domain knowledge specific to each company that you need those business people to help explain the existing systems.
Thanks,

In which domains are message oriented middleware like AMQP useful?

What problem do MOM (Message Oriented Middleware) solve? Scalability? Integration?
In which domain are they typically used and in which domains are they typically not used?
For example, say, is Google using such solution for it's main search engine or to power GMail?
What about big websites like Walmart, eBay, FedEx (pretty much a Java shop) and buy.com (pretty much an MS shop)? Does MOM solve a need there?
Does it make any sense when you're writing a Webapp where you control the server-side and have an homogenous environment (say tens of Amazon EC2 instances all running Linux + Java JVMs) there and where the clients are, well, Web browsers?
Does it make sense for desktop apps that need to communicate with a server?
Or is it 'only' for big enterprise stuff where you typically have a happy mix of countless of different systems that needs to communicate in a way or another?
I'm a bit confused as to what they're useful for and I think that with example of where they're appropriate and where they're not appropriate I could better understand their use.
This is a great question.
The main uses of messaging are: scaling, offloading work, integration, monitoring, event handling, routing, networking, push, mobility, buffering, queueing, task sharing, alerts, management, logging, batch, data delivery, pubsub, multicast, audit, scheduling, ... and more. Basically: anything where you need data but don't want to make a database request. (Caching is another, longer story).
Another way of looking at this is to notice that many applications used to be built by assuming that users (people) would perform actions that would be fulfilled by executing a transaction on a database (including reads, writes). But today, many actions are not user-initiated. Instead they are application-initiated. For example "tell me when the book that I want to buy is in stock". The best way to solve this class of problems is with messaging of some sort. Whether you call it middleware or web push or real time salad dressing does not matter. It's all messaging.
When you enable applications to initiate or react to events, then it is much easier to scale because your architecture can be based on loosely coupled components. It is also much easier to integrate those components if your messaging is based on a stable, scalable, serviceable tool, preferably using open standard APIs and protocols.
I hope this helps. We try to maintain a list of useful links about messaging here
Please get in touch with questions and comments on any of this, we are dead easy to find.
To address your specific questions:
In which domain are they typically used and in which domains are they typically not used?
Like databases, messaging systems crop up everywhere.
For example, say, is Google using such solution for it's main search engine or to power GMail?
Google uses a lot of home grown technology, but a lot of their open source contributions and known use cases suggest that messaging is (or should be) central to some of the main services.
What about big websites like Walmart, eBay, FedEx (pretty much a Java shop) and buy.com (pretty much an MS shop)? Does MOM solve a need there?
Very much so.
An example use case is scaling web page requests. When the user makes a web request, the web server puts it onto a queue for background processing. This means that the web server can keep working while the request is processed. It also means that the web server does not need to know how the request is handled, making system maintenance, upgrade and rollback much simpler because the main parts are 'decoupled'.
So, anyway, the web request gets processed by a back end service, or possibly by many services, eg 'look up book titles', 'draw shopping cart', 'get advertisement', 'check user account'... Finally all the results get put onto another queue, ready for collection and user response by the web server. Typically the system will include a timeout of around 100ms so that any late requests just get thrown away. The user sees anything that got processed in the time interval. This is one reason why some large ecommerce sites have pages that appear to load in stages.
There are many more use cases...
Does it make any sense when you're writing a Webapp where you control the server-side and have an homogenous environment (say tens of Amazon EC2 instances all running Linux + Java JVMs) there and where the clients are, well, Web browsers?
Definitely. If you have an unknown, or unbounded, number of users, server side instances, and application latencies, then it makes sense to use messaging, even if just as a scalable substrate for non-blocking RPC.
Does it make sense for desktop apps that need to communicate with a server?
In lots of cases. One very common case is when the server pushes events to the desktop app, eg game event, tweets, price feeds in finance, system alerts....
Or is it 'only' for big enterprise stuff where you typically have a happy mix of countless of different systems that needs to communicate in a way or another?
Definitely not only for those 'legacy integration' cases but they are important too. At RabbitMQ, the biggest customers we have in terms of pure scale or message volume are cloud providers and big web application providers.
I will answer only one answer, from prior experience - take a look at this middle-ware that is employed by big companies here - middle-ware has one purpose - to glue dis-connected systems (written in disparate languages) together so that they can interact with one another and streamline the business process - Entera as I have had experience with, creates a middle layer in which the unix box using processes written in C, interact with the mainframe system (DB2, COBOL) via a front-end written in PowerBuilder (I am not naming the company!).
From the description I have given, Entera is a middle-ware which hosts a number of things - smooth integration of the flow of data regardless of the endian format, ability for different languages to talk to the middle-ware broker (a broker is a CORBA or DCE like process, that conforms to 'The Open Group) that listens on a particular port) and is specified by an IDL which makes a process appear to be local - if you understand the terminology used in Remoting under Microsoft's .NET Framework, you are not far off the mark! The middle-ware generates stubs which are linked at compile-time and manages the creation of the process, hosting it off a port, multi-threading at run-time, and also, the modern front-ends (such as .NET, Java, PowerBuilder even the unspeakable VB6...ok...VB.NET for the purists out there) can interact by opening a connection to the specified port on a particular IP address, and using the stubs generated, can interact with it directly.
Obviously, from what was described you can see how the legacy systems can have new life breathed into it and thus scalability of the process, the major downside of this is the cost factor which can run into thousdands of dollars. Big companies who uses mainframes as their back-end processing systems for billing/invoicing, who generate a huge revenue can obviously afford such an expensive product - to them it would seem like throwing pennies into a pool of water...because of the use of middle-ware which prolongs the business process, and breathe new life into it, can extend the business by a good number of years into the future without worrying about 'legacy' tag attached to it.
Incidentally, I carried this out as part of my thesis for my BSc. in Information Systems which covered this commercial front-end. There was an open source version of the middle-ware available on sourceforge called FreeDCE, but development efforts have declined or stopped.
Edit:
#cocotwo: That is exactly what middle-ware does as you said it is a plumbing tool...message oriented middle-ware is not really heard of AFAIK because I would imagine, the processes (functions) would need to be called as if they are locally visible within the application domain of the front-end to make it easy to interact with.
Using messages may have its advantages over RPC calls in that the messages are queued in a safe-holding area in the event that a network disconnection occurs - there may be some data caching going on within that aspect to allow the front-end to continue regardless...it would be useful in the instances of 'updating a status of a particular billing/invoice number' - a one-way write-data to the back-end via the middle-ware.
Ok, big companies would have advanced systems infrastructure in that technicians are constantly around the clock to ensure a smooth delivery of data-flow so that would have to be factored in. The company that I worked with had IBM Global Support contract to fulfill in order to ensure a maximum uptime 99% with 6 nine's after the decimal point...with hot-swapping/balanced-clusters/mirroring systems in place...
Whereas with RPC, if the disconnection occurs, the front-end would have to be restarted or would have to handle the disconnection event. It really depends if the message-queueing middle-ware handles each message in real-time and pass back results to the front-end immediately...
This is where each (Message-queueing and RPC related middle-ware) have their strengths and weaknesses...and also the cost mitigation factor such as support, maximum up-time, development efforts and training - that's a biggie here as middle-ware are really proprietary (despite following the 'The Open Group' layout/standards) and complex to setup and to glue the whole thing together via scripts.
Good answers and discussion here. Our consulting team has two preferred "messaging" solutions: RabittMQ and NXTera a high speed RPC middleware, the contemporary version of Entera mentioned above. My partners and I have developed several solutions using RabittMQ, it is the best tool available in that space right now. Additionally, I happen to work for the company that makes NXTera/Entera.
From experience I can clearly say that both of these products meet the need for reliability and low maintenance as discussed above. There are situations where a messaging service, like RabittMQ, is the right choice -- where Publish and subscribe, certified delivery, Queuing or store-and-forward are required.
In other cases, RPC's (remote procedure calls) are the best and fastest solutions for transactional and distributed processing for enterprise or cloud-based applications. When it is right to use an RPC, but SOAP/.NET (yes these are RPC implementations) are too slow, expensive or complex, a lightwieght high speed RPC middleware like NXTera/Entera is the right choice for us.
There is some use case overlap between RPC middleware and message oriented middleware, and where there are you can use either successfully. But both are strong and dependable choices.
The large companies I work with use both RPC and MoM side-by-side. As far as Internet companies, Google (Protocol Buffers) and Facebook (Thrift) show that RPC's have a roll to play in modern web and cloud-based development.

Pros and cons of building apps with proprietary database systems

I've been interested in 4D SAS' database product for a long time, though have barely touched it in eons.
In considering what tools to use for application development, especially one that will require a database component, what should be looked for when considering open-source tools like MySQL and PostgreSQL vs proprietary solutions like 4D or Pervasive SQL?
What good (and bad!) experiences has the SO community had with various DB tools like 4D, Pervasive, FilemakerPro, etc?
Any bad experiences?
Difficult to make a relevant list of Pros and Cons without a context.
My advice would be the following: when making the decision of using a proprietary database, make sure that this decision is based on strong facts and not merely a technical interest for an exotic tool. Put into the balance the benefits for using the proprietary database and the advantages of a non-proprietary solution.
The answer is different from system to system.
A prerequisite is that your system is well identified, with a clear scope, a quite predictable evolution, so that the results of your analysis will be robust. Then, if your proprietary solution brings a real benefit for your system, that you are comfortable with the support and that you can afford the overall cost, you should be a good candidate for the proprietary solution.
4D is a MacOS/Windows only cross-platform, proprietary database system with both stand-alone and Client-Server varieties. You would do well to compare it to Alphafive.com software which is Windows only. I've worked with it for 17 years and it has served me and my department very well. Off the top of my head ...
Pros:
Interface & code are closely tied to the data engine which makes development of rich, cross-platform user interfaces very fast and easy.
Proprietary relational data engine runs natively on both platforms, along with native client interfaces (but requires licenses for multi-users). Auto-relations are helpful (but sometimes get in the way).
Can access external systems via SOAP and ODBC and SQL drivers (limited).
Can access 4D from external systems via SOAP or http requests & web pages.
Native procedural programming language based on Pascal and is EASY to learn.
Excellent tool for small to mid-sized departments.
Latest version accepts subset of SQL commands AND original data access, so it's backward compatibility record has been very good.
Security is EASY in 4D.
You can build solutions to deploy through a variety of means, and are not limited by whether or not MS Access is installed.
Cons:
Interface & code are closely tied to the data engine which can lead to limited use of abstraction and "black-box" coding unless you make it a goal of your development.
Compiles to one monolithic structure file forcing restart for single fixes.
Language is still only procedural--making it harder for object-oriented programmers to accept. Every method requires separate "file" in 4D so you can't include more then one function or procedure in a single routine -- it will take some getting used to it.
While company appears to be in good shape, growing and developing, you simply never know as they keep their condition to themselves.
Company has never really marketed itself--trusting in its developer base to spread the word and grow the product through site deployments and product upgrades. Web site is clearly useful only to developers who already use the product -- it simply fails to attract new users.
Product upgrades have always seemed to focus on how the tool is better for the DEVELOPERS rather than for the CUSTOMERS of those developers.
SQL lacks views, compound indexes, and other common SQL features.
When a user requests a report of specific columns of data, I often have to write yet another program just to provide that specific data -- I can't always just query the data and generate a text file.
Does not handle new OS versions with nearly the ease of web browser based applications. Older version is broken on Mac OS 10.6, and newest version requires the latest Mac OS 10.6. No version is certified yet on Windows 7.
I've been nearly a year at learning ASP.NET and a few weeks at Ruby on Rails. While SQL data stores are EASY, user interface is HARD -- but worth it when your application still functions through OS upgrades. You can always use an older browser if the latest version breaks something.
I'd recommend you consider either of those, depending on how much funds you have available to implement the project--Rails being the cheaper of the two. Then, ANY system with a web browser can access the data, and you can fix interface pages on the fly as needed rather than taking the whole system down a few minutes for a single, simple update. Those skills might be more marketable in the future.
I will only say one thing.. Watch the "actual" cost of your decision.. Most proprietary database systems are Windows only.. or sometimes Mac/Windows only.
This means that along with paying quite a bit of money for the database system, you must also pay a good amount of money on a Server operating system to run it...
Also, compare the database system with current open source solutions. Is it really worth it? After moving from Microsoft Sql Server(which has a free edition, but anyway) to PostgreSQL I was blew away that people pay so much for SQL Server.. I mean, Postgres to me is a lot more clean, and most of it works exactly how you'd expect(unlike in certain SQL server syntaxes) and it has more features built into it(programming stored procs in Ruby anyone?)
So basically, compared the proprietary with the open source software and decide upon which one to take by total pricing(including OS) and feature set..
Pro of zeroing in on any DB: it's got good non-portable features that help you get things done
Con of zeroing in on any DB: sometimes a different DB is appropriate (for example running your tests with in-memory SQLite instances), but that option is now closed
Con of a proprietary commercial DB: if you need many instances, licensing costs can kill you
Consider the following questions:
How easy (or difficult) is it to make changes in maintenance? Applications are likely to spend far more time in maintenance than they do in development, so if changes are hard, long-term pain is guaranteed.
What is the quality of support? A system that is well-documented, proprietary or otherwise, is going to be easier to work with.
How large (or small) is the user community? Systems with larger user communities mean more people to ask for assistance if and when things go wrong.
How robust are the import/export capabilities of this proprietary database system?
I found the last point particularly useful at my first full-time job. Our client was using CA-Ingres, and no one at the company knew it well enough to write queries to validate the data. So I came up with the idea of exporting the data from Ingres and importing it into MS SQL Server (which I knew from a brief stint at Sybase Professional Services) so we could write our validation queries there. If it had been really hard to export data from Ingres, my idea wouldn't have been an option at all.
From 4D's webpage, I gather that we are looking at a complete development+deployment environment, not a standalone database as such. So the alternatives you could be looking at include stuff like django, ruby-on-rails, hibernate and others. The real question, of course, is if the proprietary system can save you enough money doing the product lifetime to justify the costs of the product. And that would depend on the type of human resources you have available.
4D is a good option for vertial applications. I have worked for a company which used 4D to build a medical records and billing application for general practitioners and specialists. The rapid design and deployment features of 4D enabled the application to quickly move with market desires and legislated changes to medical record storage.The environment itself was not cutting edge, but it was integrated, cross platform and very productive.
If you are entering a market with high vendor lock-in and a high barrier to entry, then I think proprietory integrated development environments are a good option.
At various points in my career, I've used and gotten very good at FileMaker Pro, FoxPro, 4D, and a few other commercial products. Now I mainly use PHP/MySQL, and haven't used the latest versions of any of the products.
I've always liked FileMaker because most people who can use a computer can pick up FileMaker and design their own systems. They don't have to know programming or database design. But, you can "program" FileMaker, put a web front end on it, or do other more sophisticated setups if you need to. Many times I was "handed" a system created in FileMaker by a non-technical person that needed to be made into a full fledged data management system. The good part was that all the "specs" and data flow were already designed into a system. The prototype was already created!
4D and FoxPro I always found required a certain amount of extra programming and/or database knowledge to really do anything with. 4D & FileMaker are really complete self-contained systems, not just database systems. Although they all have the ability to hook into other backend databases systems (i.e. MySQL, Oracle), that is not their strong point.
On the downside, doing more complex, dynamic systems can be difficult in 4D and Filemaker due to everything being tightly coupled. Because of their cost, you really would want to create multiple systems with them. Which means you need to really "buy into them" to get your money's worth.
The key concept is always adherence to standards: if you plan to use 4D's custom and / or special designed functions (but the discussion could be far more general, and cover any other free or commercial tool in the wild), well, just use it and take your advantage.
Not surprisingly, that's why huge DB systems like Oracle or IBM's DB2 in the past were wide accepted for specific business areas, as commercial transactions, for instance.
The other main reason to adopt a very closed solution is the legacy support. One of the products you cited (Pervasive SQL) acted as a no-effort port for BTrieve-based applications in late 90s, and it gained popularity thanks to the huge BTrieve community all over the planet.
Finally, last but not least, you should evaluate the TCO (Total Cost of Ownership) not only in terms of license price (single seat, network environment, site licenses and so on), but also for what concerns tech support, updates and availability for your platform. Many business units I know have been obliged to change their base OS for DB related problems.
Tip: add a bonus for custom solution that are proven or supported for usage in virtualized environments, if you aren't in seek for extreme performances. It will save more than a head ache for your DB manager.
In all other cases, rely on opensource/freesoftware DBs. MySql and Postgres for big projects, SQLite for single app persistence layer. Fairly standard and very good (community) support. Good value for no price.
I don't have any experiences with the proprietary database products you listed: 4D, Pervasive, FilemakerPro.
I'd be interested in knowing what those products offer that make them more attractive to you than the open source alternatives, you listed: MySQL and PostgreSQL.
I'd be interested in what makes those more attractive to you than the much more popular proprietary alternatives: Oracle, SQL Server, DB2, etc.
Without you providing more specifics, it's hard to advise you.
I personally feel safer using a widely used open source solution than a narrowly used closed source solution. The more widely used, the more battle-tested it's likely to be. The more open, the more control over my own destiny I have in case I do encounter some bug.
I have reported bugs to open source projects and gotten a quick fix. I have reported bugs to companies that make for-profit proprietary software and have gotten nothing.

What will we do after Access? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Microsoft seems hell-bent on deprecating the swiss-army-knife of database tools. What else comes close for facading/file-swapping/cloning/name-your-acronym-connecting arbitrary database servers/spreadsheets/CSV's/flatfiles?
What weird kinds of functionality have you squeezed out of Access? And what else is there to take its place?
Access is not a DBMS. Or at least it's not just a simple DBMS. It's a very good RAD environment, a simple way to create SQL code graphically, and a regular front-end to fully fledged DBMs.
Neither SQL Server (Express or MSDE) nor Oracle, MySQL, etc. will ever replace it, until they come integrated with a simple programming language, a Crystal Reports like facility and a way for beginners to get around without having to learn SQL.
At my first professional job I developed a very big system completely in Access. Front end for the clients, admin front for me, reports and monitoring for management, permissions per user, automatic tasks run at certain times, etc. I came to learn a lot of its flaws and strengths as a result.
I've seen marvelous apps done with it, as well as pieces of crap. I still use it for personal projects, and ain't' ashamed of it (for instance, a Sudoku player, or a Karnaugh mapping implementation). There's an MVP who's created a Paint clone completely in Access, though I believe that's extreme.
Access' pearls: It's nice to easily test a database design idea and have sketch forms, reports, etc. created for you. If you change a column's name (or even a table, though that fails sometimes) it's nice to see all references to that have changed to the new name, automatically. The "sub-form" control rocks, I longed for it on VB6. And the "Thunder" button to do repeated filtering on tables is great, I wish I had something like that on SSMS!
The problem with replacing Access - and replacing Access is the problem which stops me in the vast majority of cases recommending a move to Ubuntu or SUSE desktop to my business clients - is not that Access is widely used for its database facilities: it's not except with the most Micky Mouse of user-written departmental applications which are relatively trivial to re-code. The problem is the medium sized applications where the data was migrated long ago to the corporate SQL Server.
These are a nightmare. They're often badly written (I've acquired a fair few to administer over the years) and encapsulate reams of business logic. Recoding them in anything is generally quoted at a couple of man-months at the best - usually twice or three times that, and it's unusual for a department of the size these are found in to have the budget to support that. Moreover although the arrival of AJAX and good desktop-like controls has meant that this is at least now possible in theory, in practice these are of then massively integrated with the rest of the MS Office desktop and virtually impossible to disentangle with out users seeing a drop in usability in the short to medium term - which is a show stopper in itself.
I really do not know what the solution is, apart from the slow replacement of creating new systems with other methods and hoping for the gradual demise of existing apps. Trouble is I think Access could well be the Cobol of the 1990s - it'll be around for ever supporting legacy apps because it's too costly to rewrite from scratch.
As an aside, does anyone else coming from a non-Access traditional Win32 coding background have the experience of finding that the standard of coding in even professionally written Access apps is generally below average? Although superficial (but important) stuff like formatting and variable names are generally fine I find over and over again that program structuring is poor. I know that this may often be because these apps have grown like Topsy, and VBA really isn't conducive to good coding anyway, but even allowing for these factors things generally seem worse than one might expect.
I think the easy answer is nothing... Access is commonly used because it is the only option and it is extensible. There is simply nothing else out there that is installed on nearly every business machine in the world as access is.
If you are looking for an alternative, Oracle Application Express is a fairly powerful web based app that can be run on Oracle XE. It is a potential alternative to Access but does not support Master-Detail tables as well as access.
There is a continuum of developers in the world, rather than hard and fast boundaries. People range from business managers and IT professionals. I consider myself to be an advanced amateur developer, somewhere between the two. As such I use MS Access at work to organise a large amount of data in a small architectural office including timesheets, financials and architectural specifications. Sure, the application now is a mass of stinking p** that has grown over almost five years.
I've been searching for something better than Access for ages- I can create simple apps in VB.NET however the learning curve is huge from VBA. I've looked at all sorts of options. Often you need Crystal Reports to get any kind of reporting capability, or the IDE is non-intuitive, or linking a field to a data object takes ten minutes each time, or there is not integration with other office products at all. The boss is not going to pay for something that costs a bomb, either. I'd love to get away from Access, but nothing I've looked at gets anywhere near ticking all the boxes.
the nice thing about Access is its answer to large IT bloat. It comes with MS office so its already approved for use on locked down computers but I don't have to attempt to struggle for weeks/months to get an application approved through various departments, coding hours to account for, and all the testing for an application i can whip up in an afternoon with Access. Sure SQL server would be nice to use, but not worth the headache.
I doubt Microsoft will kill off Access. With Access 2007's integration with Sharepoint and the rapid growth of SharePoint, Access may in fact have a resurgence as an off-line and reporting tool for SharePoint web sites.
I don't think MS has any intention whatsoever of getting rid of Access. They may transform it into more of an end-user tool than a programmer's tool, but it is never going away. The forking of the Jet database engine into the traditional Jet 4 version that ships with every copy of Windows (because Active Directory uses Jet 4 as its data store) and the version that is owned by the Access development group (the ACE, with its ACCDB file format, which is, de facto, Jet 4.5 or maybe Jet 5).
Access is a hugely popular and useful application and functions in a whole host of levels within any number of organizations, large and small.
Why is there no open-source alternative to Access?
Because it's way too hard to create such a complex piece of software that does so many different things well.
My cousin is a serious FileMaker guy. He seems to be doing great and has grown a small firm around it. Apparently FileMaker is a cross-platform Mac/PC system for rapid app development...
Maybe something like that will rise up with the business power-user/RAD set?
Microsoft may have a history of intentionally killing off database systems like this. I listened to a .Net Rocks interview one time with Les Pinter, where he claimed that he once heard a top Microsoft exec say that every copy of FoxPro that sells costs Microsoft thousands in lost SQL royalties. And where is FoxPro today? Officially, it is was end-of-lifed in March of 2007. So how did it get from prominance to demise? Well, Les says that Microsoft acquired it and ran it into the ground on purpose.
I am not usually big on conspiracy theories, but this does resonate with Microsoft's track record from that era.
Anyway, trivia aside, I believe there will be more RAD-style database tools... They empower non-developers and allow developers to solve certain types of problems very quickly. I have an aversion to using them for large projects that, unfortunately, cascades - small projects tend to grow over time. So as a result I only use them for the very tinest things.
As for the long term consequences... Well, I have seen scenarios where they didn't scale well and all those fragmented solutions started to look a lot like technical debt. It is actually possible to hook Access up to a SQL Server back-end, which solves a lot of problems.
Probably the biggest/weirdest thing I did with Access was writing an EDI system from scratch. For those of you who have worked first-hand with EDI, you know what I'm talking about. What a silly idea that was. My problems here had more to do with VBA than Access though -- I remember just really needing interfaces and not having them.
I also used it for code generation back before things like Codesmith were available. It generated business objects (CRUD and some other basics) for ASP Classic. That actually worked awesome.
in my experience Excel is even more widely used inside corps. We're just now doing a project where we convert ~ 60 000 Excel documents (with 4-12 sheets in each) to Sharepoint and Infopath forms. ;)
Microsoft would like us to move to using Office Business Applications - essentially hooking up the office apps to databases. Add SharePoint into the mix and there is a lot of possibility. Also plenty of licencing fees for MS as well.
I have seen access used to integrate and front end GIS and health data. It blew me away how well this app was coded and documented.
As Mark. Access was my first approach of database and I found it powerful at the time. It has some nice features like generating SQL from "query by example". Its form features and capability to print on various format (sheet of labels for example) was nice too.
On the downside, it is proprietary, and each new version was incompatible with the previous one: if you load a base made with Access 97 with Access 2000, you can no longer load it with the older one...
Although I don't do much personal database works (list of addresses, mostly), for such work I would use either Open Office's database tool (not tried yet) or a good old open source database (MySQL, SQLite come to mind as lightweight bases) with a GUI front end, for example, SQuirreL SQL Client, and probably JasperReport as report front end.
Not as integrated as Access and with steeper learning curve, but somehow more flexible.
Now, I am sure we can find some simple good old non-relational database for the simplistic uses I had at the time. :-)
I welcome the day when Access breathes it last breath and joins the likes of Clippy.
Access is well-intentioned, but it has become a crutch. Even in large companies with able IT staffs, Access applications can run rampant, providing a pain point for knowing the global landscape when it comes to products to maintain. Linked Access databases that point at other datasources, unmaintained Access applications, and just shear flexibility are issues, in my opinion.
I think that Access is actually too powerful, too flexible, and too extensible for its own good. In Microsoft's well-intentioned attempt to bring rapid development to the desktop database realm, it really has opened a Pandora's box. Look at it from another perspective, too. Assume that a company has a few applications that are written in Access. The developer who wrote them leaves. These applications are just important enough that they still need to be used, but not important enough that IT gets the approval to port them to a more technologically capable platform.
Now, the situation is that if no one on the team knows Access, it is requirement for the new developer. This means that you might have to pass on a developer who is the most technically well-rounded and the best fit if he does not have legacy chops. I speak from experience, on this. We are down to two legacy Access applications, and are trying feverishly to convince of the needs to either incorporate the functionality into related, code-based projects or into new projects of their own. I have one developer with Access "chops", and am not going to base a candidate search on whether someone knows Access or not in the event that he leaves.
As far as the weirdest thing I've seen squeezed into Access...
I am a police dispatcher for a smaller university, and we (like almost every agency) use a CAD (computer aided dispatch) and RMS (record management system) system.
Our previous CAD/RMS software was built ENTIRELY into Access. You opened Access, and through an ugly GUI, entered calls for service, everything. Officers wrote reports through the same interface.
It worked great at first, and then as the database size grew, it became extremely slow and difficult to use. This is what happens when the state makes you go with the lowest bidder on a project...
Now we use a CAD/RMS solution that is browser-based, backed by MS SQL.
I don't think that Access is going away anytime soon. The beta of office 2010 is out with an updated Access included and the Microsoft blogs are hyping the features of Access 14 (the version after 2010) which include improved Access Projects (.ADPs) with better support for SQL Server 2005/2008 and better .Net integration.
If i were to look for a new integrated database development system providing front and backend features Oracle APEX would be the main contender. Front ends are web based requiring no runtime on the client, the whole system is free to download and instal (express edition) and given a few years the entrance barrier for new users hopefully will be reduced so it is something laymen can dabble in.
Access is just migrating to more of either a single user on a desktop or a few users on a shared database file without much security. If you want to take it to a slightly higher level, use Access as a frontend to SQL Server.
Well now it seems Access 2010 is looking to get the hooks into SharePoint in an attempt to "web enable" the Access application. There are even host sites catering to this technology. Maybe all those who were concerned Access couldn't scale can fear no more?
Access definitely has both pro's and cons, it's just another tool to use but not abuse. Every adult job I've ever had used ran on windows, so Access or something like it will exist. I feel sorry for the places that are stuck in Access quicksand or lost in excel hell. But are we forgetting that all that can be corrected and better yet prevented with a bad ass bi team and proper training.
PostgreSQL, MySQL, FileMaker, <insert name of database that is not Access here>, Excel, custom parsers, natural language importers, Perl just because it is a swiss army knife, grep awk sed, m4, the old versions of Access before the demise of Access, ...
weird functionality? Rather than the normal myriad of ways to access Access, I use SQL statements to access Access. The SQL statements that I use work with other databases as well as Access -- weird I know.
Like many, I have used and abused access over the years, always felt a little dirty though ... I felt a little better about it when I came across this post by Rob Conery recently:
http://blog.wekeroad.com/blog/hacking-your-vote/
Would never have dreamed of using access in a voting system. Scary.
FileMaker is a good database for shifting from MS Access.
It is a cross-platform database (mac/PC). It has a Web Viewer, through which you can connect to the web world. For example, charts, maps etc can be shown in this web viewer.
FileMaker is easy to use for beginners. You could also explore the scripting mechanism and achieve data manipulation.
The latest FileMaker 10 has several new interesting features. My vote is for FileMaker.
I believe File Maker Pro will probably become a new standard if people ever figure out it exists.
FMP has all of same features / short comings of Access plus you can actually make a real client / server setup if you know what you're doing.
In a single file you can define your forms, reports, tables, etc. It is also cross platform and runs on Windows or Mac, and can be adapted to web based too. All by design.
Coming from the "real" SQL servers to File Maker Pro was really hard mentally but once I got the hang of it I found it was pretty amazing. Now as a database it's nothing special but as a database application development system that "normal" people can use it really shines.
If you PLAN on a network setup I would suggest taking the time to learn how to separate the storage database from the application database up front. Otherwise upgrades require you do lots of data export / import and that can take a while or be almost impossible if your tables change significantly.
I've built a call center application that automatically handled incoming phone number lookup and automatically dialed regular POTS phones using FMP on NT. That was about 6 years ago so I imagine it's improved since then.
I've only used Access when I wished Excel could do a "Left Inner Join". Otherwise a MS has done a fair job making there C#/SQL offering simple (and free) to use for light weight RDB projects.