Abstractions come in many flavours. At the most general level, abstraction is the single most valuable method of analysis. Mathematics is all about it. Abstraction is also the result of that analysis. I will be using the term "abstraction" as if it meant "a concept relevant in particular context". In simple terms abstraction is "something that ..."
In programming, abstractions appear in many places. It's impossible to discuss all of them, so here is a few random examples before we proceed to the points of the discussion.
The simplest abstraction in programming is the byte. The concept of byte suggests that (1) all data we manipulate with can be converted to numbers and (2) the numbers can be encoded using fixed width integers. Although bytes with size different from 8 bits have been experimented with, they were not successful (not that I saw CDC Cyber myself, but the books say so). Let's think why all the bytes have converged to 8 bits. My answer is - because there is no difference. They could have been of any size, 8 bits were just a good balance of range/processing power vs. costs to produce at that time. As the costs kept falling, the bytes actually grew (although under a name of machine word now). Anyhow, the byte became a universally accepted abstraction, reused literally everywhere.
Abstractions also appear in OOP where they hide in the shadows of the objects themselves. After all what is an object if not an abstraction of some real-world entity ? OOP is very successful in giving developers abstraction producing machinery which they apply to their problems yielding abstractions directly translatable to programming languages. Not all the created abstractions are reusable, leave alone useful, but OOP itself is.
As we can see, low level abstractions can become quite successful and widely adopted. This is because they are so "abstract" they apply everywhere out of the box, and changing them would make no difference, and so nobody even thinks about why they are there and what's their purpose. They are just there, they've always been (for an eternity of fifty years), they are the truth. Although this approach is far from scientific, it works well in production. After all, why do I need to know the details of the chemical process for producing plastic, as long as all I need is to wrap my groceries in a plastic bag ?
And so, my first point - easily reusable abstractions are invisible. We use them without ever noticing. "Why ? Of course our data is encoded in bytes, what a stupid suggestion !" Low level abstractions are such, but hardly any others, at least right now.
As we move farther towards the "real business" side of programming scale, it's increasingly difficult not to notice the abstractions arising more as obstacles than helpers. One step up, look at what building blocks we have as we build real systems. No matter if it's 1980s or 2000s, the abstractions that your programming environment offers are the same:
- User. Session. Context. Authentication with plaintext username and password (anything else is still out of reach because of interop problems).
- Database (whatever make, hard to design). Connection. Pool (not always useful). SQL query (hardly anyone knows how to write one that won't bring the server to its knees). Transactions (requires even more knowledge).
- Component framework, networking, distributed facilities, RPC, messaging (various compatibility, performance and scalability problems difficult to address).
- I18N: Encoding (Latin-1 as the mostly useful one, interop hell again).
Two steps up, when it comes to real business, there are no abstractions. It has to be done in exact particular way. Nobody's going to use an application if it doesn't make precisely what the user wants. That's why all the software is to some degree configurable and adjustable. The rigid abstract part of the software is clearly separated from the flexible configuration part. The abstraction therefore has nothing to do with the business specifics - it's all in the configuration. And if you are lucky, you've got enough flexibility to cover everything.
Now, I'm asking you this - wouldn't then a "reusable business-specific abstraction" be an oxymoron ?
For one, "business-specific abstraction" is already nonsence. Abstract has its details stripped off and replaced with imaginary concepts. But details are the heart of the particular business, no matter if it's called "valuable proprietary know-how" or "ugly poorly understood mess". And it makes difference. And it's highly visible. If you are switching from DCOM to SOAP, nobody cares. But if you mishandle customer's data for the sake of abstraction that you might have - well, they won't approve it. It has to be done their way or no way. If your abstraction is too rigid - too bad for it.
For two, "reusable business-specific" is also nonsence, as soon as you are crossing enterprise boundaries. Unless Big Brother takes over, every single business is going to be to some extent unique and as such will not be covered by the abstraction. Even if you are borrowing a clearly separated network service, not burdened with thousand dependencies, it's likely to be useless without tweaking.
And so my second point is - business specific software can hardly be abstract and hardly reusable. In fact, the most valuable property of such software is flexibility. It has to allow easy changes, no matter what abstraction has been put in it.
That's one of the points behind the Pythomnic project - allowing developers to build network services that can to any extent be changed on the fly, without service restart and as soon as you have read this far I invite you to check it out.