TweetFollow Us on Twitter

May 93 - The Future of OOP: One Man's Crystal Ball

The Future of OOP: One Man's Crystal Ball

Jeff Alger

This year marks a turning point in the object-oriented community. A future that seemed so clear and so certain until recently is now clouded by a haze of conflicting products, markets, and philosophies. Although no one can claim to have perfect vision of the future, I will give it a try, and the right place to start is not by looking forward, but backward at the history of our industry. For, as the saying goes, "those who do not learn the lessons of history are doomed to repeat them."

Lessons from the History of OOP

In the 1970's, a small, talented group of researchers at a place called the Xerox Palo Alto Research Center, or PARC, invented the personal computer and with it, the first true object-oriented programming language, Smalltalk. Computer hardware wasn't far enough along for their invention to be practical. Xerox as a company wasn't ready for it. And there really wasn't much of a perception in the software community that there was a need for it, anyway. Remember that at that time database technology and information engineering were just coming into their own and minicomputers were sprouting like weeds. So the Star workstation and Smalltalk languished, in my opinion more because the market didn't see the need than because of any failure of Xerox to follow through.

Soon thereafter, a brash young entrepreneur named Steve Jobs visited the PARC labs and turned what he saw into the Lisa computer. It flopped, again because it was ahead of the curve on hardware and there was no groundswell of support for new ways of developing software. Even when the most egregious problems of the Lisa were corrected in the Macintosh, it was a hard sell for Apple Computer to convince the public that new techniques were needed, especially after IBM weighed in with its own personal computer. The IBM PC was certainly less imaginative than the Macintosh, but it was closer to traditional architectures both in terms of hardware and software, so it was easier for the systems community to gets its hands around. A toy mainframe that sold because it reminded people of the real thing.

About the time of the Macintosh and Lisa, a small group within Apple recognized the potential, not just of the human interface and hardware aspects of what the PARC group had created, but of the software techniques they had used to program their machines. Object-oriented software seemed such a natural way to handle an iconic, event-driven graphical user interface that they created in succession Object Pascal, the Lisa Toolkit, the original MacApp, even an abortive Smalltalk for the Macintosh. There was talk even of object-oriented operating systems, objects from soup to nuts. Their work was only mildly influential in determining the course of Macintosh software development tools, as we saw a steady progression of procedural operating systems, languages, tools and techniques. How many people are aware that as recently as three and a half years ago, at a time when the then-MacApp Developers Association had about a thousand members worldwide, that there were only two - count 'em - engineers in all of Apple working on MacApp?

Apple had the option of going object-oriented all the way at any time up to the advent of Multifinder. There are those that disagree with me, but I and many others there at the time feel that had they done so, there would have been no need for a Taligent today and we would perhaps already be programming to the tune of Rhapsody in Pink. But even within Apple, champion of new ideas, counterculture of the computer industry, conservatism won; there were just too many people who did not even see the problems, let alone the value of object-oriented solutions.

But I'm getting ahead of myself. Across the continent, squired away in an obscure corner of Bell Telephone Laboratories, a fellow by the name of Bjarne Stroustrup spent his time writing software to simulate… exactly what, I don't know, but the Labs have always done a lot of simulations work. He used a language called Simula, an offshoot of Algol designed specifically to simulate real-time stochastic processes and arguably one of the first true object-oriented languages before that term had been coined. But Bell Labs did other things beyond simulations. And like any large organization, they developed software in a variety of languages: COBOL, Fortran, and lots of obscure languages like Simula and Snobol. The breakup of the phone system was forcing all of AT&T to think about new ways of making money and Unix looked a good bet. But there was a problem: how could AT&T tell everyone else in the world to use Unix and its companion language, C, if they themselves weren't? So, the order came down from on high: henceforth, all software will be written in C. Now, I have heard conflicting stories as to whether this order directly prompted Stroustrup to migrate to C or whether it merely built a critical mass that made C more acceptable as a delivery vehicle. It doesn't matter, because AT&T just wasn't interested in any language that wasn't C. C++, as with most other object-oriented innovations, was ignored by its own company. Today there are many within AT&T that speak of the fish that got away.

Wherever you look, the history of object-oriented technology has not been pretty. Microsoft was smart enough to recognize an object-oriented image problem when they saw it. Windows is as object-oriented as one can get without an object-oriented language, yet nowhere in the early literature of Windows was the term "object" even used. They recognized the merits of the approach but realized that few others did. Operating systems designers have been doing what could arguably be called object-oriented programming - attaching function vectors to packets of data - since the 50's but remain, perhaps for the very reason that they've done so well without, curiously skeptical of the need for object-oriented languages. Computer scientists have been pushing the use of abstract data types - encapsulating data behind functional interfaces - for decades but no one in the commercial arena has been there to listen, with the singular exception of Ada. Even in the rarefied world of databases, where challenges to the data-driven approach are quickly shown the door, the trend has been strongly toward what arguably could be called object-oriented architectures. Call them triggers, stored procedures, or what have you, but the fact remains that modern data modeling requires associating functions with packets of data, the core concept of OOP. Yet, few in the database community are willing to call a spade a spade: most so-called "object-oriented databases" are, in fact, nothing more than glorified network or relational models that support complex data types, and many so-called "object-oriented methodologies" are nothing more than recycled information engineering.

Even today, with magazines, training classes, programming languages, conferences, college curricula and associations devoted to object-oriented technology, it is easy to get fooled into thinking that the war is won and that the ramparts themselves are now object-oriented. The problem is that OOPers tend to talk mostly to other OOPers and forget how much resistance there is to the idea outside our own mutual support group. Well, as one who advises companies on these issues, I can tell you that outside our own ranks OOP is still viewed as either snake oil or a silver bullet, but not as a practical tool for solving everyday problems.

Throughout its history, OOP has been the Rodney Dangerfield of software: it gets no respect.

There are, of course, reasons for this history, and they are important to understand if we are to anticipate the future, for there is little reason to think they are about to change after all this time. Throughout, little emphasis has been laid on solving real problems that translate to real market share. That is, the OOP community has tended to be internally focused, developing great ideas and products and only then trying to convince the world that there is a need for them. Accompanying this has been more than a little arrogance, especially in waiving off the very real concerns of managers everywhere: integration with existing systems and techniques; leveraging skill sets already in place; having measurable, controllable and repeatable processes rather than a few smart people locked in a room arguing with each other. In Marketing 101 they teach you what happens when you try to bully the market. I am reminded of the story of the then-Chairman of the Great Atlantic and Pacific Tea Company being approached earlier in this century about sponsoring a Sunday afternoon radio show. He declined, saying that he doubted anyone would listen during that time slot. After all, everyone he knew played polo on Sunday afternoon. Well, everyone I know in the object-oriented community thinks polymorphism is really important.

Wild, unsubstantiated and often patently false claims have been made about the benefits of object orientation. Where, for example, are the case studies of large-scale code reuse to back up all the popular literature on the subject? Why, if this is so "natural," did the organizers of OOPSLA a couple of years ago feel compelled to hold a panel on the subject, "Why Is Object-Oriented So Difficult?" And why, if the payback is so quick and dramatic, is it quietly understood in the OOP community that it takes two years to develop a good object-oriented engineer?

We have done a very poor job of articulating why technology managers should believe us when, like Charley Brown running up to kick the football every fall only to have Lucy once again yank it away at the last second, they have been consistently let down by other, similar, claims in the past. A good friend of mine, John Brugge of IDS Financial Services, circulated a paper in his company explaining the relative merits of object-oriented technology. It spoke of dramatic increases in productivity, lower maintenance costs, better results, higher quality. Reading the paper at his urging I felt it to be quite mainstream, the sorts of claims to be found throughout the literature on the subject. After five pages, however, the paper broke off in mid-sentence. "…I can't go on with this. This really is not my paper." He explained that the paper was, in fact, from a book by Edward Yourdon from the 1970s that dealt with the structured programming revolution; John had literally done a bulk search-and-replace of "object-oriented" for "structured" and otherwise left the wording unchanged. Little wonder that we are viewed with suspicion.

Another problem has been a lack of focus. 75% of development costs, and an even higher percentage of software lifecycle costs, are tied up in analysis and design but, as Yourdon points out in his new book, "The Decline and Fall of the American Programmer," the OOP community seems stuck in the backwaters of code. Even within that limited domain the focus has tended to be more on piling feature after feature into the syntax of languages while giving short shrift to the problems that really consume programmer time: memory management, debugging, object persistence ("you mean you actually want to store your data?") and integration with non-OOP technologies. Even worse, OOP has often been the excuse used for working without any methodology whatever. I call this the "Brilliant Pebbles" approach to software development, named after the Strategic Defense Initiative program of the same name, because it relies on training a few smart people to hurtle themselves at oncoming problems. Little wonder that people don't want to trust us with critical projects.

Finally, and to me this is the most important, the OOP community seems to lurch about from place to place because it is largely blinded to the real reason it exists. Have you ever stopped to wonder why all major components of the software community - systems programming, simulations, theory of algorithms, database, artificial intelligence and human factors - have drifted in the same direction independently of one another? It was not until the advent of C++ that there was even much talk between these groups, and then it was to argue over details. It cannot be random chance. Clearly there is some underlying force guiding this evolution, and it is squarely this: Our systems and systems development efforts are so complex today that they are limited by grayware, the stuff of the human. Put simply, we have reached the point where our own ability to understand what we are doing and what we have done is the principle constraint on software development. To quote Pogo, "We have met the enemy, and it is us." Yet, this isn't a very comfortable subject to talk about because it takes us out of the ivory tower world of formal grammars and correctness proofs and into the various "soft" disciplines that come under the general heading "Cognitive Science." And when we look under that rock, we may see something that makes us uncomfortable. Maybe, just maybe, object-orientation isn't really as natural as all that, even if it is an improvement. Maybe, just maybe, the history of computer science did not, after all, stop with the advent of objects. Maybe we've just positioned ourselves for even harder work to come.

It is important to take this look backward at the history of object-oriented technologies to understand where we are likely to go in the future. There is by now enough experience with the technology to see many of its limits. Eventually, market forces will be heard whether we listen actively or cover our ears, and that will force the OOP industry to face a series of problems we've collectively swept under the rug until now.

Problem-solving Focus

First and foremost in the coming years the OOP community will become more focused on customers and problems, not technology. It has to. There are only so many executives and purchasing managers willing to buy technology and stunning leaps of faith, and as one who advises them I can tell you that even they are starting to turn up their radios to mask the noise. There are many ways to segment the potential market for OOP: small vs. large, commercial products vs. in-house, standalone systems vs. networks and groupware, GUI vs. … what is the antonym for GUI, anyway, "slick"? …, leading edge vs. mainstream, data-intensive vs. process-intensive… I could go on for some time without pausing for breath. In each case there are clearly identifiable customers and needs that we can and will address without dwelling so much on what Neal Goldstein and I like to call OOTB: object-oriented technobabble.

There are two specific companies I would like to single out in this regard. There is much that I admire about the way Taligent is approaching the future, but I think their greatest strength is their grasp of these market forces. Ten years from now when the definitive history of Taligent is written, whatever success they achieve will be attributed more to their business acumen than their technology. Tiny Component Software is another company that has done a good job of positioning themselves as solvers of specific business problems, rather than as a vendor of "YAOOL": "yet another object-oriented language." It is companies like these that we should study and emulate.

The Problem With Classes

As the OOP community enters the second phase of its business cycle, there is growing recognition that classes, inheritance and polymorphism may not be the miracle cures many thought them to be. There is increasing talk of the more general mechanisms of delegation and aggregation. Deep class hierarchies are definitely passé. Inheritance brings with it inherent problems with modularity and code reuse when compared to delegation models.

Classification is also losing its luster as a way of building semantic models, not least because the hard evidence in the cognitive science community is that grouping things according to the properties they share - the foundation of classes - is a distinctly unnatural way for humans to describe their world.

At the same time, we are rapidly coming to a good understanding of how humans really group concepts, a subject known as "categorization" in the cognitive science community. Our groupings turn out to be far more than random. They are consistent enough to allow children all over the world to learn languages in pretty much the same ways and to allow for a large number of what the linguists call "universals," grammatical constructs common to all known languages. Their structures are much more complex than simple classification can easily represent, but that should be no surprise; the artificial intelligence community came to that conclusion in the 1970's. The object-oriented community just hasn't bothered, it would seem, to hit the literature before asserting the "naturalness" of classification. But the real point here is that as we learn more about human cognitive processes, we will also find ways to more directly represent them in software. Objects, but not classes, will play an important role in that next step.

Visualization

Another frontier of object-oriented and non-object-oriented software development is to learn how to properly visualize information. I must betray a bias here, for much of my own work has been in this area. To develop our own notational system, the Visual Design Language, Neal Goldstein and I did what we felt was the natural thing: we engaged the help of an expert in graphic communication. The entire emphasis from day one was on communicating. Not painting pretty pictures (though that turns out to be important to comprehension and retention) and not expressing every technical concept that comes along, but real communication. That means minimizing loss and distortion of information. Increasing comprehension and retention rates for project documents. One can easily show by experiment that comprehension and retention of typical software documentation is an abysmal 10%-20% for most forms. We must not just capture information, but also make it understandable and useful. One of my favorite anecdotes is of the kindergartner who came home from school one day and announced excitedly, "Mommy, I learned how to write in school today!" "That's great," replied the proud mother. "What did you write?" "I don't know, I can't read yet." In software development, we are great at accumulating documents we can't read or understand.

This is almost pure grayware. And it is far less than rocket science to trace the sort of path Neal and I took. There are entire industries based on the premise that one can communicate better through visualization.

To give you some idea of how primitive software notational systems generally are by the standards of the experts, let me relate our first meeting with our graphics consultants. We came to the session armed with what we thought were the best features of several leading object-oriented notational systems. I won't name names because my point is not to embarrass particular competing approaches, but you can well imagine what such a short list would look like. Our graphics consultants looked hard at them and… laughed. Really. They accused us of deliberately presenting the worst possible visualizations from the competition and wanted something more representative. We had already chosen what we thought were the best! As we worked with them I came to understand their perspective for myself. I also came to appreciate that they are at least as good at their jobs-visual communication-as we are at ours.

But before we can even get to visualizations, the industry must recognize the need. Isn't it remarkable that browsers for object-oriented development tools-for that matter, most software development tools-are still based on text? We know that people deal poorly with text as a general rule. The time is drawing near when we will no longer consider ourselves exempt from the rules that seem to guide the rest of the human race in this regard. Already there are some noteworthy contributions to visualization of software: Microsoft's Visual Basic and Prograph. There is also no end of notational systems for analysis and design, object-oriented and not. But they have yet to take the one step that really counts: go to the experts in graphic communication and ask their advice.

Death of the Application

One common feature of advanced dynamic languages such as Smalltalk and Lisp is that one does not so much develop an "application" as extend the one, the only, "environment." Take Macintosh Common Lisp as a good example. An "application" is really nothing more than an alternative menu bar within the development environment. For that matter, you can create an arbitrary number of "applications" by this same stratagem. The application as a unit of software engineering disappears in this environment, leaving only user interface paradigms to justify the term.

This is powerful as far as it goes, but imagine now entire machines and even networks in which there are no applications per se, just bundles of objects firing off messages to one another. A world where even menu bars and windows are assembled from components on the fly, discarding the last vestige of applicationness. I call this environment "object soup," a vast sea of memory thinly populated by occasional lumps we call objects.

Well, get ready, because this is coming faster than you think. There are already several companies at work on this and the first generation of real object-oriented operating systems are just over the horizon.

This reveals the application for what it has always been: an anachronism of our operating systems. But before you get too excited, stop and think for a moment about what kind of problems this approach might invite along to the dinner party. Throughout the history of the computer industry, we have set budgets, organized projects and deliverables, trained and rewarded employees, and defined products according to this single concept of the "application." To drop the application like a hot rock just isn't realistic, if for no other reason than that we have nothing to put in its place. Instead, we are in for a prolonged and wrenching period of readjustment, in which we must rethink our concepts of methodology, project management, software economics and the software lifecycle. It is certain that many people will be left behind by this sea change and that huge and powerful entrenched interests will oppose it altogether. (I stole that from a recent Wall Street Journal editorial, but I think it appropriate here.) It is not even clear that object soup can win in the end, though I think it can. This is the classic yin and yang of new technology: the new problems will hurt many but also bring tremendous opportunities for innovation and entrepreneurship to those who can handle the change.

Death of the Linker

Speaking of anachronisms, why do we still have linkers? Especially in the world of objects, where there are so many ways to dynamically link one object or class to the rest. The traditional coding cycle - run/test/change source/compile/link/go for a walk/run again - just doesn't make any sense anymore. The technology of dynamic languages has progressed to the point where traditional objections have been or soon will be silenced: performance, application size, and shrink-wrapping of applications. It will become increasingly difficult for development tools vendors to justify why they, too, haven't caught the dynamic wave. I give the linker only a few more years.

Reuse (of Analysis and Design, That is)

Way too much attention has been directed at the idea of code reuse. Who cares? Coding is a period in the novel of software development, and it's shrinking all the time. The bulk of software development dollars go to analysis, design and maintenance, not code. Even if we completely eliminated the need for code, it would only reduce overall lifecycle costs by perhaps 10%-15%, and that is a pretty generous estimate. I don't know anyone who believes we will obviate the need for programming even in the best of reusable worlds, so invariably the market for software techniques will refocus on the pot of gold: reuse of analysis and design.

There are many reasons to think this is possible, many more than the often flimsy justifications given for investing heavily in code reuse. First, much of analysis and design is independent of particular machines and languages, but code is, at some point, always dependent on both. Second, tremendous industry-specific expertise is already available in academia, management consultancies, and within industries themselves. This expertise is tugging at the chains of current technology, trying to find new delivery vehicles. The same cannot be said of code; we don't do it very well to begin with and that which is done well isn't very well understood. Third, object-oriented analysis and design, despite their stutter steps in infancy, hold the potential to replace textual documents with diagrams. What's the big deal about diagrams? Look beyond the pretty pictures: those are a specialized form of data model, used to modularize analysis and design information. Once information is expressed in this form, it becomes possible to pick and choose pieces and reassemble them into new works, something that does not work well if one has to copy/paste paragraphs from a textual document.

Object Bases

One of the principle contributions of the data modeling era was to make practical a philosophy that data are really corporate assets, not just project assets. In fact, databases have been the only real success story in the world of code reuse and the success has been spectacular. Yet, in all the literature of data modeling one cannot find an adequate answer to a simple question: what's so special about data? What about processes? Can't they and shouldn't they, too, be treated as corporate assets? The answer to these questions is that data is special in only one sense: we've figured out how to consolidate data and the same traditionally hasn't been true of processes. But that is a limitation of our techniques, not anything inherent in data or processes. Object orientation goes a long way toward bringing sharing of processes up to the same level.

Object-oriented approaches have the potential to enable this rethinking of the role of process modeling in an organization. I call this the "corporate assets" model of object-oriented software engineering. Instead of databases we will soon see the emergence of corporate object bases. The problems should not be taken lightly: we still don't know how to optimize queries against a database of objects that may use polymorphism to change the ways values are computed or stored; methodologies have yet to really come to grips with this objective without compromising the benefits of object-orientation by using what has been derisively called "recycled information engineering." I think our own methodology, Solution-Based Modeling, has a good headstart, but there is much work to be done. But the potential is clearly there and it remains only to see more efforts directed at a concept that has enormous potential benefits.

Standards

Finally, I expect the future to bring a new emphasis on standards on several fronts, especially where object technology enables standards in related fields. How much longer can it be before we finally have a single, widely accepted standard for graphics environments? Other than the fact that some are 2D and others 3D, there really isn't enough difference in the major offerings of today to support several competing standards. Object orientation is likely to be a catalyst for this convergence because of its ability to conveniently implement abstractions of interfaces while preserving machine-specific details. It may be awhile before similar standards take hold for multi-media software development, but those, too, can be expected to emerge and consolidate over time.

Graphical user interfaces are another area ripe for consolidation. There is a lot of sniping these days over competing GUIs but be honest, now, is there really that much difference, personal and corporate allegiances aside? Certainly not enough to support several competing standards in a world where most shops develop for more than one platform. Again, OOP is uniquely poised to drive competitors either together or out of business.

Object persistence and distribution over networks is another area that will eventually standardize, though progress may be slow until operating systems come shipped with de facto treatments. At that time, the few major approaches are likely to be made at least interoperable, if not identical.

No, I did not omit languages by mistake. It is the one area in which I do not expect to see standards emerge beyond the dominance of C++, the COBOL of the 90's. Languages have always evolved to address the problems of the day; as the problems change, so do the languages. This will not change. In particular, I expect decreasing emphasis on inheritance and more emphasis on delegation and aggregation, something not very well addressed by any of the major OOP languages of today. And, as I said earlier, visualization will play an increasing role in the way we create programs in the future.

How to Prepare for the 90's

Let me conclude with some observations about who the winners and losers will be should my crystal ball not prove too out of focus. Anytime a major change takes place, there is lots of money to be made in training, consulting, and publications. This will be as true for the coming generation of object-oriented technology as it was for databases in the 70's and 80's. Programming will be quite different, not just in content and process but in perception as well and organizations will have to spend heavily on technology assimilation. The ones who will make the most money are the ones who figure out earliest how to price and sell products in the age of object soup. There are really no economic models for this market, so a few people will have to blaze the trail while the rest follow. However, as with most innovations, it is the people that follow closely behind the leaders who will gain the most, armed with business acumen and a keen eye for what is catching. There may well be a couple of Bill Gates or Steve Jobs waiting to reap their fortunes, but overall those Harvard MBAs always seem to make the most money while technical innovators get the pats on the back. To those of you who are technologists, I say, "Hire those MBAs before they hire you." Fortunes will be made in reusable, shrink-wrapped analysis and design "packages" expressed using the paradigm of objects, especially to those who figure out first how the market for such products will work. But above all, those who accurately read the deep-rooted conservatism of the industry will be far more successful than those who come charging in the front door with guns blazing. OOP will still be the Rodney Dangerfield of the industry; we'd better get used to it and learn to make money, anyway. All in all, it should be a very interesting decade.

Copyright © 1993 by Jeff Alger. All rights reserved.

 

Community Search:
MacTech Search:

Software Updates via MacUpdate

Yasu 4.0.0 β - System maintenance app; p...
Yasu was created with System Administrators who service large groups of workstations in mind, Yasu (Yet Another System Utility) was made to do a specific group of maintenance tasks quickly within a... Read more
Skype 7.37.0.178 - Voice-over-internet p...
Skype allows you to talk to friends, family and co-workers across the Internet without the inconvenience of long distance telephone charges. Using peer-to-peer data transmission technology, Skype... Read more
EtreCheck 3.0.5 - For troubleshooting yo...
EtreCheck is an app that displays the important details of your system configuration and allow you to copy that information to the Clipboard. It is meant to be used with Apple Support Communities to... Read more
Amadeus Pro 2.3.1 - Multitrack sound rec...
Amadeus Pro lets you use your Mac computer for any audio-related task, such as live audio recording, digitizing tapes and records, converting between a variety of sound formats, etc. Thanks to its... Read more
NeoFinder 6.9.3 - Catalog your external...
NeoFinder (formerly CDFinder) rapidly organizes your data, either on external or internal disks, or any other volumes. It catalogs all your data, so you stay in control of your data archive or disk... Read more
WhatsApp 0.2.1880 - Desktop client for W...
WhatsApp is the desktop client for WhatsApp Messenger, a cross-platform mobile messaging app which allows you to exchange messages without having to pay for SMS. WhatsApp Messenger is available for... Read more
Hazel 4.0.6 - Create rules for organizin...
Hazel is your personal housekeeper, organizing and cleaning folders based on rules you define. Hazel can also manage your trash and uninstall your applications. Organize your files using a familiar... Read more
Apple iBooks Author 2.5 - Create and pub...
Apple iBooks Author helps you create and publish amazing Multi-Touch books for iPad. Now anyone can create stunning iBooks textbooks, cookbooks, history books, picture books, and more for iPad. All... Read more
MYStuff Pro 2.0.26 - $39.99
MYStuff Pro is the most flexible way to create detail-rich inventories for your home or small business. Add items to MYStuff by dragging and dropping existing information, uploading new images, or... Read more
MarsEdit 3.7.8 - Quick and convenient bl...
MarsEdit is a blog editor for OS X that makes editing your blog like writing email, with spell-checking, drafts, multiple windows, and even AppleScript support. It works with with most blog services... Read more

How to get past Vulture Island's tr...
Vulture Island is a colorful and quirky mish-mash of platforming and puzzles. It’s creative and fresh, but sometimes the game can throw a curveball at you, leaving you stuck as to how you should progress. These tips will help you explore smoothly... | Read more »
The new Clash of Kings is just for Weste...
If you’ve played the original Clash of Kings, you’ll probably recognise the city building, alliance forging and strategic battles in Clash of Kings: The West. What sets this version apart is that it’s tailor made for a Western audience and the... | Read more »
Frost - Survival card game (Games)
Frost - Survival card game 1.12.1 Device: iOS Universal Category: Games Price: $3.99, Version: 1.12.1 (iTunes) Description: *Warning: the game will work on iPhone 5C and above and iPad Pro / 4. Other devices are not supported* | Read more »
How to build and care for your team in D...
Before you hit the trail and become a dog sledding legend, there’s actually a fair bit of prep work to be done. In Dog Sled Saga, you’re not only racing, you’re also building and caring for a team of furry friends. There’s a lot to consider—... | Read more »
How to win every race in Dog Sled Saga
If I had to guess, I’d say Dog Sled Saga is the most adorable racing game on the App Store right now. It’s a dog sled racing sim full of adorable, loyal puppies. Just look at those fluffy little tails wagging. Behind that cute, pixelated facade is... | Read more »
Let the war games commence in Gunship Ba...
Buzz Lightyear famously said, “This isn’t flying, this is falling – with style!” In the case of Gunship Battle: Second War, though, this really is flying - with style! The flight simulator app from Joycity puts you in control of 20 faithfully... | Read more »
How to get a high score in Fired Up
Fired Up is Noodlecake Games’ high score chasing, firefighting adventure. You take control of a wayward firefighter who propels himself up the side of a highrise with blasts of water. Sound silly? It is. It’s also pretty difficult. You can’t... | Read more »
NBA 2K17 (Games)
NBA 2K17 1.0 Device: iOS iPhone Category: Games Price: $7.99, Version: 1.0 (iTunes) Description: Following the record-breaking launch of NBA 2K16, the NBA 2K franchise continues to stake its claim as the most authentic sports video... | Read more »
Dog Sled Saga (Games)
Dog Sled Saga 1.0.1 Device: iOS Universal Category: Games Price: $3.99, Version: 1.0.1 (iTunes) Description: A game by Dan + Lisa As a rookie musher, foster a dogsledding team whose skills will grow if they're treated right. Week by... | Read more »
60 Seconds! Atomic Adventure (Games)
60 Seconds! Atomic Adventure 1.2 Device: iOS Universal Category: Games Price: $2.99, Version: 1.2 (iTunes) Description: 60 Seconds! is a dark comedy atomic adventure of scavenge and survival. Collect supplies and rescue your family... | Read more »

Price Scanner via MacPrices.net

21-inch iMacs on sale for up to $120 off MSRP
B&H Photo has 21″ iMacs on sale for up to $120 off MSRP including free shipping plus NY sales tax only: - 21″ 3.1GHz iMac 4K: $1379 $120 off MSRP - 21″ 2.8GHz iMac: $1199.99 $100 off MSRP - 21″ 1... Read more
13-inch 2.7GHz/256GB Retina MacBook Pro on sa...
Amazon.com has the 13″ 2.7GHz/256GB Retina Apple MacBook Pro on sale for $151 off MSRP including free shipping: - 13″ 2.7GHz/256GB Retina MacBook Pro (sku MF840LL/A): $1348 $151 off MSRP Read more
Apple TVs on sale for up to $50 off MSRP
Best Buy has 32GB and 64GB Apple TVs on sale for $40-$50 off MSRP on their online store. Choose free shipping or free local store pickup (if available). Sale prices for online orders only, in-store... Read more
Apple refurbished 13-inch Retina MacBook Pros...
Apple has Certified Refurbished 13″ Retina MacBook Pros available for up to $270 off the cost of new models. An Apple one-year warranty is included with each model, and shipping is free: - 13″ 2.7GHz... Read more
Duplicate Sweeper Free On Mac App Store For O...
To celebrate the launch of Apple’s latest macOS Sierra, Stafford, United Kingdom based Wide Angle Software has announced that its duplicate file finder software, Duplicate Sweeper, is now available... Read more
13-inch Retina MacBook Pros on sale for up to...
B&H Photo has 13″ Retina Apple MacBook Pros on sale for up to $150 off MSRP. Shipping is free, and B&H charges NY tax only: - 13″ 2.7GHz/128GB Retina MacBook Pro: $1174.99 $125 off MSRP - 13... Read more
Evidence Surfaces Pointing To New A10X Chip F...
Citing a job description for a Project Lead position at Apple’s Austin, Texas engineering labs, Motley Fool’s Ashraf Eassa deduces that development is progressing well on Apple’s next-generation in-... Read more
Check Print’R for macOS Allows Anyone to Easi...
Delaware-based Match Software has announced the release and immediate availability of Check Print’R 3.21, an important update to their easy-to-use check printing application for macOS. Check Print’R... Read more
Apple refurbished 11-inch MacBook Airs availa...
Apple has Certified Refurbished 11″ MacBook Airs (the latest models), available for up to $170 off the cost of new models. An Apple one-year warranty is included with each MacBook, and shipping is... Read more
Apple refurbished 15-inch Retina MacBook Pros...
Apple has Certified Refurbished 2015 15″ Retina MacBook Pros available for up to $380 off the cost of new models. An Apple one-year warranty is included with each model, and shipping is free: - 15″ 2... Read more

Jobs Board

Sr. *Apple* Mac Engineer - Net2Source Inc....
…staffing, training and technology. We have following position open with our client. Sr. Apple Mac Engineer6+ Months CTH Start date : 19th Sept Travelling Job If Read more
*Apple* Retail - Multiple Positions-Norfolk,...
Job Description: Sales Specialist - Retail Customer Service and Sales Transform Apple Store visitors into loyal Apple customers. When customers enter the store, Read more
Restaurant Manager (Neighborhood Captain) - A...
…in every aspect of daily operation. WHY YOU'LL LIKE IT: You'll be the Big Apple . You'll solve problems. You'll get to show your ability to handle the stress and Read more
Lead *Apple* Solutions Consultant - Apple (...
# Lead Apple Solutions Consultant Job Number: 51829230 Detroit, Michigan, United States Posted: Sep. 19, 2016 Weekly Hours: 40.00 **Job Summary** The Lead ASC is an Read more
US- *Apple* Store Leader Program - Apple (Un...
…Summary Learn and grow as you explore the art of leadership at the Apple Store. You'll master our retail business inside and out through training, hands-on Read more
All contents are Copyright 1984-2011 by Xplain Corporation. All rights reserved. Theme designed by Icreon.