TweetFollow Us on Twitter

May 93 - The Future of OOP: One Man's Crystal Ball

The Future of OOP: One Man's Crystal Ball

Jeff Alger

This year marks a turning point in the object-oriented community. A future that seemed so clear and so certain until recently is now clouded by a haze of conflicting products, markets, and philosophies. Although no one can claim to have perfect vision of the future, I will give it a try, and the right place to start is not by looking forward, but backward at the history of our industry. For, as the saying goes, "those who do not learn the lessons of history are doomed to repeat them."

Lessons from the History of OOP

In the 1970's, a small, talented group of researchers at a place called the Xerox Palo Alto Research Center, or PARC, invented the personal computer and with it, the first true object-oriented programming language, Smalltalk. Computer hardware wasn't far enough along for their invention to be practical. Xerox as a company wasn't ready for it. And there really wasn't much of a perception in the software community that there was a need for it, anyway. Remember that at that time database technology and information engineering were just coming into their own and minicomputers were sprouting like weeds. So the Star workstation and Smalltalk languished, in my opinion more because the market didn't see the need than because of any failure of Xerox to follow through.

Soon thereafter, a brash young entrepreneur named Steve Jobs visited the PARC labs and turned what he saw into the Lisa computer. It flopped, again because it was ahead of the curve on hardware and there was no groundswell of support for new ways of developing software. Even when the most egregious problems of the Lisa were corrected in the Macintosh, it was a hard sell for Apple Computer to convince the public that new techniques were needed, especially after IBM weighed in with its own personal computer. The IBM PC was certainly less imaginative than the Macintosh, but it was closer to traditional architectures both in terms of hardware and software, so it was easier for the systems community to gets its hands around. A toy mainframe that sold because it reminded people of the real thing.

About the time of the Macintosh and Lisa, a small group within Apple recognized the potential, not just of the human interface and hardware aspects of what the PARC group had created, but of the software techniques they had used to program their machines. Object-oriented software seemed such a natural way to handle an iconic, event-driven graphical user interface that they created in succession Object Pascal, the Lisa Toolkit, the original MacApp, even an abortive Smalltalk for the Macintosh. There was talk even of object-oriented operating systems, objects from soup to nuts. Their work was only mildly influential in determining the course of Macintosh software development tools, as we saw a steady progression of procedural operating systems, languages, tools and techniques. How many people are aware that as recently as three and a half years ago, at a time when the then-MacApp Developers Association had about a thousand members worldwide, that there were only two - count 'em - engineers in all of Apple working on MacApp?

Apple had the option of going object-oriented all the way at any time up to the advent of Multifinder. There are those that disagree with me, but I and many others there at the time feel that had they done so, there would have been no need for a Taligent today and we would perhaps already be programming to the tune of Rhapsody in Pink. But even within Apple, champion of new ideas, counterculture of the computer industry, conservatism won; there were just too many people who did not even see the problems, let alone the value of object-oriented solutions.

But I'm getting ahead of myself. Across the continent, squired away in an obscure corner of Bell Telephone Laboratories, a fellow by the name of Bjarne Stroustrup spent his time writing software to simulate… exactly what, I don't know, but the Labs have always done a lot of simulations work. He used a language called Simula, an offshoot of Algol designed specifically to simulate real-time stochastic processes and arguably one of the first true object-oriented languages before that term had been coined. But Bell Labs did other things beyond simulations. And like any large organization, they developed software in a variety of languages: COBOL, Fortran, and lots of obscure languages like Simula and Snobol. The breakup of the phone system was forcing all of AT&T to think about new ways of making money and Unix looked a good bet. But there was a problem: how could AT&T tell everyone else in the world to use Unix and its companion language, C, if they themselves weren't? So, the order came down from on high: henceforth, all software will be written in C. Now, I have heard conflicting stories as to whether this order directly prompted Stroustrup to migrate to C or whether it merely built a critical mass that made C more acceptable as a delivery vehicle. It doesn't matter, because AT&T just wasn't interested in any language that wasn't C. C++, as with most other object-oriented innovations, was ignored by its own company. Today there are many within AT&T that speak of the fish that got away.

Wherever you look, the history of object-oriented technology has not been pretty. Microsoft was smart enough to recognize an object-oriented image problem when they saw it. Windows is as object-oriented as one can get without an object-oriented language, yet nowhere in the early literature of Windows was the term "object" even used. They recognized the merits of the approach but realized that few others did. Operating systems designers have been doing what could arguably be called object-oriented programming - attaching function vectors to packets of data - since the 50's but remain, perhaps for the very reason that they've done so well without, curiously skeptical of the need for object-oriented languages. Computer scientists have been pushing the use of abstract data types - encapsulating data behind functional interfaces - for decades but no one in the commercial arena has been there to listen, with the singular exception of Ada. Even in the rarefied world of databases, where challenges to the data-driven approach are quickly shown the door, the trend has been strongly toward what arguably could be called object-oriented architectures. Call them triggers, stored procedures, or what have you, but the fact remains that modern data modeling requires associating functions with packets of data, the core concept of OOP. Yet, few in the database community are willing to call a spade a spade: most so-called "object-oriented databases" are, in fact, nothing more than glorified network or relational models that support complex data types, and many so-called "object-oriented methodologies" are nothing more than recycled information engineering.

Even today, with magazines, training classes, programming languages, conferences, college curricula and associations devoted to object-oriented technology, it is easy to get fooled into thinking that the war is won and that the ramparts themselves are now object-oriented. The problem is that OOPers tend to talk mostly to other OOPers and forget how much resistance there is to the idea outside our own mutual support group. Well, as one who advises companies on these issues, I can tell you that outside our own ranks OOP is still viewed as either snake oil or a silver bullet, but not as a practical tool for solving everyday problems.

Throughout its history, OOP has been the Rodney Dangerfield of software: it gets no respect.

There are, of course, reasons for this history, and they are important to understand if we are to anticipate the future, for there is little reason to think they are about to change after all this time. Throughout, little emphasis has been laid on solving real problems that translate to real market share. That is, the OOP community has tended to be internally focused, developing great ideas and products and only then trying to convince the world that there is a need for them. Accompanying this has been more than a little arrogance, especially in waiving off the very real concerns of managers everywhere: integration with existing systems and techniques; leveraging skill sets already in place; having measurable, controllable and repeatable processes rather than a few smart people locked in a room arguing with each other. In Marketing 101 they teach you what happens when you try to bully the market. I am reminded of the story of the then-Chairman of the Great Atlantic and Pacific Tea Company being approached earlier in this century about sponsoring a Sunday afternoon radio show. He declined, saying that he doubted anyone would listen during that time slot. After all, everyone he knew played polo on Sunday afternoon. Well, everyone I know in the object-oriented community thinks polymorphism is really important.

Wild, unsubstantiated and often patently false claims have been made about the benefits of object orientation. Where, for example, are the case studies of large-scale code reuse to back up all the popular literature on the subject? Why, if this is so "natural," did the organizers of OOPSLA a couple of years ago feel compelled to hold a panel on the subject, "Why Is Object-Oriented So Difficult?" And why, if the payback is so quick and dramatic, is it quietly understood in the OOP community that it takes two years to develop a good object-oriented engineer?

We have done a very poor job of articulating why technology managers should believe us when, like Charley Brown running up to kick the football every fall only to have Lucy once again yank it away at the last second, they have been consistently let down by other, similar, claims in the past. A good friend of mine, John Brugge of IDS Financial Services, circulated a paper in his company explaining the relative merits of object-oriented technology. It spoke of dramatic increases in productivity, lower maintenance costs, better results, higher quality. Reading the paper at his urging I felt it to be quite mainstream, the sorts of claims to be found throughout the literature on the subject. After five pages, however, the paper broke off in mid-sentence. "…I can't go on with this. This really is not my paper." He explained that the paper was, in fact, from a book by Edward Yourdon from the 1970s that dealt with the structured programming revolution; John had literally done a bulk search-and-replace of "object-oriented" for "structured" and otherwise left the wording unchanged. Little wonder that we are viewed with suspicion.

Another problem has been a lack of focus. 75% of development costs, and an even higher percentage of software lifecycle costs, are tied up in analysis and design but, as Yourdon points out in his new book, "The Decline and Fall of the American Programmer," the OOP community seems stuck in the backwaters of code. Even within that limited domain the focus has tended to be more on piling feature after feature into the syntax of languages while giving short shrift to the problems that really consume programmer time: memory management, debugging, object persistence ("you mean you actually want to store your data?") and integration with non-OOP technologies. Even worse, OOP has often been the excuse used for working without any methodology whatever. I call this the "Brilliant Pebbles" approach to software development, named after the Strategic Defense Initiative program of the same name, because it relies on training a few smart people to hurtle themselves at oncoming problems. Little wonder that people don't want to trust us with critical projects.

Finally, and to me this is the most important, the OOP community seems to lurch about from place to place because it is largely blinded to the real reason it exists. Have you ever stopped to wonder why all major components of the software community - systems programming, simulations, theory of algorithms, database, artificial intelligence and human factors - have drifted in the same direction independently of one another? It was not until the advent of C++ that there was even much talk between these groups, and then it was to argue over details. It cannot be random chance. Clearly there is some underlying force guiding this evolution, and it is squarely this: Our systems and systems development efforts are so complex today that they are limited by grayware, the stuff of the human. Put simply, we have reached the point where our own ability to understand what we are doing and what we have done is the principle constraint on software development. To quote Pogo, "We have met the enemy, and it is us." Yet, this isn't a very comfortable subject to talk about because it takes us out of the ivory tower world of formal grammars and correctness proofs and into the various "soft" disciplines that come under the general heading "Cognitive Science." And when we look under that rock, we may see something that makes us uncomfortable. Maybe, just maybe, object-orientation isn't really as natural as all that, even if it is an improvement. Maybe, just maybe, the history of computer science did not, after all, stop with the advent of objects. Maybe we've just positioned ourselves for even harder work to come.

It is important to take this look backward at the history of object-oriented technologies to understand where we are likely to go in the future. There is by now enough experience with the technology to see many of its limits. Eventually, market forces will be heard whether we listen actively or cover our ears, and that will force the OOP industry to face a series of problems we've collectively swept under the rug until now.

Problem-solving Focus

First and foremost in the coming years the OOP community will become more focused on customers and problems, not technology. It has to. There are only so many executives and purchasing managers willing to buy technology and stunning leaps of faith, and as one who advises them I can tell you that even they are starting to turn up their radios to mask the noise. There are many ways to segment the potential market for OOP: small vs. large, commercial products vs. in-house, standalone systems vs. networks and groupware, GUI vs. … what is the antonym for GUI, anyway, "slick"? …, leading edge vs. mainstream, data-intensive vs. process-intensive… I could go on for some time without pausing for breath. In each case there are clearly identifiable customers and needs that we can and will address without dwelling so much on what Neal Goldstein and I like to call OOTB: object-oriented technobabble.

There are two specific companies I would like to single out in this regard. There is much that I admire about the way Taligent is approaching the future, but I think their greatest strength is their grasp of these market forces. Ten years from now when the definitive history of Taligent is written, whatever success they achieve will be attributed more to their business acumen than their technology. Tiny Component Software is another company that has done a good job of positioning themselves as solvers of specific business problems, rather than as a vendor of "YAOOL": "yet another object-oriented language." It is companies like these that we should study and emulate.

The Problem With Classes

As the OOP community enters the second phase of its business cycle, there is growing recognition that classes, inheritance and polymorphism may not be the miracle cures many thought them to be. There is increasing talk of the more general mechanisms of delegation and aggregation. Deep class hierarchies are definitely passé. Inheritance brings with it inherent problems with modularity and code reuse when compared to delegation models.

Classification is also losing its luster as a way of building semantic models, not least because the hard evidence in the cognitive science community is that grouping things according to the properties they share - the foundation of classes - is a distinctly unnatural way for humans to describe their world.

At the same time, we are rapidly coming to a good understanding of how humans really group concepts, a subject known as "categorization" in the cognitive science community. Our groupings turn out to be far more than random. They are consistent enough to allow children all over the world to learn languages in pretty much the same ways and to allow for a large number of what the linguists call "universals," grammatical constructs common to all known languages. Their structures are much more complex than simple classification can easily represent, but that should be no surprise; the artificial intelligence community came to that conclusion in the 1970's. The object-oriented community just hasn't bothered, it would seem, to hit the literature before asserting the "naturalness" of classification. But the real point here is that as we learn more about human cognitive processes, we will also find ways to more directly represent them in software. Objects, but not classes, will play an important role in that next step.

Visualization

Another frontier of object-oriented and non-object-oriented software development is to learn how to properly visualize information. I must betray a bias here, for much of my own work has been in this area. To develop our own notational system, the Visual Design Language, Neal Goldstein and I did what we felt was the natural thing: we engaged the help of an expert in graphic communication. The entire emphasis from day one was on communicating. Not painting pretty pictures (though that turns out to be important to comprehension and retention) and not expressing every technical concept that comes along, but real communication. That means minimizing loss and distortion of information. Increasing comprehension and retention rates for project documents. One can easily show by experiment that comprehension and retention of typical software documentation is an abysmal 10%-20% for most forms. We must not just capture information, but also make it understandable and useful. One of my favorite anecdotes is of the kindergartner who came home from school one day and announced excitedly, "Mommy, I learned how to write in school today!" "That's great," replied the proud mother. "What did you write?" "I don't know, I can't read yet." In software development, we are great at accumulating documents we can't read or understand.

This is almost pure grayware. And it is far less than rocket science to trace the sort of path Neal and I took. There are entire industries based on the premise that one can communicate better through visualization.

To give you some idea of how primitive software notational systems generally are by the standards of the experts, let me relate our first meeting with our graphics consultants. We came to the session armed with what we thought were the best features of several leading object-oriented notational systems. I won't name names because my point is not to embarrass particular competing approaches, but you can well imagine what such a short list would look like. Our graphics consultants looked hard at them and… laughed. Really. They accused us of deliberately presenting the worst possible visualizations from the competition and wanted something more representative. We had already chosen what we thought were the best! As we worked with them I came to understand their perspective for myself. I also came to appreciate that they are at least as good at their jobs-visual communication-as we are at ours.

But before we can even get to visualizations, the industry must recognize the need. Isn't it remarkable that browsers for object-oriented development tools-for that matter, most software development tools-are still based on text? We know that people deal poorly with text as a general rule. The time is drawing near when we will no longer consider ourselves exempt from the rules that seem to guide the rest of the human race in this regard. Already there are some noteworthy contributions to visualization of software: Microsoft's Visual Basic and Prograph. There is also no end of notational systems for analysis and design, object-oriented and not. But they have yet to take the one step that really counts: go to the experts in graphic communication and ask their advice.

Death of the Application

One common feature of advanced dynamic languages such as Smalltalk and Lisp is that one does not so much develop an "application" as extend the one, the only, "environment." Take Macintosh Common Lisp as a good example. An "application" is really nothing more than an alternative menu bar within the development environment. For that matter, you can create an arbitrary number of "applications" by this same stratagem. The application as a unit of software engineering disappears in this environment, leaving only user interface paradigms to justify the term.

This is powerful as far as it goes, but imagine now entire machines and even networks in which there are no applications per se, just bundles of objects firing off messages to one another. A world where even menu bars and windows are assembled from components on the fly, discarding the last vestige of applicationness. I call this environment "object soup," a vast sea of memory thinly populated by occasional lumps we call objects.

Well, get ready, because this is coming faster than you think. There are already several companies at work on this and the first generation of real object-oriented operating systems are just over the horizon.

This reveals the application for what it has always been: an anachronism of our operating systems. But before you get too excited, stop and think for a moment about what kind of problems this approach might invite along to the dinner party. Throughout the history of the computer industry, we have set budgets, organized projects and deliverables, trained and rewarded employees, and defined products according to this single concept of the "application." To drop the application like a hot rock just isn't realistic, if for no other reason than that we have nothing to put in its place. Instead, we are in for a prolonged and wrenching period of readjustment, in which we must rethink our concepts of methodology, project management, software economics and the software lifecycle. It is certain that many people will be left behind by this sea change and that huge and powerful entrenched interests will oppose it altogether. (I stole that from a recent Wall Street Journal editorial, but I think it appropriate here.) It is not even clear that object soup can win in the end, though I think it can. This is the classic yin and yang of new technology: the new problems will hurt many but also bring tremendous opportunities for innovation and entrepreneurship to those who can handle the change.

Death of the Linker

Speaking of anachronisms, why do we still have linkers? Especially in the world of objects, where there are so many ways to dynamically link one object or class to the rest. The traditional coding cycle - run/test/change source/compile/link/go for a walk/run again - just doesn't make any sense anymore. The technology of dynamic languages has progressed to the point where traditional objections have been or soon will be silenced: performance, application size, and shrink-wrapping of applications. It will become increasingly difficult for development tools vendors to justify why they, too, haven't caught the dynamic wave. I give the linker only a few more years.

Reuse (of Analysis and Design, That is)

Way too much attention has been directed at the idea of code reuse. Who cares? Coding is a period in the novel of software development, and it's shrinking all the time. The bulk of software development dollars go to analysis, design and maintenance, not code. Even if we completely eliminated the need for code, it would only reduce overall lifecycle costs by perhaps 10%-15%, and that is a pretty generous estimate. I don't know anyone who believes we will obviate the need for programming even in the best of reusable worlds, so invariably the market for software techniques will refocus on the pot of gold: reuse of analysis and design.

There are many reasons to think this is possible, many more than the often flimsy justifications given for investing heavily in code reuse. First, much of analysis and design is independent of particular machines and languages, but code is, at some point, always dependent on both. Second, tremendous industry-specific expertise is already available in academia, management consultancies, and within industries themselves. This expertise is tugging at the chains of current technology, trying to find new delivery vehicles. The same cannot be said of code; we don't do it very well to begin with and that which is done well isn't very well understood. Third, object-oriented analysis and design, despite their stutter steps in infancy, hold the potential to replace textual documents with diagrams. What's the big deal about diagrams? Look beyond the pretty pictures: those are a specialized form of data model, used to modularize analysis and design information. Once information is expressed in this form, it becomes possible to pick and choose pieces and reassemble them into new works, something that does not work well if one has to copy/paste paragraphs from a textual document.

Object Bases

One of the principle contributions of the data modeling era was to make practical a philosophy that data are really corporate assets, not just project assets. In fact, databases have been the only real success story in the world of code reuse and the success has been spectacular. Yet, in all the literature of data modeling one cannot find an adequate answer to a simple question: what's so special about data? What about processes? Can't they and shouldn't they, too, be treated as corporate assets? The answer to these questions is that data is special in only one sense: we've figured out how to consolidate data and the same traditionally hasn't been true of processes. But that is a limitation of our techniques, not anything inherent in data or processes. Object orientation goes a long way toward bringing sharing of processes up to the same level.

Object-oriented approaches have the potential to enable this rethinking of the role of process modeling in an organization. I call this the "corporate assets" model of object-oriented software engineering. Instead of databases we will soon see the emergence of corporate object bases. The problems should not be taken lightly: we still don't know how to optimize queries against a database of objects that may use polymorphism to change the ways values are computed or stored; methodologies have yet to really come to grips with this objective without compromising the benefits of object-orientation by using what has been derisively called "recycled information engineering." I think our own methodology, Solution-Based Modeling, has a good headstart, but there is much work to be done. But the potential is clearly there and it remains only to see more efforts directed at a concept that has enormous potential benefits.

Standards

Finally, I expect the future to bring a new emphasis on standards on several fronts, especially where object technology enables standards in related fields. How much longer can it be before we finally have a single, widely accepted standard for graphics environments? Other than the fact that some are 2D and others 3D, there really isn't enough difference in the major offerings of today to support several competing standards. Object orientation is likely to be a catalyst for this convergence because of its ability to conveniently implement abstractions of interfaces while preserving machine-specific details. It may be awhile before similar standards take hold for multi-media software development, but those, too, can be expected to emerge and consolidate over time.

Graphical user interfaces are another area ripe for consolidation. There is a lot of sniping these days over competing GUIs but be honest, now, is there really that much difference, personal and corporate allegiances aside? Certainly not enough to support several competing standards in a world where most shops develop for more than one platform. Again, OOP is uniquely poised to drive competitors either together or out of business.

Object persistence and distribution over networks is another area that will eventually standardize, though progress may be slow until operating systems come shipped with de facto treatments. At that time, the few major approaches are likely to be made at least interoperable, if not identical.

No, I did not omit languages by mistake. It is the one area in which I do not expect to see standards emerge beyond the dominance of C++, the COBOL of the 90's. Languages have always evolved to address the problems of the day; as the problems change, so do the languages. This will not change. In particular, I expect decreasing emphasis on inheritance and more emphasis on delegation and aggregation, something not very well addressed by any of the major OOP languages of today. And, as I said earlier, visualization will play an increasing role in the way we create programs in the future.

How to Prepare for the 90's

Let me conclude with some observations about who the winners and losers will be should my crystal ball not prove too out of focus. Anytime a major change takes place, there is lots of money to be made in training, consulting, and publications. This will be as true for the coming generation of object-oriented technology as it was for databases in the 70's and 80's. Programming will be quite different, not just in content and process but in perception as well and organizations will have to spend heavily on technology assimilation. The ones who will make the most money are the ones who figure out earliest how to price and sell products in the age of object soup. There are really no economic models for this market, so a few people will have to blaze the trail while the rest follow. However, as with most innovations, it is the people that follow closely behind the leaders who will gain the most, armed with business acumen and a keen eye for what is catching. There may well be a couple of Bill Gates or Steve Jobs waiting to reap their fortunes, but overall those Harvard MBAs always seem to make the most money while technical innovators get the pats on the back. To those of you who are technologists, I say, "Hire those MBAs before they hire you." Fortunes will be made in reusable, shrink-wrapped analysis and design "packages" expressed using the paradigm of objects, especially to those who figure out first how the market for such products will work. But above all, those who accurately read the deep-rooted conservatism of the industry will be far more successful than those who come charging in the front door with guns blazing. OOP will still be the Rodney Dangerfield of the industry; we'd better get used to it and learn to make money, anyway. All in all, it should be a very interesting decade.

Copyright © 1993 by Jeff Alger. All rights reserved.

 

Community Search:
MacTech Search:

Software Updates via MacUpdate

Latest Forum Discussions

See All

Aether Gazer unveils Chapter 16 of its m...
After a bit of maintenance, Aether Gazer has released Chapter 16 of its main storyline, titled Night Parade of the Beasts. This big update brings a new character, a special outfit, some special limited-time events, and, of course, an engaging... | Read more »
Challenge those pesky wyverns to a dance...
After recently having you do battle against your foes by wildly flailing Hello Kitty and friends at them, GungHo Online has whipped out another surprising collaboration for Puzzle & Dragons. It is now time to beat your opponents by cha-cha... | Read more »
Pack a magnifying glass and practice you...
Somehow it has already been a year since Torchlight: Infinite launched, and XD Games is celebrating by blending in what sounds like a truly fantastic new update. Fans of Cthulhu rejoice, as Whispering Mist brings some horror elements, and tests... | Read more »
Summon your guild and prepare for war in...
Netmarble is making some pretty big moves with their latest update for Seven Knights Idle Adventure, with a bunch of interesting additions. Two new heroes enter the battle, there are events and bosses abound, and perhaps most interesting, a huge... | Read more »
Make the passage of time your plaything...
While some of us are still waiting for a chance to get our hands on Ash Prime - yes, don’t remind me I could currently buy him this month I’m barely hanging on - Digital Extremes has announced its next anticipated Prime Form for Warframe. Starting... | Read more »
If you can find it and fit through the d...
The holy trinity of amazing company names have come together, to release their equally amazing and adorable mobile game, Hamster Inn. Published by HyperBeard Games, and co-developed by Mum Not Proud and Little Sasquatch Studios, it's time to... | Read more »
Amikin Survival opens for pre-orders on...
Join me on the wonderful trip down the inspiration rabbit hole; much as Palworld seemingly “borrowed” many aspects from the hit Pokemon franchise, it is time for the heavily armed animal survival to also spawn some illegitimate children as Helio... | Read more »
PUBG Mobile teams up with global phenome...
Since launching in 2019, SpyxFamily has exploded to damn near catastrophic popularity, so it was only a matter of time before a mobile game snapped up a collaboration. Enter PUBG Mobile. Until May 12th, players will be able to collect a host of... | Read more »
Embark into the frozen tundra of certain...
Chucklefish, developers of hit action-adventure sandbox game Starbound and owner of one of the cutest logos in gaming, has released their roguelike deck-builder Wildfrost. Created alongside developers Gaziter and Deadpan Games, Wildfrost will... | Read more »
MoreFun Studios has announced Season 4,...
Tension has escalated in the ever-volatile world of Arena Breakout, as your old pal Randall Fisher and bosses Fred and Perrero continue to lob insults and explosives at each other, bringing us to a new phase of warfare. Season 4, Into The Fog of... | Read more »

Price Scanner via MacPrices.net

New today at Apple: Series 9 Watches availabl...
Apple is now offering Certified Refurbished Apple Watch Series 9 models on their online store for up to $80 off MSRP, starting at $339. Each Watch includes Apple’s standard one-year warranty, a new... Read more
The latest Apple iPhone deals from wireless c...
We’ve updated our iPhone Price Tracker with the latest carrier deals on Apple’s iPhone 15 family of smartphones as well as previous models including the iPhone 14, 13, 12, 11, and SE. Use our price... Read more
Boost Mobile will sell you an iPhone 11 for $...
Boost Mobile, an MVNO using AT&T and T-Mobile’s networks, is offering an iPhone 11 for $149.99 when purchased with their $40 Unlimited service plan (12GB of premium data). No trade-in is required... Read more
Free iPhone 15 plus Unlimited service for $60...
Boost Infinite, part of MVNO Boost Mobile using AT&T and T-Mobile’s networks, is offering a free 128GB iPhone 15 for $60 per month including their Unlimited service plan (30GB of premium data).... Read more
$300 off any new iPhone with service at Red P...
Red Pocket Mobile has new Apple iPhones on sale for $300 off MSRP when you switch and open up a new line of service. Red Pocket Mobile is a nationwide MVNO using all the major wireless carrier... Read more
Clearance 13-inch M1 MacBook Airs available a...
Apple has clearance 13″ M1 MacBook Airs, Certified Refurbished, available for $759 for 8-Core CPU/7-Core GPU/256GB models and $929 for 8-Core CPU/8-Core GPU/512GB models. Apple’s one-year warranty is... Read more
Updated Apple MacBook Price Trackers
Our Apple award-winning MacBook Price Trackers are continually updated with the latest information on prices, bundles, and availability for 16″ and 14″ MacBook Pros along with 13″ and 15″ MacBook... Read more
Every model of Apple’s 13-inch M3 MacBook Air...
Best Buy has Apple 13″ MacBook Airs with M3 CPUs in stock and on sale today for $100 off MSRP. Prices start at $999. Their prices are the lowest currently available for new 13″ M3 MacBook Airs among... Read more
Sunday Sale: Apple iPad Magic Keyboards for 1...
Walmart has Apple Magic Keyboards for 12.9″ iPad Pros, in Black, on sale for $150 off MSRP on their online store. Sale price for online orders only, in-store price may vary. Order online and choose... Read more
Apple Watch Ultra 2 now available at Apple fo...
Apple has, for the first time, begun offering Certified Refurbished Apple Watch Ultra 2 models in their online store for $679, or $120 off MSRP. Each Watch includes Apple’s standard one-year warranty... Read more

Jobs Board

DMR Technician - *Apple* /iOS Systems - Haml...
…relevant point-of-need technology self-help aids are available as appropriate. ** Apple Systems Administration** **:** Develops solutions for supporting, deploying, Read more
Omnichannel Associate - *Apple* Blossom Mal...
Omnichannel Associate - Apple Blossom Mall Location:Winchester, VA, United States (https://jobs.jcp.com/jobs/location/191170/winchester-va-united-states) - Apple Read more
Operations Associate - *Apple* Blossom Mall...
Operations Associate - Apple Blossom Mall Location:Winchester, VA, United States (https://jobs.jcp.com/jobs/location/191170/winchester-va-united-states) - Apple Read more
Cashier - *Apple* Blossom Mall - JCPenney (...
Cashier - Apple Blossom Mall Location:Winchester, VA, United States (https://jobs.jcp.com/jobs/location/191170/winchester-va-united-states) - Apple Blossom Mall Read more
IT Systems Engineer ( *Apple* Platforms) - S...
IT Systems Engineer ( Apple Platforms) at SpaceX Hawthorne, CA SpaceX was founded under the belief that a future where humanity is out exploring the stars is Read more
All contents are Copyright 1984-2011 by Xplain Corporation. All rights reserved. Theme designed by Icreon.