Legacy Kills

From iGeek
Revision as of 22:29, 8 May 2018 by Ari (talk | contribs) (1 revision imported)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search
Luddites.jpg

Ahh Legacy, being haunted by your own past. Sometimes great, always painful, a products legacy can be both its greatest strength (name recognition, customers, trained users, and so on), and it can be it's greatest weakness (cruft, resistance to change, millstone to drag around, etc). While it is great to have a customer base paying for your R&D, it can also suck when you're trying to adapt to a changing market or changing technology: it can suffocate companies and products under its own weight.

I 've seen many create and even personally created many systems at smaller companies, with smaller staffs, than what larger companies with far more resources could pull off. Which seems to make no sense, and people ask "how can that happen?" The answer is legacy!

Legacy has some advantages in the technology sector -- but it has some real disadvantages as well. People talk about market share and how good it is, but they forget the downsides. So that's what I'm going to do -- point out the many downsides.

Why does it happen?

Imagine you are huge company doing something, and you have an installed base and developed technology. Now you are creating the next version of your "solution". All design and development gets focused towards your current customer base and just "redoing" what you've done before (only a little better). Internally companies are always warring (like feudal kingdoms and Royal Politics), and it isn't long before the group that wants to do the least amount of work will always win. So soon your "revolutionary" change becomes "evolutionary enhancements" -- and your redesign becomes a minor feature add. In the end there is very little that is significantly "new" that can really happen in companies with too much legacy or momentum.

  • Q. Why was God able to create the Universe in only 7 days?
  • A. He didn't have to deal with an installed user base.


This happens all the time -- where the amount of change in a revision slowly gets whittled down until the revolutionary new update is just a minor bug fix, and a couple of dumb features that made it past marketing. The real innovation either never happens, takes decades, or only happens when the big (bloated) companies buy someone else who has done the work for them.

Microsoft

Look at Microsoft and track their "innovations" of the last 10 or 15 years. I bet 95% of everything you know as something "new" from Microsoft actually came from someone else:

  • Explorer is a few features stuffed on top of SpyGlass's Browser.
  • Most of the features of most of the Apps Microsoft puts out were borrowed or bought -- the spellchecker, grammar checker, thesaurus in Word.
  • Almost every Office App was started somewhere else (and bought or borrowed) -- at least Word, Excel and PowerPoint were.
  • The tools Microsoft used to create those products were the same that they used to create their early languages (which they also got somewhere else).
  • Core parts of Windows were "acquired" technology.
  • Windows98 was just a bug fix on Win95 which borrowed from WinNT or from Win3.1 which borrowed from DOS.
  • Most of DOS was bought (or stolen) from others.
  • NT was part Dave Cutler recreating VMS for Microsoft, flavored with borrowed Windows, DOS and Xenix code.
  • Visual Basic -- bought (and sort of a copy of HyperCard and MacBasic). In fact all of Microsoft's languages were bought (or borrowed).

The bigger companies get, the harder it is to make real changes from within -- and the easier it is just to try to gobble up others that do something for their money.

Intel

If you want to see real bloat and lack of innovation one need only look at Intel. Those poor saps have the same pathetic ISA (Instruction Set Architecture) that they've had for 20+ years. They can bolt more on (minor changes) -- but their legacy has been holding them back big time. If it wasn't for the fact that they were big (and in a monopoly position), they would have been dead long ago. They first talked about having a P7 core (with RISC like instructions) in 1994 -- targeted for 1997 release. Whoops -- missed that goal by a little. I remember 1998 goal, 1999 goal, 2000 goal, finally, the P7/Merced shipped as the Itanium (or Itanic as I like to call it) in late 2001, and the 2002 versions (McKinnley) aren't yet delivering on the 1997 promises. Why? Their legacy has eaten them alive.

Intel is a big, bloated, and all that most people in the company knew how to do was design x86's -- throw them onto something new and they can't handle it -- the whole company goes into chaos and "defensive mode" -- but they are still churning out those old x86's (in 27 flavors). Not only that, I'm sure there is a huge amount of infighting where some groups (the ones making the money right now) don't WANT to be obsoleted by something new -- so they do their best to make sure that doesn't happen (or doesn't happen quickly). Oh, and that x86 instruction set is mostly borrowd from their 8080 which was borrowed from the 8008, which was a fat version of the first microprocessor the 4004. It isn't like there hasn't been things that could have changed for the better, or that others weren't able to. Legacy becomes the millstone around their neck -- and no one thinks about solving the problem new, just minorly improving the solution to the old problem. Don't get me wrong, they've done a good job of bringing that legacy forward; and made money hand over fist. But if you want to innovate, Intel is not the place to do it.

Apple

Apple certainly isn't immune. The legacy problems with Mac OS (and many API's) have kept some fundamental OS changes from happening as quickly as they should have -- and Apple has its own rats-nest of political problems and groups fighting other groups. Even when the Mac was created the Apple II people used to resent the Mac group (because they knew that the Mac groups were trying to obsolete them) -- in many companies the Mac project would have been killed; and even at Apple it almost was a few times.

Smaller has advantages

Smaller companies (in general) do better against legacy -- they have less to lose, and more to lose if they don't grow and change and adapt. There aren't as many layers and people to go through (so the political mire isn't as deep) -- people communicate faster and see the results (good and bad). So they have a more "innovate or die" pressure. Also they usually just don't have legacy to build on, so they have to create. They can't just eat up competitors, so they have to do it themselves. This is why many small companies stay "hungry" and innovate (it is do or die) -- but the bigger you are, the harder it is to do, since you have an "out" (ye olde acquisition), excuse (our customers want the same old thing), or politics (action comitees and task forces meant to stall time until it doesn't matter any more and you have to buy out the competition anyway).

Why does it matter?

LudditeDuck.jpg

Imagine you are trying to do something new -- whatever that is. Make a clean new light OS, you will almost never be succeed inside a big company. All the legacy groups inside the company will fight over who gets the biggest piece of the pie -- but they know enough to gang up and kill the new intruder (who threatens to take the whole thing). Pick on the small and weak first! The enemy of my enemy is my friend (or close enough for corporate politics) -- and intruders are eliminated so they can get back to the enemies they know, like the other groups or the customers.

Even in bigger companies where you do get innovation, it is usually some small rogue group (that was able to slip below the corporate RADAR) that gets the real innovation. IBM's PC was a little group in Boca Raton Florida, far away from Corporate Mucky-Mucks in New York. The Mac group in Apple was a relatively small team. QuickTime was a small team. Some of the bigger teams (AOCE, QuickDrawGX, Copland/Maxwell) and other high profile project flopped -- they had too much interest, and too many goals (and chefs in the kitchen) to actually focus on any one thing. The same happened with most of IBM's PC efforts. If you are working on new things, you need a few smart people; if you are supporting and working with legacy systems, you need a lot more bodies, but they don't have to be as creative and smart.

Of course for the innovation to matter you have to shoot for the right goal -- and that's probably a bigger problem and a different article.

Focus

The real problem is with goals (and the focus) -- where is something going, what are you shooting for. Companies only see the problems as variants of their old solutions, or fitting in the Corporate Mantras. Again, Intel sees a way to make and sell more chips. Microsoft sees opportunities as a way to sell more OS licenses. Vertical companies with legacy see ways to add marketshare to the same old legacy systems that they know. Few now how to separate a new goal, and break off a new market. Most make hybrids, or let those new directions get killed by all the rest. They don't have the focus on the new problem.

Conclusion

The end results is that legacy (and market share) is a good thing, in that it is customers, security, a market, a revenue base, and so on. But it is also a bad thing. Legacy is an invensment in the past (and not the present or future). Legacy means that your designs will be designs towards that past -- and there are many sacred cows (wise or not). When something new comes along, you are mucked up in the mire of the past, and can't easily utilize that new innovation -- and must stick with the old way (because of all the investment into that old way inside the company). Right and wrong don't matter -- what matters is the legacy/momentum itself.

Look at USB on the Macintosh. PCs started talking about USB in 1993, they got some cards in 1994, and got on the motherboards in 1995 or 1996. In mid 1998 they got some OS support. Apple added USB to all iMacs in 1998, and all Macs a year later, and in two years the Macs did more to get USB really off the ground and accepted than the previous 5 years of the PC. Look at OS X, Apple is completing a transition from one OS to another in 2-3 years. It has taken Microsoft 9 years from the start of NT, and they still haven't completed the transition as far as Apple has. Apple didn't have the commitment to their legacy like the PC market has. The smaller size of the Mac market enabled Apple (Mac) to be far more adaptive than the PC.

Size and legacy often work against companies -- they not only get the size and momentum of an supertanker, they also get the turning radius of one as well. While others are able to zip around and make changes and adapt to new things and new markets -- it takes many miles for these things to slow down -- let alone contemplate a change in direction. The trouble is that many things in technology are a slalom (constantly changing market opportunities). Big companies can't maneuver -- but at least they can keep plowing into (and running over) the smaller companies -- or they can keep trying to buy the winners.

But manueverability only works to a point -- there are many companies that are big today but were small yesterday. The big boys can't eat them all -- and many of the little boys sometimes offer themselves up on a silver platter before they are eaten (for the quick buck). Other companies just get crushed, or they manuever themselves into ablivion; see Be, Amiga, and many others that not only zipped here and there, but kept seeming to dodge a customer base or viable business plan.

So new companies that have no legacy (or are willing to break it) are able to do incredible things that "old" thinking companies can not; but that doesn't guarantee success. I am not surprised when teams of 10 or 20 programmers can produce more than what takes many hundreds of engineers elsewhere. I am not surprised that smaller companies can jump into new markets, or gobble up old markets better than the big boys. I am more surprised more people don't realize it, don't acknowledge that it happens all the time, or by people that don't know why.


Written 2002.10.29