The Big Picture or Down in the Weeds?

The Big Picture or Down in the Weeds?

Chris EvansEnterprise, Opinion

Warning – this is a pure opinion piece, with no discussion on specific technologies!

Over my 35 professional years in IT (over 40 if you count messing about with computers as a teenager), I’ve seen many transitions, big and small.  I like to watch both the big picture changes in our industry but also follow the detail because both help us predict where we’ll head next.

It’s very easy to be focused on a single topic area in IT these days.  Compared to the 1960s, when it was just about possible to know everything about computers, today’s IT professionals can’t keep up with the speed of innovation across every aspect of technology.  In the last ten years alone, we’ve seen the rise of DevOps (and all the associated “Ops”) as an operational strategy.  The public cloud has started to dominate IT discussions.  Containerisation has appeared and already gone through one iteration (the rise and fall of Docker) – to name but a few obvious technology events. 

It’s not surprising that many IT professionals simply want to focus on their aspect of the industry.  However, as an analyst, I believe we have a duty to look at large-scale trends (the big picture) and how individual technology products and solutions are evolving (the minutiae).

The Big Picture

When we look at IT from a distance, the trends are easy to spot.  Here are some examples.

  • The cyclic nature of IT – in the 1960s & 70s, we had (arguably) mainframe dominance, with many businesses choosing managed services via systems houses.  The diversification in the 1980s & 90s drove the rise of Microsoft, Sun Microsystems and other mid-range platforms.  The data centre further diversified into standardised servers and is once again moving back to centralised computing with the public cloud.
  • The dominance of big vendors – the most obvious example here is IBM.  The mainframe set the standard in the early days of computing.  Remember the mantra “nobody got fired for buying IBM”?  Today we might think AWS dominates, although we can see Microsoft domination on the desktop, Oracle’s domination in relational databases, VMware’s domination in virtualisation, Cisco’s domination in networking, and Intel’s domination in processors.  The list goes on.  But these companies don’t reign forever.  At some point in the future, for example, AWS will be just another vendor in the IT ecosystem, while VMware could be transformed into nothing more than a recurring revenue generating machine. 
  • The relentless evolution of technology – this area covers a multitude of diversity.  Storage media vendors push the envelope on capacity, cost reduction, miniaturisation and performance.  Processor manufacturers have extended Moore’s Law further than possibly could have been imagined.  Networking has delivered ubiquitous computing from anywhere in the world.  We’ve moved through virtualisation, containerisation and serverless models of computing in less than 20 years.  New computer languages have emerged, while the ability to develop software has become open to everyone.
  • The disconnect for both users and IT professionals – as technology has moved towards a service model, over the past five decades, we’ve reached a point where working in IT doesn’t need an understanding of how the hardware or software works.  This is perhaps true in many aspects of technology; modern cars are complex and offer little opportunity for self-servicing, for example.  The inability (or lack of desire) to understand implementation details is both good and bad.  The speed of innovation is improved when the learning curve for new technology is flattened.  However, a lack of understanding of the way in which technology is delivered can lead to inefficiency, increased cost, poor service, data loss and more. 
  • Diversity matters – as computing becomes an increasingly important part of everyday life, the traditional computer geek has become horribly outdated.  Making technology work for everyone means being inclusive to those who will be affected by the adoption of IT.  With each decade that we move further forward from computing’s origins, the greater the need to ensure technology is representative of all communities because modern IT is what we do with the technology, not about how it works. 

It’s difficult to imagine another industry that moves at such a pace as IT.  I remember someone suggesting to me (many years ago) that we “lose” 25% of our technology knowledge every year.  Effectively we must learn one or two new paradigms a decade.  That’s a lot of change in a career spanning forty years.

Unfortunately, despite our ability to innovate as an industry, we continue to “re-invent the wheel”.  Some of these changes are absolutely necessary, of course.  New technology improvements demand a rethink or reimagining of how we code and deploy applications, how security is implemented, how data is protected and so on.  But at the same time, standard processes that are technology agnostic (like capacity/financial planning, data protection, data security) seem to fall into a trap that makes the same mistakes as every previous generation.  As an industry, we develop by failing, rather than developing through learning.  (The reasons for this misstep are a whole separate discussion for another time).

Minutiae

What about getting down into the details, getting into the weeds about how technology operates and is implemented?

From a purely personal interest, I’m always fascinated by new technology.  I think most people in the industry probably feel the same, otherwise we wouldn’t be here.  Sometimes the ability of engineers is simply mind-blowing.  A good example is the development of single-board computers like the Raspberry Pi.  I have more MIPS on my desktop in a few Pis than I had in a mainframe in the 1990s (although the I/O subsystem and other components don’t match up).  My 32MB SD card from 2000 has been superseded by micro-SD cards of up to 1TB, which I can buy for less than $200.  Networking speeds are unimaginable.  At University in the 1980s, I used a 300/75 baud modem.  Today I have a WIFI grid at home with speeds up to 1Gb/s. 

So, technology has an impressive record of improvement.  What can we learn from this?  For me, it’s the lessons we can learn from trends and how we project those trends into the future.  In five years’ time, for example, we’ll be dealing with 100TB HDDs and SSDs.  Networking will be at close to terabit levels (for data centre backbones at least).  How will those types of changes affect the way we deliver computing services?

But that’s not everything from a details perspective.  New technologies are constantly being developed and brought to market.  Intel, for example, tried hard with Optane (3D-XPoint), ultimately failing.  Learning why a behemoth like Intel can get things wrong is critical.  Does it mean persistent memory isn’t needed in the data centre?  Are all the other persistent technologies doomed to failure?  If not, which will be successful and for what reason?

The lower levels of technology development are like a boiling cauldron.  Some things emerge as the next greatest thing, and some don’t.  Without watching this market, we can’t predict where the industry is headed next. 

Triggers

The big picture macro transitions are generally triggered by one or more events at the micro-level.  Mainframes reduced in importance (arguably) due to anti-trust issues, the cost and closed nature of the architecture and slower pace of evolution compared to the rest of the market.  Docker failed to capitalise on the need for detail in the container ecosystem, leaving features like persistent storage until much later in the development cycle.  Flash storage gained a foothold because Apple used the technology in iPods.  The list goes on.  So, in addition to watching the component-level advances of technology, we must watch for trigger points that change adoption for good or bad.

The Architect’s View®

The IT industry has developed to a point where practitioners can specialise in only a small subset of technology and still have a rewarding career.  It’s unlikely thought that this luxury will extend across an entire 40-year span.  Instead, IT professionals will be required to learn and re-learn or risk falling behind, or out of the industry entirely.

For those of us watching and analysing, we need to see both the big picture and the small details.  Both are equally important in formulating our opinions and views.  Whilst I write a lot about the lower layers of technology, I always keep an eye on the trends.  Expect to see more about how and why we are where we are today, as well as that focus on the detail.

Copyright (c) 2007-2022 – Post #bcd2 – Brookend Ltd, first published on https://www.architecting.it/blog, do not reproduce without permission.