Posts Tagged ‘future developments’

It wasn’t much more than 10 years ago that Apple was a washed-up computer company, whose disastrous forays into PDAs and other consumer-targeted devices had led to it being comprehensively outmanoeuvred in the marketplace by a rapidly expanding Microsoft. Nowadays, whilst the reverse situation is not quite true, it is certainly the case that the release of a new Apple product garners significantly more column inches in the press than anything announced from its Redmond-based competitor, which seems always to be metaphorically tarred with the ‘dull but worthy’ brush.

And so it was last Wednesday, when Apple unveiled its latest product. In the months leading up to the launch, chat rooms and forums had been buzzing with rumours about exactly what this product might be. In fact, such was the level of anticipation, one could be forgiven for thinking that Apple CEO, Steve Jobs, was about to announce that he’d discovered the final resting place of the Ark of the Covenant.

Unfortunately, he hadn’t; what we actually got was a tablet PC, named the iPad (which must have taken the Apple branding department, ooh… minutes to think up.) And whilst it certainly looked sleek and user friendly, it didn’t, at least on first appearances, have any paradigm-shifting features or functionality.

But, then again, neither did the iPod, or, dare I say it, the iPhone; Apple’s success in recent years has been built less on pure innovation than on integrating existing features in a single device, and offering a well-executed means of adding content, be it media or software. In this respect the iPad is no different: it provides wi-fi, 3G (in the more expensive models), accelerometers and a large colour display, and most importantly, will be backed up with a fully featured online media store from which users can download books, magazines, and presumably applications.

I said ‘most importantly’ there because it is content distribution that has been the area in which Apple has made the biggest impact in changing the way consumers purchase and use media. One only needs to look at the 8.5 billion songs downloaded on iTunes since its launch, or the billion plus applications downloaded to iPhones and iTouches to realise that what the iPad really represents is the opportunity for the publishing industry to radically re-invent its distribution and sales mechanisms.

Which leads me, in what you may think was a rather long-winded way, to the future of paper and the printed medium in general.

The death knell of the printed word has been sounding from some quarters almost since the first web browser was made commercially available. Online publishing was the way forward, we were told back at the end of the last millenium. The printed word had no chance of competing with a method of distribution where the marginal costs were virtually zero. Web versions of major magazines and newspapers began to spring up on a daily basis. Paper purists wept.

Ten years on, however, the Sunday Times still gives paperboys hernias, and no WH Smiths store is complete without a row of middle-aged men staring blankly at the pages of car magazines. Sure, there have been some major closures – the Face and Melody Maker spring to mind – but there has not been the print apocalypse that was forecast. In fact, remarkably little has really changed.

This, I believe, is not a consequence of some deep-seated human need to connect with the physical medium of ink on paper, but is related to the technical limitations of the online medium. Websites are great for short articles, for videos, interactivity and nuggets of information, but no-one is going to use a desktop or laptop to read a book or even a magazine in a website format. They may, however, use a thin, light e-book reader, to read specially produced e-books and e-magazines,  especially if the content is readily available, and at a cost that is below that of the physical version. This is where the iPad comes in.

Now, at this point, some of you may be wondering why I haven’t mentioned the plethora of existing e-book readers in the marketplace, all of which, with the exception of Amazon’s Kindle and the very recent lookalikes from Sony et al, have been disappointing flops. Surely, you may argue, if e-books are so great, why have sales of these e-book readers been so poor?

The answer is that until now e-readers have lacked a simple, cohesive platform for the distribution of content, and have not offered sufficient features to make the expense of buying one seem worthwhile. The Kindle has been more successful than its predecessors, mainly because it is fully integrated with Amazon’s own e-book system, but it still lacks a solid selling point, unless you are someone who buys books in sufficient quantities for the lower price of e-books to offset the cost of the Kindle itself.

The difference with the iPad is not that it’s cheaper than the Kindle, or other e-readers – in fact the reverse is actually true. No – the real reason why the iPad will turn the e-reader market on its head, and change the way we buy  and consume the printed word, is its desirability and adaptability. Buy a Kindle and you have a dull beige box that looks like it was designed by the same people who designed IBM PCs in the 1980s (ie people who wear short sleeve shirts with ties). Buy an iPad and you have a perfectly executed piece of post-modern industrial design. The Kindle is a one trick pony. The iPad can play games, sense motion in x, y and z planes, and can almost be used in place of a laptop, if necessary. Combine this with Apple’s content distribution system, which is the market leader in the music and application download fields, and it becomes clear that the iPad may even sell in larger volumes than the iPod or iPhone. It has the ‘want one factor’ previously missing in previous e-readers, and its multi-functionality will provide the average consumer with greater justification for  purchasing it than the single-use devices from Amazon, Sony and other competitors.

I’m not saying that Apple will have the market entirely to itself over the next few years; it is likely that other companies will follow suit, in the same way that Android based mobile phones have been developed to compete with iPhones in the last year or so. What I do believe, though, is that Apple will have the lion’s share of the market, and that any sales to competitors will only serve to increase the size of the total market, rather than steal customers from Apple, in much the same way that Ferrari and Lamborghini sales rose signficantly in the last decade despite an increase in the number of companies manufacturing supercars.

The implications of this for the print and media industry are clear – if the growth in sales of online magazines and books for the iPad mirrors that displayed over the last five years  by music downloaded from iTunes, then they will need to radically re-define their distribution and business models.

Let’s look at the figures in a bit more detail to back up this assertion. Sales of downloaded music, which were virtually zero prior to 2004, grew exponentially after the release of the iPod, and are now predicted to reach $4.3 billion dollars by 2012 (http://www.forrester.com/rb/Research/end_of_music_industry_as_we_know/q/id/43759/t/2) – a figure that is greater than the forecast sales for CDs. iTunes has a share of around 70% of the global market, and it seems likely that this will remain constant, or increase, between now and 2012. In other words, within the next two years, downloading will become the most popular method for consumers to purchase music, and iTunes will be the dominant choice of retailer from which these purchases will be made.

The print market is lagging around 6 years behind the music market, but if we apply these figures to books and magazines, we can hypothesise that 50/50 print/download sales could be reached as soon as 2016. In fact, providing uptake of the iPad is as high as I’ve predicted, we could see such a shift even sooner, as printed books and magazines have much higher unit costs of production than CDs, and, thus, there is a stronger financial imperative for publishers to promote the higher margin (yet, hopefully cheaper to the consumer) e-versions.

Whether this happens is almost entirely in your hands.


Read Full Post »

I  thought I’d kick off my futurology blog with a look at my own theory of the inverted pyramid, which examines the underlying assumptions made in any body of knowledge and asks whether they are sufficiently stable to support the structure above. We’ll consider the applications for futurology in a little while, but before we proceed any further a little more explanation as to exactly what I mean is required.

As with most explanations, it is probably easier to demonstrate by example than to undergo a delineation of the theory in abstract terms, so I’ll start by looking at how the inverted pyramid applies to a practical subject – namely to that most contentious of issues – religion – and, more specifically, Christianity.

With over 2000 years of history behind it, and with innumerable subtly (and in some cases, not so subtly) different denominations, Christianity is comprised of an extraordinarily broad mix of pseudo-historical facts, articles of faith and much-debated interpretations of scripture. Yet whilst the notion of, say, original sin, or the transubstantiation of the mass, may have resulted in centuries of disagreement between the religion’s sects, the body of knowledge over which they both agree and disagree is, without exception, predicated on a single premise – that God exists. Remove the premise – that which is represented as the inverted point of the pyramid that forms the foundations – and the entire body of knowledge (that represented by the pyramid itself), with all of its attendant mythology and debate, crumbles.

Whilst there have been many attempts to prove the existence of God over the years – there have been ontological and teleological arguments (the latter resurrected in recent years by the intelligent design brigade) – none is what we could call a proof in the empirical or logical sense of the word. It is, thus,  a thin and not entirely robust premise that is upholding the inverted pyramid that is the religion’s belief system. Of course, Christianity is not alone in this;  all of the world’s major religions are susceptible to the same flaw in their reasoning.

One could conclude then, that if a body of knowledge predicated on a single premise is to be robust enough to undergo rigorous scrutiny and still maintain its form, then that premise needs to be one that is exceptionally stable; one that, despite its solitary nature, is strong enough to support the broadening weight above it. The problem, of course, is that an inverted pyramid is inherently unstable, and that no matter how strong the initial premise is, the spreading weight above will almost always result in an eventual collapse.

You may, by now, have started to see how this theory can be applied to futurology. The futurologist is concerned, more than anything, with the strength of the atoms of fact gleaned from the present, through which he or she will extrapolate future scenarios. Should the futurologist rely too heavily on too small a number of premises, or choose ones that are insufficiently robust, then the likelihood of their future scenarios being accurate is heavily reduced.

Let’s take a look, then, at the inverted pyramid in action, in the realm of that great exponent of futurology – science fiction – and, more specifically, the 1970s UK TV series, Space: 1999. It is, perhaps, a little misleading to choose such a programme to illustrate my point, given that it was produced purely as entertainment and not as a means of providing a serious vision of mankind’s future development, but I have decided to include it simply because I can think of no other example that demonstrates so succinctly the link between a weak initial premise and poor predictions of future scenarios.

As you would expect, Space: 1999  is indeed set in space, and the year really is 1999. And whilst 1999 in the actual world was the year of the dotcom explosion, the millenium bug and  the iBook, in the fictional world of Space:1999, mankind has established a permanent base and a nuclear waste dump on the moon, and computers are multi-coloured flashing objects with tiny monochrome screens. Had the writers and producers been given marks for the accuracy of their depiction of a world only 25 years forward in time from their own, it is probable that an Iceland-in-the -Eurovision Song-Contest-like nul points would have been the resounding chorus.

It is important to remember, though, that Space:1999 was produced at a time when manned lunar landings were still fresh in the collective memory, and when space travel seemed to many to be the ‘next big leap for mankind’. Given this background, it is easy to see why the show might extrapolate from the success of the Apollo missions, a world, a quarter of a century in the future, in which mankind had begun to colonise the moon. Where the show’s creators went wrong was not to test this assumption; if they had done so they would have understood that the costs and physical requirements of such a venture would have precluded it from happening, certainly at any point before the middle of the following century. With this key premise removed, much of the imagined world of the series  – the inverted pyramid itself  – collapses.

So, where does this leave the futurologist? The answer is: with a need to build his future scenarios on multiple, tested and robust assumptions. In other words, we must ensure not only that the foundations of our pyramid are stable, but that it is also supported at as many points as possible – the pyramid should actually more resemble a square. In reality, this means examining each assumption carefully, then cross-checking the effects of each assumption against the others. Assumptions can be assigned a score, based on their robustness, and only the highest scoring ones, provided that they are sufficient in number and agreement, can be used as the founding premises of our future scenarios.

We’ll examine some examples of this technique in operation in a future posting.


Read Full Post »