Category Archives: Strategy

Amazon to stop selling books by 2020

Child reading

It seems a crazy to suggest that the world’s biggest bookseller would stop selling books, particularly as I’m not someone who believes the book industry is dying. And Amazon is the dominant force in the book retail market, particularly in eBooks where it has a 65% stake.

 

But here’s the thing.

 

Book sales made up only 7% of Amazon’s annual revenue. In the US, Amazon is being outperformed by other retailers, notably Wal-Mart and Google.

 

Meanwhile, the breakdown of Amazon profits is unclear: some might suggest that this is deliberately obfuscated. However, Amazon is by far the global leader in the provision of Cloud Computing services; it has 5 times the capacity of its next 14 competitors combined.

 

So what you’re looking at is on the one hand a flat-lining business model in which you have a 7% share, and on the other a market growing massively in which you’re the dominant player. Gartner predicts that by 2016, the bulk of IT spend will be on cloud computing. McKinsey predicts the economic impact of cloud technology could be $6.2 trillion annually in 2025.

 

Faced with those figures, who would continue to sell books?

 

Realistically, I don’t think Amazon will abandon book selling completely: it will just become even more of a marketplace. Amazon will be the platform on which publishers (and self-publishers) sell their books and it will retain a commission for doing so: an agency pricing model by another name. Importantly, Amazon will retain all the point of sale data about buying habits that it can use for improving upsell.

 

What this means is that Amazon divests itself of all its stock management issues and simply acts as a channel for customer service, sales and returns. Publishers (and indeed other product manufacturers) will use Amazon’s huge online presence to market their products, while Amazon ensures it uses its data to maximise spend. This also means that Amazon can have an even more tax-efficient model in terms of where it chooses to base its services. We know this has been important to Amazon in the past and it’s unlikely to change.

 

Meanwhile Amazon will focus on cloud computing, extending its dominance and broadening its services. And that’s something that’s going to happen fast. I suggest by 2020, but that’s a finger in the air. If I were Jeff Bezos, I’d be doing this now.

 

See also:

 


 

The perils of data phrenology

An illustration of the characteristics applied to the skull in phrenology.

Tom Fisburne – whom you really should read; in a marketing world full of crap he really cuts through it, much as Scott Adams once did for management – has a great analogy comparing Big Data to teenage sex. But for those people who actually doing Big Data, there’s a Twainian peril: data phrenology.

Phrenology is the study of bumps on the head, used to assess the character of the head’s owner. Unlike its sister study of chiromancy / palmistry, the -ology suffix makes phrenology sound like a science. It isn’t. It’s about coincidence and interpreting those coincidences so that they appear meaningful. See my point about Big Data yet?

Let me elucidate.

The more data you have, the more chance you’ll find coincidences. And the more you invest in Big Data, the greater the pressure for data insight. In other words, not only do you have a lot of patterns, you’re also under pressure to interpret them. That’s a massive potential trap for your business.

This is particularly true when analysing social media data. A couple of years ago a statistic went round that people who liked Burger King on Facebook would spend a few dollars more on each visit than people who hadn’t liked the Facebook page. The implication was that if you could only get people to engage with you on social media, they’d buy more of your product. But this was a syllogistic fallacy. The truth was that these social media types weren’t driving through Burger King and saying: “I’ve liked your Facebook page, so you’d better supersize me!” These were people who liked to eat burgers and wanted their friends to know about whole beef patties, but hold the gherkins. N.B. We neither endorse nor censure any food products on this site. There was also the possibility that they liked Burger King because it ran some campaigns to get people to promote its Facebook page…

Similarly, if you sample the psychometrics of people who follow you on Twitter and find they also discuss Breaking Bad, that doesn’t mean that you should necessarily go an buy advertising space on HBO. Lots of people tweeted about Breaking Bad, just as lots of people like to watch cat videos on YouTube.

Insight = meaning + hypothesis

Before you even start looking at data, think about about what you expect it to show. For example, is data about your product appearing in markets where it’s not sold, geographically or vertically. If so, there may be some data crosstalk. Michael Jackson didn’t just perform in Thriller, he wrote about beer and commanded the British armed forces.

Why have bothered to acquire the data in the first place? It’s not just going to turn up some results that no one ever realised. There’s nothing inherently mystical about it. It’s a test bed for your business assumptions. A way to test hypotheses.

If you’re seeing spikes and trends in data that match your hypotheses, they’re correct. If you’re seeing those hypotheses fail, they’re probably incorrect. And if you’re seeing something else you need to question that trend’s validity: what happened to create a spike? Is it significant or just coincidence?

If you’ve enough relevant data, it will almost certainly beat any gut feeling about business performance. But you can’t expect it to reveal some kind of hidden truth by itself. If you really want to If you want to get meaningful insight from your data, you need to feel your way past the bumps and recognise that your data is only as useful as the .questions you ask it.

 


 

A 2011 retrospective

Eyes looking back

When you reach the end of a sprint, you look back and consider what went well, what went badly and what can be improved. There’s a similar process for waterfall projects when you produce a lessons learned report to share with the rest of the PMO. While I’m sure you floccinaucinihilipilificate about this company’s 12-month performance, allow me to highlight three things I’ve noticed come to the fore in the last 12 months.

You need to demonstrate the tangible benefits your project will deliver as quickly as possible.

Of course, this has always been true. But the pressure to be lean and value-driven is greater than ever, driven I think not just by wider economics but also because the technologies we work with are more mature and with that, so are customer expectations.

Many people are in the third or fourth significant implementation of a content management system, whether for web or across the enterprise. Marketers have already made their initial forays into social media. Not seeing returns on information systems or web engagement simply isn’t good enough. So before putting their hands in their pockets, they’re quite rightly asking what they’re going to get back. As an industry, we need to answer that question quickly and credibly.

Events are being stretched.

People are increasingly participating in events from a distance and after they’ve finished. Television has stretched beyond the screen by broadcasting with hashtags which allow an audience – not all of whom are actually watching – to discuss programme content beyond the control of the programme’s producers. Whether this is music or politics, it’s a long way from the controlled comments policies of newspaper discussion forums. Huge numbers of people are using tablets and smart phones to communicate as they watch TV.

This applies to football matches too, whether from the armchair or the stadium; and very much to music, be it at a festival or on Spotify. The discussion extends way beyond the geography and the duration of the event; supported by the fact that the media doesn’t need to be watched there and then either. There’s gold in those hills, I just haven’t figured out how to extract it yet…

We could understand our market a lot better if we just took the time.

Sales people and analysts have been harping on about big data as the next big thing without too much detail around what it is or why it’s useful. But consider this. People now reveal huge amounts of personal information under highly obfuscated terms and conditions. If you could join up Facebook profiles, Flickr, Amazon, loyalty cards, credit ratings, browser history, and online social interactions, you’d have an incredibly complex and potentially frighteningly accurate picture of your market and how to sell to them.

If you’re a D2C organsiation or want to become one, getting that kind of data and being able to process it in a meaningful way is going to make your current online engagement look… well, pretty poor. Start thinking now about how you can get more data legally and how you might exploit it to reveal business information that will give you a competitive advantage. You can be sure that if you don’t, your competitors will.

 


 

My threepence for 2011

I can’t help myself. It’s New Year and that means some kind of retrospective, and indeed preview. I’ve been working a lot less with off-the-shelf CMS and doing a lot more work involving custom-built web applications. I’ve no idea if this is reflective of a wider market trend but I thought I’d share three things that I’ve seen in the past year which I think will become even more important over the next 12 months.

1953 Thrupenny bit

1. Content management applies to off-site content too

It’s all very well thinking about content as the “stuff” people in your organisation create in repositories that you control. But there’s a really big issue. There’s a whole load of content that’s not in your repositories that you need to deal with. From an internal operations perspective, this is the tacit knowledge and the documents that people take outside your office when they leave each day and doesn’t come back until they return. From an external marketing perspective, this is the content that people outside your organisation are creating on platforms you don’t control: Facebook, Twitter, blog posts. Just getting a handle on what’s going on strikes fear into many. But exploiting this off-site content will bring huge benefits to your organisation.

2. The web is a competition

Look at all the online reputation tools out there like Klout and We Follow. Isn’t online participation just a competition where the brands with the biggest reach have the largest social market capitalisation? It used to be about whether you appeared on the first page of Google’s search results, but now we can measure influence and advocacy in other ways too. The web encourages you to ensure that your online presence exceeds those of your competitors. The services that you offer need to tap into that mindset if they’re going to be successful. But you also need to consider what tangible returns you make on raising your web profile. It’s a competition, there are trophies, but is there a cash prize?

3. Designers need to think a lot harder about multi-platform

While people who’re engaged in heavy content entry will continue to use devices with comfortable physical keyboards, we’re obviously going to see even more use of mobile phones and tablets. This means smaller screens, touch screen controls and often, slower performance. Designers who are constantly trying to cram ever richer user experiences onto a page are going to fail their audiences if they don’t consider how people on slow connections can download media, or interact with fiddly HTML buttons. It’s no good expecting the device browsers to be clever enough to handle your designs well. Test-driven interface design is going to be essential.

 


 

Culling web projects in the age of austerity

Statue of an emaciated Buddha


Austerity is on the agenda. Across Europe, business and governments are making cuts in spending. In the UK, this means a further “clampdown” on the number of central government websites, while many private sector organisations are looking to reduce the total cost of their web presence.
Shareholders and taxpayers will applaud budgetary asceticism, but there has to be a middle way. Cut budgets from the wrong projects and you won’t achieve your online goals. Cut too far and you’ll undermine your ability to respond to new market challenges. Short-term gain will lead to long-term pain.
So what’s the best way to decide which budgets should be protected and which should be culled? I’ve found executive boards typically consider three propositions.

1. Blanket cull

The easiest way for a board to implement austerity is just to withdraw funding for all new projects. This approach may be the simplest in the short term, but it’s also the most destructive. Any board that proposes this kind of cull is tacitly admitting its own incompetence. How can you engage effectively with an online market that requires you to respond to emerging trends without investing? Be under no illusions: cutting projects doesn’t mean a loss of new revenue streams; it means conceding market share.

2. Stick to your guns

Marginally better than stopping all new projects is permitting just those that conform to your corporate strategy. This is the typical approach of an executive that is totally convinced its strategy is correct. They’ll probably have the market research to support this conviction too. But this kind of tunnel vision is fraught with risks. Who wants to place all their eggs in one basket? You’ve got to have some room to hedge your bets. It’s not just that you need to respond to market trends, it’s also that people in your organisation innovate, and by applying a rigid strategy you effectively block out those innovators. They’ll be demotivated and, if their ideas are any good, guess what: they’ll take them to your competitors and you’ll be in no position to respond.

3. Additional governance

If the board is a touch more enlightened but still clinging to its purse strings, it will introduce extra governance procedures. This is a classic ploy in most large organisations: create some new hoops to jump through and hope that this puts off anyone who hasn’t completely thought their proposition through. But this recreates the same problems: you’re satisfying your existing bean-counting processes rather than trying to discover what potential benefits might emerge as a result of a project.
When an executive introduces more stringent procedures for budget approval, it’s often because it wants to appear strong but is actually completely disengaged. It’s handing over responsibility for key decisions to a set of formulae. When the project goes wrong it will blame a poorly-constructed business case rather than the more obvious cause of project failure: lack of executive involvement.

Budgetary control is not the same as leadership

All projects rely on the executive to be actively engaged: making decisions when they’re required, providing leadership and assuring that the project continues to be aligned with organisational objectives. Fundamentally, if your executive is adopting one of the approaches outlined above, it’s only thinking about the money, not about the project. And that’s likely to mean the project’s going to fail anyway.
So if it’s a bad idea to stop new projects, stick vigorously to your existing strategy or introduce extra governance, what can you do to save money?

You’re in a hole, stop digging

Organisations are littered with zombie projects: those that have been running seemingly forever but that have never delivered benefit. These are the projects that sap the morale of the teams working on them and engender snide remarks from other teams struggling with lesser budgets yet more likely to deliver.
If your project has already been running for more than a year, has missed major milestones (or, worse still, has no major milestones), has had more than one change of project manager or systems integrator, or is just setting a strategy for other projects to follow, you can be pretty sure that it’s not going to deliver against its original brief.
Why force everyone else to tighten their belts when you’re continuing to squander money on a project that has had an opportunity to deliver but failed? It seems so obvious when you’re stood on the outside, but I think most organisations can be honest and say they’ve let some projects go on too long when they should have pursued others.
Good programme management isn’t about relentlessly pursuing the same objectives with an ever-diminishing budget. It’s about the ability to shift focus and point your organisation towards new benefits. Imposing arbitrary rules just gets in the way. So be rigorous in your budget management, but be dynamic in going after new opportunities. Be less fearful of abandoning something that hasn’t worked than missing out on something that might.


See also Alan Pelz-Sharpe’s article on UK budget cuts and public sector IT and re-assessing the value of Enterprise Licence Agreements.

 


 

Does rationlalisation reduce cost?

It’s a fair assumption to make that some organisations haven’t procured their content management systems as effectively as they might have done. Poor procurement is particularly frustrating when it’s done with our money, i.e. by government. But government in the UK is steeling itself for a major cost-cutting exercise. The Transformational Government agenda is already well underway, seeking to reduce the number of government websites and streamline online services. Meanwhile the political parties have competing missions to rethink procurement, particularly of technology. You can’t argue with the idea, and as Ian Truscott points out, there are good reasons for reducing the number of websites from a user experience perspective as well as just costs. However, you can certainly question the approach.

Let’s say you try to consolidate to a single content management system. The smaller the user base for that CMS, the more likely you are to meet its requirements. As soon as you extend the CMS to multiple teams with different ways of working, different audiences and different kinds of content, you have a change management programme on your hands. The focus has shifted from where it should be, online engagement, to training existing users in new ways of working.

Over-rationalisation tends to lead to over-generalisation, and that in turn leads to a poor fit to requirements. If you generalise too much, you’ll necessarily have to introduce customisation to your system, which was precisely what you were trying to avoid in the first place.

This isn’t the only area where too much rationalisation fails to reduce costs. While preferred supplier lists brings down the cost of procurement, they’re unlikely to reduce the cost of implementation. Qualification to be a preferred supplier is strenuous, but once you’re on the list there’s very little incentive to control your prices. Preferred supplier lists can make procurement inflexible and frustrating for the business users too. New entrants to the market are seldom present, so it’s nearly impossible for government departments to be early adopters. This makes government look like it’s off-message, when in reality many civil servants are swimming against the tide to provide a good service.

What government and many other large procuring organisations end up with is a possibly cheaper but probably riskier solution: over-ambitious projects that take too long to implement and that can’t meet emerging requirements. The larger the project, the more changes to requirements will emerge and the less rational it will become. These kind of strategic rationalisations are doomed to failure. To paraphrase John Maynard Keynes, your project’s business case can stay irrational longer than your project can stay solvent.

Rationalising your web presence is a great aspiration to have, but your have to rein in your ambitions. Rationalise a feature, not the whole system, then you’re more likely to see some cost savings.

 


 

The future of content management

Julian Wraith has started a discussion about the future of content management. There are a variety of responses to this linked to from the comments section, each with their own focus, but I recommend reading Laurence Hart for a longer-term view.

My own, brief take is that content management has to face a number of challenging questions over the next couple of years.

Will content need to be managed?

Content management currently focuses on providing tools for groups to create, review and retrieve content so that an approved version of that content can be made available to predefined audiences. User-generated content and the broadcast models of social networking challenge that focus.

  1. Anyone can view content: most tweets go to everyone rather than direct to individuals.
  2. Anyone can contribute content in a UGC world.
  3. Distinguishing what’s your organisation’s content and what’s individual is becoming increasingly fraught; just take a look at any blogger’s site for disclaimers even though they’re blogging about their company’s services.

Will content need context?

Even in the least structured repositories (wikis, flickr, twitter) content is still tagged so that it can be retrieved. But the onus is on the user to find the right tag and on a search application to enable this. This is quite different from a CMS, where the software provides contextual models like folders and related documents to guide the user through an information architecture. As search interfaces and technology improves, there will be less need to provide those contextual models. I have my doubts that semantic mark-up will help people create more relevant content, but I do think that improvements to search will mean that content will be “find-able” and “relate-able” anywhere, even if it isn’t in the right taxonomical folder.

Will content need to be deleted?

As volumes of content continues to increase and contextualisation decreases, finding relevant content amid all the dross will become harder. I think that this will be an even bigger business driver than cost of storage for deleting content that’s irrelevant. But because distinguishing “approved” and strategic content will be harder, it will also be hard to identify which content is dross and what might be useful. Socially-driven records management is bound to take a stab at this problem, but whichever content management tool can help people to get rid of useless content is going to be a winner in the long term.

 


 

Three little tips to reduce huff and puff

My two-year-old son is pleased to live in a house made of bricks. It affords him protection from the Big Bad Wolf.

But what the books don’t tell you is that while piglets 1 and 2 were sheltered by their less than robust housing, piglet 3 faced rocketing costs, toil, tears and the emergent threat of swine flu.

In the seldom-told sequel, pigs 1 and 2 are forced to vacate the house that was designed for one small piglet rather than three growing hogs. They lack the skill and resources to build their own brick houses and end up destitute and living in fear of Tom the piper’s son.

As an architect, piglet 3′s end vision is certainly the right one — or would be if he foresees having to accommodate his two brothers. But in order to fulfil that vision you need the skills, resources and time.

If you’ve an immediate problem finding the right shelter for your content, then long-term strategic planning for a robust future vision is likely to be the wrong approach. You need to find a quick way to protect your resources, assess the situation then plan your next step. You’re unlikely to face a fatal threat – it’ll just be lupine bluster – and even less likely to have enough time and money to mitigate against the problem anyway. Start building, see if it works and, if it doesn’t, tear it down again. Being able to manage even a small amount of your content in a robust way is better than just having a visionary strategy.

Those three tips:

  1. Choose two high-value objectives; one that should be simple to achieve and the other likely to be complicated.
  2. Select a technology to deliver these objectives that is in your existing skill set and technology stack. Only buy licences required to meet the project objectives.
  3. Implement the project as quickly as possible and evaluate the success or otherwise six months later.

ECM doesn’t have to be a swine to implement. As long as you don’t try to go the whole hog from the start you’ll avoid making a pig’s ear of the project and be sure to bring home the bacon. It’s a ham-fisted analogy, but it’s no fairy tale.

Further reading on the failings of web strategy:

  1. Anthony Bradley – Your Web Site Strategy is Destined to Fail
  2. Dennis D. McDonald – How to avoid common strategic planning mistakes
  3. Maish Nichani – Mapping your website redesign strategy
  4. Gerry McGovern – Web redesign is bad strategy

 


 

Crystal balls are there to be broken

Guy Westlake, a senior product marketing manager at Vignette, has gazed into his crystal ball for trends and technologies in 2009. This is certainly worth a read, as Vignette continue to have some excellent product features and are one of the driving forces in both WCM and portal software development.
Of course there’s an element here of Vignette promoting its own product set — a case of gazing at navels rather than crystal balls? — but I hope Guy won’t mind if that I contradict some of his predictions. I do agree with quite a few!

1. Enterprise 2.0 takes off

The use of web 2.0-style tools (micro-blogging, RSS, tagging, etc.) as part of daily communication within a business should be a no-brainer, but many organisational cultures are way behind the curve. Early adopters are reaping the rewards of improved knowledge sharing, but the ethos of control, hierarchy and compliance hamper efforts to implement Enterprise 2.0. How do you convince people who send email attachments to half a dozen people for approval that there’s a better way of communicating if they can’t see beyond their clogged up inboxes?
One compelling case for web 2.0 tools is their use in project management: posting on project status with comments for feedback, using shared calendars and discussion boards for meetings, building networks of friends across departmental and organisational boundaries. But if you’re used to out-of-the-box services, be prepared! Implementing these kind of tools within the firewall is often considerably more complex: LDAP integration is just the first hurdle you’re likely to face.

2. Life in the cloud

So many cloud-based applications offer real benefits at seemingly ever-falling costs that the cloud appears to be the saviour of the web, particularly when recession hangs over IT budgets. But security questions remain: how sure can you be that information you want to keep in your organisation remains there? Businesses will have to become a lot more savvy about encryption methods before they start to really take advantage of what the cloud has to offer.
Nevertheless, those applications that are external to the firewall — including email — are ripe for cloud computing and I expect we’ll see many organisations taking “a punt” on these services just from a cost perspective.

3. Web 2.0 in the financial services sector

This is a banking compliance officer’s worst nightmare: anyone posting all kinds of comments to a bank’s public website. However, financial services have been the trail-blazers for web 2.0 on internal applications and I think we’ll see them pushing these applications to the public too.
The question is: what is the killer app? Social comparison sites for mortgages, savings and the like similar to Trip Advisor in the holiday industry are bound to become more prominent. But retaill banking is going to have to think long and hard about applications that they can find for online social media to gain market penetration.

4. Personalisation and the rise of ‘My Web’

Personalisation has not been the trend for web content and I see no evidence that it will become one. Personalisation has proven many times to be both costly and ineffective. The trend has been and will continue to be “our web” rather than “mine”.

Even the oft-cited Amazon example isn’t enclosing the individual in their own world: it’s making recommendations based on what other people bought who bought the same product and there’s a heavy use of communal rating functionality. I expect we’ll see more in the way of sites suggesting links other people followed (even Google is moving this way) rather than offering visitors options to configure the kind of content they want to see.

5. The future of online media is video

This is a marketeer’s dream. Unfortunately, the market is willing but not yet ready. There are significant challenges in engaging users with video based on current browsing habits. If you’re online at work, watching video is still viewed as at best anti-social and at worst as skiving. Watching at home still isn’t the experience that it should be, sat a few inches away from a small monitor displaying an even smaller video. Video on mobile devices is improving significantly however, so if mobile bandwidth prices start to fall, expect to see a rise in video clips for handheld devices. What’s more, these devices are likely to be far less effective at blocking out this content than most PC browsers.

6. The integrated brand experience

There’s is a slightly chicken and egg situation going on with multi-channel delivery. Sites won’t develop for small audience shares and those audiences won’t visit sites that don’t cater for them. I expect that we’ll see a few niche players here — probably around news and software sites for mobile devices — before we see any real obvious example (in Europe, at least) of business catering for multiple channels.

7. Social media – what next?

Social media has been about individual sites allowing lots of people to comment and contribute. The next step (we’re already seeing on many sites including BBC news) is for the site themselves to be social and provide links to resources they don’t control. I think this is a really good thing. For too long, organisations have focussed on enclosing themselves in their own “enterprise” models rather than seeing themselves as part of the web. Now they’ll begin sharing content and resources with each other more freely in order to become the “hub” that visitors come to on a regular basis. It’s best to be the daily starting point for browsing rather than the infrequent end point.

8. Semantic Web

Has the semantic web lost all meaning? It’s pushed so heavily by vendors, but how many compelling examples are there of it? Some of the technology is exciting, but let’s see a compelling business proposition for it.

Tidying up your content, organising it better and making it more search-friendly are still more effective ways of improving your website or intranet than the implementation of a semantic engine.

If the crystal ball isn’t right, what is?

I’m not disagreeing out of hand with Guy (apart from on personalisation and possibly the semantic web), but if I disbelieve his predictions, what do my own tarot cards propose?

  1. There will be more opportunities to reach new audiences across multiple channels, but a correspondingly increased need to justify the costs of these new channels.
  2. Intranet projects will struggle for attention. Challenges and costs associated with application integration in comparison to a cloud-based model will cause many internal implementations to be delayed. The focus will turn instead to communication beyond the firewall for market penetration and retention.
  3. Websites will become social, sharing content not just from their own resources but from off-brand and off-message sites too, through the increased use of RSS.

Let’s review next year and see whether tarot is more effective than a crystal ball.

 


 

Information in a bear market

Dennis D. McDonald continues to propose interesting thoughts on information management. This one – on the importance of social media in post-merger organisations – struck a particular chord with me.

A previous project I ran was to implement an internal knowledge management portal for a company that had been through several rapid mergers of some pretty small companies into a pretty large one. The company’s success is based on its staff expertise and wealth of project experience, but the full range and depth of this knowledge lay fragmented across a few people from the various entities that constituted the new whole. As a consequence, the sales team didn’t know that they could use staff who’d already engaged with a particular client, or that there were case studies for similar projects including case studies and lessons learned. The wheel was being reinvented. It was obvious that some kind of networking tool that enabled staff to identify expertise in people and projects would lend the business a helping hand and could be implemented with relative ease.

Instead, the directors decided that a search engine that could span all the company’s file servers would be more cost-effective. But how many useful results did the staff get from keyword searches? For all the typical reasons – little classification, poor naming conventions, poor security, inappropriate technology – close to none. The content was there but the information wasn’t.

Just as art only becomes art once you place it in a gallery, content only becomes information when you identify it as useful. The quality of the information, like art, is debatable, but it has no chance of being used if you don’t suggest to people that it’s useful information. Following a merger, staff need to know: these are the kind of people who work here and this is what they know about. To find out more, ask them.

Yet even in the most obvious of cases for implementing simple information management tools, their raison d’être can be by-passed. The company in question didn’t implement a networking tool and nearly two years later still doesn’t know some of its clients, the skills of many of its staff or the scope of most of its past projects. Many staff have left. Yet is the company bothered? Absolutely not.

The directors simply changed the strategy. If the sales team weren’t paying attention to certain clients or types of projects, it’s because they weren’t important enough. The strategy dictated that employees focus on bigger and better in their portfolio, as befitted the newly-merged company status. Who needs the past when you have the future?

It’s a bullish policy in a bullish market, but when things inevitably turn bearish, there’ll be a scramble to avoid repeating the mistakes of previous engagements, find people with relevant knowledge, return to reliable clients who weren’t in the big league. By then, both employees and clients could be long gone, and gleaning information from fragmentary content may well prove impossible.

While your work is easy, information has little value. As soon as your work gets tough, it’s the people and companies with the information who’ll profit.

Updated: Alan Pelz-Sharpe has also written about ECM technologies and recession.