Tag Archives: data

Going global – things to think about

I finished work on another project recently that allowed me to consolidate some ideas on how to make implementing software outside your home country more successful.

Obviously planning is key, but often the plan doesn’t follow activities right through until the final outcome. I came across several instances where a planning decision had been made before people had thought through the full impact of that decision, resulting in confusion, loss of time and wasted effort. Of course you can never know *everything* about possible impact, but you can certainly get some of the bigger things right.

In my experience, these things are:

Localization – will the product be localized for each market? If so, how will training be conducted? How will support be handled on an ongoing basis? How will ongoing changes to the product be handled? If you’re not going to localize at all, think through the impact of asking everyone to use your native language.

Security and Application Management – can people access the product within their current IT structure and who will be managing access? If it’s IT, do they have the desire and resources to do so? If it’s the business side, same question. If you’ve never implemented a product of this type across different markets before, you’ll need to do a lot of testing. This means engagement with local IT teams who may not have your project even on their radar, let alone have resources assigned to support you.

Data Security and Granularity – where will data be stored (as in general, data storage in the US is not sufficiently secure for Europe)? Does the data need to be encrypted at rest?

What granularity of data will be required for reporting at the most senior level, and at what point will data differences not matter to the higher level reports?

People in constituent markets need detail, and that detail can differ from country to country. What you need is to understand how data in one country stacks up against data in another, and then tune your high-level reporting accordingly. Many companies implement a global software solution to drive standardization across process, products and pricing, but in some environments that simply isn’t possible. You have to decide where the “break-point” is for reporting and then work to that point in every market.

Communications Alignment – this is a fairy standard principle, but I’ve seen many instances where broad, high level goals had been communicated in a pretty generic way but not broken down into specific messages. It made things difficult for us, as not only did people not know what we supposed to be doing, they didn’t understand how they needed to participate or contribute.

So, not too many things to think about really, but time spent working through each of these areas will pay dividends, I promise you! I’m happy to answer questions on this topic if you have them,


Let’s think about something cool

One of the challenging things about travelling all of the time is that it can be hard to find time to learn about the cool things people are doing in your space.

Thankfully, now that I’m at home more I have time for actual thinking, so last Friday I went to a great event that was a little bit of a pulse-check on the state of big data and analytics. There were presentations from Hitachi Solutions and from MIcrosoft – who should be congratulated for finally recognizing and embracing Excel as the day-to-day BI tool for some – amongst others, and various interesting use cases to review and discuss.

Every day I see that the majority of business, especially enterprise business, is still stuck just trying to find their data and make it reliable and useful. It’s events like these keeps your imagination going, thinking about the different kinds of long-term benefits timely, reliable data can bring – very helpful while you’re slogging through the drudgery of data cleanup!

Thanks Hitachi Solutions Canada (@HitachiSoCa) for setting it up!

Big Data = Big Deal?

Guess you’ve all seen the sudden explosion of articles about big data recently – you can hardly load a webpage without seeing it mentioned – and a number of questions may have come to mind. Things like

  • How can I help my clients take advantage of it?
  • How can I leverage it in my business?
  • Do I have any of my own?
  • Is it really as big as the media says, or is it all hype?

I’m going to look at each of these questions – and probably some others – over the next few weeks, and share what I believe to be useful and how data can be leveraged for you and your clients. I say data in general because from a business perspective it doesn’t really matter if your data is big or small – it’s part of a total approach.

Let’s look at the last question first – is big data really a big deal? To start to answer this question, let’s look at understanding some things about data generally.

Think of data like water. Water is a resource that can be scarce or abundant and historically, we didn’t know what it was made of. We didn’t know its chemical makeup, what properties it held, how pure or impure it was. We only knew it was water. When it was scarce, we moved in deserts from oasis to oasis, getting small amounts out of the ground with great difficulty. In abundance, we crossed oceans made of it, not seeing into its depths and understanding what secrets it held. In essence we held neither a micro nor a macro view – we were at the same level.

Nowadays, we understand water differently. We know what it’s made of chemically and we know that it may contain various different substances while still maintaining the same appearance. We have better ways of detecting it in environments where it’s scarce and better ways of getting it out of the ground. We no longer only float on top of oceans – we build pictures of what lies underneath and detect currents and patterns across large areas.

At one time, a cup of water only quenched a thirst. Now, it holds the answers to many other questions.

With data – particularly big data – we’ve followed a similar path to a greater understanding. In the past for organizations and for society at large, information could be scarce. We’ve known that events have occurred but we haven’t known all of the details about them. Analysts traipse from system to system, copying information from one Excel sheet to another trying to get a picture of what’s happening. We’ve floated on top of information that is unknown, un-described, or both; a bit like old-time mariners, sometimes feeling there’s a storm coming with no way to see the currents that really tell the story, not seeing the shark until it attacks.

Now, information generally has become more abundant and available to both individuals and organizations. We can identify and describe each element of a particular event and save that information for later. We have better ways of discovering and aggregating data, saving the time and effort involved in pulling information together into one place. Now we can choose to float on top of the data, look at its currents and patterns from different perspectives, or we can look at the data itself and understand what parts are needed to answer a particular question and what is not.

So is big data a big deal? Fundamentally yes, but this new abundance has its own challenges. Data now comes out of a fire hose and we have to figure out not just how to sip from it, but how to siphon only what we need. How do we know what we need? This is the next question I’ll be looking into.