Why is Artificial Intelligence still Science Fiction?

The first in a series of problem solving exercises aimed at determining what the next steps for AI might be.

Creating an Enterprise Data Strategy

An introduction to the process of developing comprehensive strategies for enterprise data manangement and exploitation.

Is Hype Killing IT Innovation?

Hype may not be as harmless as we think it is...

The Innovation Dilemma

What things actually promote or discourage innovation? We'll examine a few in this post...

Digitial Transformation, Defined

Digitial Transformation is a hot topic in IT and big money maker for consultants - but what does it really mean?.

Wednesday, September 10, 2014

The "Art" in Artificial Intelligence - Part 1

Today, we are going to launch a problem-solving exercise on what might be the single most complex topic in Information Science - Artificial Intelligence. The goal here is not to provide any sort of comprehensive survey of current theory or practice; rather our journey begins with the stark realization of how little we've achieved in the field since the term was first coined 58 years ago. This is a problem statement and problem resolution exercise and an excellent case study in technology focused innovation. Let's begin...

We'll start at the beginning with some definitions and a review of key assumptions.

Artificial Intelligence, Defined 
The ability for a machine to consistently demonstrate core cognitive skills generally associated with human intelligence; including, learning, problem solving, intuitive reasoning and contextual memory retention and extraction. These skills generally imply the need to achieve some level of self-awareness.

What We Haven't Achieved, Yet
I said that this problem statement is focused around a lack of success in the AI field to date; let's try to quantify that first.

  • No computer can learn like a human.
  • No computer can speak like a human (this is deceptive, tools like Siri will provide responses back to you, but is that in fact anything like human speech? The processing that goes on within Siri is a relatively primitive form of pattern recognition as opposed to what even the least capable human mind can produce).
  • No computer can handle complexity in the same manner a human can (this warrants much more explanation and we'll come back to it).
  • No computer can problem-solve the same way humans can. (there are types of problem solving where computers are of course far superior to humans, yet even with all that power they still fail to solve relatively simple questions that humans can handle naturally).
  • No computer has achieved anything coming close to consciousness or self awareness (despite the endless slew of sci-fi stories where this is a common fixture). 

Anybody trying to solve the AI problem is a mad scientist, right? 
Now, there's another ethical or moral side to this topic which we won't jump into until the end - the question as to whether we should even try to endow a machine with these traits - but then again it is likely that someone will do this regardless of the ethical objections. Part of the human learning process seems to require learning through our mistakes - a trait we may eventually end up passing to artificial entities, someday. But back to the problem.

Challenging Assumptions
As with most problem spaces, the set of initial assumptions associated with it tends to drive all else until or unless those assumptions evolve in some fashion. For Artificial Intelligence, there have been a number of assumptions that have helped define what's its become to date and also help to explain its limited success - they include the following:
  • The notion that brute force computing power will eventually resolve many issues and push through various AI barriers. This is partially true, but then again we sent Apollo to the Moon with the computing power of a standard calculator (by today's standards), how much computing power do we really need to mimic human thought? Nature got us here through elegance, not waste.
  • The notion that we fully understand how the human mind creates consciousness or exercises cognitive capability. We don't, yet.
  • The very flawed notion that machine learning can or should have any connection to the current methods we use to teach each other. 
  • The lack of sensory input associated with most AI paradigms. AI is IO dependent and that usually means keyboard and mouse although images and video (and audio) have now begun to play an important role. We'll get to this in more detail later.
  • The notion that simulation can perform like the real thing; building to the simulation ensures a thing always remains a simulation (thus never achieving actual reproduction). This has some interesting implications which will eventually get us into a discussion of genetic engineering. 
  • A lack of focus on natural language. This at first proved too difficult and now natural language does factor into much of the AI research going on. However, natural language hasn't been looked at as the core logic for an AI system - but it should be - instead we tend to view in terms of how an AI system (or other type of system for that matter) can interact with humans (or just capture human speech accurately). 



Watson actually showed us what he/she/it was thinking - if only we could
have seen into the minds of the humans, then we could have seen whether they think the same way...
A Brief Chronology of AI

  • Ancient Greece - those happy go lucky philosophers in togas considered the notion of thinking machines. 
  • About 150 years ago or longer - Artificial Intelligence (although not explicitly identified as such) becomes a popular topic in Science Fiction.
  • 1930s - The golden age of Science Fiction novels includes more than a few stories about artificial brains and super robots.
  • 1955 - The Term "Artificial Intelligence" is invented by John McCarthy and within a year the first conferences on the topic are held.
  • 1968 - The Arthur C. Clarke novel 2001 becomes a box office hit and HAL, the emotionally unstable AI computer on board the Jupiter, becomes a celebrity. Take your stress pill, Dave.
  • 1980 - Matthew Broderick makes it big playing games with an AI pentagon computer - the game is Thermonuclear War but of course you can't win it, right? 
  • Early 80's - LISP is introduced.
  • Mid-1980's - Expert Systems become popular.
  • 1984 - Skynet becomes 'self aware' and Terminators start trying to kill John Conner.
  • 1987 - The pasty-faced Commander Data steals the show on Star Trek the Next Generation.
  • 1990's - An IBM computer wipes the floor with the world's chess masters.
  • 2001 - Stanley Kubrick's last movie becomes Stephen Spielberg's tribute to him as Joel Haley Osment becomes one of the last humans playing a robot (as opposed to CGI animations with voice actors playing humans, robots and everything else).
  • 2010 - Stanford opens up an AI online course to the general public, several hundred thousand sign up - few finish the course. 
  • 2011 - Siri shows us that smart phones can indeed become somewhat intelligent.
  • 2012 - IBM's Watson wipes the floor with the pantheon of Jeopardy champions.


Don't imprint unless you're serious...

The expectations for AI have thusfar radically outstripped the progress made. Now in the same 50 or so years we've taken the Star Trek communicator & tricorder fictions and made them reality in the form of smart phones (some of which you can now load with apps that when used with sensors can measure one's vital functions much like a tricorder).

A lot of smart people have tried for decades to make AI a reality and across the globe billions or hundreds of billions of dollars have been spent on related research. What's going wrong here? It can be only one of two possibilities:

  1. Human like intelligence cannot be artificially created and maintained or
  2. We've approached the problem wrong
I content that the second possibility is in fact what has happened. As we will progress with the series, it will become clear that the "Art" I referred to in the title is the ability to pick the right path for problem resolution.  In part two of this series, we will examine the question; How can machines learn?





Copyright 2014, Stephen Lahanas


#Semantech
#StephenLahanas
#TechnovationTalks

The Next Generation of American Leadership - JSA


A worthy cause indeed, invest in the future - support the next generation of American Leadership, right here in Ohio...

Junior State of America - Ohio River Valley Scholarship fund

Saturday, September 6, 2014

How to Create an Enterprise Data Strategy

I work as an IT Architect. One of the more interesting things I get asked to do on occasion is to create Strategies in particular technology areas. This represents the "high-level" side of the typical IT Architecture duties one tends to run into (if you're interested in more IT Architecture related topics check my new blog here - The IT Architecture Journal). A very popular strategic focus across industry lately is Data Strategy. In this post, I will try to explain why it has gotten so popular and some of the fundamental aspects of actually producing one.

Data Strategy, Defined
Data Strategy is the collection of principles, decisions, expectations as well as specific goals and objectives in regards to how enterprise data and related data systems and services will be managed or enhanced over a specified future time period. As an actual artifact, Data Strategy is usually manifested as a document, but is not limited to that format.

A Data Strategy is usually conducted in conjunction with some IT Portfolio planning process. In optimal situations, portfolio decisions and follow-on data-related projects can be mapped directly back to goals and objectives in the Data Strategy.

Why Data Strategy is so popular
Organizations across the world have become more aware of the need for greater attention to data related issues in recent years. Some of this has been driven by collaborative industry initiatives through groups like DAMA (the Data Management Association) and the resulting Data Management Book of Practice (DMBOK). Other drivers include the near-flood of new data technologies released over the past decade as well as the exponentially growing quantity of data out there.

So, what does having a Strategy give actually give you?

What it often provides, if used properly, is both a set of shared expectations as well as a clear path for actualization of those expectations. The Data Strategy allows organizations to deliberately decide how best to exploit their data and to commit to the major investments which might be necessary to support that. This, when contrasted with an hoc and decentralized technology evolution scenario presents a much easier picture to grasp. And it also at least implies a situation that will be easier to predict or otherwise manage. It is that promise of manageability that makes creating a Data Strategy so attractive.

Elements of a Typical Data Strategy
The following mindmap illustrates some of the common elements that you'll find in many data strategies. One noteworthy item in this diagram is the idea of sub-strategies (which can be split off into separate documents / artifacts) ...



The Top 7 Considerations for Data Strategy
While there are many more things to keep in mind, I've tried to distill some of the most important considerations for this post...

  1. The strategy should take into account all data associated with the enterprise. This may sound obvious but in fact it isn't really that obvious. Many organizations explicitly separate management of dedicated data systems from other systems which may have data in them but aren't strictly just DBMSs or reports, etc. For example, there may be state data in small data stores associated with a web-based application that supports an online form / application - the data structures supporting the completion of the form may be different than the ones which collect the completed form data. However, all data, in all applications regardless of where it may be located or how or it is used must be considered.  
  2. There generally needs to be an attempt to define an organizational 'lingua franca' - or a common semantic understanding of data. There are many ways this might be achieved, but it is important that this included within the strategic plan.
  3. The Strategy cannot be entirely generic, even if one of the most vital objectives is some type of industry-driven standardization. Wholly generic plans are usually less than helpful.
  4. The Data Strategy must be presented within a larger context. What this means is that there needs to be an expectation that the Strategy will indeed be the precursor to other activities which ought to be able to map back to it for traceability purposes. 
  5. The Data Strategy needs to have sufficient detail to be meaningful. If it is too high-level it becomes merely an elaborate Mission Statement. The expectation behind any Strategy or plan is that it be actionable. 
  6. The Data Strategy ought to be need or capability based - not product or Hype focused. 
  7. There ought to be a way to measure success 'built into' the Data Strategy. This can come in the form of basic service level expectations or business outcomes or both. 

What goes into one Data Strategy versus another can be radically different from group to group. If you have a Social Media company your needs will be quite different than the US Coast Guard for example - but both will likely need their own Data Strategy.


Copyright 2014, Stephen Lahanas


#Semantech
#StephenLahanas
#TechnovationTalks

How IT Hype Inhibits Technology Innovation

There are many unique characteristics connected to the IT industry, some of them are more associated with pure-play IT vendors but others seem to pervade most all organizations which utilize IT in any significant manner. One of the strangest of these characteristics is the obsession with buzzword Hype. For most people this is symbolized through the development and dissemination of Gartner's famous Hype Cycle diagrams. However, the Hype obsession existed before the Hype Cycle ever did and doesn't explain the phenomenon in a satisfactory manner.

So, what's wrong with Hype? Isn't it just a harmless offshoot of public relations or perhaps just some benign manifestation of mass psychology? Perhaps some of our readers here are old enough to remember the endless silly fads of the 1970's which included Pet Rocks and Mood Rings. Did buying these worthless trinkets in any way negatively impact our later lives - probably not. What these harmless trends did perhaps achieve though was a sort of conditioning that might have pre-disposed the vast majority of us to assign more attention to trends then they might otherwise merit.


Pet Rocks were definitely low tech - but represented a hype-generated trend none-the-less

Now without diving into the implications of psychological theory in regards to conditioning and behavior, it is does seem as though much of what we see or talk about is driven by various Hype cycles. This occurs in entertainment, in business, in food & dining - in nearly every aspect of popular culture and remarkably it also affects science and technology. What represents Buzz in the Scientific community over recent years? Well, how about the Stem Cells, The God Particle and Fractals (or Fibonacci numbers) to name a few.

Getting back to Information Tech - how are we influenced by fads, trends and Buzzword Hype? Well, let's attempt a definition here first...

Buzzword Hype - This represents a unique form of public relations wherein somewhat complex concepts are crammed into a single buzzword (although Buzzwords can technically include several words in them - Master Data Management has three, Big Data has two). While this phenomenon is not limited to IT - it is the most prevalent form of Hype in the technology arena.

So, if an acronym is a mnemonic for a complex term (MDM for Master Data Management), the term itself is a mnemonic for the complex concept that everyone already understands, right? Wait a minute, perhaps we've discovered the first problem; how many people actually understand these terms? Furthermore, how many people are actually concerned with learning these terms?

Let's hone in on one of the biggest Buzzword Hype examples of the last few years - Big Data (we've touched upon this topic before in Technovation Talks). How many people actually have a comprehensive knowledge of what this represents? or even have the same expectations or knowledge about it? Is Big Data just the Hadoop distributed fault tolerant file system, it is the lack of SQL and relational structure, is it the high capacity or throughput or is it some combination of these elements and much, much more. Even more importantly, is Big Data even something that can be neatly defined as one standard solution? All good questions, none of which are typically addressed in the core Hype surrounding the buzzword.

The buzzword hype for Big Data seems to imply more, better, faster, bigger with very little consideration as to how that might happen or what the eventual impacts would be. The term itself becomes its own justification - if everyone is talking about doing it and beginning to do it - why shouldn't we, right? And by the way, what is it again?

Let's step back a moment and try to classify what the core problems associated with the Hype Cycle are:

  • These buzzwords actually drive business decisions, regardless of the level of education or understanding associated with them.
  • There is an undercurrent of peer pressure that tends to 'force' people into making those decisions - decisions they weren't ready to make (because they didn't have time to evaluate the situation properly).
  • The hype tends to drown out most other types of discussions associated either with the technology in question or the real trends of what's happening across most enterprises. And I characterize these as 'real' because they represent common challenges which aren't necessarily product-driven (thus not good candidates for hype).
  • Premature adoption based on hype cycles often has the opposite effect on a particular technology area - it stifles it as too much bad word-of-mouth feedback circulates and what might otherwise be a promising field of practice languishes as a result (E-learning is the best of example of this I can think of).

How does all of this impact or otherwise inhibit Innovation? Well, here are some things to think about:

  • Some Hype Trends are not in fact very innovative, yet if everyone is doing it - then folks trying to introduce truly innovative techniques or products may be drowned out or suppressed. 
  • Most Hype Cycles tend to pose the key buzzword focus area as a "silver bullet" solution - as those of us who have practiced IT for a while can all attest to, there are few if any actual Silver Bullet solutions. Similar the Heisenberg Principle (and we're not referring to Breaking Bad here), introduction of a new element impacts the existing elements in unanticipated ways (well, this isn't an exact analogy to the Heisenberg Principle but it's close enough). The whole of IT might be viewed as "Magic Happens Here" by outsiders but inside we know there is a constant struggle to impose order over chaos - silver bullets are often disruptive. Yes, that's good but not if you don't understand the nature of the disruption before you go whole hog (so to speak).
  • Hype & Buzzwords tend to make people think situations are simpler than they really are, and in some senses actually discourage the necessary analysis that should be occurring when adopting new technology. Innovation cannot be sustained when and where it becomes too expensive to manage.   


We will ever escape the grip of unreasoning Hype in IT? Will our lives be forever ruled by an unrelenting cascade of product focused buzzwords? Who knows - IT is still in its infancy and unlike most other professions we have the opportunity to reinvent ourselves on an almost continual basis - so anything is possible.



Copyright 2014, Stephen Lahanas


#Semantech
#StephenLahanas
#TechnovationTalks

Monday, August 18, 2014

The Innovation Dilemma

Innovation is perhaps today’s ultimate buzzword and most over-hyped topic. People can’t seem to get enough of articles and dialog on how Innovation is the answer to any number of potential issues – yet in all the countless discussions occurring online and in print about this topic, how well do any of the folks discussing innovation really understand it? That’s part one of the dilemma. Part two of the dilemma is that for all the lip service about how Innovation needs to be fostered, are we actually in fact fostering it in any meaningful ways (or perhaps worse yet, might we in fact be hindering it through current trends)? We will examine both parts of this question in today’s post.

Let’s start by helping to define what Innovation actually represents in a meaningful context. We’ll begin this by explaining first what innovation is not; it is not:

  • A marketing slogan
  • A collection of admirable ideas awaiting exploitation
  • The province of rarefied genius or Silicon Valley risk takers
  • Accidental or otherwise random in nature 
  • And lastly – Innovation is not thought (e.g. Innovative Thinking) – thought without application perhaps qualifies as day-dreaming. Innovation is Actionable Thought embedded within the context of a larger problem solving activity. 

Innovation is a process, not an individual event. That process has a ‘macro’ or Global perspective as well as a Local perspective. In other words, the Global process of Innovation encompasses all of the Local processes – the smaller efforts impact the cumulative achievements at the Global level. There is also often synergistic inter-relationships between various local innovation “threads.”

An example of a complex Innovation process

Definitions (Innovation in Theory):
Innovation – This represents the deliberate (reproducible, consistent) process associated with solving specific problems. The process is evolutionary, incremental and focused on specific, well-defined goals. The key concept here is that Innovation is not an anomalous or ephemeral activity; it is most definitely not “magic happens here.”

Local Innovation – Any individual application of an innovation process within a closed community/entity.  This does not imply that the community or entity is somehow cut off from the global community, merely that it has its own a unique charter.

Global Innovation – Any number of local communities may be working to solve the same problems. These problems can be referred to as innovation threads. The level of collaboration or cooperation will vary between these communities, yet on the whole there is usually some information exchange that at times will allow individual local innovation to influence or otherwise contribute to global innovation progress (and conversely, progress acknowledged at the Global will of course any number of Local efforts).

Innovation Threads – An Innovation thread is the collective effort towards resolving a unique problem. Obviously, there are cases where one group defines a similar problem somewhat differently, but in general the progress made in one variation of a particular thread may be applicable to another similar one.

Innovation in Practice

One of the best examples of what differentiates innovation in popular mythology with Innovation in practice is the case of the Wright Brothers. Their story is not one of a handful of good ideas punctuated by the glorious realization of their dreams of flight, but rather the many years of tireless work and massive amount of invention that had to occur in order to achieve a specific goal associated with one very famous problem – “how to achieve powered flight.”

The Wright B Flyer

The Wright Brothers didn’t just look at a bird and shout “Eureka.” They redefined the science of aerodynamics, testing hundreds of airfoil designs. To do this they had to redefine the mathematics of aerodynamics, they had to invent the wind tunnel and more. In short, they had to solve 100’s of related problems in order to resolve the main problem that started their quest. Theirs was an example of local innovation – however it was so profound that it completely redefined the Global scope of innovation for aerodynamics. And we still fly in planes based entirely their designs and principles today.

Another important characteristic of what the Wright Brothers did was its entirely practical focus. Everything they did was goal-focused. This differentiates it from many research and development programs that have arisen over the past 50 to 75 tears in that oftentimes research programs do not have specific, tangible goals in mind. (in other words, they are not entirely pragmatic in nature).

Dilemma 2
Now that we might have a better idea of what Innovation actually represents, let’s consider whether we as a society are actually encouraging or discouraging it. To do that we need to consider a couple of related questions, including:

  1. Can innovation be taught, and if so how would that happen?
  2. What sort of incentives might help to spur innovation?
  3. What might represent disincentives for innovation?


We will answer these questions one at a time…

Can Innovation be Taught?
Yes, it can (and we will explore that topic in more detail in a future post). But is our current expectation of what represents education that fosters innovation accurate – well, no. In our previous example of practical innovation (The Wright Brothers), the main idea was that the entire exercise was problem focused. What they learned, invented and achieved was all focused on a central goal. The vast majority of education today is in contrast not goal-focused. Moreover, it tends to be highly standardized and this trend is getting worse every year. There are some exceptions of course, but for the main part our educational systems today judge students based upon conformity of thought as evidenced through an ever-expanding list of assessment tests. This shift towards assessment testing has a chilling over-all effect on curriculum, making it more and more abstract and less focused on systematic problem-solving. In the United States, we are now teaching almost entirely to the test.

The way most experts have framed education that might somehow foster innovation is by decrying that more education ought to include science and math education (STEM). So, remarkably all of the focus towards achieving more innovation has been directed at what is being taught as opposed to how it is being taught. There is an obvious flaw in this logic that is borne out in almost every field of practical application. This is a massive and complex topic and we are of course just skimming the surface.

What sort of Incentives might encourage Innovation?
Well, this might include incentives both within education and in industry. Incentives within Education might include rewarding problem-solving skills in terms of assessment or college admission and structuring curriculum to encourage development of problem solving skills. These types of skills are in some ways diametrically opposed to the type of assessment testing which is currently so popular now. The idea is of course, is being able to question rather than mimic current thinking in order to develop the types of new perspectives needed to progress beyond current capabilities.

In industry, many assessments are more or less natural – in other words – solutions that effectively solve problems become popular and profitable. However, many such solutions can’t get funding to reach this point – so one area that can be improved is the access to capital (both in the government and private sectors, and for the federal sector R & D can become entirely problem focused rather than random).

What things tend to discourage Innovation?

A misdirected educational system, as discussed already, represents a serious discouragement towards fostering education, but it is not the only problem. Other issues include:

  • A misguided dilution of labor incentives. This sounds a little complex but what it means is that since about the year 2000, a two-tier technology labor scenario has arisen in the United States. The introduction of temporary Visas based upon the mistaken notion that there was a technology labor shortage has in fact displaced several million technology workers here and resulted in the introduction of millions of IT workers who get paid roughly half of what the standard wage would otherwise be. Add this to off-shoring, and what we have created an environment of uncertainty in area where we should be fostering confidence in terms of securing a large, stable technology workforce. 
  • Within individual organizations, despite the hype that seems to imply otherwise, risk-taking and divergent solution approaches are most often discouraged. For organizations to become innovative there generally needs to be some cultural transformation – this is very difficult to achieve. 
The goal with this post was to help refine the dialog about innovation a bit – get beyond the platitudes and start discussing how it can or should work. We will revisit this topic again in coming months…



Copyright 2014, Stephen Lahanas


#Semantech
#StephenLahanas
#TechnovationTalks

Friday, August 8, 2014

Building Effective IT Strategy - part 3

In our last two post on IT Strategy, we highlighted how Strategy can be structured, how it differs fro Tactics (although we will explore that in more depth in this post) and how to employ a consistent process towards developing strategy. The first two steps involved determining what the core strategic approach might be; the second focused on goal-setting. The third and most difficult part is assigning actions to goals...

So, using our Big Data case study, how would begin to translate the higher level goals into definitive actions and then what types of tactics might be used to carry out those actions?


This illustration highlights where Big Data fits within a larger set of Strategic elements in an overall Transformation initiative. This type of representation helps to define relationships, dependencies and quantifies where work needs to happen once the higher level goal-setting has been defined.  

The types of actions that might be involved with actualizing a Big Data Strategy might include the following:

  • Creation of a team or center of excellence to manage the technology / project
  • Definition and Deployment of a proof of concept 
  • Acquisition of the raw data intended for use in the Big Data solution (so let's say this is for an energy company it might include SmartGrid sensor information).
  • Acquisition and / or development of the Big Data Platform
All of these possible actions of course imply a number of key decisions that must be made; the following are a few examples of those;
  • Determination of Big Data technology to use (triple store, key value etc.)
  • Determination / selection of a Big Data solution hardware platform
  • Determination of modeling or data profiling approach
  • Choice of BI platform for data visualization

All of this information is going to be necessary in order to complete detailed roadmaps and ensure accurate estimates for those who manage the IT portfolio process in any given organization. Actions can then begin to be translated into milestones with traceable costs. Those Action-Milestones are then mapped specifically to goals/objectives previously identified in the higher level strategy.

Now, how does action to goal alignment involve Tactics? In the case we've introduced and in most others, the tactics involve the core tools for decision-making. So, for all of the decisions listed above, individual analyses of alternatives might be conducted. For product decisions, run-offs / competitions / evaluations and source selection processes are applied. For design considerations, an architecture approach is applied. All of these activities can also fit within a lifecycle process - all of this represents Tactics.  Why? because, we could use roughly the same lifecycle approaches for any type of technology - whether it is UAV development, Quantum Computing or building a SharePoint portal. It is the interchangeable actualization toolset for all strategy.

The hardest part of aligning Strategy, sub-strategies and tactics is when you find yourself in a very large transformation effort (one perhaps dealing with 100's of systems, dozens of technologies and perhaps thousands of people). There is no single solution, tool or approach for managing that - it represents what mathematicians often refer to as a unsolvable problem (NP Hard). We will look at IT Transformation and intense complexity in an upcoming post.


Copyright 2014, Stephen Lahanas


#Semantech
#StephenLahanas
#TechnovationTalks

Thursday, August 7, 2014

Building Effective IT Strategy - part 2

In yesterday's post, I introduced three elementary categories for IT Strategy:

  1. Product 
  2. Portfolio
  3. Transformation

I then pointed out that each of these follows a basic cycle of:

  1. Determination of Strategic Approach
  2. Goal-Setting
  3. Assigning actions to goals

So, now as promised, I will use this to take a look at a specific scenario. But before we begin, let me mention an important caveat to the three categories listed above. While all strategy tends to fit within these, not all strategy has to exist at the highest level. In other words, there are various levels of Strategy that are still abstract enough to remain differentiated from Tactics.

In this case study example, we are going to look at Big Data. Big Data is something that many organizations might consider important enough and large enough to develop a strategy around. However, just looking at the moniker "Big Data" and one might instantly wonder - well, doesn't that belong as part of a larger "Data Strategy." Yes. And then wouldn't that also imply that the Data Strategy would be part of a larger Strategy. Again the answer is yes - and in this case the whole thing lines up neatly like this:

  • Portfolio Strategy
    •  Data Strategy
      • Big Data Strategy

So, this begs the questions, just how we might first differentiate the lower level strategy from higher level strategy and then perhaps even more importantly, how do you ensure they stay in alignment?

Differentiation:
This tends to be managed something like this - you begin at the top level with the superstructure of where everything is supposed to fit as well as common capability / design principles and objectives. Then as you move down the Strategy levels, things progress from conceptual expectations to logical descriptions. The top level is the most open or flexible; the bottom level the closest to expectations regarding solution execution.

Reconciliation:
Side by side with the differentiation activity is integrated road-mapping - each level below fitting neatly into the one above. The other big component here is alignment of Strategy with Architecture which provides the other key reconciliation tool (if used properly).

So, how would a Big Data Strategy fit into an Enterprise Data Strategy? First, it obviously in most cases extends something that already exists. This then implies either replacement of existing capability or addition of new ones. Now let's jump back to the process we mentioned earlier - Step 1 - assigning portfolio strategy is complete. How would we attack goal-setting for Big Data?

This by the way, is where perhaps more than half of the organizations trying to adopt Big Data solutions are getting tripped up right now. A poor example of goal-setting would be - "let's do a POC without a clear path of how to exploit this technology yet" (mainly because everyone else seems to be doing it. A better approach might be:

  1. Define the set of possible Use Cases associated with your organization (where Big Data might make an impact)
  2. Choose one or two that can be effectively demonstrated and measured - let's say one might be the rapid development of a user driven BI solution based on unstructured (web/social media) data. 
  3. Develop a clear path as to how; a) the initial capability could be rolled into the larger existing ecosystem to avoid silos or solution-fracking and b) add new functionality to the emerging Big Data solution - consistent with overarching organizational goals. 

Step 3 is assigning actions to goals. We'll take a look at that in our next post...




Copyright 2014, Stephen Lahanas


#Semantech
#StephenLahanas
#TechnovationTalks

Wednesday, August 6, 2014

Building Effective IT Strategy - part 1

Every great endeavor begins with a strategy, well that may be true but was it an idea, a single spoken command, a drawing on a napkin? How do we quantify precisely what Strategy represents?

In military history, Strategy is the highest level of planning - the combination of complex goal-setting and the definition of an over-arching approach designed to achieve said goals. In the American Civil War, A. Lincoln decided early on that the North must cut the Confederacy in two by taking the Mississippi river and to stifle commerce using a massive naval blockade. This strategy was even given a name - the Anaconda Plan. The rest of what happened in the war was mainly tactical in nature - so in the case of the military analogy, Tactics are the detailed actions necessary to fulfill elements of the larger Strategy.

The interesting aspect associated with Tactics is that they tend to be "reusable components." In other words, you develop tactics that can be used regardless of the Strategy of that may employ them. This metaphor translates well from the military analogy over into real-world IT.

Lucky for us, the world of IT isn't much like war except in the sense that there is quite a lot of chaos and a need for planning to manage complex situations. Let's try then to define IT Strategy...

IT Strategy is the ongoing effort to guide organizational exploitation of technology over a multi-year period. It is ongoing because in IT (which is hopefully not the case for war) there is no definitive end-state goal. In other the words, the end state is always moving to right, reflecting the evolution that has already occurred as well as the oncoming waves of newer disruptive technologies.

That's the high level view, but IT has its own unique spin on Strategy which makes it possibly more divergent from the war analogy. In IT there are several distinct types of Strategy; these include:

  1. Product Strategy - Focused (perhaps analogous to a Theater strategy in war)
  2. Portfolio Strategy (otherwise referred to as Capability Strategy) - Comprehensive
  3. Integration / Transformation Strategy - This is not just focused on solution integration - it is the larger question of how to redefine and reconcile an entire portfolio 
Depending on the organization in question, they might require only one or perhaps all of these types of strategy at any given time. A software company (one that is solely focused on one software product let's say) would definitely want to use the first type of Strategy to help define their product roadmap, but might not need the other two. 


So, step 1 is determining what type of Strategy is required. Step 2, regardless of the strategy category is a goal-setting exercise. Step 3 is assigning goals to action. In our next post, we will use a Case Study to look at Steps 2 and 3 in more detail.


Policy in the context of IT Strategy generally represents tactical guidance given on an organizational level - this tends to fall under either Portfolio or Transformation Strategy (or both)


Copyright 2014, Stephen Lahanas


#Semantech
#StephenLahanas
#TechnovationTalks

Tuesday, August 5, 2014

What is Digital Transformation?

It sounds a bit misleading perhaps - maybe one gets the impression that this is something akin to rendering special effects for movies. In mainstream IT however, Digital Transformation has come to mean something quite different.

The shorthand definition for it is this; "the coordinated integration of all customer-facing digital capabilities." This includes mobile, web and social media. There is perhaps an implied level of underlying integration supporting of all it, but that isn't always required - at least not at first anyway.

We use the word "customer" here a bit loosely, as Customer can refer to any users or members of a certain organization. So, a government entity might consider citizens as customers and the military could consider its rank and file 'customers' in this sense also. Digital Transformation can occur in just about any type of organization specifically because the set of technologies involved is common across industries.

Ten years ago a 'Digital Transformation' would have likely been referred to as a "Portal Strategy." At the time, it was thought that the portal metaphor (and associated software products) would suffice to meet all customer-facing needs. A typical portal might include the following elements in 2005:

  • Extranet
  • Intranet
  • Collaboration tools
  • Content Management tools

What happened since then? Well, at least two and possibly three revolutions in technologies with more on the cusp; the key revolutions include:

  • Mobile - (not laptops of course, but the smartphone and mobile)
  • Social Media - some re-branding of collaborative capabilities merged onto smartphones as well
  • Cloud Computing - re-branding of visualization - and importantly opening that up to customer facing apps (web and mobile).
  • with Big Data coming on strong (although to be honest, most places still don't know how to use it so it's an outlyer with respect to customer facing technology).

On the backend of course are all the systems that actually make the organization go - but for the customer this is like the engine within the chassis. Most customers don't care what's under the hood until or unless it fails to perform.

Digital Transformation is a lot like enterprise integration but on the customer facing side of things. It is the alignment, coordination and update in tandem of the core mission apps and interfaces. And it can potentially include the Web of Things as well. So for example, if you have a grocery store then perhaps there are kiosks inside and gas pumps outside and nifty new cooler video displays etc. All of that technology is customer-facing so for a true transformation it all must be managed within a shared context.  The transformation has one goal - to go from a heterogeneous mix of random technologies to a holistic solution designed to facilitate strategic goals.

So, how does an organization undergo a Digital Transformation? We will look at this in more detail in one of our upcoming posts.


Some organizations (like larger retailers) face dual Digital Transformations; in so much as they must serve both traditional customers and the needs of many thousands of employees (with separate capabilities) such as in the example above targeted to employees...


Copyright 2014, Stephen Lahanas


#Semantech
#StephenLahanas
#TechnovationTalks

Sunday, August 3, 2014

The 5 Rules of IT Architecture


It is perplexing that after nearly 20 years since the emergence of IT Architecture as a discipline, there is still so much confusion surrounding what architects do (or more precisely, what they are supposed to do).

Many people in IT refer to themselves as "Architects" now - yet how can employers or colleagues quantify or otherwise validate that they are indeed actually Architects? I will propose five simple rules or tests to help clarify this:

Rule 1 - Architects architect, just as writers write. Architecture is by definition a design process. Any architect who doesn't or can't produce design is not functioning in the assigned role. An important corollary to this is that you cannot be an effective Architect and focus only on one 1 thing in IT - for example 1 tool / product. Architecture is a discipline, not SME knowledge in one particular area. The reason why this is the case is that most of the time focus on one product leads to a very myopic view of how to solve a problem (every issue begins to look like a nail for your one hammer).

Rule 2 - Architects follow a design process; whether it is geared towards EA frameworks or application design is somewhat irrelevant. What is crucial is that an approach is followed. Not having an approach means not having standard design expectations or deliverables. A lack of design diligence is analogous to a brick and mortar architect using cocktail napkins instead of blueprints to design skyscrapers.

Rule 3 - Architects are honest brokers, not blind followers. Why is this important? Well, it matters because the design process is where most key IT decisions get made. The Architect must take a certain level of responsibility for the outcomes associated with their work - just as brick and mortar architects do. If a client building a Summer house on the beach demands that the structure be built atop sand without concrete or wooden supports, the Architect must let them know the house will likely suffer a structural failure shortly after completion (if not before).

Rule 4 - Architects are problem solvers. Anyone working as an architect who deliberately avoids facing and resolving the tough issues is unlikely to achieve any sort of measurable project success.

Rule 5 - Architects must be able to communicate with the organization they are serving. This applies both to having personal effective communication skills, but also to the projects themselves. Projects that aren't transparent or collaborative tend to take longer and experience high failure rates.

IT Architects tend to be placed in high-profile, high-pressure roles. We who serve as IT Architects are constantly being judged not only personally but also as a profession. Part of the reason the latter half of the previous statement is true is due to the inconsistencies in outcomes for architecture-related projects.

However, if both Architects and their colleagues used the 5 rules above to define what should be happening, the outcomes for most architecture projects would improve dramatically.


Design requires visualization - effective visualizations can be notation-driven or conceptual (as in the above example) 


Copyright 2014, Stephen Lahanas


#Semantech
#StephenLahanas

Friday, December 13, 2013

The Challenges to Global Collaboration

There has been a lot of discussion lately on the web about how we're beginning to achieve the decades old goal of consistent global collaboration for innovation and problem-solving.  The celebration may be a bit premature, not unlike prior victory dances we engaged in related to:

  • Cancer Research
  • Flying Cars or Electric Cars for that matter
  • Space travel
  • AIDs Cures
  • Big Data changing the world
  • Artificial Intelligence and so on...
We have had a bad habit lately of confusing the initiation of a trend or innovation with its mature realization. The technical foundation for global collaboration has been developed and deployed over nearly four decades. That's how long it has taken to define the network communications paradigm and deploy the necessary bandwidth, information resources and devices to make such a lofty goal possible. But then again, perhaps we need to step back and ask ourselves what Global Collaboration really is and identify the remaining barriers that may otherwise hold it back.



Global Collaboration represents the exploitation of both technological assets and cultural predispositions to share knowledge and jointly resolve common challenges. Now, this model isn't entirely new is it? Academia has been doing this for centuries - sort of. Within higher education there is (and always has been) a tacit expectation for various types of cross-institutional mind-share. However that expectation was always restricted by the following factors:
  1. Competition for ownership of ideas.
  2. Limits in the ability to communicate (technology).
  3. Limits in access to the right information (technology and cultural boundaries or secrecy - and this factor is of course related to factor 1).  
  4. A limited working model in how to organize "virtual communities." In the old days, these communities were characterized as scientific societies and much of what they accomplished was based on direct point to point correspondence, journals and meetings. We are still working with all of those metaphors today even though we have witnessed the birth of real-time communities powered by the combination of social media and mobile devices. 
  5. Orthodoxy. This may sound odd, but in fact it is orthodoxy that puts the institution into "institutionalism." In other words, it represents the ultimate barrier to acceptance of or even discussion of unorthodox or disruptive concepts. 

As much as we'd like to think that modern society is open-minded and innovative - and that we're hurtling from one innovation to another at breakneck speed - that's just not the case. The telephone was invented in 1880's, radio communication was invented before 1920, television around 1930 and computers in the 1940's, miniature transistors in the 1950's  - yet the journey from those various inventions to a mobile device that combines them took more than 60 years. The story of the electric car is much worse - it was invented at roughly the same time as the internal combustion approach was worked out - yet one was promoted and the other neglected. We don't typically move nearly as fast as we think we do and we often make poor choices along the way.

What does this have to do with Global Collaboration? Well, the premise goes that if dozens or hundreds or even thousands of minds were directed at a problem then the solutions to that problem would happen faster and be vetted better. It would no longer be Edison versus Telsa, but countless inventors competing and collaborating on a level playing field - or at least that's idea. The closest thing to this model that we have now is Open Source software.

But does the Open Source software model represent the type of Global Collaboration that futurists have been predicting for the past half century?  The answer is yes and no.

Yes, it represents a prototype of highly specialized working collaboration (on a global scale) - no it doesn't seem to represent the prototype for making truly revolutionary breakthroughs. So, why isn't the "Open Source" model redefining innovation and progress in software or science? Here are some possible explanations:

  1. The (majority of) projects are too narrowly defined to make real breakthroughs.
  2. The context (software development) is too limited. 
  3. Coding is focused on skill in execution - and problem solving at a tactical scale. It doesn't usually require any quantum leaps in conceptual understanding or practical application.
If you think I'm being harsh about the Open Source movement as a force for global innovation - then ask yourselves how much of the core technology that is powering our current infrastructure came out of it:
  • Well, they gave us MySQL - and SQL came from previous working groups - but it was also very similar to existing products.
  • What about Java, Linux, Apache etc. All good stuff, but not truly disruptive - the seeds for all of that had been developed elsewhere. 
  • Big Data, Cloud tech - again the same above.  

Lot's of good stuff is coming out of the Open Source movement - it's just not that revolutionary and much of what has been revolutionary has come out of the older style collaborative ecosystems like Academia and DoD (or combinations thereof).

So, now with all of that as preface; here is a list of some of the challenges that I think is holding back global collaboration (at least the way it has been envisioned):

  1. Competition can be both beneficial and destructive. We need to find a better way to manage it and ensure open playing fields for smaller players. In other words, how can we provide tangible rewards for contribution that doesn't end up excluding the majority of participants? 
  2. Secrecy -  One of the lasting holdovers of the Cold War is a subculture of secrecy that seems to be present in almost every major nation on the planet. This represents perhaps the single biggest obstacle to the eventual global cooperation that futurists tend to describe. Solving problems together requires a level of trust that simply doesn't exist yet. Hopefully it will someday.
  3. Orthodoxy - How many times have brilliant ideas been dismissed because the community in which they were introduced rejected them? More times than can be counted no doubt. The biggest barrier to most innovation is and always has been a lack of imagination. When careers depend of defending existing paradigms; newer paradigms will take longer to birth or die in the cradle.
  4. A framework or methodology for global problem solving. There's perhaps thousands of these or similar approaches floating around (from academia to open source software for example) but a truly effective one remains elusive. 

My next post will explore what "a truly effective methodology for global problem solving" might look like.




Copyright 2013, Stephen Lahanas

#StephenLahanas
#TechnovationTalks
#SemantechInc.

Sunday, November 10, 2013

Understanding Data Architecture

Someone asked me what at first sounded like a very straightforward question earlier this week; "what is Data Architecture" - or more precisely, what does it mean to you. Usually, I'm not usually at a loss for words when it comes to expounding upon IT Architecture related topics - but it occurred to me at that moment that my previous understanding of what Data Architecture really represents is or has been a little flawed or perhaps just outdated. So I gave a somewhat convoluted and circumspect answer.

Where does Architecture fit within this picture?

The nature of what's occurring in the Data domain within IT is itself changing - very quickly and somewhat radically. The rise of Big Data and proliferation of User Driven discovery tools represents quite a departure from the previous more deterministic view of how data ought to be organized, processed and harvested. So how does all of this effect Data Architecture as a practice within IT (or more specifically within IT Architecture)?

But before we dive into the implications of the current revolution and its subsequent democratizing of data, we need to step back and look again the more traditional definitions as to what Data Architecture represents. I'll start with a high level summary view:

Traditional Data Architecture can be divided into two main focus areas; 1 - the structure of the data itself and 2 - the systems view of whatever components are utilized to exploit the data contained within the systems. Data in itself is the semantic representation or shorthand of the processes, functions or activities that an organization is involved with. Data has traditionally been subdivided (at least for the past several decades) into two categories; transactional and knowledge-based or analytic (OLTP vs. OLAP). 
Now we'll move to a traditional summary definition of Data Architecture practice:

Data Architecture is the practice of managing both the design of data as well as of the systems which house or exploit that data. As such, this practice area revolves around management of data models and architecture models. Unfortunately, the application of Governance within this practice is sporadic and when it does occur is often split into two views: governance of the data (models) and governance of systems (patterns and configurations). 
So, that seems to be fairly comprehensive; but is it? Where does Business Intelligence fit in - is it part of the data management or system management - is it purely knowledge focused or does it also include transactional data? For that matter, do Data Warehouses only concern themselves with analytic data or can they be used to pass through transactional data to other consumers? And isn't Big Data both transactional and analytic in nature? And BTW- how do you model Big Data solutions either from a systems or data modeling standpoint? Now - we start to begin seeing how things can get confusing.

We also need to take into consideration that there has been an attempt made to standardize some of this from an industry perspective - it's referred to as the Data Management Book of Practice or DMBOK. I think in some ways it's been successful in attempting to lay out an industry taxonomy (much like ITIL did) but not as successful in linking that back into the practice of Data Architecture. The following diagram represents an attempt to map the two together...


There isn't a one to mapping between DMBOK and data architecture practice, but it's close
One of the areas that the DMBOK has fallen short is Big Data; my guess is that they will need to rethink their framework once again relatively soon to accommodate what's happening in the real world. In the diagram above, we have a somewhat idealized view in that we've targeted a unified governance approach for both data modeling and data systems.

Let's take a moment and discuss the challenges presented by the advent of new Big Data and BI technology. We'll start with BI - let's say your organization is using Oracle's BI suite - Oracle Business Intelligence Enterprise Edition (OBIEE). Within OBIEE you have a more or less semantic / metadata management tool called Common Enterprise Information Model (CEIM). It produces a file (or files) that maps out the business functionality of all the reports or dashboards associated with the solution. Where does that fit from an architecture standpoint? It has a modeling like interface but it isn't a 3rd normal form model or even a dimensional model. It represents a proprietary Oracle approach (both as an interface and modeling approach). It allows you to track dimensions, data hierarchies and data structures - so it is a viable architecture management tool for BI (at least for OBIEE instantiations). But some traditional Data Architecture groups would not view this as something the architects would manage - it might handed off to OBIEE administrators. This situation is not unique to Oracle of course, it applies to IBM / Cognos and other BI tools as well and there's a whole new class of tools that are completely driven by end users (rather than structured in advance from an IT group).

Now let's look at Big Data. Many of the Big Data tools require command line interface management and programming in order to create or change core data structures. There is no standard modeling approach for Big Data as it encompasses at least 5 different major approaches (as different say as 3NF is from Dimensional). How does an architecture group manage this? Right now, in most cases it's not managed as data architecture but more as data systems architecture. The problem here is obvious; just as organizations have finally gained some insight into the data they own or manage - a giant new elephant as entered the room. How is that new capability going to impact the rest of the enterprise - how can it be managed effectively?

Back to the original question - what is Data Architecture. I'd like to suggest that the practice of Data Architecture is more than the sum of its traditional activities. Data Architecture is the practice of understanding, managing and properly exploiting data in the context of problems any given organization has to solve. It is not limited by prior classifications or practice but has a consistent mandate to be able to represent and hopefully govern in some fashion data as an asset (internally or shared collaboratively). Data Architecture as we know is going to change quite a bit in the next two years and that's a very good thing.



Copyright 2013, Stephen Lahanas



#Semantech
#StephenLahanas

Monday, November 4, 2013

The Value of Architecture Assessments

While many people are becoming somewhat familiar with IT or Enterprise Architecture, relatively few know much about Architecture Assessments. This is unfortunate given the significant value proposition such exercises provide. For example, had one or more Architecture Assessments been provided for Healthcare.gov during the course of the project; it is unlikely that the Obama administration would have been surprised by the current mess it's now facing.

Problem Space analysis is one of the techniques used with assessments - it can apply
both to the business and technical aspects of a project

An Architecture Assessment is also different than other traditional Architecture activities in that the expectation is generally that third-party personnel are more likely to perform them. The reasons for this include the following:

  1. Assessments are one of the key tools involved in IT oversight activities (sometimes referred to as Independent Validation and Verification or IV&V).
  2. It is more likely that an accurate assessment can be obtained by architects / investigators without a vested interest in the project. 
  3. The skillset of the person doing the assessment is critical - it needs to be an architect and not merely a technical or product expert. This is the only way to ensure that all options / alternatives are properly considered / assessed. 

So what exactly is an Architecture Assessment? A typical assessment tends to include the following categories of activity:

  • Information Gathering  
  • Design & Project Review
  • Design & Project Recommendations
An assessment also typically includes one or more types of specific analysis as well:
  • Analysis of Alternatives
  • Root Cause Analysis
  • Problem Space Mapping & Resolution 
One way to capture alternatives is through Decision Trees.

Perhaps the most important function that an Architecture Assessment can provide is a mechanism to challenge assumptions and combat complacency. One of the main reasons that IT projects fail today is because there is generally no incentive or expectation to raise issues or problems. Rather than being viewed as a healthy activity - identification of the problems is feared in itself and thus ensures even more pain later on when the issue finally surfaces (which they always do).


When compared to the cost of the rest of a typical IT project (and any potential loss or cost overruns associated with problems that aren't identified or managed in a timely fashion), a relatively brief assessment exercise is generally less than 1% of total cost yet could make the difference between success or failure...



Copyright 2013, Stephen Lahanas


#Semantech
#ITarchitecture