The Dirty Dozen - Avoiding Twelve Common mistakes in Cognitive System design
The Dirty Dozen rights are property of MGM - 1967

The Dirty Dozen - Avoiding Twelve Common mistakes in Cognitive System design

There is considerable excitement around AI, Machine Learning, and Cognitive.  My fellow architects and I advise and assist with the creation of Cognitive and AI powered solutions. We have the privilege of working with great people from small startups, systems integrators (SIs), and Fortune 500 companies. 

Most projects run very well and we are amazed at the wealth of knowledge and talent our colleagues, customers, partners bring to the table.  But we also see areas where there is room for improvement. Below is a summary of “The Dirty Dozen” that we hope can help you keep an eye out for risks, for your next Cognitive innovation project.

1.    POC – JUST WHAT ARE WE TRYING TO PROVE?  

Proofs of Concept (POC’s) are a terrific way to build momentum and advance the cause.  One common danger is organizations spending time and money on POC’s that do not focus on the most interesting or challenging components in the project.  Good examples of POC’s might include: trying to establish if signal actually exists in the data or if a new technology is compatible with system/data/stakeholders; or trying new ways to integrate new technologies together.  A POC simply to familiarize an organization with well-established Cognitive patterns (e.g. lightweight chat bot) may be a wasted opportunity. It’s wise to consider the opportunity cost of any POC.  Ask “Exactly what are we trying to prove?” and know why.

2.    ANALYSIS PARALYSIS

Many projects, especially those with larger organizations, have an abundance of enthusiasm, great ideas and supportive stakeholders – but get caught up in a cycle of business meetings, more calls, more presentations.  Weeks pass. Hundreds of hours are spent mulling over idea candidates and rescheduling the next meetings. In many of these cases, action is the best remedy! The Agile approach can also provide a good backdrop from translating talk in to action . In some cases, reducing the number of active stakeholders for a period, and empowering them to act and innovate, can unlock the effort and build positive energy and momentum.  In some cases, where projects are not destined to happen, the teams can learn fast and fail fast.

3.    TOOL-CENTRIC THINKING:

It’s an exciting time to be innovating with a growing array of dazzling tools in the AI, ML and Cognitive space. A common mistake we see in planning meetings however, is participants jumping to tool-talk too early in the process. During planning, too much can be time spent focusing on the tools and latest/newest/coolest technology, and not enough time talking to each other (and actively listening) about overarching goals, or how to integrate into existing technologies that are working well. If someone has a strong affinity for a technology (“I want to add Deep Learning ABC to a new BI Stack”) without being able to explain why – caution should be taken. Thar be dragons. 

4.    BEWARE OF CRICKETS

A great way to test whether a project is trending too “tool-centric” (as above) or lacks a holistic-strategic perspective, is to ask stakeholders in the room to take a step back from the topic at hand and ask simply: “What Problem Are We Solving?” Sometimes we hear crickets, or radically different points of view.  In either case, the cricket’s song is a strong signal to put tools down, and start talking (and listening) to each other about the fundamentals of the project. Once stakeholders having a clear understanding of the bigger goals, the challenge being solved, the key questions being asked, or can clearly talk about path to value creation, then it’s safe to start going deep into tool talk and architecting.  Another way to test for clarity of purpose is to walk the team through the (putting tongue in cheek here) ‘Cognitive Arc of Destiny’ – a process that basically seeks to connect the various dots from data through to shareholder value. Or more simply, can the arc of the project be explained to a non-technical dinner party companion in less than 1 minute?

5.    DATA NEGLECT

For a sector that talks so much about dark data and data exhaust, teams scoping data-fueled projects sometimes don’t give much thought or respect to the source data!  In reviewing project plans, teams often have not provided enough time or manpower for tasks like understanding data sources, getting access to multiple data sources (InfoSec/paywall), understanding regulatory factors (PII), achieving compliance, cleaning and formatting data, and checking for completeness of (and signal in) the data.  In other cases, the POC scale data has been handled well, but not enough thought has been given to operationalizing the system and handling the data at larger orders of magnitude.  Data munging is not fun, and often takes more resources than hoped, but it can be a critical success factor for the POC or rest of the project.

6.    NO DRUMMERS & NO DRUMBEAT

Many projects have clarity of purpose, plenty of enthusiasm, and talented people ready to engage.  But without a degree of organization and regular project cadence (drum beat) even the best projects can fail.  Whether it’s a Scrum Master (agile), or an old-school Gantt-chart-toting project manager – the right person is essential to beat the drum and keep the team moving towards incremental goals.   If a Proof of Concept (POC) is small and clear, a formal PM role may not be required, or duties may be covered by another team member – but when projects involve multiple organizations and more complex objectives with checkpoints (e.g. 30-60-90) then staffing the role with the right person can be a key driver of success.

7.    IDEA INTOXICATION

When we do a good job of explaining tools, technology and talking about the “Art of the Possible”, great ideas often start to flow like wine.  It’s intoxicating.  Stakeholders come up with numerous use cases (very good ones) and start to see possibilities everywhere.  It’s a fun point to be in the project – but also can lead to Analysis Paralysis (see above) – or Scope Creep (see below).  Having a word for this state, and recognizing both the benefits and risks of having many ideas, can help manage this stage in the process. Another way to manage is by the methods in the IBM Design Thinking framework or similar methods from great firms like IDEO.

8.    FORGETTING FEEDBACK

Surprisingly, people who work in the field of machine learning sometimes forget some of the core principles around feedback loops!  We know that no system will be perfect at launch.  Ensuring projects have enough time to learn and improve, and architectures with mechanisms for feedback and error correction, is essential.  There is no magic here. We simply ask: “At first launch: what is the expected performance?” and how do we think the system will be improved 3 months later? KPIs are key. Having yardsticks for success early in the project helps with scoping on the front end, and with performance on the tail.

9.    SUITS AND HOODIES – WE NEED BOTH!

The person wearing the hoodie and putting fingers on keyboard has a problem to solve, and the tool just needs to work! Twilio does a great job of solving specific problems, and creating a smile inside the hoodie.  While Cognitive tends to be a bit more complex than integrating a simple SMS service, the goal remains the same – to make sure that the Developer can get the job done. They need access to good tech, documentation, SDKs, reference architectures, communities and patterns to follow. But without the Suits, the nascent (sometimes skunk-works) projects will never meet their full potential. The Suits, to support the mission, need to see a clear connection between the data, services, and time spent by development team, and see cost-benefit translating to value for customers and shareholders.   A cool “hoodie” technology without a clear business case or strategic alignment will fail to launch.  Similarly, a top-down, management-driven “we’re doing Cognitive now” mandate, without consulting the innovators and technical teams – or being tone deaf to the culture of the organization – can result in lost quarters and lost opportunities.

10. KNOWLEDGE AND CULTURE

When a large organization is embracing new tools and methods, a clear plan on knowledge gathering and distribution is sensible.  What knowledge is required? What knowledge exists today and where are the gaps? Are there clusters of knowledge and talent already in the organization – and if so, how can the organization best amplify it? Having a single digit percentage of employees knowledgeable in Cognitive is a good start, but not a recipe for success.  Companies need to consider how to encourage visibility. IBM ran a Cognitive Build program in 2016 that engaged not just IBM Watson business units, but more than a quarter million employees across the world. The program raised awareness, educated, and created new connections inside the organization to enable the transformation of all business units.  On Culture - as the old saying goes “Culture Eats Strategy for Lunch” . When large organizations with decades of success are trying new technologies, and looking at innovation in new ways – organizational culture is going to make or break any innovation program. Understanding how stakeholders are going to embrace (or resist) new technologies needs to be considered in the planning, and monitored throughout the program.

11. READY, FIRE, AIM! RE-INVENTING THE WHEEL

Organizations, like people, can be sometimes be impulsive and egocentric. Inward looking for insights about products, and methods to apply technology to solve problems. While there is often an admirable enthusiasm and a “let’s get started” energy – this can sometimes lead to not spending enough time talking to others about their work, and being receptive to alternate (sometimes less fancy) methods.  As I seek to improve my own skills – I remind myself of the many times that a few hours phoning and asking around (or on Slack), has saved me weeks of effort.  It’s time well spent.  Other architects and designers are proud of their work and usually very happy to share what they’ve learned along the way. Time spent doing a bit of research and active listening, is usually time well spent.   Two things we often learn in doing our homework is (i) there is often a better, simpler and sometimes non-cognitive technology that can simplify/enrich the overall design; and (ii) it’s rare that my projects are first-of-a-kind (FOAK) and a 15-minute conversation with someone who has done it, usually pays off 50X.

12. SCOPE

This one is really simple. When scope is too big, too early, the project risk goes up. Managing scope to be tight does not mean the overall program ambitions are any less – it’s just a smart way to stay focused on a win. Ensure all stakeholders have clear idea of success metrics and KPIs. When novel cognitive methods are involved, all the greater reason to keep the scope tight and success metrics crystal clear.

For more information on nuts and bolts - and patterns - check out - https://www.ibm.com/devops/method/category/architectures/

In conclusion...

I’ve had the privilege of working with and learning from great people from my company and from small startups, systems integrators (SIs), and Fortune 500 companies.  I hope some of our “Dirty Dozen” lessons will help you plan and de-risk, your cognitive computing journey! Safe travels!

 

These opinions are my own and do not necessarily reflect the views of my employer. Dirty Dozen rights are property of MGM. Road Runner & Wile E Coyote rights property of Looney Tunes / Merrie Melodies. Drummer image The Spirit of '76, A.M. Willard.

Victor Nelson

Product Management and Leadership | Strategy and Hands-on | Entrepreneurship

7y

Several very good points here, and in fact most apply to managing any technology project. One deserves some additional thought: “A POC simply to familiarize an organization with well-established Cognitive patterns (e.g. lightweight chat bot) may be a wasted opportunity.” A POC can legitimately be used to help get stakeholders comfortable with a new approach. AI is still a scary topic to many people, and a POC can be a “safe” next step to allay fears, to enable starting the real project.

Like
Reply
Denilson Nastacio

Senior Software Developer and Cloud Architect at IBM

7y

Thanks for all the thoughts on this. Bookmarked for reference and sharing. I would add a couple more: 1. The "Best effort" mindset. Widespread, broader, deployment of cognitive approaches are a a recent trend, but many of the technologies and techniques are hardly new. Whereas team members may see the PoC as a small miracle exercise, it is still just one or more sets of features for the product. An F score of 60% relative to an expert may feel like breathing life into clay from the perspective of the development/research team, and may also elicit positive feedback during early demos. However, from the perspective of the business, it is still just an underperforming intern that is not ready to take on real work. 2. No baseline derived from real use cases from day one. Similar to #1, rolling into the data and task at hand may be hard to avoid, but without constant measurements against a human (or equivalent) benchmark, no one can tell how effective the system will be. An F score of 95% in detecting entities and relations in one of the points of the pipeline may give the team a sense of invincibility until everyone realizes that the real use cases require 8 of those entities and relation to align properly. Once you calculate 95%^8 and arrive at an effective score of 60%, you wish you had tackled simpler things along the way instead of pursuing ultimate efficiency in an isolated layer. For instance, if you are trying to draw the relation between makes and models on an insurance claim, once you arrive a certain scoring threshold, it may be best to leverage an industry dictionary to solve ambiguity cases than relying on extensive supervised training by experts. A measure against a baseline, executed daily, would tell the team that additional training is not helping within days instead of discovering the bad news months down the road.

Aurelije Zovko

CTO | Chief Architect | Knowledge Graphs | GenAI | ML | IPA Automation

7y

Nice article Ryan.

Like
Reply
Daniel Toczala

IBM Watson, Data, and AI Expert - Technical and Business Leader

7y

Nice article Ryan - something I would share with anyone who is starting their journey into Cognitive computing.

Like
Reply
Dr John B.

Chief Consultant - Information/Cyber Security

7y

Decent article. It seemed to "touch" on a particular concept but, didn't come right out and say it... Fresh data is essential! It's one thing to have a treasure trove of information but if you let it go stale... ah well, it's like retelling the same joke day in, day out. You need new material. :) Still overall... decent article and worth a read.

Like
Reply

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics