Algorithms and AI — who owns the knowledge

A few months ago, I was doing some research for the team and came across an article about the top algorithms teams use. What was interesting beyond the use of algorithms was the history of some of them and how old they are. For example, Naïve Bayesian Classifier is based on Baye's Theorem (https://en.wikipedia.org/wiki/Bayes%27_theorem) who was alive from 1701–1761. Consider that a theorem that is over 200 years old is now being used in predictive analysis on data produced by technology that did not exist then. I think it’s pretty amazing. Of course, there are many examples of solutions and innovations from the past providing answers to today’s challenges. But it’s easy to get caught up in the pace of today’s innovation: new technologies, new ways of doing things and forget that innovation didn’t start with Uber or Twitter.

This also means that the development and use of new algorithms continues.  Of course, companies like Google and Facebook continue to develop and refine their algorithms. But there is a market for actually selling algorithms, which seems somewhat in defiance the spirit of the whole idea in some ways. But then Google and Netflix have made huge amounts of money based off their algorithms so maybe it’s the right way to be thinking about them.

Another niche that took me by surprise and I don’t know why is the degree to which AI modeling depends upon humans tagging/categorizing things. Sometimes AI or machine learning is portrayed as this sort of this all-powerful autonomous aggregator, able to identify and categorize huge amounts of data with a single input. When in actual fact there have been 100s or 1000s of people tagging photographs, reviewing legal documents or medical results, and getting paid for their efforts. All of this work they’re is doing is feeding these vast AI models with the goal of identifying what…I guess everything. Totally unnerving. I really don’t think I want my mammograms made anonymous, then reviewed, tagged, given a result and then dropped into a system to allow matching of clear scans.

But then maybe I do, if I’m compensated.

I think we, the public, need to start taking ownership of the data we create and disseminate if it’s too be reused and leveraged to create something that businesses in turn sell back to us. If we benefited from the public availability of 200 year old algorithm, it seems we should also not be made to pay for the data we in fact created. 

Everywhere you go there, there's empathy

I just finished reading Satya Nadella’s book “Hit Refresh”. It was an easy read with lots of stories about his journey from India to CEO of Microsoft, his evolution as a leader and the impact of his life experience on how he is driving change at Microsoft. If there was one word that I think would sum up the purpose of the book, it is ‘empathy’. And how empathy can transform how we work so that our best work is driven by our need to help and serve.  It’s a word I’m hearing more and more.

I’m in the midst (or more accurately lagging behind) Seth Godin’s Marketing seminar, where he spends almost half the course talking about empathy. Seth mentions empathy, then reminds us of the importance of empathy and follows up with why empathy is key all within a 5 to 10 min video.

The impact of which is that when I look at an ad or watch a video, I wonder who the marketing person had in mind when they created it. Most of the time, I let ads and banners wash over me, almost oblivious to their presence but now I’m looking at them more critically and considering  what I and others put out there.

I’m also hopeful that this may be a turning point in how we do business. You can’t create with empathy and still chase more and more at the cost of the environment, communities and people. I mean you can but you’d be a bit of a jerk,  and also I don’t know how authentic your marketing can be if your empathy stops at the ad, or the brief.

Although, maybe you can take empathy too far. This ad from Timberland may accurately reflect what people are fearing but it’s not exactly offering a solution. 

 

Please. No more empathy.

Please. No more empathy.