Artificial Intelligence: What will it ever do for us?

There is a danger that going along to a conference in search of a solution will lead to outbreaks of magpie syndrome - with so many vendors offering all kinds of new technologies, it can be easy to get carried away with all of the shiny things on display. I’ve always found that the best approach is to go along with an open mind and look for inspiration, rather than trying to focus on finding a solution to a problem you think you have. Last week I went along to the autumn gathering of the Disruptive Innovators Network. This time, the theme was ‘artificial intelligence: what will it ever do for us?’, and sure enough, thanks to some great speakers and good discussion, I left feeling inspired. I thought that I would jot down a few of the things I was thinking about during my train journey home.

In IDC’s AI-Based Automation Evolution Framework the final stage of AI evolution is machine-controlled insight, decision-making and action-taking. But it feels like where we are at in the housing sector at the moment is stage one or two; we’re either 100% human lead or in some cases just starting to explore the possibilities of having machines support decision making as part of what remains a human-driven process. It feels like there may be some benefits to stage two. For a start, we’re in control, so that makes the whole thing feel a lot less dystopian, but also, humans are particularly bad at making decisions. Our decisions are largely emotional, not logical, which can lead to inequity, data bias and bad outcomes driven by irrational attachments. Having a machine help us make decisions more efficiently actually makes a lot of sense and could help us spot things we might have ordinarily missed.

Image: Based on the framework created by the International Data Corporation (IDC)

Image: Based on the framework created by the International Data Corporation (IDC)

During World War II, researchers at the Center for Naval Analysis faced a critical problem. Too many bombers were getting shot down on runs over Germany. So, after each mission, the researchers reviewed and recorded the position of all bullet holes on returning aircraft. The data began to show a clear pattern - most damage was done to the wings and fuselage. The solution they proposed was to increase the armour in these key areas. But there was a critical flaw in their analysis. The researchers had only looked at those bombers that returned to base. In reality, the areas without holes were actually the most vulnerable; this data was missing from the research because the bombers who had been hit there hadn’t returned. Sometimes when our emotions cause us to get too embedded within a particular context, we can fail to see what’s in front of us. We can help combat this by giving ourselves space to stand back and look for what’s missing.  

At Bromford, we’re currently operating within the context of our localities approach. We’re moving away from a traditional housing management approach of paternalistic, rationed, deficit-driven transactions towards one of neighbourhood coaching based around networks, resilience and strengths-based relationships. There is perhaps a danger, therefore, that when our whole business model is based on developing relationships, we may be tempted to de-prioritise technology which appears to work best in terms of streamlining transactional activity. But in doing so are we failing to look for what’s missing? 

A danger for all innovation labs is that over time they get too drawn into delivering business as usual and incremental change and start failing to achieve their core purpose of communicating the opportunities that more ‘radical’ innovation can bring. I believe that it’s possible for AI to be used in partnership with our relational approach rather than in opposition to it;  supporting decision making and helping us to do things which are innately human such as developing trusting relationships. Whilst it’s true that our coaches are now able to understand their neighbourhoods in more depth than ever before, that doesn't mean we can’t use technology to help deepen their knowledge. Arguably the question we should be asking ourselves is how we want to use AI, rather than whether we should be looking into it in the first place. If data is used as a blunt tool to sanction people, that’s not where we want to be, but if it’s used to enhance relationships, to help us do what is innately human better, then that’s a pretty worthwhile tool for the coaching toolbox.

Perhaps one of the ’most powerful actions’ I could take in the following weeks is to work with colleagues to think about our strategy for artificial intelligence and machine learning. The challenge for the housing sector is making sure that implementing AI and ML solutions is part of a robustly evaluated end-to-end design process which offers great customer experience. Artificial intelligence for the sake of artificial intelligence would just be an emotional response to the latest shiny object; like any solution, it needs to be supported by good quality human-centred insight rather than come as a result of the magpie effect.  

So, what will artificial intelligence and machine learning ever do for us? I think it can free up our time, help us focus on the important things and ultimately help us make better-informed decisions. Importantly, if it’s done right, I believe it can also help support human-to-human relationships, rather than replace them. But we’re at the start of the journey and the road is likely to be bumpy. That means that testing small and failing safely is going to be really important.   

Big inspiration credits must go to Neil Mackin, Simon Devonshire and Sally Caldwell but thanks as well to Ian Wright and all of the other folks I heard from and chatted with on the day.  

---

 

@simon_penny