Ethical decision-making is vital in local government, where choices have enormous impacts on lives and local communities. As data plays an increasing role in how council services are delivered and which policies take priority, values like fairness, transparency and accountability must be put centre stage if local authorities want to keep the trust of their constituents.
Nathan Makalena - NM
Paul Clough - PC
Lucy Knight - LK
NM - I'd be interested to hear more about data ethics in a local government context, where decisions are made for specific local areas. Do you find that open, inclusive discussions around the use of data are becoming more common in local government. Especially compared to the private sector where the end users, the people affected by these decisions, are usually a little further removed?
LK - Good question. I’m thinking about Devon County Council, my previous employer in local government.
They did something I thought was extremely good, which is that they had an external reference group. And that group was made up of representatives from the local retired and older people's association, there was the LGBTQIA charity. There were representatives of people with physical, sensory, and learning disabilities. And representatives from the Youth Parliament, so younger people in there. And in all, I think there were sort of 10 or 15 people representing a broad range of differences. Different perspectives, different backgrounds, representative of different ethnicities as well.
And on a regular basis, all of those people will be invited to County Hall. And we would sit down and we would say, ‘right, here are the kinds of policy issues under discussion. Here is how we think we're going to approach this. Over to you. What do you have to say about what we're doing and the way that we're doing it? Can you spot anything that we've missed?
I've only attended a couple of those actual meetings, but it was nice to see it approached in that way. Where a very senior person would say, ‘here's what we're thinking, tell us what we've missed’. And then that person would sit down, shut up, and listen.
I always thought that was really well done. And if I could change one thing about the way that they do it, it would be to not wait until the policy was written to call that group together, and share the information. That actually there'd be more involvement at an earlier stage.
Just saying to those people, we need your insight, but actually, we're not going to ask for it for free, we're going to make it as easy as possible for you to be with us and to give us your views and we are actually going to listen to you. That was well done.
And the other thing with the local government is we’re talking to each other now. We used to be potentially quite isolated, there wasn't a lot of budget to go to conferences and that sort of thing. If you wanted to do that you kind of had to sort yourself out and maybe pay out of your own pocket.
But now you’ve got Local Gov Digital, where organisation's are talking to each other all the time. Asking for advice all the time saying, ‘Well, we're doing this, has anybody done this? Has anybody used this tool? Has anybody used this supplier? What can you tell me? What should I be worried about? And some really good information exchange is going on there. So that's changed even in the last five years, much more of that going on.
People go into local government, because it's a nice, secure, career. You know, indoor work with no heavy lifting. But also because they want to do good, the vast majority, they go into local government, because they want to do good, and they want to help. So that does tend to bring in people who do stop to consider the impact of their actions.
PC - Yeah, so I think from my perspective, it does vary by organisation, I think some organisations… I've seen plenty of examples where you have a user team or a UX team and the project is driven more from a user perspective. Which again, is really important, because it comes back to ‘why are we doing this in the first place?’
“Well, is it likely to impact a person or a group of people? How's it going to impact them? Could that be in a positive way or negative way, or so on? We need to talk to representatives of those to find out, what are they feeling, what are they thinking, what are they doing and that as well.’
So I think it's becoming more common that people are thinking about the end user. But not all analytical applications have a kind of direct end user, there are some tools that you build that don't necessarily involve people or affect people. So I think it does vary from my experience.
But I think what Lucy said about sharing - I was at an event last week in London, and it was with public sector people, mainly central government people. And it was fantastic to see the amount of knowledge sharing that was going on.
We learned a lot about areas like levelling up, and all the great work that's going on there. And that's a massive kind of data project, analytics project. It's really big, but it involves people right across the spectrum, in terms of the statisticians, analysts, data people, but also people really seriously thinking about the policies that are being made, and the impacts it will have on people. So I think, actually, the government is a good example where they do think wider than some organisations, and involve more groups.
So I think it's definitely happening in an ongoing [way] - and the government and other sources have got lots of resources, lots of tools, that can help people as well.
Open data, linked data, all of that going on to help share some of the datasets. Not just within the government, but some of those datasets we can access as citizens because it's kind of public data if you like. And that means that then more people can build apps and tools and so on and participate in the kind of whole ecosystem of building applications and so on.
NM - The accessibility and sharing. That was one of the areas I think we highlighted right at the very beginning that might be due a bit more improvement. Something that we've not really touched on at all is obscuring the results of these engines.
Rather than how they reach the results, just the results themselves. I mean, perhaps in local government, or sort of public facing organisations, there might be more of an urge to obscure those results, if that's the lightest way of putting that phrase. I guess that's another thing about accountability, right? It comes down to sort of being accountable to what these engines expose, sometimes within your organisation.
LK - Yeah, it's a tricky thing, because certainly in local government, your elected representatives are very much in touch with the people that they have been elected by, they live in that community. For many MPs that's also the case, but less so because their duties take them to Westminster. Obviously, a huge amount of the time with local government, you can easily find yourself living in the same small town, next door to your counsellor. And they are very, very conscious of what people care about and how people feel.
So even if the data suggests that, for example, there should be fewer school buses run because it's costing the council money - what would be a ‘sound’ decision based on budgets and the available data?
The reality is, the counsellors will be aware that fewer school buses will mean more parents having to cut their hours to get their children to school, which will mean less council tax being paid because more people will be in arrears, because they have lower wages and so on.
They have a better sense for yes, the data says that, but the data is not everything. The data is not infallible. What about context? What about local knowledge? And people understand place, and they understand community. So it can be that sometimes a policy is set - and we call it policy based evidence making - ‘This is the policy because I know this is the right thing to do. Actually, where is the data that will support me making this decision’?
And that may sound wrong? And it may sound like misinformation or disinformation. But it's a balance. And how much do you trust that person making that decision? If you know them personally, as you might with your counsellor you might say, ‘well the data kind of says something else. But I can see where they're coming from, I can see why they're pushing for that decision or for that direction or for that steer in policy, because overall, it's probably going to work out better in the long run. The data right now says short term, it might be an issue, but I can see where they're going with that.’
So yeah, it's a tricky scenario to be in the middle of. As a data person, you’re going, the data kind of says this. On the other hand, the bigger picture is that fixing that will actually disadvantage a lot more people.
And that's the other thing people need to understand about local government. Budgets. It's like whack a mole, you push one thing down, something else pops up. And so you can't just tweak here and there and everything is perfect. You're constantly worrying about, you know, ‘If we slash money from this budget, well, then the roads won't be fixed. If the roads aren't fixed, there'll be more accidents and potholes. And then we'll have a huge bill further down the line. It’s a very, very complex machine.
NM - It's intriguing because, I guess, these decisions are only going to become - or the area that they're governing - wider. With that trend towards devolution, it seems recently. They will have to be more hands on and these discussions will only become more important.
PC - Just in relation to your question there as well, Nathan. I was at this panel event where one of the attendees asked the question: ‘as an analyst, how'd you convey something negative? You know, how'd you do that?’
And I think the simple answer is well, you know, if you're ethical, you just tell it as it is. The data says this, you don't bend it, you don't shape it and so on. You leave the people to then use it.
Which leads me to start thinking well, actually, as a data analyst, as a data scientist, we need some kind of Hippocratic Oath. There should be a set of ethical principles which I work against, which I abide by.
Because actually, the pressure can get put on: ‘actually, you can't report that, you need to twist it a little bit too… we cannot tell senior people this… can you just alter that, you know, that graphic, just to kind of bend it a bit…’ And again, I think it partly comes back down to the culture in which you're working. Are you comfortable enough to say, ‘No, I don't want to do that. I don't believe that's the right thing I should be doing.’ You know, and sometimes that's a really hard thing to be doing. So, yeah, I think it is a challenge.
LK - It's a very hard thing. Open Data Institute does training across a range of sectors, and we do training for the public sector. And one of the concepts we're trying to get across is that, for the technical and the non-technical (what we call the geek-wonk interface) is that it's important to be able to have those discussions across the interface.
And it's quite crucial for the technical people to feel like they're able to kind of raise a concern. Respectfully. Not, ‘I won't, you can't make me’, not slamming doors and tantrums. But rather, ‘I am a little concerned about the data that I've been given’, or ‘I'm a little worried that it's been presented in this way, can you know, can we talk about that’. And for the policy people to be able to have that discussion going back the other way, that says: ‘I understand a little more about what it is that you're doing. Let's sit down and talk about what we're actually aiming to get done here.’
I think it will help if we equip, not just non technical people with the tools and the vocabulary and the understanding of what works with data to have those conversations. But also to equip technical people with the, again, the vocabulary, but also the confidence that yes, they do get a say in this. Just because they're an analyst doesn't mean they can't understand the policy implications. It doesn't mean that they don't think about people and human impacts.
And that they should, at every point, feel they have a right to be part of that conversation as well. And to flag concerns and to raise problems. You know, and just to get across that this is everybody's conversation,
NM - I guess, then there's a question there about, we're talking about Iterative life cycles and improving things as they go on, you know, at what point does a project become abandoning a project become the best course of action, I guess, rather than trying to dip in and change things and adjust things towards more ethical decisions?
LK - It’s just a case by case basis, isn’t it?
PC - I mean, there are probably obvious cases. I'm thinking there are examples of councils who've used predictive analytics, maybe to make decisions about vulnerable adults, children, and or social welfare. And they have been basically, you know, shut down, because they just haven't been successful and haven't worked. Unfortunately, they often only shut down once they have been exposed in the media, which is not good for anybody. So you kind of think there probably is a point at which before that you would have said, ‘actually, this is not going right, we need to stop’.
But if you weren't doing things responsibly. If you didn't have checks in place, didn't have these safeguards, that we're talking about. These guardrails. How would you've ever known?
If you didn't have a culture by which social workers could actually voice concern: ‘I think the algorithms have gone awry. They're not working as they should. These predictions are wrong.’ But yeah, people higher up have just invested a lot of money in this or whatever, you know, there's a whole load of issues around that very complex.
LK - Not, when is it the right time? But is it safe? Is it even safe for somebody to say I don't think this model is going to work. I remember years and years back, there was something about data matching. About trying to bring together disparate sets of data about individual residents, and say, ‘Well, I've got Billy Smith here, and I've got William Smith here, and I've got William Jones Smith here’. Are they the same person? And precisely what level of examination is required to assert they are - ones come in from schools, ones come in from social care, ones come in from transport. And whichever adult was filling in that child's details has kind of done it differently each time. But each is valid. How do we match them up?
And there definitely were points at which certain departments within the council would say, ‘I see you have that tool. I see you have the algorithm for matching and blending where it looks like those records belong together. But I will not be using it. My department will not be using it. Because the consequences if I mistakenly mash these two people together and they are not genuinely the same person are too great for me to take that risk. I don't care how good your algorithm is. I don't care how happy ‘highways’ is with it. I will not be using it, because for me the risk far outweighs any possible benefit. I would rather have a human being spend 10 minutes checking than take a chance on that technology.’
So again, where people feel safe to stand up and speak out. For some, the impact is low, the risk is very unlikely, they will continue. I think where there are higher stakes, and where the consequences for getting onto the wrong side of that are high, less so. And also, sometimes it takes something to go very publicly and disastrously wrong for other organisations who are maybe thinking of using that tool or that supplier to go, ‘Oh, I'm not comfortable. I don't want that happening to us. I'll pull back from that’.
I think there's no technical way to prescribe this, other than knowing that your people feel safe and speaking up when they see something that makes them uncomfortable. And that they have had the training, the education or the access to literature and materials, that give them the vocabulary to articulate why they're uncomfortable. Not just ‘ew, that makes me feel ick.’ But rather ‘it is a problem, it raises significant risk of this, this and this. I think that the decision making body should be aware of this before we go ahead’. And the culture where the decision making body will look at that and say, ‘We agree. It’s documented now, that this is a risk. We're not comfortable. Can we hold off? Can we explore it some more before we deploy?’
PC - I think it's that- It's partly that education piece, again. To help people understand here are the potential risks. They may not ever know. Like, most people just aren't aware that an algorithm can go wrong and cause harm to people. But actually, they need to know that. And then they can come back and assess their own project. Because the last thing you want is to shut the thing down when somebody has been killed. You know, unfortunately, that's often what happens, it takes the worst of the worst things to change. Yeah, why can't we be proactive and preventive
Exactly, envisage that. And one thing I sometimes do with people is exactly that scenario. Imagine the worst possible outcome. Now imagine if you wanted to cause it, how would you do it?
And that gets them going, ‘Oh, right. Yes. If I did this, and I would do, oh, god. If the director did that, oh, yeah, that would do it…
And now you have the little actions, not just reacting. But actually the small things you need to be looking out for within your organisation. The ways people behave, the mistakes you might make, or your colleagues might make, that you can kind of now start heading off.
I worked with a small organisation. And I basically said, this is the contract, this is the project, this is what you want to happen. Now I want you to do some ‘disaster imagining’.
PC - And that's the thing isn’t it. Why are we thinking about a lot of this stuff? Well, part of the reason for being ethical is to build trust. Why would a client deal with us, if we're not ethical? I think most companies would run a mile. And certainly in local government, your citizens expect the local council to be trustworthy? So that's part of the reason why we're thinking through it like this, and you want to be ethical, not just it's the right thing to do. But actually we need to be doing this to build trust…
LK - You can't do your jobs without it. And this is one of the things I was thinking about when I was looking at what we'd be discussing today. The specifics of working in local government is that - the things local government does - there's no profit in it. Because if there were, there would be a wide range of companies queuing up to do those things.
So there's no profit in it, it's expensive to deliver, it's hard to staff and resource, and it's in service of people who have some of the most severe and complex needs outside of the NHS. You're talking about people on low incomes, people in poorly served areas in terms of facilities and transport and so on and so forth. And so, those people don't have a choice about whether or not they're going to do business with the council. They can't just go and get that from someone else. The council is it.
That carries an extra layer of responsibility and accountability, in terms of what you provide and how you provide it. You're dealing with people. They're not customers. They're a recipient of whatever you are able to choose to give them. They have no choice. It's okay to be imperfect, in the midst of that journey. It's okay to be imperfect.
And what's the measure, I suppose, is how you deal with that. If you say: ‘Okay, I looked at that, and I did not like what it showed me. What am I going to do about that?’ Not ‘Oh, that's it, the whole thing is off’, but rather, ‘how am I dealing with that? How am I doing that differently next time’ and except that literally nobody is perfect. But you can at least be thinking about how you can be better at that bit.
And then next time you go around that cycle, maybe there'll be a new thing that you need to tackle. That's also okay. Responsibility is, I think, the key to what we’ve been talking about. Are you being responsible? Are you acting responsibly not just saying the words but actually living it and demonstrating it?.
NM - I'd be interested to hear more about data ethics in a local government context, where decisions are made for specific local areas. Do you find that open, inclusive discussions around the use of data are becoming more common in local government. Especially compared to the private sector where the end users, the people affected by these decisions, are usually a little further removed?
LK - Good question. I’m thinking about Devon County Council, my previous employer in local government.
They did something I thought was extremely good, which is that they had an external reference group. And that group was made up of representatives from the local retired and older people's association, there was the LGBTQIA charity. There were representatives of people with physical, sensory, and learning disabilities. And representatives from the Youth Parliament, so younger people in there. And in all, I think there were sort of 10 or 15 people representing a broad range of differences. Different perspectives, different backgrounds, representative of different ethnicities as well.
And on a regular basis, all of those people will be invited to County Hall. And we would sit down and we would say, ‘right, here are the kinds of policy issues under discussion. Here is how we think we're going to approach this. Over to you. What do you have to say about what we're doing and the way that we're doing it? Can you spot anything that we've missed?
I've only attended a couple of those actual meetings, but it was nice to see it approached in that way. Where a very senior person would say, ‘here's what we're thinking, tell us what we've missed’. And then that person would sit down, shut up, and listen.
I always thought that was really well done. And if I could change one thing about the way that they do it, it would be to not wait until the policy was written to call that group together, and share the information. That actually there'd be more involvement at an earlier stage.
Just saying to those people, we need your insight, but actually, we're not going to ask for it for free, we're going to make it as easy as possible for you to be with us and to give us your views and we are actually going to listen to you. That was well done.
And the other thing with the local government is we’re talking to each other now. We used to be potentially quite isolated, there wasn't a lot of budget to go to conferences and that sort of thing. If you wanted to do that you kind of had to sort yourself out and maybe pay out of your own pocket.
But now you’ve got Local Gov Digital, where organisation's are talking to each other all the time. Asking for advice all the time saying, ‘Well, we're doing this, has anybody done this? Has anybody used this tool? Has anybody used this supplier? What can you tell me? What should I be worried about? And some really good information exchange is going on there. So that's changed even in the last five years, much more of that going on.
People go into local government, because it's a nice, secure, career. You know, indoor work with no heavy lifting. But also because they want to do good, the vast majority, they go into local government, because they want to do good, and they want to help. So that does tend to bring in people who do stop to consider the impact of their actions.
PC - Yeah, so I think from my perspective, it does vary by organisation, I think some organisations… I've seen plenty of examples where you have a user team or a UX team and the project is driven more from a user perspective. Which again, is really important, because it comes back to ‘why are we doing this in the first place?’
“Well, is it likely to impact a person or a group of people? How's it going to impact them? Could that be in a positive way or negative way, or so on? We need to talk to representatives of those to find out, what are they feeling, what are they thinking, what are they doing and that as well.’
So I think it's becoming more common that people are thinking about the end user. But not all analytical applications have a kind of direct end user, there are some tools that you build that don't necessarily involve people or affect people. So I think it does vary from my experience.
But I think what Lucy said about sharing - I was at an event last week in London, and it was with public sector people, mainly central government people. And it was fantastic to see the amount of knowledge sharing that was going on.
We learned a lot about areas like levelling up, and all the great work that's going on there. And that's a massive kind of data project, analytics project. It's really big, but it involves people right across the spectrum, in terms of the statisticians, analysts, data people, but also people really seriously thinking about the policies that are being made, and the impacts it will have on people. So I think, actually, the government is a good example where they do think wider than some organisations, and involve more groups.
So I think it's definitely happening in an ongoing [way] - and the government and other sources have got lots of resources, lots of tools, that can help people as well.
Open data, linked data, all of that going on to help share some of the datasets. Not just within the government, but some of those datasets we can access as citizens because it's kind of public data if you like. And that means that then more people can build apps and tools and so on and participate in the kind of whole ecosystem of building applications and so on.
NM - The accessibility and sharing. That was one of the areas I think we highlighted right at the very beginning that might be due a bit more improvement. Something that we've not really touched on at all is obscuring the results of these engines.
Rather than how they reach the results, just the results themselves. I mean, perhaps in local government, or sort of public facing organisations, there might be more of an urge to obscure those results, if that's the lightest way of putting that phrase. I guess that's another thing about accountability, right? It comes down to sort of being accountable to what these engines expose, sometimes within your organisation.
LK - Yeah, it's a tricky thing, because certainly in local government, your elected representatives are very much in touch with the people that they have been elected by, they live in that community. For many MPs that's also the case, but less so because their duties take them to Westminster. Obviously, a huge amount of the time with local government, you can easily find yourself living in the same small town, next door to your counsellor. And they are very, very conscious of what people care about and how people feel.
So even if the data suggests that, for example, there should be fewer school buses run because it's costing the council money - what would be a ‘sound’ decision based on budgets and the available data?
The reality is, the counsellors will be aware that fewer school buses will mean more parents having to cut their hours to get their children to school, which will mean less council tax being paid because more people will be in arrears, because they have lower wages and so on.
They have a better sense for yes, the data says that, but the data is not everything. The data is not infallible. What about context? What about local knowledge? And people understand place, and they understand community. So it can be that sometimes a policy is set - and we call it policy based evidence making - ‘This is the policy because I know this is the right thing to do. Actually, where is the data that will support me making this decision’?
And that may sound wrong? And it may sound like misinformation or disinformation. But it's a balance. And how much do you trust that person making that decision? If you know them personally, as you might with your counsellor you might say, ‘well the data kind of says something else. But I can see where they're coming from, I can see why they're pushing for that decision or for that direction or for that steer in policy, because overall, it's probably going to work out better in the long run. The data right now says short term, it might be an issue, but I can see where they're going with that.’
So yeah, it's a tricky scenario to be in the middle of. As a data person, you’re going, the data kind of says this. On the other hand, the bigger picture is that fixing that will actually disadvantage a lot more people.
And that's the other thing people need to understand about local government. Budgets. It's like whack a mole, you push one thing down, something else pops up. And so you can't just tweak here and there and everything is perfect. You're constantly worrying about, you know, ‘If we slash money from this budget, well, then the roads won't be fixed. If the roads aren't fixed, there'll be more accidents and potholes. And then we'll have a huge bill further down the line. It’s a very, very complex machine.
NM - It's intriguing because, I guess, these decisions are only going to become - or the area that they're governing - wider. With that trend towards devolution, it seems recently. They will have to be more hands on and these discussions will only become more important.
PC - Just in relation to your question there as well, Nathan. I was at this panel event where one of the attendees asked the question: ‘as an analyst, how'd you convey something negative? You know, how'd you do that?’
And I think the simple answer is well, you know, if you're ethical, you just tell it as it is. The data says this, you don't bend it, you don't shape it and so on. You leave the people to then use it.
Which leads me to start thinking well, actually, as a data analyst, as a data scientist, we need some kind of Hippocratic Oath. There should be a set of ethical principles which I work against, which I abide by.
Because actually, the pressure can get put on: ‘actually, you can't report that, you need to twist it a little bit too… we cannot tell senior people this… can you just alter that, you know, that graphic, just to kind of bend it a bit…’ And again, I think it partly comes back down to the culture in which you're working. Are you comfortable enough to say, ‘No, I don't want to do that. I don't believe that's the right thing I should be doing.’ You know, and sometimes that's a really hard thing to be doing. So, yeah, I think it is a challenge.
LK - It's a very hard thing. Open Data Institute does training across a range of sectors, and we do training for the public sector. And one of the concepts we're trying to get across is that, for the technical and the non-technical (what we call the geek-wonk interface) is that it's important to be able to have those discussions across the interface.
And it's quite crucial for the technical people to feel like they're able to kind of raise a concern. Respectfully. Not, ‘I won't, you can't make me’, not slamming doors and tantrums. But rather, ‘I am a little concerned about the data that I've been given’, or ‘I'm a little worried that it's been presented in this way, can you know, can we talk about that’. And for the policy people to be able to have that discussion going back the other way, that says: ‘I understand a little more about what it is that you're doing. Let's sit down and talk about what we're actually aiming to get done here.’
I think it will help if we equip, not just non technical people with the tools and the vocabulary and the understanding of what works with data to have those conversations. But also to equip technical people with the, again, the vocabulary, but also the confidence that yes, they do get a say in this. Just because they're an analyst doesn't mean they can't understand the policy implications. It doesn't mean that they don't think about people and human impacts.
And that they should, at every point, feel they have a right to be part of that conversation as well. And to flag concerns and to raise problems. You know, and just to get across that this is everybody's conversation,
NM - I guess, then there's a question there about, we're talking about Iterative life cycles and improving things as they go on, you know, at what point does a project become abandoning a project become the best course of action, I guess, rather than trying to dip in and change things and adjust things towards more ethical decisions?
LK - It’s just a case by case basis, isn’t it?
PC - I mean, there are probably obvious cases. I'm thinking there are examples of councils who've used predictive analytics, maybe to make decisions about vulnerable adults, children, and or social welfare. And they have been basically, you know, shut down, because they just haven't been successful and haven't worked. Unfortunately, they often only shut down once they have been exposed in the media, which is not good for anybody. So you kind of think there probably is a point at which before that you would have said, ‘actually, this is not going right, we need to stop’.
But if you weren't doing things responsibly. If you didn't have checks in place, didn't have these safeguards, that we're talking about. These guardrails. How would you've ever known?
If you didn't have a culture by which social workers could actually voice concern: ‘I think the algorithms have gone awry. They're not working as they should. These predictions are wrong.’ But yeah, people higher up have just invested a lot of money in this or whatever, you know, there's a whole load of issues around that very complex.
LK - Not, when is it the right time? But is it safe? Is it even safe for somebody to say I don't think this model is going to work. I remember years and years back, there was something about data matching. About trying to bring together disparate sets of data about individual residents, and say, ‘Well, I've got Billy Smith here, and I've got William Smith here, and I've got William Jones Smith here’. Are they the same person? And precisely what level of examination is required to assert they are - ones come in from schools, ones come in from social care, ones come in from transport. And whichever adult was filling in that child's details has kind of done it differently each time. But each is valid. How do we match them up?
And there definitely were points at which certain departments within the council would say, ‘I see you have that tool. I see you have the algorithm for matching and blending where it looks like those records belong together. But I will not be using it. My department will not be using it. Because the consequences if I mistakenly mash these two people together and they are not genuinely the same person are too great for me to take that risk. I don't care how good your algorithm is. I don't care how happy ‘highways’ is with it. I will not be using it, because for me the risk far outweighs any possible benefit. I would rather have a human being spend 10 minutes checking than take a chance on that technology.’
So again, where people feel safe to stand up and speak out. For some, the impact is low, the risk is very unlikely, they will continue. I think where there are higher stakes, and where the consequences for getting onto the wrong side of that are high, less so. And also, sometimes it takes something to go very publicly and disastrously wrong for other organisations who are maybe thinking of using that tool or that supplier to go, ‘Oh, I'm not comfortable. I don't want that happening to us. I'll pull back from that’.
I think there's no technical way to prescribe this, other than knowing that your people feel safe and speaking up when they see something that makes them uncomfortable. And that they have had the training, the education or the access to literature and materials, that give them the vocabulary to articulate why they're uncomfortable. Not just ‘ew, that makes me feel ick.’ But rather ‘it is a problem, it raises significant risk of this, this and this. I think that the decision making body should be aware of this before we go ahead’. And the culture where the decision making body will look at that and say, ‘We agree. It’s documented now, that this is a risk. We're not comfortable. Can we hold off? Can we explore it some more before we deploy?’
PC - I think it's that- It's partly that education piece, again. To help people understand here are the potential risks. They may not ever know. Like, most people just aren't aware that an algorithm can go wrong and cause harm to people. But actually, they need to know that. And then they can come back and assess their own project. Because the last thing you want is to shut the thing down when somebody has been killed. You know, unfortunately, that's often what happens, it takes the worst of the worst things to change. Yeah, why can't we be proactive and preventive.
Exactly, envisage that. And one thing I sometimes do with people is exactly that scenario. Imagine the worst possible outcome. Now imagine if you wanted to cause it, how would you do it?
And that gets them going, ‘Oh, right. Yes. If I did this, and I would do, oh, god. If the director did that, oh, yeah, that would do it…
And now you have the little actions, not just reacting. But actually the small things you need to be looking out for within your organisation. The ways people behave, the mistakes you might make, or your colleagues might make, that you can kind of now start heading off.
I worked with a small organisation. And I basically said, this is the contract, this is the project, this is what you want to happen. Now I want you to do some ‘disaster imagining’.
PC - And that's the thing isn’t it. Why are we thinking about a lot of this stuff? Well, part of the reason for being ethical is to build trust. Why would a client deal with us, if we're not ethical? I think most companies would run a mile. And certainly in local government, your citizens expect the local council to be trustworthy? So that's part of the reason why we're thinking through it like this, and you want to be ethical, not just it's the right thing to do. But actually we need to be doing this to build trust…
LK - You can't do your jobs without it. And this is one of the things I was thinking about when I was looking at what we'd be discussing today. The specifics of working in local government is that - the things local government does - there's no profit in it. Because if there were, there would be a wide range of companies queuing up to do those things.
So there's no profit in it, it's expensive to deliver, it's hard to staff and resource, and it's in service of people who have some of the most severe and complex needs outside of the NHS. You're talking about people on low incomes, people in poorly served areas in terms of facilities and transport and so on and so forth. And so, those people don't have a choice about whether or not they're going to do business with the council. They can't just go and get that from someone else. The council is it.
That carries an extra layer of responsibility and accountability, in terms of what you provide and how you provide it. You're dealing with people. They're not customers. They're a recipient of whatever you are able to choose to give them. They have no choice. It's okay to be imperfect, in the midst of that journey. It's okay to be imperfect.
And what's the measure, I suppose, is how you deal with that. If you say: ‘Okay, I looked at that, and I did not like what it showed me. What am I going to do about that?’ Not ‘Oh, that's it, the whole thing is off’, but rather, ‘how am I dealing with that? How am I doing that differently next time’ and except that literally nobody is perfect. But you can at least be thinking about how you can be better at that bit.
And then next time you go around that cycle, maybe there'll be a new thing that you need to tackle. That's also okay. Responsibility is, I think, the key to what we’ve been talking about. Are you being responsible? Are you acting responsibly not just saying the words but actually living it and demonstrating it?.
Transforming archiving through AI
How artificial intelligence can turn archives into living resources that shape the future while preserving the past.
Read moreOur recent insights
Transformation is for everyone. We love sharing our thoughts, approaches, learning and research all gained from the work we do.
Transforming archiving through AI
How artificial intelligence can turn archives into living resources that shape the future while preserving the past.
Read more
Making data deliver for public services
The government must prioritise sharing, AI management, and public trust to reach its data potential.
Read more
Keeping your systems on track with digital MOTs
Outdated tech can hold back organisations. Learn how digital MOTs can assess and future-proof your systems.
Read more