« TechStuff

Smart Talks with IBM - Responsible AI: Why Businesses Need Reliable AI Governance

2023-10-17 | 🔗

To deploy responsible AI and build trust with customers, businesses need to prioritize AI governance. In this episode of Smart Talks with IBM, Malcolm Gladwell and Laurie Santos discuss AI accountability with Christina Montgomery, Chief Privacy and Trust Officer at IBM. They chat about AI regulation, what compliance means in the AI age, and why transparent AI governance is good for business.

Visit us at: ibm.com/smarttalks

Explore watsonx.governance: https://www.ibm.com/products/watsonx-governance

This is a paid advertisement from IBM.

See omnystudio.com/listener for privacy information.

This is an unofficial transcript meant for reference. Accuracy is not guaranteed.
Welcome to textile production from I heart radio Today, we are witness to one of those rare moments in history: the rise of innovative technology with the potential to radically transformed business in society forever. That technology, of course, is artificial intelligence. and it's the central focus for this new season of smart talks with ibm join hosts from your favorite pushkin bod guests, as they talk with industry experts and leaders to explore how This is, can integrate a I into their workflows and help drive real change. a new era of a high and, of course, host Malcolm well, we'll be there to guide you through the season and throw in his to sense as well look out for new episodes of smart talks with ibm every other
on the I heard, app up apple podcast, wherever you get your podcast and more at ibm dot com, slash smart talks, hello, hello, welcome too smart talks with ibm a podcast. from Pushkin industries. I heart radio and ibm, I'm malcolm Gladwell, season? Where continue our conversation with new creators, visionaries, who are creatively applying technology in business to drive change, but with a focus on the transformative power of artificial intelligence and what it means to leverage a I as again, changing multiplier for your business. I guess today is Christina montgomery, ibm, chief privacy and trust off I she's also share of ibm beams. I ethics board. In addition, to overseeing ibm privacy policy. A core part of Christine his job involves a governance.
Making sure the way I is used complies with the national legal regulations customized for each industry in today's episode, christina, will explain why businesses need foundational principles. when it comes to using technology. Why? I regulation should focus on specific use cases over the technology itself and, a little bit about her landmark congressional testimony. Last may christina spoke with doktor lorry santos host of the Pushkin podcast, the happiness lab cognitive, science, and psychology professor at yale university lorry is an expert on human happiness and cognition. Ok, let's get to the interview. So Christina I'm so excited to talk to you today. So, let's start by talking a little bit about your role at I b m. What does a chief privacy interest officer actually do? It
dynamic profession, and it's not a new profession, but the role has really changed my role today. is broader than just helping to ensure compliance with data protection laws globally. I'm also responsible for ai governance. I co, chair or ai ethics board here at ibm and for data. Clear and send data governance as well for the company so I have both a compliance aspect, my role really important on a global basis, but also help the business two competitively differentiate because really trust is a strategic advantage, fur ibm and a competitive differentiate or as a company that's been respond, simply managing the most sensitive data for clients for more than a century now and helping to usher the technologies into the world with trust and transparency, and so that's also a key aspect of my role,
and so you joined us here on smart talks back and twenty twenty one, and you chatted with us about ibm approach of building trust and transparency with a sigh, and that was- we two years ago, but it almost feels like an eternity has happened in the field of ay. I since then- and so I'm curious how much has changed since you are here last time, work things you told us before you are: they still true heart: exchanging you're. Absolutely right. Like. The world has changed really in the last two years, but the same fundamental principle. Off and the same overall governance apply to ibm ams programme for data protection in a responsible way. I e that we talked about two years ago are now much has changed their from our perspective and the good thing is we ve put these practices and this governance approach into place. and we have an established way of looking at these emerging technologies as the technology evolves. The tec is more powerful. First
foundation models are vastly. our job and more capable and are creating in some respects new issues, but that just all the more urgent to deal what we've been doing and to put trust and transparency into place across The business to be accountable to those principles. So our conversation she's, really senate around this need for new. A regulation and part of that regulation involves the mitigation of bias and this is something I think about a ton as a psychologist right. You know I know, like my students and everyone whose interacting with a sigh is assuming that the the kind of knowledge they're. Getting from this kind of learning is accurate right, but of course it is only as good as the knowledge that's going in and so talked me alive. But about like why a bias occurs in a high and the level of the problem that were really dealing with yeah. Well, obviously, I is based on data right. It's it's train with data and that data could be by asked in and of itself network
You could come up, they come up in the data. They could also come up in the output of the mind. Themselves? I'm so it's really important that you build bias, consideration and biased testing into your product development cycle, and so What we ve been thinking about here and I and doing we had some of our research teams, delivered some of the very first tool cats to help detect bias years ago now right and deployed them to open source and we have put into place for our developers here at ibm and ethics by design playbook, that's a sort of a step by step approach. Which also address is very fully by considerations and we I not only like here- is a point when you tests for it and you consider it my data. You have to measure it both at the data and the model level where the outcome level and we provide guidance with respect to what tools can best be used to accomplish,
So it's a really important issue, its one. You can't just talk about. You have to provide so essentially the technology and the capabilities of the guidance to enable people to task Recently you had this wonderful opportunity to had to congress to talk about and in your testimony before congress you mentioned. That is often the innovation moves too fast for government to keep up, and in this something that I also worry about. As a psychologist re, our policymakers really understanding the issues that their dealing with is I'm curious? How approaching this challenge of adapting our policies to keep up with the serve rapid pace of all. The advancements were seeing any. I technology itself gets really critically important that you have foundational principles that apply I too. Not only how We use technology, but whether you're going to use in the first place and where you're gonna use and apply it across your company and then your profile
mcgovern, its perspective has to be agile. It has to be able to address emerging capabilities, new training methods, etc and part of that involves helping to educate and still and empower a trustworthy culture at a company, so you can see those issues. You can ask the right questions at the right time. If you try, we talked about bring the senate hearing and- and I ibm spin talking for years about regular eating the use, not the technology itself, because if you try to rake, late technology you're very quickly, gonna find out. regulation will absolutely never keep up with that So in your testimony to congress, you also talked about this idea of a precision regulation approach, for I tell me more about this it's a precision, regulation approach and in. Why could that be so important? It's funny, because I was able to share with congress. Our pursuit regulation point of view in twenty twenty three, but that
decision regulation, point of view was published by ibm in twenty twenty, so We have not changed our position that you should apply the tightest controls the strictest regulatory requirements to the tec, ology, where the end use and risk of societal harm is the greatest. So that's, essentially what it s, there's lots of ay. I technology that's used today that doesn't punch people that very low risk in nature, and even when you think about a lie that delivers a movie recommendation versus ay. I that is used to diagnose cancer right, there's very different locations associated with those two uses of the technology. And so centrally were. Precision. Regulation is applied. Different rules to different risks, right more stringent regulation to the use cases with the greatest risk and then also we, that out calling for things. Like transparency
you see it today with content right misinformation and the like. We believe that consumers should always know when their interacting with an eye system so be transparent. Don't hide your I quit, you, define the risks. So as a country, we need some clear guidance written in globally as well, in terms of which uses of iron higher risk were will apply higher and strict. Regulation, have sort of a common understanding of what those high risk uses are and then straight. The impact in the cases of those higher risk uses. So companies who are using a high and spaces where they can impact people's legal rights, for example, should have to conduct an impact assessment that damage it's that the technology is and biased. So we ve been pretty clear about apply either the most stringent regulation to the highest risk uses away. I insist so far we ve been
talking about your congressional testimony in terms of air, this specific content that you talked about, but I'm just curious on a personal level. You know what was that lake right live right now. It feels like, at a policy level like there's a con fever pitch going on with a right now you're. What did that feel like to kind of really have the opportunity to talk to policymakers inter influence what their thinking about a high technologies like in the coming century. Perhaps I was really an honour to be able to do that and to be one of the a set of invitees too steering and when I worked for me- essentially is in a really two things. The first is really the value of authenticity, so both, as individual and as a company, I was able to talk about what I do you might need a lot of advanced prep right. I talked about what might is what I b has been putting in place for years now, so this isn't about creating something
This was just about showing up in being authentic and we were invited for a reason we were invited because we are one of the earliest company in the end technology space, where the oldest technology company- and we are true- instead an imagined and, and then the second thing I came away with, was really how important this issue is to society. I dont think I appreciated it as much entail following that experience. I an outreach from colleagues I hadn't worked with four years. I had outreach from family members who heard me on the radio. You know my mother and my mother in law, and my nieces and nephews, and my friends of my kids were alike. I get it I can't wait. You do now, wow, that's pretty cool in as that. Really I the best and most impact Take away that I had the mass adoption of generative ay I happening at breakneck speed has spurts
eighties and governments around the world to get serious about regulating ally for businesses. Compliance is complex enough. Ready, but throw whenever involving technology like a I into the mix and compliance itself becomes an exercise in adaptability, as regulators seek greater stability in how a is used. Businesses need help, creating governance processes comprehensive enough to comply with the law, but agile enough to keep up with the rapid rate of change in a development regulatory scrutiny, isn't the only consideration either responsible, a governance, a business, his ability, prove it. Models are transparent and explainable is also key to building trust with customers, regardless of industry in the next part of their conversation laura
es christina. What businesses should consider when approaching a governance? Let's listen so particular role that businesses are playing in a governance. Likewise, it so critical for businesses to be part of this. So I think it's really critically important that businesses, I understand the impacts that technology can have both in making them better businesses, but the impacts of those technologies can have on the consumers that they are supporting you, businesses need to be deploying a technology that is in a line meant with the goals that they set for it, and that can be trusted I think for us and for our clients. A lot of this comes back to trust in tat. If you deploy something that doesn't work that hallucinates that scrimmage aids that is transparent. where decisions can be explained, then you
are going to very rapidly erode the trust at best right of your clients and at worst, for yourself you're, going to create legal and regulatory issues for yourself as well, so trusted technology is really important. and I think, there's a lot of pressure on businesses today to move very rapidly and adopt technology, but if you do it without having a programme of governance in place, you're, really Skin road map process took. This is really where I think, a strong. A governance comes in. You talk about from your perspective, how this really contributes maintaining the trust that customers and stakeholders have in these technologies, yeah, They have any. You need to have a governance programme, because you need to understand that the technology when the air space that you are deploying is exploring all you need to understand why a tomb making decisions and recommendations that it's making and you need to be able to explain that your consumers, I mean you can't do
if you don't know where data is coming from what data using to train those models. If you dont have a program that manages the outline. men of your eye models over time to make sure as a I learns any evolves over uses, which is in large part. What makes so beneficial that stays in alignment with the objectives that you set for the technology over time. So He can't do that without a robust governance process in place. So we work with clients, to share our own story here at ibm in terms of how we put that in place, but also in our consulting practice to help clients work with. these new genera of capabilities and foundation miles and the like, in order to put them to work for them, there is in a way that's going to be impact felt too that business, but at the same time be trusted her son. I want
You turn a little bit towards watson, ex governance, and so I b recently announced there. I platform watson x, which will include a governance component could you tell us a little more about watson, ex doc, governance, yeah river crazy that just back up and talk about the full platform, and then lean into watson acts, because I think it's important to understand that delivery of a false we of capability is to get data to train models and then to govern them over their lifecycle. All of these things, a really important from the answer. You to make sure that you have you know for our watson extort a I, for example: that's the studio brain new foundation models, and generally I and machine learning capabilities, and we are
believing that studio with some ibm train foundation, models which, where curating, entail more specifically for enterprises. So that's really important comes back to the point I made earlier about business trust, and the need you know to have enterprise ready technology To me, I space and then the watson extort data is fit for purpose data, stored data, wake and then watson, stock of so that's a particular component of the platform that my I teach in the air. I ethics board has really worked closely with the product team on developing and were using it internally here in the chief privacy office as well to help us learn our own uses of a high technology and our compliance programme here and they d essentially helps to to notify you if a model becomes bias or gets
out of alignment azure using it over time. So company they're gonna need these capabilities mean they need them today to deliver technologies with trust them. eat them tomorrow to comply with regulation, which is on the horizon, and I think compliance becomes even more complex when you consider international data, action laws and regulations? Honestly, I don't know how anyone on any companies illegals even keeping up with this these days, but my question for you Really, how can businesses develop a strategy to maintain compliance and to deal with it in this ever changing landscape? It increasingly more challenging. In fact, I saw statistic: just this morning that the area The tory obligations on companies have increased something like seven hundred times in the last twenty years or so so really is a huge, focus area for companies have tougher process in place in order to do that- and it's not easy politically for Many like ibm that it has
as ensign over one hundred and seventy countries around the world. There is more than a hundred and fifty comprehend privacy regulations there. regulations of non personal data there. A I regulations emerging I'm so you really need an operational approach to it in order to stay compliant, but but one of the things we do is we set a baseline and a lot of company do this as well, so we define a privacy baseline, we define and a baseline and we ensure than as a result of that that there are very few deviant says, because it incorporates a map baseline. that's one of the ways we do at other companies, I think, are similarly situated in terms of of doing that. But again it is, It is a real challenge for global companies. It's one of the reasons why we advocate this much alignment as possible.
The international realm as well as nationally here in the? U s as much alignment as possible to make compliance easier for easier and not just because companies want an easy way to comply, but harder. It is the less likely there will be compliance, and it's not the objective of anybody government, companies consumers to have to set go up legation that companies simply camp me, and so what advice? Would you give to other companies who are looking to rethink or strengthen their approach to a governance, and you need to start with, as we did foundational principles and you need to start making decisions about what technology you're going to deploy and what technology? Not? What are you going to use a form? What aren't you going to use it for and then, when you do use, it aligns
to those principles. That's really important! Formalize a programme have some one within the organisation, whether it's the chief privacy officer, whether its some other role, a chief ay, I ethics officer, but haven't accountable individual accountable organization doing charity assessment, figure out where you are, and we need to be and really start. You know putting it into place today, don't wait! fur regulation to apply directly to your business because it'll be too late, so the tax features new creators. These visionaries, like yourself, your creatively applying technology in business, to drive, change, carry if you see yourself as creative, I you know, I definitely do, I made. You need to be creative when you're working in an industry that evolves so very quickly. So
You know. I started with ibm when we were primarily a hardware company right and we ve changed our business so significantly over the years. In the issues that are raised with respect to each new technology, whether it be cloud whether it be a I now, where we're seeing a ton of issues or look at emergent issues, space of things like neuro technologies and quantum computers. You have to be strategic, and you have to be creative and thinking about how you can adapt actually quickly a company to an environment. That is changing so quickly and with Transformation happening at such a rapid pace, decent creativity plays a role in how you think about and implement specifically a trustworthy ai strategy. Yeah
I actually think it does, because again it comes back to these capabilities and and their ways. I guess, how do you define creativity, could could be different right? but I'm thinking of creativity in the sense of sort of it already in strategic vision and creative problem solving, I think that's really important in the world that were in right now being able to creatively problem solved with now, The issues that are rising sort of every day. The honey see the role of chief privacy officer evolving in the future, as a high technology continues to advance like what steps should scipios to stay ahead of all these changes that are come in their way. So role is evolving. In most companies, I would say pretty rapidly. Many companies are looking to achieve privacy officers who are ready, understand the data that's being used in the organization and have programmes to ensure compliance with laws that
choir you to manage is that data in accordance with data? Pretend, She wasn't like it's a nice, actual place in possession for you know. I I responsibility, and so I think, what turning to a lot of chief privacy officers is that being asked to take on this area governance, responsibility for companies and, if not, take it on at least play a very key role: working with other parts of the business in a governance, so that really is changing in and if chief prime the officers, are in companies who may be haven't started thinking about a yet they should so. I encourage them to look at. Different resources that are available ready and governance space, for example the intern national association of privacy professionals, which is thus seventy five thousand member professional body for the profession of you, privacy officers just recently launched an governance initiative and an egg
when it certification programme, I sit on there, report, but that's just emblematic of the fact that the field of changing so rapidly and so yeah being a rapid change when you're back here on smart toxin, twenty twenty one. You said that the future of there will be more transparent and more trustworthy, your wedding the next five to ten years holding. You know when your back on smart talks in twenty twenties actually no twenty thirty. You know what are we gonna be talking about when it comes to air technology and governance? So I too, Let it be an optimist straight nice about two years ago and I think we're see We now come to fruition and there will be requirements whether there Being from the? U s, whether they're coming from europe, whether they're, just coming from voluntary adoption by clients of things like the nest, risk management framework, important voluntary frameworks, you're going to have to adopt transparent and mixed nepal practices in your uses a day I so I do see that
happening and in the next five to ten years boy, we'll see more research, to trust in and techniques, because we don't really know. For example, had a watermark, we were called for things like water marking they'll be more research into how to do that. I think we'll see you know regulation that specifically gonna require those types of things, so I think again, The regulation is going to drive. Research is going to drivers, search into these areas that will help ensure that we can. the lever, new capabilities, generated capabilities and the like with trusts and explain ability. Thank you but christina for joining the handsomer talks. To talk of my eye and governance will thank you very for having me to unlock the transformative growth possible with artificial intelligence, businesses need to,
oh, what they wish to grow into first, like christina said, the best way forward in the ai future is for businesses to figure out their own foundational principles. and using the technology. Drawing on those principles to apply in a way that ethically consistent with their mission and complies with the legal frameworks built to hold the technology accountable as a adoption grows? More and more widespread so two willie expectation from consumers and regulators that businesses use it responsibly, investing in dependable. A governance is a way for businesses to lay the foundations for technology that their customers can trust. while rising to the challenge of increasing regulatory complexity though the emergence of ai does complicate an already tough compliance landscape. Businesses now face a creative opportunity to set a precedent for
What accountability in a looks like and reach think what it means to deploy: trustworthy, artificial intelligence, I Malcolm grab. Well, this is a paid advertisement for my ibm, smart talks, are that the aim will be taking a short hiatus but look for new episodes in the coming weeks. Smarter so that the aim is produced by Matt Romano David jaw, nature, then cat and raised him deserve with Jacob costing were edited by their eugene. caught. Our engineer is jason embryo theme, song by grammar scope, special. To call him a glory and kelly Cathy Callaghan and the eight by and I m teams, as well as the Pushkin marketing team, smart talks with ibm, is a production, a pushkin industries and ruby studio. I heart media to find more pushkin, podcast, listen
on the I heart, radio, app apple podcasts or wherever you listen to podcasts.
Transcript generated on 2023-12-12.