Today marks the launch of a new podcast series, The Compass, which will help us navigate together the impact of AI worldwide.
Ted Vial and I held a recent conversation on AI and ethics, which is appropriate when speaking with the Potthoff Professor of Theology and Modern Western Religious Thought at the Iliff School of Theology, based in Denver. A fascinating discussion that will surely continue. Here are a few highlights from our podcast.
AI ethics is really human ethics – we, as humans, need to guide our future. Humans are being asked to develop and adjust the best formulas. We should not place that burden on technology.
Math doesn’t eliminate bias — you can have things that are mathematically correct but can still have bias. For example, what does fairness mean? It is a topic we won’t agree on, and math can’t solve for us, so we have to find ways to work it out. We have to learn how to speak with each other.
Some guidelines make sense – if you are a medical doctor, you have to get a license, take continuing education courses and, if you do something unethical, you could lose your license. If you are writing code that impacts millions of people, what standards make sense? Section 230 was started to protect companies when conversations occur on their platforms. Thirty years later, what do we need going forward?
If our personal data is valuable to others, why can’t we access all of it? That statement stands on its own.
We have new editors – we are familiar with editors in the news media. In the world of technology, algorithms choose what we will see, so who are these editors and what is the standard for them to follow? We may not agree on what is “fair,” but we can always be transparent.
Gaming has value well beyond games – it is helping people learn how to play and learn together worldwide. When people get to know each other, they often find more similarities than differences.
What obligation do we have to prevent people from seeing negative information? — when we think of the concept of availability bias, we know that when you are exposed to the same type of thinking continually, you start to believe it, right or wrong. How do we deal with information that is not helpful earlier in the process? It’s worth the discussion.
Privacy means different things depending on where you live – in Europe, it may mean your ability to have content about you removed or to ensure you have the ability to opt in. In more than 100 countries worldwide, it may be more about your personal security and safety. Technology can help citizens fight for freedom and it can also be used against them to repress them. How we address the availability and use of technology is important.
We all generally mean well – we are trying to move in the right direction. We do have to learn how to talk to each other, rather than react so often. And it is worth remembering that the discussions of today may sound unique, but they are not.
What’s new is often old — when Mark Zuckerberg talks about consciousness, we can remember that Thomas Acquinas reflected on the meaning of soul back in the 13th century. It is worth revisiting the ground we have already walked on.
We end the podcast with Ted making a recommendation for a book to read. It is God Human Animal Machine by Meghan O’Gieblyn. We also reflected on his favorite musicians, which include Amy Winehouse and Stevie Ray Vaughn. The book is ordered and at least for Stevie Ray Vaughn, I just have to walk the streets of Austin to hear memories.
Thank you, Ted, we will continue.