<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" encoding="UTF-8" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:sy="http://purl.org/rss/1.0/modules/syndication/" xmlns:admin="http://webns.net/mvcb/" xmlns:atom="http://www.w3.org/2005/Atom/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:fireside="http://fireside.fm/modules/rss/fireside">
  <channel>
    <fireside:hostname>web02.fireside.fm</fireside:hostname>
    <fireside:genDate>Thu, 16 Apr 2026 20:11:29 -0500</fireside:genDate>
    <generator>Fireside (https://fireside.fm)</generator>
    <title>The Smith Business Insight Podcast - Episodes Tagged with “Responsible Ai”</title>
    <link>https://smithinsight.fireside.fm/tags/responsible%20ai</link>
    <pubDate>Tue, 14 Nov 2023 08:00:00 -0500</pubDate>
    <description>Tune in for a different take on business, with professors, researchers and experts from Smith School of Business. Understand the rapidly evolving corporate world, stay ahead of the curve, and navigate a landscape that is no longer defined by the balance sheet alone.
</description>
    <language>en-us</language>
    <itunes:type>episodic</itunes:type>
    <itunes:subtitle>Fresh ideas from Smith School of Business at Queen’s Universit</itunes:subtitle>
    <itunes:author>Smith Business Insight</itunes:author>
    <itunes:summary>Tune in for a different take on business, with professors, researchers and experts from Smith School of Business. Understand the rapidly evolving corporate world, stay ahead of the curve, and navigate a landscape that is no longer defined by the balance sheet alone.
</itunes:summary>
    <itunes:image href="https://media24.fireside.fm/file/fireside-images-2024/podcasts/images/2/248a6c23-a4d6-42d7-9703-916c3caec8e7/cover.jpg?v=3"/>
    <itunes:explicit>no</itunes:explicit>
    <itunes:owner>
      <itunes:name>Smith Business Insight</itunes:name>
      <itunes:email>smithinsight@queensu.ca</itunes:email>
    </itunes:owner>
<itunes:category text="Business">
  <itunes:category text="Careers"/>
</itunes:category>
<itunes:category text="Business">
  <itunes:category text="Marketing"/>
</itunes:category>
<item>
  <title>Episode 18: AI Reality Check: Battling Bias</title>
  <link>https://smithinsight.fireside.fm/18</link>
  <guid isPermaLink="false">7a4c0b6b-8905-4113-9a1f-91bc7c0e9c09</guid>
  <pubDate>Tue, 14 Nov 2023 08:00:00 -0500</pubDate>
  <author>Smith Business Insight</author>
  <enclosure url="https://aphid.fireside.fm/d/1437767933/248a6c23-a4d6-42d7-9703-916c3caec8e7/7a4c0b6b-8905-4113-9a1f-91bc7c0e9c09.mp3" length="48580811" type="audio/mpeg"/>
  <itunes:episodeType>full</itunes:episodeType>
  <itunes:season>4</itunes:season>
  <itunes:author>Smith Business Insight</itunes:author>
  <itunes:subtitle>Algorithms hobbled by human biases are playing havoc with people’s lives. How do we best respond to the challenge?</itunes:subtitle>
  <itunes:duration>33:42</itunes:duration>
  <itunes:explicit>no</itunes:explicit>
  <itunes:image href="https://media24.fireside.fm/file/fireside-images-2024/podcasts/images/2/248a6c23-a4d6-42d7-9703-916c3caec8e7/episodes/7/7a4c0b6b-8905-4113-9a1f-91bc7c0e9c09/cover.jpg?v=1"/>
  <description>From high school students given the wrong marks just as they’re applying to university to Black defendants misclassified as higher risk for reoffending, AI is driving unfair and damaging outcomes. Technology firms promise what they call Responsible AI. But can they really deliver if they can’t keep up with the speed of change? Can governments impose ethical standards and safe use of AI-based systems and products?
In this episode, Anton Ovchinnikov, Distinguished Professor of Management Analytics at Smith School of Business, discusses his groundbreaking research into the government response to algorithmic bias and what happens when large language models are fed AI-generated synthetic content rather than human-generated content. He is joined in conversation by host Meredith Dault.
 Special Guest: Anton Ovchinnikov.
</description>
  <itunes:keywords>smith, smith business, smith insight, AI-driven unfair outcomes, Algorithmic bias, Responsible AI, Ethical standards, Safe use of AI, Government response to bias, AI-generated synthetic content, Technology firms, Speed of change, Distinguished Professor, Management Analytics, Anton Ovchinnikov, Smith School of Business, Large language models, Human-generated content, Meredith Dault</itunes:keywords>
  <content:encoded>
    <![CDATA[<p>From high school students given the wrong marks just as they’re applying to university to Black defendants misclassified as higher risk for reoffending, AI is driving unfair and damaging outcomes. Technology firms promise what they call Responsible AI. But can they really deliver if they can’t keep up with the speed of change? Can governments impose ethical standards and safe use of AI-based systems and products?</p>

<p>In this episode, Anton Ovchinnikov, Distinguished Professor of Management Analytics at Smith School of Business, discusses his groundbreaking research into the government response to algorithmic bias and what happens when large language models are fed AI-generated synthetic content rather than human-generated content. He is joined in conversation by host Meredith Dault.</p><p>Special Guest: Anton Ovchinnikov.</p>]]>
  </content:encoded>
  <itunes:summary>
    <![CDATA[<p>From high school students given the wrong marks just as they’re applying to university to Black defendants misclassified as higher risk for reoffending, AI is driving unfair and damaging outcomes. Technology firms promise what they call Responsible AI. But can they really deliver if they can’t keep up with the speed of change? Can governments impose ethical standards and safe use of AI-based systems and products?</p>

<p>In this episode, Anton Ovchinnikov, Distinguished Professor of Management Analytics at Smith School of Business, discusses his groundbreaking research into the government response to algorithmic bias and what happens when large language models are fed AI-generated synthetic content rather than human-generated content. He is joined in conversation by host Meredith Dault.</p><p>Special Guest: Anton Ovchinnikov.</p>]]>
  </itunes:summary>
</item>
  </channel>
</rss>
