Mirror, mirror, on the wall… …should this guy get bail or not? #46 #cong23 #reality

Synopsis:

My contention is that even though AI (Generative AI) can’t draw a realistic hand to save its life, it is a powerful window into a reality we might otherwise not see.

Total Words

946

Reading Time in Minutes

4

Key Takeaways:

  1. AI is not just a bad renderer of human hands.
  2. AI is a mirror that shows us truths we might not want to see, but should.
  3. The material we use to train AI is a fair representation of ourselves. And the cold, unbiased eye of AI is the perfect way to see the truths contained in it.
  4. AI can show you the truth, but it’s up to you to do something about it.

About Richard Ryan

I have worked in Advertising for approximately 30 years. I am a copywriter, which means I wrote the very words that made you choose that specific box of cornflakes, or cellphone plan or midrange server.

I work in a small, full-service ad agency in Brooklyn NY, called Something Different. What actually makes us something different is we solve your business problems with smart, plain-spoken, deeply human ideas. It what every agency should do, but sadly doesn’t.

I live in New Jersey, where I enjoy having four distinct seasons.

Contacting Richard Ryan

You can check out Richard’s personal site, and the Something Different Agency or send him an email.

By Richard Ryan

We’ve all sniggered at the oddly-webbed, six-fingered hands that AI draws for us. Or laughed at ChatGPT when it tried to gaslight a New York Times reporter and convince him to leave his wife for the program. And then there’s the Pepperoni Hug Spot commercial.

But don’t let that sideshow fool you.

I think AI is a powerful window into our reality. Or, to be more precise, a mirror. A mirror that shows us truths we might not want to see, but should.

Consider how Generative or Creative AI works. We feed it a set of things. The more the better. Things we write, draw and create. Images. Books. Letters. Scientific papers. Greek poetry. Whatever we want. And it absorbs them all. Then, using its super complicated algorithms, it “learns” what we’re showing it. It sees the patterns in what we’ve done. And then tries to recreate it. By guessing. Based on what it saw. It’s a hugely powerful trick. This way it can learn to code. Or converse in Chinese. Or if we give it millions of mammograms and medical data it can learn to spot breast cancers with uncanny accuracy

You could argue that it doesn’t actually understand anything. It’s not filtered or underpinned by emotion or beliefs or context. It just spits back the reality of what it sees.

So to my point. What does it see? Well, it was recently reported that when you ask Midjourney (which is a picture-generating AI) to create pictures of doctors, what it sends back are images of white men.

Possibly not what you’d expect, but it’s reflecting back what it has seen. It’s the truth.

What do those images tell us about our reality? Or about opportunity? Or about whether we really value diversity?

Admittedly, although it’s a thought-provoking fact, those are just pictures. No harm done. But that’s not always the case.

I said AI has taught itself to read mammograms. It’s way better and much faster than humans. It’s so good, doctors don’t quite understand what it’s seeing, or how it does it, but it has saved people’s lives. The problem is, while it’s very good at spotting cancers in white women, it’s not so good at spotting breast cancers in people of color.

That also teaches us something about our reality.

Because – just as with the doctor pictures – the data sets we’re using to train it are from real life, taken from a health care system that is biased and skewed.

The reality our AI is reflecting back at us is a reality where we don’t treat people equally. We treat some people worse.

That’s what the mirror is showing us.

In March of this year a judge in India couldn’t decide whether to grant bail for a murder suspect so he just asked ChatGPT to give him the answer. Chat GPT said the guy didn’t deserve bail because the program considered him “a danger to the community and a flight risk.” So the judge said fair enough and sent him back to jail.

Of course that’s a story of one lazy judge. That behavior would never become institutionalized, right? Wrong. Unfortunately, it could.

Right now, if you’re booked into jail in New Jersey, the judge when he’s deciding whether to send you to jail or not, has a small black box that uses risk-assessment algorithms to help him make his decision. Not quite autonomous. At least not yet. But when that AI does come on line, what data sets will be used to teach it? Whichever they are, they won’t be equitable. The data sets that comprise all the information on the US incarceration system were built up over centuries of hugely racist government policies.

So the decisions that AI will return – either go to jail or go home – will reflect and reinforce a reality that isn’t remotely fair.

That won’t be a few harmless pictures of white doctors, that’ll be someone’s life.

So the next time your AI doesn’t send you back quite what you’re expecting, don’t blame it for not getting reality right. Consider that, in its unvarnished, unemotional way, it may be getting reality exactly right.

Then, once we see that reality, consider what we want to do about it.

Reality has Become Fuzzy #45 #cong23 #reality

Synopsis:

My understanding of what is real and what constitutes reality seems to be constantly shifting these days… reality is getting fuzzy and I’m just learning how to deal.

Total Words

696

Reading Time in Minutes

3

Key Takeaways:

  1. Reality no longer means what I think it means.
  2. I think I’m going to have to deal with life being a little more fuzzy

About Clare Dillon:

Clare loves Congregation, and the discussions she has there. She is currently researching how developers can better collaborate to create and maintain software. She works with InnerSource and open source communities on the side.  She’s very sorry her submission came so late this year.

Contacting Clare Dillon:

You can connect with Clare on LinkedIn or send her an email.

By Clare Dillon

One of my favourite movies is The Princess Bride. I often find the characters inhabiting my head when their words appear relevant. When I heard of this theme, I could hear Inigo Montoyo say “You keep using that word. I do not think it means what you think it means”.

 

And fundamentally, I agree with Inigo. I used to think I knew what reality meant – but there have been so many times in the last few years where my perception of reality has been challenged, that at the moment, I think my understanding of the word is a little fuzzy.

Here is a list of just some of the times when my understanding of what “reality” is has been challenged…

  • Learning about quantum theory and entanglement and how a quantum system can exist in a combination of two or more possible states at the same time. Mind blowing.
  • Experiencing augmented reality for the first time with a Microsoft Hololens and “seeing” large mech invaders emerge from the wall of the room. As I started ducking and diving I had that peculiar experience of both knowing they were not real even as my body and mind was reacting like they were being attacked.
  • That time in Cong when I was introduced to the idea from James Joyce of “two thinks at a time” and “twosome twiminds.” That started me thinking about how two seemingly contradictory perspectives can be held as true at the same time.
  • Donald Trump becoming president. Who would’ve thunk that was possible when we saw him in Home Alone 2.
  • Thinking about just how “real” some artificial intelligence output seems.
  • Learning about all the cognitive biases at play in our minds that mean we rarely make decisions based on rational thinking. Realising just how little control we have about what we think in the world.
  • And that time, just a little while I ago, when I woke up from a particularly vivid dream of talking with my mother and believed, in that moment, that she was still alive and with us in this world….

My understanding of what is real and what constitutes reality seems to be constantly shifting these days…

After hearing there is a poetry jam once again at this Cong, here is a short poem ChatGPT came up with on the topic I quite like:

Reality is fuzzy, we are just going to have to learn how to deal

With the uncertainty and ambiguity that we often feel

Sometimes we think we know the truth, but then we find out we were wrong

Sometimes we see things differently, but then we have to get along

Reality is fuzzy, we are just going to have to learn how to cope

With the complexity and diversity that we often hope

To understand and appreciate, but then we face some challenges

To communicate and collaborate, but then we need some balances

Reality is fuzzy, we are just going to have to learn how to grow

With the creativity and curiosity that we often show

To explore and discover, but then we have to make some choices

To create and innovate, but then we have to use our voices