The thorn on your side
What do you want to get from a community of Data Ethics Leaders and Practitioners?
When Shilpi Agarwal of DataEthics4All posed the above question on her community page — the only answer that popped into my head was “Several copies of Christina Maslach please!”.
For those of you who don’t know who that is; she was the then girlfriend and now current wife of Professor Zimbardo of the notorious Stanford Prison Experiment of 1971. She is historically credited for stopping the prison experiment on the 6th day after observing the horrific abuse of human dignity firsthand.
Now you may object and say that there is no connection between data ethics and atrocities of that degree.
But I digress.
Whether it is the actual abuse of humans or the abuse of data from humans against humanity — to me, they are all the same.
The prisoners and the guards in that study (I can’t possibly continue to call it an experiment as it didn’t have any scientific method to it — a point that Prof Zimbardo himself has agreed with) were victims of data abuse. Data about human behaviors that the researchers were privy to. The guards were instructed on how to behave based on that data and the prisoners were manipulated based on previously available data from behavioral and psychological studies. Neither side understood the data about their behaviors prior to the study.
Christina Maslach was by no means a good fairy godmother. Although I am sure she is a wonderful human. Neither was Prof. Zimbardo a super evil human being. I have no reason to suspect otherwise.
This was no epic battle between good vs evil.
He was trapped in his own filter bubble. And she was the needle that burst it wide open.
Simple as that.
In May 2018, exactly two years ago, I was one of millions listening to Sunder Pichai’s demonstration of Google Duplex at I/O conference. A virtual assistant that can make restaurant bookings via phone mimicking human speech. The bot makes the call to unsuspecting restaurant & salon reservation agents — and books a spot successfully.
Through the deafening rounds of applause, we can hear laughter and giggles from the attendees as the human on the other side of the call fails to recognize that it’s a bot. For them it was a successful demo of advanced NLP technology.
Outside the conference, many sat in utter horror — including myself.
How could Google not recognize the ethical concerns of this application? Did they not realize that humans would want to be informed before conversing with a bot? They did not even try to appear ethically responsible on that day.
There were no gasps from the audience. Only appreciation. When there is not even a pretense of transparency; then that’s an alarming symptom of bubble blindness.
But that changed soon, swiftly, following the backlash tsunami. Google corrected course within a week by saying that the technology will have disclosure built in. They even claimed that was part of the plan.
Yet, there was no mention of that during the demo.
Nada.
To me, it was the laughter of the audience during the demo that was the most bothersome. Laughing at a fellow human, being deceived by AI.
Painful to listen even now, as it reminds me of how vulnerable we all are to group think and situational attributes. To laugh at the demise of our own dignity and identity as humans.
A clear indication of a bubble. Everyone in that conference room was in a bubble. No, they were not versions of Dr. Evil from Austin Powers movies. They were ordinary humans who were caught up in their scientific achievements. Captured in the moment of engrossment and marvel at their masterpieces — while completely blind to its unethical applications.
But then…
A community of Christina Maslachs shouted from their rooftops — and the bubble burst.
Those were the data ethics leaders.
The needles that prick and burst bubbles. Exposing those inside the bubble to see the perspective of those outside.
And that is the community we need to build.
We must start soon. With the young. And be aggressive.
How soon? Yesterday is already late.
We are in the middle of the world’s worst pandemic. Not only are lives lost to a vicious virus. Lives are on the line to another sinister monster — the unethical use of data. This pandemic is turning into a fertile ground for human data misuse at a scale unheard of in human history.
Contact Tracing apps are being developed at a rapid pace by many countries. Many have already launched it, and some have already started using it. Countries that believe in democracy as well as those that don’t.
And that is not even the scary part.
With no oversight in sight. Most of the apps require no consent from users. Many are mandatory and comes with fines & imprisonment for failure to comply. Some will come pre-installed on new phones with no option to uninstall. And no one knows how the data will be used now or later.
Many don’t have a sunset date; a limited preset period in which the app will become void. They might live on, forever, citing different reasons for their survival.
The risk to benefit ratio is not openly debated. We are told these apps are meant to keep us safe from the virus — but who is tracking these apps to keep us safe from future abuses?
Nobody.
Millions of people will be subject to this tracking. Many don’t even understand how it can be misused. It is up to us who are privy to such knowledge to stand up for those who are most vulnerable. To speak up. To shout from our balconies.
Yes, it is great that we all banged our pots and pans to support the front-line workers. They deserve much more than that. But that is all we could give them now. We need to do more for them. We tried. We shared tributes, wrote poetry, painted paintings, created murals, printed banners, held drive-by parades, did flyovers, stitched masks and raised funds for them.
Still it pales mightily in comparison to what they have done for us. They sacrificed a lot, some their lives — and we just sat on our couches.
They deserve more. I agree.
But we deserve more too. It is time we banged pots and pans to protest the misuse of our data. Many suggest we should applaud and cheer for Google and Apple for leading others in contact tracing app privacy protections. Seriously? Have we now come to the point where we are praising companies for doing the right thing? Cheering them for protecting my data and yours?
That we are so used to being taken for granted that we are now thanking others for not doing that anymore? No, we don’t need to thank them. We need to demand more privacy rights and protections going forward. It cannot be an afterthought. It must be a given.
It is wonderful that Google (and Apple) are setting the bar high for privacy in contact tracing apps. I say about time! Their creed of let’s do no evil was always passive. You can shout Don’t do evil, don’t stand up, don’t cry, don’t shout, and don’t be unethical to someone in the grave and they will accomplish all of that with 100 percent success!
It takes a lot more effort to do good. To stand up. To cry. To shout. And to be ethical. To do the right thing. Not just once, but always.
Now is the time for today’s Christina Maslachs to show up! And, time for us to raise the next batch of lookouts.
How young? Before they are on social media.
One of the biggest filter bubble kingdoms exists on social media platforms. We are all inside our own bubbles there. Many don’t know that news feeds are curated, personalized, tailored to our beliefs and worldviews. The algorithms look for what we have looked at, liked, shared, commented and then gives us more of what we want. We see more of what we have seen before. We assume that is how the world sees too. We think others think like us too.
In many ways, this is a regression of Theory of Mind — something that we all developed by age 4. The ability to understand that others have thoughts and views that differ from us. Now we have become so egotistic to think that others also believe in what we believe. When that view is challenged, we demonize the other side. This has made us more divided.
As adults who grew up before the advent of social media, we know how to seek for unbiased sources of information. But kids who are growing up now, will not know a world prior to Facebook, Twitter & Instagram. Their main sources of information come from those feeds.
Even for someone like me who was born before Google and Facebook, I still have a difficult time looking for non-partisan news sources.
Pure fact based and evidence backed sources of information on the pandemic are as scarce as toilet paper a few weeks back!
Kids stand no chance. We may not be able to change these platforms, but we can educate our children to be aware of the dangers. They must know how news feeds work. How algorithms get trained. How ads are targeted. What goes on behind the scenes. We must pull the curtain on the Wizard of OZ. Like Dorothy, they must know that we are no longer in Kansas anymore!
How aggressive? Get prickly.
I already used the analogy of a needle. To use biblical terms — be a thorn in their sides. The prickly, pain-in-the-neck kind. A thorn that is on the right side of history. One on your side.
I see many organizations and writers playing softball when it comes to addressing data ethics. The monster is already out. It is getting too big to contain. Now is not the time to be passive. Now is not the time to play nice.
To be diplomatic now, is unethical.
Now is the time to rise up and shout. People inside the bubbles need to hear you. They need to hear your voice. If you could step out to your balcony and make such a difference — then imagine what you could do with your agency.
Your perspective matters. You matter. You are the gatekeeper. You are the thorn.