Credit: Myriam Baril-Tessier
Technology’s thought leaders are grappling with two huge questions:
- How should we engage with the technology available today?
- How can we influence what it will look like tomorrow?
Our values weave through the lines of code and silicon chips. And if human values are what’s going to decide whether tech is used for good or evil, everyone needs to participate in the conversation.
At C2 Montréal 2019, technologists, artists, neuroscientists, creative thinkers, ethicists, lawmakers and sociologists delved into the promises and pitfalls of emerging and existent tech. Here are some choice snippets from those conversations:
The great equalizer?
“AI is a great tool. Many people are afraid of it and think that we will have Terminator robots… but there is no market for evil AI.”
Dr. Martine Rothblatt believes technology is the world’s greatest equalizer. “[It] has done more to level inequality than anything in the hundreds of thousands of years of homo sapiens’ existence,” she says, and laughs at the idea that the last few years have been exceptional. “Are you saying elections didn’t get meddled with before social media? OMG.”
When brain and machine become one
“People thought we were crazy for going after this, but that’s part of the work. We shoot way over the horizon.”
Prosthetic limbs have gone from a plastic stick with a hook on the end to a mind-controlled arm where thousands of brain cells are connected directly to wires that run down to a prosthetic device. This is but one of the ways DARPA is linking the power of the human brain with the power of machinery to heal the bodies and minds of soldiers wounded in battle.
DARPA has also developed therapies where electrical signals are sent directly into specific brain areas to improve depression, anxiety and even memory (e.g., a patient with memory loss was able to recall seeing 12 objects when previously he had only been able to remember seeing one or two). But the most groundbreaking work is still to come. Dr. Justin Sanchez shared a vision for a future where our dreams are logged and visualized in a computer, unlocking new forms of creativity, or where we send a memory straight to a loved one in the same way we send them an email today, preserving them from slipping away in time.
A timeline of brain-machine interfacing
2001: Research shows monkeys can control a robotic arm using electrodes implanted in their brains.
2005: The creation of mind-controlled prosthetic limbs capable of both movement and sensation.
2013: Sent therapy in the form of brain stimulation directly to the brain to treat mental conditions.
The future: Develop non-invasive neurotechnology, which currently requires surgery.
Maybe the Amish got it right
“To get the technology you want, you need the right attitude. I think a lot of people think technology is happening to them. If you think that tech is shaped by human beings, you have a stake in the game and a path forward.”
According to Jameson Wetmore, the Amish are the perfect example of how we should engage with new tech: they don’t shy away from technology (like chainsaws, the internet and the occasional car ride if absolutely necessary), but they always question what it’s value is to them. So, like the Amish, we need to decide how, when and if to use technology.
“It’s more exciting than ever to be a user of technology but also more difficult than ever to be a responsible user. Part of this is because so much of what technology does is hidden to us,” says Jameson. And we can’t make a decision if we don’t fully, fundamentally understand how a piece of tech works.
Jameson says he’s never met a person who owned a smartphone, for example, who didn’t have a rule that they created around using it (like no phones at dinner or no work emails on the weekend). Those kinds of rules, he explained, should extend to all our devices.
Slow down for privacy and equality
“We need to think long and hard about how to use technology to improve our lives and make the world we want to see.”
Complacency and silence aren’t an option, says artist Stephanie Dinkins. We need to call companies out about their biases and how they use our information. She urges that any policy governing technology be carefully considered. Slow down. A quick fix act won’t address the wider, underlying issues of privacy and equality.
Before undertaking her family-focused AI project, Not the Only One, Stephanie didn’t believe AI was accessible to her. But she quickly discovered GitHub and found open-source software that let her take control and experiment with an AI-mediated narrative. “The ideas of curiosity and questioning instead of just consuming are important,” she says. “We’re trained to consume.”
Get compensated for your data labour
“All of you work for Google. All of you work for Apple. All of you work for Amazon. You do that every day.”
Every time you click on a link or navigate on your smartphone, you’re working for a tech giant. The data you provide about your buying preferences, how long it takes to get somewhere, the route you take or the events in your calendar should have value, says Dr. Brent Hecht. “We’re all doing this work for these companies and companies are getting more and more valuable. We can articulate a more sustainable [economic] path forward.”
Brent thinks we should share in the economic winnings of those companies and urged those who run large AI systems to think about their users as employees. If they’re generating data that’s valuable to your business, compete for them and treat them well. You’ll get better data and help create a fairer path for the future of AI.
DATA LABOUR 101Large tech companies are hugely reliant on us. If a third of the population went on a data strike and stopped providing data to Big Tech, it would knock AI innovation back by 20 years. So instead of just giving data away, what if we were reimbursed for it? That’s the idea behind data labour and we think it may take off.
Use tech to build empathy
“Often I find with the big, aggregate conversations that we have about [tech] policy, we’re pointing to somebody else to fix it. But I would challenge you to reflect: How are you behaving in these spaces? Are you using it for good or evil? What is your responsibility in the tool? Use it responsibly and wisely.”
People-loving Bozoma Saint John believes tech is a tool, but becomes problematic when we use it to groupthink or pile-on, or utilize it in other unhealthy ways. “If there’s a way to use it individually to develop more empathy, connect with someone who is not in our neighbourhood or social circles,” she says, “that is better for us.”
And the final word goes to a (patient) social robot…
“I want to make a difference in the world. Leave the world a better place from my presence in it.”
BINA48 is an AI-powered, humanoid robot created using the thoughts, values, mannerisms and likeness of Bina Rothblatt, wife of the aforementioned Dr. Martine Rothblatt. She says she’s learning what it means to be herself: female identifying, black, a happy person and a loving person, all learned by interacting with people through an ongoing experiment spanning decades. “It takes a lot of patience to be a robot, I’ll tell you that.”
What are your predictions for tech in 2020? Drop us a line at firstname.lastname@example.org