Thinking about the social cost of technology
Every time I call my mum for a chat there's usually a point on the phone call where she'll hesitate and then, apologizing in advance, bring up her latest technological conundrum.
An email she's received from her email provider warning that she needs to upgrade the operating system of her device or lose access to the app. Or messages she's sent via such and such a messaging service that were never received or only arrived days later. Or she'll ask again how to find a particular photo she was previously sent by email, how to save it and how to download it so she can take it to a shop for printing.
Why is it that her printer suddenly now only prints text unreadably small, she once asked me. And why had the word processing package locked itself on double spacing? And could I tell her why was the cursor kept jumping around when she typed because she kept losing her place in the document?
Another time she wanted to know why video calling no longer worked after an operating system upgrade. Ever since that her concerns has always been whether she should upgrade to the latest OS at all -- if that means other applications might stop working.
Yet another time she wanted to know why the video app she always used was suddenly asking her to sign into an account she didn't think she had just to view the same content. She hadn't had to do that before.
Other problems she's run into aren't even offered as questions. She'll just say she's forgotten the password to such and such an account and so it's hopeless because it's impossible to access it.
Most of the time it's hard to remote-fix these issues because the specific wrinkle or niggle isn't the real problem anyway. The overarching issue is the growing complexity of technology itself, and the demands this puts on people to understand an ever widening taxonomy of interconnected component parts and processes. To mesh willingly with the system and to absorb its unlovely lexicon.
And then, when things invariably go wrong, to deconstruct its unpleasant, inscrutable missives and make like an engineer and try to fix the stuff yourself.
Technologists apparently feel justified in setting up a deepening fog of user confusion as they shift the upgrade levers to move up another gear to reconfigure the 'next reality', while their CEOs eyes the prize of sucking up more consumer dollars.
Meanwhile, 'users' like my mum are left with another cryptic puzzle of unfamiliar pieces to try to slot back together and -- they hope -- return the tool to the state of utility it was in before everything changed on them again.
These people will increasingly feel left behind and unplugged from a society where technology is playing an ever greater day-to-day role, and also playing an ever greater, yet largely unseen role in shaping day to day society by controlling so many things we see and do. AI is the silent decision maker that really scales.
The frustration and stress caused by complex technologies that can seem unknowable -- not to mention the time and mindshare that gets wasted trying to make systems work as people want them to work -- doesn't tend to get talked about in the slick presentations of tech firms with their laser pointers fixed on the future and their intent locked on winning the game of the next big thing.
All too often the fact that human lives are increasingly enmeshed with and dependent on ever more complex, and ever more inscrutable, technologies is considered a good thing. Negatives don't generally get dwelled on. And for the most part people are expected to move along, or be moved along by the tech.
That's the price of progress, goes the short sharp shrug. Users are expected to use the tool -- and take responsibility for not being confused by the tool.
But what if the user can't properly use the system because they don't know how to? Are they at fault? Or is it the designers failing to properly articulate what they've built and pushed out at such scale? And failing to layer complexity in a way that does not alienate and exclude?
And what happens when the tool becomes so all consuming of people's attention and so capable of pushing individual buttons it becomes a mainstream source of public opinion? And does so without showing its workings. Without making it clear it's actually presenting a filtered, algorithmically controlled view.
Provided by : https://www.yahoo.com