globinfo
freexchange
“The
Glass Cage [a new book by Nicholas Carr] examines the possibility
that businesses are moving too quickly to automate white collar jobs,
sophisticated tasks and mental work, and are increasingly reliant on
automated decision-making and predictive analytics. It warns of the
potential de-skilling of the workforce, including software
developers, as larger shares of work processes are turned over to
machines."
"This book
is not a defense of Luddites. It's a well-anchored examination of the
consequences and impact about deploying systems designed to replace
us. Carr's concerns are illustrated and found in, for instance, the
Federal Aviation Administration's warning to airlines about
automation, and how electronic medical records may actually be
raising costs and hurting healthcare.”
“...
until the development of software that can do analysis, make
judgments, sense the environment, we've never had tools, machines
that can take over professional work in the way that we're seeing
today. That doesn't mean take it over necessarily entirely, but
become the means through which professionals do their jobs, do
analytical work, make decisions, and so forth. It's a matter of the
scope of automation being so much broader today and growing ever more
broad with each kind of passing year.”
“A good
example is the self-driving car that Google, and now other car
makers, are manufacturing. We're certainly not to the point where you
can send a fully autonomous vehicle out into real-world traffic
without a backup driver. But it's clear that we're now at the point
where we can begin sending robots out into the world to act
autonomously in a way that was just impossible even 10 years ago.
We're also seeing, with new machine-learning algorithms and
predictive algorithms, the ability to analyze, assess information,
collect that, interpret it automatically and pump out predictions,
decisions and judgments. Really, in the last five years or so we,
have opened up a new era in automation, and you have to assume the
capabilities in those areas are going to continue to grow, and grow
pretty rapidly.”
“We
have to figure out how to best balance the responsibilities between
the human expert or professional and computer. I think we're going
down the wrong path right now. We're too quick to hand over too much
responsibility to the computer and what that ends up doing is leaving
the expert or professional in a kind of a passive role: looking at
monitors, following templates, entering data. The problem, and we see
it with pilots and doctors, is when the computer fails, when either
the technology breaks down, or the computer comes up against some
situation that it hasn't been programmed to handle, then the human
being has to jump back in take control, and too often we have allowed
the human expert skills to get rusty and their situational awareness
to fade away and so they make mistakes. At the practical level, we
can be smarter and wiser about how we go about automating and make
sure that we keep the human engaged.”
“Then
we have the philosophical side, what are human beings for? What gives
meaning to our lives and fulfills us? And it turns out that it is
usually doing hard work in the real world, grappling with hard
challenges, overcoming them, expanding our talents, engaging with
difficult situations. Unfortunately, that is the kind of effort that
software programmers, for good reasons of their own, seek to
alleviate today. There is a kind of philosophical tension or even
existential tension between our desire to offload hard challenges
onto computers, and that fact that as human beings, we gain
fulfilment and satisfaction and meaning through struggling with hard
challenges.”
“I
think we have a choice about whether we do this wisely and
humanistically, or we take the road that I think we're on right now,
which is to take a misanthropic view of technological progress and
just say 'give computers everything they can possibly do and give
human beings whatever is left over.' I think that's a recipe for
diminishing the quality of life and ultimately short-circuiting
progress.”
Full
article and interview:
http://www.computerworld.com/article/2845383/how-automation-could-take-your-skills-and-your-job.html
Further
philosophical extensions:
Nevertheless,
another question emerge: What if AI is meant to be the next step
of human evolution itself? What difference does it make when we
progressively abolishing human conquered concepts, like morality,
from our culture?
If
we want truly evolve, as humanity, we need to bring back morality.
We need to develop concepts like solidarity, altruism,
collectivity and put them in the core of our civilization.
Otherwise, it would make no difference - and probably would be
better - to be replaced by super-intelligent machines.
|
Socio-political
consequences especially in the class war field:
Hyper-automatization is the key
for the ruling class to break the social contract exactly because
it doesn't need human labor anymore. The culture of extreme
individualism serves perfectly the plan, because for decades
generations learned to grow just to consume and protect their
individual rights without caring for the others (this is the
general picture of course, there are exceptions). The “inflation”
of the middle class based on this one-dimensional culture,
destroyed any class consciousness, fed apolitical generations and
now used as a tool to trigger conflict interests inside the
shrinking middle class, always on an economic basis.
|
Comments
Post a Comment