The complexity of rapidly evolving technologies deters many users — let alone
human rights advocates — from believing that there is even the possibility that
standards could be set on which the ‘big tech’ companies could be forced to
comply.
As technology was part of the focus on this second day in
Geneva,
familiar stories arose about authoritarian governments using Western-based
technologies to impose censorship or to restrict protest and of tech companies
themselves becoming part of a new surveillance culture.
Barriers to ensuring technologies respect the UN Guiding Principles on
Business and Human Rights
(UNGPs)
mirror concerns about the perceived difficulty and questionable desirability of
providing any regulation of the internet and of all of its uses.
Technology governance
However, governance applies to technology as to all sectors — with regulation,
standard-setting, industry initiatives and public accountability all applying to
this sector, too.
Already tech giants including Facebook, Google, LinkedIn,
Microsoft, Nokia, Orange and Vodafone have formed the Global
Network Initiative — which has produced
principles, guidelines and tools for companies to assess human rights risk in
new technologies.
The UN Special Rapporteurs on Freedom of Opinion and
Expression
and on Privacy have
also made reports which outline norms for the technology sector.
This was taken a step further yesterday on day
one,
when the Office of the UN High Commissioner for Human Rights unveiled a new,
draft “UNGPs
compass”
— which will give authoritative, UN-level guidance on how technology can and
must respect the Guiding Principles.
It is necessary to redefine technology companies as part of infrastructure —
similar to railways and supply chains — as regulatory mechanisms themselves that
govern people’s lives and in a context where there is information asymmetry
between companies and citizens, according to Lisa
Hsin from Oxford’s
Bonavero Institute, who helped draw up the draft guidance. Each of those has
an implication for regulation, she said.
Concrete examples
The Forum heard concrete examples of regulation — including Europe’s
Digital Services
Act,
in force from earlier this month, which requires IT platforms to assess risk
from the user’s point of view. A current bill in the Brazilian Senate that
seeks to address widespread concerns about perpetuation of bias and
discrimination in decisions taken by artificial intelligence was also explained.
Already it has been estimated that 85 percent of customer interactions are
handled by machines rather than humans, using chatbots and self-service
technologies. This is transforming markets and therefore the need for market
rules.
However, researcher Clara
Keller warned that
current legislative initiatives still reflect a principles-based approach —
which she argued should be superseded by the rights-based approach, to enable
users to be informed about impact on them and to be able to access
remedy
where their rights have been violated.
The furor around the so-called “Facebook
Papers”
showed the need for more rigorous protection of whistleblowers, according to
Gaya Khandhadai, Head
of Technology and Human Rights at the Business and Human Rights Resource
Centre. Indeed, it’s ironic when companies that use freedom of expression to
defend their actions are all too ready to try to silence those who seek to
criticise them.
Positives
The digital revolution has harnessed the huge positive potential to extend
distance education opportunities across Africa; to provide closed blockchain
models that enable assurance against supply chain violations from illegal
fishing to conflict
minerals;
and the application of big data, which is transforming our capacity to help
detect and prevent the causes of climate change and consequent humanitarian
disasters.
The lesson from this week’s debate, however, is that digital empowerment must go
alongside digital innovation; only by providing systems for individuals to hold
providers to account can equality be pursued, not diminished.
The new ‘UNGPs compass’ will be posted online in the next two weeks, with full
details for public comment before they are finalised.
Highlights
Other highlights on Day Two included a session on investor responsibilities, in
which Domini Impact Investments’ Mary Beth
Gallagher argued for
the finance sector to directly meet people whose human rights have been affected
by the companies in whom they invest.
The annual session on the long-running consultations towards creating a
Binding Treaty on Business and Human Rights was both better attended and
more constructive in tone this year. The Head of Business and Human Rights for the
German Foreign Office, Wolfgang
Bindseil, even
announced that his Government is organising a conference next year to support
international agreement around access to remedy.
Continuing the whistleblower theme, a session on the role of human rights
defenders demonstrated their importance in protecting workers’ and environmental
rights and in combating corruption.
However, it was the technology session which most caught the spirit of this
week, to bring rights holders to the centre of deliberations. If that can apply
to the Cloud, the Internet of Things and the
Metaverse,
it really can apply anywhere.
Get the latest insights, trends, and innovations to help position yourself at the forefront of sustainable business leadership—delivered straight to your inbox.
Richard Howitt is a strategic adviser on Corporate Responsibility and Sustainability, Business and Human Rights. He is also a Board member, lecturer at Audencia Business School and host of the Frank Bold ‘Frankly Speaking’ responsible business podcast. Richard was Member of the European Parliament responsible for the EU’s first rules on corporate sustainability reporting and subsequently Chief Executive Officer of the International Integrated Reporting Council.
Published Nov 30, 2022 1pm EST / 10am PST / 6pm GMT / 7pm CET