Ethical explorers for ethical digital today and in the future

Ethical explorers
for ethical
digital today
and in the future

Ethical explorers for ethical digital

Subtitle

The ethics of Developers has been addressed here before. Avenga Labs is bringing the topic back because there’s a new interesting guideline and set of tools that might help all the digital solution builders. They are called Ethical Explorers.

The sphere of ethics in digital transformation, and system development in particular, is kind of full of good intentions, but suffers from a lack of important details and methods to achieve the desired state.

“Move quickly and break things” used to be the mantra of the wild period of digitalization. Now we all want to move fast without breaking things, including human rights, especially when in uncharted territories which are poorly addressed by existing laws and regulations. Building ethical software and data processing sometimes might seem like it goes against short term gains, but in the middle and long term it helps to retain customers and gain their loyalty. And, this should also become an integral part of any customer experience.

How do we build solutions so as to avoid known technological drawbacks? How do we gain a customer’s trust by adding ethics into the equation?

What is Ethical Explorer

An Ethical Explorer is a set of organizational tools and techniques that help with the ethics side of digital solutions. Let me briefly explain what this is all about.

Important tech risks

The authors of Ethical Explorers identified several key technology risk areas.

Surveillance

Digital products gather more information about their users, that is about us. How can this data be used against us? With the rise of biometric technology, the risk is even more pronounced.

Our privacy is already endangered  and we addressed this previously in our Human Digital Twins article.

Despite the regulations growing stronger and stronger, at least in EU countries with incoming AI regulations, it is the people working on particular projects that decide how these threats will be addressed.

Fortunately, it’s relatively easy to justify additional efforts in this area. Not conforming to regulations mean a high risk of financial loss as well as security vulnerabilities that may result in data leaks and losing customers.

Disinformation

This does not only apply to social networks or deep fakes created by generative AI.

Even bugs in business applications can make people frustrated, disoriented and cause them to make the wrong decisions. For instance, something as simple as the incorrect amount of money in the virtual banking application can cause overspending and then financial consequences for the client.

Exclusion

Technology solutions should be readily available to the public, which means to as many people as possible. Digital products should not be designed just for a group of people who resemble the team who created the solution.

This refers to many things, to include accessibility features; we are proud to deliver this on our Avenga website. Providing tiny fonts or a fixed design for particular screens, requiring the fastest computers to run the applications, and eliminating users with older OS versions are still all too common. With a few smart practices, it’s easy to widen the target group of users by tens of percent and even more.

The goal of digital products should not be a perfect design, but being accessible and understandable by as large a number of users as possible.

Algorithmic bias

Bias is a big problem, especially in AI applications. It can affect financial decisions, health related decisions and more. Taking existing datasets as the source for machine learning may result in heavily biased models, which will further reinforce the discriminatory sides of modern digital solutions.

Dealing with bias should start at the earliest stage of data exploration and model design. The ‘earlier the better’ translates into it being as low biased as possible.

In case more motivation is needed, there are new AI regulations coming (the EU first, but the rest may follow their lead).

Addiction

Application creators want to build a user’s engagement with gamification that uses constant attention grabbing methods, like using notifications for instance. This may create an addiction, especially for younger users, and this may lead to problems with attention span or the ability to focus at school and/or work. Their time on-screen is increasing and the frequent notifications create disruptions in their work flow and concentration.

CX/UX designers and product owners should weigh the benefits of engaging users alongside the risk of addiction.

Both Apple iOS and Android are providing new options to deal with notifications in order to avoid this kind of behavior. It’s additional proof of how serious a problem it has become and that it needs to be addressed at the system level.

Data control

All digital products gather a lot of information about their users. Unfortunately, users are often not aware of what information is stored about them and what happens with this data. The data can be sold, or even worse, it can be leaked.

Users are entitled to know and control what information is stored about them.

This sounds very much like the GDPR regulations in the EU, but let’s not forget that these are not the norm for the entire world, which basically includes the rest of the world and especially the USA with its global internet giants.

And, the GDPR regulation is one step towards ensuring the privacy rights of citizens.

Bad actors

This applies less to enterprise IT, but social media really should pay more attention to it.

Now, social media is one of the  easiest ways to spread disinformation, radicalization of politics, etc.

This especially applies to platforms with very high degrees of encryption and law enforcement avoidance techniques. Despite good intentions, they have become full of illegal activity that is harmful to individuals and entire societies.

Outsized power

Large monopolies or semi-monopolies often close their APIs to restrict their clients in so-called walled gardens.

A primary example is, of course, Apple with their ecosystem; especially with iMessage which is not available on the web nor on Android. It’s an intentional tactic to keep the user within the Apple bubble.

Another example is that Facebook requires all Oculus Quest (VR headset) users to login using an active Facebook account.

The same principle also applies to the API economies of today and the future. Limiting access to functionalities by baking them with a default web or mobile UI is still not a thing of the past. In our API culture, sharing features is beneficial for both the API providers and consumers.

How do we do it?

This is no time to slow down

“Ok, the ideas are fine, but we have so much to do and we’re under pressure to deliver faster and better, so . . . maybe we can do it later?” you might say.

In our time of rapid transformation, later often means never or too late. The key advice here is to start as early as possible by addressing the most pressing issues before they become hard to implement into an already existing system.

This is similar to dealing with technical debt, as the cost of avoiding it or fixing it in the early stages is orders of magnitude cheaper and less risky than doing it later.

But, philosophy is not a part of our project

Ethical requirements can and should be translated into practical steps and actions, as they are not something that is really philosophical when it comes to the architecture and implementation of IT systems.

New structures

There’s an old but true mantra about management that says when one (manager) does not have a clue about what to do to achieve the goals, they can create or modify organizational structure instead. It takes time and creates a lot of noise, but might buy you a lot of time.

The same rule can be applied to ethics, but that’s not really how it is supposed to be realized.

As a part of the quality of the product, there are different responsibilities for different aspects that should be assigned to already existing roles in the entire lifecycle of the product.

Organizational buy-in

Selling ‘ethics’ is a new trend. Starting with small steps and involving more and more people step by step is the recommended approach.

No one can imagine an entire organization stopping, planning and executing an ‘ethics’ project. It’s a delicate process requiring lots of empathy, but also persistence.

Why not start building IT solutions with ethics?

Ethics in building IT solutions is more important than ever because we, our businesses, and our private lives are more and more digital.

I really enjoyed reading the Ethical Explorers’ guidelines and I encourage anyone working in IT, and even those just using IT technologies, to read them and reflect on them. Even small steps can mean a big difference.

We should not wait for another set of regulations to enforce these basic principles. Building the trust of customers and doing the right thing will make everyone happier and give us all a more satisfying result.

Other articles

or

Book a meeting

Zoom 30 min

or call us+1 (800) 917-0207

Start a conversation

We’d like to hear from you. Use the contact form below and we’ll get back to you shortly.