Pages in this blog

Saturday, September 5, 2020

We need decentralized social networks and AI tools to give users more control over them

Ben Goertzel thinks large centrally controlled social networks, like, e.g. Facebook, are a bad idea. I agree. He thinks that "explainable AI" is one of the technologies we need to work our way out of that trap. I agree with that as well. Here's a blog post he's written setting forth some ideas: The Critical Importance of Decentralized, Explainable AI for Better Social Networks, SingularityNET, July 28, 2020.
What we need to create a better social network, in my view, is relatively straightforward to state:

1. Code should be open source.

2. Ownership and control should be decentralized… so there is no company and no government tasked and burdened with calling all the shots. Embedding blockchain technologies deep in the technical design of the network is the right way to achieve this.

3. Major decisions should be made democratically. If some group systematically doesn’t like the outcomes votes are getting, they can fork the code and make a new version, or copy the code and create a new network using the same software.

4. Recommendations of connections and content should be made using AI that has the ability to explain the reasons behind its judgments — and these explanations need to be shown routinely to users in a way they can understand.

Due to SingularityNET and a variety of other recent developments, we already have the basic technologies needed to support this sort of approach — as far as tech is concerned, we could have decentralized, democratic, transparent and ethical alternatives to Facebook Twitter, LinkedIn, TikTok and so forth right now. We even have some existing websites — such as minds.com — embodying significant parts of the above.

What we still don’t have — yet! — is the alignment of resources to get alternatives fulfilling all four of the above points fully built-out … and then pushed toward wide adoption.
Goertzel's attention seems centered on the content carried by social networks, which is understandable and which is obviously critical. He doesn't say anything about the issue that's got me up in arms about Facebook, the ability of the network's owner to unilaterally change the interface and thereby mess up your mind. But, given some of the more theoretical material in the post – about Hebbian learning in neural nets – I'm pretty sure he'd recognize the issue.

And then we have things like this:
One problem with the lack of transparency in the operation of modern social networks is that it covers up the way the companies owning the networks are exploiting the data obtained from you, the user, to sell things to you and your friends.

But another, less often discussed problem is: Opaque systems do not provide people with the explicit visibility into their own perceptions, actions and judgments that they need to have — if they want to overcome their more egocentric and shortsighted aspects and place more weight on their longer-term benefit and on broader benefit to the world.

Transparent systems help people who are questing to understand themselves, what they like and dislike, what they are reacting to, what they are attracted to versus repelled from. They are valuable tools for people to use to improve their consciousness and experience and become wiser and happier and more useful to the world.

Opaque systems provide much less aid to self-understanding and self-awareness and push the user to instead behave more mechanistically, following their inclinations without much reflection.
There he's snuggling right up to the user-interface issue.  Goertzel goes on:
We also need tools that a user can deploy to understand the judgments they are making themselves. What kind of people am I choosing to follow on Twitter? What kinds of images, when shown to me, am I most likely to look at longest? Which other people tend to read the same documents as I do regarding medicine? Or regarding physics? Or regarding Chinese politics?

There is incredible positive potential to be realized via applying AI and information visualization to allow people to study their own behaviours online. This could be a powerful tool for helping people to achieve greater self-awareness and move toward more positive consciousness — as manifested in their online interactions and more generally. However, this is clearly not how the social network tech industry is oriented.
Frankly, that has interface written all over it. The user needs to have control over the interface, and they need AI tools to help them.

No comments:

Post a Comment