You are viewing a single comment's thread from:

RE: Steemit Roadmap 2018: Community Input Requested

in #roadmap20187 years ago

I'm going to go out of the box, way out of the box, and suggest an orthogonal technology:

We really need some sort of "web of trust" system which will give individual users more control over what they view and what they interact with on Steemit.

As an example:

I find someone that is a very good curator. I follow them. I get their content in the content that they think is good, and everything is aces.

I find someone who consistently produces absolute garbage. Maybe they're a bot. Maybe they're completely insane. Maybe our tastes are simply deeply divergent. I want to see less of them – but I don't have any way to tell the system that as far as I'm concerned they should be less often in a stream that I'm exposed to.

In a web of trust system I would invest some "trust" into the first person which would automatically distribute some amount of trust into the people that they trust, with the assumption that I consider their judgment good. I would invest some "distrust" or negative trust into the second person, and from my perspective that person and all of their trusted people would become less prevalent in my stream; they would evaluate as "less valuable."

And for the people who both of my targets share in common? They would be adjusted by the relative strength of trust/distrust moving through the network out away from me.

This would result in an orthogonal sorting mechanism, informed by chronology (because many people may have very similar trust levels from my perspective), but with content that I am most likely to want to see at the top and content that I am very unlikely to want to see at the bottom.

If I find a bot who keeps getting into my stream? Distrust it. It and all of the content that it touches becomes less valuable. I see less of it, naturally. People who trust me, because of the quality of my posts, or comments, or just good curation, likewise see less of it – because I distrusted it.

A good web of trust system would go hand-in-hand with the 2017 roadmap's increased focus on subset community building, because I don't necessarily trust everyone equally on every subject. That, however, is something that can be put in place after the basic handling of trust comes into being.

Maybe this is best handled as some other form of token currency which doesn't touch Steem, I leave that to others who have more experience with the block chain that myself. (I'm an old-school symbolic AI and computation guy, myself.)

This would solve a lot of problems that we are seeing with bots, spam, and pollution of the streams in general. It certainly something worth taking a look at.

Sort:  

web of trust is basically recomentation system. It could be done on frontend-side.

I'm not sure it's accurate to say that it's a recommendation system. It can function as a recommendation system in part but it's more accurate to say that it's exactly what it says on the tin: "a system which manages quanta of trust."

Trust can mean a lot of things. Trust can mean that a user is considered "a good part of the community." Trust can mean that a source provides content which is in line with with expectation. Trust can mean exactly what it does in common parlance, that you extend a level of predictive expectation to another entity.

It's certainly possible to get into various types of trust which all function simultaneously, but at this point Steemit really needs some sort of lensing system, some way to slice through the piles of content which get dumped on it every day in ways that are meaningful to individual users.

I'm not one of those people that sees a blockchain and thinks I have a hammer and every problem is a nail. I'm actually pretty sure that any sort of web of trust system would have to be built on another type of data store, and that could certainly be built into an entirely different platform front end.

But that wouldn't improve Steemit. And that would be a shame.

Well, i see.
As an analyst once I created a content-rating on a classified selling site. it was a hell of a task, I spend 9 months to did that. At it was a simple target function - a score of selling an item and getting revenue to a company. With tons of information in hands.
I'm not sure if we need to put that many resources there.

But someone could experiment with steemit sidechain attaching a content score to each content and each author.

As an analyst once I created a content-rating on a classified selling site. it was a hell of a task, I spend 9 months to did that. At it was a simple target function - a score of selling an item and getting revenue to a company. With tons of information in hands.

That's a lot more complicated problem then we even need to think about in this context.

Essentially a web of trust is a sparse relationship matrix. You don't adjust trust to all entities, just those you want to. The results provide an ordering of the content but don't change the content. If there's any innovation required for the process, and given that we were doing this 10 years ago with a lot less resources at hand, it's in resolving the matrix for an individual view.

The big hook is that we need to stop trying to think of things from a global perspective because all that does is encourage trying to "game the system" from the top down. Effectively. That's why we see the undesirable whale and bot behavior that we do. It's incentivized because there is a global view.

My feeling is that a blockchain of any sort is the wrong technology to bring to bear on this. If I'm honest, I have to admit that I'm not really sold on a cryptocoin being a really effective way to reward creators and curators, but since that's the underlying premise that we except when we deliberately engage with Steemit – there you go.

I will say this: somebody needs to put the resources into doing this well and doing it soon, if not Steemit then Busy or someone else who really wants to be a successful social media platform first so that they can reward creators and curators secondly, because if there is no successful social media platform integrated, there's no rewards to hand out. If no one uses the system, nobody gets paid.

This is at least one reasonable approach.

This is high on my list, actually. Communities are the “gateway drug” infrastructure to the grand plan of letting anyone moderate the whole site, and letting anyone subscribe to anyone else’s moderation feeds.

I feel like Communities are actually even bigger than the "gateway drug" infrastructure; they are the one thing that lets discovery happen organically and naturally as long as they're coupled with a decent search engine. They channelize discussion, keeping like things associated, making it easier to find things that are interested in and ultimately keep users engaged with the community much longer than any of the other alternatives.

It's one of the things that Google+ has done better than anyone else and one of the reasons that Facebook Pages are one of the only things that Facebook has got right.

"Moderation" might not be the right mindset to bring to driving social network engagement. Moderation implies a deliberate squelching of other people's content – legitimately so. But we know that people feel better about engaging with a system that promotes things that they like more than they enjoy engaging with a system that demotes things that they're not interested in at that moment.

In that sense, one of the strongest words and ideas associated with Steemit right now is "curation." Curation implies that you are helping other people find good stuff. People want to help other people. Let people help each other by working alongside the system to classify things for better discovery and you'll get better, longer term user engagement.

(Anyone that hasn't poked around it Google+ long enough to get past the media narrative that it is a "failed social network" needs to invest that time. Seriously, the Community interface and management system is top-notch. It's the best part about the entire system. Users can create their own Communities, manage and moderate those Communities, and take personal responsibility for the results. Google gets the vast benefit of channelize and content that is inherently self similar. Total win.)