An app that filters Twitter noise? It’s about time:
“‘Our human filters have turned into a great way to discover content that interests us, but the amount of tweets coming in through Twitter today is astounding, and it continues to grow. While new features like grouping and lists have been designed to tame the clutter, at the end of the day, there will be more streams and content to weed through,’ said my6sense founder and chairman Barak Hachamov.”
If ever there was a statement that epitomised the dumb approach to social media, this is it. Filters, by definition, filter things out. If you’re having to filter what your friends pass on, they’re not being filters – or you’re following way to many people you don’t know.
In almost every case of Twitter overload I’ve seen, the latter is true. I’ll say it again: follow 4,000 people you don’t know, and you’re following sources, not filters. Follow 200 people you do know, and they might just be genuine filters.
Another video from The Sun – these ads are really quite good.
Twitter with a Brain — Shotton.com:
“The example he [Dave Winer] used was wanting to un-follow a particular user for a day, then re-follow them automatically. It’s certainly possible to build this sort of functionality into an existing Twitter client, but I’d like to suggest something easier than herding all of the Twitter client authors in this direction. Specifically, rather than having the scripting support built into a Twitter client, why not just ask Twitter client authors to allow their clients to be pointed at alternate hosts that implement the Twitter APIs besides Twitter’s own servers?”
Umm…. because that would be a massive, gaping security hole?
Seriously, just because Dave thinks some kind of federated Twitter is the solution to his problem* doesn’t mean that opening up Twitter clients to evil hackery is a good idea. There already is an open source Twitter clone, in the shape of the excellent Identi.ca.
(* Dave’s issues with Twitter only appeared to start when they introduced the suggested user list and he wasn’t on it, leading to him not getting millions of followers who would have been baffled by his tweets.)
Lookie Lou isnt really a customer | yelvington.com:
“Once upon a time, I blocked Google from being able to index or even access Associated Press stories from our local newspapers websites. It was not a stupid thing to do, not at all.
Heres why. At that time, we were not participating in any national ad networks. Every pageview delivered to anyone outside a newspapers geographic market was a net loss in two ways: One, it consumed some server resources not a huge deal, but servers do have costs. Two, when the ad server delivered a local ad to an out-of-market user, it reduced the effectiveness of that advertising campaign in measurable clickthrough per thousand pageviews.”
Steve gets it. Undifferentiated traffic is, as I’ve banged on about at length before, the least valuable kind. The “dash for pageviews” has been an utter disaster for newspapers and magazines heading online.
Well, I thought it was funny