Who decides what 2Mb of content I get?

Hi I like the idea and just got a lantern.

Who’s going to decide what 2MB of content I get every day?

Wondering if you could use the FOSS https://www.loomio.org/ decision making software so it was a collective decision.

Your current voting system seems to allow multiple votes so I’m not sure it’s going to select the best content for me.

I think they’re still working on who decides the content, and the current voting system is a trial as some people will be able to hack whatever protections they try to put in place against people upvoting or downvoting as much as they like.

This is an interesting and difficult problem.

I think a possible solution lies in having a diverse range of sources. With a bit of news about each country, from that country, in the (main) local language.

So one process could be: Google for ‘country name & news’ in the local language. Find the leading news source and include the first five items from their RSS feed.

So for Danish for example I google “danske nyheder” which leads me to here: <![CDATA[Seneste nyt | DR]]>

This would ensure that there is at least some news of interest to each country, in the local language albeit the mainstream narrative. It would begin to give you different perspectives on global stories.

I think this would be an interesting baseline that could just be included by default.

If it’s all curated by those who already have access to technology you are never going to get many votes for news about Uganda in Swahili for example.

I’m not sure how big it would be each day but probably more than 2Mb… Bandwidth aside you could then start to add some ‘curated’ news, on top to get alternative perspectives.

Maybe this project could be useful: http://globalvoicesonline.org/
What is Global Voices?
“We are a borderless, largely volunteer community of more than 800 writers, analysts, online media experts and translators. Global Voices has been leading the conversation on citizen media reporting since 2005. We curate, verify and translate trending news and stories you might be missing on the Internet, from blogs, independent press and social media in 167 countries.”

It would be good to get an idea of how much space was going to be given to the Core/ News/ Adverts

500Kb for Core, 500Kb for News, 1Mb of adverts? What are the suggested proportions?

Currently, due to manual nature of broadcasting, we choose and upload each piece of content. This is very slow (as you can see from the number of new stuff that appears), and not very democratic. We have two possible routes. One is to automate decision-making based on voting platform we have now. Another is something I personally think it’s better (although as a company, we don’t yet have a solid unified opinion on it) is to turn Outernet into a completely self-service publishing platform.

The publishing platform would have a free and paid tier. On the free tier, anyone can publish anything but everything goes to public voting before it gets published (or not, depending on how well it does during voting). The paid tier doesn’t have any kind of voting, but you obviously have to pay to have your content published.

Also, forgot to mention this: the current voting system is a bit silly. We have to manually review if it’s possible to publish something at all (for legal reasons, or because of technology involved, like in the case of Google Translate). So even though something might get tons of votes, it may not be publishable.

Hi Branko

Thanks for this, it clarifies the current situation.

Have you thought about the proportion of News vs Paid content? Could we end up with a 100% paid content stream of data? I’d hope not, but there doesn’t seem to be any kind of policy in place to prevent that?

I’m a bit cautious about ‘everything’ going to public voting. Why should English speaking westerner’s be the only people to have a vote? (I know you mention SMS but that’s going to be fringe IMHO)

I’m not going to vote for Swahili content about Uganda, but if you are serious about this project then finding a way to include it seems important. Particularly for those people who only speak Swahili…

What did you think of the idea of automatically including a small volume of local language news from the most popular news site in that country?

I stuck a few feeds in here to try it: Your Personal Dashboard | NETVIBES

Each of those feeds is about 1000bytes, say you doubled the length of the text to make it more useful to 2000bytes.

To give 190 countries the 5 items a day in the local language would cost 0.38Mb a day: Is that something you could spare? I’d be up for helping get the RSS feeds together and working out a file format that worked etc…

I include the below language map because I found it interesting. Will you be aiming to reflect this in the content published?

1 Like

We’ve considered many different scenarios, but before we can start adding a steady stream of content, we need to automate as much as possible. Not enough manpower to handle everything right now. Ideally, we’d put as much content as we can in as many languages as we can with very little restriction, and we’d give up the right to determine those restrictions even. Just a matter of time.

Hi thanks for replying.

My suggestion to use RSS would automate a lot. I’m curious what you think of the specific suggestion to use a RSS feed for each country? To provide some basic info in an appropriate language. Good or Bad?

I’d volunteer to do some of the research on that if I thought Outernet were interested.

I think this would be a departure from your current proposal which is for all content to be voted on by an ad hoc group.

@branko might I suggest a new route to voting? It fits right in with your mission. Develop a wordpress plugin and theme for a timebank. Let organizations and local governments login. Money flows to you for satellite costs from corporate sponsors in ad placements. People vote with their time bank account.

A remote village could send out a request for an free course on xxx. Dr. XXX creates the content and logs their time bank hours. Dr. XXX now has currency to vote for other content.

2 Likes

That’s interesting. I’ll see how we could incorporate that.

1 Like

It’s generally a good approach, and we already do that with some of the partners. The only problem is repackaging of content prior to broadcast and legal issues surrounding those. That’s why we only do this with content partners and sponsored content. If we can get content authors to submit their work directly, this problem goes away, RSS or not.

I think you are possibly being over cautious.

It would actually be best to use Google news results anyway as you’d get a wider range of sources. You are allowed to use the feeds from google news:

"Incorporate Google RSS feeds onto your site
You can make non-commercial use of Google’s RSS feeds on your website, subject to Google’s Terms of Service. " Google News Help

As outernet is not charging users for access its pretty clear that this use would be non commercial. I understand that you are a for profit company, but disseminating this news will not generate you revenue. Your revenue model is completely seperate to the non commercial dissemination of news for the public good…

IANAL so I leave it to my betters. :wink:

Hi Branko. So thinking about this 2Mb content limit a bit more is my understanding below correct?

  1. At the moment you broadcast about 100Mb/ day? On Dvb-s? This requires a dish. Ref: https://images.indiegogo.com/file_attachments/795983/files/20140818121118-Screen_Shot_2014-08-18_at_1.59.43_PM.png.jpg?1408389078

  2. Once the lanterns ship there is a plan to turn on a new Uhf broadcast on a as yet unspecified frequency? This will broadcast 2Mb a day (10Mb if funding goes well)

So in stream 1) there is stuff like Linux distro’s, big files. Presumably this stuff won’t get broadcast on the new UHF frequency 2)? A tiny fraction of a Linux distro is no use to me on a lantern.

So will there in effect be two queues? How will you ensure that time sensitive info (there is a Tsunami coming in one hour) not be displaced by some random Linux distro? In short; how is the queue going to be prioritised?

@sam_uk most news from google news come from third parties. While google lets you syndicate them (they are providing them as a rss, after all), some media companies require third parties to pay if they want to syndicate them (even when they are offering the news as rss). New copyright code in Spain, created after years of trials between Google and newspaper companies, is a good example of that. And I think there are other countries in Europe where laws prohibit commercial use of syndicated sources. I don’t like this, but it is the law, and it could cause trouble to the satellite operators if they are based in Europe.

Forgot to say wikipedia news projects could be a good alternative.

I am very intrigued by this concept of a time bank. It’s basically like a currency.

You have the gist of it correct, but I have a couple of corrections to make.

  1. We currently broadcast about 200 MB per day. Yes, it does require a dish. Our dish-based stream/data carousel will increase to 10 GB per day, at minimum. This is where heavy content like Linux distros and courseware will be placed.

  2. I think we can hit the target to hit 10 MB per day. In that case, we would be using L-band and not UHF for the mobile broadcast. (Please spread the word and help us do so :smile:

  3. Correct, the heavy content will not be placed into the mobile carousel. However, all content from lower bitrate carousels will be available in the faster ones. For example, all HF content/information will be on mobile. And all mobile content will be available through DVB-S.

As time progresses, we will likely have more than even three queues. Our goal is to create a monsoon of free and useful information, which will happen over numerous frequencies and modulation schemes (which is why we are placing a software-defined radio in Lantern).

Anything that is time sensitive–especially information related to emergencies–will immediately have priority across all channels. On some channels, it will entirely consume the capacity of the channel. In others, it will only take up a small sliver–though will have a special place in Librarian.

In the US, all radio operators are required to participate in a system called the Emergency Broadcasting System. When something serious happens, those notifications are sent over all stations’ transmitters. In many respects, our model will be very similar.

2 Likes

Since the Outernet is meant to provide information to places with no internet access, I fail to see how requesting content online, or voting on content online makes any sense.
In the previous example of time banks and “remote villages sending out requests” for data this would be impossible if they have no other connectivity. The client devices (the Lantern et al.) are receive-only. How would a village request content?

I think there are a large number of thoughts but no specific plan. That is a large hurdle, but there are a huge number of possibilities.

Not requesting, but suggesting. Requesting will come later when we can actually collect feedback from users on the ground. For requesting there are several options, SMS, USSD, snail-mail… We’ll see how it goes.

1 Like