Bandwidth issues when selecting content for Outernet

NOTE BY ADMINISTRATOR: This thread is created from off-topic discussion in another thread. Original thread is here.

Interesting idea. I’m not quite sure what it’s for yet? Is this the core archive?

Is it for the lantern? Will it be broadcast at 2Mb/ day?

Is it a constantly expanding archive? Or capped at 4Gb total?

If we know what the purpose of the archive is it might be easier to comment on the proposed methodology.

It’s for a community library/archive. It’ll be separate from core which is curated by Outernet, although, if people actually agree this is a good idea, and it actually turns out to be a good idea, I don’t see why it wouldn’t be applied to core.

Think of this as a sketch for the community managed side of outernet, all of it. It wouldn’t be hard to make the differentiation from pillar to lantern.
It’s my understanding that the only reason outernet would manage core content and a community library is if the community library did not already accomplish everything that the core content was attempting to.

I really disagree. I think sending useful data within 2Mb/day is a very specific challenge. Eg. The Weather file alone looks like 1.9Mb.

Is the content in the wikidrive going to be broadcast continually on rotation? Just a few times? This affects how we think of the archive. Is it basically static? Or is it dynamic, looking for fresh content every day?

Let’s say we have a 10Mb/day broadcast. There is 3Mb/day available for this community content in the Lantern broadcast, users submit 500Mb/day of new content. What then?

It seems like this exactly is an exercise in editing and distilling the best content. If the bandwidth was unlimited we could just send the entire internet…

I think you’re thinking about it too narrowly. This is a place to foster discussion and collaboration. It would be easy to make a subforum or folder or whatever for discussion specific to lantern devices.

I’d say this probably depends on how it’s used by the community. It will probably resemble git to some extent, as being able to make incremental changes would be huge. There would probably be a “heavy” or “core” archive area which wouldn’t be broadcast as regularly, and probably a smaller sampling of books and other media people want in there. Honestly, it’s an idea of a website for interacting about what data might be useful to outernet. It doesn’t need to be set in stone.

I think this falls into the pillar/lantern issue. Because the 10mb/day limit is specifically for lantern. People would be discouraged from generating 500mb/day content that was exclusive to lantern, as if you’re generating that much data you’ve either a) found the holy grail of miniature content and want to share it with everybody or b) missed the point of the lantern specific broadcast.

That’s exactly what the intention for this is. As far as sending the entire internet, that’s silly and you know it’s not the point, and wouldn’t be functional with the technological model we have even with unlimited bandwidth.

That’s not how you look at it. When we say 2Mb/day, it means Lantern (the device) takes one day to fetch 2Mb of content, while the archive may be 1GB in size. So, for all practical purposes and for the sake of discussion in this particular thread, you can simply disregard bandwidth limitation completely. If you feel strongly about bandwidth, I suggest you start a separate topic.

I do understand that. My point being that it would take five and a half years to broadcast the Gigabyte assuming it was the entire broadcast, once you have allowed for the advertising, daily news, core archive etc it would be more like 15-20 years.

If you said; the maximum size this archive can be is 250Mb, then it could be broadcast in a more realistic time frame, people would better understand the need for making sure content is small and really good.

I really don’t see how this is relevant to the topic.

OK maybe I should discuss it elsewhere. It seems to me like you are proposing a software solution to solve a problem. My understanding of the problem you are trying to solve is:

How to select the best content to broadcast over a limited bandwidth service?

It seems entirely relevant to me to discuss the bandwidth/content size aspect of that, as it seems to be the defining feature of the problem you are trying to solve.

Apologies if I have misunderstood. Quite happy to move this to another thread.

Done. Started a new topic.

When you look at bandwidth from today’s Internet perspective, 2MB/s may not sound like much, but as someone who has used 2400 baud modems (just over 1Kbps) I can tell you you can transfer quite a lot of information quickly.

To address the issue of low-bandwidth vs high-bandwidth content, the current plan is to have a tag that allows the receiver to recognize a low- and high-bandwidth version of the same content. The lo-fi version might be a stripped down, plain-text version, or even a partial version of the hi-fi content. The receiver would then be able to upgrade to high-bandwidth version when it is tuned into the high-bandwidth channel.

Also, in reference to the original topic, from my perspective, bandwidth is really one of the least significant problems in context of content selection. People keep asking for a democratic way of content selection as if that’s a solved problem that we are refusing or don’t know how to implement, and keep talking about problems that are beyond the selection process. I might be the dumbest person on earth or just misinformed enough and missing something crucial here, but I don’t think the selection process itself is a solved problem by a long shot.