Does the Lantern Have a in-built Storage for Offline Content?

Does the lantern have an in-built storage , how does it provide data to devices ? it says , lantern can download 2mb of data per day so , every data per day is stored somewhere ? or it just streams the data to us like a live feed ? . If it acts as an offline internet for us can we search for any content at anytime , can we use google ? or will there be a specific website for us to stream data ? Too many questions in mind :\ help me know the actual use of Lantern.

I’m not part of the outernet project, but I believe that:

The lantern will support Micro SD cards up to 64Gig. They will ship with a 4Gb SD card.

You can’t use Google as that would necessitate two way communication.

It’s one way communication, you just get sent the data selected by Outernet & the community. You can participate in the community by various means to influence what content is sent.

1 Like

if such less gigs of memory would be provided i guess it would be difficult for lantern to grow , as accessing the internet in an offline way would take huge loads of data , as their stated plan of making offline Wikipedia accessible to people and providing everyday education content for schools in villages , these projects will definitely require tons of data streaming everyday and storage of such huge content would not be possible in such low memory. I guess storage works kind of different in this technology :\

The lantern is really for text, you can get a lot of text for 4Gb.

But I do kind of agree. 8Gb and 4Gb SD cards are more or less the same price, so an 8Gb would have made sense to me.

In terms of how you access the data; you connect your phone or laptop to the Lantern’s wifi. You can then access a offline webpage that will look like this:

The outernet website clearly states that the lantern is for webpages , images , videos music or any type of file content that could be downloaded to lantern . So even 8gb of data or even 64gb is too less for the offline internet ,dont u think ? is it like specific data as per choice will be streamed for particular people and can be saved according to their will ?

Well yes it does. I agree with you it’s a bit misleading. The intention is to send 10Mb per day of pre selected content to the lanterns. I think that’s an interesting and useful service. I’m excited by it.

They additionally intend to send between 200Mb and 1Gb of data per day to people who have set up satellite dishes. This is also an interesting project and would allow video etc.

It seems to me that they are quite different services and should be talked about separately. The Outernet team don’t seem to agree, so I anticipate this confusion continuing to arise.

So means there’s a lot more that has to be taken care of , many more topics arise through a single question . I hope the Outernet team is paying attention to all the scenarios and problems that would come up in the future , and they find a Robust solution for that , as people would be waiting for huge amounts of free data in a click and they would expect a portal somewhat like google or in short a web portal where you just type in the information you need and you get it .(Including video , music & images etc. )

The purpose of the project isn’t to host the internet. Its more of a library of preconfigured offline content. In my discussions with branko I understand actually transmitting the full contents of wikipedia is outside of the viability of the lantern and pillar technology as it is currently implemented.

Sites like google will not exist, but engines like wolfram alpha may be implementable.

Keep in mind There are currently two sets of technology being examined, on for mobility (lantern) and another for more data capacity (pillar). Lantern is primarily for news (local, world, weather, technology) and low volume data like booksand articles, smaller or compressed pictures. Lantern is more ofa village library, downloading libraries like wikipedia, RACHEL-Pi, and the global village construction set. The pillar can get about 200mb/day I believe in the current (and possibly permanent) configuration.

Also the purpose of outernet, to my understanding is more about education than entertaimment.

1 Like

That’s our intention anyway. Once community starts deciding on what should broadcast things may change a bit, although Outernet might continue to maintain its core archive and have a community archive in parallel.

1 Like

Well Great , Its a big leap for connecting people with knowledge and i guess slowly the technology will flourish as per daily interaction with public.
Specific and important data can be streamed soon to probably every human population , which will even increase the literacy rate quickly and remote places will benefit in education as well as many topics connected to it.

Broadcasting proper and useful data should be taken care of , that’s my concern . People would also need a good user interface which they kind of get familiar with quickly.

Wish u Good luck Outernet Team . Hope you Do the best out of this technology.

1 Like

This is my concern too. Unfortunately outernet’s future is not as financially stable as to focus heavily on nonrevenue goals. At least, this is my understanding.
Though content selection is definitely on the short lost of things to do. I’ve been talking with branko about it regularly and I think we’re starting t0 get somewhere in our brainstorming.

1 Like

This seems like a real shame to me. It seems like there is a real opportunity when the lanterns are first dispatched to send them with a good chunk of Wikipedia, You couldn’t download all the data over a 10Mb/day, but I think you could send the updates to this core data.

Work has been done on Incremental updates for Zim files which is probably the most promising file format for this.

I see that offered to work with Outernet here. I’m not sure if it was ever followed up?

You could absolutely send the updates, but then you would be relying on preloaded content, right? Also, what is the rate of edits for wikipedia? I personally don’t think wikipedia is a great resource as it reflects the mentality of the people who use it more than a library (which i like to the think of outernet as :P, i’m biased).

The link you referred to is a little over a year old. I didn’t see any indication the project was finished or pursued?

Branko sayed the zim files were big enough to be uninteresting at this time. They did communicate and try to make the technologies compatible, but it wasn’t viable.

Yes absolutely you’d be relying on preloaded content. It seems when the lanterns ship there is a one-off opportunity to send this.

Wikipedia has bias, libraries have bias, Outernet will have bias. It’s just one of those things. Everything has bias. Personally I think Wikipedia is an amazing (yet imperfect) resource.

Not sure if the incremental thing was ever finished. I’m not saying it’s a completely solved problem, but it’s not one that would have to be solved from scratch either…

That’s a vendor-lock-in opportunity. What if someone builds a receiver and doesn’t have the base file? What would they do with the updates?


  1. Download the original file off the net & stick it on the device.


  1. Just ignore the updates and don’t keep them.
1 Like

I think the main issue is update size. How big would an update be?
As for the content, i’m curious how relevant all of wikipedia’s articles about people are, and how much potential similar fluff their is on wikipedia.

I’m not sure all biographical information is ‘fluff’.

You could allocate an amount you were prepared to send and just send whatever fitted within that.

There are some existing curated versions that might be useful eg:

If nothing else the medicine project seems like good value at 444M


I wasn’t saying all, but there is a lot. And fluff is also relative to the person reading it.

Those updates vary quite a lot in size. Wouldn’t it make just as much sense to spend the time gathering or presenting the information that’s present in a more compressend and sendable way? I mean, if we can send 40 novels a day we could send a seriously stripped down textbook too.
Also i believe there are a variety of packages for schools. I think a few of them have been considered for broadcast as well.

That’s not really ideal. We’ll probably have an offline version of Wikipedia, but I doubt it’d be in a format where you must send diffs to update it. It will probably be broken down into individual articles and updates would be on article-by-article basis. Anyway, we’ll see how it goes. This issue is currently still off my radar.

That’s still a bit on the large side at 5GB, but definitely better than full Wikipedia. I’ll download and see what it contains.