Request for new content and re-queue of old

Could the relatively recent community files (maybe last 14 or 30 days) please be re-queued for delivery now that Skylark is out of beta?

Also could you please add the following Wikipedia articles?

Thanks!

Oh shoot. I delete the community content every few days. Are you aware of any good online services that turns web pages into single, static files?

Doh! Ok, I resubmitted a few things. Not being able to zip stuff makes it really difficult to find good content.

I have not used any online service that does that, but I’ll keep on the lookout for it. I can’t imagine it would be that difficult to do.

@Syed
Not sure if this is exactly what you are looking for but it fits the bill I think.

http://anh.cs.luc.edu/python/hands-on/3.1/handsonHtml/webtemplates.html

Just tested it out here in the lab

Not exactly an online service, but I think you can automate Chrome so it could be scripted.

@Syed Any decision on the two Wikipedia files?

No decision on how I will do it, but I’ll make sure it gets done by tomorrow. I might just take the archive.org version of the pages, since they include all assets in one location.

Manually uploading old content from a web site is a great idea, particularly if you have external storage plugged into your CHIP and use Skylark.

For new Lantern sales (not the DIY kits), I think you should pre load files into an Outernet provided external storage device such as a USB stick or Micro SD card plugged into an adapter that then plugs into the Lantern hub strip.

Sounds like a pain - - doesn’t it - - but then, the delivered device is relatively current and can even have special large data bases (such as Rachel) pre installed. Ken

Oh, I thought you had something for Wikipedia articles since it seems like random recently updated ones show up. If it is a pain don’t worry about it :slight_smile:

Not a problem at all. The email I used as a todo simply got buried. I’ll upload something as soon as I get home.

Maybe I am misreading your issue with assets, but can’t wget pull the page with full assets?

Just asking.

-Cecil

I just created a directory call Requests. It should be coming down now. Inside of it are the two .html files, as well as the folders containing the assets of each of page. Let me know how it works.

@Syed
directory showed up and the files work fine!

Got em! That was quick thank you :slight_smile:

What generates the files in the Wikipedia folder?

FYI…

Jan  7 02:52:30 skylark user.info ondd[417]: [carousel] completed: opaks/2216-16px-Folder_Hexagonal_Icon.svg.png.tbz2
Jan  7 02:52:44 skylark user.info ondd[417]: [carousel] completed: opaks/436d-12px-Commons-logo.svg.png.tbz2
Jan  7 02:53:00 skylark user.info ondd[417]: [carousel] completed: opaks/5c4f-23px-Flag_of_the_United_Kingdom.svg.png.tbz2
Jan  7 02:55:49 skylark user.info ondd[417]: [carousel] completed: opaks/11ad-load(1).php.tbz2
Jan  7 02:56:15 skylark user.info ondd[417]: [carousel] completed: opaks/50ae-wikimedia-button.png.tbz2
Jan  7 02:56:33 skylark user.info ondd[417]: [carousel] completed: opaks/5440-Inmarsat_logo.png.tbz2
Jan  7 02:57:58 skylark user.info ondd[417]: [carousel] completed: opaks/6ab4-240px-Inmarsat-3.gif.tbz2
Jan  7 03:02:42 skylark user.info ondd[417]: [carousel] completed: opaks/714d-Inmarsat - Wikipedia.html.tbz2
Jan  7 03:04:11 skylark user.info ondd[417]: [carousel] completed: opaks/7424-220px-Couverture_satellite_inmarsat.svg.png.tbz2
Jan  7 03:06:16 skylark user.info ondd[417]: [carousel] completed: opaks/742c-L band - Wikipedia.html.tbz2
Jan  7 03:45:01 skylark user.info ondd[417]: [carousel] completed: opaks/a092-240px-Satellite_phone.jpg.tbz2
Jan  7 03:49:33 skylark user.info ondd[417]: [carousel] completed: opaks/a9cf-load(3).php.tbz2
Jan  7 03:51:10 skylark user.info ondd[417]: [carousel] completed: opaks/ac57-240px-Shoreditch_inmarsat_building_1.jpg.tbz2

Chrome, save as complete website.

@Syed, I mentioned using wget to do this conversion earlier. Here is a site with an example for you.

-Cecil

I mean the other Wikipedia pages that are all one file?

@kf4hzu That is a custom setup that Abhishek made, that pulls in the most popular Wikipedia pages of the day. We’re now using that system for individually requested Wikipedia articles.

Oh ok cool, seems @Abhishek has mad skills :slight_smile:

Crazy mad skills.