2400 bps is enough

For many years, since the time of BBSs i have known that real, important information is very condensed.
Today people spend their 8 mbits on videos and movies…

And websites are (purposefully?) bloating each and every downloaded page (fb for example) with so much nonsense and formatting information that each downloaded page can have up to 300 kb of data… this is outrageous…

1200 characters / bytes is approximately the amount of text in one book page.

that makes roughly 1.2 kb.

and fb for example, uses the data that could have provided 300 pages of information, to show ~one page of information.

rewinding back to the days of BBSs, i remember i used them at the speed of 2400 bps initially. then 14400, 28800, 56700 and then adsl

2400 bps means 300 characters per second,
300 bytes per second… 0.3 kb per second…

people would perhaps laugh at it but 300 letters per second is way more than what most humans would be able to read.

It means reading one page of text in 4 seconds…

All this while one side of the world uses speeds like 800 000 letters per second (8 mbit) and in another part of the world, people have no access to information at all…

2400 bps is actually more than enough for accessing true unbloated, real information. Even searching, websurfing in text, accessing information on wikipedia and more…

The problem is that the pages uses much more data for formatting and other things, javascript, tracking scripts, etc.

2400 bps could possibly be even provided by gsm operators with minimal modification of the gsm systems…

It could be done terrestrially as well in my opinion
But collaborative effort would be required.

anyway 2400 bps is way faster information transfer than what humans can absorb. and you could promote this too on the side of your project… To create awareness.

thanks for reading.

1 Like

You could use a more modern analogy than BBSs:

A tweet is 140 characters. Receiving 2 tweets per second is definitively not slow.

2400 bps is a very achievable goal! We’re targeting 100-200kbps, but this is by no means finalized. The biggest roadblock that we are dealing with right now is the 802.11-spec, which requires minimum receive speeds of 1Mbps. We want to be absolutely wifi-compliant, but these rates make the spacecraft design a very challenging problem. It’s not unsolvable, but nowhere near as simple as providing a 2.4 kbps transmission.

I think you should consider perhaps a different a different custom protocol,
and if slower speeds will be easier to acheive with low tech hardware, then i would prefer that.

Also, you could make a system for broadcasting various information, all sorts of information, that devices store in a large drive,

for continuous broadcast all day long,
from news, weather, wikipedia, forums etc etc…
the receiver devices have say 1 tb drive built into it,
so it stores all info when received, and the user can access whatever he wants in an instant,

instant access locally…

and when you want an encyclopedia it is already there with you…

we need a standard for a more condensed internet, that will be good for a slower internet…

pdfs books, etc etc real information and knowledge at your fingertips…

and it could be connected directly to TVs perhaps, without a computer even… because there are much more TVs crt and lcd than there are computers…

all you need is a rca video connection,

this has been done with even minimal hardware… even PICs …

if you make the DEVICE as well, for direct connect to a tv,

you skip the gap of needing to have made a computer purchase, entirely.

please forward this idea as a genuine proposal.

I wouldn’t be surprised if many websites developed a “mobile version” type of alternate viewing for a slower speed such as this. Such as a variation of Wikipedia which allowed you to view text only. Same for other popular websites, like news sites.

We have definitely considered something outside of wifi, but if we want to reach billions of people instantaneously, with devices they already have, then we’re basically stuck with either the 802.11-spec or a workaround that conforms to all wifi chipsets.

As an fyi, we’ll eventually be publishing a reference design for homemade antennae that can be used to increase the download speeds. This video shows how a cheap cooking strainer/colander can be easily converted into a satellite tv dish. Some other parts are still needed, but it’s the general idea that is important.

As for using equipment they already have, have you considered using Digital Radio Monodial in the 11, 13 or 15 meter international shortwave bands? Those bands are sparsely used because the critical frequency is not often high enough for earth based broadcasters to use them for distance. DRM is designed for datacasting, and as such is a more “robust” protocol for the purpose than wifi anything, and would reduce your transmit power requirements considerablely. Also, all that is required to receive DRM is to plug the output from a shortwave receiver into the input jack port of a computer’s sound card, and shortwave receivers are fairly common in the same areas that you wish to reach, and the quality of receivers are far better than the cheap wifi chipsets found in old smartphones.

Alternatively, have you explored using the eXtended Range protocol found in Atheros chipsets? It would reduce your datarate, but if coupled with a quarter width channel, could get your receive requirements down to -107 db. Even so, I question whether or not a cubesat can produce a powerful enough signal to be received by common hardware. For example, a wifi signal would require at least 250 times the RMS power to be copyable on the ground as would a DRM signal, and shortwave broadcasters are generally using power levels in the kilowatt ranges; whereas a 10-cube sat can’t realisticly expect to do better than 10 watts on a continous basis. Granted, these broadcasts might be short duty-cycle bursts, but I’d guess modified receiver gear would be required anyway; in which case, why use wifi when so many of your target audience already has shortwave receivers?

1 Like

You are absolutely correct about the quality of shortwave DRM versus bending a cheap wifi radio to meet our needs, but if the goal is reach and impact, it’s really hard to argue with the prevalence of the wifi chipset. Tomorrow I’m meeting with some folks in the international aid space who have incredible field knowledge of what kind of equipment is common and what is not. Many of the requirements of the system will be informed by groups who are in the trenches. Will be sure to keep you posted.

eXtended Range is interesting and we’re not familiar with it. Can you provide additional information and links?

As far as free space loss goes, it’s roughly -150 dBm from LEO; at 600km I think it’s -155dBm (but could be off by a dBm or two). -107 is not that difficult of a link to close. A 3U cubesat can definitely close that link, though it might be hard for a 1U.

The only reason to use wifi over shortwave is because, nowadays, it’s just plain cheaper. Of course, there is also the regulatory environment; 2.4GHz is unlicensed internationally.

I dont know your interest in this but a large scale mesh network system is sought after by many people in the world today, if wifi is inherently low range maybe it is time for a better standard.

wifi today is not shared in many ways, and adsl is not designed nor opened afterwards to provide sharing lof bandwidth etc.

the standards are a product of a mindset that does not prioritize maximum reach of knowledge.

building on that standard is prrhaps trying to use it despite the tradeoffs accepted by that mindset. and if range is affected by the decisions made when designed, perhaps it is proper to design a new standard that can easily be used for long range communication data transfer with not so expensive hardware and not so high bandwidth, both in mesh networks and satellite communication.

and keep I mind that if you design for wifi devices to receive,
it probably means the user already has access to computers and electronics and probably telephone/ internet

a device directly from satellite to tv, low speed, with internal storage is an alternative, please forward this idea, I ask you again.


The difference between -150 dB and -107 is roughly 20,000 times the power requirement. Considering that the common hotspot is about half a watt RMS, you’re looking at a minimum power floor of 10 kilowatts from a 3U cubesat. And did you consider that most wifi chipsets in cell phones don’t have proper dipole antennas? They don’t fit, so most cell phones use electricly shortened antennas that contribute to signal loss in the range of -2.5 to -3 dBl. Maybe this can still be done by taking a cue from the EME ham radio geeks, and build a 1U cubesat that simply deploys as large and efficient of a mylar balloon reflector that you can fit, as well as a tracking beacon that a ground based, high power parabolic transmitter can bounce a signal off of. And free space loss doesn’t even consider atmospheric scatter loss. While the ionosphere isn’t going to prevent most of the signal, it’s not entirely transparent either. It’s similar to shining a flashlight through a dusty room, while there is plenty of light for you to see the other wall, some small percentage of light strikes dust particles in the air. Cloud cover would also result in some measurable signal loss, as would simply the user simply holding his cellphone in the palm of his hand. Using a collander as a mini sat dish isn’t going to cut it. I’m pretty sure DRM is the only way to make this project work at all. As for the unlicensed wifi thing, that doesn’t apply to non-terrestrial emitters nor any emitter over the statutory limit set by each nation, none of which is over 2 watts. Hams can crank up to 50 watts, but I know of no cell phone chipsets that can be unlocked to moniter the ham wifi channels, so it’s not likely common enough to bother with.

textual information is a narrow format that requires visual observation.

volume of data depends on the type and protocol information.

The data carried by the signal is largely irrelevant, it’s the broadcast protocol that is the base problem here. We’re trying to get the wifi protocol to do something well beyond what it was designed to do, which in the world of radio propagation generally means that more power is required. In the case of creating a distinctly copyable wifi signal at sea level, across a target area roughly 600 miles in diameter (more isn’t realistic from low earth orbit, as the edges of the signal footprint would be at rather extreme angles, almost the horizon, and pass through a great deal more atmosphere than can be overcome) the root-mean-square power requirements are huge. DRM is a narrowband datacasting protocol, in part, for this exact reason. When we cut the protocol bandwidth in half, we double the RMS power signal using the same wattage of amplification. Wifi’s smallest bandwidth is a quarter-channel at 5 megahertz wide, while DRM can use any bandwidth in 4.5 or 5 kilohertz wide steps. A single or double channel (4.5 or 9 khtz wide) can fit inside the output bandpass of any standard AM shortwave radio, so a direct audio patch cord from the shortwave radio to the input jack of a computer’s sound card are all the hardware that is needed to decode the signal. Wider DRM signals could be broadcast for additional data, which would require accessing and manipulating the radio’s internal frequency, but those DRM “channels” that can’t be heard with an unmodified shortwave radio are not required to be practically related to each other. Each 4.5 kilohertz wide “channel” can carry a completely independent data stream, permitting the most important information (such as how to modify a shortwave for the internal frequency and aquire or build the correct addtional equipment, such as a downconvertor) to be streamed in the two most centered channels. For that matter, just one channel can move more than 2.4 Kbps, so just one might work anyway.

For DRM, at what frequency would you suggest operating on?

Well, in order to stay within the international shortwave broadcast bands & stay above the critical frequency, you’d have to use the 15m, 13m or 11m bands. However, even the 11m band would require a huge dipole for a 3u minisat. But the 15 meter band is being considered for a DRM broadcasting band for local to regional broadcasting, much like FM is used in the US. But right now, it’s still an international shortwave band, even though it’s almost unused by broadcasters that depend upon skywave propagation. 15 meters is pretty much always too high a frequency for skywave, but is part of just about every shortwave receiver that I can think off. I checked mine and it can get it fine, there’s just nothing to hear. The general idea of broadcasting shortwave listening from orbit makes so much sense, I’m kinda surprised no one has done it yet. The bandwidth available above the critical frequency is uncluttered as compared to below it and in the skywave skip ranges. However, DRM isn’t designed deal with significant dopler effects, so mode D would be required. But DRM is designed with some doplar in mind, so it might be workable event with low earth orbit and some adjustments in the center frequency and such at the transmitter. The 13 meter band is used as an AM band in Asia, and the 11meter band overlaps the Citizen’s Band in the US, which has similar uses in many other nations. DRM still works fine in higher bands, I just don’t know how high most receivers will go. Another route might be to have some (relatively) cheap USB software defined receivers mass produced that are sensitive in your target band. $20 each shouldn’t that hard, and the device could also contain the apps and ‘get started’ data on itself. There are also lower frequency ISM bands (license free like 2.5 GHtz) at 433 Mhz and 40MHz that would be worth considering if a special receiver is required. But if DRM is the mode, and shortwave radios the target equipment, I’d suggest looking into buying a sideband data stream on one of the existing DRM shortwave broadcasters that serve your target area. There is a broadcaster in Sweden (iirc) that prides itself on it’s ability to penetrate into Burma despite efforts to jam them by the junta, as an example.

1 Like

Syed - I like your idea, however the satellites are going to have to broadcast some serious amounts of power to be received on the ground if they are going to be running at 2.4Ghz.

As MoonShadow has pointed out - perhaps DRM maybe a possibility.

I posted about the amateur radio ‘HamTV’ transmitter in another topic - this will be using the 2.4Ghz band transmitting at 10 watts.

The ground station will require at least a 1m dish pointed accurately at the Space Station as it moves across the sky. Admittedly you will get better range with lower bandwidth requirements.

Link here:-

The ISS orbits around 240 miles or so above the earth, so you will get even more free space losses on a satellite in a 600 mile orbit - I just don’t think it wouldl be possible to pick up 2.4 Ghz WiFi from a satellite without very highly specialised equipment.

If DRM can work on VHF/UHF - with doppler shift taken into account from passing satellites (perhaps even the lower part of standard Band II FM - 88-108Mhz?) - then this could be an option (less ionispheric scatter issues than using HF) - however - again it will require some sort of ‘specialised’ receiver.

This will be a lot cheaper and simpler to use than building a system to accurately track a satellite and accurately point a 1m + sized dish at the satellite.



I don’t see anyway you can remain 2.4ghz 802.11 compliant and legal…
You cannot run 10 watts of power on these things (without a license) and even if you had a license only ITU region 2 can utilize it. Let’s not forget most wireless devices cannot run in promiscuous mode, meaning they cannot receive from you unless they can authenticate ie tx to you.

You are better off using an amateur radio network, think of it like a packet network in space.

That would be solid, work much better, be cheaper, straight to the point, and easy to satisfy the ITU region 1 2 and 3 requirements.

Not Found

The requested URL /Amsat-Italia_HamTV_brochure.pdf1 was not found on this server.

1Mbps chipsets. There in lies the rub. If you could get the modern smart phone chipsets to receive data transmissions at only 10kbps (like 9600 bps) Then the SNR would go up by 20 dB. That is the equivalent of a 100 times the transmit power. (Technically the noise would go down, but signal to noise is a Ratio)

I don’t think a $20 SDR would be tough at all. The RTL-SDR can be found as as little as $8 on Amazon. Depending on the tuner, it can go from 22 MHz to 2200 MHz.

Sure, but then you lose the concept of already haveing billions of in-place groundstations in the forms of smartphones, laptops, tablets, and accesspoints.