With all the jibber-jabber about Starlink being down, I figured it was an appropriate time to remind people this exists. Vint Cerf, one of the founding wizzards of the internet, established the IPN SIG in 1998 to cuss and discuss issues related to IP protocols over high-latency, potentially high-loss links. Worth poking around if you've not seen it before, though I sort of wish there were more use cases regarding information security.
jvanderbot · 1d ago
I used to work with some of those board members at JPL!
DTN is cool stuff. We had a few applications built up for distributed "delay aware" computing so that you could, at the network/application boundary, farm out jobs for e.g., an orbiting compute cluster coming over the horizon.
Really fun times.
bigfatkitten · 23h ago
And there are lots of open implementations to play with!
It's also noteworthy that DTN can be used on Earth too, especially in remote places with poor/unreliable data connections. There's some interesting literature about those applications, which was my first approach to DTN when I started working with it.
philipwhiuk · 12h ago
I'm not sure how seriously I take an organisation supposedly focused on interplanetary space where the main advertised event seems to be Raspberry Pi workshops.
There are many conferences and academic discussions that spend a long time bikeshedding while industry actually does stuff.
The lack of involvement from industry in a field where stuff is happening suggests to me this is one of them.
Sanzig · 8h ago
Most of what they do is Layer 2 and above, so it's hardware agnostic - prototyping on a Pi is fine.
Their work is gaining traction. DTN Bundle Protocol has been baselined for the LunaNet specification, which a bunch of private companies are designing to for lunar relay networks. Bundle Protocol is also currently on the CCSDS standards track so it should be formally part of the CCSDS protocol suite soon.
For those unaware: CCSDS is the Consultative Committee for Space Data Systems, they set widely used standards for spacecraft communications protocols. Basically anything beyond Earth orbit flies some variant of a CCSDS protocol stack, and a substantial chunk of missions in Earth orbit do as well, particularly if they are government funded. It's an international effort, China and Russia participate too so that everyone can communicate if need be.
0points · 11h ago
> The lack of involvement from industry in a field where stuff is happening suggests to me this is one of them.
Remind me again, which companies are going inter-planetary?
dcminter · 1d ago
Off topic, but...
> to cuss and discuss
...is a turn of phrase that's new to me and I love it. Totally stealing that.
OhMeadhbh · 23h ago
It's from my 7th grade history teacher, Mr. Mooneyham. As in "tomorrow we're going to cuss and discuss the Louisiana Purchase. Make sure you read chapter 12." He was also the teacher who had the "Super-Duper Discussion Stick" which he used to hit your desk if you fell asleep in class. And at least once he played the version of the "Devil Went Down to Georgia" w/ the bad words left in.
In the old days, public schools in suburban Texas were quirky, but the quality of education was relatively decent. For instance, I remember that Thomas Jefferson was president in 1803 when the Louisiana Purchase was finalized.
LorenDB · 1d ago
IMO the most likely solution to interplanetary networking is to throw tons of datacenter and compute that's anywhere more than a few light-seconds from the nearest existing datacenter, then use something along the lines of IPFS to perform data synchronization between planets.
bigfatkitten · 23h ago
Despite the name, IPFS has no properties that make it suitable for this application. It’s very bandwidth intensive and isn’t designed with latency or disruption tolerance in mind.
knome · 23h ago
there's a lot of interesting problems just in the networking.
if it took four years for a message to cross the void from where you are to the recipient, you certainly wouldn't want to wait a full eight years to see they didn't send a receipt message and only then retransmit.
eight years is some awful latency.
you'd probably want to send each message at something like a fibonacci over the months. so, gaps of (1, 1, 2, 3, 5, 8, etc) would mean sending the message on months (1, 2, 4, 7, 12, 20, 33, etc) until you got a confirmation message that they had received it. they would similarly want to send confirmations in the same sort of pattern until they stopped receiving copies of that message.
spreading the resends out over time would ensure not all of your bandwidth was going to retransmissions. you'd want that higher number of initial transmissions in hopes that enough of the message makes it across the void that they would have started sending receipts reasonably close to the four years the initial message would take to get there.
if you had the equivalent of a galactic fido-net system, it could be decades and lifetimes between messages sent to distant stars and messages sent back.
furyofantares · 23h ago
Wouldn't you want to completely saturate your bandwidth? Just always be transmitting whatever message has been transmitted the least.
knome · 22h ago
that would probably depend on how much power it takes to send the messages, how much actual usable bandwidth you could manage over the distances involved, and how much data you want to send.
if it takes a large amount of energy to send the data, we probably wouldn't want to run the equipment all the time. strong pulses would let the equipment cool down or recharge capacitor banks or whatever during downtime.
interstellar dust and other debris floating through space could cause interference, not to mention radiation from everything else around us, and our own sun shining right next to our little laser.
might want to move the laser out onto pluto or something to avoid having it right up against the sun.
toast0 · 22h ago
You'd want to do a lot of work with erasure codes as well.
Sanzig · 8h ago
It would be a lot more efficient to use erasure coding + heavy interleaving with other traffic so that you can withstand a maximum predicted outage period.
scottyah · 23h ago
and you'd probably want to take orbits/vectors into account, a djikstra-esque algorithm where the distances change is crazy.
Also, our signals are usually going very short distances very quickly and are very protected from solar/cosmic waves by the ionosphere. What kind of data loss could you get transmitting in open space across vast distances and time?
Sanzig · 8h ago
Interstellar space is pretty empty, and we have good models for it thanks to the radio astronomy community. Dispersion is low enough to be nearly negligible, even over tens of light years.
Determining theoretical interstellar link rates is a fairly straightforward link budgeting exercise, easier in fact than most terrestrial link calculations because you don't have multipath to worry about.
jvanderbot · 21h ago
I agree! This was my obsession when I worked at JPL, unfortunately the answer was usually "no mission will sacrifice their budget for reusable assets".
You'd need a mission whose purpose is to emplace compute stations.
That's why we can't have nice things.
r14c · 21h ago
you'd probably want a different protocol than IPFS for that application. managing a DHT with extremely high latency isn't going to work very well. something like named-data networking would probably work better since the transmitter can know
1. exactly what prefixes need to be buffered based on the received interest messages from deep space
2. exactly which data rate is possible at any given time
3. exactly how much data needs to be sent from the buffer in each transmission
optimizing for high latency really pushes your design choices around compared to our comparatively very low latency uses here on earth. its pretty interesting to think about.
macintux · 23h ago
How would that work to, say, Mars? Have satellites filling many, many orbits between the two planets?
cjtrowbridge · 23h ago
We already have an interplanetary internet called the NASA Deep Space Network. Understanding it's limitations and challenges is a good way to start thinking about this.
BizarroLand · 23h ago
Nah, nothing that extreme. The broadcast range and bandwidth of even current technology in space could handle a huge amount of fairly rapid data transfer between the two planets.
It would be more like a handful of satellites, some orbiting earth, some orbiting mars, and then a handful of relay satellites serving as intermediaries.
Don't count on playing e-sports competitively, though.
The lag under ideal conditions would be insane, about 2.5 minutes each way (when the planets are "only" 40 million kilometers apart), but with repeaters and overhead probably closer to twice that.
macintux · 22h ago
The comment was a few light-seconds. That's a lot of hops to Mars to fill to sustain that coverage year-round.
snickell · 15h ago
The distance between earth and mars varies between 150 and 2000 light seconds.
macintux · 10h ago
But carpeting that distance across the entire volume of space between the planets with data centers every few light-seconds apart seems ambitious. A hundred or more data centers in space?
> throw tons of datacenter and compute that's anywhere more than a few light-seconds from the nearest existing datacenter
I think I'm misinterpreting the comment.
rippeltippel · 18h ago
Author of the initial versions of DTNPerf (iperf for DTNs) and some related papers. I moved on to other areas of SW engineering, but glad to know DTN technology is still looked after. I recently learned that ESA are looking into that as well.
No. That does not allow faster than light communication (which is impossible)
MadnessASAP · 19h ago
FTL communication is presumed to be impossible, it actually hasn't been proven impossible.
On the other hand, if it were shown to be possible it would be rather disruptive to many other presumptions in physics.
dodobirdlord · 19h ago
People are fairly attached to causality.
MadnessASAP · 15h ago
Well that's just it, my understanding is that FTL hasn't been proven to violate causality, or that causality is inviolable. It's just very strongly hinted at.
jdranczewski · 12h ago
In special relativity at least it's pretty clearly the case that communication outside the light cone (so faster than light) will result in events happening in the wrong order in some frames, violating causality. I will not speak of general relativity, as while I've taken a course in it, years later I have returned to considering it largely dark magic.
bee_rider · 17h ago
Supposing you transmit a message to me at a prearranged time, a number. At that prearranged time I pick a number at random, and act as if it is your message.
When I eventually get your message some time later, if it turns out my random pick was wrong, I kill myself. If the many worlds interpretation is right, I should only observe universes in which I’be managed to conjure up your message faster than causality, right?
Ukv · 10h ago
> If the many worlds interpretation is right, I should only observe universes in which I’be managed to conjure up your message faster than causality, right?
I feel that's pairing MWI with some non-physical (or at least beyond the wave function) overarching "I" that can see across or jump between branches of the wave function, whereas I'd claim the appeal of embracing MWI is largely that the universe's wave function is all there is and observers/consciousness play no special role (along with not having nonlocal random "collapses"). The experiment would be no different than gathering a bunch of people, assigning each a number, then killing the ones that were assigned the wrong number once the real number arrives.
bee_rider · 6h ago
It isn’t any jumping, just from an individual’s point of view they can’t have been somebody who ended up dying.
jdranczewski · 12h ago
Long term, sure. Short term I think an unpleasant number of your parallel universe copies would observe themselves dying.
webdevver · 4h ago
star wars except its comcast 'accidentally' destroying starlink sattelite links with 'debris'
ieee-e · 23h ago
There are too many graphics (>0) and not enough monospaced font for me to take this seriously.
Steve Crocker, Vint Cerf, Jon Postel (RFC editor) and I all worked together at UCLA. I was there the day the IMP arrived. Heady days.
jibal · 21h ago
I was at UCLA with Vint Cerf ... very cool guy.
userulluipeste · 20h ago
"We work to extend terrestrial networking into solar system space..."
Minor nitpick: it's the Solar System - i.e. capitalized (since it's a proper name). The Solar System is the planetary system that we reside in, the one that has the star Sol at its center.
> When not used as a proper noun and written without capitalization, "solar system" may refer to either the Solar System itself or any system reminiscent of the Solar System.[14]
DTN is cool stuff. We had a few applications built up for distributed "delay aware" computing so that you could, at the network/application boundary, farm out jobs for e.g., an orbiting compute cluster coming over the horizon.
Really fun times.
https://github.com/nasa/HDTN
https://github.com/nasa-jpl/ION-DTN
https://gitlab.com/d3tn/ud3tn
https://upcn.eu/
There are many conferences and academic discussions that spend a long time bikeshedding while industry actually does stuff.
The lack of involvement from industry in a field where stuff is happening suggests to me this is one of them.
Their work is gaining traction. DTN Bundle Protocol has been baselined for the LunaNet specification, which a bunch of private companies are designing to for lunar relay networks. Bundle Protocol is also currently on the CCSDS standards track so it should be formally part of the CCSDS protocol suite soon.
For those unaware: CCSDS is the Consultative Committee for Space Data Systems, they set widely used standards for spacecraft communications protocols. Basically anything beyond Earth orbit flies some variant of a CCSDS protocol stack, and a substantial chunk of missions in Earth orbit do as well, particularly if they are government funded. It's an international effort, China and Russia participate too so that everyone can communicate if need be.
Remind me again, which companies are going inter-planetary?
> to cuss and discuss
...is a turn of phrase that's new to me and I love it. Totally stealing that.
In the old days, public schools in suburban Texas were quirky, but the quality of education was relatively decent. For instance, I remember that Thomas Jefferson was president in 1803 when the Louisiana Purchase was finalized.
if it took four years for a message to cross the void from where you are to the recipient, you certainly wouldn't want to wait a full eight years to see they didn't send a receipt message and only then retransmit.
eight years is some awful latency.
you'd probably want to send each message at something like a fibonacci over the months. so, gaps of (1, 1, 2, 3, 5, 8, etc) would mean sending the message on months (1, 2, 4, 7, 12, 20, 33, etc) until you got a confirmation message that they had received it. they would similarly want to send confirmations in the same sort of pattern until they stopped receiving copies of that message.
spreading the resends out over time would ensure not all of your bandwidth was going to retransmissions. you'd want that higher number of initial transmissions in hopes that enough of the message makes it across the void that they would have started sending receipts reasonably close to the four years the initial message would take to get there.
if you had the equivalent of a galactic fido-net system, it could be decades and lifetimes between messages sent to distant stars and messages sent back.
if it takes a large amount of energy to send the data, we probably wouldn't want to run the equipment all the time. strong pulses would let the equipment cool down or recharge capacitor banks or whatever during downtime.
interstellar dust and other debris floating through space could cause interference, not to mention radiation from everything else around us, and our own sun shining right next to our little laser.
might want to move the laser out onto pluto or something to avoid having it right up against the sun.
Also, our signals are usually going very short distances very quickly and are very protected from solar/cosmic waves by the ionosphere. What kind of data loss could you get transmitting in open space across vast distances and time?
Determining theoretical interstellar link rates is a fairly straightforward link budgeting exercise, easier in fact than most terrestrial link calculations because you don't have multipath to worry about.
You'd need a mission whose purpose is to emplace compute stations.
That's why we can't have nice things.
1. exactly what prefixes need to be buffered based on the received interest messages from deep space 2. exactly which data rate is possible at any given time 3. exactly how much data needs to be sent from the buffer in each transmission
optimizing for high latency really pushes your design choices around compared to our comparatively very low latency uses here on earth. its pretty interesting to think about.
It would be more like a handful of satellites, some orbiting earth, some orbiting mars, and then a handful of relay satellites serving as intermediaries.
Don't count on playing e-sports competitively, though.
The lag under ideal conditions would be insane, about 2.5 minutes each way (when the planets are "only" 40 million kilometers apart), but with repeaters and overhead probably closer to twice that.
> throw tons of datacenter and compute that's anywhere more than a few light-seconds from the nearest existing datacenter
I think I'm misinterpreting the comment.
On the other hand, if it were shown to be possible it would be rather disruptive to many other presumptions in physics.
When I eventually get your message some time later, if it turns out my random pick was wrong, I kill myself. If the many worlds interpretation is right, I should only observe universes in which I’be managed to conjure up your message faster than causality, right?
I feel that's pairing MWI with some non-physical (or at least beyond the wave function) overarching "I" that can see across or jump between branches of the wave function, whereas I'd claim the appeal of embracing MWI is largely that the universe's wave function is all there is and observers/consciousness play no special role (along with not having nonlocal random "collapses"). The experiment would be no different than gathering a bunch of people, assigning each a number, then killing the ones that were assigned the wrong number once the real number arrives.
https://www.rfc-editor.org/rfc/rfc1.txt
Minor nitpick: it's the Solar System - i.e. capitalized (since it's a proper name). The Solar System is the planetary system that we reside in, the one that has the star Sol at its center.
https://en.wikipedia.org/wiki/Solar_System#Definition
They're solving for other solar systems too!