Blog Post

A startup wants to reinvent big data, by ditching binary and putting computers in space

There’s a lot to question about the plans of ConnectX, a Los Angeles-based startup that plans to build a new type of big data system in space — running on satellites and using a new type of non-binary processor instruction — but founder and CEO Lance Parker is adamant about his vision. “It will work,” he said during an interview about the company’s strategy.

To be more specific, ConnectX wants to make the process of storing data more efficient and also improve data-analysis speeds; space is only the ideal real estate. The real secret sauce behind ConnectX is a new computing model that replaces binary code with symbols that Parker says can store data much more efficiently. He makes a comparison to written language, where an English sentence can sometimes be captured in a single Chinese character.

ConnectX’s initial satellites will be about the size of a cantaloupe, and Parker isn’t sure quite yet how much data each one will be able to hold. He’s hoping a fleet of them will be able to store many terabytes of data, which doesn’t sound like a lot but actually could be if you consider how greatly the company hopes its new symbolic programming structure will be able to compress data sizes.

The ConnectX mission statement.
The ConnectX mission statement.

The other big technological piece of the puzzle — a really big one, in fact — is the network. Latency matters a lot when it comes to data processing, and connecting users to satellites, and satellites to each other, seems like it would be a lot slower than just connecting to a data center a few hundred miles away and to other servers a few feet away.

Parker hopes ConnectX can overcome this by relying on a new data-transfer technique similar to one that has shown promise in the lab but is far from being commercialized. That research involves twisted radio beams that are able to transfer data fairly high speeds across open space and also are minimally affected by atmospheric obstacles. Researchers at the University of Southern California recently moved data at 32 gigabits per second across 2.5 meters of space.

Satellites, by comparison, orbit Earth at distances ranging from several hundreds of kilometers (e.g., the International Space Station) to tens of thousands of kilometers (e.g., communications and weather satellites). The companies’ satellites, which will need to communicate with one another, could be very far apart, as well.

Even if network speeds never approach those of data center interconnects or research networks, Parker said that the amount of information contained in each symbol transferred should more than make up for technically slower speeds. When it comes to networking architectures, Parker said, “We look at other big data companies as sort of moving furniture around in the same room.”

Lance Parker 2
Lance Parker. Source: ConnectX

Theoretically, customers would interact with the ConnectX service via the web almost like they would any other cloud computing service. They would upload data to the satellites, which would then process it and convert it to the ConnectX symbol structure. Any analyses they then ran on the data would be sped up significantly because the volumes would be shrunk so much and the code would process faster, and results would be sent back to Earth via those twisted radio beams.

All of this could technically be done within a data center, Parker acknowledged, but space should provide cheaper real estate and operations costs (microsatellites typically piggyback onto other space launches, and don’t require staff), as well as the benefits of cold temperature and low environmental friction. He hopes new propellant-free engine technology will help the satellites last longer than current ones do, and said the satellites will operate like a self-healing network. That would be a good thing, because no one is going to space to repair a cantaloupe-sized satellite.

“The entire system is very crucial,” he said.

But companies probably shouldn’t start planning their space-based big data strategies just yet. Parker said ConnectX hopes to launch its first satellite in about a year in order to test out the data storage and transmission technologies. A beta launch with the full suite of services is probably two to three years out, and might include 5 to 10 satellites to serve test customers. Ideally, he said, the company will have been raising capital simultaneously so it will be able to meet these milestones and scale its infrastructure as new customers sign up.

Perhaps sensing my skepticism, Parker noted that the folks planning to mine asteroids for natural resources (who, for the record, have yet to mine anything) have a much crazier idea than ConnectX has. “When you look at it in perspective,” he said, “it’s really not this thing that’s way out there.”

10 Responses to “A startup wants to reinvent big data, by ditching binary and putting computers in space”

  1. biggus dickus

    “There’s a lot to question about the plans of ConnectX” — “There’s” is a contraction of “there is”. Now substitute and re-read the ignorant intro sentence. It’s supposed to be “There are questions”. Go back to 3rd grade.

  2. Oh boy. Tip: launch your new binary alternative in a traditional architecture data center. If it works and takes off, this space concept might one day make sense. Doing it all at once makes zero sense and just makes it all far less likely to succeed. If you truly have something that makes big data storage and analysis more efficient, waiting 5 years to offer it to the market is ludicrous.

  3. Oh boy. Tip: launch your new binary alternative in a traditional architecture data center. If it works and takes off, this space concept might one day make sense. Doing it all at once makes zero sense and just makes it all far less likely to succeed. If you truly have something that makes big data storage and analysis more efficient, waiting 5 years to offer it to the market is ludicrous.