ultimape πŸœπŸ’© ❌ is a user on mastodon.social. You can follow them or interact with them if you have an account anywhere in the fediverse. If you don't, you can sign up here.

The reason why I found IPFS interesting is it has the potential to act as a decentralized git-like repository for signed code. Combined with in-browser P2P systems (like beaker browser's DAT) could act as a robust de-localized distribution for various web initiatives while also subverting centralized monopoly risk of hosting all of it on github.

I evaluated all major players at the time. Even did historical analysis of things like sourceforge's formation, and the loss of CodePlex/Google Code.

ultimape πŸœπŸ’© ❌ @ultimape

At the time, I was looking for ways to solve firmware rot that plagued the proto-IoT industry known as networking hardware. Looking at what could support DD-WRT like systems at scale, while also getting around all sorts of headaches of distributing signed software to devices.

One of my favorite teaches had an Aibo. I saw this coming: motherboard.vice.com/en_us/art

Who wants to run a smarthouse where the hardware provider isn't supporting - or worse isn't even in business? redmondmag.com/Articles/2012/0

Β· Web Β· 0 Β· 1

I still don't have a good solution for all of this stuff. And while there are technically options, the amount of effort and friction to get things working is way too much overhead for developers unless you've got institutional inertia behind you (basically: be a big firm).

And then on the user facing side are major usability concerns. There is a whole other can of worms to open about how to get trust-able systems in reach of people (accessible?) who don't understand stuff like web-of-trust.

There are areas where i'm seeing progress. Particularly around fringes of the journalism security field. All the commotion around the Edward Snoden leaks seems to have put a fire under the kinds of journalists who find themselves in risk prone situations where privacy can be life-or-death.

They are surfacing interesting solutions like blog.airbornos.com/post/2017/0 but to extend these patterns to non-web ecosystems is challenging. It often starts to hit cultural barriers / last-mile problems.

Sadly, the more I look at the situation, the more disillusioned I get. You start to see how almost all layers of our modern day Internet (and not just the www parts) are built on top of fragile systems that didn't take security in mind. Everything seems like an artifact of speed and cost overruling accuracy.

I mean... we're still fighting on the fronts over things like DNS over HTTPS, which are really just kludges and bandaids for a really shit design. en.wikipedia.org/wiki/DNS_over

Its frustrating.

I read stuff like this and it doesn't even phase me anymore.

"CHIPSat was the first US mission to use end-to-end satellite operations with TCP/IP and FTP utilizing SpaceDev-developed Windows NT-based mission control softwareβ€”all running across a secure Internet link."
thespacereview.com/article/563

"As for the other CubeSat's there's a CIA-funded satellite containing a TCP/IP web server that will be testing out orbital data networking"
theregister.co.uk/2015/05/19/u

That isn't to say there aren't initiatives working toward more robust systems. NASA has been experimenting with something more reliable than TCP in high latency situatiosn: nasa.gov/mission_pages/station

But if you know what to look for you, you can see people doing really odd things with TCP.

networkengineering.stackexchan

serverfault.com/questions/4385

networkengineering.stackexchan

And this is a fundamental internet protocol by most measures.

So, NASA's snarf&barf (store & forward) system is basically a federated network using a reliable message passing architecture. It even runs into the same problems that IPFS faces with regards to bootsrapping node network: nasa.gov/content/dtn-neighbor-

Under a different light, this is the sibyl/interloper identity problem - how ants decide who's worth regurgitating food at: mastodon.social/web/statuses/9

Some ants even use a form of TCP's windowing protocol as part of theri foraging: priceonomics.com/the-independe

M. Dorigo's original ant based routing optimization designs were for robot systems.

D. Gordon is now working with Nasa: "Examining how ants in diverse environments solve the problem of collective search can give insights on how different forms of collective behavior evolve. Solutions to the problem of collective search are currently of much interest in robotics, for example, to design ways that robots can use local information to perform search and rescue operations."
nasa.gov/mission_pages/station

The more I look at the situation, the more disillusioned I get. You start to see how almost all layers of our institutional landscape (and not just the technology parts) are built on top of fragile systems that didn't take reliability in mind. Everything seems like an artifact of speed and cost overruling accuracy.

Amount of effort and friction to get things working is too much overhead for individuals unless you've got institutional inertia behind you (basically: be NASA).

Its frustrating.