Author Topic: Can MaidSafe Decentralize the Internet?  (Read 4123 times)

0 Members and 1 Guest are viewing this topic.

Offline CLains

  • Hero Member
  • *****
  • Posts: 2606
    • View Profile
  • BitShares: clains
Dirvine made a great post,

Quote from: Dirvine
Interesting read. It may have some outdated info (pay to get etc.) and slight misunderstandings, but he was in contact well before BTC was known of. He is right it is a challenge and the reliable UDP protocol is a hard thing to get right (that is why Rudp is becoming a new project CRUX and headed for boost libraries). We have boost Engineers in the team working on it and Chris Kolhoff who designed asio did the initial Rudp, so it's not us doing it, it's us getting the right people involved as well.

I think issue of speed etc can only be measured, but there is a clear issue over the approach, even if every tx takes 1 sec (per group) and there are millions of groups then those millions of tx will happen in close to a second (say tx gets close to a million messages per second , per channel). Again measurement will help. Then if safecoin takes the integer route, it will be one message so ...

In terms of micro-transactions then the easy solution for us in the short term is as previsouly outlined. Pay in safecoin and the network will consume these in chunks as you put data (so you effectively buy X space at the cost at that time) This allows transactions to look like microtransactions as the manager group can say OK that is 10 million puts we have deducted a safecoin. Further testing and algorithm detection will make even micro tx simple, it is one thing I worry less about (we never used to pay a woodsman for every axe swing smile )

So the goal is to provide speed and privacy as well as decentralisation (speed does not require to be day 1, but its a factor and I believe there is no way a centralised system will keep up as we grow the logic out, not without huge security and privacy issues). The key is to decentralise every single algorithm and that is extremely hard (we have done this mostly). It is akin to factoring every equation to it's base algorithm.

I think it is like back in school you can have an enormous equation with many variables, that is generally what a program does, implements an algorithm. So you can grab nodejs or python and quickly implement the huge algorithm or take time and do it in c++ and they are both very wrong. First the algorithm needs reduced and you may find you have Z = 5X + 3Y as an example. People can code a massive version of that equation unfactored. This is the difference between fast to market and right. I prefer to factor as much as possible. No matter what the number of bugs is 3 per 1000 lines of code so less code == less bugs, but more importantly it means you have found the more core algorithm.

When you do factor down then another thing happens, people can look at the results and say, oh that is simple, simple code and very easy to work with. Then you win all round. If you do not ignore the market though, it is essential, so a fine balance of correctness verses speed (in our case testnet measurements tell us if we got the algorithm right no matter how un-factored [it's not even a word smile ]) and not coding yourselves out of upgrade paths and you are good to go.

This balance is the key component to viable product (minimal or not) and I feel we are there now with the network algorithms. We have simplified them down and our complexity is imagining the network running to see how they interact and these test-nets are key. Some things cannot be measured like ripples in the sea, if your system is design along natural rules then you create immeasurable outputs. So you need to step further back and look at the network as a human sees it when many other humans use it.

In the last 6-7 years I have not been able to get close to the code (running around begging for money to pay the Engineers time implement a dream without being able to speak much with me), since Aug/Sep I have and it is amazingly good as there have been some step improvements and reductions in the code and this week I intend to present even more. This happens in parallel with the testnets though as we balance launch and correctness.

tl;dr I spoke with Dan years back and more recently in person, he is very sharp and decent. I think he is more focussed on economics whereas I am more in decentralisation so we will differ in opinion (route to improve our world) but not respect and that comes over well and hope I re-iterate it here.

Offline hpenvy2

  • Sr. Member
  • ****
  • Posts: 217
    • View Profile

Offline xeroc

  • Board Moderator
  • Hero Member
  • *****
  • Posts: 12922
  • ChainSquad GmbH
    • View Profile
    • ChainSquad GmbH
  • BitShares: xeroc
  • GitHub: xeroc
Welcome to the BitShares community and thanks for opening up this discussion with the maidsafe community! +5%!

Offline dallyshalla

  • Newbie
  • *
  • Posts: 1
    • View Profile
Greetings,

Thanks for a great article regarding MaidSafe's endeavors, I am not MaidSafe staff, though a member of the community;

It has been delightful to read, and also; regarding:

" If I have gotten anything wrong in this article I hope those more familiar with the inner workings of MaidSafe will visit bitsharestalk.org and help clarify things. I truly hope we can work together to build a viable system. "

I started a thread on our forum regarding the blogpost from Bytemaster, and perhaps if you had some questions, there is a whole forum of members who are studying in real time this software: SAFE Network from MaidSafe


Sincerely,
Dallyshalla

https://www.maidsafe.org/t/bytemater-bitshares-daniel-larimer-opinion-on-maidsafe/2902