Author Topic: The Future of the BTSX Market Engine...  (Read 12504 times)

0 Members and 1 Guest are viewing this topic.

Offline starspirit

  • Hero Member
  • *****
  • Posts: 948
  • Financial markets pro over 20 years
    • View Profile
  • BitShares: starspirit
Does the rule that shorts are forced to cover after 30 days introduce the possibility of a short squeeze? Would it be better to set a limit on their purchases at the feed price, rather than being forced to buy at whatever price is available?

Offline starspirit

  • Hero Member
  • *****
  • Posts: 948
  • Financial markets pro over 20 years
    • View Profile
  • BitShares: starspirit
BM - Are the changes proposed here due to be implemented imminently or still being discussed? It should be clear to everybody when these final changes are effective so nobody is surprised.

Offline santaclause102

  • Hero Member
  • *****
  • Posts: 2486
    • View Profile
Quote
2) Implementing a prediction market that trades on the ERROR in the price feed.  This prediction market will then allow continuous, real time, price discovery on the blockchain by continuously discovering the % delta between the price feed and real price.  It can be speculated on without losing any exposure to BTSX and with limits that can be used to halt trading on the USD/BTSX market if the feed error is too great.   The USD/BTSX market can then use FEED_PRICE * PERCENT_ERROR.   In this way delegates are only responsible for "getting close" with their feeds and a free market will continuously update the price at which shorts can execute.
who determines here what the "real price is". A feed again? :) Or what the majority thinks?

Quote
  2) the price feed is slow to update and does not allow internal price discovery.
as a short term but practically effective solution there could be statistics in the client about: how often a delegate updates his/her price feed, and how close the delegate was to the median (this would already include some prediction market dynamics).
Maybe delegate pay could depend on reliability and price accuracy?
« Last Edit: October 03, 2014, 10:29:04 am by delulo »

Offline arhag

  • Hero Member
  • *****
  • Posts: 1214
    • View Profile
    • My posts on Steem
  • BitShares: arhag
  • GitHub: arhag
You, me and bytemaster are all agreed that linear interest is preferred to exponential continuously compounding interest.  I just have a different reason for reaching the same conclusion.

...
For me, the potential to implement the immediate-payment feature is what really tips the scales in favor of linear interest.

I agree that linear interest computations are slightly simpler to implement, and linear interest is also the approach taken by current (or planned) code.  While certainly valid points, these arguments wouldn't quite be enough to tip the scales for me.

That's cool. I guess if we all come to the same conclusion anyway it doesn't really matter. But I am curious as to why you care so much about immediate payments. BitAssets are fungible so all of these short interest payments get mixed into a buffer fund anyway. I suppose the immediate nature of charging the interest thanks to the linear interest model (based on practical engineering constraints anyway) is useful in increasing the speed at which information propagates in this negative feedback loop (shorts deciding to pay higher interest -> higher yields on BitAssets available -> increased demand of BitAssets causes more bids to go over the price feed -> shorts ease off on their high interest rates and perhaps even set prices above the feed). But I think the speed of propagation is already good enough with the 1 month term limit and practical limits on how quickly humans [1] will change demand for BitAssets based on yield rate changes.  Also, if there was a cap of 5% yearly yield on BitAssets like I hope there will be, and the yield rate was consistently saturated at this cap for a long time, the feedback loop would be temporarily broken anyway so information propagation speed would be meaningless.

[1] Also, if you are considering most of the BitAsset buy demand to be controlled by bots, then the bots have the ability to predict the future BitAsset yield rate changes by examining the blockchain/database to see the interest rates set by the shorts and which shorts have yet to cover. The information propagation speed is just as fast for bots in a model where interest owed is only collected at cover time as it is would be for them in the immediate-payment models. So in a market where BitAsset buy demand is dominated by bots, what exactly is the benefit of the immediate-payment models (obviously other than the technical implementation simplicity that we have already discussed, although that is a property of the linear interest model not the immediate-payment model, which technically are orthogonal issues)? Also, the client could always tell the humans an estimated predicted yield rate change expected in the near future by examining the blockchain the same way bots do. So the advantages bots have over humans does not have to be any different in either models. Note that the accuracy of the estimated prediction of yield rate change would be dependent on computing power available (in particular when the number of outstanding shorts is large) up to the limit of perfect accuracy if the computer could calculate interest contributed from each short every block without following behind the rate at which blocks are produced.
« Last Edit: October 03, 2014, 03:36:52 am by arhag »

Offline theoretical

For me the above is the straight-way to solve this 'incorrectness' but:
-Is it really that easy to update the initial order on a blockchain? Or will we will end up carrying well too much pretty unnecessary info for a average interest of 0.20833% And deference of like less then 10% of that?

The blockchain itself will just contain two things (well not really, this is just a simplification, see [1] for the truth):

- On November 1, a transaction initiating the initial short.
- On November 16, a transaction executing the partial cover.

It's up to the client to compute the new margin call price from the information in these transactions, and figure out how much BitUSD liability the position has in the future when something else happens to it (which might be another partial cover, a full cover, a margin call, or expiration).

So the November 1 transaction remains the same forever (the blockchain is basically an append-only transaction log, that's kinda the whole point of having a blockchain).  The client's database record showing the open BitUSD short positions, OTOH, is a mutable record which is updated by transactions coming from the blockchain.  So when the November 16 transaction comes in, the client updates the principal amount in the database record for the open short position.

My point was that, if you have the client handle the partial cover transaction by updating its database record using the algorithm I described, you'll cause the existing logic to do exactly the right thing for any further operations on that short position (another partial cover, full cover, margin call, expiration).  Without requiring another column in the database.

[1] It actually doesn't even contain this much.  Basically the blockchain only contains the bare minimum information to tell the client how to update its database.  If the client can figure something out on its own from existing information, then that's what the client does.  It'd just be stupid and wasteful to use precious blockchain storage to store a copy of anything we could just have each client store locally.

In a little more detail:  From the answer vikram (one of the devs) gave to one of my questions this forum, BTSX does something called "virtual transactions."  Basically whenever market orders are matched, the resulting transaction is generated locally by the client and never exists on the blockchain itself.  So you'd have blockchain transactions for Alice entering the short order, and Bob entering the bid order.  The matching between Alice's and Bob's order wouldn't show up on the blockchain.  Instead, the client would figure out that the prices overlap, triggering the "virtual transaction" logic which is basically bunch of different database updates showing that Alice now has an active short position, Alice and Bob's BTSX has moved from their orders into the short position's collateral, Bob now has some newly printed BitUSD, and both of the orders are now gone (or maybe just one is gone if the quantity was unequal and one side got a partial fill).  The database updates caused by virtual transactions are only visible on the blockchain insofar as they affect what transactions are accepted in future blocks, i.e. if Bob tries to send BitUSD to Caitlin, it will now be accepted because every client will have performed the database update and every client's database will show that Bob has enough BitUSD (because of the update that happened when his bid overlapped Alice's short).  Whereas before the match, it might have failed because Bob would have had a smaller (maybe zero) BitUSD balance.

Of course there's also logic for fees, partial fills, yield, and lots of rounding going on.  Oh, and this also all has to be undoable to deal with forks, which can happen if a delegate is so late producing a block that the next delegate produces its own block, having decided that the first delegate is offline (but the first delegate did produce a block because it didn't get notified in time that the second delegate had produced a block).

In addition, the production BTSX client has to keep a copy of all previous versions of the market logic to figure out the database from blocks that happened before the current rules came into effect.  (On testnet, they just reset everything to the genesis state when they change the market engine, because testnet XTS / BitAssets are just "play money" and we don't care about resetting their ownership.)

Knowing about all of the machinery that has to exist, explains why I tend to be so patient and understanding when there are frequent updates to deal with various bugs :)
« Last Edit: October 03, 2014, 02:33:06 am by drltc »
BTS- theoretical / PTS- PZxpdC8RqWsdU3pVJeobZY7JFKVPfNpy5z / BTC- 1NfGejohzoVGffAD1CnCRgo9vApjCU2viY / the delegate formerly known as drltc / Nothing said on these forums is intended to be legally binding / All opinions are my own unless otherwise noted / Take action due to my posts at your own risk

Offline theoretical

I also think it is unnecessary to complicate the calculations to implement continuous compounding interest for the shorts.

You, me and bytemaster are all agreed that linear interest is preferred to exponential continuously compounding interest.  I just have a different reason for reaching the same conclusion.

I want to eventually implement interest from shorts being paid immediately to longs (via the yield fund), rather than waiting for shorts to close their positions.  With linear interest it should be fairly simple to do, since the network income from short interest would occur at a constant per-block rate that only changes when a short opens, closes or partially covers.  With continuous compounding interest, this immediate-payment feature is likely technically impossible (given practical engineering constraints).  For me, the potential to implement the immediate-payment feature is what really tips the scales in favor of linear interest.

I agree that linear interest computations are slightly simpler to implement, and linear interest is also the approach taken by current (or planned) code.  While certainly valid points, these arguments wouldn't quite be enough to tip the scales for me.  But they're a nice bonus when the potential for the immediate-payment feature has already tipped the scales in favor of linear interest :)
BTS- theoretical / PTS- PZxpdC8RqWsdU3pVJeobZY7JFKVPfNpy5z / BTC- 1NfGejohzoVGffAD1CnCRgo9vApjCU2viY / the delegate formerly known as drltc / Nothing said on these forums is intended to be legally binding / All opinions are my own unless otherwise noted / Take action due to my posts at your own risk

Offline tonyk

  • Hero Member
  • *****
  • Posts: 3308
    • View Profile
I am aware the issue and know that it does cost extra for partial covering.   You end up paying interest on your interest.... result: partial covering has fees associated with it that are magnified by the insane interest rate used in the example. 

The challenge is calculating the result without adding yet another data field to every order... to track the start date and end date. 

I suppose I might as well store extra data rather than using a crude approximation.

I thought about this issue some more.  I think you can solve it by just figuring out what part of the payment goes to principal, and updating the initial principal.

Consider our example, where $100 short has an interest rate of 10% / month and the user covers with $52.50 at 15 days.

We said that a full cover would be $105 at 15 days, $100 principal and $5 interest.  The partial cover payment goes to principal and interest in the same 100 : 5 proportion.  A $52.50 payment would pay $50 to principal and $2.50 to interest.  (A $50 payment would pay $47.62 to principal and $2.38 to interest.)  So if the user pays $52.50, just update the initial principal to $50.  In other words, you just replace the record that says they took out a $100 short at 10% interest 15 days ago, with a "back-dated" record that says they took out a $50 short at 10% interest 15 days ago.

Then they would need $52.50 to do a full cover right now, so the numbers are right for doing a partial cover quickly followed by a full cover.  In 30 days they'll need $55 to do a full cover, and linearly in between.  The back-dated record is an accounting fiction which happens to make the numbers come out exactly right without needing another database field.

For me the above is the straight-way to solve this 'incorrectness' but:
-Is it really that easy to update the initial order on a blockchain? Or will we will end up carrying well too much pretty unnecessary info for a average interest of 0.20833% And deference of like less then 10% of that?
Lack of arbitrage is the problem, isn't it. And this 'should' solves it.

Offline arhag

  • Hero Member
  • *****
  • Posts: 1214
    • View Profile
    • My posts on Steem
  • BitShares: arhag
  • GitHub: arhag
I thought about this issue some more.  I think you can solve it by just figuring out what part of the payment goes to principal, and updating the initial principal.

Consider our example, where $100 short has an interest rate of 10% / month and the user covers with $52.50 at 15 days.

We said that a full cover would be $105 at 15 days, $100 principal and $5 interest.  The partial cover payment goes to principal and interest in the same 100 : 5 proportion.  A $52.50 payment would pay $50 to principal and $2.50 to interest.  (A $50 payment would pay $47.62 to principal and $2.38 to interest.)  So if the user pays $52.50, just update the initial principal to $50.  In other words, you just replace the record that says they took out a $100 short at 10% interest 15 days ago, with a "back-dated" record that says they took out a $50 short at 10% interest 15 days ago.

Then they would need $52.50 to do a full cover right now, so the numbers are right for doing a partial cover quickly followed by a full cover.  In 30 days they'll need $55 to do a full cover, and linearly in between.  The back-dated record is an accounting fiction which happens to make the numbers come out exactly right without needing another database field.

 +5% Looks good to me.

I also think it is unnecessary to complicate the calculations to implement continuous compounding interest for the shorts.

Offline theoretical

I am aware the issue and know that it does cost extra for partial covering.   You end up paying interest on your interest.... result: partial covering has fees associated with it that are magnified by the insane interest rate used in the example. 

The challenge is calculating the result without adding yet another data field to every order... to track the start date and end date. 

I suppose I might as well store extra data rather than using a crude approximation.

I thought about this issue some more.  I think you can solve it by just figuring out what part of the payment goes to principal, and updating the initial principal.

Consider our example, where $100 short has an interest rate of 10% / month and the user covers with $52.50 at 15 days.

We said that a full cover would be $105 at 15 days, $100 principal and $5 interest.  The partial cover payment goes to principal and interest in the same 100 : 5 proportion.  A $52.50 payment would pay $50 to principal and $2.50 to interest.  (A $50 payment would pay $47.62 to principal and $2.38 to interest.)  So if the user pays $52.50, just update the initial principal to $50.  In other words, you just replace the record that says they took out a $100 short at 10% interest 15 days ago, with a "back-dated" record that says they took out a $50 short at 10% interest 15 days ago.

Then they would need $52.50 to do a full cover right now, so the numbers are right for doing a partial cover quickly followed by a full cover.  In 30 days they'll need $55 to do a full cover, and linearly in between.  The back-dated record is an accounting fiction which happens to make the numbers come out exactly right without needing another database field.
BTS- theoretical / PTS- PZxpdC8RqWsdU3pVJeobZY7JFKVPfNpy5z / BTC- 1NfGejohzoVGffAD1CnCRgo9vApjCU2viY / the delegate formerly known as drltc / Nothing said on these forums is intended to be legally binding / All opinions are my own unless otherwise noted / Take action due to my posts at your own risk

Offline arhag

  • Hero Member
  • *****
  • Posts: 1214
    • View Profile
    • My posts on Steem
  • BitShares: arhag
  • GitHub: arhag
I think this is a decent compromise, all things considered. Who knows if the 3x collateral is enough (to protect against 66% flash crashes), but there has to be some balance that we decide and going over too much just hurts our ability to use interest for more productive purposes. Also, if a black swan event does occur, and there are more BitAssets issued than the BTSX value backing them, the DAC can simply burn the BitAssets collected as interest from shorts from that point forward until balance is restored. During that time, yields would all go to zero, but after balance is restored, yields would resume. With this policy in place, BitAssets might hold their value even in the event of a black swan event. I think it would be even better if a reserve fund of BitAssets, with a sensible dynamic minimum limit, were held to immediately burn as much as necessary to restore balance in the event of a black swan event for the purpose of quickly restoring market confidence.

And I agree with bytemaster here:
Why is this the right amount of collateral?

I have addressed this in other places... there is no clear way to know the right collateral level and competing for shorts based on collateral doesn't actually establish a market rate for collateral is just causes the network to buy as much collateral as possible given the short vs USD demand.  IE: it maximizes collateral. 

The shorts don't care about finding the optimal collateral ratio to maximize long-term market confidence/desire in the system (meaning decreasing the risk of black swan events while also increasing the utility/value of BitAssets). They just care about getting in front of the line but with enough leverage that it is still individually profitable for them to short. This just maximizes the collateral the shorts are willing to offer under the constraint that they still believe they can profit in the short-term. I don't know if we can expect the strong believers of BTSX growth (at least in the short-term) to properly access the risk of black swan failure. If we got rid of the reserve requirements and there was little competition for shorts, they might drive down the collateral ratio to something only a little higher than 134% (above the margin call limit where 75% of collateral is needed to pay the debt). We, the shareholders of the DAC and long-term holders of BTSX, impose a minimum collateral to protect BitAsset holders against the risk of black swan because of the overeagerness of the shorts. I am not sure what the proper number is, but once we have that minimum limit I see no reason why we need to allow the market to increase it any higher for the purposes of getting shorts matched (instead letting shorts compete by interest rate makes more sense).

I am also not sure if we can devise a market mechanism to dynamically find the new optimal collateral ratio needed. Perhaps as the price (BTSX per BitAsset) of BitAsset sells (not short sells) drops further and further below the price feed, the mandated minimum collateral requirements for new shorts can increase to compensate? The reasoning behind this is that perhaps the discount in the BitAsset price is because of the risk of black swans, and so decreasing that risk through higher collateral will eventually stimulate demand for BitAssets. Anyway, I don't think any of that is really necessary. Allowing the DAC to specify a fixed minimum collateral requirement per BitAsset (and maybe later on we can have the flexibility to adjust it with shareholder approval) is good enough IMO. And the 3x collateral bytemaster proposed seems reasonable to me for now.


-If we switch to 30 day required covering and this 30 day timeframe is deemed important/needed, I really hope we don't grandfather current shorts more than a couple months if at all; the health and liquidity of the whole system is more important than some individuals' desire to keep their leveraged position... everyone else shouldn't have to wait a year to see what the end result looks like.

I'm fine with this.


remaining weaknesses are:  ...
  3) the market is asymmetric (you cannot short BTSX backed by USD)
  4) you cannot short BitGLD backed by BitUSD (or any other combo)
I don't understand the need for these things.
We have a roadmap for addressing ...
2) Implementing a prediction market that trades on the ERROR in the price feed.  This prediction market will then allow continuous, real time, price discovery on the blockchain by continuously discovering the % delta between the price feed and real price.  It can be speculated on without losing any exposure to BTSX and with limits that can be used to halt trading on the USD/BTSX market if the feed error is too great.   The USD/BTSX market can then use FEED_PRICE * PERCENT_ERROR.   In this way delegates are only responsible for "getting close" with their feeds and a free market will continuously update the price at which shorts can execute.
I don't see the need; the central market already accomplishes this in and of itself.  As it is right now, smart traders will make their trades based on the real exchange rate, not the feed as long as they expect the feed to track toward the right answer over the long run.  If the feed is wrong you shouldn't overpay and buy expensive bitUSD.  By the same token if the feed is wrong in the other direction you shouldn't sell bitUSD cheap because you know the feed will eventually correct and the market will eventually correct and 1 bitUSD will equal $1 in the long run so if the market is inefficient that is a profit opportunity.

I also agree with the above. I don't really see the point in shorting BTSX using a BitAsset since the market is always going to be asymmetric because of the collateral ratio requirements and the fact that everything is ultimately backed by BTSX. Although, I do see some use in being able to short one BitAsset backed by another BitAsset (but maybe there should be different minimum collateral requirements depending on which BitAsset you back with since the risk of extreme BitAsset1 / BitAsset2 volatility depends on the chosen BitAsset pairs). Actually, thinking about it more, I would say BitAssets should not be shorted into existence by anything other than BTSX. If you want to do a separate bond market backed by other BitAssets, that is fine with me.


Overall I'm not getting this and I still prefer priority by collateral, and using a bond market to manage interest rates.

A bond market allows the shorts to directly pay the bond holders the interest (which also makes it nonfungible). The DAC doesn't get in the middle of this payment and get the opportunity to decide the best way to utilize those payments (for example, maybe not all of the interest should go to BitAsset yields, maybe there should be a yield rate cap and the excess should be given to delegates/workers). Prioritizing shorts by collateral kills all of this potential revenue for the DAC for the sake of potentially excessive and unnecessary black swan risk reduction. Using a bond market in its place, doesn't allow the DAC to take a cut either as I already mentioned.

Also, what exactly is the point of the bond market in this case? It doesn't stimulate demand for BitAssets, it stimulates demand for holding bonds. It is true that it locks up extra BTSX as collateral which can have a positive effect on the price. But I am not convinced that this BTSX wouldn't be effectively locked up anyway as people hold a lot of their wealth as BTSX for long-term growth. I think it makes more sense to have BitAsset yields paid for by the short interests. BitAssets are fungible and can be used like the money in your checking account (unlike bonds). Plus they receive much better yields than a typical bank, especially with the new market engine that allows shorts to pay interest.
« Last Edit: October 03, 2014, 01:00:53 am by arhag »

Offline zhao150

  • Hero Member
  • *****
  • Posts: 606
  • 老子早就不想当代表了
    • View Profile
老子早就不想当代表了

Offline bytemaster

With the interest rates we are talking about and the terms we are talking this will merely encourage people to "cover all at once" and prior to expiration.

You can "partially cover" by being short and long at the same time....

The extra fee just motivates users to be more economical with their transactions and to avoid waiting until the end of their interest period. 

The complex math involved to do it right has many issues with performance at high transaction volumes, rounding errors, etc. 

Worst case: user pays interest on their interest.  Best case: user pays simple interest.   
For the latest updates checkout my blog: http://bytemaster.bitshares.org
Anything said on these forums does not constitute an intent to create a legal obligation or contract between myself and anyone else.   These are merely my opinions and I reserve the right to change them at any time.

Offline theoretical


Let's walk through the bookkeeping of a partial cover with continuous compounding.  Basically you compute f(t), subtract the payment amount, re-figure the initial principal based on the new amount, and treat it the same as you'd treat a loan with that re-figured initial principal.  Let's imagine the user pays $52.50 on $100 at 10% monthly rate at 15 days, just like before.

- (1.10)^(0.5 months) = 1.048809 (always round up), so $104.8809 is the balance at 15 days.
- Subtract the $52.50 payment, you now have $52.3809 left.
- Solve for what the initial principal would have been on a loan with a current balance of $52.3809 at 15 days left ("implied initial principal").
- You get 52.3809 / 1.048808 (always round down here), giving $49.9433 (rounded up) as the implied initial principal.
- Now compound $49.9433 at 1.10 to get the new value of interest due at maturity (basically we un-counted the first 15 days when we did the implied initial principal computation, so we have to re-count it by doing the whole month).
- $4.9944 interest (always round up) + $49.9433 gives a total of $54.9377 due at maturity.  The collateral ratio and margin call price would then be calculated using $54.9377.
- $54.9377 + $52.50 = $107.4377 total repayment.

Of course, to do this bookkeeping, you have to compute f(t) for fractional t.  Which is non-trivial with integer arithmetic.  You can make it easier by having all computations internally use a per-block interest rate; per-month or APR is strictly for the convenience of using human-friendly units when displaying values to the user.  Using fixed-point arithmetic with 32-bit fractional part, per-block interest would have a resolution of under 0.08% APR.  Increasing to 48-bit fractional part gives APR resolution of about 1.1 millionths of a percentage point -- more than good enough.  With 64-bit values you have 16 bits before we overflow, if we ever exceed 100% in some computation.

You'd get some full 128-bit products in intermediate computations, but I think you mentioned in another thread you already have library functions for dealing with 128-bit values.
BTS- theoretical / PTS- PZxpdC8RqWsdU3pVJeobZY7JFKVPfNpy5z / BTC- 1NfGejohzoVGffAD1CnCRgo9vApjCU2viY / the delegate formerly known as drltc / Nothing said on these forums is intended to be legally binding / All opinions are my own unless otherwise noted / Take action due to my posts at your own risk

Offline theoretical

I am aware the issue and know that it does cost extra for partial covering.   You end up paying interest on your interest.... result: partial covering has fees associated with it that are magnified by the insane interest rate used in the example. 

The challenge is calculating the result without adding yet another data field to every order... to track the start date and end date. 

I suppose I might as well store extra data rather than using a crude approximation.

If you do continuous compounding, then interest and principal dollars generate interest at the same rate and don't need to be tracked separately.  The amount payable is f(t) = initial_principal * (1+r)^t where t is the time in months and r=1.10 is the monthly rate.

Continuous compounding is technically doable.  I'll post an example of what the numbers would look like in another post.  I would actually recommend doing continuous compounding if the story ended there.  But there's more:  Linear compounding has one highly desirable feature that continuous compounding does not.  With linear interest, you can figure out per-block short interest income for the network as a whole:  Just keep track of the sum of the shorts' per-block interest!  We could (and I think we should) pay the interest income directly to the yield fund as it accrues, rather than waiting for shorts to close.

If you have a bunch of outstanding shorts using exponential compounding with different exponents and expiration dates, I think this immediate payment would be impossible for technical reasons [1] [2].

Therefore, my recommendation is to stick with linear compounding, and consider implementing paying the accrued interest to the yield fund immediately every block (not a super high priority, so we should probably testnet this first).

[1] It would theoretically be possible if we were willing to spend O(N) computation every block, where N is the number of shorts that exist.  We aren't willing to do that, as "not breaking the network" is a higher priority than "paying yield a little sooner" :)

[2] I don't think paying exact interest as it's accrued would be possible with exponential compounding, but you might be able to figure out some approximate lower bound that you only need to update when a short opens or covers.
« Last Edit: October 02, 2014, 10:28:55 pm by drltc »
BTS- theoretical / PTS- PZxpdC8RqWsdU3pVJeobZY7JFKVPfNpy5z / BTC- 1NfGejohzoVGffAD1CnCRgo9vApjCU2viY / the delegate formerly known as drltc / Nothing said on these forums is intended to be legally binding / All opinions are my own unless otherwise noted / Take action due to my posts at your own risk

Offline starspirit

  • Hero Member
  • *****
  • Posts: 948
  • Financial markets pro over 20 years
    • View Profile
  • BitShares: starspirit
Just want to check I understand the interest - so shorts compete on interest, thus have different rates, and the network collects this in the pool that longs earn through their BitYield?

Also is there a fixed date when this will all be locked in stone (for now)?

Thanks