Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - litepresence

Pages: [1] 2 3 4 5 6
General Discussion / Re: API Node Latency Map
« on: December 13, 2018, 01:05:42 am »
updated repo

- removed whitespace from uploaded images for easier incorporation into user websites
- uploaded basemap.png to repo
- added list of seed nodes to jsonbin under key 'SEEDS' with data ('ip', 'ping', 'geolocation')
- plotting seed nodes on top with yellow dots
- added UTC timestamp to images
- removed misleading green dots in africa from the lens flare in the base map

new sample:

this should satisfy most requirements of issue 1348

General Discussion / API Node Latency Map
« on: December 07, 2018, 10:30:21 pm »
for data you go here:

This jsonbin api is at a stationary location.

I update the latency info hourly via python script; I'm on east coast US.

at the link you will find json dictionary with keys:


UNIVERSE is all known bitshares public api nodes mentioned in various github projects; I suspect I have the largest known list of bitshares public api domains

URLS are github urls where node lists can be found; this tool does not contain a list of nodes.  Instead it goes to many different repos on github (such as bitshares, cryptobridge, rudex) etc.  where people have configuration files with lists of nodes; if they update their repo with new nodes, the script finds them with web scraping methods

LIVE is a list of tested nodes that respond within 10 seconds and have a head block number that is not stale. 

PING is list of tuples of ping time for each LIVE node

GEO is geo location data (country, city, longitude, latitude) for each LIVE node as provided by or

COUNT is number of live nodes vs number of nodes in universe

SOURCE_CODE is my github were you can get this tool and run it yourself to produce your own web hosted latency map in distributed fashion.  The latency script will also give you information as to why each down node failed to connect.  A full report looks like this:

You can also pick up copy of metaNODE and ExtinctionEvent while you're there :D

UNIX and UTC are time stamp when this jsonbin was last updated

MAP_URL is a link to a visual image of the data on the jsonbin;
the map url will change every hour!

it will be something like:

and look like one frame of this gif

source code is here, WTFPL:

todo list... upload 24hr animations instead of still images

thanks to free and easy to use api services that make this project is possible:


I've tightened my margins... let the scalp wars begin.

blue is dex bid/ask; tan is dex last

the margins and price line are now very dynamic on the dex, much like external cex prices for bitshares vs btc


same market, matching 24 hour period, 60 days later... bts:open.btc dex prices and book have gone from bar code to liquid

EV6 has been released

this chart is 30 days ago
magenta/yellow is cex price
blue/white is dex price

Even without my personal EV running, the BTS:OPEN.BTC bid / ask and priceline are no longer stepwise like this image from 30 days ago.     
The margins are tight, dynamic like the cex markets... they look to be EV controlled by some other operator running a different tune from my own.

I've tightened my margins... let the scalp wars begin.


5 and released to github

lots of new niceties related to plotting and logging, improved control over scalping

Live session now rotates through 3 views every 5 minutes.

The EV7/metaNODE11 stack has been


for a 1,000,000 second (12 day) run without hiccup or user intervention; buy/sell/cancel and replace all scalp and primary orders every 5 minutes; maintaining a perfect count of 5 minute ticks during the ongoing run. 

I quite literally locked a linux box in a room running algo for 2 weeks entirely unattended; it said BTS/BTC bear market when I left.  When I returned it said bull market and had 13% more btc than starting portfolio.  Log file clean.  8)




it ships with a 120X stock 1000 day BTS/BTC tune; my personal tune backtests over 7000X during the same period


EV6 has been released

improved scalp logic offsets thresholds above or below cex market prices based on primary indicator market cross

previously when creating simple moving averages from cex data there was a degree of stepwise high frequency "wiggle" in the indications where the moving averages would oscillate up and down +/- 1 satoshi per hour; this is a result of candle aggregation methods in the data arriving from cryptocompare.  I have resolved this using polynomial regression of the last 24 hours of plotted moving average.  So now bot plots moving average from cex data, then performs poly, then replots the moving average "smoothly"; the state machine is then fed the smoothed indication.

additionally; the new regression allows for moving averages to be extrapolated into the future; these moving average predictions are in yellow and are deleted and redrawn every 5 minutes; the tip of the prediction (24 hours into the future) is plotted as a single magenta dot and it stays on the chart.

finally, in the log some information about primary moving average slope and concavity is provided.

you can see the older stepwise moving average here; the step size is about one satoshi

this shows new poly regressions of moving averages extending into future

this is zoomed in on the 50 day average

zoomed in further

this is data provided in the new log

General Discussion / [ANN] metaNODE = Bitshares_Trustless_Client()
« on: June 06, 2018, 01:52:45 pm »
metaNODE = Bitshares_Trustless_Client()

There were two ways to get data from Bitshares blockchain:

- a private node that uses lots of RAM, prefers its own machine, and is technical to tend
- a public node that is difficult to stay connected to and may provide rogue data

I've created a 3rd path; connect to several random nodes in the public network... ask them each for latest market data; then move on to other nodes in the network continually. Finally, perform statistical analysis on the combined feeds and maintain a streaming curated output file; the metaNODE.

python script and whitepaper, including usage:

Code: [Select]

'litepresence 2018'

def WTFPL_v0_March_1765():
    if any([stamps, licenses, taxation, regulation, fiat, etat]):
            print('no thank you')
            return [tar, feathers]


1/10 the RAM usage of a personal node
99.9999 six sigma uptime
99.9999 six sigma accurate data feed
less than 5000ms latency

metaNODE is a streaming text file that contains statistically curated dex data from all public nodes for a single market on a single account.  metaNODE currently curates the following feeds:

- last
- market history
- open orders
- orderbook
- market-account balance

Run time has been demonstrated in excess of 2 weeks. 
metaNODE.txt is updated about once per second with live data.

is entirely independent of pybitshares.
Public database calls are made with websocket-client.

This project has received funding through the DEXbot worker.  Additional funding has been earmarked for incorporation of metaNODE into future release of the DEXbot platform.

metaNODE10 has already been incorporated into the Extinction Event dex algo trading framework; learn more at

New Version of Extinction Event has been released.

The project has progressed beyond the point of "alpha" and is now officially in beta testing stage; it is "feature complete" and appears to be able to run stable long term - days on end - without user input.

From here forward, will be dependent on the stand alone app

The latest stack is


and is available on my github

metaNODE whitepaper can be found here:

metaNODE app:

EV app:

You'll need to install my virtual environment per instructions in my exctinction-event github repo, then create a folder and put a copy of and into it.  To run live, start the metaNODE first, let it warm up fully, then you can launch an EV session for paper trading, live, or test orders. 

General Discussion / Bitshares Database Pocket Reference - Python
« on: April 27, 2018, 08:43:56 pm »
Code: [Select]

' Bitshares Database Api Websocket Calls Pocket Reference'


'litepresence 2018'

import websocket  #pip install websocket-client

node = 'wss://'#'wss://' # websocket address
asset = 'BTS' # symbol
currency = 'OPEN.BTC' #symbol
start = '2017-01-01T11:22:33' # iso date
stop = '2018-01-01T11:22:33' # iso date
account_name = 'litepresence1' # string
account_names = '["xeroc", "litepresence1"]' # list of string names
account_id = '1.2.743179' # a.b.c vector
block_num = 26444444 # int block number
block_nums = [26444444, 26444445, 26444446] # list of block numbers
trx_in_block = 4 # int index of item
transaction_id = '1.7.66209049'
transaction_id_type = 'cancel'
skip_order_book = 'false' # bool - non pythonic 'true' or 'false'
limit = 5 # int depth of data called
object_id_type = '' # a.b.c vector
object_ids = []
asset_id = '1.3.0'
asset_id2 = '1.3.861'
subscribe = 'false' # bool - non pythonic 'true' or 'false'
public_key = "BTS7d4MpYzecprWMfso8f6o1Ln8fQyxuQGD5LM83PRTfQBkodk4Ck"
witness_id = '1.6.65'
trx = '{"expiration":"2018-04-27T14:16:35","extensions":[],"operation_results":[[1,"1.7.66407704"]],"operations":[[1,{"amount_to_sell":{"amount":71055254,"asset_id":"1.3.0"},"expiration":"2018-05-04T14:16:04","extensions":[],"fee":{"amount":578,"asset_id":"1.3.0"},"fill_or_kill":false,"min_to_receive":{"amount":1922091,"asset_id":"1.3.121"},"seller":"1.2.879926"}]],"ref_block_num":8413,"ref_block_prefix":4199813419}'

# '{"id":1,"method":"call","params":["database","get_ticker",["OPEN.BTC","BTS"]]}'
Z = '{"id":1,"method":"call","params":["database",'

get_objects                     = Z + ''

set_subscribe_callback          = Z + ''
set_pending_transaction_callback = Z + ''
set_block_applied_callback      = Z + ''
cancel_all_subscriptions        = Z + ''

'Blocks and transactions'
get_block_header                = Z + '"get_block_header",["%s"]]}' % block_num
get_block_header_batch          = Z + '"get_block_header_batch",["%s"]]}' % (block_nums)
get_block                       = Z + '"get_block",["%s"]]}' % block_num
get_transaction                 = Z + '"get_transaction",["%s", "%s"]]}' % (block_num, trx_in_block)
get_recent_transaction_by_id    = Z + ''

'Globals' # DONE
get_chain_properties            = Z + '"get_chain_properties",[]]}'
get_global_properties           = Z + '"get_global_properties",[]]}'
get_config                      = Z + '"get_config",[]]}'
get_chain_id                    = Z + '"get_chain_id",[]]}'
get_dynamic_global_properties   = Z + '"dynamic_global_properties",[]]}'

get_key_references              = Z + '"get_key_references",[["%s",]]]}' % public_key
is_public_key_registered        = Z + '"is_public_key_registered",["%s"]]}' % public_key

get_accounts                    = Z + '"get_accounts",[["%s",]]]}' % account_id
get_full_accounts               = Z + '"get_full_accounts",[["%s",],%s]]}' % (account_name, subscribe)
get_account_by_name             = Z + '"get_account_by_name",["%s"]]}' % account_name
get_account_references          = Z + '"get_account_references",["%s",]]}' % account_id
lookup_account_names            = Z + '"lookup_account_names",[%s]]}' % account_names
lookup_accounts                 = Z + '"lookup_accounts",["%s", "%s"]]}' % (account_name, limit)
get_account_count               = Z + '"get_account_count",[]]}'

get_account_balances            = Z + '"get_account_balances",["%s", [] ]]}' % (account_id)
get_named_account_balances      = Z + '"get_named_account_balances",["%s", [] ]]}' % (account_name)
get_balance_objects             = Z + ''
get_vested_balances             = Z + ''
get_vesting_balances            = Z + '"get_vesting_balances",["%s"]]}' % account_id

get_assets                      = Z + '"get_assets",[["%s",]]]}' % asset_id
list_assets                     = Z + '"list_assets",["%s","%s"]]}' % (asset, limit)
lookup_asset_symbols            = Z + '"lookup_asset_symbols",[["%s",]]]}' % asset

'Markets / feeds'
get_order_book                  = Z + '"get_order_book",["%s","%s","%s"]]}' % (currency, asset, limit)
get_limit_orders                = Z + '"get_limit_orders",["%s","%s","%s"]]}' % (asset_id, asset_id2, limit)
get_call_orders                 = Z + '"get_call_orders",["%s","%s"]]}' % (asset_id, limit)
get_settle_orders               = Z + '"get_settle_orders",["%s","%s"]]}' % (asset_id, limit)
get_margin_positions            = Z + '"get_margin_positions",["%s"]]}' % account_id
get_collateral_bids             = Z + ''
subscribe_to_market             = Z + ''
unsubscribe_from_market         = Z + ''
get_ticker                      = Z + '"get_ticker",["%s","%s","%s"]]}' % (currency, asset, skip_order_book)
get_24_volume                   = Z + '"get_24_volume",["%s","%s"]]}' % (currency, asset)
get_top_markets                 = Z + '"get_top_markets",["%s"]]}' % limit
get_trade_history               = Z + '"get_trade_history",["%s","%s","%s","%s","%s"]]}' % (currency, asset, start, stop, limit)
get_trade_history_by_sequence   = Z + ''

get_witnesses                   = Z + ''
get_witness_by_account          = Z + '"get_witness_by_account",["%s"]]}' % witness_id
lookup_witness_accounts         = Z + '"lookup_witness_accounts",["%s","%s"]]}' % (witness_id, limit)
get_witness_count               = Z + '"get_witness_count",[]]}'

'Committee members'
get_committee_members           = Z + ''
get_committee_member_by_account = Z + ''
lookup_committee_member_accounts = Z + ''
get_committee_count             = Z + '"get_committee_count",[]]}'

get_all_workers                 = Z + ''
get_workers_by_account          = Z + ''
get_worker_count                = Z + '"get_worker_count",[]]}'

lookup_vote_ids = Z + ''

'Authority / validation'
get_transaction_hex             = Z + '"get_transaction_hex",[%s]]}' % trx
get_required_signatures         = Z + ''
get_potential_signatures        = Z + '"get_potential_signatures",[%s]]}' % trx
get_potential_address_signatures  = Z + '"get_potential_address_signatures",[%s]]}' % trx
verify_authority                = Z + '"verify_authority",[%s]]}' % trx
verify_account_authority        = Z + ''
validate_transaction            = Z + '"validate_transaction",[%s]]}' % trx
get_required_fees               = Z + ''

'Proposed transactions'
get_proposed_transactions = Z + '"get_proposed_transactions",["%s"]]}' % account_id

'Blinded balances'
get_blinded_balances = Z + ''

get_withdraw_permissions_by_giver = Z + ''
get_withdraw_permissions_by_recipient = Z + ''

all_calls = [get_objects,set_subscribe_callback,set_pending_transaction_callback,set_block_applied_callback,cancel_all_subscriptions,get_block_header,get_block_header_batch,get_block,get_transaction,get_recent_transaction_by_id,get_chain_properties,get_global_properties,get_config,get_chain_id,get_dynamic_global_properties,get_key_references,is_public_key_registered,get_accounts,get_full_accounts,get_account_by_name,get_account_references,lookup_account_names,lookup_accounts,get_account_count,get_account_balances,get_named_account_balances,get_balance_objects,get_vested_balances,get_vesting_balances,get_assets,list_assets,lookup_asset_symbols,get_order_book,get_limit_orders,get_call_orders,get_settle_orders,get_margin_positions,get_collateral_bids,subscribe_to_market,unsubscribe_from_market,get_ticker,get_24_volume,get_top_markets,get_trade_history,get_trade_history_by_sequence,get_witnesses,get_witness_by_account,lookup_witness_accounts,get_witness_count,get_committee_members,get_committee_member_by_account,lookup_committee_member_accounts,get_committee_count,get_all_workers,get_workers_by_account,get_worker_count,lookup_vote_ids,get_transaction_hex,get_required_signatures,get_potential_signatures,get_potential_address_signatures,verify_authority,verify_account_authority,validate_transaction,get_required_fees,get_proposed_transactions,get_blinded_balances,get_withdraw_permissions_by_giver,get_withdraw_permissions_by_recipient,]

object_calls = []

subscription_calls = []

block_calls =      [get_block,

global_calls =     [get_chain_properties,

key_calls =        [get_key_references,

account_calls =    [get_accounts,

asset_calls =      [get_assets,

balance_calls =    [get_account_balances,

market_calls =     [get_order_book,
                    get_top_markets, ]

witness_calls =    [get_witness_by_account,

committe_calls =   [get_committee_count]

worker_calls =     [get_worker_count]

vote_calls = []

authority_calls =  [get_transaction_hex,

proposed_calls = []
blindied_calls = []
withdrawal_calls = []

for call in asset_calls:

        ws = websocket.create_connection(node)
        ret = ws.recv()
        print (ret)
    except Exception as e:
        print (e.args)

# you can also make https requests in this manner
import requests
data = '{"id":1,"method":"call","params":["database","get_ticker",["CNY","USD"]]}'
response =';echo', data=data)

also here:

get named balances returns an asset id without its human readable name....
and an amount - without the location of its decimal point!

this is kind of thing makes neckbeard itchy

Code: [Select]
["database","get_named_account_balances",["abc123", [] ]]
[{'asset_id': '1.3.0', 'amount': 12019967}, {'asset_id': '1.3.1382', 'amount': 100}, {'asset_id': '1.3.1578', 'amount': 100000000}, {'asset_id': '1.3.2419', 'amount': 500000}, {'asset_id': '1.3.2841', 'amount': 12000}, {'asset_id': '1.3.2931', 'amount': 10000}, {'asset_id': '1.3.3248', 'amount': 2580000}, {'asset_id': '1.3.3261', 'amount': 1000}, {'asset_id': '1.3.3279', 'amount': 10000}, {'asset_id': '1.3.3389', 'amount': 1000000000}, {'asset_id': '1.3.3431', 'amount': 1000}, {'asset_id': '1.3.3530', 'amount': 100000000}, {'asset_id': '1.3.3540', 'amount': 100000000}, {'asset_id': '1.3.3830', 'amount': '25000000000'}]

we can fix this... so get asset details

which is another api call and gives us "precision"; location of decimal, and the asset "symbol"; ie OPEN.BTC or BTS

but then we have to merge those dictionaries and they do not contain a common key; one has "asset_id" the other "id"


every call for what the asset is called and where its decimal point goes also contains bit of spam for every shitcoin donated to your account, on every bot tick, which brings in a huge page of asset banner ads like this silliness:

Code: [Select]
'{"main":"WARNING, DANGER, RED-ALERT, PANIC! PANIC! PANIC!\\n\\nYOU HAVE BEEN INFECTED,\\n\\nZOMBIES HAVE BEEN RELEASED! \\n\\nZombies are going to appear from nowhere & infest your portfolio,\\nSome zombies will come from your friends and family,\\nAND YES there are even some sickos out there that will want to buy Zombies too :s\\n\\n\\nThese VILE undead human corpses have risen, The Mission, BURN Zombies :)\\nHelp FILL THE POOL, Zombies are BURNED when there sent to the pool.\\n\\n\\nDisclaimer: NO REAL ZOMBIES WILL BE HURT IN THE CREATION OF THIS TOKEN ;)","short_name":"Zombies",

be thankful you're not on data plan!

nonetheless, hocus pocus account balances:

Code: [Select]
{'TURION': 0.1, 'FREECOIN': 10000.0, 'SOLOMON': 0.1, 'BIRMINGHAMGOLD': 100.0, 'ZOMBIES': 12.0, 'DECENTRALIZED': 10.0, 'ANONYMOUS': 1.0, 'BTS': 120.19967, 'BTSJON': 1000.0, 'BADCOIN': 10000.0, 'HERTZ': 0.01, 'URTHONA': 50.0, 'SEED': 258.0, 'UNIVERSAL': 25000.0}

Code: [Select]
'litepresence 2018'

import websocket  #pip install websocket-client
from ast import literal_eval as literal

def database_call(node, call):

    while 1:
            call = call.replace("'",'"') # never use single quotes
            ws = websocket.create_connection(node)
            # 'result' key of literally evaluated
            # string representation of dictionary from websocket
            ret = literal(ws.recv())['result']
            print (ret)
            return ret
        except Exception as e:
            print (e.args)

def account_balances(node, account_name):

    Z = '{"id":1,"method":"call","params":["database",'
    # make call for raw account balances as returned by api
    get_named_account_balances = Z + '"get_named_account_balances",["%s", [] ]]}' % (account_name)
    raw_balances = database_call(node, get_named_account_balances)
    # make list of asset_id's in raw account balances
    asset_ids = []
    for i in range(len(raw_balances)):
    # make a second api request for additional data about each asset
    get_assets = Z + '"get_assets",[%s]]}' % asset_ids
    raw_assets = database_call(node, get_assets)
    # create a common key "asset_id" for both list of dicts
    # also extract the symbol and precision
    id_sym_prec = []
    for i in range(len(raw_assets)):
                            'precision':raw_assets[i]['precision'], })
    # merge the two list of dicts with common key "asset_id"
    data = {}
    lists = [raw_balances, id_sym_prec]
    for each_list in lists:
       for each_dict in each_list:
           data.setdefault(each_dict['asset_id'], {}).update(each_dict)
    # convert back to list
    data = list(data.values())
    # create a new dictionary containing only the symbol and quantity
    ret = {}
    for i in range(len(data)):
        qty = float(data[i]['amount'])/10**float(data[i]['precision'])
        ret[data[i]['symbol']] = qty
    return raw_balances, ret

#node = 'wss://' # websocket address
node = 'wss://'
account_name = 'abc123' # string
raw_balances, balances = account_balances(node, account_name)


I'll also host this on my github

General Discussion / Re: [python-bitshares-qt] CITADEL Desktop Wallet
« on: April 14, 2018, 12:54:41 pm »
nice work! stoked to try it out!

General Discussion / Re: [ANN] microDEX - low latency minimalist UI
« on: April 09, 2018, 07:33:17 pm »
Seems very promising, looking forward to the v0.1 :D

moving up one satoshi version every few days, could be about 100,000 years before v0.1

u might want to give install a try now instead


General Discussion / Re: [ANN] microDEX - low latency minimalist UI
« on: April 07, 2018, 08:42:24 pm »
v0.00000013 uploaded

was small bug in buy/sell/cancel where
Code: [Select]
attempt=0 on wrong line below
Code: [Select]

Pages: [1] 2 3 4 5 6