Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - litepresence

Pages: 1 2 3 4 [5] 6 7 8 9 10
Code: [Select]
If (current price >  two-day moving average price) {
  feed price = current price;
  feed price = two-day moving average price;

this logic is not applicable to the situation at hand

there is no price on bitshares dex; there is only:

Code: [Select]

Code: [Select]
so for this reason you cannot use a "greater than" operator on price vs a moving average of price to initiate action; because what then is the desired effect if we swap the market?   say BTS:CNY vs how do we handle CNY:BTS market?

what is "price greater than moving average"?

It does not work.

What you are looking to do is normalize outliers in the dataset to prevent extremes.

What I would suggest in such a case:
Code: [Select]
price = qty_A[-1]/qty_B[-1]
inverse = qty_B[-1]/qty_A[-1]

if price > (2*qty_A[-2])/qty_B[-2]:
    B_scale = qty_B[-1]/qty_B[-2]
    qty_A[-1] = B_scale*2*qty_A[-2]

elif inverse > (2*qty_B[-2])/qty_A[-2]

    A_scale = qty_A[-1]/qty_A[-2]
    qty_B[-1] = A_scale*2*qty_B[-2]

this would be a balanced solution to the issue which did not favor either currency.

It would also allow for price exploration; but not beyond two times the previous price in either direction; up or down.

I use this same mechanism to normalize my candle data when I want to optimize parameters for a finite state machine with machine learning techniques.   Less than 2X previous or greater than 1/2 previous has always served me well to filter otherwise unrealistic pricing from crypto markets.

Technical Support / Re: Account Hijacked
« on: March 28, 2019, 02:17:45 pm »
I am documenting this issue here:

BSIP: Proposals Scam Prevention #154

General Discussion / Re: microDEX <-> metaNODE <-> manualSIGNING
« on: February 23, 2019, 10:32:53 pm »

1 week run time achieved and UI is still 100ms responsive to buy/sell/cancel requests.

I forgot to mention that all auth'd ops occur in parallel processes; so you can order several buy/sell operations prior to the first actually executing (which takes a few seconds).

Technical Support / 1000 Candles of Historical HLOC from the DEX
« on: February 17, 2019, 05:00:46 pm »

list of dicts of human readable numpy arrays

Code: [Select]
from websocket import create_connection as wss  # handshake to node
from json import dumps as json_dumps
from json import loads as json_loads
import matplotlib.pyplot as plt
from datetime import datetime
from pprint import pprint
import numpy as np
import time

def public_nodes():
    return [

def wss_handshake(node):
    global ws
    ws = wss(node, timeout=5)

def wss_query(params):
    query = json_dumps({"method": "call",
                        "params": params,
                        "jsonrpc": "2.0",
                        "id": 1})
    ret = json_loads(ws.recv())
        return ret['result']  # if there is result key take it
        return ret

def rpc_market_history(currency_id, asset_id, period, start, stop):

    ret = wss_query(["history",
    return ret

def chartdata(pair, start, stop, period):
    pass  # as per extinctionEVENT cryptocompare call

def rpc_lookup_asset_symbols(asset, currency):
    ret = wss_query(['database',
                     [[asset, currency]]])
    asset_id = ret[0]['id']
    asset_precision = ret[0]['precision']
    currency_id = ret[1]['id']
    currency_precision = ret[1]['precision']

    return asset_id, asset_precision, currency_id, currency_precision

def backtest_candles(raw):  # HLOCV numpy arrays

    # gather complete dataset so only one API call is required
    d = {}
    d['unix'] = []
    d['high'] = []
    d['low'] = []
    d['open'] = []
    d['close'] = []
    for i in range(len(raw)):
    del raw
    d['unix'] = np.array(d['unix'])
    d['high'] = np.array(d['high'])
    d['low'] = np.array(d['low'])
    d['open'] = np.array(d['open'])
    d['close'] = np.array(d['close'])

    # normalize high and low data
    for i in range(len(d['close'])):
        if d['high'][i] > 2 * d['close'][i]:
            d['high'][i] = 2 * d['close'][i]
        if d['low'][i] < 0.5 * d['close'][i]:
            d['low'][i] = 0.5 * d['close'][i]

    return d

def from_iso_date(date):  # returns unix epoch given iso8601 datetime
    return int(time.mktime(time.strptime(str(date),

def to_iso_date(unix):  # returns iso8601 datetime given unix epoch
    return datetime.utcfromtimestamp(int(unix)).isoformat()

def parse_market_history():

    ap = asset_precision  # quote
    cp = currency_precision  # base
    history = []
    for i in range(len(g_history)):
        h = ((float(int(g_history[i]['high_quote'])) / 10 ** cp) /
            (float(int(g_history[i]['high_base'])) / 10 ** ap))
        l = ((float(int(g_history[i]['low_quote'])) / 10 ** cp) /
            (float(int(g_history[i]['low_base'])) / 10 ** ap))
        o = ((float(int(g_history[i]['open_quote'])) / 10 ** cp) /
            (float(int(g_history[i]['open_base'])) / 10 ** ap))
        c = ((float(int(g_history[i]['close_quote'])) / 10 ** cp) /
            (float(int(g_history[i]['close_base'])) / 10 ** ap))
        cv = (float(int(g_history[i]['quote_volume'])) / 10 ** cp)
        av = (float(int(g_history[i]['base_volume'])) / 10 ** ap)
        vwap = cv / av
        t = int(min(time.time(),
               (from_iso_date(g_history[i]['key']['open']) + 86400)))
        history.append({'high': h,
                        'low': l,
                        'open': o,
                        'close': c,
                        'vwap': vwap,
                        'currency_v': cv,
                        'asset_v': av,
                        'time': t})
    return history


node_id = 2
calls = 5  # number of requests
candles = 200  # candles per call
period = 86400  # data resolution
asset = 'BTS'
currency = 'USD'

# fetch node list
nodes = public_nodes()
# select one node from list
# gather cache data to describe asset and currency
asset_id, asset_precision, currency_id, currency_precision = (
    rpc_lookup_asset_symbols(asset, currency))
print(asset_id, asset_precision, currency_id, currency_precision)

full_history = []
now = time.time()
window = period * candles
for i in range((calls - 1), -1, -1):
    print('i', i)
    currency_id = '1.3.121'
    asset_id = '1.3.0'
    start = now - (i + 1) * window
    stop = now - i * window
    g_history = rpc_market_history(currency_id,
    history = parse_market_history()
    full_history += history


data = backtest_candles(full_history)

fig = plt.figure()
ax = plt.axes()
ax.tick_params(axis='x', colors='0.7', which='both')
ax.tick_params(axis='y', colors='0.7', which='both')

x = data['unix']
plt.plot(x, data['high'], color='white')
plt.plot(x, data['low'], color='white')
plt.plot(x, data['open'], color='aqua')
plt.plot(x, data['close'], color='blue')

RPC call to public node


Is in graphene format; ie no decimal places, no human readable pricing

(min_to_receive/10^receive_precision) / (amount_to_sell/10^sell_precision)

gives you this:

I give you this:

crypto long, moar coinz short!

- uncle lp

Technical Support / Re: Assert Exception: min_to_receive.amount > 0:
« on: February 14, 2019, 08:53:42 pm »
on the backend there are no decimals in prices OR amounts

everything is integers in this format:

(amount_to_sell / 10^precision_of_asset1)
(min_to_receive / 10^precision_of_asset2)

each asset has a precision... if you get too close to its precision... ie trying to sell dust then this will get rounded to zero and rejected.



General Discussion / Re: API Node Latency Map
« on: February 14, 2019, 07:08:50 pm »
You can still use this script on your own machine, but I am no longer keeping this live updated; my internet bandwidth is very limited.  Please contact me on telegram @litepresence if you would like to keep a public version alive and I'll point this thread at it.

my github has recently been updated to newer version, you no longer need the basemap, the script will aquire it externally from imgur

General Discussion / microDEX <-> metaNODE <-> manualSIGNING
« on: February 14, 2019, 05:29:58 pm »

I have set out to ensure UI buy/sell/cancel ALWAYS works when I push buttons and never have to think about which node to pick, slow connectivity, idiosyncrasies of graphene speak, or the bugs of "full featured software". 

Just on demand buy/sell/cancel.

Go to my github repo...


You get copy of each of these and plop in one folder

each script is about 50kb; 1500 lines; very small, quick/easy to read

all dependencies are `pip3 install` the stack does NOT require pybitshares

Code: [Select]
linux / python3


is statistically curated public api data
you run this in a separate terminal before launching microDEX

is your private authenticated ops api
microDEX imports the broker(order) method from manualSIGNING to sign transactions with a WIF key.  manualSIGNING is a purpose built fork of pybitshares for signing limit orders about 1/40th the size of the full package.

is your buy/sell/cancel user interface
built on tkinter, it communicates with metaNODE at nanosecond speed and places orders via manualSIGNING to the fastest nodes in the network.

it runs 24/7 maintaining 100ms response time on button clicks
all buy/sell/cancel ops occur in timed out parallel process to the UI process; nothing can get hung
extremely reactive, 99.99% uptime, never rogue or stale; statistically curated public api data
when new version is available microDEX will inform you at startup and allow in script update

I maintain a "heavily commented", "minimal dependencies", "minimal viable product", "pep8", and "procedural-style" script.

you can demo the platform without providing any acct details

there is also metaNODEwhitepaper and manualSIGNGwhitepaper in my repo

enjoy, pro bono

crypto long, moar coinz short!

-uncle lp


YES! this stack does take your wif key, and performs ECDSA. 
Scammers can use these methods to take all ur cryptoz!!!
You should NOT blindly trust ANYONE to write a GUI wallet for you. 
You should read ANY script with access to your funds before giving it your keys!

General Discussion / Re: API Node Latency Map
« on: February 14, 2019, 05:17:34 pm »
What does the large pulsating circle around geographical points indicate? Activity?

the bigger the circle the faster the response time to USA east coast

General Discussion / Re: API Node Latency Map
« on: December 13, 2018, 01:05:42 am »
updated repo

- removed whitespace from uploaded images for easier incorporation into user websites
- uploaded basemap.png to repo
- added list of seed nodes to jsonbin under key 'SEEDS' with data ('ip', 'ping', 'geolocation')
- plotting seed nodes on top with yellow dots
- added UTC timestamp to images
- removed misleading green dots in africa from the lens flare in the base map

new sample:

this should satisfy most requirements of issue 1348

General Discussion / API Node Latency Map
« on: December 07, 2018, 10:30:21 pm »
for data you go here:

This jsonbin api is at a stationary location.

I update the latency info hourly via python script; I'm on east coast US.

at the link you will find json dictionary with keys:


UNIVERSE is all known bitshares public api nodes mentioned in various github projects; I suspect I have the largest known list of bitshares public api domains

URLS are github urls where node lists can be found; this tool does not contain a list of nodes.  Instead it goes to many different repos on github (such as bitshares, cryptobridge, rudex) etc.  where people have configuration files with lists of nodes; if they update their repo with new nodes, the script finds them with web scraping methods

LIVE is a list of tested nodes that respond within 10 seconds and have a head block number that is not stale. 

PING is list of tuples of ping time for each LIVE node

GEO is geo location data (country, city, longitude, latitude) for each LIVE node as provided by or

COUNT is number of live nodes vs number of nodes in universe

SOURCE_CODE is my github were you can get this tool and run it yourself to produce your own web hosted latency map in distributed fashion.  The latency script will also give you information as to why each down node failed to connect.  A full report looks like this:

You can also pick up copy of metaNODE and ExtinctionEvent while you're there :D

UNIX and UTC are time stamp when this jsonbin was last updated

MAP_URL is a link to a visual image of the data on the jsonbin;
the map url will change every hour!

it will be something like:

and look like one frame of this gif

source code is here, WTFPL:

todo list... upload 24hr animations instead of still images

thanks to free and easy to use api services that make this project is possible:


I've tightened my margins... let the scalp wars begin.

blue is dex bid/ask; tan is dex last

the margins and price line are now very dynamic on the dex, much like external cex prices for bitshares vs btc


same market, matching 24 hour period, 60 days later... bts:open.btc dex prices and book have gone from bar code to liquid

EV6 has been released

this chart is 30 days ago
magenta/yellow is cex price
blue/white is dex price

Even without my personal EV running, the BTS:OPEN.BTC bid / ask and priceline are no longer stepwise like this image from 30 days ago.     
The margins are tight, dynamic like the cex markets... they look to be EV controlled by some other operator running a different tune from my own.

I've tightened my margins... let the scalp wars begin.


74 and released to github

lots of new niceties related to plotting and logging, improved control over scalping

Live session now rotates through 3 views every 5 minutes.

The EV7/metaNODE11 stack has been


for a 1,000,000 second (12 day) run without hiccup or user intervention; buy/sell/cancel and replace all scalp and primary orders every 5 minutes; maintaining a perfect count of 5 minute ticks during the ongoing run. 

I quite literally locked a linux box in a room running algo for 2 weeks entirely unattended; it said BTS/BTC bear market when I left.  When I returned it said bull market and had 13% more btc than starting portfolio.  Log file clean.  8)




it ships with a 120X stock 1000 day BTS/BTC tune; my personal tune backtests over 7000X during the same period


EV6 has been released

improved scalp logic offsets thresholds above or below cex market prices based on primary indicator market cross

previously when creating simple moving averages from cex data there was a degree of stepwise high frequency "wiggle" in the indications where the moving averages would oscillate up and down +/- 1 satoshi per hour; this is a result of candle aggregation methods in the data arriving from cryptocompare.  I have resolved this using polynomial regression of the last 24 hours of plotted moving average.  So now bot plots moving average from cex data, then performs poly, then replots the moving average "smoothly"; the state machine is then fed the smoothed indication.

additionally; the new regression allows for moving averages to be extrapolated into the future; these moving average predictions are in yellow and are deleted and redrawn every 5 minutes; the tip of the prediction (24 hours into the future) is plotted as a single magenta dot and it stays on the chart.

finally, in the log some information about primary moving average slope and concavity is provided.

you can see the older stepwise moving average here; the step size is about one satoshi

this shows new poly regressions of moving averages extending into future

this is zoomed in on the 50 day average

zoomed in further

this is data provided in the new log

Pages: 1 2 3 4 [5] 6 7 8 9 10