Skip to content

Instantly share code, notes, and snippets.

@0xdevalias
Last active January 15, 2025 09:52
Show Gist options
  • Save 0xdevalias/c6e25addd0c9b74428b611f4b9e0d1f7 to your computer and use it in GitHub Desktop.
Save 0xdevalias/c6e25addd0c9b74428b611f4b9e0d1f7 to your computer and use it in GitHub Desktop.
Some notes on exporting Nexo Pro data

Exporting Nexo Pro Data

Some notes on exporting Nexo Pro data.

Table of Contents

Official Activity Export

Firstly, I would recommend using the official Activity -> Export page if possible:

Though it has an off-by-1 bug on the to date currently (see section below), and it also doesn't give good options for exporting our full 'all time' data (even using the workaround below, I was getting 500 - Internal Server Error when trying).

Though there is another way.. where we can make use of the same JSON API powering the website's main activity pages (see section below).

Workaround for to date bugs

Currently there is a bug with that that means if you're trying to export data for today, you may not get all of the transactions based on how it sets the end date. You can work around this with the Chrome DevTools debugger. Since this is in webpacked code, the exact references and variable names will likely change, but this should help you find what you need still.

You want to find code that looks somewhat like this, which is currently within this Webpack chunk file:

// ..snip..
  , ka = __webpack_require__(35662).Z.create({
  baseURL: "/api/v1/export",
  withCredentials: !0
})
  , Oa = function(e) {
  return ka.get("/", {
    params: {
      page: e.current,
      size: e.pageSize
    }
  })
}
  , Pa = function(e) {
  return ka.post("/", e)
}
  , Ta = function(e) {
  return ka.put("/".concat(e.id))
}
  , Aa = function(e) {
  return ka.delete("/".concat(e.id))
}
// ..snip..

Set a breakpoint on the return ka.post("/", e) line, then select your export 'Type' and 'Date Range' as normal from the Nexo Pro Activity Export page, and click 'Submit'.

The code should stop at this breakpoint, where you have access to the current scope, including the e variable, which looks somewhat like this:

  • Spot
    • {
        "type": "Spot",
        "from": "2025-01-13T13:00:00.000Z",
        "to": "2025-01-14T13:00:00.000Z"
      }
  • Futures
    • {
        "type": "Futures",
        "from": "2025-01-13T13:00:00.000Z",
        "to": "2025-01-14T13:00:00.000Z"
      }
  • Interest
    • {
        "type": "Interest",
        "from": "2025-01-13T13:00:00.000Z",
        "to": "2025-01-14T13:00:00.000Z"
      }
  • Deposits & Withdrawals
    • {
        "type": "DepositsWithdrawals",
        "from": "2025-01-13T13:00:00.000Z",
        "to": "2025-01-14T13:00:00.000Z"
      }

I'm in Australia, where we are currently at AEDT (UTC+11). I selected my end date as 2025-01-15 (today). Looking at the to date above, notice how it is 2025-01-14T13:00:00.000Z, which isn't going to include any of todays data. This is basically because it is set to the equivalent of new Date('2025-01-15T00:00:00+11:00'), as you can see here:

new Date('2025-01-15').toJSON()
// '2025-01-15T00:00:00.000Z'

new Date('2025-01-15T00:00:00+11:00').toJSON()
// '2025-01-14T13:00:00.000Z'

Even if this timezone bug didn't exist, there is still an off-by-1 error, where if I select an end date of 2025-01-15, that usually means I would want to include all of the transactions on that day as well, so to fix it, we actually need to do something more like this:

new Date('2025-01-15T23:59:59.999+11:00').toJSON()
// '2025-01-15T12:59:59.999Z'

Or to make it simpler:

new Date('2025-01-16T00:00:00+11:00').toJSON()
// '2025-01-15T13:00:00.000Z'

Though since the new Date() constructor defaults to using my system's local timezone, and because presumably any transactions I am trying to export would have happened before I run the export, I can just do:

new Date().toJSON()
// '2025-01-15T02:37:31.337Z'

Getting back to our debugger session, this means we can just set to to new Date():

e.to = new Date()

Then resume the script execution to allow the export request to proceed; after which you should be able to download the export from the website as normal once it completes.

Using the JSON API

If the official export doesn't work for your needs, there is another way that may help.. where we can make use of the same JSON API powering the website's main activity pages:

You could manually click through each of these pages and sniff the API requests from the Chrome DevTools Network tab, or a capturing proxy or similar; but if you wanted a semi-automated way to do this outside of a browser, you can use my paginate-fetch helper script, which wraps curl / restish and adds extra functionality for fetching paged content:

First you will need to browse one of the above pages with Chrome DevTools open, then go to the 'Network' tab, and find the relevant request, which may look something like:

GET https://pro.nexo.com/api/v1/order/depositAndWithdraw?page=1&size=10

Then you can right-click on that entry, and select 'Copy' -> 'Copy as cURL', to get something like this:

curl 'https://pro.nexo.com/api/v1/order/depositAndWithdraw?page=1&size=10' \
  -H 'accept: application/json, text/plain, */*' \
  -H 'accept-language: en-AU,en-GB;q=0.9,en;q=0.8' \
  -H 'cache-control: no-cache' \
  -H 'cookie: ajs_anonymous_id=REDACTED; referer=; OptanonConsent=REDACTED; OptanonAlertBoxClosed=REDACTED; nsi=REDACTED; esi=REDACTED; connect.sid=REDACTED; __cf_bm=REDACTED' \
  -H 'pragma: no-cache' \
  -H 'priority: u=1, i' \
  -H 'referer: https://pro.nexo.com/activity/deposit-and-withdraw' \
  -H 'sec-ch-ua: "Google Chrome";v="131", "Chromium";v="131", "Not_A Brand";v="24"' \
  -H 'sec-ch-ua-mobile: ?0' \
  -H 'sec-ch-ua-platform: "macOS"' \
  -H 'sec-fetch-dest: empty' \
  -H 'sec-fetch-mode: cors' \
  -H 'sec-fetch-site: same-origin' \
  -H 'user-agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36'

We don't actually need to pass most of those headers, nor most of the cookies. The relevant cookies seemed to be:

Knowing that (and some details I inferred from looking at the shape of the URL + response JSON), we can now construct this more minimal command using my paginate-fetch helper; which we then pipe into jq to convert the timestamp in the response JSON from something like 1736908651337 to something far more human friendly like 2025-01-15T02:37:31.337Z; and then we output it to a .json file:

WARNING: By default, I'm not sure if you will be able to access the full history by just paging through these URLs as normal. You may need to add parameters like this &from=1577836800000&to=1767225600000 (which is from: new Date('2020-01-01').valueOf(), to: new Date('2026-01-01').valueOf())to set the desired range.

paginate-fetch \
  --debug \
  --page-param='page' \
  --count-param='size' \
  --total-key='.totalOrders' \
  --array-key='.orders' \
  'https://pro.nexo.com/api/v1/order/depositAndWithdraw' -- -H 'referer: https://pro.nexo.com/activity/deposit-and-withdraw' -H 'cookie: __cf_bm=REDACTED; connect.sid=REDACTED' \
  | jq '[.[] | .timestamp |= (./1000 | todate)]' \
  > 20250115-nexo-pro-deposits-and-withdrawals-all-time.json

We could then repeat a similar process for any of the other API endpoints we're wanting to get the data from, eg.

  • Spot
    • paginate-fetch \
        --debug \
        --page-param='page' \
        --count-param='size' \
        --total-key='.totalOrders' \
        --array-key='.orders' \
        'https://pro.nexo.com/api/v1/order/history/spot' -- -H 'referer: https://pro.nexo.com/activity/spot' -H 'cookie: connect.sid=REDACTED; __cf_bm=REDACTED' \
        | jq '[.[] | .timestamp |= (./1000 | todate)]' \
        > 20250115-nexo-pro-spot-all-time.json
  • Dust
    • paginate-fetch \
        --debug \
        --page-param='page' \
        --count-param='size' \
        --total-key='.totalDustConversions' \
        --array-key='.dustConversions' \
        --start-page=0 \
        'https://pro.nexo.com/api/v1/order/dust-conversions-history' -- -H 'referer: https://pro.nexo.com/activity/dust-convert' -H 'cookie: connect.sid=REDACTED; __cf_bm=REDACTED' \
        > 20250115-nexo-pro-dust-conversions-all-time.json
    • curl --silent \
        'https://pro.nexo.com/api/v1/order/sub-dusts?batchId=00000000-0000-0000-0000-000000000000' -H 'referer: https://pro.nexo.com/activity/dust-convert' -H 'cookie: connect.sid=REDACTED; __cf_bm=REDACTED' \
        | jq \
        > 20250115-nexo-pro-dust-conversions-batch-00000000-0000-0000-0000-000000000000-all-time.json
  • Etc.

Another way to find potentially relevant api paths (but not which part of the site triggers them) is by using the Chrome DevTools 'Search' tab and searching for something like /api/v1/order, which gave the following results:

  • /api/v1/order/trading-fees
  • /api/v1/order/open
  • /api/v1/order/closed
  • /api/v1/order/history/spot
  • /api/v1/order/history/twaps
  • /api/v1/order/closed/twaps
  • /api/v1/order/open/twaps
  • /api/v1/order/closed/twap-by-deal-id
  • /api/v1/order/depositAndWithdraw
  • /api/v1/order/history/interest
  • /api/v1/order/closed/twap/orders
  • /api/v1/order/closed/twap/summary
  • /api/v1/order/dust-conversions-history
  • /api/v1/order/sub-dusts
  • /api/v1/order/limits
  • /api/v1/order/pass-through-limits
  • /api/v1/order/futures-limits

Converting JSON -> CSV for CoinTracking import

If you want to convert the spot history JSON -> CSV to import into CoinTracking or similar, you can use the following script:

  • WARNING: The JSON doesn't contain the same integer id that the official export uses, so it's not possible to construct an id that will perfectly match the default csv entries, so you may get duplicates on import if you use this method.
  • NOTE: The timestamp in the JSON doesn't have millisecond precision like the CSV export does. This probably won't make a difference, but I figured it's worth noting in case it does.

This version of the script matches closer to the original CSV, in that it doesn't quote the cells:

jq -r '
  def formatTimestamp($ts): ($ts | sub("T"; " ") | sub("Z"; ""));
  
  def toPrecision($num; $digits): 
    ($num | tostring | 
    if contains(".") then
      (.[0:index(".")] + "." + (.[index(".")+1:] + "00000000000000000")[0:$digits])
    else
      . + ".00000000000000000"[0:$digits]
    end);
    
  (
    ["id","timestamp","pair","side","type","price","executedPrice","triggerPrice","requestedAmount","filledAmount","tradingFee","feeCurrency","status","orderId"]
  ),
  (
    .[] | [
      .id,
      formatTimestamp(.timestamp),
      .pair,
      .side,
      .type,
      toPrecision(.price; 8),
      toPrecision(.price; 8),
      .triggerPrice,
      .requestedAmount,
      toPrecision(.filledAmount; 8),
      toPrecision(.tradingFee; 16),
      .feeCurrency,
      .status,
      .orderId
    ]
  ) | join(",")
' 20250115-nexo-pro-spot-all-time.json > 20250115-nexo-pro-spot-all-time.csv

Whereas if we want something more robust, we can swap | join(",") on the 2nd last line for | @csv like in this version, and the individual cells will all be quoted:

jq -r '
  def formatTimestamp($ts): ($ts | sub("T"; " ") | sub("Z"; ""));
  
  def toPrecision($num; $digits): 
    ($num | tostring | 
    if contains(".") then
      (.[0:index(".")] + "." + (.[index(".")+1:] + "00000000000000000")[0:$digits])
    else
      . + ".00000000000000000"[0:$digits]
    end);
    
  (
    ["id","timestamp","pair","side","type","price","executedPrice","triggerPrice","requestedAmount","filledAmount","tradingFee","feeCurrency","status","orderId"]
  ),
  (
    .[] | [
      .id,
      formatTimestamp(.timestamp),
      .pair,
      .side,
      .type,
      toPrecision(.price; 8),
      toPrecision(.price; 8),
      .triggerPrice,
      .requestedAmount,
      toPrecision(.filledAmount; 8),
      toPrecision(.tradingFee; 16),
      .feeCurrency,
      .status,
      .orderId
    ]
  ) | @csv
' 20250115-nexo-pro-spot-all-time.json > 20250115-nexo-pro-spot-all-time.csv

See Also

My Other Related Deepdive Gist's and Projects

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment