It streams all tables (excluding archived ones) into Google Big Query.
This will start up a server which will stream data from data sources like Atlas, MongoDB, S3 and Azure Blob Storage into Google Big Query on a schedule e.g. every day at midnight or every hour. You don't need anything other than this server. When this process is complete it will shut down the server so if you want to add more tables you'll need to start it back up. Note that if the server is on when its scheduled to begin the etl process it will restart.
- https://console.cloud.google.com/compute/instancesAdd Create a fast high memory Google Cloud VM Instance called
slamdata-bigquery
with access scopes set to "Allow full access to all Cloud API" and in the Management, security, disks, networking, sole tenancy" section, in the "Networking" section add the network tagmy-ip-web
then edit the network interface and create an external ip address calledexternal
, make a note of this ip ad