Here is the problem I'm trying to solve here. You have some code running in AWS or running in some far-away server rack. Your access to the machine in question is things like SSH or telnet. The code is running a bunch of jobs as part of a CI/CD system. You would like to kick off a job where the code runs with RemotePdb so that you can step through it and examine variables and set breakpoints and all that, without disturbing any other jobs running at the same time.
You can't change the code. You don't have time to get a merge request approved, and it doesn't make sense to do a merge request to facilitate what might be a very brief one-time debugging session. You need some kind of hooks in your production code that make this remote debugging stuff feasible without a fresh push to your CI/CD stack.
I had the foresight to add a "back channel" JSON parameter to the test parameters in the CI/CD system, which goes to all the pieces of code I care about. This metadata is on a per-job basis: it only affects a targeted job while all the other simultaneously running jobs are unaffected.
Given a piece of back channel JSON like this:
{debug: "scan", target: "/path/to/foo.py:bar"}
the goal is that when we get to the "scan" step in the CI/CD job, we should set a breakpoint in file "foo.py" at the
bar
function. As we enter that function, we should kick off a RemotePdb session so that I can telnet in and do a
debugging session with my code.
If there is no back channel JSON or it is doing something else, the code should run without a debugger to avoid any performance hit.
In the system, the set_up_remote_pdb
function is passed the target information from the JSON back channel.