multiprocessing.managers
cpython 3.14 @ ab2d84fe1023/Lib/multiprocessing/managers.py
Shared object managers for multiprocessing. BaseManager spawns a dedicated server process that owns the real Python objects. Client processes receive proxy objects whose method calls are serialized over a socket, dispatched by the server, and the results sent back. SyncManager (the standard concrete subclass) registers Queue, Lock, Event, Semaphore, BoundedSemaphore, Value, Array, dict, list, and Namespace as shared proxies. AutoProxy inspects any arbitrary object and generates a proxy class for it at import time.
Reading
BaseManager: starting the server process
BaseManager.__init__ stores an address and an authkey. Calling start() (or using the manager as a context manager) forks a child process running Server.serve_forever. The child binds a socket at the given address, registers each shared-type factory, and then loops on accept. The parent's BaseManager instance connects back to that socket and stores a connection.Client handle.
# Lib/multiprocessing/managers.py (CPython 3.14, simplified)
class BaseManager:
def start(self, initializer=None, initargs=()):
self._state.value = State.STARTING
self._process = self.Process(
target=type(self)._run_server,
args=(self._registry, self._address, self._authkey,
self._serializer, self._writer),
)
self._process.daemon = True
self._process.start()
# wait for the server to signal it is ready
self._address = self._reader.recv()
self._reader.close()
self._state.value = State.STARTED
self.connect()
The _writer / _reader pair is an os.pipe used only at startup so the parent can block until the server is listening before start() returns.
Server dispatch loop
Server.serve_forever accepts connections in a thread pool. Each connection runs Server.handle_request in its own thread. The request envelope is a tuple (ident, methodname, args, kwds); ident is the object ID assigned when the proxy was created. The server looks up the object in its id_to_obj dict, calls the method, and sends back either ('#RETURN', result) or ('#ERROR', exception).
def handle_request(self, c):
funcname = result = request = None
try:
connection.deliver_challenge(c, self._authkey)
connection.answer_challenge(c, self._authkey)
request = c.recv()
ident, methodname, args, kwds = request
obj, exposed, gettypeid = self.id_to_obj[ident]
if methodname not in exposed:
raise AttributeError(f"method {methodname!r} not exposed")
function = getattr(obj, methodname)
result = function(*args, **kwds)
except Exception:
msg = ('#ERROR', convert_to_error(methodname, sys.exc_info()))
else:
msg = ('#RETURN', result)
c.send(msg)
Every connection re-runs the HMAC challenge (inherited from multiprocessing.connection) so there is no persistent authenticated session; each new proxy method call that opens a fresh connection must authenticate again.
AutoProxy and MakeProxyType
AutoProxy examines an object (or its class) via dir() and getattr, filters to the caller-supplied exposed list (or all public names), and calls MakeProxyType to synthesize a new class. MakeProxyType is a class factory: it creates one method per exposed name, each of which calls self._callmethod(name, args, kwds) on BaseProxy.
def MakeProxyType(name, exposed, _cache={}):
exposed = tuple(exposed)
try:
return _cache[(name, exposed)]
except KeyError:
pass
dic = {}
for meth in exposed:
exec(
f"def {meth}(self, /, *args, **kwds):\n"
f" return self._callmethod({meth!r}, args, kwds)",
dic,
)
ProxyType = type(name, (BaseProxy,), dic)
ProxyType._exposed_ = exposed
_cache[(name, exposed)] = ProxyType
return ProxyType
The exec-based approach preserves the real method name so that repr, help, and inspect.signature work correctly on proxy instances.
gopy mirror
Not yet ported. A Go port is significantly more complex than pool or connection because it requires a running goroutine (or OS thread) acting as the object server, a generic dispatch mechanism that can call arbitrary methods on registered objects, and a proxy-generation step that Go's static type system makes harder to automate. The nearest analogue in Go would be an RPC server using net/rpc or a hand-written message loop, with proxy types generated by code generation rather than exec.
CPython 3.14 changes
- The default serializer changed from pickle protocol 2 to
pickle.DEFAULT_PROTOCOL(protocol 5), matching the rest of themultiprocessingpackage. SyncManagernow registersBarrieras a shared proxy type (added in 3.14 alongside thethreading.Barrierimprovements).Server.handle_requestgained a hard timeout on the per-connection thread to prevent a misbehaving client from holding the server indefinitely.AutoProxyno longer callsgetattron descriptors that raise duringdir()iteration, fixing a class ofAttributeErrortracebacks seen with C extension types.