This repository has been archived by the owner on Apr 26, 2024. It is now read-only.
A fast postgres adapter for pypy #11756
Labels
T-Enhancement
New features, changes in functionality, improvements in performance, or user-facing enhancements.
(Mostly a derivative of #9294, but meant to be an idea-collecting issue)
One of the largest roadblocks I've encountered when working on pypy (#8888) is that PyPy interfaces poorly with any CFFI, this is due to pypy having to emulate a CPython API (see this article), which slows down execution around these parts.
Currently, synapse uses psycopg2 for its database operations, which is heavily reliant on CFFI for CPython speedups.
For PyPy, this is self-defeating if we'd want to utilise it for speedups, as the speedups gained by its runtime are offset by the slowdowns of the CFFI interface.
Luckily, HPy can offer a solution, which (due to its design) can offer a low-overhead CFFI interface on any python implementation.
I've requested for psycopg3 to include support for hpy, which would make working with psycopg as fast on pypy as it would be on cpython. (psycopg/psycopg#154)
However, in the meantime, my hypothesis was that a pure-python postgres adapter would be as fast as any CFFI adapter, due to the machine code compilation that PyPy would offer that code. I suggested pg8000, which is still maintained.
I understand however that adding an additional database driver next to a battle-hardened one, for a "speedup" usecase, is not exactly appealing from a maintenance perspective. Still, for pypy to be realistic, a fast database adapter has to exist, be it either through psycopg3's hpy support, or through a pure-python adapter such as pg8000, or via another method.
The text was updated successfully, but these errors were encountered: