🤖 AI Summary
Wasmer today announced full Python support in Wasmer Edge (Beta), bringing unmodified CPython to WebAssembly via WASIX. To do this they added dynamic linking (dlopen/dlsym), libffi (ctypes) support, polished sockets and threading in WASIX, published a Python Package Index of popular native wheels compiled to WASIX, and built an automated deploy pipeline (their alternative to buildpacks) to detect and run projects. The platform already runs FastAPI, Streamlit, Django, LangChain and more, with templates and examples for quick deployment; PyTorch, polars and greenlet/gevent support are coming soon.
This matters for AI/ML teams because it finally enables real server-side Python workloads at the edge with near-native compatibility: native C extensions (numpy, pandas, pydantic), multithreading/multiprocessing, raw sockets, WebSockets and subprocess-heavy tools (ffmpeg, pandoc) “just work” without adapter layers. Performance is promising — Wasmer reports pystones benchmarks approaching native Python (e.g., ~534k vs ~604k pystones/sec native) and a ~6x improvement over older Wasmer builds — with optimizations underway to hit ~95% of native speed. Compared to Pyodide-based Cloudflare Workers and adapter-heavy AWS Lambda, Wasmer Edge offers broader native-module compatibility, faster cold starts, sandboxing, and simpler deployment, albeit still in Beta.
Loading comments...
login to comment
loading comments...
no comments yet