When I was gaming on Windows, the DirectX 12 implementation in every game I tried was kinda garbage.
It usually either would just perform bad in general, or just have really bad input lag.
The first thing I’d try whenever I had problems was switching the renderer to DirectX 11, and that would often fix things.
In fairness, Vulkan implementations have been pretty hit-and-miss too. I think developers still just need to get used to the new execution model.
This also was on Nvidia graphics, which may or may not have had something to do with it.
This is the whole idea behind Turing-completeness, isn’t it? Any Turing-complete architecture can simulate any other.
Reminds me of https://xkcd.com/505/