// d3d12umddi.h
typedef enum D3D12DDI_SHADER_MIN_PRECISION {
D3D12DDI_SHADER_MIN_PRECISION_NONE = 0x0,
D3D12DDI_SHADER_MIN_PRECISION_10_BIT = 0x1,
D3D12DDI_SHADER_MIN_PRECISION_16_BIT = 0x2
} ;
View the official Windows Driver Kit DDI referenceNo description available.
The D3D12DDI_SHADER_MIN_PRECISION enumeration describes the driver's minimum precision support options for shaders.
D3D12DDI_SHADER_MIN_PRECISION_NONE:0x0The driver supports only full 32-bit precision for all shader stages.
D3D12DDI_SHADER_MIN_PRECISION_10_BIT:0x1The driver supports 10-bit precision.
D3D12DDI_SHADER_MIN_PRECISION_16_BIT:0x2The driver supports 16-bit precision.
The returned information just indicates that the graphics hardware can perform HLSL operations at a lower precision than the standard 32-bit float precision, but doesn’t guarantee that the graphics hardware will actually run at a lower precision.