Note that I am not looking for something opinion based or some third party library - I merely want confirmation that nothing is planned (or pointers to some discussion by the powers to be). I tried google and fail to find anything, so it looks like I am heading towards a "write my own implementation in C++/CLI using the Intel library").
Like many I am working with financial numbers and that means floats are terrifically problematic. At the same time, the .NET decimal is a beast - slow but also large (128 bit large) Which makes it inefficient to use when you amass hundreds of thousands of them and want them in a struct with some more information.
IEEE 754 defines 3 decimal types that likely have coming support in hardware in mainstream procesors (they already have in less commmon ones, the Power Series for example), for 32, 64 and 128 bit. There is an optimized Intel library to do decimal mathematics and it is possibly quite likely that at one point at least the easier mathematics will be in hardware.
All I could find is an ancient discussion in the Annotated C# Standard about interop between .NET decimal and the IEEE then proposed standard and that it was rejected but made in a way it can be identified whether a bitfield is either a .NET decimal or an bit inveted IEEE decimal 128.
Since then, many years have passed. Now IEEE 754 : 2008 is finished and - I wonder whether there is anything official that has been published about how that is to go on. As I said, the .NET decimal is slow and has zero chance to ever get hardware acceleration - and i is unwiedely big.
So, anyone knows anything in a blog or something? Note - it has to be official or referring to, I am not here for opinions that are from people unrelated to the .NET langauges or BCL teams. THis is about a canonical resource whether additional data types are considered in the future... possibly for a .NET 5.0 / 6.0 timeframe.