I've just purchased a Samsung Odyssey HMD and am very impressed with the work that Microsoft has put into Windows Mixed Reality. Navigation and use in the Cliff House is relatively intuitive and remarkably quick to learn, and the immersion is outstanding. I look forward to working with Mixed Reality as this technology matures and improves.
It's somewhat ironic, however, that the quality and totality of the immersive experience yields what for me is the single biggest obstacle to comfortably working in WMR... the complete separation from one's environment. As nice as this immersive interface is, there is still value in being able track key items in your real surroundings, such as your keyboard and mouse, the corners of your desk, or that piping-hot extra-large "double-half-caff mocha latte" that's sitting nearby.
Given the inside-out design of WMR devices, I am wondering if it would be possible to allow, at a software level, some mechanism for passing through at least a small amount of key 'reference' information from the real environment into the virtual... perhaps a 'ghosted' (e.g. 90% transparency) view from the HMD cameras, which could be faded in and out as needed, or possibly print-able (or purchased) coded stickers which could be displayed much like boundary dots are now. (For coded stickers, perhaps they could even mapped to virtual counterparts, so that a virtual copy of your real keyboard can be visible.
It seems to me that an enhancement such as this would go a long way towards bringing the non-HoloLens HMDs closer to Augmented Reality, and towards addressing one of the more common complaints I hear about VR HMDs.