A force map function is where you map and upgrade in a second data set a neural force map such that you can use it during some neural training stages for better training efficiency.
Analogue photonic computing could allow for and better form map function fit as the dataset compiles to a small subset when effecting a zone in the neural network you are training.
This would improve overall speed and dynamic efficiency in training the neural network.
With reasonable quantum computing say 256 - 10,000 good Qbits may be 8,000,000 error correction Qbits something like that with not even enough to do much factorisation but photonic and running in parallel so lots of these types of operations per second making good use of quantum communication.
You could have more dynamic optimisation control relative to the force map dataset you may also have a quantum dataset enhancing the force map dataset and quantum functionality in the neural net dataset you are training.
This would mean your mind could be very robust and could Do quantum AI functioning which would help in better modelling of molecular down sub atomic physics it would also help with things like more robust and dynamic modelling of both Physics and scene dynamic in games and help with modelling better engineering design and product testing analysis.
This is what people don't realize, that with even where photonic digital and analogue AI acceleration is capable now and the maturing of EUV fabrication given ten years the AI could be quite capable for human driven more with a lot less mundane work. If quantum mechanics moves from 10-20 good Qbits upto to a photonic equivalent system with at least 256 good Qbits over the next 20 years we would still be able to do a lot of more optimal computation with less effort. So even if you take the lower possible outcome by 2042 you would have at least 80 good Qbit's with some level of photonic ability giving you some degree of quantum optimisation capabilities.
You would still have over 0.6 zeta flops of cloud compute resources Animations, Game making and much modelling and Simulation would be a lot easier to do a lot with.
TB bot making would be more resource intensive and of less capability that if we had got further by 2040 but the bots would enable people and companies to do a lot with less human effort but compute resources would be more tighter for big projects.
A top mini PC would still be able to do at least 0.8 Peta Flop ideal for many applications but less dynamic than if such a system had reached 50+ Peta flops by this point in time.
You would still have analogue being used to some degree to enhance training.
Computing after this point for decades to come would grow and prosper nicely allowing ever better models and capabilities and ever better production range ability and efficiency.