I personally like the modern, clean look of ArcGIS Pro. It has been beautifully styled and is very impressive. From a purely stylistic viewpoint it makes quantum and any other GIS look like windows XP. It looks like a high powered web design from an elite San Francisco graphics company, a good match to ESRI's beautiful web sites. That is important when selling look and feel to upper management.
I also like ESRI is willing to make a fresh start. 64-bit is long overdue. It is good they are finally trying. But as you write, the actual functionality and reliability of Pro is poor. It just crashed on me again when I did nothing but open a new project. Not good!
What is worse is that such a big effort went into implementing an architecture that was technologically obsolete before they finished it. ArcGIS Pro is non-parallel software in an era when modern software is parallel. It uses GPU as GPU was used 15 years ago, for rendering, not as GPU is used today for massively parallel GPGPU computation.
Pro is not multi-threaded. At best it can run a single background geoprocessing thread without locking up the GUI. That's terrible in an era when people routinely write software that parallelizes tasks to execute on all 16 hypercores of an 8 core CPU. Other companies can do GIS geoprocessing on the desktop using 32 or 64 CPU cores and thousands of GPU cores. Why not ESRI? It took ESRI over ten years longer to go 64 bit than smaller GIS companies and open source. Parallel is much harder than 64-bit. Will it take ESRI 20 years to go parallel as other companies have today? There are new packages that can do in two or three seconds what Pro takes hours. To match that Pro will either have to be re-written completely or ESRI will have to buy the new technology from somebody that has it.
ESRI also needs to make Pro easier to use. Today it is too hard to do simple tasks that ordinary workflow in GIS requires. An example:
I read documentation diligently. I work through all tutorials. Today, I spent five hours trying different ways to import a shapefile as a layer into ArcGIS Pro. Five hours! In real life you meet shapefiles all the time without a .prj that are in lat/lon, WGS 84. For any other GIS in the world that takes five minutes: import the shapefile and done. ArcGIS Pro? It is the most convoluted, opaque process I have ever seen. I read dozens of web pages and blogs and documentation pages. Finally, I wrote a python script using arcpy.SpatialReference(4326) to specify the projection and build a .prj for the shapefile. Now when I add the shapefile as a layer it appears with the correct Spatial Reference properties but the extent is wrong and it does not appear in the map. I'm not asking for advice here. Sooner or later I will figure it out using published documentation and web searches. I am just giving an example where if such a simple task is so difficult for a diligent beginner ESRI should consider there may be a basic problem with Pro. Such things must be made simpler.
I would like to see ESRI be successful with Pro. They need to be less inbred, to think outside the captive market. For example, the world uses EPSG, not ESRI factory codes for coordinate systems. Adding EPSG should be easy. See what can be done with few clicks, like importing a shapefile, in competitive systems and make sure Pro makes it as easy. Read/write many different formats. Buy parallel technology from smaller companies to make Pro go fast. Loosen up. A product that had the depth of GIS expertise ESRI has, with the money to style a software experience like ESRI has but built on modern internals and not obsolete stuff, that would be wonderful.
ESRI has a window of opportunity because the open source crowd does not have the organization and discipline to do parallel end user products. Smaller companies with modern technology do not have the money to gain traction against ESRI marketing. But nothing stops a big company like Oracle or Google from wrapping around a modern, parallel core a collection of hundreds of features, geoprocessing tools, format read/write taken from open source. And then ESRI will be up against a "quantum" of sorts but sold by a big commercial vendor that does not have the stigma of open source, and that big new quantum will do in seconds what ESRI takes hours. Any of the big database companies could do that just out of frustration at being held back by backward GIS technology in big data markets that involve location data. They are doing it already when you look at how every new GPU DBMS seems to include spatial visualization capability. Nothing to laugh at! ESRI should be first in that game because being second will be game over.