I personally was for a very long-time a supporter of desktops. Even when majority of casual users long switched to laptops or even smartphones I still clang to a desktop.
There were 3 reasons for that: videogames, modability and price.
Videogames is the simplest of these. simply put it modern graphics require a lot of computation power to run. If you are not happy with cubist primitivism of Minecraft and wanted a photorealistic look of Witcher III, then you needed a discrete graphic card.
There were early gaming laptops back in 2000s, but they were bulky, heavy, expensive and not that powerful. At that time, they were more of a aspiration to make gaming more portable than a real solution that could actually play newest games.
Finally, the often-overlooked advantage of desktops is modability and resulting price reduction. A gaming desktop used to cost less than a gaming laptop. However, it is only if you build your own gaming rig, that you can get a decent gaming performance for less than $1000. Compare that to $2500 something for pre-build gaming desktop than would be somewhat faster and around $5000 for gaming laptop that would lose to your sub $1000 in performance.
There were a handful of strategies in selecting parts that would let you achieve such outcomes. Such as mid-range graphic card for example. A couple of cards often numbered something like xx60 or xx50 in nVidia nomenclature would perform well enough to run all the games but will not break the budget. Go below that towards xx30 or xx10 and they will not run games. In contrast xx70 and above cost astronomically and also require expensive power supply to run.
There was more. For example, AMD uses the same socket for several years straight. That way when you build new rig, you can buy motherboard with newest socket. Then in several years only need to swap a processor and graphic card and add some RAM to get a boost in performance, comparable to buying a new PC. Such a solution would save you a further money.
All of these considerations kept me on desktop for a long time.
However recently things started to change.
Gaming laptops became increasingly fasters, but also smaller and cheaper. By early 2020s progress in gaming laptop development finally started to threaten the gaming desktop market.
However, the biggest nail in the coffin came from a different direction. It begun when AMD finally released their Ryzen processors. They were slow to make it into laptop market but now you can find enough cheap laptops with Ryzen in them.
Ryzen has reasonably good graphic core that can actually run modern games. Its release made Intel up its game and dramatically increase number of shading units in their processors starting from 2020s.
Before that Intel processors could only run older games made before 2010 something. Intel even made an agreement with nVidia that likely included the clause about keeping shading unit count low in order to not destroy nVidia's discrete graphic card business.
From my personal experience Witcher III refused to run on my GeForce 250 GTS with 128 shader units. However, it run rather well on Ryzen 3 2200G with 512 shader units.
Now both Intel and AMD offer laptops that can actually play modern games with caveats.
The future looks even better. Competition between Intel and AMD will likely make each side to continue increasing shader unit count in their processors. Eventually the shader count will get so high, it would make discrete graphic cards completely obsolete. A 2000+ shader count would satisfy even the most.
On a separate note, we finally see SSD catching up to Hard Drives in their total capacity. Soon another legacy tech will become obsolete.
I wonder if we will see modular laptop design or may be even something more revolutionary.