Yet more worrying for manufacturers, the costs of preparing chips for new nodes isn't just rising but rising quickly, with mask sets already over a million dollars and expected to grow even further thanks to the high costs of developing masks for current multi-patterning technologies.
The Apple paradigm, selling only to the top X% (thought to be 20-ish), can only work for a slim minority of vendors. Moreover, it won't even work for that slim minority if they become the *sole* vendors from some external production process. In simpler words: the Apple paradigm only works, for Apple, iff there are other vendors to support the capex of said external production process. "We lose money on each widget, but make up for it with volume." Without all those, alleged, money losing smartphone vendors, Apple wouldn't have suppliers to eviscerate.
By splitting up a chip in this fashion, the number of transistors laid down on the leading-edge node would be held to a minimum, resulting in a smaller module that would be cheaper to design and cheaper to produce than a full SoC, all the while the other modules would be relatively cheap to produce and cheap to design (if not largely held over from existing designs to begin with).
There was a time, back in the late 70s and early 80s, when IBM held a massive tech lead on the seven dwarves (may have shrunk by one or two, don't recall exactly) with their "thermal conduction module". This was before VLSI and chips as we know them today. It appears that Marvell has discovered old wine and fashioned a new bottle. Smaller, yes, but exactly the same idea.
There will always be some new way to spread total cost, esp. capex. But, the bottom line, so to speak, is finding expanding demand for cycles. WinTel did that for decades. Now, it's the smartphone vendors and telecoms making the yin-yang of sink and supply. Life changes not so much as it appears.