Snowflake, an optimiser or an innovator?

-

Snowflake, an optimiser or an innovator?

In the latest line up of surprising mid-pandemic public listings, Snowflake, a data warehousing as a service platform has also filed for an IPO. Snowflake launched in 2014 as a data warehousing solution and was valued at $12.4 billion during its last round of fundraising in February. Its success pivoted, among other factors, on its ability to abstract compute power and data storage to create a modular solution that could be made efficient for any data warehousing configuration. Snowflake also allows customers to work across multiple cloud platforms (Google, AWS etc.) and provides the much desired data portability that enables them to shift between cloud solutions. 

At the time of launch, the seemingly simple solution was an efficiency that was not afforded by the big cloud players, whose focus tended to be less specialised and broader. This allowed Snowflake to carve out a niche of enterprises who brought along with them a lucrative revenue stream. But in a saturated cloud market, pointing to Snowflake’s modularity as its success is an oversimplification of the strategy that allowed it to carve out its multi-billion-dollar valuation. To give it more depth we can apply a model that better describes the structure with which it was able to enter and capture value in the market. 

Optimisers: Positioning of new entrants in saturated markets

We can think of any technology that underpins a transformation as a system. These systems, such as cloud computing, always seek to optimise the end-user experience. In the process of optimisation, the system begins to become more complex, working across multiple vectors to innovate cloud’s original purpose such that it fulfils its maximum potential (i.e. servicing the most expensive and effective use cases). The leaders in the market find the greatest value in being able to serve the broadest customer base across these vectors. The bottlenecks and inefficiencies that result in niche pockets are not immediately addressed by these service providers, that is until a time when demand for the niche segment grows to service a broader consumer base.

In an effort to enter a saturated market and succeed in capturing value, one could pursue one of four avenues:

  • Direct competitor – offers what is already available (Storage and compute power)
  • Augmentor – offers augmentations to current solutions (using the cloud to optimise other services)
  • Optimiser – optimises an inefficiency that exists in the current landscape (Snowflake)
  • Innovator – provides new technology or processes that achieve the same end goal 

Snowflake focused on becoming an optimiser, where the ideal solutions to pursue would be the ones that encapsulate future states (standards) for the industry as a whole. That is to say, at the time Snowflake launched, the base state of the industry did not have (at scale) the efficiency of multi-cloud and/or customisable modularised cloud and storage solutions. Not having the option to unbundle compute power and storage was inconsistent with the flexible nature that cloud had become known for.

These gaps in the product offering (despite being obvious inhibitors of the end-user experience) would not be solved at the rate one would expect but rather in a future state when the primary existing base state has been fully optimised and this would serve as an additional edge over competitors. This is not to mention that there also exists an incentive for big cloud players (AWS, Google etc) to deprioritise cross-cloud compatibility in order to retain users within their own ecosystems. 

In this interim period between inefficiency and an industry standard that solves that inefficiency, there is a lucrative window of opportunity for optimisers like Snowflake to carve out a niche within what most would consider an oversaturated market. 

Though this seems straightforward and easily accessible to new entrants, it is plausible that the apparent solution misses the mark entirely and exacerbates an existing inefficiency rather than solving it. Some ‘discordant’ solutions may make the system more difficult to use in a misguided attempt to improve the user experience. A less detrimental but equally ‘redundant’ solution may add complexity without changing the user experience for better or worse. 

But how do solutions avoid this and identify a future industry standard? The simplest method lies in identifying points of customisation that can be introduced to an overly complex system. In an industry’s early stages, it is near impossible to provide customisability to individual users without trading off the cost and scale that broader, less nuanced solutions have been able to achieve. When a solution is able to address both customisability and cost efficiency in these early stages, it maps the trajectory to a future industry standard where the big players will eventually absorb these efficiencies as well. Snowflake did exactly this, it was the ‘efficient’ solution that made the net user experience of the system simpler and more coherent – setting the standard that the entire industry would eventually rise to meet.

There is, however, a caveat to pursuing such a path. Whilst being an optimiser is a lucrative opportunity, it will only create defensible value in the interim. As the industry absorbs the efficiencies as its standard, an optimiser transitions to becoming a direct competitor – a more difficult position to defend in a saturated market like cloud computing. The optimiser will need to continue identifying future industry standards to remain an optimiser. 

“Snowflake needs the funding as it needs to expand its product footprint to encompass more than just data warehousing. It should be focused less on niches and more on the entire data lifecycle including data ingest, engineering, database and AI,”

– Patrick Moorhead – founder and principal analyst at Moor Insight & Strategy provided TechCrunch with his analysis on what Snowflake should do with the IPO fundraising.

In the absence of identifying new optimisations for the future standards of the industry, Snowflake quickly becomes dependent on a sales model that is more in line with the nature of direct competitors. This introduces the industry’s margin squeeze risks to Snowflake’s model, which will make it harder for it to sustain and grow its unicorn valuation established at IPO. Where Snowflake has built its current valuation off the back of its disruptive value as an optimiser over the last six years, its ongoing upkeep will depend heavily on its sales model and continued evolution. 

CONTACT THE EDITOR

We welcome thoughtful discourse on all our content. If you would like to further explore or discuss any of the ideas covered in this article please contact our editors directly.
Contact Details