Melanie Strait; Head of Investor Relations; DigitalOcean Holdings Inc
Paddy Srinivasan; Chief Executive Officer, Director; DigitalOcean Holdings Inc
Matt Steinfort; Chief Financial Officer; DigitalOcean LLC
Ladies and gentlemen, thank you for standing by. My name is Krista and I will be your conference operator today. At this time, I would like to welcome everyone to DigitalOcean's Fourth Quarter 2024 Earnings Conference Call. (Operator instructions)
I would now like to turn the conference over to Melanie Strait, Head of Investor Relations.
Melanie Strait
Good Morning and thank you all for joining us today to review DigitalOcean's fourth quarter and full year 2024 financial results. Joining me on the call today are Paddy Srinivasan, our Chief Executive Officer, and Matt Steinfort, our Chief Financial Officer.
Before we begin, let me remind you that certain statements made on the call today may be considered forward-looking statements which reflect management's best judgment based on currently available information. Our actual results may differ materially from those projected in these forward-looking statements, including our financial outlook.
I direct your attention to the risk factors contained in our filings at the SEC, including our most recent annual report on Form 10-k filed today, as well as those referenced in today's press release that is posted on our website. DigitalOcean expressly disclaims any obligation or undertaking to release publicly any updates or revisions to any forward-looking statements made today.
Additionally, non-GAAP financial measures will be discussed on this conference call, and reconciliations to the most directly comparable GAAP financial measures can be found in today's earnings press release, as well as in our investor presentation that outlines the financial discussion on today's call. A webcast of today's call is also available in the IR section of our website, and with that, I will turn the call over to Paddy.
Paddy Srinivasan
Thank you, Melanie. Good morning, everyone, and thank you for joining us today as we review our fourth quarter and full year 2024 results. We concluded the year with strong momentum and continue to successfully execute on the initiatives we laid out at the beginning of the year. Our senior leadership teams, significantly improving the pace of product innovation, augmenting our product led sales motion with new strategic go to market enhancements.
And continuing to accelerate the early success of our AIML platform, all of which together positioned us with momentum heading into 2025. In my comments today, I will briefly recap our fourth quarter and full year results, reiterate our strategy and priorities, share several product innovation and customer use cases across both core cloud and AIML that demonstrate the progress we're making against our priorities.
First, I will briefly summarize the fourth quarter and full year 2024 financial results. Revenue growth accelerated in the fourth quarter to 13% year-over-year to $205 million with one of our biggest growth levers, net dollar retention improving to 99% from 96% in Q4 of the prior year.
Our efforts to improve growth and NDR in 2024 are evident in our Q4 results; As NDR, with our traditional cloud services reached 100% in Q4 for the first time since June of 2023 on the back of a rapid product roadmap execution in our investments in several strategic go to market motions. From these efforts, we saw increased expansion from our higher spend customers as we continue to focus both our product to market efforts on these top customers.
Our higher spend customers, which has traditionally included our builder and scaler cohorts, now represent 88% of total revenue and grew 16% year-over-year in Q4. We have now further disaggregated our scalers and are disclosing our highest spend customer cohort Scalers Plus, which are customers who were at 1000+ annual run rate during the quarter.
These Scaler Plus customers who are critical to our growth trajectory, increased in count by 17% year-over-year and were 22% of the total company revenue in Q4. We reached over 500 of these customers for the first time in the company's history, and more importantly, we saw a 37% year-over-year increase in revenue from scalers plus customers, which is clear evidence of both the water share opportunity we have with these customers and our demonstrated ability to scale with them.
We also made material progress on our other major growth lever, our AIML platform, and closed the year with continued momentum, exceeding the 3 points of overall growth contribution from our AIML platform that we had guided for 2024. With Q4 just north of 160% ARR growth while staying true to our AI strategy and pursuing durable AI revenue.
We are very encouraged with the rapid growth and customer adoption of our newly launched AI products, and I'll talk about them later in my comments. On top of the encouraging growth signals, profitability remains strong as we delivered healthy 42% adjusted EBITDA margins, both Q4 and for the full year, maintaining our cost discipline while we continue to invest to fuel future growth.
Looking forward, our 2025 guidance reflects this ongoing momentum with full year revenue growth at low to mid-teens and high 10s free cash flow margins above our preliminary 15% to 17% indication. We continue to prioritize and rebalance our investments, driving improved operational efficiencies while shifting resources towards our top growth initiatives.
Our upcoming Atlanta data center is a good example of both these priorities, as the upfront investment in that facility, which will come online in Q1, not only provides us with incremental capacity for both AI and our core cloud offerings, but also gives us a lower cost facility and it's part of our longer term data center optimization strategy.
Matt will walk you through more detail on our financial results and guidance later in the call. In my first year at DigitalOcean. We had several very clear priorities as we sought to accelerate growth. We needed to double down on product innovation to address key gaps in our core cloud platform, better address the needs of our larger customers, return net dollar retention to a tailwind rather than a headwind, and build the foundation for our longer term AI growth strategy.
We made material progress on each of these objectives and continue to deliver on our promise of making complex cloud and AI technologies simple. We also made substantial progress on making our platform even more scalable, enhancing our ability to meet the needs of larger customers, and finally, we doubled down on our heritage of being the most approachable public cloud provider by continuing to invest in support of open source AI models and even hosting our developer conference deploy in January.
Let me now give you some updates on the core cloud computing platform. In Q4, we continued to accelerate the pace of innovation as we released 49 new products and features throughout the quarter, which is more than 4 times what we released in Q4 of the previous year.
Most of these products and feature enhancements directly address the needs of our largest spend customers as we continue to remove blockers and implement the capabilities that they need to further scale on our platform. I will highlight several of these product releases that we have made to help our customers grow their businesses using DigitalOcean.
Given that our larger customers run significant global workloads, they need the ability to securely connect different parts of their network so that their systems and applications in separate environments in various data centers in different countries can securely communicate without using the public internet to improve speed and efficiency while keeping their data secure. To address this need, in Q4, we announced virtual private cloud peering or VPC Peering for short, which is now generally available for all our customers.
VPC peering enables customers to connect their separate private clouds and establish seamless communication between resources hosted in these clouds using private IP addresses, keeping their information safe by traversing through the DigitalOcean backbone rather than through the public internet. Our larger customers also need the ability to distribute traffic across resources.
While still keeping it within a secure private network, to support this, we introduced a new feature called internal load balancer which enhances security by ensuring that internal workloads remain isolated from public internet, making it ideal for applications that require highly scalable private communications.
We also have several large customers with volatile traffic patterns that need mechanisms to handle these massive spikes in volume very smoothly while still optimizing costs and scaling them down when the demand is lower. To address this, we announced the general availability of droplet auto scale pools to ensure that the right resources are available to handle application workloads during these surges in traffic.
Scaling up automatically to meet demand while also helping minimize costs by scaling them back down when the traffic surge is over. We also introduce flexible management capabilities to our app platform, which is our platform as a service offering for more granular life cycle management, including archive and restore functionality and maintenance mode during the application's full life cycle.
Next, customers of spaces, which is our fast growing S3 compatible object storage service, have long asked for the ability to grant granular permissions to different users or teams without exposing full account account wide credentials. In response, we launched per bucket access keys for spaces.
This highly sought after feature provides customers with identity-based bucket level control over access permissions, helping enhance their data security and ultimately simplifying management overhead. Complementing this accelerated pace of product delivery of sophisticated capabilities was one of our new go to market motions where we bolstered our engagement with our TOP1,500 customers.
By helping take these new innovations to our customers and tightly orchestrating a closed loop between the various DigitalOcean teams and our top customers, this motion increased awareness and adoption of our new product capabilities facilitating migration of cloud workloads from other clouds to DO and serving as a catalyst for both our improved NDR and faster growth rate of scalar plus customers.
Our higher spend customers have quickly started adopting many of these features that I just talked about that we released over the back half of 2024. Over 50% of our TOP100 customers have adopted at least one of the features that we released in Q3, and we anticipate similar adoption levels for our Q4 features over time.
Together, the breadth of these new features and the pace at which we are executing our product roadmap is enabling our higher spend customers to grow on DigitalOceans and is enabling us to win more of their workloads that today reside on other hyperscalar clouds.
As we discussed last quarter, we've been focused on helping our customers seamlessly migrate more workloads to us and scale efficiently on DO. One example of this is a customer called Digital Platform, a strategic software solutions development company that was experiencing high cost and latency issues with the cloud they were they were leveraging, which was impacting their application performance.
Through our customer facing teams, we were able to fully migrate their workloads to DigitalOcean, leveraging our optimized database infrastructure to improve their performance while providing them with substantial cost savings.
Another example is Hoodoo, a provider of enhanced IT documentation with features and tools made for assisting managed service providers and IT departments. Hoodoo has been a DO customer since 2019, and they continue to scale and grow on our platform.
Hoodoo was an early adopter of our Kubernetes, managed databases, snap shooter, and premium support products. As a result of the ease of use of our platform, they've been able to focus on their own scalability and have grown as a business over 870% since 2021.
Another example of our customers' ability to scale with this lotion is Moments, a fitness and wellness platform that manages bookings, communications, and memberships. Moments needed a larger instance to house their database to meet the requirements of their rapidly growing customer base and continue leveraging our platform.
The installation team was able to provide architectural guidance by crafting a solution with existing DO products while delivering a new product they requested a 48 vCPU storage optimized droplet with scalable storage. Let me now switch gears and give you a quick update on our AI initiatives. We remain committed to and are executing well against our AI strategy that we articulated last year.
In that context, I'm very encouraged by the emerging innovations in this space like DeepSea that drive down the cost of AI adoption and improve the quality of open source models which will ultimately enable more customers to use AI. We see innovations such as DeepSea and even reports of some Hyperscalers potentially moderating their data center commitments as reinforcing our conviction that while a lot of action to date has been at the infrastructure layer.
That innovation and value creation will occur at the platform and application layers, where we are highly differentiated and well positioned to grow as we democratize AI for our customers. Our prudent approach to AI investment also allows us to ramp investment where we see customer demand, and as a case in point, we are increasing our allocation of our GPU capacity for our GPU droplets where we quickly ran out of capacity after launching at the beginning of Q4.
Each of these three layers infrastructure, platform, and AI applications have their purpose and very distinct customer targets. And although most of the action is still in the infrastructure layer, we are now starting to see more narratives in the market around the higher layers of the stack in platforms and agentic applications, which is a good validation of our AI strategy that we laid out last year.
We've been making excellent progress enhancing our AI infrastructure offerings as well as innovating at the GenAI platform and agent tech application layers as we build towards a goal of a democratizing AI by enabling our customers to quickly experiment and build AI into their real world applications.
On the infrastructure side, we are seeing strong adoption of GPU droplets which we made generally available to all our customers in October. As a reminder, GPU droplets allow digital lotion customers to leverage on-demand and fractional access to GPUs in a self-service way in just a few minutes, vastly simplifying a very complex series of steps.
Let me now give you a couple of real world examples. Prodia, a company specializing in integrating generative AI into their own applications leverages digital lotions, GPU infrastructure globally to efficiently manage their own products. Prodia accelerates generation speeds, offering an easy to use API for AI powered image generation.
Another example of an AI, ML infrastructure company is Commodity Weather Group, a company that provides advanced weather model forecasts to their clients and runs AI-based weather models to enhance decision making with additional insights. They also leveraged DigitalOcean AI infrastructure for its scalability and ease of use.
These are just a few examples of how our customers are leveraging our infrastructure to develop and sustain data intensive software to be able to meet the needs of their own customers, all while leveraging the simplicity of DigitalOcean's AIML infrastructure. Moving up the stack, I'm very excited about our GenAI platform which is now in public data. The DigitalOcean GenAI platform is one of the simplest platforms to create, deploy, and integrate AI agents into real world applications.
Stepping back for a second, AI agents are software applications designed to autonomously perform multi-step tasks that involve reasoning and decision making, leveraging AI and ML. Our new GenAI platform gives customers everything they need to build AI into their own applications without the need for advanced expertise in AI or machine learning.
Customers can easily and quickly build AI agents leveraging DO's infrastructure by adding their data to their pre-trained third party GenAI models and can seamlessly integrate those agents into their own application via secure endpoints or chatbot plugins.
In the four weeks since we made the GenAI platform beta public, we have seen well over 1,000 agents created on the platform, with the most encouraging fact being that roughly 90% of these agents were created by existing DO customers, which is further validation of our belief that our typical DO customer wants to leverage AI into their software stack and are willing to do it if we make it very simple and integrated with the rest of our cloud platform.
In our latest version of the DigitalOcean current, a customer trends report that we published earlier this month, we found that almost 80% of our target customers are interested in leveraging AI, but over 70% of them said that costs and lack of expertise are the two major impediments to AI adoption.
Our GenAI platform makes it very simple by abstracting out most of this complexity associated with creating AI agents by having templatized agents, click through wizards, and so on, and by providing easy access to a variety of open source models including lama, Deepeek, and withdrawal.
At the application layer, we introduced Cloudways Copilot in public beta, which is a suite of AI solutions designed to bring intelligent managed hosting to small and medium businesses, starting with AI powered diagnostics to give customers recommendations and alerts to fix issues before they become problems.
This helps our customers automate tasks, monitor performance, and provide them with insights to keep their websites up and running smoothly. One such example is click it a web design and development agency which is already leveraging the newly announced Cloudways Copilot and AI. Click it is finding a 4x reduction in the time spent manually handling issues and taking care of web servers.
We also started using GenAI agents for our own internal DigitalOcean cloud operations for a variety of operational incidents. GenAI agents are invoked which analyze our service logs to determine the root causes. This has resulted in a 39% improvement in our time to resolution and is one of the most sophisticated uses of GenAI agents in the industry today.
We're building these agents and using them not just to improve our own operations but also to deeply understand the pain points and complexities of building AI agents so that we can incorporate these learnings into our GenAI platform and making it even more simpler for our customers to use. Beyond our product and customer progress, another recent highlight was the deploy conference we hosted in January in Austin, Texas.
This event brought together customers, partners, and some DigitalOcean employees, amplifying our presence in the developer and AIML space and building out on a strong position at the most approachable public cloud. At Deploy, we introduced a slew of new product capabilities, launched an AI variant of our popular startup incubator program called Hatch, and hosted many of our technology and channel partners.
At Deploy, we also launched a new migrations program designed to seamlessly transition cloud workloads from the hyper scales to DigitalOcean. This program will eliminate migration-related complexity, deliver lower operational costs, and provide seamless technology assistance through our partner ecosystem and a newly formed DigitalOcean team of solution architects skilled at migrating cloud workloads.
To recap, as I close my prepared remarks, we entered 2025 with increasing momentum. In Q4 alone, we released more than 4 times as many products and features. We did in the previous year, increased net dollar retention to 99%, grew revenue 13% year-over-year, and delivered 18% at just a free cash flow margin. Our focused efforts on our higher-end customers and our continued traction in AI drove quarterly revenue for our Top 500 plus customers, representing 22% of our total revenue to grow at 37% year-over-year.
This shows clear progress on our strategy and builds on our leading position as the simple, scalable, and approachable cloud. Before I turn the call over to Matt, I'm very excited for our upcoming Investor Day, which we will be hosting at the New York Stock Exchange on April fourth starting at 9 AM Eastern time.
During this investor day, we will share more on our longer term strategy, including more details on our progress and key metrics, and we'll share a view of our long term financial outlook. I will now hand it over to Matt, who will provide some additional details on our recent financial results and our outlook for Q1 and full year 2025.