Why AI Enterprise Leaders Must Trade Margin For Moat Through Services Led Growth

By Staff Writer | Published: June 17, 2025 | Category: Innovation

In the AI revolution, winning companies are prioritizing services-led growth over pure product-led strategies to capture the enterprise market.

Why AI Enterprise Leaders Must Trade Margin For Moat Through Services-Led Growth

Joe Schmidt's recent article "Trading Margin for Moat: Why the Forward Deployed Engineer Is the Hottest Job in Startups" presents a provocative thesis for today's AI startup landscape: the most successful enterprise AI companies will embrace services-led growth rather than pursuing pure product-led growth (PLG) strategies. This represents a significant departure from the conventional Silicon Valley wisdom of the past decade, which has elevated PLG as the superior model due to its scalability and higher margins.

Schmidt's argument is both timely and insightful, offering a roadmap for AI startups navigating the unique challenges of enterprise adoption. However, the framework deserves deeper examination, particularly regarding its implications for different market segments, the evolving nature of AI implementation work, and how companies can effectively transition from services-heavy to product-dominant over time.

The False Promise of Pure PLG for Complex AI Applications

The allure of product-led growth is undeniable. Companies like Atlassian, Slack, and Figma achieved remarkable success by creating intuitive products that users could adopt with minimal friction. These PLG darlings have driven a decade-long obsession with self-service models that maximize gross margins and minimize customer acquisition costs.

However, Schmidt correctly identifies that during major platform shifts—like today's AI revolution—the playbook changes dramatically. The most valuable enterprises born during the cloud transition (Salesforce, ServiceNow, Workday) succeeded not through PLG but by investing heavily in implementation services. Despite their initially lower margins, these companies have achieved combined market capitalizations that dwarf their PLG counterparts.

This pattern appears to be repeating with enterprise AI. As Gartner noted in their 2023 "Market Guide for AI TRiSM," implementing AI in enterprise environments introduces unique challenges around trust, risk, and security management that cannot be addressed through self-service alone. Organizations require specialized expertise to integrate AI systems with existing workflows, connect to appropriate data sources, and ensure governance standards are met.

The complexity stems from a fundamental shift in what software is doing. As Schmidt puts it: "Software is no longer aiding the worker—software is the worker." When AI agents take on complex, end-to-end workflows, the implementation requirements become significantly more demanding than traditional SaaS tools.

The Hidden Value of Services-Led AI Implementation

While services-led growth initially results in lower gross margins, Schmidt makes a compelling case that this approach creates more sustainable competitive advantages for AI startups. By doing the hard work of implementation, companies can:

This approach has precedent beyond just the enterprise software giants Schmidt cites. McKinsey's 2023 report on "The State of AI" found that organizations achieving the highest ROI from generative AI were those investing in customized implementations rather than off-the-shelf solutions. The report noted that 79% of organizations realizing substantial value from AI had dedicated implementation teams working alongside vendors.

However, Schmidt's argument overlooks some important nuances. The Salesforce, ServiceNow, and Workday examples all come from an era when cloud computing was far less mature than AI is today. These companies were replacing existing systems of record (like Siebel CRM) rather than creating entirely new categories. Today's AI startups often face the dual challenge of creating new solution categories while also handling complex implementations.

The Forward Deployed Engineer: A New Type of Services Professional

What makes Schmidt's analysis particularly valuable is his focus on the emerging role of the "forward deployed engineer" (FDE) in AI startups. Unlike traditional professional services roles, these hybrid positions combine technical implementation expertise with strategic advisory capabilities.

The FDE role represents an evolution from previous enterprise implementation models. Rather than simply configuring existing products to meet customer requirements, FDEs are actively involved in:

This evolution addresses one of the key counterarguments to services-led growth: that it limits scalability. By developing reusable components, automated integration tools, and standardized implementation methodologies, forward deployed teams can achieve greater efficiency than traditional professional services.

Deloitte's 2023 report on "The AI Services Surge" supports this trend, noting that AI implementation services are growing at a 32% CAGR, significantly outpacing the broader IT services market. The report attributes this growth partly to the unique skill sets required for AI implementation, which combine technical expertise with deep domain knowledge.

A compelling example is Anthropic's enterprise strategy. Despite having one of the most advanced foundation models available (Claude), the company has invested heavily in building an enterprise services team to drive adoption. Their job listings for implementation engineers and solutions architects emphasize not just technical skills but also experience redesigning business processes—exactly the hybrid capability Schmidt describes.

The Implementation Paradox: Why Services Can Actually Increase Scale

One of Schmidt's most counterintuitive insights is that services-led growth can actually enable faster scaling for AI startups. By having dedicated implementation teams, companies can:

This approach has proven successful for companies like Scale AI, which built a substantial services organization to help enterprises implement computer vision solutions. Their hands-on implementation approach not only drove initial revenue but also helped them identify high-value use cases that informed product development.

However, Schmidt's analysis doesn't fully address the capital efficiency challenges of services-led growth. The approach requires significantly more upfront investment than pure PLG strategies, potentially limiting it to well-funded startups or those with strategic enterprise relationships. Forrester's research on AI implementation costs suggests that enterprises typically spend 2-3x the cost of AI software licenses on implementation services—a substantial amount that startups must factor into their pricing strategies.

Best Practices for Building Elite Services Teams

Schmidt provides valuable practical advice for AI startups building their first services teams. His eight recommendations form a solid foundation, but three deserve particular emphasis:

1. Automate Implementation Where Possible

Unlike previous generations of enterprise software, AI can actually help streamline its own implementation. Schmidt notes that best-in-class companies are "automating as much of the process as possible, including tasks like process mining, data pipeline automation, system integrations, and combing through API documentation."

This represents a significant advantage over the services-led approaches of Salesforce and ServiceNow, potentially allowing AI startups to achieve higher margins earlier in their lifecycle. For example, companies like Moveworks have developed automated discovery tools that can map customer support workflows without manual intervention, dramatically reducing implementation time.

2. Create Strong Product-Services Feedback Loops

Schmidt rightly emphasizes that "forward deployed teams—technical or not—are closest to the customer." The most successful AI startups have established formal mechanisms for channeling implementation insights back to product teams, including:

This tight feedback loop addresses one of the common criticisms of services-led growth: that it diverts resources from product development. When properly structured, implementation experiences can actually accelerate product evolution by revealing real-world usage patterns and pain points.

3. Design for Future Ecosystem Partnerships

While Schmidt advocates for startups to initially handle implementation themselves, he acknowledges the importance of eventually developing partner ecosystems. His advice to "leave a trail" through documentation is critical for this transition.

Companies that successfully navigate this path typically follow a three-phase approach:

  1. Direct implementation: Handle all services internally to build expertise and refine methodologies
  2. Partner enablement: Develop training programs, certification paths, and co-delivery models with selected partners
  3. Ecosystem scale: Shift to primarily partner-led implementations with internal teams focused on complex or strategic accounts

This progression allows companies to maintain quality control during critical early adopter phases while building toward more scalable models as products mature.

The Critical Nuances Schmidt Overlooks

While Schmidt's analysis is largely persuasive, several important considerations deserve deeper exploration:

1. The Market Segmentation Question

Schmidt presents services-led growth as a universal approach for AI startups, but the optimal strategy likely varies by market segment. Enterprise AI solutions targeting C-suite initiatives or mission-critical workflows clearly benefit from high-touch implementation approaches. However, departmental solutions or tools aimed at individual knowledge workers might still succeed with more PLG-oriented strategies.

A hybrid approach may be most effective for many AI startups—using PLG tactics to drive initial adoption and awareness while offering implementation services for customers seeking deeper integration. Companies like Jasper AI have followed this path, starting with a self-service product for individual content creators before building enterprise capabilities with implementation support.

2. The Cost Structure Challenge

Services-led growth requires significantly different cost structures and fundraising strategies than PLG approaches. Startups pursuing this path need to communicate clearly with investors about gross margin expectations and capital requirements. Schmidt touches on this briefly but doesn't fully address how startups should structure their financial models to support services investments.

Research from PitchBook indicates that AI startups with substantial services components typically raise 30-40% more capital than pure-product companies at similar revenue levels. This reflects both the higher operating costs and the longer path to profitability associated with services-led approaches.

3. The Talent Acquisition Imperative

Schmidt recommends hiring "curious hustlers" for implementation roles, but building a high-performing services organization requires more specialized talent strategies. Forward deployed engineers need a rare combination of technical depth, business acumen, and customer relationship skills.

Successful AI startups are addressing this challenge through targeted recruiting from management consulting firms, specialized training programs, and partnerships with universities to develop talent pipelines. Some are even creating rotational programs where early-career engineers split time between product development and implementation roles.

The Long-Term Evolution: From Services to Platform

Perhaps the most important aspect of Schmidt's thesis is how services-led growth eventually transitions to more scalable models. The path from Salesforce's early days (when services represented over 30% of revenue) to today (when services are less than 10% of revenue) provides a valuable roadmap.

AI startups should view implementation services not as a permanent business model but as an investment in building category leadership. The ultimate goal remains building highly scalable software platforms with strong gross margins. The services approach simply recognizes that during platform shifts, the fastest path to that destination often requires significant implementation support.

This transition typically follows predictable phases:

  1. Direct implementation: The company handles all implementation work directly
  2. Productized services: Standard implementation packages with fixed scopes and timelines
  3. Partner enablement: Building and certifying an ecosystem of implementation partners
  4. Self-service components: Developing tools that automate aspects of implementation
  5. Platform maturity: Implementation becomes primarily partner-led or self-service

Schmidt's analysis would benefit from more explicit discussion of this evolution and how companies can manage the transition without disrupting customer relationships or revenue growth.

Conclusion: A Necessary But Temporary Trade-Off

Joe Schmidt's analysis offers a valuable counterpoint to Silicon Valley's PLG orthodoxy. His central insight—that AI startups should trade some margin for moat through services-led growth—is well-supported by historical precedents and current market dynamics.

The forward deployed engineer is indeed becoming a critical role in enterprise AI startups, combining technical implementation expertise with strategic advisory capabilities. Companies that invest in building elite services organizations are likely to capture larger market share during this platform shift, even at the cost of lower initial margins.

However, this approach should be viewed as a strategic investment rather than a permanent business model. The ultimate goal remains building highly scalable software platforms with strong network effects and high gross margins. Services-led growth simply recognizes that during major platform shifts like the AI revolution, the fastest path to category leadership often requires significant implementation support.

For founders and investors in enterprise AI, Schmidt's advice provides a valuable roadmap: embrace the implementation complexity, hire forward deployed engineers who can bridge technical and business domains, build systems that automate services work where possible, and maintain strong feedback loops between implementation and product teams.

By following this path, today's AI startups have the opportunity to become tomorrow's Salesforce, ServiceNow, or Workday—category-defining platforms that achieve both massive scale and exceptional margins. The key is recognizing that getting there requires trading some margin for moat in the critical early stages of market development.

As the AI revolution continues to unfold, we can expect to see more startups embracing this services-led approach. Those that resist it in pursuit of PLG purity may find themselves with elegant products that lack the deep customer relationships and workflow integration necessary to become systems of work. In the battle for AI dominance, the forward deployed engineer may indeed be the secret weapon that separates the winners from the also-rans.

For further insights on this topic, you can explore more here.