The enterprise-level Web Agent is here to let AI "do the heavy lifting" and earn big money for businesses.

Have you ever thought that the internet has actually exceeded human processing capabilities? Imagine a company needing to track price changes, inventory updates, and competitor dynamics across thousands of websites every day; this data changes every minute, and traditional browsers and manual operations simply cannot cope. When I saw the news that TinyFish had just completed a $47 million Series A funding round, I realized that this is not just a round of financing, but the beginning of a whole new era—the era of enterprise-level Web Agents. I have been thinking about the commercialization of AI agents, but TinyFish's approach has shown me a more realistic and disruptive direction: allowing AI agents to not merely simulate human browsing of the web, but to execute complex business workflows with enterprise-level scale, reliability, and compliance requirements.

The financing round led by ICONIQ Capital attracted participation from well-known investment institutions such as USVP, Mango Capital, MongoDB Ventures, and Sandberg Bernthal Venture Partners. Notably, Sandberg Bernthal Venture Partners is a fund co-founded by former Meta executive Sheryl Sandberg, and her involvement adds significant strategic value to the project. However, what truly interests me is that TinyFish has already been deployed at scale in the production environments of Fortune 500 companies like Google, DoorDash, and ClassPass, running millions of operations each month. This means they have crossed the chasm from demo to real commercial value, which is an extremely rare achievement in the AI agent space.

The background of the founding team is also noteworthy, reflecting a perfect combination of technical depth and business insight. CEO Sudheesh Nair was the president of Nutanix and has extensive experience in enterprise product development and marketing. Co-founder Shuhao Zhang is a former Meta engineer who participated in the development of GraphQL and has a strong technical foundation in building large-scale systems. Another co-founder, Keith Zhai, is a former senior journalist at The Wall Street Journal, and his media background brings a unique perspective on information acquisition and analysis to the team. This triple combination of technology, business, and media experience has provided them with a unique perspective to understand and address the real needs of enterprises in network automation. As Shuhao mentioned in an interview, "Marketing and positioning are indeed the most difficult parts" when building a company, and Keith's media background perfectly complements this shortcoming.

Technical Evolution from AgentQL to Enterprise-level Web Agent

Understanding the development history of TinyFish has deepened my understanding of their technological accumulation. This company has actually been quietly working for 20 months, although it has only just officially emerged from its secretive mode. Their first product, AgentQL, has laid an important technical foundation for enterprise-level Web Agents. AgentQL addresses a long-standing problem that has troubled developers: how to enable AI agents to accurately identify and manipulate web elements.

Shuhao Zhang observed an important trend while developing AgentQL: "Twenty months ago, I did see a trend and shift towards a more agentic world. At that time, it was still GPT-3.5, but we really saw the reasoning ability and the ability to handle complex tasks. So what was really missing was a more AI-native way for AI agents to access the web." This insight is very crucial. Traditional web automation tools rely on CSS selectors or XPath, which often fail when faced with dynamically generated class names and constantly changing page structures. AgentQL allows developers to describe page elements using natural language, such as "red submit button" or "card with a specific content title."

I particularly appreciate AgentQL's design decisions in its technical architecture. It chose to analyze pages based on the DOM rather than screenshots, a decision backed by profound technical considerations. Shuhao explained: "The training dataset for language models contains far more data on HTML and the DOM than on images. Moreover, screenshots have physical limitations; when you have long pages, horizontal and vertical scrolling, and content hidden behind collapsible panels, the screenshot approach has a whole set of constraints." This technical choice reflects the team's deep understanding of the boundaries of AI model capabilities.

The success of AgentQL has laid an important foundation for TinyFish. It has been integrated into major AI frameworks such as LangChain, LlamaIndex, and LFlow, serving hundreds of millions of API calls. More importantly, its innovations in Chrome extensions and developer tools allow developers to validate the accuracy of queries before actual deployment. This "what you see is what you get" development experience significantly lowers the usage threshold and enhances the reliability of the final deployment. From AgentQL to enterprise-level Web Agent, TinyFish demonstrates the technological evolution path from underlying tools to complete solutions.

Why Traditional Network Automation Methods Are Outdated

I have observed a significant trend over the past few years: the internet is becoming increasingly complex, while our methods of accessing and processing internet data have remained stagnant for a decade. TinyFish made a profound observation in its latest blog: "The story of the internet has always been a story of scale." From the initial few static pages to millions of websites searchable through Yahoo and Google, to e-commerce and social platforms, ultimately the entire enterprise has migrated online. But the problem is that the rate of growth of the internet far exceeds our ability to handle it.

The current internet has become a maze. Information is hidden behind login screens, content changes with scripts and personalization, and prices are adjusted every minute. As TinyFish said: "The internet has become a maze that cannot be grasped on a human scale." Traditional web scraping tools and automated scripts are struggling to cope with the modern web environment. Websites use dynamic loading, anti-scraping measures, and personalized content, all of which frequently render traditional methods ineffective.

Moreover, the requirements for accuracy, stability, and Compliance in enterprise-level applications far exceed those of individual users. Sudheesh Nair, the founder of TinyFish, has a profound understanding of this issue. He pointed out: "Today's web spans thousands of platforms and billions of pages, but companies cannot fully tap into its potential because the work required to create business value at scale is complex, manual, and limited by human capability." This observation is very accurate. I have seen too many companies spend a large amount of manpower manually gathering competitor information, tracking market prices, and monitoring inventory changes. These tasks are not only inefficient but also prone to errors.

TinyFish emphasized the complexity of the modern network environment in its technical sharing. Shuhao Zhang mentioned: "The class names in modern network frameworks are dynamically generated. If you refresh the page, some websites change everything. The content is also dynamic. So nth-child will change the order when you have new banners, and the carousel will change." This technical detail explains why traditional CSS selectors and XPath methods are no longer reliable.

I particularly agree with a point made by TinyFish: the internet has surpassed the capabilities of browsers. They wrote in their blog: "Creating new opportunities and revenue depends on thousands of workflows running across thousands of websites, with billions of changes every day. No human analyst can keep up. Consumer tools, those proxies built for individuals one browser at a time, were never designed to bear this weight." What modern businesses need is not better browsers, but intelligent systems that can understand and adapt to the complexities of the web.

The Enterprise-level Web Agent Revolution of TinyFish

The reason TinyFish's approach caught my attention is that they clearly differentiate the fundamental differences between consumer-grade and enterprise-grade Web Agents from the very beginning. As they pointed out in their analysis: "Enterprise-grade Web Agents are fundamentally different from consumer-grade browser agents." Consumer-grade agents excel at handling individual tasks, such as arranging travel itineraries or providing personalized recommendations based on browsing history, which are simple one-on-one tasks. However, enterprise-grade Web Agents need to automate complex business workflows that must be executed thousands or even millions of times, and they cannot fail.

Their enterprise-level Web Agents possess several key features that have shown me real technological breakthroughs. First is the results-oriented design; these agents are not meant to showcase technical capabilities but to achieve measurable business outcomes such as revenue growth, cost savings, or market share increase. Second is the complete workflow coverage; they can handle every stage of the entire process, not just isolated tasks. Third is enterprise-level reliability and compliance, which means they can meet the requirements of large global organizations in terms of security, governance, and uptime.

What impressed me the most is their "planetary-scale" capability. TinyFish's Web Agent can coordinate actions across thousands of platforms simultaneously, a scale that traditional automation tools cannot achieve. Imagine an agent being able to monitor price changes on thousands of e-commerce websites worldwide at the same time, analyze competitors' promotional strategies in real-time, and integrate this information into actionable business insights. This is not just a technological advancement; it is a fundamental transformation in the way business intelligence is collected.

From a technical implementation perspective, TinyFish employs advanced inference models to understand and adapt to changes in the network environment. Their system utilizes state-of-the-art AI models for reasoning and exploration, and then encodes this knowledge for high-speed, deterministic large-scale execution. This approach combines the flexibility of AI with the reliability of traditional automation. More importantly, their infrastructure is capable of learning, adapting, and scaling, which means the system becomes smarter and more reliable as it is used.

I particularly appreciate TinyFish's considerations regarding security and Compliance. Enterprise-level applications cannot afford the risks of data breaches or Compliance violations like consumer-grade products can. TinyFish's Web Agent is equipped with an enterprise-grade security posture and governance framework, ensuring that all operations have complete logging and audit trails. As they emphasize: "The TinyFish agent is specifically designed to operate at the scale, reliability, and Compliance required by enterprises." This deep understanding of enterprise needs is a key reason for their successful deployment in Fortune 500 companies.

In the design of the technical architecture, TinyFish demonstrates a profound understanding of modern AI technology. Shuhao Zhang mentioned in the technical sharing: "The advancements in generative AI and newly released inference models have made the web more complex, making traditional tools harder to access." However, it is precisely these inference models that provide TinyFish's enterprise Web Agent with the ability to understand and tackle today's web complexities, allowing the company to securely scale its operations and turn complexity into a business advantage.

Real business cases prove value

No matter how good the theory is, it needs practical validation, and TinyFish's performance in this regard has impressed me. They have already achieved large-scale deployments among leading enterprises in multiple industries, and these cases showcase the real commercial value of enterprise-level Web Agents. Currently, TinyFish operates hundreds of thousands of enterprise-level Web Agents, executing millions of operations each month for Fortune 500 companies and high-growth businesses. This scale itself indicates the maturity of the technology and the authenticity of market demand.

In the hotel industry, TinyFish's Web Agent developed for Google addresses a long-standing technical challenge. Thousands of hotels in Japan use outdated booking systems that cannot directly integrate with Google's search aggregator. Traditional solutions require these hotels to upgrade their entire IT systems, which is costly and difficult to implement. TinyFish's Web Agent can automatically aggregate the inventory information of these hotels, allowing consumers to find and book these rooms through Google Hotel Search without the hotels needing to make any infrastructure updates. This case perfectly demonstrates how an enterprise-level Web Agent can create new business value without disrupting existing systems.

In the transportation sector, a leading ride-hailing company uses TinyFish to collect millions of pricing variables every month, achieving near real-time dynamic market adjustments. This capability allows them to quickly respond to competitors' pricing strategies, optimize their own pricing models, and ultimately enhance market competitiveness and profitability. Imagine how much human resources would be needed to collect and analyze this data manually, and how difficult it would be to ensure the timeliness and accuracy of the data.

The application in the e-commerce sector further demonstrates the powerful capabilities of Web Agent. Global brands can simultaneously track competitor pricing across thousands of retail websites, monitor inventory changes, and capture promotional data. This real-time market intelligence enables businesses to quickly adjust their pricing strategies, uncover new market opportunities, and avoid missing critical business information. More importantly, the collection and analysis of this data are fully automated, significantly reducing operational costs.

The customer coverage of TinyFish is also continuously expanding. In addition to tech giants like Google and DoorDash, growth companies such as ClassPass are also using their services. This indicates that the value of enterprise-level Web Agents is not limited to large corporations; medium-sized companies can also benefit from it. Especially in the retail and tourism industries, TinyFish focuses on dynamic price monitoring as a core application scenario, helping businesses track competitors' prices, promotions, delivery times, and inventory levels in real-time.

The evaluation by Abhi Shah, the Director of Data Science at DoorDash, is particularly persuasive: "TinyFish's platform manages the complexity of network interactions on a large scale. In addition to DoorDash, TinyFish powers high-risk workflows for hotels, e-commerce, and marketplace platforms, helping businesses capture changing network data, act faster, and turn continuous changes into measurable results." Such recognition from actual users is more convincing than any technical demonstration.

From a business model perspective, TinyFish's success lies in their focus on addressing the actual pain points of enterprises, rather than pursuing the novelty of technology. Traditionally, these tasks were handled by large offshore teams performing manual data entry, or by custom software scripts, which often failed when website design changes occurred. TinyFish offers a more robust and scalable solution, leveraging an AI-driven approach to cope with the rapid changes in the online environment.

Why are investors optimistic about this direction?

The decision by ICONIQ Capital to lead this round of financing has made me think a lot. As a top-tier VC focused on growth-stage investments, ICONIQ's investments often have deep strategic considerations. Their partner Amit Agarwal mentioned a key point when explaining the investment decision: TinyFish has already achieved product deployment among large-scale customers, who themselves have sufficient development resources to build similar systems. "They have operationalized and productized it, deploying it on a large scale for two large clients who have all the internal development resources to build such things themselves," Agarwal said.

This observation is very important. Technology companies like Google and DoorDash are fully capable of developing their own network automation tools, but they choose to use TinyFish's solutions, which indicates that the value provided by TinyFish goes beyond simple technical implementation. I believe this value is mainly reflected in three aspects: specialization, economies of scale, and continuous innovation capability. The specialization is reflected in TinyFish's deep focus on the enterprise Web Agent field. They are not trying to create a general AI platform, but rather specifically address the particular issues enterprises face in network automation.

ICONIQ's investment team has highly praised TinyFish's technological capabilities. Amit Agarwal stated: "TinyFish's innovative enterprise Web Agent can replicate human behavior on the internet at scale, possessing the resilience and reliability required by businesses. This is laying the foundation for a significant transformation in how enterprises and applications interact with the web, collect intelligence, and automate workflows. No one else has addressed this issue; TinyFish has delivered results in today's customer production environments."

The scale effect comes from their infrastructure investment. Building infrastructure that can support hundreds of thousands of Web Agents running simultaneously requires a huge technical investment, which is uneconomical for most businesses. TinyFish has built such infrastructure and possesses "planetary-scale" processing capabilities, creating significant competitive barriers for them.

Continuous innovation capability may be the most important factor. The network environment is constantly changing, with new anti-crawling technologies, new website architectures, and new security measures emerging one after another. The TinyFish team specializes in tackling these challenges, and their solutions evolve as the network environment changes. For enterprise clients, this means they can focus on their core business without worrying about the maintenance and updates of network automation tools.

From a market timing perspective, investors believe that now is the critical moment for the explosion of enterprise-grade Web Agents. The AI agent sector is experiencing a gold rush, with large tech companies and startups competing to leverage the shift from static large language models to dynamic agents capable of executing complex multi-step tasks. TinyFish has already established a technological lead and customer base at this crucial moment, creating favorable conditions for them to occupy a strong position in the rapidly growing market.

The new financing provides TinyFish with three to four years of development funding, allowing them to continue investing in product development and expanding market promotion operations. CEO Sudheesh Nair clearly stated that the goal is not just to help businesses save costs, but to "help businesses make more money." This business philosophy, which focuses on creating incremental value rather than merely optimizing costs, is a key reason why investors are optimistic.

Key Breakthroughs and Future Challenges of AI Agent Technology

From a technical perspective, the success of TinyFish is inseparable from the recent breakthroughs in large language models and reasoning capabilities. Past automation tools relied on hard-coded rules and scripts, which could not adapt to the dynamic changes of the online environment. However, current AI models possess human-like reasoning abilities, allowing them to understand web structures, adapt to interface changes, and handle exceptions. But as TinyFish has observed: "The advancements in generative AI and newly released reasoning models have made the web more complex, making it harder for traditional tools to access."

I am particularly concerned about how TinyFish addresses the core challenges faced by AI agents in enterprise environments. The first is the issue of accuracy. Consumer-grade applications can tolerate occasional errors, but enterprise-grade applications have very high accuracy requirements. A pricing error or data omission could lead to significant business losses. TinyFish ensures the precision and consistency of operations through its patented infrastructure, which can learn and adapt while maintaining enterprise-level reliability standards.

The issue of scalability is equally critical. Individual users may only need to manage a few websites at a time, but enterprise clients need to monitor thousands of platforms simultaneously. This is not just an increase in quantity, but a qualitative change. Large-scale deployment requires consideration of complex issues such as resource management, error handling, and load balancing. TinyFish's "planetary-scale" capability demonstrates their technical accumulation in this area. Their system uses advanced AI models for reasoning and exploration, and then encodes this knowledge to achieve high-speed, deterministic large-scale execution.

From the perspective of the implementation details of the technical architecture, Shuhao Zhang faced many interesting technical decisions during the development process. For example, in the development of AgentQL, they chose to use the DOM instead of screenshots to analyze pages, which was based on a deep understanding of the AI model training data and technical limitations. They also developed a complex preprocessing system to handle the intricate structures of modern web pages, including nested iframes, shadow DOM, and other technical details.

Security and compliance are another key challenge. Corporate behavior on the internet must comply with various laws and regulations, including data protection laws, antitrust laws, and so on. TinyFish's Web Agent comes with an enterprise-grade security posture and governance framework, ensuring that all operations meet compliance requirements. Particularly when dealing with user identity and authentication status, Shuhao Zhang emphasized security risks during the interview: "I absolutely do not recommend users share their sessions with remote browsers. This is a very gray area." He suggested that companies should create independent identity and authentication systems for AI agents.

I also noticed TinyFish's innovations in handling network complexity. Modern websites use various technologies to prevent automated access, including CAPTCHA, behavioral analysis, IP restrictions, and more. TinyFish's Web Agent is able to adapt to these measures, maintaining stable access capabilities. This adaptability is not a one-time process but a continuous learning and improvement process. They even developed a "stealth mode" to cope with anti-scraping detection, bypassing these restrictions by simulating the fingerprint characteristics of a real browser.

But challenges still exist. Shuhao Zhang admitted that for complex scenarios like infinite scrolling, they have not yet found a perfect solution: "By definition, it is infinite. You always need to slice it to fit the context window, and you need to remember where you stopped and then start again." This honest acknowledgment of the technology reflects their clear understanding of the boundaries of technology and points the way for future technological development.

The Profound Impact on Enterprise Digital Transformation

I believe that the enterprise-level Web Agent trend represented by TinyFish will have a profound impact on the digital transformation of enterprises. As they stated in their company blog: "If you can transform the internet into analyzable data, it will fundamentally provide enterprises with advantages that others do not have." Traditional enterprise information systems mainly rely on structured data and API interfaces, but a large amount of valuable information on the web still exists in an unstructured manner. Enterprise-level Web Agents provide a new way to acquire and utilize this information.

The significance of this change is not only at the technical level but also at the strategic level. A company's competitive advantage increasingly relies on the speed of information acquisition and analysis capabilities. Companies that can obtain market information faster and more accurately can gain an advantage in competition. TinyFish's Web Agent enables businesses to monitor the entire market environment in real time, and this capability has immense value in a rapidly changing business environment. As Sudheesh Nair said, their goal is to help companies "make more money" rather than just save costs.

From a cost perspective, Web Agents also bring significant benefits. Traditional market research and competitive analysis require a large amount of human resources and often cannot be updated in real-time. Enterprise-level Web Agents can work continuously 24 hours a day, with costs far lower than manual methods, and with higher accuracy and consistency. This efficiency improvement allows companies to invest more resources into core business and innovation activities.

I am particularly optimistic about the application prospects of Web Agent in supply chain management, risk control, market forecasting, and other fields. Supply chain management requires real-time monitoring of suppliers' conditions, price changes, inventory levels, and other information. Risk control needs to promptly identify external factors that may affect the business. Market forecasting requires analyzing large amounts of market data and trend information. These are all areas where Web Agent can play an important role. TinyFish is currently focused on the retail and tourism industries, but their technology can be fully extended to other industries.

More importantly, Web Agents may change the way businesses obtain external information. Traditionally, companies mainly relied on purchasing third-party data services or commissioning research firms. However, Web Agents allow businesses to obtain the latest and most accurate information directly from the source, reducing intermediaries and improving the timeliness and reliability of information. This ability to directly access first-hand information will become an important source of competitive advantage for businesses.

TinyFish mentioned an important point in its technological vision: "Technology at its best does not demand your attention. It fades into the background, making way for the importance of human work." This philosophy reflects their profound understanding of the value of technology. The best enterprise-level technology should be intangible, allowing users to focus on business goals rather than technical details. This is precisely the core value of the enterprise-level Web Agent.

Challenges and Future Development

Although I am very optimistic about the prospects of enterprise-level Web Agents, this field still faces some significant challenges. The first is the technical challenge. The online environment is constantly changing, with new anti-scraping technologies and security measures emerging. Web Agents need to continuously evolve to adapt to these changes. TinyFish has made significant progress in this area, but it is an endless technological race. As they say: "Transforming the complexity of the web from a barrier into an opportunity."

Legal and ethical issues are another significant challenge. Although most online information is public, automated access can still involve legal and ethical disputes. Different countries and regions have varying legal regulations regarding web crawlers, and companies need to ensure that their actions comply with all relevant laws and regulations. TinyFish needs to find a balance between technical capabilities and Compliance requirements. Especially in terms of data privacy and user identity protection, industry standards and best practices need to be established.

Intensifying competition is also a real challenge. With the rapid growth of the enterprise-level Web Agent market, more and more companies will enter this field. Large tech companies may develop their own solutions, and specialized software companies may also launch competing products. TinyFish needs to continue innovating to maintain its competitive edge. However, based on the current situation, they have already established significant first-mover advantages and technological barriers.

From a team building perspective, TinyFish faces the typical challenges of a tech startup. Shuhao Zhang mentioned in an interview, "The hardest part for founders is definitely positioning and business," which reflects the common challenges that tech founders face in marketing. However, their co-founder Keith Zhai's media background provides important reinforcement for the team in this regard.

I believe that TinyFish's successful strategy should focus on several key areas. First is to continue deepening the technological moat, especially in capabilities related to handling complex network environments and large-scale deployments. They need to maintain a technological lead in AI reasoning capabilities, network adaptability, and enterprise-level reliability. Second is to expand the customer base, moving from the current large enterprises to the medium-sized enterprise market. Third is to build an ecosystem by establishing partnerships with other enterprise software vendors, making Web Agent a part of a larger digital solution.

From the perspective of product development, TinyFish is evolving from underlying tools like AgentQL to a complete enterprise Web Agent solution. They plan to officially launch the company in the next month or two, at which point more product details may be announced. From a technical architecture standpoint, they are building the entire tech stack, including the infrastructure for the runtime environment, the business logic processing at the application layer, as well as observation, monitoring, and authentication systems.

From an industry development perspective, I predict that enterprise-level Web Agents will become a standard component of enterprise tech stacks. Just as companies today commonly use systems like CRM and ERP, future enterprises will also widely use Web Agents to obtain and analyze external information. The market size could reach hundreds of billions of dollars, providing enormous growth opportunities for early players like TinyFish.

Ultimately, I believe that TinyFish represents not just a new technological solution, but a fundamental transformation in the way businesses interact with the online world. In an era where information is a competitive advantage, companies that can better understand and utilize online information will gain sustained competitive advantage. As TinyFish puts it: "Focus on what matters to you. For everything else, there's TinyFish." Their $47 million funding is just the beginning of this transformation, and the real value creation lies ahead. Transforming the complexity of the web into business opportunities is at the core of the enterprise-level Web Agent era.

AGENT4.84%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)