Evolution Of Work

September 18, 2025

Evolution of Work 

Pre-1960 - the early days: 

Before the 1960s, much of organizational work revolved around the written document—physical files stored in cabinets, often stacked floor to ceiling. Companies were typically structured along rigid functional lines, and the "file" itself acted as the carrier of work. A request would originate in one department, say sales or procurement, and then physically travel—sometimes literally in a manila folder—from one office to the next. Each department along the chain would add its comments, approvals, or required actions. For example, a purchase request might begin with the operations team, then move to finance for budget approval, onward to legal for contract vetting, and finally to the executive office for sign-off.

The process could take weeks, even months, with clerks serving as the human conveyors of progress. Anecdotes from large corporations of that era describe “file runners”—junior employees whose primary job was to hand-carry documents across buildings or even across town, ensuring the workflow didn’t stall. In government offices, a file’s journey could be tracked by the colorful notations and stamps on its cover, often referred to jokingly as the "life story of the document." In essence, the file was the organizing principle, defining not only the flow of information but also the rhythm of work itself.

It wasn’t until the arrival of computers, databases, and electronic communication in the 1960s and beyond that this paper-bound, sequential method of organizing began to give way to more dynamic, parallel, and integrated systems.

Digitization, Information Processing & Evolution of Work/Labour

Monoliths & the Mainframe (1960’s and 70’s): 

When computers first entered the corporate world in the 1960s and 1970s, large organizations established dedicated IT departments to manage these massive mainframe systems. These weren’t the nimble laptops or cloud servers we know today - they were room-sized machines that required specialized environments, raised floors, and entire teams of operators to keep them running. The computers processed business requests—such as payroll, billing, or inventory management - and generated outputs that employees and customers relied on.

This era also gave rise to the dominance of hardware giants like IBM, whose System/360 line (launched in 1964) became the backbone of many Fortune 500 companies, and Digital Equipment Corporation (DEC) with its PDP and later VAX minicomputers, which democratized computing by making it more accessible to departments rather than just centralized corporate IT.

Information processing initially relied on punch cards, where each card represented a set of instructions or data. Anecdotes from that time highlight how employees would carry stacks of punch cards - sometimes hundreds at a time - to the computer center, only to discover that a single misplaced hole or missing card meant rerunning the entire batch. Over time, technology advanced from punch cards to magnetic tapes and then magnetic disks, which enabled faster and more flexible data storage. With each innovation, machines shrank in physical size but grew exponentially in processing power, setting the stage for the personal computer revolution of the 1980s.

Distributed Systems (1980’s): 

The 1980s marked a fundamental shift in computing architecture - the move from centralized monolithic mainframes to distributed systems built on the client-server model. Until then, most organizations relied on large, expensive mainframes that tightly controlled both data and applications. These systems were powerful but also inflexible, costly to maintain, and largely the domain of specialists.

The advent of affordable personal computers (PCs) and local area networking technologies democratized computing power. Companies like Microsoft, Dell, and Compaq were the young challengers of the decade, introducing cost-effective machines that could sit on every employee’s desk. Microsoft, with its MS-DOS and later Windows operating system, became the standard software layer, while Dell and Compaq revolutionized hardware manufacturing and distribution by building modular, lower-cost machines compared to IBM’s proprietary systems.

This client-server architecture allowed businesses to distribute workloads: data could be stored and processed on centralized servers, while applications and interfaces ran on networked client PCs. For example, corporate email systems such as Lotus Notes or early Microsoft Exchange leveraged this model, enabling collaboration in ways mainframes never could. Financial institutions adopted client-server trading systems that replaced slower, batch-oriented mainframe processes with near real-time applications, a leap that transformed Wall Street in the 1980s and 1990s. This shift also eroded the monopoly of companies like IBM, whose dominance in mainframes was challenged by a new wave of competitors. Apple, with the introduction of the Macintosh in 1984, showcased a user-friendly graphical interface that emphasized accessibility, while Compaq became the first company to successfully reverse-engineer IBM’s PC BIOS—ushering in the era of “IBM-compatible” PCs, which ultimately undercut IBM’s market control.

By the end of the 1980s, distributed client-server computing had become the organizing principle for corporate IT, laying the groundwork for internet-based systems and cloud architectures in later decades. This was the era when nimble, affordable, and networked machines replaced the centralized, monolithic model - and when companies like Microsoft and Apple transformed from upstarts into industry-defining behemoths.

Labour Arbitrage (1990’s through Present): 

Another transformative trend that accelerated in the 1990s was labour arbitrage, enabled by advances in networking and globalization. As corporations adopted distributed computing and became comfortable moving data and processes across networks, they realized that work no longer needed to be performed in the same geography as the customer. This gave rise to the practice of “outsourcing” and later “offshoring”, where non-core but essential functions - such as customer service, IT support, and back-office operations - were moved to lower-cost locations with a large, skilled workforce.

This shift was the foundation of the modern Business Process Outsourcing (BPO) sector. India, with its large English-speaking talent pool and improving telecom infrastructure, quickly became the global hub. Companies like Tata Consultancy Services (TCS), Infosys, and Wipro transitioned from being primarily IT services vendors into major outsourcing players, building large delivery centers in Bangalore, Hyderabad, and Pune.

Even multinational corporations embraced this model. General Electric (GE), under Jack Welch’s leadership, was an early pioneer in leveraging Indian talent for back-office and process work. In 1997, GE established what would later become Genpact, one of the first large-scale captive BPOs, which eventually spun off into a standalone global outsourcing powerhouse. Similarly, firms like Accenture and IBM Global Services expanded aggressively into offshore service delivery, blending global consulting with scalable, cost-effective labour models.

The BPO wave fundamentally changed the cost structures of large corporations. For example, banks shifted call centers and transaction processing to India and the Philippines, reducing costs while scaling operations. Insurance firms offshored claims processing. Tech companies outsourced help desks and infrastructure support. By the late 1990s and early 2000s, labour arbitrage was no longer just a cost-cutting measure but a strategic advantage, fueling the rise of a multi-billion-dollar global services industry.

As the sector matured, however, global clients began demanding more than just cost savings- they wanted expertise and value creation. This evolution led to the rise of Knowledge Process Outsourcing (KPO) in the 2000s. Unlike BPO, which focused on standardized processes, KPO involved specialized, high-value work such as equity research, legal document review, medical transcription, and advanced analytics.

For instance, firms like Evalueserve and Copal Partners (later acquired by Moody’s) offered financial research and analytics services to global banks. Infosys and Wipro began offering business intelligence and data analytics solutions. Law firms experimented with legal process outsourcing to India for document discovery. Even healthcare organizations leveraged offshore teams for medical coding and clinical trial data analysis.

This move up the value chain demonstrated how labour arbitrage had evolved - from handling simple, repetitive tasks to providing strategic insights and domain expertise. It also reshaped global perceptions of emerging market talent. India, the Philippines, and later Eastern Europe were no longer seen just as back-office hubs, but as centers of innovation, analytics, and domain knowledge.

As we entered the 2010s, the outsourcing industry faced both new pressures and new opportunities. Rising wages in India and other offshore markets began to erode the simple labour arbitrage advantage, while global clients increasingly demanded faster turnaround, deeper insights, and innovation - not just cost savings. At the same time, advances in automation, data analytics, and artificial intelligence opened up the possibility of reshaping outsourcing itself.

This marked the beginning of what some call “KPO 2.0” - a convergence of knowledge work and digital technologies. Instead of relying solely on large teams of skilled labour, companies began combining human expertise with automation platforms to deliver higher productivity, lower costs, and deeper insights.

For example:

  • Infosys launched its AI platform Nia, designed to automate repetitive IT and business processes while also extracting predictive insights from enterprise data.
  • TCS built Ignio, a cognitive automation system that uses AI to autonomously manage IT operations and business processes.
  • Genpact, the BPO pioneer spun out of GE, repositioned itself as a “digital transformation partner”, acquiring AI and analytics startups to enhance its service offerings.
  • Accenture embedded machine learning and automation into its outsourcing contracts, offering clients not just cheaper services but “intelligent operations.”

Even traditional BPO tasks like call centers underwent a transformation. With natural language processing (NLP) and chatbots, many routine customer queries are now handled automatically, while human agents focus on complex or high-empathy interactions. In financial services, back-office reconciliation once done by armies of clerks is increasingly automated, with human oversight limited to exceptions. In healthcare, AI helps parse clinical trial data or assist radiologists, offloading repetitive work to algorithms.

This blending of automation + human expertise is reshaping the very nature of outsourcing. Instead of a cost-arbitrage model, global service providers are now selling outcome-based solutions - reducing fraud in banking, increasing customer retention in telecom, and accelerating drug development in pharma. Labour arbitrage is no longer the core story; rather, the narrative is about scaling expertise through digital platforms.

Interestingly, this shift also mirrors earlier disruptions. Just as client-server PCs in the 1980s challenged mainframes, today’s AI-driven distributed services are challenging the legacy outsourcing model. The winners are no longer just the lowest-cost providers, but those who can integrate technology, talent, and domain knowledge to deliver measurable business value.

Looking forward, we are entering an era where the next wave of arbitrage is not geographic but cognitive - companies will arbitrage between human expertise and machine intelligence, seamlessly orchestrating the two across global delivery models.

Technology Transformations (1990 – 2020s): 

BPM/BRE as an Architectural Innovation

One of the hardest challenges in enterprise software through the 1990s was embedding workflow logic directly into application code. While this gave tight integration, it made systems brittle - every small process change required new code, testing across countless edge cases, and costly deployments. For industries like banking, insurance, and telecom - where compliance rules and customer processes evolved constantly - this rigidity became a major bottleneck.

To solve this, Business Process Management Systems (BPMS) emerged. These tools separated process logic from core code, representing workflows graphically so that business analysts, not just programmers, could model and adapt them. Similarly, Business Rules Engines (BREs) let firms externalize rules (e.g., “loan applicants under 25 need a guarantor”) from core applications, so rules could be updated without rewriting software.

  • Pegasystems (Pega) became a category leader, helping banks like Citibank and insurers like AIG streamline claims, underwriting, and customer service workflows.
  • iLOG, a pioneer in BREs, was widely adopted in credit risk and supply chain optimization before being acquired by IBM in 2008 for $340M to strengthen its middleware offerings.
  • GE Capital used BREs to manage complex credit approval rules across global operations, reducing turnaround times and regulatory risk.

These innovations marked a turning point: organizations began decoupling the what” (business logic and rules) from the how” (infrastructure and application code) -  foundational step toward today’s agile enterprises.

Architectural Pattern Evolution: Client-Server → SOA → Cloud & Serverless: 

Client-Server (1990s):

The shift from mainframes to client-server changed enterprise IT economics. Suddenly, companies could run core applications on commodity hardware from Dell or Compaq, instead of multimillion-dollar IBM mainframes. For example, Walmart built its inventory systems on client-server architecture, which allowed it to process transactions more cheaply and scale globally. Databases like Oracle and SQL Server became household names in IT departments, powering everything from payroll to e-commerce.

Service-Oriented Architecture (SOA, 2000s):
As businesses grew more complex, SOA emerged to modularize applications into independent services connected by standardized interfaces. This enabled upgrades, scaling, and vendor flexibility. A bank could, for example, swap its payment gateway service without rewriting its customer onboarding system.

  • WebLogic (BEA, later Oracle) and WebSphere (IBM) became staples for enterprise middleware.
  • Oracle dominated RDBMS, while new entrants like MongoDB and Cassandra popularized NoSQL for web-scale applications. and
  • SAP and Oracle ERP leveraged SOA principles to integrate finance, supply chain, and HR into unified systems.

Cloud, Microservices & Serverless (2006 onward):

Cloud computing changed the game by offering elastic infrastructure on demand. Amazon Web Services (AWS), launched in 2006, became the backbone of startups and enterprises alike.

  • Netflix famously migrated to AWS after a massive database failure in 2008, pioneering microservices architecture to scale video streaming for hundreds of millions of users worldwide.
  • Spotify used microservices to manage its recommendation engine and user playlists, allowing rapid innovation at scale.
  • Serverless platforms like AWS Lambda and Google Cloud Functions pushed abstraction further, letting developers run business logic without managing servers at all - ideal for event-driven use cases like payment notifications or IoT.

These transitions—from client-server, to SOA, to cloud-native - represent the unbundling of the monolith. Each era expanded modularity, scalability, and resilience, making enterprise IT more adaptable and business-aligned.

Client server architecture made it easier and cheaper for companies to operate on commodity and cheaper hardware. It also made replacing/upgrading of parts of the system easier. 

Moving to SOA, decoupled a lot of these components and thus made the replace and swap of components even easier. This allowed a lot of the components to evolve on their own path/timeline like application servers, databases, dataware houses, web servers etc. Each of these specialised to create their own champions for e.g. Oracle for RDBS, a number of players for NoSQL, Weblogic and Websphere lines for app servers etc.  

Finally moving to the cloud, serverless architectures, microservices further enabled dynamic scaling, redundancy, and availability of these applications. This transformation resulted in the emergence of large cloud providers like AWS, GCP, Azure and OCI

Evolution of RPA:

Pre-RPA Foundations (1990s – early 2000s): Workflow & Screen Scraping

Before RPA became a term, companies were already automating tasks - but in primitive ways:

  • Workflow automation: Tools like Pega and FileNet helped orchestrate business processes by routing tasks between humans and systems. However, they required heavy IT projects and deep integration with back-end systems.
  • Screen scraping & macros: Early automation often used scripts (e.g., Excel macros, Visual Basic scripts) or “screen scraping” tools that mimicked user inputs to copy/paste data between legacy apps. These were fragile, breaking when UIs changed.

Example: Banks used screen scrapers to extract data from mainframe terminals into spreadsheets for risk reports, saving analysts hours of manual data entry.

Emergence of RPA as a Category (mid-2000s – early 2010s):

Around the mid-2000s, the term Robotic Process Automation (RPA) began gaining traction. Unlike workflow automation, which required system integration, RPA focused on mimicking human actions at the UI level - logging into apps, moving files, filling forms, generating reports.

  • Key players emerged: Blue Prism (2001, UK) pioneered the concept, followed by UiPath (founded 2005 in Romania) and Automation Anywhere (2003, US).
  • Low-code platforms: RPA vendors offered visual “drag-and-drop” studios for designing bots, making automation accessible to business users, not just IT.
  • Scalability: Enterprises started deploying RPA in back-office operations - finance, HR, procurement - where repetitive, rules-based tasks were abundant.

Example: A telecom provider automated SIM card activation through RPA bots that logged into multiple legacy systems, cutting provisioning time from 48 hours to a few minutes.

Hypergrowth & Enterprise Adoption (2015 – 2020):

RPA became a mainstream enterprise strategy as companies looked for efficiency, especially in financial services, insurance, and healthcare.

  • Funding boom: UiPath, Automation Anywhere, and Blue Prism became “unicorns,” attracting billions in VC funding. UiPath IPO’d in 2021.
  • Citizen development: RPA tools expanded to “citizen developers” in business units, enabling non-technical staff to build and deploy bots.
  • Integration with BPM & ERP: Vendors integrated RPA with BPM, ERP (e.g., SAP, Oracle), and ITSM systems to support end-to-end processes.
  • Digital workforce narrative: RPA vendors positioned bots as a virtual workforce handling high-volume, rules-based work.

Example: GE Capital and Deutsche Bank automated thousands of finance and compliance tasks - like reconciliations and regulatory reporting - reducing error rates while freeing employees for higher-value work.

Convergence with AI → Intelligent Automation (2020 onward):

Pure RPA had limits: it worked best for structured, rules-based tasks. To handle unstructured data (emails, PDFs, documents, images), RPA converged with AI, creating Intelligent Automation.

  • AI + RPA: Integrating NLP, OCR, computer vision, and ML lets bots “read” documents, classify emails, or extract data from invoices.
  • Process mining & discovery: Tools like Celonis and UiPath Process Mining analyze logs to discover automation opportunities.
  • Hyperautomation (Gartner’s term): Combining RPA with AI, process mining, low-code platforms, and orchestration tools help automate complex, end-to-end business processes.
  • Generative AI (2023 onward): RPA platforms are embedding LLMs (like GPT) to enable natural language-driven automation. Instead of building bots manually, users can now describe processes in plain English and let AI generate automation flows.

Example:

  • Coca-Cola uses intelligent automation to process invoices from suppliers worldwide, combining RPA with OCR and ML for accuracy.
  • UnitedHealth Group uses RPA with NLP to process millions of medical claims, reducing cycle time and improving compliance.
  • UiPath recently integrated GPT-style copilots into its platform, allowing business users to design bots conversationally.

Moving from Intelligent Automation to Agentic AI 

Intelligent Automation (IA)

Intelligent Automation is the fusion of Robotic Process Automation (RPA) with Artificial Intelligence (AI) technologies (e.g., NLP, OCR, ML, computer vision) to handle end-to-end business processes, including those involving unstructured data, judgment, and learning.

Whereas RPA automates structured, rules-based, repetitive tasks, Intelligent Automation extends this to more complex processes by adding intelligence.

Key Components:

  • RPA (Robotic Process Automation): Automates routine, rules-based workflows.
  • AI/ML: Adds capabilities like pattern recognition, predictive analytics, and decision-making.
  • NLP (Natural Language Processing): Understands and processes human language (emails, chat, documents).
  • OCR/Computer Vision: Reads and extracts data from scanned documents, PDFs, or images.
  • Process Mining: Identifies automation opportunities by analyzing event logs.

Examples:

  • Banking: Automating loan processing by combining RPA (data gathering from legacy systems) + OCR (reading income documents) + ML (credit risk scoring).
  • Healthcare: Automating claims processing where RPA extracts claim data, OCR digitizes medical forms, and ML flags anomalies for fraud detection.
  • Insurance: End-to-end automation of policy renewals - bots gather data, NLP reads customer emails, and ML models suggest pricing.

Business Value:

  • Cost savings: Labour arbitrage without offshoring.
  • Speed & accuracy: 24/7 bots with low error rates.
  • Scalability: Easier to roll out automation across geographies.
  • Employee augmentation: Freeing humans for higher-value tasks.
Agentic AI

Agentic AI refers to AI systems that act as autonomous agents, capable of planning, reasoning, and executing tasks proactively - often without human intervention. Unlike IA, which is still orchestrated by predefined workflows, Agentic AI can decide what to do next to achieve a goal.

This is powered by Large Language Models (LLMs) and multi-agent systems where AI agents can:

  • Understand goals in natural language.
  • Break them down into subtasks.
  • Interact with software/tools via APIs or UI automation.
  • Collaborate with other agents or humans.
  • Learn and adapt from outcomes.

Key Characteristics:

  • Autonomy: Can operate without step-by-step instructions.
  • Goal-oriented: Works toward objectives, not just scripts.
  • Reasoning: Uses planning and decision-making (e.g., chain-of-thought, tool use).
  • Multi-agent collaboration: Multiple AI agents work together (e.g., one for research, another for execution, another for validation).
  • Adaptivity: Can adjust strategy if initial plan fails.

Examples:

  • Customer Service: Instead of a bot that only responds to FAQs, an agentic AI system can:
    • Understand a customer’s intent.
    • Look up account details.
    • Negotiate a solution.
    • Escalate to a human if necessary.
  • Financial Advisory: An AI agent could take “optimize my portfolio for retirement” as a goal, then research markets, run simulations, and recommend allocations dynamically.
  • Enterprise IT: AI agents can monitor systems, detect anomalies, suggest patches, and even trigger automated remediation - akin to a “self-healing” IT environment.”
  • Personal Productivity: Tools like AutoGPT or LangChain-based agents can book travel, draft reports, summarize research, and integrate with SaaS tools.

Business Value:

  • Handles unstructured, dynamic tasks beyond IA’s reach.
  • Reduces human-in-the-loop overheads.
  • Enables continuous improvement and learning of processes.
  • Moves toward autonomous enterprises where digital agents work alongside humans.

Key Difference Between IA and Agentic AI

A key catalyst driving the adoption of Agentic AI is its ability to deliver transformation through operational expenditure (Opex) rather than capital expenditure (Capex). Traditionally, large-scale transformation initiatives - whether in technology, process re-engineering, or automation - required heavy upfront Capex investments in infrastructure, platforms, and long multi-year programs. This often limited adoption to organizations with significant budgets and long investment horizons.

With Agentic AI, however, transformation can be achieved in a pay-as-you-go model leveraging Opex budgets. AI agents can be deployed rapidly via cloud-based platforms, consumed as services, and scaled up or down based on demand. Crucially, once transformation is in place, operational costs are significantly reduced - for example, through automated customer support, streamlined finance operations, or intelligent supply chain orchestration. This means that the cost savings begin to fund the transformation itself, reducing the need for large upfront commitments.

This shift represents more than just a funding model - it signals a business model transformation. By removing the Capex barrier, organizations of all sizes can now experiment, adopt, and scale Agentic AI in a way that was previously only available to large enterprises. Just as cloud computing democratized access to advanced IT infrastructure without requiring servers and data centers, Agentic AI is democratizing access to enterprise-grade automation and intelligence through Opex-based consumption models.

In short, Agentic AI is not just reshaping workflows; it is redefining how transformation is financed and delivered, creating a structural shift in the way businesses approach technology adoption.

Summary

Summarizing the trends in the table below, we classify each of these as a business model innovation or an incremental change.

Future of Work: 

The Future: Autonomous Enterprises (2025 and beyond)

We’re moving toward autonomous operations, where bots/agents don’t just execute pre-defined scripts but:

  • Learn from human decisions.
  • Optimize workflows automatically using AI.
  • Orchestrate across hybrid environments (legacy, cloud, SaaS).
  • Provide predictive analytics to suggest “what should” be automated next.

In this vision, we aren’t just talking about “bots doing tasks” but about building a self-healing, AI-driven digital workforce that augments humans in decision-making.

While many organizations today are experimenting with building and deploying their own agentic AI solutions, we expect this trend to evolve rapidly. Over time, the focus will shift from fragmented, in-house experimentation toward consolidation, with a handful of specialized agentic AI companies emerging as scaled providers. These companies will offer suites of industry-specific agents - for example, financial services agents for compliance monitoring, healthcare agents for prior authorization and claims, or retail agents for supply chain orchestration - covering the majority of functional tasks within end-to-end workflows.

In the early stages, enterprises will still need to manage their own agents - handling quality control, exception management, and human-in-the-loop interventions where an agent “drops down” to a human for oversight. But as adoption matures, we anticipate a second wave of consolidation: the management of agent ecosystems itself will be outsourced. Just as cloud infrastructure moved from on-premise IT to centralized hyperscalers, the management of agent suites is likely to move to a few large agent operations providers, who can deliver economies of scale, standardized governance, and continuous improvement.

In this model, organizations will consume agents much like SaaS or cloud services today - provisioning them on-demand, integrating them into workflows, and relying on specialized providers for monitoring, compliance, and optimization. This represents not just a shift in 

technology, but a restructuring of the value chain, where agent development, deployment, and management become concentrated in a few scaled players rather than dispersed across thousands of enterprises.

Conclusion:

To understand how these trends are shaping our investment thesis and strategy, we invite you to read the next post in this series. There, we outline our perspective on the AI stack, and highlight the key strategies and assets we look for in companies - particularly those building durable moats to sustain long-term competitive advantage.