Robots here, there and everywhere, and now in the office

In the past two to three years, Robotic Process Automation (RPA) software has taken hold of the enterprise, introducing a new wave of virtual robots to better assist office workers with labor-intensive information tasks. Learn the basics of this new technology and why it has become such a hot topic in the world of enterprise software.
By

Nuance’s Document Imaging Division is now part of Kofax. Learn more

Excuse me for one moment while I move away from my laptop to let the eufy RoboVac work its magic around my desk. In 2019, it’s hardly novel that little robot vacuums wander around hundreds of thousands, if not millions of homes sucking up dirt and dust, saving time and making indoor air and surfaces cleaner. Over the past forty years, robots have migrated from science fiction books, movies, and television shows, to the real world of automated manufacturing, toys, autonomous vehicles and now consumer “tools” like vacuums. We are surrounded by robots and have been for a while now.

Each year, robots become more versatile, useful, and less expensive, enabling people to work smarter, safer, and get more done in less time. Robots are good at vacuuming, completing precision manufacturing tasks, and driving trucks and cars, but can robots help with something like office work?

You may not see them at the coffee station or water cooler, but a growing number of enterprises are rapidly deploying robots in the form of software that automate an array of digital tasks. This category of software tools is referred to as Robotic Process Automation, or more commonly RPA. RPA is one of the fastest growing segments of the software industry, with some analyst estimates showing greater than 100% growth annually over the past two years. If you are not familiar with RPA software, this post will break it down, including why it’s important for office work.

 

A Brief RPA History

While it seems to have come out of nowhere, RPA software has roots that extend back to the 1990s. Software developers used “automation” tools that could be programmed to manipulate another application. The programmers could feed it a long list of instructions and the automation software would follow each step testing the new application the way an end-user would operate it. The automation software would detect bugs and log them for the programmer to fix. Using automation, a programmer could let the “bot” test her or his software while they went onto another task in parallel. It was not as thorough or reliable as a human quality assurance tester, but it helped speed up repetitive testing tasks.

As automation tools evolved, they gained different capabilities to support a wider array of tasks. One of those capabilities is referred to as “screen scraping”. As the name implies, the software could detect fields and values on a computer screen by tracking and measuring pixels. Once a required field was identified, the software could copy (“scrape”) the field data and move it to another location like a spreadsheet or database. Most of the uses for this technology were limited, but for critical tasks, this removed the tedious and error-prone task of moving small amounts of data from one application to another.

In the late 1990s, with the arrival of the internet, the browser and HTML (hypertext markup language), there was now a universal platform available for efficiently sharing a lot of new data. As web applications exploded in number and usefulness, a massive number of data sources came online, literally. While those web apps, including sites like Amazon.com, eBay and many more, published a lot of data to the browser, there was no direct way to access it without manually copying and pasting a lot of data, or manually keying it into a spreadsheet or another database. This is where the concept of RPA really took hold.

As HTML standards matured, RPA tools could more accurately and dynamically map the user interfaces for an almost unlimited number of web applications. Once an automation routine was mapped to the coded tags that defined the page, the software could replicate human tasks, entering search queries and then exporting the resulting text, links, images and whatever else the app returned into a spreadsheet or database. This approach came to be defined as “virtual” or “synthetic” APIs (application programming interfaces). Regardless of the name, it served a critical purpose by programmatically, automatically and accurately capturing data from one system and accurately replicating it in another.

The inability to synchronize data between different systems has been a major information technology problem for decades, including most major industries. As the volume of applications has exploded in, so has the interoperability problem, driving up the cost to maintain systems and making it harder to drive greater operational efficiency as information becomes locked in system silos. By combining automation, screen scraping, virtual APIs and other capabilities, RPA provides a dynamic middleware platform that effectively unlocks information from virtually any human accessible source.

 

A Breakthrough

Suddenly a whole world of previously hard to collect web and system data was accessible at a dramatically lower cost and in near real time. RPA made it easier to:

  • Safely and efficiently replace and upgrade critical business applications, like replacing one brand of ERP or CRM system with another, without disrupting the business
  • Collect valuable data from small niche sources in real-time, like traffic information, retail prices or weather data
  • Synchronize data between disparate systems quickly and accurately without the need to write dedicated software code
  • Provide office workers with access to more timely and accurate information in the format that works best for them, like an excel report rather than static PDF documents.

With a relatively low-cost profile (several products are available as free community versions), enterprises have quickly adopted RPA technology deploying bots for a wide range of critical applications. Because RPA is adaptable to virtually any interface, enterprises can creatively access a wide range of data sources without making any structural changes to those sources. RPA bots will not transfer data as quickly as two systems with a bidirectional integration, but they can be run in parallel to increase overall throughput. In some situations, banks can run very high volumes, in the billions, of automated transactions tasks using RPA bots.

The combination of flexibility, relatively low-cost, a rapid rise in structured web-accessible data sources and a never-ending need for accurate, dynamic data has fueled the recent boom in Robotic Process Automation software. You won’t see them sitting in the cubicle near you, but the robots are in the office, helping workers perform many data centered tasks that were inaccessible before.

In the next of this two-part series, learn how RPA capabilities differ from and complement Document Capture software to deliver the most complete information capture solution for structured, semi-structured and unstructured information.

Learn how with Kofax

Visit Kofax to learn about the options for combing RPA and Capture to transform automate tasks in the digital workspace.

Free trial

Tags: , , ,

Joel Mazza

About Joel Mazza

Joel Mazza is the Senior Product Marketing Manager responsible for Capture solutions at Nuance. He is responsible for understanding and translating end customer business problems into capture capabilities within the Nuance solutions portfolio. Joel brings more than 20 years in software product management, product marketing and sales experience, working with a wide range of end-customer organizations across many industries including Finance, Insurance, Manufacturing, Engineering & Construction, State and Federal Government and Healthcare. For fun, Joel enjoys time with his family, Boston area professional sports, and just recently took up the sport of kiteboarding while visiting North Carolina’s Outer Banks.