I didn’t know Excel could do this with live data
Excel is great for organizing data. But if you need information that changes constantly—like stock prices, weather updates, or cryptocurrency rates, you don't have to copy numbers from websites every time they change. Luckily, Excel has two built-in functions, WEBSERVICE and FILTERXML, that can pull live data directly from the web and keep it updated automatically. These join the list of other underrated Excel functions that can save you significant time. So, you don't need any special add-ins or subscriptions—just these two functions and a basic understanding of how websites structure their data. You can pull raw data from a website with WEBSERVICE The syntax is dead simple but the results are raw

Excel is great for organizing data. But if you need information that changes constantly—like stock prices, weather updates, or cryptocurrency rates, you don't have to copy numbers from websites every time they change.
Luckily, Excel has two built-in functions, WEBSERVICE and FILTERXML, that can pull live data directly from the web and keep it updated automatically. These join the list of other underrated Excel functions that can save you significant time. So, you don't need any special add-ins or subscriptions—just these two functions and a basic understanding of how websites structure their data.
You can pull raw data from a website with WEBSERVICE
The syntax is dead simple but the results are raw

The WEBSERVICE function is Excel's built-in tool for fetching data directly from the internet. It sends a request to a web address (URL) and returns whatever information that page contains. It returns data as text, often in XML format, but it can also be JSON or plain text. However, FILTERXML, used with WEBSERVICE, only works with XML, so JSON data can't be parsed directly without conversion.
It uses the following syntax:
=WEBSERVICE(url)
WEBSERVICE has only one parameter—the URL. The complete web address you want to pull data from. This must be a valid URL that returns data, not just a regular webpage. Most APIs (Application Programming Interfaces) provide specific URLs that deliver data in structured formats such as XML or JSON. The URL must be enclosed in double quotes, or you can reference an Excel cell that contains the URL.
Here's a simple example. If you want to pull XML data from an API that provides currency exchange rates, you'd use something like:
=WEBSERVICE("https://api.example.com/exchange-rates")
The function returns everything the server sends back, which is usually a long string of text with tags, brackets, or other formatting. It's raw and unfiltered, which is why you'll need FILTERXML to make sense of it later.
WEBSERVICE only works with URLs that don't require authentication. If a website requires a login or an API key embedded in the headers, this function won't work. You'll need Power Query or VBA for those scenarios.
You need FILTERXML to make the web data usable
XPath queries let you pick out exactly what you want

The raw output from WEBSERVICE is usually a mess of tags and text that Excel can't interpret on its own. That's where FILTERXML comes in. It extracts specific pieces of information from XML data using XPath queries.
The syntax looks like this:
=FILTERXML(xml, xpath)
Here's what each parameter does:
- xml: The XML data you want to filter. This is typically the output from a WEBSERVICE function, but it can also be a cell reference containing XML text or a manually entered XML string enclosed in double quotes.
- xpath: An XPath query that tells Excel exactly which part of the XML structure to extract. XPath uses a path-like syntax (similar to file directories) to navigate through XML tags and attributes. For example, //price would extract all elements with the tag "price" from the XML data.
Let's say the WEBSERVICE function in cell A1 returns an XML, and you need to extract just the USD exchange rate, you'd use:
=FILTERXML(A1, "//currency[@name='USD']")
This formula tells Excel to look inside the XML in cell A1, find the "currency" element where the "name" attribute equals "USD," and return its value.
If you want to extract multiple values at once, FILTERXML returns them as an array. Using //currency instead would pull all exchange rates (if available in the XML), which you can spill across multiple cells if you're using Excel 365 or 2021.
The tricky part is writing the correct XPath query. XML data has a hierarchy—parent tags contain child tags—and your XPath needs to follow that structure. If the data has namespaces (prefixes like ns:currency), you'll need to account for those too, though most simple APIs don't use them.
Here are some practical ways you can use live data in Excel
Pull top NFL headlines from ESPN

Once you've got the hang of WEBSERVICE and FILTERXML, you can start pulling live information into your spreadsheets. The key is finding sources that offer proper XML or RSS feeds rather than trying to scrape regular websites.
If you follow sports, you can pull the latest NFL or any other news directly into Excel using ESPN's RSS feed. The formula looks like this:
=FILTERXML(WEBSERVICE("https://www.espn.com/espn/rss/nfl/news"),"(//item/title)[1]")
This grabs the most recent headline from ESPN's NFL news feed. The (//item/title)[1] part tells Excel to extract the first item's title from the RSS feed. If you want multiple headlines, you can change [1] to [2], [3], and so on.
You can swap out the URL for other ESPN RSS feeds—MLB, NBA, NHL, or general sports news. Just replace /nfl/ in the URL with the sport you want to track. Every time you refresh the workbook with Ctrl + Alt + F9, it pulls the latest headlines.
Other data you can pull
Beyond sports news, there are plenty of XML feeds that work with these functions. The European Central Bank offers daily currency exchange rates in a clean XML format, which is useful if you're tracking international transactions or planning travel.

Government agencies like USGS provide RSS feeds for recent earthquake data, and NASA has feeds for their astronomy picture of the day. The World Bank also offers XML-based data feeds for economic indicators like population, GDP, and inflation rates.
The common thread across all of these is that they're official data feeds designed to be machine-readable. Trying to scrape regular websites—like e-commerce product pages or cryptocurrency exchanges—usually doesn't work because they return HTML or JSON that FILTERXML can't parse.
Consider these important API limitations
Some modern websites won't work with this method
While these functions are powerful, they come with some restrictions that can limit what you're able to do. WEBSERVICE can't handle authentication. If an API requires OAuth tokens, login credentials, or authorization headers, this method won't work. Some APIs let you append a key directly to the URL—like ?api_key=12345—which does work, but anything more complex is off the table. Additionally, these functions are Windows-only, so Mac users can't use them at all.
Free APIs often have rate limits that restrict how many requests you can make per hour. If you're pulling data for multiple items and refreshing frequently, you'll hit that limit quickly and either get error messages or a temporary IP block. Also, if a feed changes its XML structure or goes offline, your formulas break—and sadly, there's no way to know until it stops working.
Understanding the basics opens up live data
XPath takes some getting used to, but it's worth learning
Writing XPath queries isn't intuitive if you've never worked with XML before. You need to understand the data structure, how attributes work, and how to navigate parent-child relationships between tags.
For simple data, it's manageable—but if you're dealing with deeply nested XML or namespaces, the syntax gets complicated fast. There are online XPath testers such as XPather that can help, but they still require trial and error to get it right.
Despite these limitations, WEBSERVICE and FILTERXML are useful for straightforward tasks that require live data from XML-based sources. If you're pulling currency rates, tracking a few stock tickers, or monitoring weather conditions, these functions handle the job without needing Power Query or external add-ins. Just know what you're getting into before building something that depends on them.
Share
What's Your Reaction?






