In the previous post , I showed how Opalytics lets you ask human questions and get analytics answers. This time we’re going full dev-mode: OAuth tokens, discovery docs, tool calling—and a small detour where I didn’t use the official SDK and built my own attribute-based version instead.
Obviously, first of all I had to setup Piwik PRO tracking on my website. Luckily this is easily done, and everyone can sign up for a free trial. After that is setup, it's a simple matter of copying and pasting some js to your layout - or inject it in your tagmanager. This will at least get the basic tracking going - there are so many more things you can (and should) track - but I'll leave those for a future blog post.
When you setup a trial you also get access to a set of sample data - which I used for the demo in my previous blogpost.
Now that it was all setup, development could begin.
Step 1: Getting data out of Piwik PRO (OAuth without tears)
Piwik PRO exposes a clean OAuth2 client credentials flow. You create an API client, exchange client/secret for a token, and then call endpoints like the “execute query” API to fetch exactly the data you need. You can read all about it here.
To get the ClientID and ClientSecret you basically just go to the API Keys part of your profile and create a new key - it will give you everything you need.
Now, with those in hand I was ready to start calling the API.
Hand-rolling curl requests was fun for about… three minutes. So I started a tiny .NET SDK that wraps auth + a few common calls, e.g. "Execute Query", "List goals" and so on. The goal: turn scary JSON into pleasant C# methods. Luckily a lot of the report data for analytics can be retrieved using the main Execute Query method - where you basically provide it with one or more dimensions and the metrics you want back along with a time-scope, and you'll get exactly what you ask for!
There is fairly decent documentation: Piwik PRO’s developer portal is solid and shows the exact token and query shapes.
Here is how you execute a query, and here is a really useful list of metrics and dimensions you can use.
Now, executing _any_ query is a bit abstract, and to make it easier for the LLM's, I made a collection of llm-friendly methods for the most typical queries - like the most popular pages, the key metrics and so on. I think I currently have around 22 of those in place - but it keeps growing.
Step 2: Let the LLM press the buttons (aka Tool Calling)
With friendly methods in place, I tested local LLMs (Qwen, DeepSeek) with tool calling.
Side-note: After an unfortunate experience with Azure OpenAI's API a year ago where a few test calls resulted in a $100k usage invoice I've started testing more and more on locally running llms - but that's a story for another time. Fact of the matter - if you have a decent GPU then it's really easy to get set up with Ollama and run your own local AI-llm - which works great for development and testing.
Not all LLM's support the same advanced method signature as Opal, so I made a wrapper for them that simplified things a bit - basically made sure to return a text string to the LLM's with the return data (as LLM's are so good at understanding text and extracting structured data from it). Opal on the other hand just wants the structured data back.
The advantage of debugging and testing against the local LLM's was that I could easily setup breakpoints and debug my way through the calls - and it could all run in the console.
Here you can see how it looked - and QWEN has even been kind enough to share it's thought process.
So, basically the LLM read the provided function descriptions and chose the right method surprisingly well. Next stop: Opal, Optimizely’s AI platform for wiring tools to models.
Opal uses a discovery document—a JSON list of functions, descriptions, and parameter schemas. When Opal sees that doc, it knows what your tool can do and when to call it.
However, both the discovery document and the tools/functions needs to be on a public website for Opal to be able to reach them. And that means that live debugging can get a little bit tricky...
Step 3: “Why not the official Opal Tools SDK?” (I tried.)
Optimizely provides an Opal Tools SDK for .NET that can auto-register tools and emit discovery for you. It’s neat and evolving—but in my case I hit bumps and needed a very specific shape for routing and auth. Rather than fight the problems, I simply asked my trusted army of AI coding agents to scaffold a super lightweight alternative: attribute-driven API controllers that generate the discovery doc on startup.
(If you’re building your first tool, do try the SDK first; it reduces boilerplate a lot. But if you need odd routing or custom policies, rolling your own can be faster.)
Anyway - to get going with Opal I added an MVC Web app to my project and added a ToolsController with a bunch of actions wrapping the tools I previously prepared.
I then added my custom attributes to both the actions and their parameters in order for it to be able to generate the Opal discovery document. If we just take a look at a simple method like "GetMostPopularUrls" the discovery signature looks like this:
{
"functions": [
{
"name": "GetMostPopularUrls",
"description": "Lookup the most visited URLs on the website.",
"parameters": [
{
"name": "DateFrom",
"type": "string",
"description": "Start date in ISO (YYYY-MM-DD) or relative range: last_7_days, last_30_days, last_90_days, this_month, last_month, this_year, last_year, yesterday, today. Leave null for default.",
"required": false
},
{
"name": "DateTo",
"type": "string",
"description": "End date in ISO (YYYY-MM-DD). Only used if DateFrom is not a relative range.",
"required": false
},
{
"name": "Limit",
"type": "number",
"description": "Number of URLs to retrieve (default 10).",
"required": false
}
],
"endpoint": "/tools/GetMostPopularUrls",
"http_method": "POST"
}
]
}
With this in place, Opal automatically knows which tools exist, how to call them, and what parameters they expect.
Step 4: Code example
Here’s how my controller action and parameters look:
[OpalTool("GetMostPopularUrls", "Lookup the most visited urls on the website")]
[HttpPost("tools/GetMostPopularUrls")]
public async Task<List<PopularUrlsResult>> GetMostPopularUrls(GetMostPopularUrlsParameters parameters)
{
return await _piwikTools.GetMostPopularUrlsAsync(
dateFrom: parameters.DateFrom ?? GetDefaultDateFrom(),
dateTo: parameters.DateTo ?? GetDefaultDateTo(),
limit: parameters.Limit);
}
public class GetMostPopularUrlsParameters
{
public const string DateFromDescription =
"Start date in ISO (YYYY-MM-DD) or relative range: last_7_days, last_30_days, last_90_days, this_month, last_month, this_year, last_year, yesterday, today";
public const string DateToDescription =
"End date in ISO (YYYY-MM-DD). Only used if DateFrom is not a relative range.";
[OpalParameter(DateFromDescription, required: false)]
public string? DateFrom { get; set; }
[OpalParameter(DateToDescription, required: false)]
public string? DateTo { get; set; }
[OpalParameter("The number of urls to retrieve", required: false)]
public int Limit { get; set; } = 10;
}
Step 5: Registering in Opal
Once the code was ready, I published it to a public website then it was time to log into Opal and register the tool provider.
Basically you can register tools in Opal by providing it with a name, and the public url to the discovery json document. If you want you can provide a Bearer Token that will be passed along in the HTTP Header in all calls done by Opal - this way you can tie it together with other accounts. For my PoC I took the easy way out and just hardcoded access to my demo data.
If you have access to Opal, but don't see the Tools menu - then the ability to register custom tools might not be available to you yet. But I'm sure it's coming soon :-)
Step 6: Building with AI in the loop
I didn’t do this all by hand:
- GitHub Copilot → created PRs for new endpoints + discovery entries.
- ChatGPT → sanity-checked JSON and descriptions.
- Local LLMs → helped test tool calling before wiring into Opal.
The result: faster development, less boilerplate, and fewer mistakes.
Step 7: The architecture flow
Here’s how the pieces fit together:
Wrap-up
So that’s how Opalytics works under the hood:
- OAuth to access Piwik PRO
- A small .NET SDK
- Attribute-based controllers that generate a discovery document
- Opal consuming that document to let LLMs call analytics tools automatically
The end result: analytics you can talk to, instead of dashboards you have to dig through.
As always I'm left with a long todo list of things I really want to do, to improve this project - I'll see how quickly I can get stuff done and share it with you guys here - but feel free to reach out if you want to learn more.
This project also sparked some even crazier ideas on which tools you could provide - if I have time I'll experiment a bit with those.