Blog

Thoughts on product management, AI, compliance, and language.

Building a Cross-Platform DSA View: What the Data Doesn't Tell You

The VLOP dashboard I published brings 30 services into one view, but its limitations are as important as the data. The most significant: category definitions are not standardized across platforms. What TikTok labels "Hate Speech" and what Meta labels "Hate Speech" are defined by each platform's own policies, not by the DSA — so volume comparisons between platforms are comparing apples to internally-defined fruit. The data is also self-reported, with no third-party audit requirement under the current framework. Some platforms aggregate differently too: Google reports its 6 services separately while others report at the platform level, which affects how per-service numbers read. Worth noting: the DSA Observatory argued in January that removal counts and appeals figures can't tell you whether moderation is accurate or proportionate, and that Article 15.1(e)'s "indicators of accuracy" requirement had produced meaningless data — but that post predated the first harmonized reports. The new template, which these H2 2025 reports follow, does include precision and recall indicators, so that specific gap has been addressed. The structural problems of self-reported data and inconsistent category definitions remain. The dashboard is most reliable for tracking volume trends within a single platform over time; cross-platform normative comparisons should be treated with caution.

EU DSA VLOP Data Analysis

New: Cross-Platform EU DSA Transparency Dashboard for VLOPs

I've published an interactive dashboard covering H2 2025 EU Digital Services Act transparency reports across Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). The dashboard combines data from 30 services — including Google (6 products), X, TikTok, Meta (Facebook and Instagram), Pinterest, AliExpress, Amazon, LinkedIn, Snapchat, Booking.com, SHEIN, Temu, Zalando, Wikimedia (6 services), and others — into a single cross-platform view. It covers the four main DSA reporting categories: Notices, Own-initiative actions (illegal content and policy violations shown separately), Government Orders, and Appeals. Filter by platform, service, content category, and keyword; all data is processed in your browser.

EU DSA VLOP Dashboard Data

How I Use LLMs in Compliance PM Work

The use cases that have actually stuck are fairly specific. Requirement extraction: paste a statute or guidance document into the model and ask it to enumerate the reportable data elements, then diff that list against what current data infrastructure produces — especially useful when a new law (California AB 587) has overlapping but non-identical scope to an existing one (EU DSA Article 15). Gap analysis: maintain a structured cross-jurisdiction requirements table and use the model to flag where new guidance shifts the picture. Draft review: run a near-final report draft against a statutory checklist before it reaches Legal, to catch omissions early. None of these replace domain judgment, but they compress the time between "new regulatory requirement" and "here is what we need to build."

AI/LLM Compliance PM Product Management

Three Things Worth Noticing in the Google Removal Request Data

Now that the dashboard has been up for a day, here are the findings I find most worth exploring. The country distribution for defamation requests looks very different from government criticism requests — defamation dominates in large democracies like France, Germany, and India (where courts actively issue removal orders), while government criticism concentrates in a narrower set of countries with different press freedom profiles. Removal rate isn't a fixed property of any country: it shifts across the 13 reporting periods, and you can watch compliance posture tighten or loosen over time for a given country or product. And requestor type matters — a Court Order carries different operational weight than a Police request, and the mix has changed year over year. All of this is one or two filter changes away in the dashboard.

Google Transparency Data Analysis

New: Google Government Removal Requests Dashboard (2019–2025)

I've built an interactive dashboard on top of Google's public government content removal data — turning the per-period snapshots Google publishes into a multi-year trend explorer. The dataset spans 13 reporting periods from 2019 to mid-2025, covering 160 countries, 42 Google products, and 22 stated removal reasons. I was a primary author of the Greater China section of the Google Transparency Report from 2014 to 2024, so this dataset is one I know well. Every filter combination — country, requestor, product, reason — generates a time series, and a separate chart tracks removal rate over time. All data loads directly in your browser.

Google Transparency Dashboard Data

California AB 587 vs. New York S895: What's Different and Why It Matters

Both laws require social media companies to publish transparency reports on content moderation, but their scope and data requirements diverge in ways that matter for reporting infrastructure. AB 587 focuses on policy disclosure: platforms must file their terms of service and describe how they enforce them across 10 content categories (including hate speech, harassment, and foreign political interference), with semi-annual reports to the California AG. S895 is more operationally specific: quarterly reports to the New York AG covering the number of pieces of content actioned, accounts suspended, and appeals processed — broken down by content category. In practice, AB 587 requires narrative and policy mapping while S895 requires structured operational data. The overlap is substantial enough to share review cycles, but the underlying data requirements are different enough to maintain separate pipelines.

Regulatory Compliance PM Product Management

Roblox's California and New York Transparency Reports Are Live

Two state social media transparency reports are now filed — Roblox's California AB 587 (H2 2025) and New York S895 (Q4 2025). The two statutes overlap in scope but diverge on specifics: AB 587 focuses on content moderation policies and their enforcement, while S895 requires more granular data on actions taken against specific accounts and content. Running both filings in parallel meant maintaining separate data tracks while sharing cross-functional review capacity across Legal, Policy, and Engineering — a useful forcing function for building more durable reporting infrastructure.

Roblox Compliance PM Transparency Regulatory

What EU DSA Transparency Reporting Actually Requires

The EU Digital Services Act's transparency reporting obligations are often described in shorthand — "publish a report on content moderation" — but the actual statutory requirements are more specific. Article 15 requires all intermediary services to publish annual reports covering orders received from member state authorities, notices received under Article 16, own-initiative moderation actions, and use of automated tools. For VLOPs and VLOSEs, Article 42 adds substantially more: bi-annual reporting (every six months instead of annually), detailed breakdowns by content category and language, decision-making timeline data, appeals information, and figures on government orders. The practical gap between what a notice-only platform must report and what a VLOP must report is significant: VLOP obligations require purpose-built reporting systems and structured data pipelines, not ad-hoc exports.

EU DSA Compliance PM Regulatory Product Management

Roblox's EU Digital Services Act Transparency Report (H2 2025) Is Live

Roblox has published its EU Digital Services Act transparency report for the second half of 2025. I owned end-to-end delivery — scoping data requirements against Article 15 obligations, building the cross-functional production workflow across Legal, Policy, and Engineering, and driving on-time publication. The report covers Roblox's content moderation activity for 1 July – 31 December 2025, including notices received and actions taken across EU services. Alongside it, Roblox also published its Terrorist Content Online (TCO) transparency report for the same period.

Roblox EU DSA Compliance PM Transparency
esc