Content Moderation Specialist
full-time
senior
Posted 1 week ago
About this role
About Anthropic
Anthropic’s mission is to create reliable, interpretable, and steerable AI systems. We want AI to be safe and beneficial for our users and for society as a whole. Our team is a quickly growing group of committed researchers, engineers, policy experts, and business leaders working together to build beneficial AI systems.
About the role
Anthropic's Integrity & Compliance (I&C) function is building the systems that let us scale responsibly as our products reach more people, more enterprises, and more regulated industries. Our global compliance program is bespoke, reflecting our unique mission and position as one of the leading AI labs operating on the frontier.
Our Regulatory Programs pillar is a key pillar of our overall Integrity & Compliance function and covers a range of compliance domain areas including economic sanctions, US export controls, and regulatory compliance programs stemming from global AI safety regulation.
As a Content Moderation Specialist, you'll own day-to-day program management of Anthropic's global content moderation and online safety regulatory compliance program. Online safety regulation is one of the fastest-moving areas of technology law, and AI sits squarely in its sights. Regimes including the EU Digital Services Act, the UK Online Safety Act, the Australia Online Safety Act, and a growing set of emerging frameworks globally create novel obligations for how AI products are built, deployed, and governed. You will be at the forefront of translating those obligations into a defensible, well-documented compliance program — with regulatory risk assessments as the core of the work.
This is a deeply cross-functional role. You'll partner closely with internal counsel, Safeguards, and operations teams across Anthropic to build the compliance program and frameworks that demonstrate Anthropic meets its obligations under content regulation. This is a builder's role at a company that takes integrity seriously and moves fast — you'll exercise independent judgment on issues without clear precedent and help build durable programs that let Anthropic move quickly while honoring its obligations to regulators, customers, and the public.
Key responsibilities
Own the global content regulation risk assessment program, including the roadmap of required assessments across jurisdictions, a consistent and repeatable risk assessment methodology and framework, and the coordination of inputs, consultation, and approvals for each assessment
Build and maintain systems and trackers to assess, operationalize, and report on relevant regulatory requirements across Anthropic's products and jurisdictions
Partner with internal counsel, Safeguards, Policy, engineering, and operations teams to align internal practices with external commitments and legal obligations
Maintain a controls inventory and the compliance documentation library for content regulation, ensuring documentation is drafted, reviewed by the right stakeholders, and kept current
Conduct gap analysis when new or amended content regulations come into scope, and stand up the compliance readiness plan and workback for each
Provide regular written program status reporting to stakeholders and leadership, proactively surfacing stalled or at-risk items with a proposed path to unblock
Take on additional related work as the program evolves; job duties and responsibilities may change from time to time at Anthropic's discretion or as required by applicable law
Minimum qualifications
Experience managing regulatory or compliance programs at a technology company or in a regulated industry
Hands-on experience conducting or program-managing regulatory risk assessments, including coordinating inputs across multiple functions
Demonstrated ability to build and maintain compliance program artifacts, including policies, risk assessment documentation, controls inventories, program trackers, and readiness plans
A track record of executing cross-functionally, driving outcomes across legal, product, policy, and operations partners without direct authority
Excellent written and verbal communication skills, including producing clear program documentation and status reporting for senior stakeholders
Sound judgment and the ability to make decisions and move work forward with incomplete information in an evolving regulatory environment
Preferred qualifications
5+ years of relevant experience in regulatory program management or content moderation compliance
Direct experience with online safety or content moderation regulation, such as the EU Digital Services Act, UK Online Safety Act, Australia Online Safety Act, or comparable regimes (strongly preferred)
Experience in trust and safety, online safety, or regulatory compliance at a large consumer technology platform
Prior experience in a Big 4 or other professional services firm advising on content regulation, online safety, or platform compli
Similar Jobs
Related searches:
Get jobs like this delivered weekly
Free AI jobs newsletter. No spam.