🤖 AI Summary
Apple updated its App Review Guidelines to require apps to explicitly disclose and obtain user permission before sharing personal data with “third‑party AI.” The tweak — an addition to rule 5.1.2(i) — names AI providers alongside other third parties, and can lead to app removal if developers fail to comply. The timing is notable: Apple is rolling out an upgraded Siri in 2026 (reportedly using Google’s Gemini), and this change signals a tighter posture on how apps route user data to external AI systems as the company builds its own AI features.
For developers and the AI/ML community this raises practical and technical implications. Any app that sends user data to external ML models or LLMs (or otherwise delegates inference/processing to AI vendors) will need clear consent flows, updated privacy disclosures, and likely new contract and data‑processing agreements naming AI subprocessors. The broad wording — “third‑party AI” — leaves enforcement scope ambiguous (covering everything from cloud LLMs to on‑device ML pipelines), so teams should err on the side of transparency, minimize data shared with external models, and consider on‑device processing or anonymization. The change is one of several guideline updates (including a Mini Apps Program and rules for creator, loan and crypto exchange apps) that together tighten App Store governance around regulated data use.
Loading comments...
login to comment
loading comments...
no comments yet