This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 1 minute read
Reposted from Taylor English Insights

Google To Require Disclosure of Use of AI in Campaign Ads

Google will require that campaigns disclose certain types of digitally altered or generated content. These new rules are part of Google's overall standards against deceptive, fraudulent, and misleading political ads. "Synthetic content" that has been altered or generated to show "real or realistic-looking" people and events must be "prominently disclosed." De minimis edits (such as re-sizing a picture) and certain other digital changes would not have to be disclosed.

Why It Matters

Although US regulators have expressed interest in how AI can be used to generate misleading or harmful campaign ads, they have not as yet taken action against it.  Having private actors, especially ones with reach as broad and as deep as Google's, set standards in the marketplace may be a good stop-gap until officials decide what to do. However, Google is a private and largely unregulated business, and (like all private companies) is run for shareholder benefit rather than for the public good. This may mean that Google's rules permit activity that a regulator would eventually deem harmful or misleading. It is likely that citizens' and consumer groups will applaud the transparency required by Google, but they are likely to continue to press for official rules and regulations as well.  

All verified election advertisers are required to “prominently disclose” if an ad contains synthetic content that has been digitally altered or generated and “depicts real or realistic-looking people or events,” according to Google.

Tags

data security and privacy, hill_mitzi, insights, ai and blockchain