
By: Phelan H.B. No. 366
A BILL TO BE ENTITLED
AN ACT relating to required disclosures on certain political advertising that contains altered media; creating a criminal offense. BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF TEXAS: SECTION 1. Chapter 255, Election Code, is amended by adding Section 255.0015 to read as follows: Sec. 255.0015. REQUIRED DISCLOSURE ON CERTAIN POLITICAL ADVERTISING CONTAINING ALTERED MEDIA; CRIMINAL PENALTY. (a) This section applies only to a person who: (1) is an officeholder, candidate, or political committee; (2) makes expenditures during a reporting period that in the aggregate exceed $100 for political advertising, other than an expense to cover the basic cost of hardware, messaging software, and bandwidth; or (3) publishes, distributes, or broadcasts political advertising described by Subsection (b) in return for consideration. (b) A person may not, with the intent to influence an election, knowingly cause to be published, distributed, or broadcast political advertising that includes an image, audio recording, or video recording of an officeholder’s or candidate’s appearance, speech, or conduct that did not occur in reality, including an image, audio recording, or video recording that has been altered using generative artificial intelligence technology, unless: (1) the image or video recording has only been altered to change the saturation, brightness, contrast, color, or any other superficial quality of the image or video; or (2) the political advertising includes a disclosure from the person or another person on whose behalf the political advertising is published, distributed, or broadcast indicating that the image, audio recording, or video recording did not occur in reality. © The commission by rule shall prescribe the form of the disclosure required by Subsection (b), including the font, size, and color of the disclosure. The commission shall ensure that the form of the disclosure is consistent with other required disclosures on political advertising. (d) A person commits an offense if the person violates this section. An offense under this section is a Class A misdemeanor. (e) This section does not impose liability on any of the following persons for political advertising published, distributed, or broadcast by or at the direction of another person: (1) an interactive computer service, as defined by 47 U.S.C. Section 230(f); (2) an Internet service provider, cloud service provider, cybersecurity service provider, communication service provider, or telecommunications network; (3) a radio or television broadcaster, including a cable or satellite television network operator, programmer, or producer; or (4) the owner or operator of a commercial sign, as defined by Section 391.001, Transportation Code. SECTION 2. This Act takes effect September 1, 2025.
Sorry for the formatting, link here
Including the full body does say the law only applies to people engaged in a campaign, and sets specific boundaries around that.
I think these types of laws are important. It seems like the misinformation, including AI generated images to soil a candididates reputation is a clear problem than needs to be address. Not saying the language on this specific law is ideal, or even adequate, but it is a start and other juristictions need to be working towards this to protect democracy in this new world of endless content.
We Canadians just had our elections, and there were plenty of fake AI images making the rounds trying to link our new Prime Minister to Epstein. I don’t want that sort of nonsense to continue into future elections.
I’m not speaking from experience with the firefighters side here, but I do think it come’s down to the ick factor of smell is so much stronger than the yum factor.
Smell is how we know if something is safe to eat, so if its off even a bit, that jumps to the peak of our attention.
Usually if you burn something a little bit, that’s the only smell you notice.