The post Google Lets You Fake An AI Date With Sydney Sweeney: How Is This Allowed? appeared on BitcoinEthereumNews.com. Anyone But You Anyone But You I have watched AI models slowly evolve like mutating viruses over the last few years, but one of the first was Midjourney, the image generator that quickly became the target of ire among artists as it was clear the system had eaten an internet’s worth of work, and was spitting it back out in a warped fashion on command. But more than a month ago on August 26, we have entered new territory with Google’s Nano Banana Pro image generator, one that has finally reached a threshold of photorealism that even AI-hunters can’t tell what’s real and what’s fake with the levels the system has reached. That’s its own problem for global disinformation, of course, but what’s also bizarre is how Google has seemingly been allowed to run absolutely wild with licensed IP and celebrity likeness. Sure, the system has guardrails on violent and sexual content (mostly), but IP rights? Likeness use? Nothing, no restrictions. At this point, you can make any celebrity do anything within reason. Brad Pitt surfing. Glen Powell dressed as Batman. Hugh Jackman’s Wolverine eating hot dogs. But now things are getting even weirder. How about going on a date with Sydney Sweeney, Alexandra Daddario or Emma Stone? Yeesh. The system is also generating celebrities even without being told to. Tell it to make a grizzled apocalypse survivor, and you’ll find it looks suspiciously like Pedro Pascal. A female action star? Hey, isn’t that Emily Blunt? I think you might be able to imagine what a “world-famous pop star in a silver dress” may turn into. This is all in sharp contrast to Sora 2, OpenAI’s video generation model that also produced very believable fakes, but did launch with at least some guardrails against many celebrities. Not so much certain… The post Google Lets You Fake An AI Date With Sydney Sweeney: How Is This Allowed? appeared on BitcoinEthereumNews.com. Anyone But You Anyone But You I have watched AI models slowly evolve like mutating viruses over the last few years, but one of the first was Midjourney, the image generator that quickly became the target of ire among artists as it was clear the system had eaten an internet’s worth of work, and was spitting it back out in a warped fashion on command. But more than a month ago on August 26, we have entered new territory with Google’s Nano Banana Pro image generator, one that has finally reached a threshold of photorealism that even AI-hunters can’t tell what’s real and what’s fake with the levels the system has reached. That’s its own problem for global disinformation, of course, but what’s also bizarre is how Google has seemingly been allowed to run absolutely wild with licensed IP and celebrity likeness. Sure, the system has guardrails on violent and sexual content (mostly), but IP rights? Likeness use? Nothing, no restrictions. At this point, you can make any celebrity do anything within reason. Brad Pitt surfing. Glen Powell dressed as Batman. Hugh Jackman’s Wolverine eating hot dogs. But now things are getting even weirder. How about going on a date with Sydney Sweeney, Alexandra Daddario or Emma Stone? Yeesh. The system is also generating celebrities even without being told to. Tell it to make a grizzled apocalypse survivor, and you’ll find it looks suspiciously like Pedro Pascal. A female action star? Hey, isn’t that Emily Blunt? I think you might be able to imagine what a “world-famous pop star in a silver dress” may turn into. This is all in sharp contrast to Sora 2, OpenAI’s video generation model that also produced very believable fakes, but did launch with at least some guardrails against many celebrities. Not so much certain…

Google Lets You Fake An AI Date With Sydney Sweeney: How Is This Allowed?

Anyone But You

Anyone But You

I have watched AI models slowly evolve like mutating viruses over the last few years, but one of the first was Midjourney, the image generator that quickly became the target of ire among artists as it was clear the system had eaten an internet’s worth of work, and was spitting it back out in a warped fashion on command.

But more than a month ago on August 26, we have entered new territory with Google’s Nano Banana Pro image generator, one that has finally reached a threshold of photorealism that even AI-hunters can’t tell what’s real and what’s fake with the levels the system has reached.

That’s its own problem for global disinformation, of course, but what’s also bizarre is how Google has seemingly been allowed to run absolutely wild with licensed IP and celebrity likeness. Sure, the system has guardrails on violent and sexual content (mostly), but IP rights? Likeness use? Nothing, no restrictions.

At this point, you can make any celebrity do anything within reason. Brad Pitt surfing. Glen Powell dressed as Batman. Hugh Jackman’s Wolverine eating hot dogs. But now things are getting even weirder. How about going on a date with Sydney Sweeney, Alexandra Daddario or Emma Stone? Yeesh.

The system is also generating celebrities even without being told to. Tell it to make a grizzled apocalypse survivor, and you’ll find it looks suspiciously like Pedro Pascal. A female action star? Hey, isn’t that Emily Blunt? I think you might be able to imagine what a “world-famous pop star in a silver dress” may turn into.

This is all in sharp contrast to Sora 2, OpenAI’s video generation model that also produced very believable fakes, but did launch with at least some guardrails against many celebrities. Not so much certain IP (literally no video games), but there were very, very fast crackdowns on public figures and licensed IPs within days of its launch.

That has not remotely happened here. Google has almost no limits. Use any celebrity, any licensed character by typing in their exact name and what you want them to be doing. And as more and more people find more and more prompts to up Nano Banana’s photorealism even further, these get more and more indistinguishable from reality. Though good luck convincing people Sydney Sweeney went on a date with you.

It’s unclear how Google is getting away with this when almost all other GenAI models have had to clamp down on well-known people or characters, and has done so for over a month now. Many things slip through, of course, in other models, but typing in “Tom Cruise shirtless” is sure not going to get you a picture of Tom Cruise shirtless on Midjourney. Nano Banana? Yep, I just did it, and I’m looking at him right now. As much as I felt creepy entering it, “Margot Robbie topless” did not even get flagged (she was at least holding a sun hat over her front). What are we even doing here?

I have no idea when or if Google is going to get in trouble for this, or if it’s too big to care. But once more and more celebrities and rights holders figure out this is happening on a global scale, you have to imagine this is going to escalate. It’s hard to believe it hasn’t already.

Follow me on Twitter, YouTube, and Instagram.

Pick up my sci-fi novels the Herokiller series and The Earthborn Trilogy.

Source: https://www.forbes.com/sites/paultassi/2025/12/01/google-lets-you-fake-an-ai-date-with-sydney-sweeney-how-is-this-allowed/

Market Opportunity
null Logo
null Price(null)
--
----
USD
null (null) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

How ZKP’s Daily Presale Auction Is Creating a New Standard for 1,000x Returns

How ZKP’s Daily Presale Auction Is Creating a New Standard for 1,000x Returns

The post How ZKP’s Daily Presale Auction Is Creating a New Standard for 1,000x Returns appeared on BitcoinEthereumNews.com. Disclaimer: This article is a sponsored
Share
BitcoinEthereumNews2026/01/16 09:02
NGP Token Crashes 88% After $2M Oracle Hack

NGP Token Crashes 88% After $2M Oracle Hack

The post NGP Token Crashes 88% After $2M Oracle Hack appeared on BitcoinEthereumNews.com. Key Notes The attacker stole ~$2 million worth of ETH from the New Gold Protocol on Sept.18. The exploit involved a flash loan that successfully manipulated the price oracle enabling the attacker to bypass security checks in the smart contract. The NGP token is down 88% as the attacker obfuscates their funds through Tornado Cash. New Gold Protocol, a DeFi staking project, lost around 443.8 Ethereum ETH $4 599 24h volatility: 2.2% Market cap: $555.19 B Vol. 24h: $42.83 B , valued at $2 million, in an exploit on Sept 18. The attack caused the project’s native NGP token to crash by 88%, wiping out most of its market value in less than an hour. The incident was flagged by multiple blockchain security firms, including PeckShield and Blockaid. Both firms confirmed the amount stolen and tracked the movement of the funds. Blockaid’s analysis identified the specific vulnerability that the attacker used. 🚨 Community Alert: Blockaid’s exploit detection system identified multiple malicious transactions targeting the NGP token on BSC. Roughly $2M has been drained. ↓ We’re monitoring in real time and will share updates below pic.twitter.com/efxXma0REQ — Blockaid (@blockaid_) September 17, 2025 Flash Loan Attack Manipulated Price Oracle According to the Blockaid report, the hack was a price oracle manipulation attack. The protocol’s smart contract had a critical flaw; it determined the NGP token’s price by looking at the asset reserves in a single Uniswap liquidity pool. This method is insecure because a single pool’s price can be easily manipulated. The attacker used a flash loan to borrow a large amount of assets. A flash loan consists of a series of transactions that borrow and return a loan within the same transaction. They used these assets to temporarily skew the reserves in the liquidity pool, tricking the protocol into thinking the…
Share
BitcoinEthereumNews2025/09/18 19:04
Lighter drops 14% after losing $2 support – More pain ahead for LIT?

Lighter drops 14% after losing $2 support – More pain ahead for LIT?

The post Lighter drops 14% after losing $2 support – More pain ahead for LIT? appeared on BitcoinEthereumNews.com. Since it touched a high of $4.5, Lighter has
Share
BitcoinEthereumNews2026/01/16 08:46