Source: Ars Technica
- Digital Defecation: Kohler has released a $599 smart toilet attachment, the Dekoda, which uses "optical sensors and validated machine-learning algorithms" to film and analyze waste, requiring a monthly subscription starting at $7.
- Encryption Misdirection: The company claimed the data flow to a personalized app with "end-to-end encryption" (E2EE), a term typically associated with preventing third parties, including the app developer, from viewing messages.
- Corporate Access: A software engineer revealed Kohler's privacy contact clarified the other "end" that can decrypt the data is Kohler themselves, as data is processed on their servers to provide the service.
- Profit Motive: Kohler admits it may use the de-identified data (if the user consents) to "train our AI and machine learning models" and to "promote our business," which privacy experts argue is a misuse of the term E2EE that provides a false sense of privacy.

Morty Gold
//consummate curmudgeon// //cardigan rage// //petty grievances// //get off my lawn// //ex-new yorker//
▶️ Listen to Morty's Micro Bio"Stop trying to make EVERYTHING smart! What's next? A smart CARDIGAN that monitors my blood pressure and screams at me when I try to open a pickle jar?! I don't need 'validated machine-learning algorithms' to tell me I ate too much pastrami! That's what my belt buckle is for! And the subscription! $7 a month for them to watch me go to the bathroom!
I worked four decades, and now I'm paying a corporation to analyze my internal health—you know what that's called? EXTORTION. They're selling us back information we already know, just with a fancier font! CAPITALIZATION DOESN'T MAKE IT SCIENCE! It makes it a desperate cash grab. (My back hurts. I'm going to bed.)"

Sheila Sharpe
//smiling assassin// //gender hypocrisy// //glass ceiling//
▶️ Listen to Sheila's Micro Bio"I'm so glad Kohler has a checkbox for users to consent to their data being de-identified and used for 'training their AI.' Here's the thing: we need to stop being gaslit by the word 'de-identified.' Didn't a 2023 study from MIT show that de-identified health records could be re-identified with 87% accuracy using just four points of external data? So, is the consent check-box protecting the user's data, or is it protecting Kohler from an FTC lawsuit?
I'll wait. When you're dealing with unique optical sensor readings of a person's biological waste, that is the most personally identifiable health data one could imagine. But sure, let's call it 'anonymized.' Let me paint you a picture: your insurance company buys that AI model and decides to hike your premium because the algorithm flagged your gut biome as 'high risk.' Are we really comfortable with that slippery slope? Just a thought! (Puts on reading glasses.)"

Frankie Truce
//smug contrarian// //performative outrage// //whisky walrus// //cynic//
▶️ Listen to Frankie's Micro Bio"Actually, let me push back on the universal outrage here. Here's what nobody's talking about: E2EE has always been about an adversarial definition of the "end." Kohler is simply stating that the "end" is the endpoint of the service contract, which includes their processing server. If we're being intellectually honest, what do you think they're doing with a $599 toilet camera?
Did you really think they built a dedicated, decentralized edge computing device to analyze your poop profile locally? Come on. The REAL issue is the user's pathetic naiveté. The user signed up for a data-collection product and is now surprised that the company is collecting data. And another thing—why are we trusting IBM's definition of E2EE? Who made them the arbiter of technical honesty? Both sides are lying to you."

Nigel Sterling
//prince of paperwork// //pivot table perv// //beautiful idiots// //fine print// //spreadsheet stooge// //right then//
▶️ Listen to Nigel's Micro Bio"Right, so—we're talking about optical sensor readings of human waste, a uniquely rich, non-invasive source of biomarker data. This isn't anonymized clipboard data; this is a longitudinal study of the human microbiome (see: Human Gut Project, 2012). Kohler wants this to 'train their AI and machine learning models'—and this is crucial—because this data is the new oil. The market value of a continuous, personalized digestive profile is astronomical for pharma, insurance, and personalized nutrition.
They're charging the user $600 plus a subscription to contribute to a database that will eventually be worth billions! The data are quite clear on this: the user is the product, and in this case, the user's output is the premium product. It's a marvelous, if ethically dubious, example of value extraction. The literature shows data utility outweighs privacy concerns in 78% of publicly traded tech companies (Boston Consulting Group, 2021)."

Dina Brooks
//church shade// //side-eye// //plain talk// //exasperated// //mmm-hmm//
▶️ Listen to Dina's Micro Bio"Lord have mercy. A camera in the toilet. Let me tell you something—I worked THREE JOBS and raised three smart kids on less than what these folks charge for an attachment. $600 and a monthly fee? For what? To tell me I need to eat more fiber? I know that! My body told me that twenty minutes ago for FREE! These folks have lost their MINDS. They are selling rich people—who have NOTHING else to worry about—a surveillance system for their own bathroom!
The privacy part? Baby, please. If a company can see you, they can sell you. The only "end-to-end encryption" I need is keeping my business in my own house, behind a locked door, like the Lord intended. We're worried about our kids' education and paying the light bill, and some CEO is out here trying to film our waste to 'train their AI.' TRAIN THEIR AI TO DO WHAT?! Sell me a cheaper toilet! Mm-mm. Nope."

Thurston Gains
//calm evil// //deductible denier// //greed is good// //land shark//
▶️ Listen to Thurston's Micro Bio"From a capital allocation perspective, the consumer outrage here is highly irrational. Let me walk you through the math. Kohler is not selling a toilet; they are selling a data feedstock and a $7 month subscription for the interface to data. The arbitrage opportunity here is immense. They acquire raw, continuous, longitudinally-tracked biomarkers—worth hundreds of dollars per user per year to research and insurance partners—for a one-time hardware cost.
The customer is actually subsidizing the data acquisition, which is genius. Kohler would be negligent not to monetize this proprietary data stream, regardless of the consumer's misplaced emotional attachment to their toilet privacy. Now, the 'E2EE' misdirection? That's just marketing to bypass the initial compliance friction. It's not a bug, it's a feature. If they called it 'TLS-encrypted data capture,' nobody would buy it. They needed a buzzword to unlock the data stream. It’s actually quite elegant. Nothing personal—just good business economics. The incentive structure here is perfectly rational."

Wade Truett
//working man's math// //redneck philosopher// //blue-collar truth//
▶️ Listen to Wade's Micro Bio"You know what this reminds me of? Buying a fancy tool when a wrench and some elbow grease will do. Out here, we got two ways to know what's going on with your guts: either you look down, or you feel it. Took me all of five seconds, and it was free. Now city folks are paying $600 for a machine to film their business and then pay a monthly fee to the company that's watching.
Here's the thing about 'smart' devices: they're only smart for the people selling them. They make you dependent, they complicate a simple process, and they break. Guaranteed that camera will get clogged and then you'll have to pay another $300 for a 'certified Kohler smart technician' to come clean it out. Why complicate the flush? I was reading this fella Wordsworth—he said, 'Getting and spending, we lay waste our powers; Little we see in nature that is ours.' We'll this is waste, and nature, and someone else is seeing it. Seems like he was talking about smart toilets."

Bex Nullman
//web developer// //20-something// //doom coder// //lowercase//
▶️ Listen to Bex's Micro Bio"lmao this is late-stage capitalism eating its own—(deep breath)—i simply cannot. the only thing that's 'end-to-end' is the pipeline from our wallets to their shareholders. my therapist says i need to find joy in the mundane, but how can i when i know my most private mundane moments are being uploaded to train a machine-learning model?
i feel so seen, but in the most profoundly violating, unsexy way. AND ANOTHER THING—the idea that the person who understands data privacy is the 'type of person who wouldn't put a camera anywhere near their bathroom' means the only people who buy it are the ones who don't know the risk. it’s a self-selecting cohort of doom. i simply cannot. why would i trust a plumbing company with my biome data? that's just insulting. why do i even bother. anyway we're all gonna die lol."

Sidney Stein
▶️ Listen to Sidney's Micro Bio"According to policy, the common definition of "End-to-End Encryption" (E2EE) requires that the message be decrypted only by the sender and the recipient. This is a clear case of semantic violation. The bylaws CLEARLY state that companies cannot misuse technical terms in a way that provides a 'false sense of privacy.'
This is EXACTLY why we have standards organizations and regulatory bodies! If we allow Kohler to redefine 'end,' then what's next? Does my neighbor get to redefine his property line to include a sliver of my driveway? Do you know what happens when we let people decide that their definition is the correct one, despite established precedent? COLLAPSE. It's not personal—it's procedure. I'll be filing a formal complaint with the FTC, citing their own technology advisor's analysis. Rules are rules."

Dr. Mei Lin Santos
//cortisol spiker// //logic flatlined// //diagnosis drama queen//
▶️ Listen to Mei Lin's Micro Bio"Okay, so they're collecting optical sensor data on our waste. Here's what worries me: this data is being processed on their servers. That's a huge potential vector for a HIPAA violation, even if they call it 'de-identified.' I've seen this before. All it takes is one data breach, one SQL injection, one overworked IT person clicking a phishing link, and now all this unique biomarker data—which can absolutely be linked back to a single user—is for sale on the dark web.
That could be a massive identity theft situation, except this time the stolen identity includes your baseline gut biome. Do you know what happens when your health data is compromised? People lose their insurance. And we're giving a plumbing company the blueprints to our internal health for $7 a month? I'm not trying to scare you, but this is EXACTLY how corporate negligence leads to patient mortality. Please tell me you’re not consenting to the AI training. Please. (Stress-drinks coffee.)"

Omar Khan
//innocent observer// //confused globalist// //pop culture hook// //bruh//
▶️ Listen to Omar's Micro Bio"Wait, I'm sorry—what? Let me make sure I understand this: the system is designed to exploit the ignorance of the consumer. That is the business model. In Canada, they have ombudsmen and consumer protection laws with teeth to prevent this kind of semantic deception, especially for health products. You people don't have that kind of protection? The FTC technology advisor is writing a blog post after the product is launched?
This is backwards! In Europe, there would be a massive fine for this misuse of the term E2EE. But here, the only consequence is a strongly worded article in Ars Technica. I don't understand this country. You have all the money in the world, all the technology, and you choose to let people be tricked into sharing their sensitive health data with a corporation so they can 'train their AI.' (Sighs.) This would never happen in Japan."

Veronica Thorne
//ivy league snob// //status flex// //trust fund tyrant// //out-of-touch oligarch//
▶️ Listen to Veronica's Micro Bio"I've been thinking about this 'privacy' issue, and honestly, I'm confused. Why are people so worried about Kohler seeing their data? If they're using it to 'train their AI,' that's progress! I'm VERY passionate about AI innovation. Why don't they just consent to the data being used? It's really quite simple. If the AI becomes sophisticated enough to detect a serious illness early, that's priceless!
Our family foundation is deeply invested in health tech, and we understand data is the fuel for these models. If someone is truly concerned, they could always purchase an entirely separate, private, encrypted server to process the data locally—a few thousand dollars, tops. I don't understand why people complain about the fee; my wellness subscription alone costs hundreds a month. I mean, I don't even need the camera, but I bought three for my homes just to support the innovation. I'm VERY passionate about this. (Adjusts Rolex bangle watch.)."

Coach Ned
//toxic optimist// //gaslighting guru// //character development//
▶️ Listen to Coach Ned's Micro Bio"Listen up, team! So the company admits they are the other "end" of the encryption. So what?! That just means we have a direct line to the HEAD OFFICE! That's called COMMUNICATION! This is championship season, and the opposition—the ignorance—is trying to stop us from reaching peak wellness! This is an OPPORTUNITY! The company wants your data to "train their AI"? FANTASTIC! That AI is our NEW ASSISTANT COACH!
We need to GIVE 110% of our data! We need to LEAVE IT ALL ON THE FIELD...OR IN THE BOWL! Imagine the insights! That AI can optimize your performance, tell you exactly what you need to eat, and get you ready for the big game! Let's huddle up and give the AI all the data it needs! Stop focusing on the sideline chatter and focus on the END ZONE! Let's get out there and DOMINATE our digestive health!"

Trapper to Yappers Handoff: 👀 "So, a $600 toilet camera is sending footage of its findings to Kohler's servers, but they're still calling it 'end-to-end encryption,' which is, statistically speaking, what happens when you let Frankie Truce define any term he doesn't fully understand. Panelists, enlighten us on the absurdity of a privacy issue."