All Watched Over
By Machines of Loving Grace
12 min read
The vision of cybernetic utopia invoked in Richard Brautigan’s classic poem is being attempted at Ahimsa, a new co-living community.
I am planning to relocate to Seattle, so I reached out to a friend of mine that I’ll call Kyle. Kyle and I had met in a co-living community in Nashville a few years ago, and now he lives in Seattle. I didn’t know anybody else in the city, and I wanted some advice on what the housing situation was like. Kyle is living in another co-living community (his fourth), but the stories he told me about this one led me down a deep rabbit hole into a world of cutting-edge technology, social dynamics, and the ethics of combining the two. The community in question is called Ahimsa, in existence for about a year now, and about to open up two more locations. If you haven’t heard of them, check out their website. I’ll wait.
The mention of the company sparked a tickle of recognition at the back of my brain. I first came across Ahimsa in an Augur article last year. Despite the relative tech-industry fame of the CEO and Cirrus AI being basically a household name at this point, the announcement largely flew under the radar. I didn’t really think much of it at the time, and I’m not sure many other people did either. Tech companies regularly make larger-than-life claims about what their technology is capable of, and other than the application of Cirrus, this co-living community didn’t sound much different from the myriad others that dot the fabric of cities all over the world. Roommates and AI? If you say so.
As a veteran of just one previous co-living community, I wanted to know what made Ahimsa different. I expected the usual: mostly people in their twenties and thirties, yoga groups, bar crawls, pottery classes. But what was life like living with an AI always up in your business? Kyle’s stories painted a picture that intrigued me, and he has kindly introduced me to a number of other residents (current and former) who also wanted to talk about their experiences, but everyone was very concerned about their anonymity. So, wherever you see names, know they’ve been changed. At Ahimsa, your neighbors know who you are. For some, that’s great. For others, not so much.
Aasiya is a freelance writer, designer, and illustrator. You can find her here, writing about her fascination with design, technology, and the future.
Published on June 5th
12 min read
It’s All Fun and Games Until Someone Loses a Point
To a person, they all started off by telling me about HP, or House Points. They seem to govern every aspect of life at Ahimsa. Ahimsa claims their core philosophy is compassion and a “commitment to non-violence” (more on that later), but to the residents I spoke with, it seemed like their core philosophy was based around giving everyone grades. Every apartment at Ahimsa comes with a device called AtHome that always shows you not only your HP score, but how you’re doing in three difficult-to-define metrics: cleanliness, sustainability, and apparently, harmony. For an explanation of all of this from the CEO himself, a resident directed me to this episode of Anastasia Lee’s podcast, recorded shortly after Ahimsa opened last year. I think it provides some useful context for understanding their intent, and will help measure that intent against reality. Get up and stretch while you give it a listen.
May 10, 2030: Ahimsa
Cirrus AI co-founder Adam Rutledge-Pyke talks about his new venture
Most of the residents I spoke with have experienced a shift in attitude toward points over the last year. “At first, it was great. Everybody was sort of on their best behavior. I’ve lived in a lot of roommate situations, and it can really suck when someone’s a slob or when people have very different ideas of what ‘clean’ means,” Kyle said. “But the AI keeps everybody accountable, makes sure we all clean up our messes, do our chores, that sort of thing. It did a lot to help people get to know each other, too. It was cool to feel like you were being rewarded for doing fun things, so people were always in the lounges, at the bar, even kind of wandering in and out of each other’s apartments like it was an episode of Friends or something.”
“It’s really ironic, then, that this thing that was supposed to bring us together ended up tearing us apart.” When I asked him what he meant, he was quick to point out that not everyone shared his opinion, and some people still seemed to like it, but for him, a big issue was the desirability of the prizes. “Including a financial component was, in my opinion, a huge mistake. People got really, really competitive over it. If they’d just kept it at free drinks or a free month of Netflix, it would’ve been less intense. But giving people free rent? That’s thousands of dollars.”
“The level of competition created this weird social taboo around talking about your score. Asking someone their total was like asking them their salary. It got to the point where people would hide or cover their AtHome when they had folks over,” he told me. Socializing became an arms race, with people forming tribes and alliances, and actively trying to avoid others with known high scores.
“Nobody understands how points are scored, but everybody has theories,” Kyle told me, which created anxiety for some. According to several residents I spoke with, there were a lot of accusations, blaming, and finger-pointing. Alyssa, a 26-year-old retail worker, told me her roommate would get angry when they weren’t proactive in doing their assigned chores, because she was convinced that they lost points every time the AI prompted them. Nevermind, Alyssa says, that the disagreement affected their harmony score. Dominique, 34, told me, “Some people figured out you could earn points just by keeping your door open, since apparently the AI thinks it’s an attempt to socialize. But as a single woman in the city, I don’t feel comfortable doing that. And I don’t like being penalized for prioritizing my safety.”
Dominique went on to recount one of her more bizarre experiences. “I came into the kitchen to find two of my roommates arguing vehemently, but in these really bright, friendly tones of voice. They thought that a disagreement would lower their harmony score and cost us points, so they figured they could trick the AI by sounding happy and upbeat. It was profoundly surreal.”
Product design sketch of Ahimsa’s AtHome device
Harmony might be poorly understood, but Dominique told me that it’s never underestimated. “Sometimes if the environment in the apartment is a little icy, for whatever reason, it can just...tell. It’s pretty uncanny.” She said that even if there had been no outward disagreements, the harmony levels were often an accurate reflection of how she and others were feeling about their living situations. While the total HP score updates every night at midnight with no explanation or breakdown, the levels of cleanliness, sustainability, and harmony are always in flux. “It’s really jarring to be having a deteriorating conversation with your roommate and catch the harmony score incrementally decreasing on the AtHome out of the corner of your eye,” which she says has happened to her on more than one occasion.
The real stress, Alyssa said, was in wondering what people actually thought of you. “It’s often really hard to tell if someone is being nice to me because they genuinely like me, or because they just want to earn points. The building is always watching, and every interaction counts. I started getting really paranoid about it.” Eventually, she stopped going to larger gatherings and just stuck with a handful of people she trusted. She said this strained her relationship with her roommates, who cared more about earning points than she did. “That just made it worse, though,” she said, “it made me feel like they only wanted me to go out with them for
Rohan, a 27-year-old originally from India and former resident, felt that the point system discouraged people from making friends outside the building. “Non-residents aren’t allowed at a lot of events, and you don’t gain HP for socializing with them.” To him, it seemed like he was being forced into a social group with only his neighbors, some of whom he didn’t really care for. “Adam [Rutledge-Pyke, CEO] is a decent guy, but his friends are idiots,” he said. “There’s like a dozen of them spread across a few apartments, and a lot of people find them...hard to like.” Rohan eventually moved out because he found the insular environment suffocating.
The CEO’s friends won several of the early building-wide tournaments Ahimsa calls (somewhat ominously, in my opinion) “The Games.” According to Rohan and Kyle, the prevailing opinion is that they won because The Games are largely designed for people like them. In addition to the normal HP scoring across the three metrics, the AI creates games and scenarios for people to earn extra points. Despite the guiding light of ahimsa, a lot of the early games were competitions that, while perhaps not outright violent, could get uncomfortably physical. “Really bro-y stuff,” Rohan said. He paused a moment. “This whole thing would never work in India. People are more attached to their families back home. This place feels like Neverland, where nobody ever grows up.” Even, he said, the not-insignificant minority of senior citizens sometimes seemed like they had reverted to adolescence.
Courtesy of Ahimsa promotional materials
Some of The Games were truly ridiculous. Kyle introduced me to a man named Rudy, who told me about a particularly absurd incident. “So sometimes the AI will challenge a group of people out of nowhere to complete some random task. Sometimes it’s a puzzle or a riddle. Other times, it ain’t that. Once, I was in the lounge with some folks and the AI just started talking, telling us that the first person to race up five flights of stairs would win some points. I’m 68 years old. My knees are shot. I was talking to some kids in their twenties. I’m not going to do that. What if I was in a wheelchair, or a pregnant lady? I don’t feel like the AI was ever programmed to think about this sort of stuff.”
Apparently, even Ahimsa isn’t really sure why or how their AI does what it does. When residents began asking questions, Ahimsa’s official position was that they could not divulge anything that could be considered a proprietary trade secret. But an Ahimsa employee who lives in the building let it slip to Kyle that they were just as much in the dark as everyone else. It’s based on Cirrus tech, but this is the first time that’s been applied to something as murky as managing social interactions. They could make tweaks to it, but overall, it was something of a mystery.
Community of Lost (Adult) Children
A number of residents mentioned viewing the AI as a surrogate parent—with all the complications that entails. Kyle showed me a social media post from a fellow resident.
For Alyssa, it’s nice to offload the burden of confrontation that often comes with living with roommates. “I actually really like the AI being sort of the parent in the house. I’m an introvert and I don’t really handle conflict well. But when someone isn’t pulling their weight, the AI lets them know. It takes a lot of the pressure off,” she told me.
Kyle, however, pointed out that it has not exactly contributed to everyone’s maturity. “One of my roommates won’t do anything or clean anything until the AI reminds him to. If we’d ask him to do something, he’d give us a sort of ‘you’re not my real dad’ retort. His philosophy is that if the AI hasn’t asked for it, it doesn’t matter.” Kyle also said that the roommate in question would often claim that they shouldn’t be unhappy with him, because it would lower their harmony score.
Social Media post from an Ahimsa resident
It might be even more parental than everyone initially realized. According to Dominique, mutual exasperation with the AI brought her and her roommates closer together. “It reminds me of bonding with my siblings over our parents being annoying,” she laughs. “It is a little ironic, though. At the end of the day, the damn thing is still working as intended. It’s building community.”
Stop Fighting Or I Will Turn This Car Around
One area that received a lot of praise from most of the residents I interviewed was the AI’s conflict mediation feature. When roommates have a dispute, sometimes (it’s not always predictable when) the AI will jump in and act as a moderator. To me, this sounds eerily dystopian and a little paternalistic, but even Kyle, who was largely critical of Ahimsa, seemed to be onboard. “So it employs a conflict resolution method called NVC, or non-violent communication. That’s what ahimsa means, right? It’s a creed of non-violence. This is one place where they really follow through.”
Non-violent communication is about re-framing the things that upset you. Instead of criticizing the other person, you talk about your own emotional needs. Kyle provides an example: “So, instead of saying, ‘You’re an asshole cause you always leave your dirty pan on the stove,’ the AI will suggest we reframe it to something like, ‘When you don’t clean your pan, I feel frustrated and ignored and need to feel like you respect me enough to listen to my requests. Therefore, I would like you to clean your pan.’ There’s a formula to it, like observation, feeling, need, request. There’s also a big component of listening and reframing what the other person said.” He said that it’s had a pervasive impact on how he handles conflict not just at home, but in other relationships and situations. “My girlfriend gets exasperated with me sometimes,” he jokes, “because she’ll be like ‘you don’t listen to me’ and I’ll be like ‘what I’m hearing is that you feel upset because you want a greater sense of connection, and therefore would appreciate it if I listened more actively.’ It even affects how I talk to people at work. It’s surprisingly useful.”
There are some drawbacks, though. Another resident I spoke with, Jeremy, told me that, while he felt some of the NVC skills he’d learned were helpful, sometimes the AI chose the wrong time to butt in. “I live with my wife, and there was a period of time where we were really struggling and having a lot of serious talks. The AI kept trying to get involved, and it was extremely off-putting. It has this genderless voice that sort of emanates from the walls. And it’s fine when I’m asking a roommate to do more vacuuming or something, but I don’t need it involved in my marriage.” After pondering a moment, he reflected, “That said, I think it did actually help us communicate better. Maybe it even saved my marriage? Who knows?” Dominique speculates that it had helped everyone build a shared language of conflict resolution, and that had proven helpful, especially when things got dark with The Games.
The Bad and the Ugly
Unfortunately, there are very few safeguards against bad actors. People exploit the system in different ways, especially when it comes to HP. Early hacks included two people just constantly holding the door open for each other and walking in and out. Others have been caught cheating at The Games. One set of folks would record themselves talking to each other and just play it on speakers while they worked from home with their headphones in to make the AI think they were socializing. Ahimsa corrected for these exploits (or claimed to), but people are still always trying to find ways to beat the system.
Alyssa told me that freeloaders were occasionally an issue. The AI has a feature where it will suggest meals using ingredients found in the apartments of various residents, and have them meet up in either one unit’s kitchen or one of the communal kitchens, depending on the size of the group. “This one guy would always show up to these big dinners that everyone had contributed to and just eat the food. Some people would bring an entire pot roast or a vegan lasagna, but he would bring a water bottle filled with tequila and margarita mix, and just offer it to people. Everyone always refused, so he’d just shrug and chug it himself while eating the food we made. Unfortunately, he probably got the same amount of points as the rest of us.” Even though the system is designed for the equitable sharing of resources, Ahimsa faces an ancient problem, with people taking more than they provide.
The most sinister, though, are those who engage in predatory behavior under the guise of building community. Alyssa’s roommate Grace told me about another resident, one of [CEO] Adam’s inner circle. “We had a big problem with this creepy guy who came to everything. He was at every event, always making uncomfortable comments and trying to get the women drunk. Some people were worried that if they told him off, they’d lose points. We complained several times, but it’s really hard to evict someone for being a creep, especially when they’re college buds with the owner of the building. We really felt like we couldn’t get away from him.” Ahimsa only took action and evicted him after he was accused of stalking and harassing one of the female residents, repeatedly entering her apartment without permission.
Concept layout of Ahimsa lobby, featuring HP leaderboard
Should I Stay or Should I Go?
I asked all of my interviewees if they planned to move out. About half of them said yes. Dominique told me that when things first started taking a turn, the residents felt like their concerns and feedback were completely discounted, and that Ahimsa never followed through on promises of transparency. However, a higher-than-average degree of turnover started to shift things, and apparently came to a head recently, as the first year drew to a close and residents were asked if they wanted to renew. “A lot of people told them they would not be renewing, and that scared their investors at a time they were planning to open two more locations,” Kyle said. “So now, I think they’re listening to us a little more. We’ve seen some changes recently, and I’ve heard that they’re completely revamping the HP system to be less competitive.”
The word is that HP will be switched to a form of currency that can be traded for much smaller prizes, to incentivize good behavior and create more ways for people to feel like earning points matters, especially when they aren’t near the top of the competitive pack. Positive change might be on the horizon. Ahimsa has, albeit vaguely, acknowledged they need to provide more accessible experiences and take greater steps to ensure that all residents feel safe and supported.
Despite everything he told me, Kyle is staying. “It’s weird, you know? I rag on this place a lot, but living here can just be...really interesting. There’s always something unusual happening. And in a lot of ways, I feel that I’m getting better at understanding the whole thing.” He also mentioned that his immature roommate is moving out, and there was about to be a spot open in his apartment, if I wanted it.
So, readers, what do you think? Should I move in? Leave a comment below.