He’s curious about the deal between Beth and Rio. Nick visits Beth to return a sweater she left at dinner. The scene cuts before we can find out who. “So basically, we’re framing someone,” says Annie. She tries to explain her thinking to Annie and Ruby, but neither they nor I follow what she’s selling. Watching her kids fight over a toy gives Beth an idea. (My words, not hers.) At least the guy’s thinking! Where’s their healthcare and paid sick leave? Despite the fact that he does a bang-up job riling the ladies up, Ruby tells him his attempt to turn the dancers away from Gene is half-cooked and pretty dumb. You get a “support network.” (OK, so definitely a pyramid scheme then! Got it.) He tells them he’ll think about it.Īt the club, Stan tells the dancers that Gene doesn’t respect them. But it’s not a pyramid scheme, it’s a “multi-level opportunity.” You don’t just get a bunch of products, they say.
#THE OFFICE SEASON 8 EPISODE 8 SKIN#
While out for a bike ride, Dean’s friends tell him about “The System,” a new line of men’s skin care they’re hawking. She’s worried she’ll end up in jail, but Rio says he’s not asking her.
He wants her to “keep them busy” - the agents living outside her house in a surveillance van, that is. Keep on coding.“We’ve got a real opportunity here,” Rio tells Beth in the present. I'm Saron Yitbarek, and this is Command Line Heroes, an original podcast from Red Hat. We're looking at the robot revolution that's been rolling toward us for over a century: the self-driving car. We're forced to think about who gets to offer input and who gets to help design our robotic future. The more that robots move through our lives, the higher the stakes get. Designing a robot future that works for everybody means bringing everybody to the table. You've got the manufacturer, you've got the user, you've got the robot itself, and that's the point, really. You might've noticed in this episode, there are a lot of stakeholders. And it means in addition to making manufacturers and users responsible for robots' behavior, we need to start giving robots a sense of responsibility that's all their own. But that's the next level of robot responsibility-translating our real complex, messy values and desires. But first we need to grapple with some immediate worries because questions about robotic responsibility are already here-and the stakes are high.Ĭan we, in other words, give robots a sense of right and wrong instead of uploading every single example of what is considered right and wrong? It's a problem as sprawling as the field of robotics itself. We'll come back to that disaster scenario, an interesting thought experiment by philosopher Nick Bostrom. And this time we're asking, what happens when good robots go bad? Who's responsible for their actions and who do we blame when a Paperclip Maximizer Bot 3000 decides to destroy the city? All season, we've been tracking the fast-evolving field of robotics. I'm Saron Yitbarek and this is Command Line Heroes, an original podcast from Red Hat. When a machine has some measure of autonomy, like a lot of robots do, is the manufacturer responsible for its actions? Is the user? Could a robot be held responsible? Making sure robots don't cause harm has become a crucial field of research, and figuring out who is responsible for what as robots become more a part of our lives is more difficult than you might imagine. A Roomba might try to vacuum up your cat, for example. They obey the letter of the law, but not the spirit. When it comes to robots, even the most innocent of intentions can go awry.