What rights would you give up to end gun violence?

Indefensible.

No jokes or clever anecdotes on this one.

Just another day where people fight over how to end rampant mass shootings.

A fact, not an opinion: we cannot reduce guns in the United States until at least some people are willing to give up at least some of the rights they currently enjoy.

An opinion/hypothesis: maybe it isn’t 100% fair that only gun owners should sacrifice something for a safer country. (not 100% sure if I agree with this myself, but it is a popular argument with some merit.)

So.

What rights would you give up — whether you’re a gun owner, a 2nd Amendment supporter, or neither of those — if it meant fewer gun fatalities? 

What good is an app that simply reminds us we’ll die someday?

phone headstone

All those moments will be lost in time… like tweets in rain.

 

There is a constant tension between our desire to live every day like it’s our last — to maximize our impact on this world and the joy we find in it — and our tendency to do the opposite, by frittering away precious time doing mundane, pointless, unfulfilling things. Well, there’s an app for that.

“Five times a day for the past three months, an app called WeCroak has been telling me I’m going to die. It does not mince words. It surprises me at unpredictable intervals, always with the same blunt message: “Don’t forget, you’re going to die.”

As I scroll through Instagram or refresh Twitter, WeCroak interrupts with the sobering reminder that it is not just my attention these other apps are consuming, but chunks of my life.”

The simplicity is beautiful, if potentially morbid. And don’t count out the fact that it may have the opposite effect on the more jaded among us, who find the comfort of an inevitable end a source of relief.

 

Would you get anything out of an app like this?

 

How might these reminders affect your daily behavior?

 

What other “tech” with such a clear and simple purpose do you wish existed?

How should self-driving cars handle potentially fatal accidents?

Turns out your answer depends a lot on whether you're the car or the pedestrian.

Turns out your answer depends a lot on whether you’re the car or the pedestrian.

 

Self driving cars sound awesome. Less traffic, fewer accidents, more free mental bandwidth while commuting. But nothing is perfect, and some scientists are beginning to examine how automated cars should handle accidents:

Here is the nature of the dilemma. Imagine that in the not-too-distant future, you own a self-driving car. One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?

One way to approach this kind of problem is to act in a way that minimizes the loss of life. By this way of thinking, killing one person is better than killing 10.

But that approach may have other consequences. If fewer people buy self-driving cars because they are programmed to sacrifice their owners, then more people are likely to die because ordinary cars are involved in so many more accidents. The result is a Catch-22 situation.

So one could abstractly argue all day about what’s right, and if you’re able to take yourself out of the equation, the math is what it is.

 

If it were up to you to decide how autonomous cars handle accidents, what do you program them to do?

 

How does your answer change if:
a) you’re the first one driving one?
b) you’re also in charge of convincing other people to buy one?
c) everyone is required to drive one (and is that worth doing)?