Lease this WebApp and get rid of the ads.
Sprout
I'm not sure there is a better answer though....
Wed Nov 29, 2017 1:35pm
192.86.118.14

I think the transition TO a post-shortage economy is probably one of the most dangerous times socially...

Because from the time that automation starts to the time that it is TOTALLY complete, there WILL be inherent and unavoidable costs of producing and delivering goods. Yet at the same time there will be a reduction in the means to earn money to pay those costs.

Let's take the hypothetical where MOST of the supply chain is automated, but the only thing still manual is driving trucks (just an example)... Everyone BUT a truck driver is unemployed and has no money to buy the goods, even though the goods are FAIRLY cheap because the only thing that has to be paid is the truck drivers...

Can we simply expect everyone with a CDL to work the same hours and days for no more compensation than the person who sits on his back porch all day and admires his pet cats? Why would those few keep working?

I know that it is popular to portray capitalism as some sort of evil, but the problem with Marxism is that while everyone beavering away for nothing SOUNDS unbelievably great on paper, it rarely works in the real world. Human beings are animals and it is only natural for animals to take the path of least resistance. If I can achieve my desired standard of living by sitting on the porch or by driving a truck 5 days a week, I know which one I would choose.

IMO the key to TRUE automation is making robots that can make robots. Once that process is FULLY automated, then the rest will probably move pretty quickly. Because once ROBOTS are free to make, then a wide range of supply chains will follow.

  • The Culture!Poppet, Wed Nov 29 1:26pm
    That's the best-case outcome as regards AI and automation, and one I embrace,as well. The big stumbling block, of course, is capitalism. As it stands, the benefits of these advances look likely (in... more
    • I Would Like To Think...Amadeus, Wed Nov 29 1:38pm
      ...that an ascendant AI unburdened by emotion would find the unnecessary destruction of life wasteful, and therefore wouldn't go out of its way to destroy us. Emotions only help us, I think, as I... more
      • killing us could be FAR more efficient than continuing to spend resources on feeding/sheltering/clothing/pampering billions of human beings... Yes, killing us off might require a significant... more
      • I'm undecided on that.Poppet, Wed Nov 29 3:11pm
        The possibility of AI becoming an existential threat to humanity wasn't what I was referring to, though. I was talking about the likelihood that the potentially liberating effect of increasingly... more
Click here to receive daily updates


Religion and Ethics BBS