<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>Comments on: Growing Children For Bostrom&#8217;s Disneyland</title>
	<atom:link href="http://slatestarcodex.com/2014/07/13/growing-children-for-bostroms-disneyland/feed/" rel="self" type="application/rss+xml" />
	<link>http://slatestarcodex.com/2014/07/13/growing-children-for-bostroms-disneyland/</link>
	<description>In a mad world, all blogging is psychiatry blogging</description>
	<lastBuildDate>Fri, 24 Jul 2015 06:54:29 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=4.2.3</generator>
	<item>
		<title>By: &#187; Superintelligence Bayesian Investor Blog</title>
		<link>http://slatestarcodex.com/2014/07/13/growing-children-for-bostroms-disneyland/#comment-127249</link>
		<dc:creator><![CDATA[&#187; Superintelligence Bayesian Investor Blog]]></dc:creator>
		<pubDate>Mon, 28 Jul 2014 20:12:17 +0000</pubDate>
		<guid isPermaLink="false">http://slatestarcodex.com/?p=2408#comment-127249</guid>
		<description><![CDATA[[&#8230;] the risks of competitive pressures driving out human traits (discussed more fully/verbosely at Slate Star Codex)? If WBE and AGI happen close enough together in time that we can plausibly influence which comes [&#8230;]]]></description>
		<content:encoded><![CDATA[<p>[&#8230;] the risks of competitive pressures driving out human traits (discussed more fully/verbosely at Slate Star Codex)? If WBE and AGI happen close enough together in time that we can plausibly influence which comes [&#8230;]</p>
<p><a href="javascript:void(0)" onclick="report_comments_flag(this, '127249', '4b33b77030')" class="report-comment">Report comment</a></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Data and Philosophy</title>
		<link>http://slatestarcodex.com/2014/07/13/growing-children-for-bostroms-disneyland/#comment-122666</link>
		<dc:creator><![CDATA[Data and Philosophy]]></dc:creator>
		<pubDate>Wed, 16 Jul 2014 17:54:01 +0000</pubDate>
		<guid isPermaLink="false">http://slatestarcodex.com/?p=2408#comment-122666</guid>
		<description><![CDATA[Why do you follow Bostrom&#039;s Malthusian assumption? That to me was the biggest flaw in the piece, and I don&#039;t think it necessarily true that engineered beings would have to try to reproduce as much as possible.]]></description>
		<content:encoded><![CDATA[<p>Why do you follow Bostrom&#8217;s Malthusian assumption? That to me was the biggest flaw in the piece, and I don&#8217;t think it necessarily true that engineered beings would have to try to reproduce as much as possible.</p>
<p><a href="javascript:void(0)" onclick="report_comments_flag(this, '122666', '4b33b77030')" class="report-comment">Report comment</a></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Paul Torek</title>
		<link>http://slatestarcodex.com/2014/07/13/growing-children-for-bostroms-disneyland/#comment-122082</link>
		<dc:creator><![CDATA[Paul Torek]]></dc:creator>
		<pubDate>Tue, 15 Jul 2014 16:59:43 +0000</pubDate>
		<guid isPermaLink="false">http://slatestarcodex.com/?p=2408#comment-122082</guid>
		<description><![CDATA[Alternative suggestion: I found Jenann Ismael&#039;s papers (e.g.  &lt;a href=&quot;http://www.jenanni.com/papers/Causation,%20Free%20Will,%20and%20Naturalism.pdf&quot; rel=&quot;nofollow&quot;&gt;here&lt;/a&gt; and   &lt;a href=&quot;http://www.jenanni.com/papers/Decision%20and%20the%20Open%20Future.pdf&quot; rel=&quot;nofollow&quot;&gt;here&lt;/a&gt; ) to contain much of the same wonderful stuff (and more) as the presentation I linked to above, without the annoying video jitter that one might get with typical internet connections.]]></description>
		<content:encoded><![CDATA[<p>Alternative suggestion: I found Jenann Ismael&#8217;s papers (e.g.  <a href="http://www.jenanni.com/papers/Causation,%20Free%20Will,%20and%20Naturalism.pdf" rel="nofollow">here</a> and   <a href="http://www.jenanni.com/papers/Decision%20and%20the%20Open%20Future.pdf" rel="nofollow">here</a> ) to contain much of the same wonderful stuff (and more) as the presentation I linked to above, without the annoying video jitter that one might get with typical internet connections.</p>
<p><a href="javascript:void(0)" onclick="report_comments_flag(this, '122082', '4b33b77030')" class="report-comment">Report comment</a></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Noah</title>
		<link>http://slatestarcodex.com/2014/07/13/growing-children-for-bostroms-disneyland/#comment-122075</link>
		<dc:creator><![CDATA[Noah]]></dc:creator>
		<pubDate>Tue, 15 Jul 2014 16:48:44 +0000</pubDate>
		<guid isPermaLink="false">http://slatestarcodex.com/?p=2408#comment-122075</guid>
		<description><![CDATA[Agreed - the obvious response to this all-against-all scenario...
&lt;blockquote&gt;Any agent that doesn’t always take the path that maximizes its utility (defined in objective economic terms) will be outcompeted by another that does.&lt;/blockquote&gt;
...is for the agents to unionize.  If we have superintelligent agents, surely coordination will sometimes be economically superior to all-against-all competition.  If the advantages of coordination are sufficiently large, then the society of agents could adopt a futuristic communism/anarchism where economic pressures face the society as a collective whole but individual members of the society are relatively free from economic pressure.

&lt;blockquote&gt;it’s worth asking how doomed we are when we come to this point. Likely we are pretty doomed, but I want to bring up a very faint glimmer of hope in an unexpected place.&lt;/blockquote&gt;

Incidentally, this lends some intuitive plausibility to primitivism.  If humanity were almost certainly doomed by following a technological path, then conceivably our long-term values could be best served by destroying technological civilization and any capacity to rebuild technological civilization.

&lt;blockquote&gt;It would just be a really weird volume of space that seemed to follow different rules than our own.&lt;/blockquote&gt;

In any case, I don&#039;t buy that this scenario makes sense.  The goo would need a power source, and as soon as the power source was unavailable, it would return to normal physics.]]></description>
		<content:encoded><![CDATA[<p>Agreed &#8211; the obvious response to this all-against-all scenario&#8230;</p>
<blockquote><p>Any agent that doesn’t always take the path that maximizes its utility (defined in objective economic terms) will be outcompeted by another that does.</p></blockquote>
<p>&#8230;is for the agents to unionize.  If we have superintelligent agents, surely coordination will sometimes be economically superior to all-against-all competition.  If the advantages of coordination are sufficiently large, then the society of agents could adopt a futuristic communism/anarchism where economic pressures face the society as a collective whole but individual members of the society are relatively free from economic pressure.</p>
<blockquote><p>it’s worth asking how doomed we are when we come to this point. Likely we are pretty doomed, but I want to bring up a very faint glimmer of hope in an unexpected place.</p></blockquote>
<p>Incidentally, this lends some intuitive plausibility to primitivism.  If humanity were almost certainly doomed by following a technological path, then conceivably our long-term values could be best served by destroying technological civilization and any capacity to rebuild technological civilization.</p>
<blockquote><p>It would just be a really weird volume of space that seemed to follow different rules than our own.</p></blockquote>
<p>In any case, I don&#8217;t buy that this scenario makes sense.  The goo would need a power source, and as soon as the power source was unavailable, it would return to normal physics.</p>
<p><a href="javascript:void(0)" onclick="report_comments_flag(this, '122075', '4b33b77030')" class="report-comment">Report comment</a></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: suntzuanime</title>
		<link>http://slatestarcodex.com/2014/07/13/growing-children-for-bostroms-disneyland/#comment-121911</link>
		<dc:creator><![CDATA[suntzuanime]]></dc:creator>
		<pubDate>Tue, 15 Jul 2014 08:44:21 +0000</pubDate>
		<guid isPermaLink="false">http://slatestarcodex.com/?p=2408#comment-121911</guid>
		<description><![CDATA[I agree that it&#039;s a conflict of interest - confiscating your savings as punishment is also a conflict of interest. But if the conflict of interest is a problem, it&#039;s because it causes people to be punished for illegitimate crimes, not because the method of punishment is brutal.]]></description>
		<content:encoded><![CDATA[<p>I agree that it&#8217;s a conflict of interest &#8211; confiscating your savings as punishment is also a conflict of interest. But if the conflict of interest is a problem, it&#8217;s because it causes people to be punished for illegitimate crimes, not because the method of punishment is brutal.</p>
<p><a href="javascript:void(0)" onclick="report_comments_flag(this, '121911', '4b33b77030')" class="report-comment">Report comment</a></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Alrenous</title>
		<link>http://slatestarcodex.com/2014/07/13/growing-children-for-bostroms-disneyland/#comment-121848</link>
		<dc:creator><![CDATA[Alrenous]]></dc:creator>
		<pubDate>Tue, 15 Jul 2014 05:34:27 +0000</pubDate>
		<guid isPermaLink="false">http://slatestarcodex.com/?p=2408#comment-121848</guid>
		<description><![CDATA[Consciousness is ontological subjectivity. For example, this means it really is inherently private. Sharing a thought with someone directly means being that person directly. As another, you can&#039;t be mistaken about first-order mental entities. That you think you&#039;re seeing blue is what causes you to be perceiving blue.]]></description>
		<content:encoded><![CDATA[<p>Consciousness is ontological subjectivity. For example, this means it really is inherently private. Sharing a thought with someone directly means being that person directly. As another, you can&#8217;t be mistaken about first-order mental entities. That you think you&#8217;re seeing blue is what causes you to be perceiving blue.</p>
<p><a href="javascript:void(0)" onclick="report_comments_flag(this, '121848', '4b33b77030')" class="report-comment">Report comment</a></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Ken Arromdee</title>
		<link>http://slatestarcodex.com/2014/07/13/growing-children-for-bostroms-disneyland/#comment-121832</link>
		<dc:creator><![CDATA[Ken Arromdee]]></dc:creator>
		<pubDate>Tue, 15 Jul 2014 04:56:11 +0000</pubDate>
		<guid isPermaLink="false">http://slatestarcodex.com/?p=2408#comment-121832</guid>
		<description><![CDATA[suntsu: It&#039;s a conflict of interest to both punish prisoners and benefit from the punishment in ways unrelated to discouraging crime.]]></description>
		<content:encoded><![CDATA[<p>suntsu: It&#8217;s a conflict of interest to both punish prisoners and benefit from the punishment in ways unrelated to discouraging crime.</p>
<p><a href="javascript:void(0)" onclick="report_comments_flag(this, '121832', '4b33b77030')" class="report-comment">Report comment</a></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: anon1</title>
		<link>http://slatestarcodex.com/2014/07/13/growing-children-for-bostroms-disneyland/#comment-121792</link>
		<dc:creator><![CDATA[anon1]]></dc:creator>
		<pubDate>Tue, 15 Jul 2014 03:34:09 +0000</pubDate>
		<guid isPermaLink="false">http://slatestarcodex.com/?p=2408#comment-121792</guid>
		<description><![CDATA[It&#039;s a problem because making prisoners profitable creates an incentive to punish people for illegitimate crimes.]]></description>
		<content:encoded><![CDATA[<p>It&#8217;s a problem because making prisoners profitable creates an incentive to punish people for illegitimate crimes.</p>
<p><a href="javascript:void(0)" onclick="report_comments_flag(this, '121792', '4b33b77030')" class="report-comment">Report comment</a></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Thomas Eliot</title>
		<link>http://slatestarcodex.com/2014/07/13/growing-children-for-bostroms-disneyland/#comment-121776</link>
		<dc:creator><![CDATA[Thomas Eliot]]></dc:creator>
		<pubDate>Tue, 15 Jul 2014 03:13:08 +0000</pubDate>
		<guid isPermaLink="false">http://slatestarcodex.com/?p=2408#comment-121776</guid>
		<description><![CDATA[That is in fact what Eldritch said, yes.]]></description>
		<content:encoded><![CDATA[<p>That is in fact what Eldritch said, yes.</p>
<p><a href="javascript:void(0)" onclick="report_comments_flag(this, '121776', '4b33b77030')" class="report-comment">Report comment</a></p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Scott Alexander</title>
		<link>http://slatestarcodex.com/2014/07/13/growing-children-for-bostroms-disneyland/#comment-121767</link>
		<dc:creator><![CDATA[Scott Alexander]]></dc:creator>
		<pubDate>Tue, 15 Jul 2014 02:57:48 +0000</pubDate>
		<guid isPermaLink="false">http://slatestarcodex.com/?p=2408#comment-121767</guid>
		<description><![CDATA[Very degenerate self replicating patterns do arise in current economic activity - for example, the weird feedback loops among stock-trading-robots that occasionally shut down a market or two. Make the economic activity a zillion times more complex, and maybe we&#039;ll get better ones.

&quot;Also how would an economy like that be stable for so long? Eventually the best agent would outcompete all the others, or have enough resources to be self-sufficient and do whatever it wants without interacting with the others.&quot;

Why doesn&#039;t this happen to biological life forms? Insects were competing against each other for millions of years, and while there was some driving-to-extinction mostly they just stayed in their own niches and formed a nice stable system?

(this question does bother me, but given that it happened there must be some answer)]]></description>
		<content:encoded><![CDATA[<p>Very degenerate self replicating patterns do arise in current economic activity &#8211; for example, the weird feedback loops among stock-trading-robots that occasionally shut down a market or two. Make the economic activity a zillion times more complex, and maybe we&#8217;ll get better ones.</p>
<p>&#8220;Also how would an economy like that be stable for so long? Eventually the best agent would outcompete all the others, or have enough resources to be self-sufficient and do whatever it wants without interacting with the others.&#8221;</p>
<p>Why doesn&#8217;t this happen to biological life forms? Insects were competing against each other for millions of years, and while there was some driving-to-extinction mostly they just stayed in their own niches and formed a nice stable system?</p>
<p>(this question does bother me, but given that it happened there must be some answer)</p>
<p><a href="javascript:void(0)" onclick="report_comments_flag(this, '121767', '4b33b77030')" class="report-comment">Report comment</a></p>
]]></content:encoded>
	</item>
</channel>
</rss>
