techhub.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A hub primarily for passionate technologists, but everyone is welcome

Administered by:

Server stats:

4.6K
active users

#thealignmentproblem

0 posts0 participants0 posts today
Matthew Thompson<p>“But who was learning, you or the machine?”</p><p>“Well, I suppose we both were”</p><p>Amazing book 🔥🤓</p><p><a href="https://mastodon.social/tags/TheAlignmentProblem" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TheAlignmentProblem</span></a> <a href="https://mastodon.social/tags/Learning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Learning</span></a> <a href="https://mastodon.social/tags/ResponsibleAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ResponsibleAI</span></a></p><p>The Alignment Problem by Brian Christian 📚 <a href="https://micro.blog/books/9781786494320" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">micro.blog/books/9781786494320</span><span class="invisible"></span></a></p>
jack the nonabrasive<p><a href="https://mastodon.social/tags/TheAlignmentProblem" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TheAlignmentProblem</span></a> is with the folks creating so-called “general” AI: their values are misaligned with the rest of humanity.</p>
jack the nonabrasive<p>One of the maddening things about reading <a href="https://mastodon.social/tags/BrianChristian" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>BrianChristian</span></a>’s <a href="https://mastodon.social/tags/TheAlignmentProblem" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TheAlignmentProblem</span></a> is that it assumes creating systems without firm use cases—“general” AI—is inevitable and desirable. It never questions assumptions of who is doing this work, why, and for whose benefit. </p><p>A book that talks in a positive way about narrowly scoped systems which multiply human potential while preserving values is needed. A kind of “Understanding Computers and Cognition” focused on today’s technology.</p>
jack the nonabrasive<p>So much for “transcendance”. <a href="https://mastodon.social/tags/BrianChristian" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>BrianChristian</span></a> <a href="https://mastodon.social/tags/TheAlignmentProblem" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TheAlignmentProblem</span></a> </p><p><a href="https://www.ft.com/content/175e5314-a7f7-4741-a786-273219f433a1" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">ft.com/content/175e5314-a7f7-4</span><span class="invisible">741-a786-273219f433a1</span></a></p>
jack the nonabrasive<p>Oh man <a href="https://mastodon.social/tags/BrianChristian" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>BrianChristian</span></a>’s <a href="https://mastodon.social/tags/TheAlignmentProblem" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TheAlignmentProblem</span></a> is treating <a href="https://mastodon.social/tags/WilliamMacAskill" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>WilliamMacAskill</span></a> and the whole EA/Longtermer cult seriously on pp 235-36. This is after opening shots with a <a href="https://mastodon.social/tags/NickBostrom" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>NickBostrom</span></a> quote on p 223.</p><p>Sigh. </p><p>Let’s see if he ever gets to the eugenics &amp; racism.</p>