techhub.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A hub primarily for passionate technologists, but everyone is welcome

Administered by:

Server stats:

4.8K
active users

#Sparsity

0 posts0 participants0 posts today
JMLR<p>'A minimax optimal approach to high-dimensional double sparse linear regression', by Yanhang Zhang, Zhifan Li, Shixiang Liu, Jianxin Yin.</p><p><a href="http://jmlr.org/papers/v25/23-0653.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/23-0653.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/thresholding" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>thresholding</span></a> <a href="https://sigmoid.social/tags/sparsity" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparsity</span></a></p>
JMLR<p>'skscope: Fast Sparsity-Constrained Optimization in Python', by Zezhi Wang, Junxian Zhu, Xueqin Wang, Jin Zhu, Huiyang Pen, Peng Chen, Anran Wang, Xiaoke Zhang.</p><p><a href="http://jmlr.org/papers/v25/23-1574.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/23-1574.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/optimization" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>optimization</span></a> <a href="https://sigmoid.social/tags/sparsity" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparsity</span></a></p>
Low Rank Jack<p>Let's start designing a new course for applied mathematics students in <a href="https://mathstodon.xyz/tags/UCLouvain" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>UCLouvain</span></a>, <a href="https://mathstodon.xyz/tags/EPL" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>EPL</span></a> on high dimensional data analysis with 3 wonderful reference books <a href="https://mathstodon.xyz/tags/inverseproblem" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>inverseproblem</span></a> <a href="https://mathstodon.xyz/tags/highDimensional" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>highDimensional</span></a> <a href="https://mathstodon.xyz/tags/statistics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>statistics</span></a> <a href="https://mathstodon.xyz/tags/optimization" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>optimization</span></a> <a href="https://mathstodon.xyz/tags/Sparsity" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Sparsity</span></a> <a href="https://mathstodon.xyz/tags/teaching" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>teaching</span></a></p>
Javed A. Butt<p><a href="https://mastodon.social/tags/mistral" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>mistral</span></a>'s 8x22B is ~260GB</p><p>the trend is to get models smaller, not bigger</p><p><a href="https://mastodon.social/tags/pruning" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>pruning</span></a>, <a href="https://mastodon.social/tags/sparsity" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparsity</span></a>, <a href="https://mastodon.social/tags/quantization" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>quantization</span></a>, <a href="https://mastodon.social/tags/distillation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>distillation</span></a></p><p>so why such a huge model?</p><p>does mistral have no other models?</p>
Laurent Duval<p>Yasuhisa Kuroda released a spectral data processing program for chemical analysis called SPANA <a href="https://www.eonet.ne.jp/~spana-lsq/index-e.html" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">eonet.ne.jp/~spana-lsq/index-e</span><span class="invisible">.html</span></a>. He has been kind enough to incorporate our BEADS algorithm (baseline estimation &amp; denoising w/ <a href="https://mastodon.social/tags/sparsity" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparsity</span></a>) to separate peaks, baseline and noise using sparsity priors! <a href="https://doi.org/10.1016/j.chemolab.2014.09.014" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">doi.org/10.1016/j.chemolab.201</span><span class="invisible">4.09.014</span></a> <a href="https://mastodon.social/tags/analyticalchemistry" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>analyticalchemistry</span></a></p>
Ryan Daws 🤓<p>Damian Bogunowicz, Neural Magic: On revolutionising deep learning with CPUs <a href="https://www.artificialintelligence-news.com/2023/07/24/damian-bogunowicz-neural-magic-revolutionising-deep-learning-cpus/" target="_blank" rel="nofollow noopener noreferrer" translate="no"><span class="invisible">https://www.</span><span class="ellipsis">artificialintelligence-news.co</span><span class="invisible">m/2023/07/24/damian-bogunowicz-neural-magic-revolutionising-deep-learning-cpus/</span></a> <a href="https://techhub.social/tags/ai" class="mention hashtag" rel="tag">#<span>ai</span></a> <a href="https://techhub.social/tags/sparsity" class="mention hashtag" rel="tag">#<span>sparsity</span></a> <a href="https://techhub.social/tags/llm" class="mention hashtag" rel="tag">#<span>llm</span></a> <a href="https://techhub.social/tags/tech" class="mention hashtag" rel="tag">#<span>tech</span></a> <a href="https://techhub.social/tags/technology" class="mention hashtag" rel="tag">#<span>technology</span></a></p>
New Submissions to TMLR<p>Revisiting Sparsity Hunting in Federated Learning: Why the Sparsity Consensus Matters?</p><p><a href="https://openreview.net/forum?id=iHyhdpsnyi" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://</span><span class="ellipsis">openreview.net/forum?id=iHyhdp</span><span class="invisible">snyi</span></a></p><p><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/sparsity" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparsity</span></a> <a href="https://sigmoid.social/tags/distributed" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>distributed</span></a></p>
JMLR<p>'Fundamental limits and algorithms for sparse linear regression with sublinear sparsity', by Lan V. Truong.</p><p><a href="http://jmlr.org/papers/v24/21-0543.html" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v24/21-0543.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/sparsity" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparsity</span></a> <a href="https://sigmoid.social/tags/interpolation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>interpolation</span></a></p>
marco<p>New podcast from <span class="h-card"><a href="https://sigmoid.social/@thegradient" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>thegradient</span></a></span> with Hattie Zhou (twitter: <a href="https://twitter.com/oh_that_hat" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://</span><span class="">twitter.com/oh_that_hat</span><span class="invisible"></span></a>):</p><p>`Lottery Tickets and Algorithmic Reasoning in LLMs`</p><p><a href="https://thegradientpub.substack.com/p/hattie-zhou-lottery-tickets-and-algorithmic" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://</span><span class="ellipsis">thegradientpub.substack.com/p/</span><span class="invisible">hattie-zhou-lottery-tickets-and-algorithmic</span></a></p><p>The first half is focused on the lottery ticket hypothesis, which is a favorite topic of mine.</p><p><a href="https://sigmoid.social/tags/ML" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ML</span></a> <a href="https://sigmoid.social/tags/sparsity" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparsity</span></a> <a href="https://sigmoid.social/tags/LLMs" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLMs</span></a></p>
Tech News Worldwide<p>Inside APFS: from containers to clones <br> <br> <a href="https://eclecticlight.co/2023/01/02/inside-apfs-from-containers-to-clones/" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://</span><span class="ellipsis">eclecticlight.co/2023/01/02/in</span><span class="invisible">side-apfs-from-containers-to-clones/</span></a> <br> <br> <a href="https://aspiechattr.me/tags/GUIDPartitionTable" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>GUIDPartitionTable</span></a> <a href="https://aspiechattr.me/tags/Technology" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Technology</span></a> <a href="https://aspiechattr.me/tags/filesystem" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>filesystem</span></a> <a href="https://aspiechattr.me/tags/sparsefile" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>sparsefile</span></a> <a href="https://aspiechattr.me/tags/container" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>container</span></a> <a href="https://aspiechattr.me/tags/partition" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>partition</span></a> <a href="https://aspiechattr.me/tags/Sparsity" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Sparsity</span></a> <a href="https://aspiechattr.me/tags/clone" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>clone</span></a> <a href="https://aspiechattr.me/tags/Macs" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Macs</span></a> <a href="https://aspiechattr.me/tags/APFS" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>APFS</span></a> <a href="https://aspiechattr.me/tags/HFS" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>HFS</span></a>+ <a href="https://aspiechattr.me/tags/GPT" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>GPT</span></a></p>