techhub.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A hub primarily for passionate technologists, but everyone is welcome

Administered by:

Server stats:

4.7K
active users

#embeddings

1 post1 participant0 posts today
Mathieu Jacomy<p>Ah, my latest tool, just out of the oven! Just in time for my Summer break... It's called *Vandolie*. It's for high school students, but it may work for you as well. I will let you discover it by yourself.</p><p>👉 <a href="https://jacomyma.github.io/vandolie/en/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">jacomyma.github.io/vandolie/en/</span><span class="invisible"></span></a></p><p>It's like a mini CorTexT for teenagers, if you know that tool. But it runs entirely in the browser.</p><p>Entirely localized in Danish.</p><p>Consider it a beta version. Usable, but feel free to file GitHub issues for feedback &amp; bugs.</p><p><a href="https://mas.to/tags/CSSH" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CSSH</span></a> <a href="https://mas.to/tags/DistantReading" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DistantReading</span></a> <a href="https://mas.to/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a></p>
Christian Drumm 🇪🇺🧗🚵<p>Playing with <span class="h-card" translate="no"><a href="https://mastodon.social/@duckdb" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>duckdb</span></a></span>, <a href="https://mastodon.social/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a> and <a href="https://mastodon.social/tags/skiplists" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>skiplists</span></a>. Got exercise to understand how <a href="https://mastodon.social/tags/RAG" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>RAG</span></a> works under the hood.</p>
gary<p>with ai now you set up a box and run deepseek local - 98% of frontier models 20x cheaper <a href="https://infosec.exchange/tags/token" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>token</span></a> gen <a href="https://infosec.exchange/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a> <a href="https://infosec.exchange/tags/rag" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>rag</span></a> <a href="https://infosec.exchange/tags/license" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>license</span></a> <a href="https://infosec.exchange/tags/innovation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>innovation</span></a> <a href="https://infosec.exchange/tags/open" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>open</span></a> weights <a href="https://infosec.exchange/tags/comm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>comm</span></a> use ok</p>
gary<p><span class="h-card" translate="no"><a href="https://mastodon.social/@nickbearded" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>nickbearded</span></a></span> have to build it, there are a few layers here and there, kind of started out as generic portal quest...we will see if i can get anything built - i may have to just sell clusters first and then hire a coding genius or two. I do not think this or the 90 pt plan are particularly unique but just have to try and follow through - get a working prototype for the rag pipelines - make it a cohesive dashboard <a href="https://infosec.exchange/tags/osint" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>osint</span></a> <a href="https://infosec.exchange/tags/comp" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>comp</span></a> intel <a href="https://infosec.exchange/tags/realtime" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>realtime</span></a> info <a href="https://infosec.exchange/tags/portal" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>portal</span></a> <a href="https://infosec.exchange/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a>/token/s <a href="https://infosec.exchange/tags/federated" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>federated</span></a> distributed inference <a href="https://infosec.exchange/tags/semantic" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>semantic</span></a> search</p>
➴➴➴Æ🜔Ɲ.Ƈꭚ⍴𝔥єɼ👩🏻‍💻<p>Okay, Back of the napkin math:<br> - There are probably 100 million sites and 1.5 billion pages worth indexing in a <a href="https://lgbtqia.space/tags/search" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>search</span></a> engine<br> - It takes about 1TB to <a href="https://lgbtqia.space/tags/index" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>index</span></a> 30 million pages.<br> - We only care about text on a page.</p><p>I define a page as worth indexing if:<br> - It is not a FAANG site<br> - It has at least one referrer (no DD Web)<br> - It's active</p><p>So, this means we need 40TB of fast data to make a good index for the internet. That's not "runs locally" sized, but it is nonprofit sized.</p><p>My size assumptions are basically as follows:<br> - <a href="https://lgbtqia.space/tags/URL" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>URL</span></a><br> - <a href="https://lgbtqia.space/tags/TFIDF" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TFIDF</span></a> information<br> - Text <a href="https://lgbtqia.space/tags/Embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Embeddings</span></a><br> - Snippet </p><p>We can store an index for 30kb. So, for 40TB we can store an full internet index. That's about $500 in storage.</p><p>Access time becomes a problem. TFIDF for the whole internet can easily fit in ram. Even with <a href="https://lgbtqia.space/tags/quantized" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>quantized</span></a> embeddings, you can only fit 2 million per GB in ram. </p><p>Assuming you had enough RAM it could be fast: TF-IDF to get 100 million candidated, <a href="https://lgbtqia.space/tags/FAISS" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>FAISS</span></a> to sort those, load snippets dynamically, potentially modify rank by referers etc.</p><p>6 128 MG <a href="https://lgbtqia.space/tags/Framework" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Framework</span></a> <a href="https://lgbtqia.space/tags/desktops" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>desktops</span></a> each with 5tb HDs (plus one raspberry pi to sort the final condidates from the six machines) is enough to replace <a href="https://lgbtqia.space/tags/Google" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Google</span></a>. That's about $15k. </p><p>In two to three years this will be doable on a single machine for around $3k.</p><p>By the end of the decade it should be able to be run as an app on a powerful desktop</p><p>Three years after that it can run on a <a href="https://lgbtqia.space/tags/laptop" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>laptop</span></a>.</p><p>Three years after that it can run on a <a href="https://lgbtqia.space/tags/cellphone" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>cellphone</span></a>.</p><p>By #2040 it's a background process on your cellphone.</p>
Nebraska.Code<p>Adam Barney is 'Demystifying LLMs: How They Work and Why It Matters' July 24th at Nebraska.Code().</p><p><a href="https://nebraskacode.amegala.com/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">nebraskacode.amegala.com/</span><span class="invisible"></span></a></p><p><a href="https://mastodon.social/tags/Travefy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Travefy</span></a> <a href="https://mastodon.social/tags/LargeLanguageModels" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LargeLanguageModels</span></a> <a href="https://mastodon.social/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> <a href="https://mastodon.social/tags/languagemodels" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>languagemodels</span></a> <a href="https://mastodon.social/tags/TechTalk" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechTalk</span></a> <a href="https://mastodon.social/tags/tokenization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>tokenization</span></a> <a href="https://mastodon.social/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a> <a href="https://mastodon.social/tags/attentionmechanisms" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>attentionmechanisms</span></a> <a href="https://mastodon.social/tags/finetuning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>finetuning</span></a> <a href="https://mastodon.social/tags/engineering" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>engineering</span></a> <a href="https://mastodon.social/tags/softwareengineering" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>softwareengineering</span></a> <a href="https://mastodon.social/tags/softwareengineer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>softwareengineer</span></a> <a href="https://mastodon.social/tags/Nebraska" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Nebraska</span></a> <a href="https://mastodon.social/tags/TechnologyConference" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechnologyConference</span></a> <a href="https://mastodon.social/tags/lincolnnebraska" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lincolnnebraska</span></a> <a href="https://mastodon.social/tags/devconference" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>devconference</span></a></p>
N-gated Hacker News<p>🚨 BREAKING NEWS: <a href="https://mastodon.social/tags/Embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Embeddings</span></a>, that ancient artifact from the dusty corners of ML, are apparently the unsung heroes of technical writing! 🤯 Who knew that connecting texts like a glorified game of connect-the-dots could revolutionize your mundane documentation? 💡 Thank you, Captain Obvious, for this revelation! 😂<br><a href="https://technicalwriting.dev/ml/embeddings/overview.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">technicalwriting.dev/ml/embedd</span><span class="invisible">ings/overview.html</span></a> <a href="https://mastodon.social/tags/TechnicalWriting" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechnicalWriting</span></a> <a href="https://mastodon.social/tags/MLRevolution" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MLRevolution</span></a> <a href="https://mastodon.social/tags/ConnectTheDots" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ConnectTheDots</span></a> <a href="https://mastodon.social/tags/DocumentationInsights" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DocumentationInsights</span></a> <a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/ngated" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ngated</span></a></p>
Hacker News<p>Embeddings Are Underrated</p><p><a href="https://technicalwriting.dev/ml/embeddings/overview.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">technicalwriting.dev/ml/embedd</span><span class="invisible">ings/overview.html</span></a></p><p><a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/Embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Embeddings</span></a> <a href="https://mastodon.social/tags/Underrated" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Underrated</span></a> <a href="https://mastodon.social/tags/MachineLearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MachineLearning</span></a> <a href="https://mastodon.social/tags/DataScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DataScience</span></a> <a href="https://mastodon.social/tags/AIInsights" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIInsights</span></a></p>
Markus Eisele<p>From Strings to Semantics: Comparing Text with Java, Quarkus, and Embeddings<br>Learn how to build an AI-powered text similarity service using Quarkus, LangChain4j, and local embedding models. <br><a href="https://myfear.substack.com/p/java-quarkus-text-embeddings-similarity" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">myfear.substack.com/p/java-qua</span><span class="invisible">rkus-text-embeddings-similarity</span></a><br><a href="https://mastodon.online/tags/Java" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Java</span></a> <a href="https://mastodon.online/tags/Quarkus" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Quarkus</span></a> <a href="https://mastodon.online/tags/Embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Embeddings</span></a> <a href="https://mastodon.online/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a> <a href="https://mastodon.online/tags/LangChain4j" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LangChain4j</span></a></p>
JMLR<p>'Variance-Aware Estimation of Kernel Mean Embedding', by Geoffrey Wolfer, Pierre Alquier.</p><p><a href="http://jmlr.org/papers/v26/23-0161.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v26/23-0161.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a> <a href="https://sigmoid.social/tags/embedding" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embedding</span></a> <a href="https://sigmoid.social/tags/empirical" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>empirical</span></a></p>
Hacker News<p>HNSW index for vector embeddings in approx 500 LOC</p><p><a href="https://github.com/dicroce/hnsw" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">github.com/dicroce/hnsw</span><span class="invisible"></span></a></p><p><a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/HNSW" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HNSW</span></a> <a href="https://mastodon.social/tags/vector" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>vector</span></a> <a href="https://mastodon.social/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a> <a href="https://mastodon.social/tags/machinelearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>machinelearning</span></a> <a href="https://mastodon.social/tags/GitHub" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GitHub</span></a> <a href="https://mastodon.social/tags/500LOC" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>500LOC</span></a></p>
MSvana<p>Big update to my Embeddings Playground. I added support for the first free-to-use embedding model: "all-MiniLM-L6-v2" from Sentence transformers (<a href="https://www.sbert.net/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="">sbert.net/</span><span class="invisible"></span></a>).</p><p>Try the Embeddings playground here: <a href="https://embeddings.svana.name" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">embeddings.svana.name</span><span class="invisible"></span></a></p><p><a href="https://mastodon.social/tags/ai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ai</span></a> <a href="https://mastodon.social/tags/ml" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ml</span></a> <a href="https://mastodon.social/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a> <a href="https://mastodon.social/tags/programming" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>programming</span></a></p>
^.^<p>Nomic Embed Code. <a href="https://mastodon.social/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a> specifically for code from Nomic.</p><p><a href="https://www.nomic.ai/blog/posts/introducing-state-of-the-art-nomic-embed-code" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">nomic.ai/blog/posts/introducin</span><span class="invisible">g-state-of-the-art-nomic-embed-code</span></a></p>
FIZ ISE Research Group<p>We are very happy that our colleage <span class="h-card" translate="no"><a href="https://sigmoid.social/@GenAsefa" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>GenAsefa</span></a></span> has contributed the chapter on "Neurosymbolic Methods for Dynamic Knowledge Graphs" for the newly published Handbook on Neurosymbolic AI and Knowledge Graphs together with Mehwish Alam and Pierre-Henri Paris.</p><p>Handbook: <a href="https://ebooks.iospress.nl/doi/10.3233/FAIA400" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">ebooks.iospress.nl/doi/10.3233</span><span class="invisible">/FAIA400</span></a><br>our own chapter on arxive: <a href="https://arxiv.org/abs/2409.04572" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">arxiv.org/abs/2409.04572</span><span class="invisible"></span></a></p><p><a href="https://sigmoid.social/tags/neurosymbolicAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>neurosymbolicAI</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://sigmoid.social/tags/generativeAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>generativeAI</span></a> <a href="https://sigmoid.social/tags/LLMs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLMs</span></a> <a href="https://sigmoid.social/tags/knowledgegraphs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>knowledgegraphs</span></a> <a href="https://sigmoid.social/tags/semanticweb" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>semanticweb</span></a> <a href="https://sigmoid.social/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a> <a href="https://sigmoid.social/tags/graphembeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>graphembeddings</span></a></p>
Judith van Stegeren<p>Should you use OpenAI (or other closed-source) embeddings?</p><p>1. Try the lightest embedding model first<br>2. If it doesn’t work, try a beefier model and do a blind comparison<br>3. If you are already using a relatively large model, only then try some blind test against a proprietary model. If you really find it that the closed-source model is better for your application, then go for it.</p><p>Paraphrased from <a href="https://iamnotarobot.substack.com/p/should-you-use-openais-embeddings" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">iamnotarobot.substack.com/p/sh</span><span class="invisible">ould-you-use-openais-embeddings</span></a></p><p><a href="https://fosstodon.org/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a> <a href="https://fosstodon.org/tags/genai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>genai</span></a> <a href="https://fosstodon.org/tags/openai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>openai</span></a> <a href="https://fosstodon.org/tags/ada" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ada</span></a></p>
FIZ ISE Research Group<p>Poster from our colleague <span class="h-card" translate="no"><a href="https://blog.epoz.org/" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>epoz</span></a></span> from UGent-IMEC Linked Data &amp; Solid course. "Exploding Mittens - Getting to grips with huge SKOS datasets" on semantic embeddings enhanced SPARQL queries for ICONCLASS data.<br>Congrats for the 'best poster' award ;-) </p><p>poster: <a href="https://zenodo.org/records/14887544" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">zenodo.org/records/14887544</span><span class="invisible"></span></a><br>iconclass on GitHub: <a href="https://github.com/iconclass" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">github.com/iconclass</span><span class="invisible"></span></a></p><p><a href="https://sigmoid.social/tags/rdf2vec" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>rdf2vec</span></a> <a href="https://sigmoid.social/tags/bert" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>bert</span></a> <a href="https://sigmoid.social/tags/llm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llm</span></a> <a href="https://sigmoid.social/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a> <a href="https://sigmoid.social/tags/iconclass" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>iconclass</span></a> <a href="https://sigmoid.social/tags/semanticweb" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>semanticweb</span></a> <a href="https://sigmoid.social/tags/lod" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lod</span></a> <a href="https://sigmoid.social/tags/linkeddata" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>linkeddata</span></a> <a href="https://sigmoid.social/tags/knowledgegraphs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>knowledgegraphs</span></a> <a href="https://sigmoid.social/tags/dh" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>dh</span></a> <span class="h-card" translate="no"><a href="https://nfdi.social/@nfdi4culture" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>nfdi4culture</span></a></span> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>fiz_karlsruhe</span></a></span> <a href="https://sigmoid.social/tags/iconclass" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>iconclass</span></a></p>
lqdev<p>Nomic Embed Text V2: An Open Source, Multilingual, Mixture-of-Experts Embedding Model <a href="https://toot.lqdev.tech/tags/nomic" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>nomic</span></a> <a href="https://toot.lqdev.tech/tags/ai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ai</span></a> <a href="https://toot.lqdev.tech/tags/embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>embeddings</span></a> <a href="https://www.luisquintanilla.me/feed/nomic-embed-text-v2?utm_medium=feed" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">luisquintanilla.me/feed/nomic-</span><span class="invisible">embed-text-v2?utm_medium=feed</span></a></p>
PyData Madrid<p>Tenemos cita el 20 de febrero 🔥 Nos vemos en BBVA AI Factory para hablar embeddings para contratación financiera y de redes neuronales de grafos. Estamos probando <a class="mention" href="https://bsky.app/profile/guild.host" rel="nofollow noopener" target="_blank">@guild.host</a>, ¡reserva tu plaza aquí! 👇 <a href="https://guild.host/events/embeddings-para-contratacin-306046050" rel="nofollow noopener" target="_blank">guild.host/events/embed...</a> <a class="hashtag" href="https://bsky.app/search?q=%23PyData" rel="nofollow noopener" target="_blank">#PyData</a> <a class="hashtag" href="https://bsky.app/search?q=%23PyDataMadrid" rel="nofollow noopener" target="_blank">#PyDataMadrid</a> <a class="hashtag" href="https://bsky.app/search?q=%23python" rel="nofollow noopener" target="_blank">#python</a> <a class="hashtag" href="https://bsky.app/search?q=%23embeddings" rel="nofollow noopener" target="_blank">#embeddings</a> <a class="hashtag" href="https://bsky.app/search?q=%23GraphNeuralNetworks" rel="nofollow noopener" target="_blank">#GraphNeuralNetworks</a><br><br><a href="https://guild.host/events/embeddings-para-contratacin-306046050" rel="nofollow noopener" target="_blank">📄 Embeddings para contratación...</a></p>
Raphiki<p>🚀 New Video Alert! <br>Explore advanced image generation with Stable Diffusion in our latest "GenAI's Lamp" tutorial. Learn how to use <a href="https://mastodon.social/tags/Embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Embeddings</span></a> and <a href="https://mastodon.social/tags/LoRAs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LoRAs</span></a> to create stunning visuals. <br>Watch now! 🎨<br>✨ <a href="https://youtu.be/mZ6eVw8-MM8" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">youtu.be/mZ6eVw8-MM8</span><span class="invisible"></span></a></p><p><a href="https://mastodon.social/tags/StableDiffusion" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>StableDiffusion</span></a> <a href="https://mastodon.social/tags/ComfyUI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ComfyUI</span></a> <a href="https://mastodon.social/tags/TechAtWorldline" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechAtWorldline</span></a></p>
Microsoft DevBlogs<p>Leveraging embedding models for deep understanding was another breakthrough moment. OpenAI’s text_embedding_3 allowed us to grasp the nuances of user queries, going beyond mere keyword matches. This capability was instrumental in returning more relevant results and understanding user intent better. <a href="https://dotnet.social/tags/OpenAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenAI</span></a> <a href="https://dotnet.social/tags/Embeddings" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Embeddings</span></a></p>