techhub.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A hub primarily for passionate technologists, but everyone is welcome

Administered by:

Server stats:

4.6K
active users

#textgeneration

0 posts0 participants0 posts today
N-gated Hacker News<p>A paper on AR-Diffusion pretends to revolutionize text generation with a model so "advanced" it can't even generate its own job listing. 🔄🤖 Meanwhile, arXiv's relentless quest for a <a href="https://mastodon.social/tags/DevOps" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DevOps</span></a> <a href="https://mastodon.social/tags/engineer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>engineer</span></a> continues as they remind us that "open science" is important—especially if you can keep their site from crashing. 💻🛠️<br><a href="https://arxiv.org/abs/2305.09515" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">arxiv.org/abs/2305.09515</span><span class="invisible"></span></a> <a href="https://mastodon.social/tags/ARDiffusion" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ARDiffusion</span></a> <a href="https://mastodon.social/tags/OpenScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenScience</span></a> <a href="https://mastodon.social/tags/TextGeneration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TextGeneration</span></a> <a href="https://mastodon.social/tags/Innovation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Innovation</span></a> <a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/ngated" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ngated</span></a></p>
Hacker News<p>AR-Diffusion: Auto-Regressive Diffusion Model for Text Generation</p><p><a href="https://arxiv.org/abs/2305.09515" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">arxiv.org/abs/2305.09515</span><span class="invisible"></span></a></p><p><a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/ARDiffusion" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ARDiffusion</span></a> <a href="https://mastodon.social/tags/TextGeneration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TextGeneration</span></a> <a href="https://mastodon.social/tags/AutoRegressive" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AutoRegressive</span></a> <a href="https://mastodon.social/tags/AIResearch" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIResearch</span></a> <a href="https://mastodon.social/tags/MachineLearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MachineLearning</span></a></p>
Damon L. Wakes<p>I thought my Markov chain text generator had come up with a novel and amusing phrase - "The dog wasn’t terribly happy about being swept up in a sheet and rocket-booted way out of town" - but now I realise that it appears word-for-word in one of the stories it was trained on: <a href="https://damonwakes.wordpress.com/2017/07/07/isnt-it-bionic/" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="ellipsis">damonwakes.wordpress.com/2017/</span><span class="invisible">07/07/isnt-it-bionic/</span></a><br><a href="https://mastodon.sdf.org/tags/MarkovChain" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MarkovChain</span></a> <a href="https://mastodon.sdf.org/tags/TextGeneration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TextGeneration</span></a> <a href="https://mastodon.sdf.org/tags/IronyMan" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>IronyMan</span></a> <a href="https://mastodon.sdf.org/tags/FlashFiction" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>FlashFiction</span></a></p>
Damon L. Wakes<p>Some more interesting output from my Markov text generator. It apparently got locked into that repetitive string of all caps phrases because the "STOP" at the end of each line gives it little to work with that isn't from the story that originally prompted it to start doing that. Still, I quite like "It is like dogs’ tongues flick and ask the smell," and also the thing about the space lizard with a bomb-ring in its mouth water.<br><a href="https://mastodon.sdf.org/tags/MarkovChain" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MarkovChain</span></a> <a href="https://mastodon.sdf.org/tags/TextGeneration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TextGeneration</span></a> <a href="https://mastodon.sdf.org/tags/FlashFiction" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>FlashFiction</span></a> <a href="https://mastodon.sdf.org/tags/SpaceLizard" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SpaceLizard</span></a></p>
Damon L. Wakes<p>It finally happened. The Markov chain text generator trained on my flash fiction finally wrote one.<br><a href="https://mastodon.sdf.org/tags/MarkovChain" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>MarkovChain</span></a> <a href="https://mastodon.sdf.org/tags/TextGeneration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TextGeneration</span></a> <a href="https://mastodon.sdf.org/tags/FlashFiction" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>FlashFiction</span></a> <a href="https://mastodon.sdf.org/tags/banana" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>banana</span></a></p>
Damon L. Wakes<p>For a while now I've been thinking "Large language models could be interesting, if only they were trained ethically and used for fun stuff instead of enshittification." Well, today I've finally had a little go at producing text based on my own quarter-million words of flash fiction using this public domain Markov text generator: <a href="https://eli.thegreenplace.net/2018/elegant-python-code-for-a-markov-chain-text-generator/" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="ellipsis">eli.thegreenplace.net/2018/ele</span><span class="invisible">gant-python-code-for-a-markov-chain-text-generator/</span></a>. I'm not done yet, but the results are already promising!<br><a href="https://mastodon.sdf.org/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> <a href="https://mastodon.sdf.org/tags/TextGeneration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TextGeneration</span></a> <a href="https://mastodon.sdf.org/tags/FlashFiction" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>FlashFiction</span></a></p>
Hacker News<p>New ChatGPT Models Seem to Leave Watermarks on Text</p><p><a href="https://www.rumidocs.com/newsroom/new-chatgpt-models-seem-to-leave-watermarks-on-text" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">rumidocs.com/newsroom/new-chat</span><span class="invisible">gpt-models-seem-to-leave-watermarks-on-text</span></a></p><p><a href="https://mastodon.social/tags/HackerNews" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HackerNews</span></a> <a href="https://mastodon.social/tags/NewChatGPTModels" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>NewChatGPTModels</span></a> <a href="https://mastodon.social/tags/Watermarks" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Watermarks</span></a> <a href="https://mastodon.social/tags/AIResearch" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIResearch</span></a> <a href="https://mastodon.social/tags/TextGeneration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TextGeneration</span></a> <a href="https://mastodon.social/tags/TechNews" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechNews</span></a></p>
PPC Land<p>Amazon expands access to Nova AI models with new developer portal: New platform grants wider access to foundation models for text, image, and video generation. <a href="https://ppc.land/amazon-expands-access-to-nova-ai-models-with-new-developer-portal/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">ppc.land/amazon-expands-access</span><span class="invisible">-to-nova-ai-models-with-new-developer-portal/</span></a> <a href="https://mastodon.social/tags/Amazon" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Amazon</span></a> <a href="https://mastodon.social/tags/NovaAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>NovaAI</span></a> <a href="https://mastodon.social/tags/AIModels" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIModels</span></a> <a href="https://mastodon.social/tags/DeveloperPortal" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DeveloperPortal</span></a> <a href="https://mastodon.social/tags/TextGeneration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TextGeneration</span></a></p>
Kolibri<p>Lucrezia Maria Romola de’ Medici on X: "seed-poem-tool <a href="https://t.co/KzqaFhOzbz" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="">t.co/KzqaFhOzbz</span><span class="invisible"></span></a>" / X<br><a href="https://x.com/Ole_Lukoile777/status/18695048390190…" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="ellipsis">x.com/Ole_Lukoile777/status/18</span><span class="invisible">695048390190…</span></a></p><p>Matt DesLauriers on X: "I've just published an open source tool for crafting "seed poems"— this is an exercise in constrained poetry that produces a valid BIP-39 mnemonic seed phrase, giving the reader full access to a cryptocurrency wallet. 🌱 <a href="https://t.co/aZruiOCkiG" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="">t.co/aZruiOCkiG</span><span class="invisible"></span></a> <a href="https://t.co/SG9ddxPjDf" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="">t.co/SG9ddxPjDf</span><span class="invisible"></span></a>" / X<br><a href="https://x.com/mattdesl/status/1543958392984723456" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="ellipsis">x.com/mattdesl/status/15439583</span><span class="invisible">92984723456</span></a></p><p>### Описание:<br>**Lucrezia Maria Romola de’ Medici** опубликовала ссылку на утилиту **seed-poem-tool** в социальной сети X (ранее известной как Twitter). Утилита, вероятно, предназначена для генерации текстов на основе семантического ядра (seed).</p><p>Оригинальная публикация доступна по ссылке:<br>[seed-poem-tool](<a href="https://x.com/Ole_Lukoile777/status/18695048390190…" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="ellipsis">x.com/Ole_Lukoile777/status/18</span><span class="invisible">695048390190…</span></a>)</p><p>Инструменты подобного рода могут быть полезны для лингвистов, поэтов, писателей, а также для разработки программного обеспечения в области обработки естественного языка (NLP).</p><p>### Хэштеги:<br><a href="https://qoto.org/tags/SeedPoemTool" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SeedPoemTool</span></a> <a href="https://qoto.org/tags/TextGeneration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TextGeneration</span></a> <a href="https://qoto.org/tags/CreativeTools" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CreativeTools</span></a> <a href="https://qoto.org/tags/PoetryTool" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PoetryTool</span></a> <a href="https://qoto.org/tags/LucreziaMedici" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LucreziaMedici</span></a> <a href="https://qoto.org/tags/AIWriting" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AIWriting</span></a> <a href="https://qoto.org/tags/NLP" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>NLP</span></a> <a href="https://qoto.org/tags/DigitalCreativity" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DigitalCreativity</span></a> <a href="https://qoto.org/tags/WritingSoftware" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>WritingSoftware</span></a> <a href="https://qoto.org/tags/SemanticTool" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SemanticTool</span></a></p>
David M. Schmidt<p>We currently have two fully-funded open PhD positions in our group with a focus on <a href="https://mastodon.social/tags/NLProc" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>NLProc</span></a>, <a href="https://mastodon.social/tags/InformationExtraction" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>InformationExtraction</span></a> and <a href="https://mastodon.social/tags/TextGeneration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TextGeneration</span></a>. I can really recommend both the group as well as Philipp Cimiano as a supervisor, so take this opportunity!</p><p>NLP/Text Generation<br>EN: <a href="https://uni-bielefeld.hr4you.org/job/view/4054" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">uni-bielefeld.hr4you.org/job/v</span><span class="invisible">iew/4054</span></a><br>DE: <a href="https://uni-bielefeld.hr4you.org/job/view/4053" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">uni-bielefeld.hr4you.org/job/v</span><span class="invisible">iew/4053</span></a></p><p>NLP/Information Extraction<br>EN: <a href="https://uni-bielefeld.hr4you.org/job/view/4059" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">uni-bielefeld.hr4you.org/job/v</span><span class="invisible">iew/4059</span></a><br>DE: <a href="https://uni-bielefeld.hr4you.org/job/view/4057" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">uni-bielefeld.hr4you.org/job/v</span><span class="invisible">iew/4057</span></a></p><p>If you have any questions, do not hesitate to contact me or Philipp directly!</p>
Paolo Amoroso<p>Artyom Bologov <span class="h-card" translate="no"><a href="https://merveilles.town/@aartaka" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>aartaka</span></a></span> posted Common Lisp code and examples of traditional text generation algorithms.</p><p><a href="https://aartaka.me/generated" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">aartaka.me/generated</span><span class="invisible"></span></a></p><p><a href="https://fosstodon.org/tags/TextGeneration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TextGeneration</span></a> <a href="https://fosstodon.org/tags/CommonLisp" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CommonLisp</span></a> <a href="https://fosstodon.org/tags/lisp" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>lisp</span></a></p>
Debby<p>Just came across this insightful article on watermarking for large language models! 📝 It explores how watermarking can help distinguish between human and AI-generated text, ensuring responsible usage. A great read for anyone interested in the responsible use of AI technology!<a href="https://go.nature.com/3Abhjt1" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">go.nature.com/3Abhjt1</span><span class="invisible"></span></a> <br><a href="https://esperanto.masto.host/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> <a href="https://esperanto.masto.host/tags/technology" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>technology</span></a> <a href="https://esperanto.masto.host/tags/technews" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>technews</span></a> <a href="https://esperanto.masto.host/tags/TextGeneration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TextGeneration</span></a> <a href="https://esperanto.masto.host/tags/Watermarking" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Watermarking</span></a> <a href="https://esperanto.masto.host/tags/EthicsInAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>EthicsInAI</span></a> <a href="https://esperanto.masto.host/tags/ResponsibleAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ResponsibleAI</span></a> <a href="https://esperanto.masto.host/tags/TechForGood" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechForGood</span></a> <a href="https://esperanto.masto.host/tags/Innovation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Innovation</span></a> <a href="https://esperanto.masto.host/tags/NaturalLanguageProcessing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>NaturalLanguageProcessing</span></a> <a href="https://esperanto.masto.host/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> <a href="https://esperanto.masto.host/tags/SyntheticText" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SyntheticText</span></a></p>
Judith van Stegeren<p>I'm evaluating a gpt-4o-mini pipeline today, and the LLM consistently classifies The Netherlands as "outside of the EU". 🤦‍♀️ </p><p><a href="https://fosstodon.org/tags/llms" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llms</span></a> <a href="https://fosstodon.org/tags/openai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>openai</span></a> <a href="https://fosstodon.org/tags/gpt4omini" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>gpt4omini</span></a> <a href="https://fosstodon.org/tags/genai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>genai</span></a> <a href="https://fosstodon.org/tags/textgeneration" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>textgeneration</span></a> <a href="https://fosstodon.org/tags/classification" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>classification</span></a></p>

Over the past few days I've been working on extending & re-packaging the procedural text generation engine from one of the old examples into a new package and also just wrote/updated documentation for its various features:

- variable definitions, optionally with multiple value choices
- cyclic & recursive variable references/expansion
- variable assignments
- dynamic, indirect variable lookups (for context specific situations)
- optional preset & custom modifiers (i.e. pointfree/concatenative application of modifier sequences)
- controlled randomness during var expansion

The new package is called: thi.ng/proctext (6.5KB incl. all deps) The text format used relies on a simple parser grammar defined and processed via thi.ng/parse. The resulting document AST is then interpreted via thi.ng/defmulti

Please see readme for notes/examples, as well as the refactored example project below. The tool is very useful for complex source code generation, but also could be useful for bots, interactive fiction etc. The generator is stateful and variable state can be optionally retained/re-used over multiple invocations. Making all modifiers async is also providing a lot of flexibility (e.g. loading external data sources, generating secondary/expanded descriptions etc.)

Demo (incl. 5 examples and can be used as playground):
demo.thi.ng/umbrella/procedura

thi.ng/proctextExtensible procedural text generation engine with dynamic, mutable state, indirection, randomizable & recursive variable expansions

#INLG2024 submission deadline is in 4 weeks! How are your papers coming together?

If you work on #NaturalLanguageGeneration, #TextGeneration (with or w/o #LLMs)

All deadlines are Anywhere on Earth (UTC-12)
• START system regular paper submission deadline: May 31, 2024
• ARR commitment to INLG deadline via START system: June 24, 2024
• START system demo paper submission deadline: June 24, 2024
• Notification: July 15, 2024

More info: inlg2024.github.io/calls.html

inlg2024.github.ioINLG2024The 17th International Natural Language Generation Conference is scheduled to be held in Tokyo, Japan from September 23rd to 27th, 2024.

The first #CallForPapers for #INLG2024 is now out!

If you work on #NaturalLanguageGeneration, #TextGeneration (with or w/o #LLMs)

All deadlines are Anywhere on Earth (UTC-12)
• START system regular paper submission deadline: May 31, 2024
• ARR commitment to INLG deadline via START system: June 24, 2024
• START system demo paper submission deadline: June 24, 2024
• Notification: July 15, 2024
• Camera ready: August 16, 2024
• Conference: 23-27 September 2024

More info: jiscmail.ac.uk/cgi-bin/wa-jisc

www.jiscmail.ac.ukJISCMail - SIGGEN Archives
Replied in thread

@golovlev perplexity.ai и phind.com — как замену поисковикам, когда нужно невыдуманное описание или разъяснение явления или события с пруфлинками (но всё равно иногда умудряются спизднуть). Сюда же, в принципе, можно и #Copilot с #Geminy дбавить, обычно «спрашиваю» сразу у всех: кто-нибудь, да справится с вопросом.

Очень частый (уж извинити) юзкейс — написание каментов в Mastodon 😉 Считаю, что за пару секунд сгенерировать ответ на вопрос (с пруфлинками) — это гораздо «человечнее», чем отвечать «иди погугли».

#Mistral и #Клавдия — для художественного перевода и стилистической обработки текста. Их галлюцинациям доверия 0,0%.

theb.ai пишет код простеньких скриптов для бытовых задач чуть более избыточно, чем другие, но зато более понятно для «чайника».

А, ну и вот эти: kagi.com/summarizer/ выдает краткое содержание текста (или по ссылке). 300.ya.ru тезисы из #YouTube-видео с таймстампами. Реально помогает решить, стоит ли полуторачасовой подкаст просмотра (как правило — не стоит).