Hello Lumin,
El sáb, 9 nov 2024 a las 10:27, DebGPT (<lumin@debian.org>) escribió:
This is an experiment, by letting LLM go through all 369 emails from
debian-devel on Oct. The command for producing the news report
is included below. Use debgpt's git HEAD if you want to try.
First time I see this kind of email, I thought time ago that'd be a
really cool use of AI, to produce a summary of mailing lists - since I struggle to read everything.
I just want to thank you for putting this together and, at least from
my side, this is very much appreciated.
Regards
--
Héctor Orón -.. . -... .. .- -. -.. . ...- . .-.. --- .--. . .-.
This is an experiment, by letting LLM go through all 369 emails from debian-devel on Oct. The command for producing the news report
is included below. Use debgpt's git HEAD if you want to try.
is it via ChatGPT or an llm self hosted ?[...]
is it via ChatGPT or an llm self hosted ?
Can we imagine having a Debian hosted computer with and AMD GPU dedicated to this use case ?
Se should provide these summaries letter for most of our mailing list :)
cheers
Fred
----- Le 9 Nov 24, à 14:09, Hector Oron zumbi@debian.org a écrit :
Hello Lumin,
El sáb, 9 nov 2024 a las 10:27, DebGPT (<lumin@debian.org>) escribió:
This is an experiment, by letting LLM go through all 369 emails fromFirst time I see this kind of email, I thought time ago that'd be a
debian-devel on Oct. The command for producing the news report
is included below. Use debgpt's git HEAD if you want to try.
really cool use of AI, to produce a summary of mailing lists - since I
struggle to read everything.
I just want to thank you for putting this together and, at least from
my side, this is very much appreciated.
Regards
--
Héctor Orón -.. . -... .. .- -. -.. . ...- . .-.. --- .--. . .-.
The LLM I used to produce that exact news report was gpt-4o-mini,
from openai. ChatGPT is the name of openai's LLM web interface and
its underlying LLM model name could change. It took roughly 3
minutes to perform the bulk API calls.
That said, I basically implemented support for all commonly seen
LLM inference services:
(4 commercial ones)
openai, anthropic, google, xai,
(4 self-hosted)
llamafile, ollama, vllm, zmq (built-in but kind of outdated.)
Other services missing from the list are also supported as long
as it has compatibility mode to the openai api.
For the particular use case like summarizing a mailing list, self-hosted
one will be much slower to respond to the bulk API call unless it is
hosted on a GPU cluster :-)
Small LLMs are not necessarily smart enough. The open llm leaderboard[3]
is a good reference for figuring out the best open-access llm for self-hosting.
In terms of "Debian hosted computer with AMD GPU for LLM inference" --
that is exactly one of the long term goals of debian deep learning
team (debian-ai@l.d.o). Team members are working to prepare the ROCm
packages and the ROCm version of pytorch.
I find ollama[1] and llamafile[2] quite handy to use locally if do not
mind using software from outside of debian archive, with a spare GPU.
[1] https://github.com/ollama/ollama
[2] https://github.com/Mozilla-Ocho/llamafile
[3]
https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard
On 11/9/24 05:19, PICCA Frederic-Emmanuel wrote:
is it via ChatGPT or an llm self hosted ?
Can we imagine having a Debian hosted computer with and AMD GPU
dedicated to this use case ?
Se should provide these summaries letter for most of our mailing list :)
cheers
Fred
----- Le 9 Nov 24, à 14:09, Hector Oron zumbi@debian.org a écrit :
Hello Lumin,
El sáb, 9 nov 2024 a las 10:27, DebGPT (<lumin@debian.org>) escribió: >>>> This is an experiment, by letting LLM go through all 369 emails from
debian-devel on Oct. The command for producing the news reportFirst time I see this kind of email, I thought time ago that'd be a
is included below. Use debgpt's git HEAD if you want to try.
really cool use of AI, to produce a summary of mailing lists - since I
struggle to read everything.
I just want to thank you for putting this together and, at least from
my side, this is very much appreciated.
Regards
--
Héctor Orón -.. . -... .. .- -. -.. . ...- . .-.. --- .--. . .-.
I just realized that the news report could be more useful if it cites
the information source. Here we go:
I just realized that the news report could be more useful if it cites
the information source. Here we go:
debgpt -Hx ldo:debian-devel/2024/10 -a 'write a news report based on the >provided information. Cover as many topics as possible. You may expand a >little bit on important matter. include links to the report.' --no-render
BTW, which mailing lists should I cover with those montly reports?
I currently tried on debian-ai, debian-devel, and debian-science.
At 2024-11-09T21:44:40+0000, Steve McIntyre wrote:
Please, no further. We don't need hallucinated summaries on ourOh, good--since it's not a CoC violation to express an unflattering
lists. If you want to publish them, publish them somewhere separately
IMHO.
opinion of this experiment, did anyone notice how deadly dull the prose
style is? It's like the LLM had been trained solely on corporate press releases.
I miss the Joeys.
Regards,
Branden
Thanks a lot Mo for this exciting experiment!Those new technologies are sure to make more impact in the future. We
And having two ex-DPLs pressing the big red stop button is not
necessarly a bad sign in an ageing project. Often you will see ideas rejected in a very dismissive if not insulting way (for example
source-only uploads or HTTPS URLs in /etc/apt/sources.list), and a
couple of years later they are mainstream!
What do people do when there is a long thread on debian-devel, debian-project, debian-private or the like? Well, what I do is that II do more or less the same, or even worse -- keep them marked as unread
check the first ~6 messages and then cherry-pick 3 or 4 answers deeper
in the thread from
I see a big transformative potential for our future discussions: even ifDebGPT is now more about a general terminal LLM tool which I develop
a crowd is shouting circular arguments around, we can use AI to reassure participants that original point of views can have good chances to be
part of a summary. Taking the effort to contribute is rewarded. This
can change Debian considerably. So please, more DebGPT summaries !
Our mailing lists were a ground-breaking technological avance in the
past that would open Debian to the whole World, but now are they not
working exactly against that?
please post these summaries to a dedicated applied statistics summarymailinglist, but please dont spam the original lists with this bot content.
On Sun, Nov 10, 2024 at 08:48:21AM +0900, Charles Plessy wrote:
Our mailing lists were a ground-breaking technological avance in the
past that would open Debian to the whole World, but now are they not working exactly against that?
first: citation needed.
second: summaries written by applied statistics systems will not help.
(also citation needed I guess.)
please post these summaries to a dedicated applied statistics summarymailinglist, but please dont spam the original lists with this bot content.
btw, the signature below was choosen by an 'artificial intelligence'
system called fortune. it's so amazing and wise!!1
On Sun, Nov 10, 2024 at 08:48:21AM +0900, Charles Plessy wrote:
Our mailing lists were a ground-breaking technological avance in the
past that would open Debian to the whole World, but now are they not
working exactly against that?
first: citation needed.
second: summaries written by applied statistics systems will not help.
(also citation needed I guess.)
please post these summaries to a dedicated applied statistics summarymailinglist, but please dont spam the original lists with this bot content.
While hallucinating too much, LLMs can still correctly
teach me how to use urwid (which I could never understand by going
through their tutorial many times...).
That leads to the `debgpt config` TUI configuration wizard.
Any suggestion on a place where I can safely direct those generated
contents, with the audience being comfortable with LLM noise? As noted
by the ex-DPLs, it should happen at a dedicated experiment ground.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 406 |
Nodes: | 16 (2 / 14) |
Uptime: | 110:06:54 |
Calls: | 8,528 |
Calls today: | 7 |
Files: | 13,210 |
Messages: | 5,920,463 |