#Datasette

2025-05-20

I should be able to generate a #Datasette Lite URL and a ChatGPT "GPT" bot real soon now :-)

Jack Linke šŸ¦„jack@social.jacklinke.com
2025-05-17

Not all the #PyConUS posters are up yet, but the ones that are look amazing 🤩

I especially like...

@pamelafox - explains vector embedding visually in a way that finally helped tie everything together in my head. I feel like I actually *understand* the core concepts 1000% better now.

@simon - I love the dozens of little photos demonstrating real uses of #datasette. Now I want to explore all the niche museums within driving distance of home!

#Python is applied a mind-boggling variety of ways!

Paul Fƶrster :verified_blue:paulfoerster@swiss.social
2025-04-15

Ich wusste gar nicht, dass das 64'er Magazin auf #Mastodon ist. Super cool! šŸ˜Ž Merci. šŸ™ Direkt mal folgen… 🤣

@64er

64er-magazin.de/

#Commodore #VC20 #C64 #Commodore64 #C16 #C116 #Plus4 #Datasette #CBM #64er #64erMagazin #MEGA65

Meshuggah Mischell āœ…meshuggahmischell@metalhead.club
2025-02-16

Ha wie geil, das wollte mein Vater damals unbedingt mal ansehen als er uns einen #c64 mit #datasette gekauft hatte. Ich war damals zu jung um zu verstehen warum ... mal ansehen ...

archive.org/details/d64_Scharf

Vingt Trois Seize šŸ’Žvingtroiseize@mastodon.world
2025-02-12

Computer illustration crops-wips
Posted by smugcomputer

#commodore #cbm #C64 #8bit #retrocomputer #retrogaming #videogames #Tape #CBM1530 #datasette #80s #90s #Geek

2025-01-28

I aggregated reptiles observation data from #inaturalist, #gbif, #observation and #naturgucker into a sqlite Db. Then I used #Datasette with clustermap plugin to present the data in the working group.
That worked quite well for little efforts. šŸ˜Ž

R.J. Gillisrj@dice.camp
2025-01-23

Real nerd hours on my lunch break today. I need to pull a bunch of cards from my #MagicTheGathering collection for this deck I'm working on, and I wanted a way to know which box a particular set is stored in, so obviously I found a place I could download a CSV with information about all the sets…and then I loaded that into SQLite…and now I’m querying it with Datasette... #mtg #datasette

Screenshot of a database tool with a long SQL query related to Magic: The Gathering
Dendrobatus AzureusDendrobatus_Azureus@bsd.cafe
2025-01-18

@rickt137 @RadioAzureus @mms

You must have a very sturdy version there, because the ones that were here in the tropics, where the temperatures can reach 38 to 41°C those did not survive for a long time, before their heads got out of alignment and they got magnetized quite fast too

In the tropics the belts of such a recorder were destroyed in about 8 years they just dried up & snapped

#RetroComputing #Commodore #VIC20 #Datasette #Cassette #Tapes #FlightSimulator

How I use LLMs – neat tricks with Simon’s `llm` tool

Earlier this year I co-authored a report about the direct environmental impact of AI, which might give the impression I’m massively anti-AI, because it talks about the signficant social and environmental of using it. I’m not. I’m (still, slowly) working through the content of the Climate Change AI Summer School, and I use it a fair amount in my job. This post shows some examples I use.

I’ve got into the habit of running an LLM locally on my machine in the background, having it sit there so I can pipe text or quick local queries into it.

I’m using Ollama, mostly the small LLama 3.2 3B model and the Simon Willison’s wonderful llm tool. I use it like this:

llm "My query goes here"

I’m able to continue discussions using the -c flag like so:

llm -c "continue discussion in a existing conversation"

It’s very handy, and because it’s on the command line, I can pipe text into and out of it.

Doing this with multi line queries

Of course, you don’t want to write every query on the command line.

If I have a more complicated query, I now do this:

cat my-longer-query.txt | llm

Or do this, if I want the llm to respond a specific way I can send a system prompt to like so:

cat my-longer-query.txt | llm -s "Reply angrily in ALL CAPS"

Because llm can use multiple models, if I find that the default local (currently llama 3.2) is giving me poor results, I can sub in a different model.

So, let’s say I have my query, and I’m not happy with the response from the local llama 3.2 model.

I could then pipe the same output into the beefier set of Claude models instead:

cat my-longer-query.txt | llm -m claude-3.5-sonnet

I’d need an API key and the rest set up obvs, but that’s an exercise left to the reader, as the LLM docs are fantastic and easy to follow.

Getting the last conversation

Sometimes you want to fetch the last thing you asked an llm, and the response.

llm logs -r

Or maybe the entire conversation:

llm logs -c

In both cases I usually either pipe it into my editor, which has handy markdown preview:

llm logs -c | code -

Or if I want to make the conversation visible to others, the github gh command has a handy way to create a gist in a single CLI invocation.

llm logs -c | gh gist create --filename chat-log.md -

This will return an URL for a publicly accessible secret gist, that I can share with others.

Addendum – putting a handy wrapper around these commands

I have a very simple shell function,ve that opens a temporary file, for me to jot stuff into, and upon save, echoes the content to STDOUT, using cat.

(If these examples look different from regular bash / zsh, it’s because I use the fish shell).

This then lets me write queries in an editor, which I usually have open, without needing to worry about cleaning up the file I was writing in. Because llm stores every request and response in a local sqlite database, I’m not worried about needing to keep these files around.

function ve --description "Open temp file in VSCode and output contents when closed"    # Create a temporary file    set tempfile (mktemp)    # Open VSCode and wait for it to close    code --wait $tempfile    # If the file has content, output it and then remove the file    if test -s $tempfile        cat $tempfile        rm $tempfile    else        rm $tempfile        return 1    endend

This lets me do this now for queries:

ve | llm

One liner queries

I’ve also since set up another shortcut like this for quick questions I’d like to see the output from, like so:

function ask-llm --description "Pipe a question into llm and display the output in VS Code"    set -l question $argv    llm $question | code -end

This lets me do this now:

ask-llm "My question that I'd like to ask"

Do you use this all the time?

Not really.

I started using Perplexity last year, as my way in to experimenting with Gen AI after hearing friends explain it was a significant improvement on using regular web search services like Google as they get worse over time. I also sometimes use Claude because Artefacts are such a neat feature.

I also experimented with Hugging Face’s Hugging Chat thing, but over time, I’ve got more comfortable using llm.

If I wanted a richer interface than what I use now, I’d probably spend some time using Open Web UI. If was to strategically invest in building a more diverse ecosystem for Gen AI, it’s where I would spend some time. Mozilla, or anyone interested in less consolidation, this is where you should be investing time and money if you insist on jamming AI into things.

In my dream world, almost every Gen AI query I make is piped through llm, because that means all the conversations are stored in a local sqlite database that I can do what I like with.

In fact, I’d probably pay an annual fee (preferably to Simon!) to have my llm sqlite database backed up somewhere safe, or accessible from multiple computers, because as I use llm more, it becomes more valuable to me, and the consequences of losing it, or corrupting it in some way become greater.

If you have had success using llm that way, I’d love to hear from you.

#AI #datasette #generativeAi #llm

2024-12-24

#ICYMI i've loaded the Ontario anti-bicycling Bill 212 comments (most of which are against the bill) CSV file into SQLite using the amazing #datasette ! This means you can use SQL to make queries if that's your jam. Hope it's helpful! Check it out at: lite.datasette.io/?csv=https%3

2024-12-24

@PapyrusBrigade i've loaded the Bill 212 comments CSV file into SQLite using the amazing #datasette ! This means you can use SQL to make queries if that's your jam. Hope it's helpful! Check it out at: lite.datasette.io/?csv=https%3

Rusty InvaderRusty_invader
2024-11-27

@iieksi in Deutschland war Soft Aid auch relativ unbekannt. Ich hatte zu seiner Zeit diese Datasette nicht. Aber für Dich noch zwei lesenswerte Fotos aus dem Inlay der šŸ˜ beachte Side 2, erster Eintrag 🄹

Soft Aid Datasette Inlay 1Soft Aid Datasette Inlay 2

Next steps:
- More data from the party websites
- Candidate statements (and embeddings which is where the real fun begins)
- Show candidates "neighbors" based on geodata

If I get help I'd love to get data from social media, find more open datasets to join in, and anything else you can think of. I need help! If you're a developer interested in #civics #civichacking and #irishpolitics, or just want experience with #nushell, #datasette, or #sqlite - please reply to me here, there's plenty to do.

irish-election-2024.vercel.app

I've been working on this candidate explorer for the upcoming Irish general election. At the moment it contains basic info about who is running where, candidate/constituency/party search and some stats at the party and constituency level. I'm hoping that by election day I'll be able to add a lot more info, and bring in additional datasets.

#irishge
#ireland
#civics
#civichacker
#irishelection
#daileireann
#ge2024
#ge24
#irishpolitics
#datasette

Any civics hackers working on open data or tools for the upcoming Irish general election? (Or otherwise Ireland civics related things)

Let me know, I'd love to see what's out there and potentially contribute.

I'm also working on a candidate directory to try and provide as much info on candidates as possible, so if you could let me know if that already exists that'd be great.

#irishge #ireland #civics #civichacker #irishelection #daileireann #ge2024 #irishpolitics #datasette

2024-10-05

#Beziehung next Level ist,

Wenn Dir Deine #Herzensmensch auf #Deadline2024 sagt, daß sie früher auch eine #Datasette hatte und #Fortran programmieren kann.

<3

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst