Post AmcuoH3HXh5gt4kqY4 by j@bae.st
(DIR) More posts by j@bae.st
(DIR) Post #AljXbZcfbquuLeDfvM by j@bae.st
2024-09-05T11:04:29.276032Z
0 likes, 0 repeats
@lanodan @hj what would be the postgres query to export a user's followers?
(DIR) Post #AljXbaUuMCIn3razSq by i@declin.eu
2024-09-06T17:30:01.291270Z
0 likes, 0 repeats
@j @lanodan @hj thisselect ap_id from users join (select following_id from following_relationships where follower_id = (select id from users where nickname = 'CHANGE ME')) as _user on users.id = _user.following_id;
(DIR) Post #AljhhbDXanPoZayfUO by j@bae.st
2024-09-06T19:20:45.869507Z
0 likes, 0 repeats
@i @lanodan @hj I just got ERROR: column "nepiant" does not exist
(DIR) Post #Aljhhc54NmEXFc1PvM by i@declin.eu
2024-09-06T19:23:09.812394Z
0 likes, 0 repeats
@j @lanodan @hj single quote ' not double " quote for sql
(DIR) Post #AljhzHSJnwSaUoJwga by j@bae.st
2024-09-06T19:25:14.428590Z
0 likes, 0 repeats
@i @lanodan @hj it was because I was using single quotes for the SQL command
(DIR) Post #Alji2rv2gvLSSBT1HM by j@bae.st
2024-09-06T19:25:38.696897Z
0 likes, 0 repeats
@i @lanodan @hj thank you
(DIR) Post #AljiQGgKn2jrukkGIq by j@bae.st
2024-09-06T19:30:00.329956Z
0 likes, 0 repeats
@i @hj @lanodan wait can this resulting list be able to be imported in pleroma setting? That's what I need. Pleroma can't do more than 1 page so I have to pull it for people from the database directly so they can import it on other servers.
(DIR) Post #Aljkv6xdBMj6eOfd4q by i@declin.eu
2024-09-06T19:59:14.395968Z
1 likes, 0 repeats
@j @lanodan @hj select nickname instead of ap_id and csv import will work
(DIR) Post #AljmDOPj89Rlwn5qLo by j@bae.st
2024-09-06T20:13:43.480782Z
1 likes, 0 repeats
@i @lanodan @hj how can I make it add the @domain.tld to local accounts so it can be imported on other servers?
(DIR) Post #AljnOdYbGjjA3JAY88 by i@declin.eu
2024-09-06T20:26:58.759715Z
1 likes, 0 repeats
@j @lanodan @hj probably easier to run two queries than complicate the sql futherselect concat(nickname, '@bae.st') from users join (select following_id from following_relationships where follower_id = (select id from users where nickname = 'CHANGEME')) as _user on users.id = _user.following_id where users.local = 't';select nickname from users join (select following_id from following_relationships where follower_id = (select id from users where nickname = 'CHANGEME')) as _user on users.id = _user.following_id where users.local = 'f';
(DIR) Post #AljthtP7ApoqQbIVCS by shitpisscum@mrhands.horse
2024-09-06T21:37:40.044129Z
0 likes, 0 repeats
@i @j @lanodan @hj >easier to run two queriesWhy not just union these queries?select concat(nickname, '@bae.st')from users join ( select following_id from following_relationships where follower_id = ( select id from users where nickname = 'CHANGEME' ) ) as _user on users.id = _user.following_idwhere users.local = 't'unionselect nicknamefrom users join ( select following_id from following_relationships where follower_id = ( select id from users where nickname = 'CHANGEME' ) ) as _user on users.id = _user.following_idwhere users.local = 'f';
(DIR) Post #Aljw0iX616go7c8PWS by i@declin.eu
2024-09-06T22:03:29.492861Z
1 likes, 0 repeats
@shitpisscum @j @lanodan @hj for the same reason i didn't case when it, laziness
(DIR) Post #Alk9173M9LF7Zfk7zU by j@bae.st
2024-09-06T23:58:30.922328Z
0 likes, 0 repeats
@shitpisscum @i @lanodan @hj I like how complex of a query such a simple task requires
(DIR) Post #AluSbjvyuRiRIS4UQS by j@bae.st
2024-09-11T23:55:54.142361Z
1 likes, 0 repeats
@shitpisscum @i @lanodan @hj what about to get a list of all the files uploaded by a user so I can hard link them in a directory and then tar them up?
(DIR) Post #Alv2TeP9BI7KBE9hho by i@declin.eu
2024-09-12T06:37:46.760577Z
0 likes, 0 repeats
@j @shitpisscum @lanodan @hj select split_part(upload->>'href','/', 5) from (select jsonb_array_elements(data->'url') as upload from objects where data->>'actor' = (select ap_id from users where nickname = 'CHANGEME')) where upload->>'type' = 'Link';
(DIR) Post #AlvnpDkne0chjz4RaC by j@bae.st
2024-09-12T15:28:21.944375Z
1 likes, 0 repeats
@i @shitpisscum @lanodan @hj now isn't this some shit?How can I tie split into this to make it work?Screenshot_20240912-102611~2.png
(DIR) Post #AlvoATuclLc9w7ntse by i@declin.eu
2024-09-12T15:32:08.745103Z
0 likes, 0 repeats
@j @shitpisscum @lanodan @hj psql | xargs -n 16 cp -t dest ... to do it in smaller batches, or psql | xargs -I AAA cp AAA dest/ to do one file at a timeor readline, or any other for loop, but bash substitution
(DIR) Post #AlwDJQHE97oI7bjj3A by lanodan@queer.hacktivis.me
2024-09-12T20:13:45.929389Z
1 likes, 0 repeats
@i @j @shitpisscum @hj Shouldn't even need -n 16 to xargs as it should automatically split to never exceed ARG_MAX (per POSIX standard), which is 131072 (characters) on Linux.
(DIR) Post #AmcuoH3HXh5gt4kqY4 by j@bae.st
2024-09-25T03:34:51.207646Z
0 likes, 0 repeats
@lanodan @i @shitpisscum @hj I copied and pasted that but it gave an error
(DIR) Post #AmcuoHS63QwM82MfE8 by j@shitposter.world
2024-10-03T10:39:12.974604Z
2 likes, 1 repeats
@j @i @shitpisscum @lanodan @hj cc @teto I'll let @p handle this since I wasn't able to
(DIR) Post #Amd9V7bZwLCPOn7dHk by p
2024-10-03T13:23:50.692413Z
0 likes, 0 repeats
@j @j @hj @i @lanodan @shitpisscum @teto What are we doin'?
(DIR) Post #Amd9oZJX8vLgq3xFaq by p
2024-10-03T13:27:21.566430Z
1 likes, 0 repeats
@i @j @hj @lanodan What I do is just pipe it through awk.psql -q -A -t -f query | '$0!~/@/{print $0 "@bae.st";next} 1;' > x.csv
(DIR) Post #AmdBpjwVJNNU05rwh6 by teto@cawfee.club
2024-10-03T13:49:57.878547Z
1 likes, 0 repeats
@p @j @i @shitpisscum @lanodan @hj @j he's talking about the data of individuals being packaged and sent. In my case... well you know which one
(DIR) Post #AmdD4WAatsxPaD4PVg by p
2024-10-03T14:03:51.416754Z
0 likes, 0 repeats
@j @shitpisscum @hj @i @lanodan Oh, that's a huge pain. You're better off just going through the API to get a list of attachments.You can grab a list of the objects attributed to a given user and then turn the URLs into paths. I'd just go through the API.
(DIR) Post #AmdDl3ZF7SX7KHspXM by i@declin.eu
2024-10-03T14:11:30.969580Z
1 likes, 0 repeats
@p @j @shitpisscum @lanodan @hj not really, unless you try to bash substitute the entire igel coom at once, instead of iterating it any which way
(DIR) Post #AmdFlNoK3UYQ1ADiBE by p
2024-10-03T14:34:00.827097Z
4 likes, 1 repeats
@teto @hj @i @j @j @lanodan @shitpisscum Yes, indeed. Definitely worth preserving. :housewant:The import is slow going (I have to rework a piece that has been too slow anyway; it's at 15k files of 1.9m, bottleneck is the same one I stopped hacking on to get the media import started, so I get to kill two bottlenecks at once) but it'll all be infinitely durable when the process is done. There's a static UI that is fast enough to just scrape without sleeping between reqs.
(DIR) Post #AmdG4MoLzmvqQ6Hr8a by p
2024-10-03T14:37:26.681796Z
1 likes, 0 repeats
@i @hj @j @lanodan @shitpisscum Oh, shit, the uploads table, right.It is *still* probably easier to just go through the API. The JSONb shit is ugly as hell even if it is reasonably fast to query the uploads table.
(DIR) Post #AmdMQfD4ry5tZvrmd6 by p
2024-10-03T15:48:42.267756Z
0 likes, 0 repeats
@teto @hj @i @j @j @lanodan @shitpisscum Actually, if you have Tor then it's viable on the grafana dashboard: http://pbuhwjjhrzcvrghtfqlwqlgabuzc7jnkqv4swekxjvv7pgubd7jjoiqd.onion/public-dashboards/8643a3beee524385bd0ef6a304dbbd65?orgId=0&refresh=6s .baest_media_ingestion.png
(DIR) Post #AmdMRib5lEvQNEbNaK by p
2024-10-03T15:48:53.705655Z
2 likes, 0 repeats
@hj @i @j @j @lanodan @shitpisscum @teto Or viewable. Whichever.
(DIR) Post #An88UFS3ZD9o2sRmb2 by teto@cawfee.club
2024-10-18T12:08:30.106084Z
1 likes, 0 repeats
@i @j @p @shitpisscum @lanodan @hj so whats the current situation on this?
(DIR) Post #An89LZlWlp6EOcz0vw by i@declin.eu
2024-10-18T12:18:09.199902Z
2 likes, 0 repeats
@teto @j @p @shitpisscum @lanodan @hj peen is holding your coom vault hostage
(DIR) Post #An8eu4pGtNDL9YfKLY by p
2024-10-18T18:11:45.662920Z
0 likes, 0 repeats
@teto @i @hj @j @lanodan @shitpisscum Well, I am speeding up the one thing; as noted, I'm skeptical that the reason it's slow is sshfs, but we'll see. I have to do that part anyway so it won't hurt to do it now.I do see a directory so maybe sjw did his hard-links thing but I don't know if it has completed.