00:00:18 would make life simpler for people wanting to do their own stuff with the data 00:03:20 currently for Postquell I have something set up at crawlapi.mooo.com, which is probably not stable enough for general use yet (running on home server) 00:04:22 API endpoint is http://crawlapi.mooo.com/event 00:07:17 is there a way for crashed services to catch up on missed events? 00:08:06 yes it's a database, you can read through the history of all events with the /event endpoint 00:09:08 e.g. first request http://crawlapi.mooo.com/event?limit=100, then you get "next_offset":101 in the results so you do http://crawlapi.mooo.com/event?limit=100&offset=101 00:09:47 etc until you have parsed everything 00:10:02 and if you want realtime updates without polling you can switch to the socket 00:10:24 ah, so there's a separate paginated listing api as well as pubsub, good 00:10:24 I'm using socket.io but would want to change that I think 00:11:33 I've used zeromq a tiny bit, might be an option 00:11:57 what would be good is having the crawl servers push the updates instead of having to poll 00:12:37 why not raw websockets? 00:12:43 and yeah, pushing is better 00:12:54 the event database is very basic, just data in json form in no particular order 00:12:54 what writes the milestone files? 00:13:07 the crawl server 00:13:20 haven't looked into that 00:13:49 the webserver could post them over a websocket 00:14:07 yup that could work 00:15:15 have to think about what happens if the connection drops and the pubsub has to catch up with the crawl server 00:15:15 crawl server needs to store a log 00:15:19 ideally you could share code for this bit 00:15:24 yeah can just keep the current log system 00:15:50 this isn't quite up to date https://github.com/Kramin42/Crawl-Log-Api-Nodejs 00:16:53 documentation for the api is in the python version https://github.com/Kramin42/Crawl-Log-Api 00:17:14 thanks, I'll take a look; this all seems like a step in the right direction 00:18:08 as mentioned above, servers are currently a bottleneck for some areas of development and I'd strongly prefer not to add to that 00:18:31 how would you feel if the dev team managed this service? 00:18:38 fine by me 00:19:09 preferable even heh 00:21:32 most of this was written while learning nodejs and sequellize so bear with :) 00:21:48 to be clear, that'd probably include importing the code repository into the crawl org 00:22:17 consider it public domain 00:22:55 great 00:23:38 fwiw, I would prefer crawl servers to push events once by one using a http api over streaming via websockets/other 00:24:03 the volume of events is extremely low, and streaming is more complex 00:25:23 yeah can just do it with http POST 00:26:12 from crawl-servers to central server 00:26:32 sure 00:26:57 from central server to subscribers not sure 00:27:38 if they dont care about realtime they can just use the event endpoint 00:27:47 that also fixes another issue: half the servers don't have http, so having the scoring server connect to unencrypted websockets isn't great 00:30:33 for people that do want real-time (announcers) websockets is probably the easiest 00:30:48 yeah, from central servers to subscribers you probably want a streaming feed. Unless there is some sort of callback webhook registration system 00:31:40 what will consume scoring events? 00:31:59 what do you mean? 00:32:31 by events I'm meaning milestones/wins/deaths 00:32:44 from the crawl-servers 00:32:59 right 00:33:17 what bits of crawl infra will read the shared stream of all events? 00:33:22 a scoring server will consume those from the central pubsub 00:34:34 basically eliminating the whole fetching and parsing logfiles from X different servers aspect that everyone has to set up to do any scoring/custom tournaments 00:34:51 why not have the crawl servers just post events to the scoring server instead? 00:35:49 could do, but then each scoring server still has to know where each crawl-server is 00:35:59 and you have an N-N relationship 00:36:13 ah; how many scoring servers are there 00:36:13 introducing the central server just makes it N-1-N 00:36:34 at the moment, maybe 2 haha 00:37:08 <|amethyst> aidanh: there are at least three official services: tourney, scoring, and sequell 00:37:08 <|amethyst> though tourney is only around for a few weeks out of every year 00:37:45 ok, in that case a merging server could make sense 00:38:16 and you have custom tournies like CSDC 00:40:35 ah, right 00:40:53 there have also been several people who download all logfiles/milestones from every server. It's hard to find out all the servers to download them, if you don't hang out in here 00:41:17 look at the servers overview page :) 00:41:35 (also, a central place will mean that as servers come and go their data isn't lost. But I don't want to emphasize that as I think making old data less available is not a bad thing) 00:41:44 not to mention half the serves have a different url scheme 00:42:36 a minor gripe haha 00:42:36 ok, yeah, a centralised server seems reasonable 00:42:57 I don't think the resource requirements are very high 00:43:17 how much disk space do all the logfiles/milestones take up Kramin ? 00:43:33 and how much when converted to the storage format your server uses 00:43:55 one big problem: this requires a restart of all crawl webservers 00:44:46 files are 35GB for all of CAO CBRO CDO CJR CKO CPO CUE CWZ CXC LLD 00:44:53 database is 57GB 00:45:09 this is just milestones/logfiles 00:45:12 no morgues 00:45:28 morgues will add a bit 00:46:04 cjr is dead btw 00:46:50 i think it depends how you implement this. You could implement it as a crawl option "run this command every time you generate a new milestone", then it would by orthogonal to webtiles/dgl 00:46:57 yeah, I still have the files from when it was alive though 00:47:20 (eg crawl runs 'handle-event.sh milestone "name=chequers:......"') 00:48:04 .. yuck :) 00:48:04 <|amethyst> Or you could have a separate process tailing the milestone file 00:48:04 yeah! it is yuck. At least dcss could be writing json to that process 00:48:04 milestones on a single server come slowly so it doesn't have to be super efficient 00:48:16 tailing will inevitably break when the log needs to be truncated 00:48:23 |amethyst: well, it would be nice to remove that file as part of this change 00:48:36 <|amethyst> aidanh: existing services also break when that happens 00:49:09 so let's not repeat that mistake :) 00:49:11 <|amethyst> scoring lost several CKR games because of that 00:49:27 yeah once this is all up and running servers might not even need any permanent storage 00:49:45 though theres still ttyrecs that I haven't considered 00:50:28 there was an unrelated discussion recently about just not storing ttyrecs forever 00:50:34 yeah 00:50:57 I feel like if players care maybe they get a window where they can download it 00:51:10 and then delete it after 3 months or so 00:52:00 then someone can make a service where you can register interesting ttyrecs to store them longterm :) 00:52:07 servers will still need to store scores to disk in case the post fails 00:52:17 true 00:53:33 I wonder if it's possible to store milestones in the crawl save 00:53:35 then the window for losing milestones due to crashing is minimized 00:53:46 if you build the master service right, it will only go down once every couple of years. Data loss is acceptable in that case if it lets you avoid the complexity of handling that case IMHO 00:54:36 actually, it'll get restarted every time we push an update 00:54:57 so there will be a few seconds of downtime ~daily 00:55:12 the central server? deploying an update needn't cause downtime 00:56:10 eh, true, but the simplest deployment schemes have this issue 00:56:33 and network partitions do happen 00:57:44 alexjurkiewicz: you're more up-to-date with cloud providers; do any provide automatic deployments of docker images? 00:58:11 by watching the git repo 00:58:22 well, you could do that with github actions and any cloud provider 00:59:01 ah, good 00:59:01 I think for a service like this which is called very infrequently, you don't even need container compute 00:59:01 function compute would be even cheaper and more reliable 00:59:10 won't we require storage? 00:59:21 public clouds all provide serverless nosql databases 00:59:57 can we export that data if needed? 01:00:22 and how do we develop locally? 01:00:34 i was thinking of something like sqlite 01:01:07 the two big providers don't provide an "export to file" functionality, so you'd write a script to loop over all data in the DB to export the data. Are you thinking about backup, or downloading bulk data for some other purpose? 01:01:31 local development can use docker images which provide the same API 01:02:05 backup of the data could be done by anyone just using the API itself 01:02:18 the event endpoint 01:02:30 backup and anti-lockin 01:02:33 to be honest, you might not even want a database. Now that I am sketching out the requirements, it might be better to write JSON data to object storage 01:03:26 I was providing the option to filter by server or milestone/game, I don't know if that is super important 01:03:32 actually, I guess a database is useful because Kramin's service does provide limited filtering 01:03:41 another factor is for crawl code to standardise on a few common tools 01:03:47 we already use sqlite 01:03:47 and date/time 01:03:58 though I haven't implemented that yet 01:04:29 but it's probably the most useful filter given that the events are unordered 01:04:30 for scoring/sequell, you don't need any filtering. For tournament, you want date and/or version. For CSDC you want the same 01:04:56 (if you just have one or the other of date/version, it's no big deal to get irrelevant events and skip them) 01:05:09 <|amethyst> scoring would want version 01:05:18 <|amethyst> so that it doesn't pull experimentals or forks 01:05:52 you can check for that in what you get and throw it out, as long as it's not so high volume that it's super inefficient 01:05:53 yeah, that would be a nice to have 01:06:51 being able to just get events from a date range is critical I think 01:07:00 Kramin: what's your repo again? 01:07:22 https://github.com/Kramin42/Crawl-Log-Api-Nodejs and docs at https://github.com/Kramin42/Crawl-Log-Api 01:07:58 and server filter is good for single-server scoring sites/tourneys 01:08:19 though still somewhat nice-to-have 01:09:14 filtering by game/milestone is pretty good for people just wanting to aggregate game stats 01:10:13 if all the above filters are worth supporting, you would need to go sql for the storage backend rather than nosql. nosql would let you have a couple of queries only 01:11:10 I think date filter is the only really critical just due to the every-growing number of events 01:11:42 yeah. if you are running csdc and lose your place, you don't want to download every event since the beginning of time 01:11:54 exactly 01:15:49 !lg * x=gid 01:15:53 11797300. [game_key=DolRinEe:cwz:20200411093750S] DolRinEe the Bringer of Light (L27 FoFi of The Shining One), slain by an electric golem on Zig:16 on 2020-05-18 05:15:17, with 1683760 points after 173147 turns and 10:50:53. 01:22:43 javascript? in my project? hell no 01:23:35 who said in your project 01:24:00 :P 01:24:24 this is a python shop 01:24:32 we don't allow your kind here 01:24:44 (ignore the perl) 01:29:02 could probably rewrite in python using the python async stuff now 01:29:34 well I'm purely joking, obviously we use javascript for the client. python does seem a bit more natural for our infrastructure stuff though 01:30:18 not worth not moving ahead if we have a clear path for improvement; I haven't read all of the scrollback 01:30:44 yeah I'm musing since I tend to go for python where possible 01:35:40 I like js promises though 01:36:12 python has futures now which is similar 01:37:59 I agree that we should use python (3 only) 01:38:48 3.8+ only :) 01:40:01 Kramin: yeah, asyncio is using Futures under the hood now 01:40:35 although you don't really use Tasks (the subclass of Futures that async wants you to use) directly a lot these days, you just use async/await most of the time 01:40:53 in terms of needing to work directly with the Task objects themselves 01:41:25 maybe advil could give this thing a scoring.dcss.io subdomain 01:41:29 I say that having only made various relay bots; maybe it comes up a lot more in practice for some kinds of projects 01:42:05 oh sometimes you just want to fire-and-forget a task 01:42:48 well, could do, but we could also get a hostname setup at develz.org; one was actually made for the new scoring project alex and kramin worked on for a while 01:43:20 as long as that doesn't imply deployment on the same server (which it shouldn't) 01:43:25 http://scoring.crawl.develz.org/ 01:43:38 no, this was a DNS A record 01:43:54 I think to just an instance alex ran 01:44:13 yeah i twas 01:45:37 could have api.crawl.develz.org maybe 01:46:09 and then use the scoring one for if that project gets picked up by someone 02:53:19 -!- Krakhan|2 is now known as Krakhan 03:22:16 -!- amalloy_ is now known as amalloy 03:32:14 Fork (bcrawl) on crawl.kelbi.org updated to: 0.23-a0-3079-g67c3e499a2 03:45:56 Fork (bcadrencrawl) on crawl.kelbi.org updated to: 0.22.1-2695-g0cbfb1426b 06:01:16 -!- amalloy is now known as amalloy_ 06:59:34 Potion of stabbing + Gozaq gold distraction not working correctly 13https://crawl.develz.org/mantis/view.php?id=12265 by Le_Nerd 07:03:33 -!- krator44-- is now known as krator44 08:22:53 03Aidan Holm02 07* 0.25-a0-1120-gd744090: Remove unused member 10(4 days ago, 3 files, 0+ 5-) 13https://github.com/crawl/crawl/commit/d7440908826e 08:46:35 Unstable branch on crawl.kelbi.org updated to: 0.25-a0-1120-gd744090882 (34) 09:12:59 here's a very quick poc sketch of a serverless central event storage service https://github.com/alexjurkiewicz/dcss-central-event-store 09:13:57 it lets servers submit events to it, and gives users access to read events. There's no filtering of events or pagination (both would be required in a real solution as discussed) 09:21:59 this looks nice 09:25:55 dynamodb is a bit of an odd choice though 09:39:28 alexjurkiewicz btw did you remember that you have write access to dcss_tourney? 09:40:34 not that there's anything wrong with PRs 12:17:15 |amethyst looks like when running multiple instances of the webserver it does prevent you from playing multiple copies of the same save at once, and the stale pid killing stuff works as expected. The lobby game list is not detecting games from the other instance though (which is probably ok) 12:18:31 <|amethyst> huh 12:19:05 oh, maybe that needs some extra config I don't have locally 12:19:05 that's probably it 12:19:07 I also had to override the pidfile check 12:20:51 03advil02 07* 0.25-a0-1121-g0f0e9d8: Add some server.py command line options, refactor 10(10 minutes ago, 1 file, 104+ 39-) 13https://github.com/crawl/crawl/commit/0f0e9d80d6c4 12:31:27 03advil02 07* 0.25-a0-1122-gd323cdc: Let --port disable config.py SSL for now 10(3 minutes ago, 1 file, 5+ 1-) 13https://github.com/crawl/crawl/commit/d323cdc54f78 12:32:13 Unstable branch on crawl.kelbi.org updated to: 0.25-a0-1121-g0f0e9d80d6 (34) 12:47:04 Unstable branch on crawl.kelbi.org updated to: 0.25-a0-1122-gd323cdc54f (34) 12:53:48 -!- EuphOria is now known as Euph0ria 13:00:28 Unstable branch on crawl.akrasiac.org updated to: 0.25-a0-1122-gd323cdc (34) 13:19:57 -!- amalloy_ is now known as amalloy 13:20:28 -!- amalloy is now known as amalloy_ 13:35:16 03advil02 07* 0.25-a0-1123-gc510c91: Fix some issues with lobby template timing 10(2 minutes ago, 1 file, 19+ 5-) 13https://github.com/crawl/crawl/commit/c510c915c70c 13:38:56 ugh I just realized that mysql 5.5 means no window functions :< 13:39:46 Unstable branch on crawl.akrasiac.org updated to: 0.25-a0-1123-gc510c91 (34) 13:42:25 whats the version of sqlite on cdo 13:44:35 3.8.7.1 2014-10-29 is what the only thing that autocompletes is 13:44:53 seems lightly aged 13:45:47 also no window functions 13:47:22 is support for ansi sql 2003 too much to ask 13:47:22 Unstable branch on crawl.kelbi.org updated to: 0.25-a0-1123-gc510c915c7 (34) 13:47:24 it appears so 13:47:35 we have c++11 on cdo! 13:48:28 you could ask n.apkin if it's possible to upgrade? may depend on if anything else is using mysql there 13:48:38 such as the (lol) minecraft server that is running 13:49:27 (he is running like 10 domains of this server, I guess) 13:50:34 |amethyst ah so what happens in practice seems to be that webtiles can't connect to a socket that is being used by another webtiles server 13:52:03 File "/chroot/crawl-master/webserver/connection.py", line 109, in send_message error: [Errno 111] Connection refused 13:53:14 can't tell if that is ignorable (I killed the process before much happened) 13:54:22 <|amethyst> aha, yeah, I guess that would be a problem 13:55:57 I'm a little surprised it didn't crash on self.socket.bind(self.socketpath) 13:56:22 !tell Napkin would it be possible to upgrade either mysql or sqlite on cdo to a version that supports window functions (mysql>=8.0, sqlite >= 3.25)? 13:56:22 ebering: OK, I'll let napkin know. 14:30:09 03RojjaCebolla02 07https://github.com/crawl/crawl/pull/1414 * 0.25-a0-1123-g3eb8617: Further randart name tweaks 10(33 hours ago, 3 files, 23+ 7-) 13https://github.com/crawl/crawl/commit/3eb8617f3e59 14:51:41 03advil02 07* 0.25-a0-1124-g8f32fae: Fix a typo 10(34 seconds ago, 1 file, 1+ 1-) 13https://github.com/crawl/crawl/commit/8f32faedd04a 14:55:08 this doesn't seem to actually be making xom act, but it's compiling and running just fine: xom_acts(10, MB_TRUE); 15:00:37 <|amethyst> cebolla: might be the tension check, see xom_choose_action (the "Make good acts at zero tension less likely") 15:01:00 <|amethyst> cebolla: you could pass in a positive value for the tension parameter perhaps 15:01:49 if you're running under very controlled circumstances it's also possible that the rng is doing the same thing each time 15:02:04 <|amethyst> the thing is, a lot of good Xom effects don't do much if there are no monsters around 15:02:13 ah that's a good point 15:02:28 ah, that worked 15:02:30 Unstable branch on crawl.kelbi.org updated to: 0.25-a0-1124-g8f32faedd0 (34) 15:03:00 i tried (200, MB_TRUE, 200) and it teleported the centaur to a random spot 15:03:05 thanks xom! 15:03:40 heh 15:03:55 now that's the kind of xom flavor my note was looking for :-P 15:04:12 (@rawlins = me) 15:04:59 apparently! 15:07:28 not sure that makes faded xom feel a bit more like other faded god results, but could try three times, or pick from a tighter list of effects 15:07:48 yeah, if you did go with a custom event list it would be fair to exclude that one 15:08:45 in fact, it probably should be excluded 15:15:55 i guess it was a "good" teleport in that it deposits player in a safe area, increasing map knowledge 15:16:04 but now ya gotta find the stairs 15:16:14 *get back to 15:30:33 03advil02 07* 0.25-a0-1125-gc162733: Add a debug mode for live-testing server updates 10(2 minutes ago, 1 file, 18+ 1-) 13https://github.com/crawl/crawl/commit/c1627330ba83 15:30:53 ah, even so, yeah, I don't think that's something that should be a possibility for taking a faded altar 15:32:20 Unstable branch on crawl.kelbi.org updated to: 0.25-a0-1125-gc1627330ba (34) 15:35:05 Unstable branch on crawl.akrasiac.org updated to: 0.25-a0-1125-gc162733 (34) 15:41:13 adding a debug mode but no wizard mode smh 15:43:22 this is weirdly flaky 15:44:11 oh, real webtiles is probablys eeing these games 15:45:01 I wonder how I could prevent that 15:49:25 also, why is running this messing up my terminal 16:31:55 03advil02 07* 0.25-a0-1126-g5fd7b56: Fixes for live-debug mode 10(66 seconds ago, 3 files, 21+ 11-) 13https://github.com/crawl/crawl/commit/5fd7b56c6021 16:39:15 Unstable branch on crawl.akrasiac.org updated to: 0.25-a0-1126-g5fd7b56 (34) 16:47:19 Unstable branch on crawl.kelbi.org updated to: 0.25-a0-1126-g5fd7b56c60 (34) 16:47:28 I think I got it working with now only light weirdness 17:05:59 advil: if you're happy for me to push to master directly I will, but I prefer to work with submitting PRs for any multi-person project 17:07:10 +aidanh | dynamodb is a bit of an odd choice though <-- it's basically free. There aren't many alternatives for storing data like this 17:24:14 -!- jfcaron_ is now known as jfcaron 17:52:30 alexjurkiewicz is the cpo webtiles config in git somewhere that I could look at? 17:53:02 I'm just hitting all these little ways my assumptions were wrong under dgamelaunch-config, so wanted to also eyeball yours... 17:53:15 nope, want me to pastebin it? 17:53:44 if you don't mind sharing (would also be useful to see your yaml assuming you are using that) 17:54:25 https://crawl.project357.org/static/config.py 17:55:19 and https://crawl.project357.org/static/10-trunk.yaml 17:56:59 thanks 17:57:08 is `/var/dcss/builds/dcss-trunk/latest/bin/dcss` an actual binary? 17:57:36 yes 17:57:54 $ ls -ld /var/dcss/builds/dcss-trunk/latest 17:57:57 lrwxrwxrwx 1 root root 21 May 18 20:45 /var/dcss/builds/dcss-trunk/latest -> 0.25-a0-1126-g5fd7b56 17:59:53 ok thanks 18:00:10 my #1 wrong assumption at like every step of the way has been that any server except yours ever calls a binary directly 18:02:11 i thought none of the rest did 18:02:32 correct 18:03:10 wow naming the binary with just the version string, bold 18:03:20 i can put in a no-op bash script if it helps 8) 18:03:31 guess cpo can never host nethack 18:03:33 pls don't 18:03:49 this script crashes if it doesn't have at least 2 arguments 18:04:10 I also think under some circumstances I haven't diagnosed it is actually running crawl with old binaries and bad arguments 18:04:45 on the tourney website front, I've begun the visual rework. here's old & wip. Uses bootstrap of course https://imgur.com/a/cxSpJ0Y 18:05:22 looks good! 18:05:22 can I hire you for scoring 18:05:27 ask me in 2 weeks 18:05:32 haha 18:06:31 though I've grown fond of the pale yellow in a kind of stockhausen syndrome way 18:06:43 cebolla is now a bloatcrawl 2 dev 18:12:02 did we ever figure out how to display all those banners? 18:12:20 I seem to recall the old scoring rework going with some sort of "tiling" approach where it just showed them in a large block 18:12:24 but visually it didn't look great 18:12:34 maybe we'd need to redo the banner images 18:16:58 yes, the last attempt just laid them our horizontally, as wide as the page would allow. It looked alright, but image reworks would help 18:23:51 Unstable branch on underhound.eu updated to: 0.25-a0-1126-g5fd7b56c60 (34) 19:00:53 03advil02 07* 0.25-a0-1127-gb10990e: Correctly handle pre_options for json calls 10(2 minutes ago, 2 files, 36+ 13-) 13https://github.com/crawl/crawl/commit/b10990eb69c9 19:06:43 Unstable branch on crawl.akrasiac.org updated to: 0.25-a0-1127-gb10990e (34) 19:09:44 03advil02 07* 0.25-a0-1128-gfd937d7: Obvious bugfix 10(84 seconds ago, 1 file, 1+ 1-) 13https://github.com/crawl/crawl/commit/fd937d71834d 19:12:49 Unstable branch on crawl.akrasiac.org updated to: 0.25-a0-1128-gfd937d7 (34) 19:12:54 Proposal for centralised event tracking service https://github.com/crawl/crawl/issues/1415 19:13:37 I wrote this down here so we can come back to it in a month. Since I think a lot of people are busy atm. cc Kramin aidanh |amethyst gammafunk 19:17:09 Unstable branch on crawl.kelbi.org updated to: 0.25-a0-1128-gfd937d7183 (34) 19:17:34 yeah, I think we're all going to be pretty focused on the tournament and release over the next two weeks 19:20:01 good progress though, I'm almost able to give a complete ranking based on the new scoring system for individual players 19:25:51 how are you testing btw? Are you running a test environment on CDO? 19:26:59 "underlyingly" good word advil 19:27:39 heh did that slip out? 19:27:46 that is basically linguistics terminology 19:28:20 I usually try to delete that when I write it outside of that context (https://en.wikipedia.org/wiki/Underlying_representation) 19:29:29 is 'def binary_key(g):' meant to return something? 19:35:55 thx alexjurkiewicz 19:38:03 wrt pale yellow, i find stark white a little intense on my eyes. pale yellow is reminiscent of spreadsheets, but not awful 19:45:26 only advil and I (and I think neil?) have access to CDO, so he's probably doing a local test 19:47:17 alexjurkiewicz yeah that is a mistake 19:47:54 I did find it, after quite some time 19:48:26 there are other bugs in that commit too if you can spot them ;-) 19:49:21 |amethyst yet another server question: if hypothetically (*cough*) cao were using the same inprogress_path for both 0.23 and 0.24, do you know what would happen? 19:50:08 it looks like you'd have to work at to get something bad to happen? 19:50:57 it looks like dgl is probably using the correct directory, just not webtiles 19:56:56 i suppose if a player started a game in 0.23 and 0.24 from two different windows at the same second, things would get bad 19:57:06 03advil02 07* 0.25-a0-1129-ga7b400a: Even more save info fixes / tweaks 10(51 seconds ago, 2 files, 5+ 2-) 13https://github.com/crawl/crawl/commit/a7b400a13557 19:57:28 yeah, it would have to be at the same time I think? 19:57:28 though maybe I don't know how ttyrecs are named 19:57:41 whew, I think that is finally enough to get this fully working on cao 19:57:47 once the server restarts 19:58:59 so that --live-debug mode I added is pretty handy! 19:59:28 although I still never figured out why it was doubling my logs 20:00:22 iirc ttyrecs don't encode anything about the version played in the filename 20:00:30 !lg . -ttyrec 20:00:33 834. gammafunk, XL5 BaFE, T:3020: http://crawl.akrasiac.org/rawdata/gammafunk/2020-04-13.00:32:22.ttyrec.bz2 20:01:29 yeah it's just time 20:01:59 I guess it's been like this since the 0.24 release and I'm not aware of any problems, so it can wait 20:02:14 Unstable branch on crawl.kelbi.org updated to: 0.25-a0-1129-ga7b400a135 (34) 21:05:57 alexjurkiewicz: I'm testing on a local copy 21:06:13 where I have sql that supports ansi sql 2003 21:06:34 in the hopes that n'apkin can provide an updated mysql or sqlite 21:11:10 don't think it'll be easy to upgrade CDO mysql as napkin is using it for other purposes 21:11:24 sqlite might be more possible 21:11:32 I can ask about both though 21:11:38 https://i.redd.it/lbsdjdz09kz41.jpg 21:12:34 CDO sqlite3 version is 3.8.7.1 21:24:14 gammafunk: I sent 'kin a tell, the needed feature is window functions 21:25:01 ebering: recommend you ask in on discord, he's on the roguelikes server as Napkin 21:25:01 he's more likely to see your message that way 21:25:13 a PM should be fine; that's how we chatted last 22:03:21 -!- Psymania_ is now known as Psymania 22:50:11 -!- amalloy_ is now known as amalloy 23:02:13 advil: does this look more appropriate? https://paste.ubuntu.com/p/9Qwh4vv3Km/ 23:06:39 I wonder what the different in items is like between XOM_GOOD_RANDOM_ITEM and XOM_GOOD_ACQUIREMENT 23:07:17 good random item seems to give mostly things like coins, ammunition, rations 23:08:07 a lot of the effects in that list are ones of questionable utility (detect items, potion, mapping) while others are more strategic (items, ally). That feels inconsistent -- not much different to a completely random action 23:10:45 hmm. complete random didn't work out because xom might whisk you to a "safe" area that is nonetheless dangerous because you need to explore/fight your way back to known territory 23:15:48 why is that bad? 23:17:52 that just makes faded-xom even more frustrating 23:20:43 a suggestion 23:20:55 it's a bit more subtle, but you could have faded-xom set xom's moodmeter to maximum 23:25:29 admittedly i don't understand xom's mood too well... got a trck win, but all i was really sure about is that if i run around for awhile at high tension there's a good chance xom will help out. he's got 3 different moodmeters, right? how bored he is, how much he likes you, and tension? 23:25:55 maybe setting them all high is enough. tho that means he'll just do a random good action later? but that one might prove to be really timely, especially if players like, go back and fight sigmund while xom's in the mood 23:26:02 tension isn't a mood meeter so much as a variable that influences xom actions 23:26:16 it's computed dynamically based on what's visible so you can't set it 23:27:17 every time xom acts, mood and interest are randomised iirc, so setting a high mood/interest would only result in one good action 23:35:11 <|amethyst> ebering: xom_acts does let you override tension through a parameter, though 23:36:11 New branch created: pull/1416 (1 commit) 13https://github.com/crawl/crawl/pull/1416 23:36:11 03Quipyowert202 07https://github.com/crawl/crawl/pull/1416 * 0.25-a0-1129-g0a8e893: Pass some parameters by reference 10(15 minutes ago, 11 files, 46+ 46-) 13https://github.com/crawl/crawl/commit/0a8e893a68a0 23:47:51 <|amethyst> hm, with a C++11 compiler I expect several of those changes result in copying when call-by-value would be able to use a move. Specifically for the cases where the function or constructor is called with an rvalue 23:49:07 <|amethyst> (and where it uses the parameter to initialize something else, e.g. the constructors) 23:52:59 03kate-02 07* 0.25-a0-1130-g3926558: Fix a broken alphashops entry (#12262) 10(10 minutes ago, 1 file, 1+ 1-) 13https://github.com/crawl/crawl/commit/3926558f3a78