00:01:13 -!- Zaba has joined ##crawl-dev 00:03:00 Unstable branch on crawl.develz.org updated to: 0.9.0-a0-344-g1e30ecf (32) 00:17:58 Windows builds of master branch on crawl.develz.org updated to: 0.9.0-a0-344-g1e30ecf 00:40:38 -!- gvdm has quit [Ping timeout: 250 seconds] 01:14:07 -!- gvdm has joined ##crawl-dev 01:20:35 -!- Zaba has quit [Ping timeout: 264 seconds] 01:26:38 -!- Zaba has joined ##crawl-dev 01:35:11 moin 01:51:46 -!- monqy has quit [Read error: Connection reset by peer] 01:51:57 -!- monqy has joined ##crawl-dev 01:58:14 -!- gvdm has quit [Remote host closed the connection] 01:58:27 -!- gvdm has joined ##crawl-dev 02:10:45 -!- eith has quit [Ping timeout: 260 seconds] 02:14:30 -!- OG17 has quit [Read error: Connection reset by peer] 02:29:12 -!- Zaba has quit [Quit: maintenance] 02:49:52 -!- desciero has joined ##crawl-dev 02:50:00 How can I view my local machine's high scores? 02:56:07 -!- ortoslon has joined ##crawl-dev 02:59:21 -!- ortoslon has left ##crawl-dev 03:05:59 desciero: I can't find a command to simply display your high scores from the main menu, so aside from starting and ending a new game your best bet is to look at the "scores" file in the saves folder 03:08:32 -!- syllogism has joined ##crawl-dev 03:50:58 desciero: run the console version with --help 03:51:17 there are a 3 options to show the scores with different verbosity 03:51:58 the tiles version can probably do that too - not sure it handles --help properly though 04:26:46 uh oh, the 0.9-a1 tag wasn't pushed 04:32:26 let me know when you do, because we need to adjust the dev-builds/trunk website then 04:35:11 and dzien dobry, kilobyte! 04:38:24 Napkin: guten morgen! 04:38:40 :D 04:38:52 (nitpick: it's "dzień", with a ń :p) 04:39:08 yeah.. sucky keyboard here ;) 04:39:31 can't compose it 04:39:52 it's a pity the compose key is not bound by default 04:40:36 like, on regular keyboards, there's that "Menu" key that I haven't used ever save for trying to see if it works 04:43:20 indeed 04:43:42 i'm using the windows flag for my wm shortcuts, but not the menu key.. should actually bind that 04:44:17 omg, xfce has bound it to to show right-click menu in terminals :D 04:44:35 compiz has some nifty functions that use the Win key, but they're still quite rare 04:44:49 well, isn't that the intended purpose? 04:45:17 probably - but who cares! ;D 04:45:24 indeed 04:45:33 i'm having a go at the webtiles finally, btw 04:45:46 the ones from Florian Diebold 04:46:19 now I need to be careful with DESTDIR & prefix - since it's not chrooting the crawl binaries 04:46:22 unless he updated it a lot, they're really, really primitive and non-functional yet 04:46:36 or did he without making much noise? 04:46:44 his posting suggested something totally different 04:47:00 using websockets 04:47:34 using tileweb.cc to output javascript 04:47:42 -!- monqy has quit [Quit: hello] 04:47:50 and a python server to serve that via those websockets 04:48:37 yeah... the last announced version just showed a tiny map and two critters in the left upper corner 04:50:27 kilobyte? do you think leaving DESTDIR empty and setting prefix=/srv/webtiles bin_prefix=/bin SAVEDIR=/${GAME}-${REVISION}/saves DATADIR=/${GAME}-${REVISION} will work? 04:51:41 sounds good... not sure if you want the games in / and binaries in /bin, though 04:52:09 prefix is not added in front of bin_prefix, SAVEDIR and DATADIR? 04:53:28 lemme check 04:54:05 thanks 04:54:18 03kilobyte * rac4ce1cfc265 10/crawl-ref/source/tilepick.cc: Use the hill giant tile for detected giants. 04:54:24 03kilobyte * rd804caca3cd2 10/crawl-ref/source/files.cc: Declare deletion of save files as "safe". 04:54:25 it uses the regular path resolution -- like if you were in a directory 04:54:35 ok 04:55:22 with prefix=~/ignacio, "/bin" means real /bin but "bin" is ~/ignacio/bin 04:55:31 yeah, i understood 04:55:50 added the full patch to prefix_bin, SAVEDIR & DATADIR 04:55:56 *path 04:56:56 just tried out the new knifeless butchering, and though I like it it's slightly disorienting 04:56:57 just don't start them with a slash 04:57:01 heh, you were wrong, kilobyte 04:57:02 cp crawl-web-97e0f46 /srv/webtiles/srv/webtiles//srv/webtiles/bin/ 04:57:10 er, wut? 04:57:29 ifeq ($(filter /%,$(DATADIR)),) 04:57:29 #relative DATADIR 04:57:29 ifneq ($(prefix),) 04:57:29 override DATADIR := $(strip $(prefix))/$(strip $(DATADIR)) 04:57:29 endif 04:57:31 endif 04:58:27 interesting behaviour 05:00:16 with prefix=${DESTDIR} bin_prefix=${DESTDIR}/bin SAVEDIR=${DESTDIR}/${GAME}-${REVISION}/saves DATADIR=${DESTDIR}/${GAME}-${REVISION} 05:01:06 with DESTDIR (shell variable only, not defined in make), the bin landed in /srv/webtiles/srv/webtiles/srv/webtiles/bin 05:01:15 hrm, sister's cat is here (stored with me Easter..May 1-3 holidays), climbed the keyboard and demanded a long petting session 05:01:17 and saves/data in /srv/webtiles/srv/webtiles/ 05:01:22 fun unless you want to do something :p 05:01:54 yes :D 05:02:19 make didn't read the environment variable DESTDIR.. did it? i didn't use make -e 05:04:43 s/DESTDIR/PREFIX/ inside the update scripts 05:05:03 wow, it did! 05:05:11 cp crawl-web-97e0f46 /srv/webtiles//srv/webtiles/bin/ 05:05:20 prefix=/a/b/c DATADIR=/moo makes it /moo 05:05:36 prefix=/a/b/c DATADIR=moo makes it /a/b/c/moo 05:05:43 so it seems to work 05:05:44 that's not the point, kilobyte 05:05:51 ok, sorry, confused 05:06:03 if you have a clue, don't lose time on me right now 05:06:05 make read the DESTDIR environment variable without using make -e 05:06:10 -!- Zaba has joined ##crawl-dev 05:06:38 and, "prefix" is added to the front of "prefix_bin", even if paths are specified absolut 05:06:55 but ok, figured a way to do it now :) 05:07:53 I see what's wrong 05:08:01 don't touch it! ;> 05:08:54 mmmkay, it's only 12 at the crack of dawn, not fully conscious yet :p 05:09:04 hehe 05:09:25 cp crawl-web-97e0f46 /srv/webtiles//bin/ 05:09:34 mkdir -p /srv/webtiles/crawl-web-97e0f46/dat/des 05:09:43 mkdir -p /srv/webtiles/crawl-web-97e0f46/saves 05:09:48 tadaaaa! 05:10:15 -!- Zaba has quit [Client Quit] 05:11:50 -!- Zaba has joined ##crawl-dev 05:13:31 ok, crawl finally installed the way it's required 05:13:33 moin Zaba 05:13:42 pyhton webserver in place too 05:13:50 let's check this tornado framework 05:16:03 what's up 05:16:19 coffee! 05:16:34 and i'm checking out Florian Diebold's "WebTiles" 05:23:14 heh, python-tornado version 1.0.1 from squeeze doesn't have websocket support 05:23:21 when you're done, should I start testing with Fennec 4 first, or directly with elinks? 05:23:32 nor do stable browsers 05:24:24 Firefox/Fennec 4 have it but only if you set a config flag that enables insecure stuff 05:24:26 some browsers implement websocket but disable it because it's insecure by design 05:29:12 is there a ps or pstree command, that shows me all children processes (and their children) of a certain pid? 05:29:23 *ps or pstree parameter 05:39:37 -!- gvdm has quit [Remote host closed the connection] 05:40:03 -!- gvdm has joined ##crawl-dev 05:42:25 -!- edlothiol has joined ##crawl-dev 05:43:49 what would you want from the processes? 05:44:41 /proc/*/stat has the pid and ppid, it'd be enough to follow the tree 05:44:55 so the question is, what else do you want to display 05:44:59 using pstree now 05:45:22 i want to see children of children 05:45:32 without writing a script, kilobyte ;) 05:47:21 I do suck at perl, needing several lines to do what could be done in one, but I don't get why would you hate scripts if it's unobvious (or impossible) to do something without 05:47:39 far from it 05:47:42 i love scripts 05:47:47 i just have no time at the moment 05:48:14 using 05:48:15 watch -n1 'pstree -a -p -n 3686' 05:48:16 for now 05:50:47 you know.. focussed work trying to get webtiles working - without trailing off for 10 minutes writing a script just because i'm curious what processes and how many will be spawned ;) 05:53:48 oh, it indeed works 05:54:18 so does the webtiles 05:54:21 *do 05:54:40 o rly? What's the URL, let us prove you wrong! 05:55:08 will send in query, because it's a full-fledged un-dgl'ed version 05:55:35 ah 05:55:46 ok, it can be secured later 05:56:22 got the address? 05:56:31 i'm running chromium 05:56:39 map is not showing 05:56:48 but everything else looks very good 05:56:57 no inventory 05:57:00 no mouse-support 05:57:42 but even mini-healthbars work fine 05:57:51 and have patience, takes a while until it starts 05:57:59 holy crap... 05:58:02 5s black screen at the beginning 05:58:09 it works 05:58:13 no delay for me 05:58:19 very fast, actually 05:58:44 works FASTER than native tiles 05:58:48 hahaha 05:59:07 ok, i have to hurry with shopping 05:59:16 autofight on Tab doesn't work and the keyboard focus gets stolen constantly, but that's fixable 05:59:43 source is in ~crawl/source/crawl-webtiles 05:59:49 I’d love tiles that are just the same textual symbols as in the terminal version. The benefit would be that they are perfectly square, and actual text can be rendered with another font. 06:02:32 link? 06:03:05 don't make it public yet though, due 06:03:10 sure 06:03:13 no security yet 06:03:14 PM me? :) 06:03:18 i did 06:03:20 oh 06:03:20 via notice 06:03:30 "feh notices" 06:03:52 ok, 4 players already 06:04:02 hogging one cpu up to 80% 06:04:14 jesus christ 06:04:16 but seldomly 06:04:25 ok, gotta go 06:04:26 o/ 06:04:31 too huge for my screen 06:04:34 but *god* that is nice 06:05:35 where's the source code for it? 06:08:03 -!- edlothiol has left ##crawl-dev 06:08:15 in Firefox 6 trunk, it is noticeably faster than Chromium (which already was much faster than native), but there's a delay upon loading it the first time. Just once, though. 06:08:32 chromium dev is quite fast 06:08:52 -!- edlothiol has joined ##crawl-dev 06:09:07 -!- edlothiol has left ##crawl-dev 06:09:15 lemme try it in Fennec 4 06:10:55 lol 06:11:06 Napkin: i thought you left :) 06:11:07 anyone tried to save their game yet? 06:11:13 out of shower 06:11:50 -!- edlothiol has joined ##crawl-dev 06:12:20 Ohhh. It's just reading the screen? 06:12:37 Pity. 06:12:50 Cython would probably make it a hell of a lot easier to interact with the game state. 06:12:57 no, it creates the screen in javascript 06:13:12 it's kinda like a new frontend 06:13:22 s/screen/it uses subprocess to talk to ./crawl/ 06:13:38 that's just the python part 06:13:50 well, yeah 06:14:59 direct implementation of websockets would probably be very cool, indeed :D 06:15:03 shopping! 06:15:05 o/ 06:16:48 I'm assuming you can't view other games that are in progress? 06:17:06 due: it's just normal non-DGLized crawl 06:17:16 ah 06:17:44 so it's basically a faster nettiles -- it's just reading the ascii and printing the relevant info as tiles? 06:17:46 you guys can probably make crawl write a ttyrec at the same time.. can't you? 06:18:14 Napkin: no, it has no console output 06:18:20 not yet 06:18:22 it can 06:18:42 it's just using subprocess to talk 06:18:48 and, it's also showing healthbars of monsters - so it's more than "just" the ascii version 06:18:52 ah 06:18:59 ah 06:19:11 yeah, i have't examine the source to determine how it works 06:19:17 The code reuses quite a bit of the normal tiles code. The main part of 06:19:17 the new code is in tileweb.cc, which replaces tilesdl.cc to send tile 06:19:17 data (and the text area contents) to the client. This is simply done by 06:19:17 writing javascript to stdout, where a small python server reads the data 06:19:17 and sends it to the client via a websocket. 06:19:18 well, it uses map_knowledge rather than trying to parse the screen 06:19:19 --- 06:19:30 Ahah! 06:19:32 quote from his email "Web Tiles prototype" to CRD 06:20:01 black screen on Fennec 4 even after overriding the websockets security block 06:20:07 An even easier solution is to have the *standard* crawl interface optionally output this info into a pipe. 06:20:15 try again, kilobyte 06:20:20 i'm watching the processes 06:20:44 unfortunately that would require ... maybe not 06:20:55 well, it would require *bits* of the tiles code get merged into mainline trunk 06:21:06 i'm not sure that would be a bad thing, though. 06:21:24 no crawl-web process yet, kilobyte 06:21:49 due: it would have side effects like the fixing loss of all tile data when reading a save into console 06:22:00 s/the fixing/fixing the/ 06:22:20 btw 06:22:37 duplicating the output for watching should be trivial.. right? 06:22:59 okokok, really got to go ;) 06:25:04 installed ancient Firefox (4), it works there too 06:26:11 Fennec 4 not working is a bummer, but on an 800x480 screen it would have problems anyway 06:27:41 in general, though: adding support for transports other than websockets would be trivial 06:27:52 -!- OG17 has joined ##crawl-dev 06:28:50 and since our OpenGL code is a filthy piece of crap that is crashy, buggy and fails to work on many setups, we can even think of replacing it locally 06:30:27 Well, we needn't need all the tiles code -- but having the capacity, regardless of interface, would be excellent. 06:31:48 yeah 06:31:48 -!- Twilight-1 has quit [Read error: Connection reset by peer] 06:31:49 -!- Twilight13 has joined ##crawl-dev 06:40:18 -!- gvdm has quit [Ping timeout: 250 seconds] 06:44:38 -!- Textmode has quit [Ping timeout: 250 seconds] 06:50:37 -!- galehar has joined ##crawl-dev 07:14:57 -!- gvdm has joined ##crawl-dev 07:39:02 -!- ortoslon has joined ##crawl-dev 08:20:39 re! 08:20:52 so, anyone want to work on these webtiles? 08:24:28 yes! 08:24:33 need some help with it? 08:24:48 -!- ZorbaBeta has quit [Read error: Connection reset by peer] 08:25:05 -!- ZorbaBeta has joined ##crawl-dev 08:27:33 -!- edlothiol has quit [Quit: edlothiol] 08:27:53 -!- edlothiol has joined ##crawl-dev 08:28:51 hi 08:28:59 i'm the author of the prototype 08:29:16 nice to see that it works on a machine other than mine :) 08:30:18 i'd of course continue to work on it if the approach is deemed worthwhile ;) 08:45:35 -!- gvdm has quit [Remote host closed the connection] 08:45:48 -!- gvdm has joined ##crawl-dev 08:46:41 -!- Jordan7hm has quit [Ping timeout: 240 seconds] 08:47:54 -!- ais523 has joined ##crawl-dev 09:24:44 -!- valrus_ has joined ##crawl-dev 09:33:37 -!- Henzell has quit [Remote host closed the connection] 09:36:39 -!- Henzell has joined ##crawl-dev 09:46:29 -!- eith has joined ##crawl-dev 10:26:21 03dolorous * rc8f860887b2d 10/crawl-ref/source/mon-data.h: Add formatting fixes. 10:46:14 -!- upsy has joined ##crawl-dev 11:17:12 hi edlothiol 11:21:02 I think the webtiles is awesome! 11:21:26 i'm getting a log or ERR though 11:23:56 oh yeah, that's just some debug logging that I forgot to remove, doesn't mean there's really an error 11:24:12 Crawl has webtiles now, just like NetHack does? 11:24:14 awesome 11:25:14 nethack is way too old to support websockets, ais523 ;-P 11:26:02 nethack has xml saves! 11:26:09 well, the japanese variant 11:26:12 edlothiol: I would need to make it more secure 11:27:11 abyssal knight trapped in D:1 (https://crawl.develz.org/mantis/view.php?id=3902) by Saegor 11:27:30 with authorization 11:28:27 yes, of course... I wasn't sure how to handle that correctly 11:28:47 we could work on that together, if you are interested? 11:28:55 yes 11:29:46 what i'd like the requirements to be is: 11:30:00 authenticate user against sqlitedb 11:30:19 ability to specify -name, -morgue & -rc as parameter when starting crawl 11:31:52 not sure how reasonable it is to do the auth with tornado framework 11:32:52 but doing it via apache first, I wouldn't know how to safely tell the webtiles who the user is 11:33:47 due & kilobyte voiced interest too, so did galehar 11:33:53 galehar, have you tried it yet? 11:34:35 -!- bf has quit [] 11:36:26 I actually read Learning Perl from the beginning 11:36:49 until I understood how the variables work and how the syntax is to be used 11:37:20 oops, wrong channel :) 11:39:24 hm ok, I'm not sure how to transmit the user safely either 11:39:53 starting crawl with the right parameters when the username is known should be easy 11:39:55 so, I think it must be done after the websocket is opened, right? 11:42:50 I think the authentication itself could be done before. then we could set some kind of security token that the javascript can send on the websocket to show which user it is 11:44:13 oh, ok 11:44:30 for a while, i've been considering setting up a crawl site that permits user logins 11:44:38 if that token can't be faked, I think it's ok 11:45:11 it would be pretty easy to make this integrate with webtiles, so that you log in, and then launching webtiles logs you in automatically too 11:45:33 for reference, i am speaking of something similar to tome4's site, here 11:45:53 where you can view your dumps/etc. while logged out but get more features when logged in 11:46:13 (e.g., ability to add inline comments to dumps, like "i used this weapon most of the game" or whatever 11:46:23 well, as a side hobby, go ahead 11:46:35 i don't see much advantage of that 11:46:51 Napkin: have you looked at tome's site? 11:47:15 edlothiol: problem is, Javascript and me aren't exactly close friends :D 11:47:37 for example, it can make graphs of winners and such - the kinds of queries people do on henzell, except that data then accessible to a much better viewer than IRC can ever be (and the queries can be cached!) 11:48:15 what has that to do with the "logging in" you spoke about? 11:49:07 Napkin: the way their site works is that you need to make an account to play tome online - but you don't have to log in to use the website. but, if you do log in, you have access to some features on your account 11:49:09 well the javascript side won't be very complicated, I think 11:49:16 they don't have nettiles, though 11:49:24 whereas if we have that, logging in also -> able to actually play online 11:49:37 if someone is willing to sit down and write a webinterface for shiny sql-queries and graphic stats 11:50:22 great, edlothiol 11:50:47 Napkin: basically, i could make a site to do that, the issue would be getting it integrated with other logins :) 11:51:20 i think it could be using CAO login, or using wiki/mantis login + some way to associate your game account with wiki/mantis 11:51:29 how about this: the authentication page (on apache) does its thing, then generates a random token and associates it with the username in a db; 11:51:34 the former probably less preferable, because dgl has crappy password, doesn't it? 11:52:03 then redirects to the webtiles page, sending the random token in the page, which the javascript then sends over the websocket to connect; then the python code can check the database for the username 11:52:30 edlothiol: what does webtiles do when you die, currently? does it go to a dgl-style page? 11:53:47 depends on what crawl does when you die, which i'm not sure of currently -- if crawl quits, the websocket is closed, which is currently not handled in any way 11:54:15 hmm.. we would authenticate against the dgl-db, create the token and save it in the dgl-db as well, then start websockets and make it search for the token to figure out the username? is that what you mean, edlothiol? 11:54:37 yes, basically 11:54:53 sounds somehow overloaded - but not sure how else to do it 11:55:05 yeah ;) 11:55:08 :D 11:55:14 edlothiol: my thought would be that you could tie an auth token to the current character, to help against stuff like session fixation 11:55:48 'Regenerate SID on each request 11:55:48 well, edlothiol - instead of executing crawl directly when someone connects to the websocket - can we start something else instead? 11:55:49 A countermeasure against session fixation is to generate a new session identifier (SID) on each request. If this is done, then even though an attacker may trick a user into accepting a known SID, the SID will be invalid when the attacker attempts to re-use the SID. ' 11:56:07 one could consider a game of crawl as similar to one 'request' 11:56:16 yes, of course 11:56:30 what are the options? 11:56:51 as long as its stdin/stdout is then redirected to crawl (or it may get a bit more complicated) 11:56:57 for example - could we maybe start a console application, which then in turn (after auth) starts the crawl game? 11:58:12 i mean - with DGL (ncurses console application), we have everything we need already. authentication, config management, savegame backup, etc 11:58:19 Napkin: it would be neat to have it go: login -> brought to logged in dgl window (which also includes spectating and such) -> when you select a branch, you get "Play in web console" and "Play in nettiles" 11:58:23 but of course, could also be something simpler 11:59:08 -!- monqy has joined ##crawl-dev 11:59:20 that would be rather complicated, since the ncurses output would be hard to handle 11:59:29 hmm... edlothiol: how does nettiles actually 'start up'? it runs on JS, but - if you have an existing websocket connection, can you transfer it? 11:59:51 geez 12:00:15 what exactly do you mean by transfer? 12:00:22 edlothiol: are you familiar with shellinabox? 12:00:53 looked it up just now 12:01:42 i set it up earlier for multirobin (it was pointed at a multi-input screen session running ssh->CDO) 12:02:11 it was reasonably usable for some people, even though it was them->myserver->cdo 12:02:31 it doesn't currently use websockets but some people are working on having it use it instead 12:03:00 if you could use shellinabox to handle the console input, then gracefully switch over to nettiles for the nettiles part still using websockets, that would be very cool 12:03:14 er, console output... well, input too i guess :) 12:04:27 oh, neat, someone actually has a DCSS shellinabox up live: http://amuletofyendor.com/nethack/dungeon-crawl-stone-soup/ 12:04:32 I guess it would be possible 12:07:59 it'd essentially then go something like: ssh into cao, cao hits dgl, dgl can send either console output (->shellinabox) or nettiles output (->nettiles), dying in nettiles -> dgl (so back to shellinabox). the problem with this would be figuring out when to switch 12:08:19 but if you could figure that out, it would make setting up other kinds of authentication irrelevant 12:13:31 hmm, here's one thought: log into SIAB dgl, get the nettiles option 12:14:16 if you select it, it gives you a one-time login link :) 12:14:40 then you wouldn't even need to try and integrate them 12:14:42 well it could send some kind of magical control sequence that makes the client switch 12:15:28 yeah, i thought of that, but that could get annoying - because you have to both be able to switch, and switch back 12:15:47 but I don't really like having to login in a simulated console to play crawl in a web browser ;) 12:16:10 whereas if you output a link, this can open in a new tab - or even in the same window via js (since SIAB outputs all HTML, it can just be hidden) 12:16:28 i.e., load them both on the same page, and click -> switch which one shows 12:17:47 edlothiol: keep in mind that dgl also has spectating, RC file management, etc. 12:20:21 can someone spectate a game being played on nettiles? 12:20:32 not yet 12:33:14 -!- valrus_ has quit [Remote host closed the connection] 12:38:02 -!- gvdm has quit [Read error: Connection reset by peer] 13:22:43 -!- Mu_ has joined ##crawl-dev 13:25:59 Napkin: it seems the crawl rc files are missing on cdo? several people complained in ##crawl there and i checked myself and the file is blank 13:26:21 yeah, seems like all versions 13:26:27 some permission problem which I just saw myself 13:26:34 the files are there, just not read 13:26:36 cool 13:27:23 fixed? 13:27:23 Napkin: You have 5 messages. Use !messages to read them. 13:28:10 all good now 13:28:25 yea sorted, thanks 13:28:31 sorry for the fuss 13:28:36 i blame paxed ;> 13:47:49 -!- valrus_ has joined ##crawl-dev 13:59:11 wow, surprisingly many zot traps in swamp 13:59:11 Napkin: You have 1 message. Use !messages to read it. 13:59:13 is that new? 13:59:16 !messages 13:59:16 (1/1) Elynae said (30m 38s ago): all okay again, thanks a lot :) 14:01:50 -!- gvdm has joined ##crawl-dev 14:02:43 in swamp's code, no, but the new rng bmh wrote is specially rigged to detect that it's you who is playing and hate you even more than the old one 14:11:54 kilobyte: i'm not sure why the webtiles don't work in fennec -- it seems that it can't connect to the websocket. my guess is (after googling a bit) that fennec still has an older websockets implementation 14:17:44 edlothiol: ah, possible 14:17:51 the standard is not even finalized 14:18:27 yes 14:18:27 nor usable -- a new standard that is a gaping security hole is no good 14:18:48 kilobyte: http? :D 14:18:51 but, we can add as many alternate transports as we want 14:19:12 indeed... although as I understand it, the security hole is not actually in the websockets protocol, but in some proxy servers 14:20:22 Eronarn: http can be MITMed by a hostile proxy, but websocket allow a page to MITM everyone who uses the same proxy as you 14:21:34 and avoiding this particular problem is not rocket surgery 14:28:35 -!- valrus_ has quit [Remote host closed the connection] 14:30:41 -!- gvdm has quit [Remote host closed the connection] 14:30:54 -!- gvdm has joined ##crawl-dev 14:57:22 -!- Zaba has quit [Ping timeout: 260 seconds] 14:58:51 -!- st_ has quit [] 15:02:41 Hm. Still can't figure out how to view highscores without starting a new game and dying. There is no saves folder to look in. Anyone know how to do this? 15:02:50 scores folder* 15:04:20 -!- st_ has joined ##crawl-dev 15:08:06 -!- Zaba has joined ##crawl-dev 15:09:26 -!- Cryp71c has joined ##crawl-dev 15:12:41 desciero: run crawl with --help 15:12:56 there are 3 options to show the scores 15:14:14 -!- Cryp71c has quit [Ping timeout: 250 seconds] 15:14:16 kilobyte? 15:15:00 oh, wait 15:17:48 kilobyte? could you please make a crawl-tiles save it's game when it receives a kill -HUP? :) 15:18:23 Some monster spells don't have messages (https://crawl.develz.org/mantis/view.php?id=3903) by 78291 15:40:33 -!- valrus_ has joined ##crawl-dev 16:00:49 -!- ortoslon has quit [Ping timeout: 252 seconds] 16:31:32 -!- ZorbaBeta has quit [Read error: Connection reset by peer] 16:31:47 -!- ZorbaBeta has joined ##crawl-dev 16:36:38 Napkin: it is handled only on Unix console. Adding support to Unix tiles is trivial, but I have no clue how to do that on Windows. 16:36:54 and it's a nasty bug that aborts the game without saving 16:37:00 unix tiles is enough 16:37:14 webtiles need to be HUP'ed when the connection breaks 16:38:52 -!- Kurper has joined ##crawl-dev 16:51:41 -!- casmith789 is now known as megabat 16:59:15 edlothiol: verynice work! 16:59:34 thanks :) 16:59:58 -!- gvdm has quit [Ping timeout: 250 seconds] 17:11:32 -!- valrus_ has quit [Remote host closed the connection] 17:23:30 moin due! 17:24:17 -!- megabat is now known as casmith789 17:24:25 hi Napkin 17:26:01 how's it going? :) 17:26:57 Not bad :) 17:30:24 -!- Textmode has joined ##crawl-dev 17:38:46 -!- Textmode has quit [Quit: Ex-Chat] 17:40:04 -!- Textmode has joined ##crawl-dev 17:44:58 -!- galehar has quit [Remote host closed the connection] 18:01:08 geez 18:01:15 water moccasin & iguana in d4 18:03:12 ouch 18:03:30 lucky webtiles troll berserker ;D 18:03:45 (: 18:03:49 -!- neunon_ has joined ##crawl-dev 18:04:31 -!- neunon has quit [Quit: ZNC - http://znc.sourceforge.net] 18:04:37 -!- neunon_ is now known as neunon 18:24:22 -!- ais523 has quit [Remote host closed the connection] 18:27:33 -!- edlothiol has left ##crawl-dev 18:31:25 -!- Twilight13 has quit [Read error: Connection reset by peer] 18:31:49 -!- Twilight13 has joined ##crawl-dev 18:42:25 -!- valrus has joined ##crawl-dev 18:57:14 -!- gvdm has joined ##crawl-dev 19:05:21 -!- desciero has quit [Quit: ChatZilla 0.9.86.1 [Firefox 4.0.1/20110413222027]] 19:11:16 so.. I am impressed - edlothiol reduced the server-load of webtiles server.py from 0-100% down to 0-1%! 19:12:01 biggest bug left before i'll make it public: allow interruption of travel & auto-explore 19:12:21 login works via CDO's DGL db 19:12:32 nothing fancy yet - just login 19:14:53 -!- Gretell has quit [Remote host closed the connection] 19:15:01 -!- Gretell has joined ##crawl-dev 19:33:51 -!- desciero has joined ##crawl-dev 19:34:44 -!- upsy has quit [Read error: Connection reset by peer] 19:36:52 -!- upsy has joined ##crawl-dev 19:50:10 -!- Mu_ has quit [Quit: Defecator, may everything turn out okay so that you can leave this place.] 20:39:15 -!- desciero has quit [Quit: ChatZilla 0.9.86.1 [Firefox 4.0.1/20110413222027]] 20:41:26 -!- Hehfiel has quit [Ping timeout: 240 seconds] 20:43:19 -!- Hehfiel has joined ##crawl-dev 20:55:20 -!- upsy has quit [Quit: Leaving] 20:55:48 -!- syllogism has quit [] 22:08:51 -!- eith has quit [Ping timeout: 246 seconds] 22:28:39 -!- OG17 has quit [Read error: Connection reset by peer] 22:40:58 -!- OG17 has joined ##crawl-dev 22:44:27 -!- valrus_ has joined ##crawl-dev 23:00:04 -!- gvdm has quit [Ping timeout: 250 seconds] 23:10:12 -!- unexpected has joined ##crawl-dev 23:10:39 Napkin, nice! 23:12:20 !tell edlothiol Have you considered using Cython to compile Crawl as a Python "extension"? You could use C++, etc, to call the relevant set-up and then provide a series of pipes to feed data through. Subprocess is excellent, but having everything run through child processes is a pain -- and the majority of the program would still be written in and run as C++, thus not encumbering any greater speed malus than using subprocess. 23:12:21 Maximum message length is 300 characters. Eschew verbosity, Gladys! 23:12:28 Eschew annoyance, Henzell! 23:12:57 !tell edlothiol Have you considered using Cython to compile Crawl as a Python "extension"? You could use C++, etc, to call the relevant set-up and then provide a series of pipes to feed data through. Subprocess is excellent, but having everything run through child processes is a pain. The speed should also be excellent, and possibly better than subprocess! 23:12:57 Maximum message length is 300 characters. Eschew verbosity, Gladys! 23:13:02 Oh ffs. 23:13:08 !tell edlothiol Have you considered using Cython to compile Crawl as a Python "extension"? You could use C++, etc, to call the relevant set-up and then provide a series of pipes to feed data through. Subprocess is excellent, but having everything run through child processes is a pain. 23:13:32 !tell edlothiol The speed should also be excellent, and possibly better than subprocess! 23:13:45 due: OK, I'll let edlothiol know. 23:13:45 due: OK, I'll let edlothiol know. 23:30:48 -!- valrus_ has quit [Remote host closed the connection]