Talk:Monster (Slash'EM Extended)

From NetHackWiki
Jump to navigation Jump to search

Page size

<!-- 
NewPP limit report
Cached time: 20170708210703
Cache expiry: 604800
Dynamic content: false
CPU time usage: 56.440 seconds
Real time usage: 61.075 seconds
Preprocessor visited node count: 104435/1000000
Preprocessor generated node count: 281016/1000000
Post‐expand include size: 624570/2097152 bytes
Template argument size: 16276/2097152 bytes
Highest expansion depth: 4/40
Expensive parser function count: 0/100
-->

That's on accessing the cached, anonymous version of the page. Trying to access it (or any diffs) on my logged in session, the wiki returns an HTTP 500 after about 50 seconds.

As it is, a 1.6 MiB page like this seems to refuse to work in any sensible way. I think the better option would be to split the page into multiple smaller pages.

I've reverted the page to the last revision that seems to be reasonably workable. Feel free to access the previous revision (if you can) to split up the information if you wish to do so, although I'd strongly advise laying out your plans on this talk page first to avoid further friction like this incident.

Thank you! —bcode talk | mail 00:58, 9 July 2017 (UTC)

Update – it looks like all the calls to the color templates ({{black}}, {{white}} etc.) are at fault. A quick fix would be subst:ing them, but it's probably a better idea to just split up the page anyway for viewability purposes. —bcode talk | mail 22:31, 10 July 2017 (UTC)

Ideas on restructuring monsters

It looks like turning all the monsters into a single big wikitable didn't work too well. We'll need to pick another option in order to list all the monsters. I've thought of a few ideas listed here:

  • Create a page for each symbol and put a wikitable on each of them
  • Create a page for every 5 levels of monster, and then a new page for all the level 50+ monsters and put a wikitable on each of them
  • Combine the ideas above, as well as any other numbers we want to sort by
  • Create an offsite database and javascript viewer
  • Create an offside downloadable csv or odf spreadsheet
  • Create an offsite csv and javascript viewer

--Bug sniper (talk) 04:06, 10 July 2017 (UTC)

Splitting by glyph seems like a good idea. Having some sort of external database that can be queried for monsters would be an interesting thought as well; I know I used to keep one for vanilla NetHack in SQLite allowing me to run queries such as "all monsters with a difficulty above 12 that don't need to breathe" (example query, but you get the idea). You should be able to hack the source into providing the required data automatically. —bcode talk | mail 22:38, 10 July 2017 (UTC)

Guys, get faster PCs please :P

It worked just fine on mine. The ones tested were a 2.7 GHz with Windows XP and Mozilla Firefox, and a 3 GHz with Windows 7 and Internet Exploder. Keep in mind that the latter frequently hangs for me when trying to view large diffs on Github. It had zero trouble with the 1.6 MB page here, though. --Bluescreenofdeath (talk) 14:40, 10 July 2017 (UTC)

The issue isn't on the client side, it's that generating the page on the wiki side takes forever (see snippet I extracted, above) and (at least in my case) likes to 500 out on me. If it works for you repeatedly, chances are the generated page has been cached for you on the server side. (It can't serve the same cached page to other logged-in users for various technical reasons; I'm pretty sure the anonymous view of the page has been cached as that one worked for me in fractions of a second, too, even though it took over a minute to generate.)
You'd have to ask User:dtype for the specifics, but I'm pretty sure the AWS EC2 instance running the wiki isn't something to be shrugged at (in fact, I think it has been upgraded a while ago).
(Even so, I don't think we'd want to exclude people with weaker clients, but this isn't even the issue here.)
If you believe this to be an issue with MediaWiki or its specific configuration on this wiki, we could look into getting a MediaWiki person to check it out, but I suspect the wikitext engine simply isn't designed for pages of this size. In any case, the megabyte page works badly (if at all) on the server side; I prefer a reliably working page to a page people might not be able to view.
Moreover, this issue applies to page diffs as well. (There's a user option to not show page content after diffs; I don't know if this would help but I'd rather not have everyone adjust their preferences just to be able to review edits to a single page.) This makes it hard for editors to review edits for possible vandalism or other issues. This is not much of an issue when you edit the page, but if anonymous or newly registered editors make an edit, that's kind of bad. —bcode talk | mail 21:48, 10 July 2017 (UTC)
Update: see my note above about template calls being the issue. I still recommend splitting the page anyway. —bcode talk | mail 22:33, 10 July 2017 (UTC)

Scroller master

This has been bugging me ever since I read the source code, but I'll ask it now. What exactly does the scroller master do? --Kahran042 (talk) 16:34, 11 August 2017 (UTC)

They are a dummy monster for when the player triggers a superscroller trap; wherever they spawn, an active superscroller trap is spawned underneath, and then the scroller master is killed off. One way of stopping the superscroller effect is to find and untrap that active superscroller. --Bluescreenofdeath (talk) 17:46, 11 August 2017 (UTC)
Thanks for the info and for the rapid response. --Kahran042 (talk) 17:51, 11 August 2017 (UTC)