Welcome to Open Carnage

A resource for Halo Custom Edition and MCC modding, with unique means of rewarding content creation and support. Have a wander to see why we're worth the time! - EST. 2012


  • Content count

  • Joined

  • Last visited

About Kavawuvi

  • Birthday April 10

Extra Information

  • Gender
  • Contributed
    $100 (US) to Open Carnage
  • Raffle Victor

Computer Details

  • Name
    Dark Citadel
  • Central Processor
    AMD Ryzen 5 2600
  • Motherboard
  • Graphics
    MSI GeForce GTX 1070 Gaming 8G
  • Memory
    32 GB [2x 16 GB] G.Skill Ripjaws V Series
  • Storage
    500 GB Samsung 970 EVO SSD (NVMe) + 240 GB ADATA XPG SX930 SSD (SATA III)
  • Power Supply
    EVGA SuperNOVA 650 G3
  • Case
    Fractal Design Node 804
  • Display
    LG 27GL850-B 27" 2560x1440 144 Hz IPS
  • Keyboard
    MAX Keyboard Nighthawk X9
  • Mouse
    Logitech M510 Wireless Mouse
  • Operating System
    Arch Linux

Recent Profile Visitors

136,365 profile views
  1. Oh, yeah, this is primarily intended for custom maps first and foremost. This is never going to produce Halo-like lightmaps. That said, I DO intend to reverse engineer tool.exe's radiosity and make my own lightmapping tool, possibly with modern features such as threading.
  2. For your first point, the built-in errors for the toolset is often highly cryptic. You MIGHT get a useful error message sometimes. But often times, you just get a generic exception error which CAN'T be searched, making them very un-user friendly. When it does have errors, many people often have to resort to messing up their tags until the tools are finally happy with them. And when it doesn't report errors where it should, the game crashes. Invader, on the other hand, is very impressive and well-polished for the feature set it provides. Error handling is also way better, often telling you exactly what is wrong with your tags if you have errors and even providing an automated means to fix tags if it's too much to try to fix everything yourself by hand. Not only that, but Invader supports a variety of inputs besides .tif and .wav for creating assets. Tool's LibTIFF implementation may not work with some TIFFs made by newer image editors. Invader also lets you use .png, .tga, and .bmp in addition to having a newer LibTIFF implementation. Invader also supports FLACs for audio input, allowing you to keep sizes down. Unlike tool.exe, it also does not require you to violate the law to make Xbox ADPCM sound tags, and the Xbox ADPCM sound is a bit better, too. Invader also has really good performance with large tags. The tag editor, for example, takes less than half a second to load d40_b. Guerilla, on the other hand, takes about a million years to open it. For your second point, the documentation may not be that much yet, but that's because I value quality over quantity and I've had to write it myself. Here are some examples: Want to make a bitmap? I have one of the best tutorials for that. It not only explains how color plates work and the many different quirks of them, but it explains all of the different formats you can use for making bitmaps including the pros and cons of them. https://github.com/SnowyMouse/invader/wiki/Creating-a-bitmap Want to make a sound? There's hardly any tutorials on that, but the Invader tutorial not only tells you how to make a sound tag, it tells you how pitch ranges and permutations work and also explains the various formats. https://github.com/SnowyMouse/invader/wiki/Creating-a-sound Invader also does a number of things the base tools simply do not do, and it is way more reliable at finding errors that the base tools do not do. It also does not introduce a number of errors the base tools do. I and many others consider this "very impressive" but the fact of the matter is that people use the stock tools because they're the stock tools. Not because they're actually any better to use, and certainly not because they have more documentation available. Such is the way of things here. And honestly, it's feedback like this that make me wonder if making a reimplementation of the Halo Editing Kit is even worth it (not blaming you of course - this is just what people tell me all the time). Not a lot of people test it, and no one actually lends a hand. I am lucky if someone even says "I'd help if I could program" here. I have had to pull so many hours and sleepless nights just to get Invader where it is by myself because, let's face it, no one cares. Almost everyone is perfectly happy using a broken, buggy, closed source toolkit for making their maps. Again, my concerns with this are that no one will use it, especially with MCC's modding support potentially being right around the corner, thus most people aren't going to use this format, thus it wouldn't be worth the added confusion. If MCC CEA was never ported to PC or I was certain it wouldn't go anywhere modding-wise, I would be fully on board with making this new format. And, of course, MCC CEA's new format is still .map, which all seven games are already .map.
  3. The problem is that .map is such a generic file extension. Tons of programs use files with .map extensions. And the game even has bitmaps.map/sounds.map/loc.map which aren't cache files but a completely different format to the other map files. Yes, changing the header will prevent the stock game from loading it. This is something Chimera does. However, people still want the extension changed - something I'm implementing in a future Chimera update that's in the near future. It's worth noting that MCC no longer compresses the header, but these maps are still incompatible as they use a different base memory address (thus they will crash the game) despite using version 7. We did give this consideration a while ago. The problem isn't so much implementing it on a client but moreso having tools that build maps with these tags. While Invader can build maps, people prefer to use the stock tools to build maps. So to get more people to use this convention, you would have to write a program that patches a .map file with this metadata. Anyway, right now, the map downloader isn't going to download things that aren't incompatible with the client you're using. HAC2 users aren't going to get a map that requires Chimera. Rather, each mod gets their own type of file. As for getting the client to only download just the header, this is something we'd have to figure out with msalerno1965, as he's the guy who runs the repo. He'd have to modify his repo. This is a really good writeup and some really good ideas for a new format. However, now there's yet ANOTHER format coming, this time for MCC (version 13). I have no doubt that this will create even more confusion. Not only that, but it has more tag space and support for things that Halo PC doesn't support like monochrome textures. This, too, will be .map. I'm starting to wonder if the new format I was considering will even be worth it at this point.
  4. This update contains bug fixes and improvements to map downloading courtesy of pR0Ps (https://github.com/SnowyMouse/chimera/pull/75) This fixes an issue where the game may try to download the same map multiple times. You can now specify mirror(s) and the preferred order in chimera.ini for downloading maps. Some additional features of the HaloNet Map Download Protocol are now supported.
  5. I've rarely had it happen to me on Windows 10 and Windows 7. I'm not sure if I've seen it happen on Windows XP or older. It typically only happens when it crashes. Huh. I actually might recommend ReShade since it's open source.
  6. Chimera blocks the gamma setting from working. This is intentional. Halo's gamma setting needlessly changes your operating system's gamma setting, and this is prone to issues such as gamma being retained temporarily (or even permanently sometimes, resulting in you having to manually change it back). I am not changing this behavior. Instead, I present a better way to set your brightness: dgVoodoo2. Step 1. Install dgVoodoo2. Download it from http://dege.freeweb.hu/dgVoodoo2/dgVoodoo2/ and go to the latest stable release. Copy MS\x86\D3D9.dll, dgVoodoo.conf, and dgVoodooCpl.exe to your Halo directory. Step 2. Open Halo. Look for the dgVoodoo watermark on the bottom right corner to verify dgVoodoo is properly installed. Step 3. Open dgVoodooCpl.exe. Go to DirectX and uncheck dgVoodoo watermark. Step 4. Go to General. Change the Brightness setting on the color adjustments until satisfied. Click Apply and restart Halo to see the changes. Step 5. Play the game. (This is with 291% brightness - Have fun)
  7. I'm working on a new Blender plugin for generating lightmaps, Shadowmouse! This uses Cycles to bake lightmaps and Blender to unwrap UVs. It uses the radiosity parameters set in your tags to set up Cycles materials and sunlight. With this, you can bake high quality lightmaps for a number of maps with no manual work necessary (besides clicking a few buttons and running a couple commands), and you do not need to worry about tool's really bad UVs. Here are some examples. Blood Gulch (Original Halo PC lightmaps) Blood Gulch (Cycles lightmaps [WIP] - 2048x2048, 1024 samples, ~8 hours) camera.txt: 23.231102 -60.409565 10.358509 0.345733 -0.919217 -0.188493 0.066352 -0.176431 0.982086 1.221730 Things to note: Shadow definition is much higher on the Cycles lightmaps. UVs aren't broken on the Cycles lightmaps (look at the cliff on the right - the rock strangely has two shadows on the Halo PC lightmaps - an issue NOT present on the Xbox lightmaps) Sunlight strength is a bit lower on the Cycles lightmaps. In honesty, I'm not really happy with that, so I'm going to try increasing the sunlight intensity by 50%. Unfortunately, because it took 8 hours to bake these, I'll have to rebake these later. Longest (Original Halo PC lightmaps) Longest (Cycles lightmaps [WIP] - 2048x2048, 16384 samples, ~10 hours) camera.txt: -15.382186 -15.644083 0.837852 0.762132 0.529893 0.372231 -0.305612 -0.212480 0.928195 1.221730 Things to note: The actual sunlight is properly shown now on the Cycles lightmaps. The map is way too bright now, however. This may have to do with how Cycles does materials, or maybe Halo doesn't do indirect lighting very well. Either way, even if it's technically more realistic, it also completely changes the appearance of the map. This is because Cycles calculates lighting for 4 bounces by default. Setting the number of indirect light bounces to 1 gets a more desirable result although not 100% exact (see below). This may also be why Blood Gulch's shadows don't contrast as much. Anyway, I'm considering having the script set this by default for all of the materials. Longest (Cycles Lightmap [WIP] - 2048x2048, 512 samples, 1 indirect light bounce, ~12 minutes) Ignore the noise here. This is 512 samples, so it's really rough with indirect lighting. This is definitely not a replacement to tool.exe lightmaps, but rather an alternative. Not everyone will want to use this. Here are a few reasons why: It uses Blender, not 3ds Max (or other programs) Most people use 3ds Max, not Blender. Blender compatibility is something that is only recent, thus most Halo tutorials are for 3ds Max. So, I'm sure a lot of people would prefer something like Aether which uses 3ds Max. While you are not required to use Blender to make your model to use Shadowmouse (and, by the way, you should always use a separate project for baking lightmaps!), Shadowmouse is a plugin for Blender, NOT 3ds Max. I have no plans to make a 3ds Max-compatible version of this tool. 3ds Max is prohibitively expensive for me ($1530/year), and since I don't use 3ds Max, I do not know how to write in MAXScript. The file size is huge The file size of large lightmaps is utterly titanic, especially with 32-bit color. Blood Gulch's lightmaps at 2048x2048 end up being over 200 MiB! You can use 16-bit color like what the original lightmaps used. This will halve the file size (and if you're making Xbox lightmaps, you should do this!). However, at these resolutions, banding is going to be more prevalent. You can also use DXT1, reducing the size to 12.5% the original size, but block artifacting is going to utterly destroy some lightmaps. Time and energy Time is probably the biggest con of using Cycles. Cycles is very CPU demanding and time consuming! Higher resolutions and higher sample count both result in longer baking time, and some BSPs will take longer than others. For example, Blood Gulch at 2048x2048 with 1024 samples took well over 8 hours to bake on a Ryzen 5 2600 with the sample count set to 1024. Sample counts less than 8192 have noticeable noise in dim areas, but 8192+ samples takes an extremely long amount of time. Longest, at 16384 samples, took 10 hours, and it's way smaller than Blood Gulch and has only 1 lightmap texture to bake, not 15! Either way, while Cycles baked lightmaps, the PC was not very usable. Seriously, if your CPU doesn't have very many cores or has a low clock speed or IPC, then generating a bunch of 2048x2048 lightmaps with these settings is NOT going to be something you can just do overnight. And, of course, a Sandy-Bridge era Intel Core i5-2500K or AMD FX 6300 is going to absolutely chug compared to newer, better CPUs like the Intel Core i7-10700K or AMD Ryzen 7 5800X. It's not going to look like Halo I've done a lot of work in figuring out how to make materials that match Halo's radiosity parameters. However, the fact of the matter is that it's not possible to make Blender output a lightmap that perfectly matches the Halo aesthetic. Which, if you're going for a higher resolution, it just isn't going to happen anyway. Not only is it not possible to match the original Halo aesthetic, but if you use scenery that has random permutations (e.g. rocks), it's ESPECIALLY not going to look right. These will cast shadows that won't be correct for different permutations, resulting in the chance of a very dark shadow surrounding the scenery every time you load the map Some manual work can make things look more like Halo-style lighting, but it will ultimately be up to your expectations.
  8. I'd recommend Invader or even Mozzarilla to make sound tags. They both have the best Xbox ADPCM codec available, and they're both free software. You COULD use the Xbox ADPCM codec posted by Giraffe, but it's not quite as good in terms of audio quality, and it might possibly be totally illegal since it appears it came from the original Xbox SDK (which almost certainly required violating an NDA to redistribute, and if so, it's probably copyrighted as heck). Invader also has a much newer Ogg Vorbis codec than tool.exe which has a number of vulnerabilities fixed compared to the original one tool.exe used, and it's just better overall. Ogg Vorbis sounds better but it's a little bit slower to decode. For music and dialogue, this is fine, but for sound effects, use Xbox ADPCM. For making sound tags with Invader, I wrote a tutorial here: https://github.com/SnowyMouse/invader/wiki/Creating-a-sound. Oh, also make sure to split it with the -s option! If you don't, then Halo may reset the sound abruptly if it's especially long. Also, Halo will only play 22.05 kHz mono, 22.05 kHz stereo, or 44.1 kHz stereo. For 3D audio, you will want it to be 22.05 kHz mono. For stuff like music, it doesn't really matter. Invader can resample (i.e. -r 22050 or -r 44100) or mixdown/up (i.e. -C mono or -C stereo). If it's audio intended to be played in 3D space, you will need to use 22.05 kHz mono or you may hear it at the same volume no matter where you are. Next, make your sound_looping tag. If it's music or ambient audio that only has one sound tag, create a track and set the loop to your sound tag.
  9. As of the last update, MCC now supports Custom Edition maps, but it appears to have checks in place to prevent protected maps from loading. Apparently this was done to prevent people from complaining about these maps not working. This basically means that a lot of those 2000+ maps I mentioned in my previous post are basically left in the dust in terms of them working on the latest version of the game - especially a version you can actually purchase for cheap ($10, sometimes even $5 on Steam). In my opinion, I don't think it is unreasonable to expect a newer version of the engine to NOT support maps that were deliberately corrupted. And, of course, I love being proven right!
  10. AAAAH!!!! Wow it's been that long???
  11. Oh, wow. This is actually really good and useful. Nice!
  12. This is a small fix. Basically, every April 1st, it'd add a cartridge tilting effect. This originally did it on all maps. However, I've changed it so it only does it on the main menu. Sorry, that was an oversight on my part.
  13. I updated the review a little bit to better distinct Variable Refresh Rate and FreeSync. Basically, FreeSync is AMD's proprietary standard used on both HDMI and DisplayPort. Adaptive-Sync, however, is a DisplayPort standard. Not everything supports FreeSync, but some things support DisplayPort's VESA Adaptive-Sync (for example, non-AMD GPUs). I've also noted that it does claim to support HDR, but it actually isn't really HDR (the contrast ratio, while fine, isn't good enough). It also supports 10-bit color. I found that pretty much all games benefit from it in some way. Obviously genres can play a role, too. An FPS or an RTS is going to benefit a lot more than, say, Tetris. Halo: Combat Evolved look fantastic. Killing Floor 2 looks really good, too, though it's pretty CPU heavy and may dip below 120 FPS sometimes, even when lowered to 1080p. Adaptive-Sync is extremely useful here. Goodness. I haven't tried Minecraft with this, but this is definitely a game I thought would benefit a LOT from the high refresh rate. The Bedrock version (Windows 10, mobile, and console) probably benefits the most from it, as that version gets a very good frame rate even on toasters.
  14. Holy shit. This is amazing.
  15. I've been using this monitor for about a week now, so it's time to write a review on it! For reference, this monitor replaces my older Acer G257HU smidpx 25" monitor which I reviewed here: What's good about it? This monitor has a 2560x1440 (quad HD) 144 Hz nano IPS display. Nano IPS is a better version of IPS which cover more than the sRGB color space. You can read a whole article on it on DisplayNinja or ViewSonic, but in a nutshell, it's IPS but better. Anyway, like most IPS displays, the viewing angles are really good. The gray-to-grey (G2G) response time is also pretty low. I wouldn't say it's 1 ms as advertised, though. Sure, you can set the Response Time to "Faster" and you'll technically get an average of 1 ms response time, but the result is tons of inverse ghosting artifacts and trailing due to overshoot, which kind of defeats the point of getting a low response time (less ghosting!). Thankfully, this is not the default setting. In fact, the default settings are pretty good - MUCH better than my previous Acer monitor which maxed out the overdrive setting by default, resulting in LOTS of the aforementioned artifacts. I primarily do programming, so 1440p, while not necessarily required, is extremely useful for what I do, and going back to 1080p would've been a far bigger downgrade than the upgrade of going to 144 Hz. I also play video games, though, and I wanted to try something better than 60 Hz. It also support for variable refresh rate (VRR). This results in the monitor adapting to the frame rate of the game - basically a reverse form of vSync with none of the input lag. The on-screen display controls for the monitor is also pretty good (it uses just one button under the display!). It lets you configure a wide variety of settings, including two custom profiles and a few built-in profiles for various use cases (FPS, RTS, sRGB, "reading") as well as two other settings "HDR Effect" and "Vivid". The custom profiles let you configure most of the settings, while the other ones only let you configure brightness and contrast. sRGB mode is nice for viewing content, and the two custom profiles are really useful, but I'm not sure what the point of the other settings are. The stand is also pretty good, too. You can pivot, tilt, and raise/lower the monitor (but not rotate - you have to rotate the stand, itself). And if you don't care, you can use your own VESA mount, instead. Lastly, the monitor is not terribly expensive. I got mine for about $480 (after tax) which really isn't too bad considering it's a 144 Hz, 1440p, IPS display. Can I use Adaptive-Sync or FreeSync with my PC or console? To use VRR with this monitor on a PC, you need an AMD (GCN 2.0 or newer), Nvidia (1000 series or newer), or Intel (11th gen or newer) GPU to take advantage of this feature. Older Intel, Nvidia, and AMD GPUs do not support the standard. If you're on a console, currently only Xbox One X/S and Xbox Series X/S will work via FreeSync. The Nintendo Switch does not support either standard (it does not use DisplayPort and it uses an Nvidia chipset), but the PlayStation 5 is planned to get support for this in a future update. Also, you cannot use VESA Adaptive-Sync over HDMI. Only DisplayPort actually officially supports variable refresh rate, where with HDMI 2.0, you have to use the proprietary FreeSync standard (only HDMI 2.1 or newer natively supports Variable Refresh Rate - this monitor only has 2.0 ports though!). So, depending on your device (basically any device with a non-AMD GPU), you may have to use DisplayPort to get this feature. And, of course, the game needs to actually run below the refresh rate of the monitor. If it runs at or above the refresh rate, you'll get tearing as usual. At 144 Hz, it's probably not going to be that bad (assuming you even notice it), but it is something to watch out for. Is 144 Hz so much better than 60 Hz? Let me start by saying the difference between 60 Hz and 144 Hz was immediately noticeable. Merely moving the cursor on my desktop was immediately better. Scrolling on pages looks really smooth, too. And, of course, gaming, itself, looked so much better. Of course, I wanted to know if this was placebo! After playing on 144 Hz for a while, I did a blind test where I did not know the refresh rate. To do this, I wrote a Bash script that changed my refresh rate to randomly either 144 Hz or 60 Hz without telling me which one, and then ask me what I thought the refresh rate was. I found I could, indeed, reliably tell the difference between 144 Hz and 60 Hz, even though I had not used this monitor for more than a few hours. But does 144 Hz look twice as good as 60 Hz? After all, 144 is more than twice as high as 60. Personally, I think it's a matter of opinion depending on what you play. First, let's invert the numbers to convert refresh per second to seconds per refresh. 144 Hz is 6.944 milliseconds per refresh 75 Hz is 13.333 milliseconds per refresh, or 6.389 ms worse than 144 Hz. 60 Hz is 16.667 milliseconds per refresh, or 9.722 ms worse than 144 Hz. 30 Hz is 33.333 milliseconds per refresh, or 16.667 ms worse than 60 Hz. While it refreshes more than twice as fast, going from 60 Hz to 144 Hz is really only 58% as much of an improvement as going from 30 Hz to 60 Hz - something mostly seen on consoles. However, smoothness isn't the only improvement of going to a higher refresh rate. Because more information is being presented to you ever second, that also means the time between button input and response is effectively improved. Therefore, 144 Hz not only looks better, but games play better, too. What's not so good? My main gripe is that there is only one DisplayPort port while there are two HDMI ports. This doesn't seem like a very bad thing until you consider the following: You can only get DisplayPort Adaptive-Sync if you use... well... DisplayPort. As stated earlier, HDMI 2.0 does not natively support this technology, thus you have to use a proprietary standard like FreeSync if you want VRR over HDMI, where anything else would have to use DisplayPort. Also, if you use the HDMI port and have FreeSync enabled, the refresh rate is capped at 100 Hz when at 2560x1440 and 120 Hz when at 1920x1080. If I can only fully take advantage of the monitor with ONE of the ports, then the other ports are mostly pointless if you care about variable refresh rate. A second DisplayPort port would've been really nice. So, yeah, I ended up disabling FreeSync on the HDMI ports and just used Adaptive-Sync on the DisplayPort input. The monitor, itself, has no speakers, which, itself, isn't a problem since most monitors' built-in speakers are terrible, anyway. However, the audio quality from the headphone jack on this monitor isn't great. It's noticeably worse than motherboard audio. Also, the black aesthetic of the monitor is really good, but the slight "gamer" red accents on the stand and back of the monitor, while not garrish, feel a bit unnecessary. Maybe I'm just being very picky? At least they're using a matte black for everything facing the front, but the red accents would look weird anywhere except in a home environment. Lastly, do not bother with HDR on this monitor. The contrast ratio is okay, but not really good enough to be called HDR. Forget that it says HDR anywhere on the box. Other notes This monitor has a crosshair you can turn on. If you're playing something like a first-person shooter that has a shoddy reticle, you can use this to compensate for it. You can also use the crosshair in games that intentionally do not show you a crosshair without anyone knowing. However, note that doing so makes you a gigantic tool at the same time, so use with caution!! Conclusion This is a pretty good monitor. In fact, it's certainly the best monitor I've ever personally owned. The display looks nice. The colors and viewing angles are good. The refresh rate provides a really good experience in both gaming and regular desktop usage, and I've finally been able to properly watch the 120 FPS version of that one somewhat popular MCC video I made. The only real drawbacks are that there's only one DisplayPort connector and the HDMI inputs only support up to 1440p @ 100 Hz or 1080p @ 120 Hz if you have Adaptive-Sync turned on. It doesn't matter for stuff like my Nintendo Switch which would only ever support 60 Hz anyway, but if I connect a second PC (which I sometimes do), it actually does matter. And, of course, if your input device doesn't have a built-in audio jack, the monitor's 3.5 mm audio jack doesn't have very good audio quality. Pros IPS display (good viewing angles and color accuracy) 2560x1440 (plenty of pixels for regular usage) 144 Hz (smooth, responsive) 10-bit color support Adaptive-Sync / FreeSync Premium support (with compatible systems) Good grey-to-gray response times Good on-screen display Good stand (and VESA) Cons HDMI ports do not support 144 Hz with FreeSync turned on, only 100 Hz (1440p) or 120 Hz (1080p) Only one DisplayPort input, which makes the above con an even bigger issue Poor audio quality via 3.5 mm headphone jack Does not actually hit 1 ms response time without significant overshoot (but the default setting is pretty good) Not actually HDR If I had to rate this out of five stars, I'd say... it depends. If you only have one DisplayPort-compatible device you plug into your monitor and you have your audio coming from your motherboard, as is the case with most PC gamers, this monitor is easily a five star experience. Otherwise, I'd give it three stars. If you need to use the audio from the video cable (i.e. no direct audio from the system you're plugging in), the audio is terrible. You can probably split the audio from the video feed, but you'll have to buy a separate device for that, and they're usually around $20 for HDMI. I can't find anything on DisplayPort online except using a series of adapters. And if you need Adaptive-Sync support from multiple inputs, switching input to input is a pain. Sure, you can buy a $30 DisplayPort switcher, but it's not going to switch the audio (unless you're okay with the monitor's really bad audio). All in all, it's a good monitor, but having only one DisplayPort input, having a maximum of 100 Hz maximum refresh rate on the HDMI ports with FreeSync on, and, of course, the poor audio quality from the headphone jack all hold it back.