-
Content count
294 -
Joined
-
Last visited
Everything posted by Zatarita
-
I’m impressed! I know this has been out a while, but I’ve been nose to the grindstone programming. Just took some time out of my day to play this, and it’s glorious. I LOVE the glass effects you used, and the last block out was my favorite. I’m glad you chose that one. Over all this will be in my mental “top ten” You always do awesome work! what software do you use? max, blender?
-
Never thought of that! That's an easy fix too. I can definetly add that. Sadly specialty stuff like this will likely need to be addressed manually. I will bring that up in the manual conversion tutorial though. Also isn't the new shader settings for the multipurpose default off? So classic tags that use the old system should default to the CE style. Xbox requires manual setting Edit: I understand what you mean now. I keep messing things up with my edits x.x gotta get used to this moderator thing. Sorry lol. The bitmaps will be mismatched. That might require a more nuanced approach, but I'll have to run my brain on it for a bit This is a good point too, I didn't know invader could extract scripts. That's an easy fix too! As far as the forbidden script commands that might be a bit more difficult. That could go into the manual conversion tutorial as well. I'll update the changes when I have a free night
-
Conservancy is a bat file I threw together to streamline the steps required to port a classic map to anniversary using invader and the anniversary HEK. This requires invader: https://invader.opencarnage.net/builds/nightly/download-latest.html Usage is simple. First extract invader and Conservancy into your Chelan_1 folder. Then you place all the maps you want converted into the "classic maps" folder The bat file will process each of the entries inside of the classic maps folder and use invader-extract to extract the data from the map. After each map has been extracted, it will run invader-bludgeon, and invader-strip on all the tags. Once this is done, it will extract the scenario name from the map using invader-info, and rebuild the cache file using tool. Once all of this has been finished, all the successful conversions will be built into your maps folder. This is practically the same process as the tutorial I made in the tutorials subforum, just automated with a bat file. Do note it is slightly inefficient bludgeoning, and stripping EVERY tag every time; however, for sake of automation the process has been simplified; that is a sacrifice needed to be made for ease of use. Do note if there were any issues with extraction/stripping/bludgeoning a manual approach may be required, to which I recommend following the tutorial here: Please post any issues. I have tested on multiple maps, and it seems to run as expected c: Happy modding! Conservancy.zip
-
Yep c: I think it comes down to how each generation views computers. I was born on the cusp of DOS, but as I was getting my experience in my field I also used Linux. I think a lot of people only know graphic interferace. They don't realize how powerful scripting can be. In fact modern windows is shite compared to Linux in that matter. Since windows dominates, a lot of people never learned how to leverage scripting. Not to mention, if you make a strong backend, and it's useful enough, people will create a front end. Like tool+ or osoyoos
-
Yes, but doing that with a bat file alone is tough. I could just write a program, that wouldn't be difficult edit: Something like this
-
My coworker had an interesting stance on the matter that might help. We were discussing the "I could have a server automate extracting and rebuilding a map" His thought was, if it's that simple to convert a map, why not make the conversion tool easily available and easily usable. Then have the individual convert the map on their own. This would completely pull the need for a Porter out of the equation. Someone could say "yo let's play hugeass." And you could download the classic map, and just run a bat to rebuild it. I actually made a bat file that streams the process into sub-folders and rebuilds the maps. One major issue I have though is knowing the scenario file name from the map name. I just kinda all search the levels folder for a scenario. (Since they're sub folders, the extracted scenario should be the only scenario, but I feel it's a dirty solution)
-
That could work, or maybe we could talk to hosting services like ce3 about having a specific "license" (if you will) for ported maps. (Or maybe I could set up a hosting service with preservation in mind as the mission statement) Heck, I could create a bat file to automate this process and remove the need for an "Porter" all together. Someone would just upload the CE map, and the server could pull it apart, and rebuild for MCC. Only needing attention when the automated system failed
-
heck yea, I'm honored c:
-
Recent San Fransisco trip to see Porter Robinson. Twas lit. Florida boi enjoyed the 68 degrees
-
I do agree maintaining is a bit much to ask, that's why I think it would be good practice. Though I understand not "required" And I never thought about spam Do you think there could be a better solution?
-
ALRIGHT! So I've been working silently. I've been touching up a lot of the underlying systems to get them to the point of functionality I'm happy with. I have completely removed the need for a custom endian class. Instead I have created endian aware variables that will cast for me when needed regardless of parent system. This utilized c++20's std::endian definitions. Which may limit usage. I'm unsure how this will interact with linux's endian library. This will require testing; however, by doing this I can create a custom std::istream >> operator over load DRASTICALLY simplifying the need to create my own parser. I have just finished patching up the decompression object. It is now 100% a template class. Which allows for easy extension. I figured I'd leave out h2am compression though, as that should be associated with blam style compression. This also cleaned up the h2am fringe case code that bloated everything up. I'm not a big fan of boost; however, I am now utilizing the boost logging library for tracing and debugging. I plan to keep boost libraries to a minimum. I also went ahead and borrowed a library from github (https://github.com/bshoshany/thread-pool) for thread pooling. This has cut down decompression time even further showing (for my computer as an 8 core) a ~650% decrease in decompression time. This compounds as I don't need to decompress the entire file. Making "random" data access damn near as efficient as possible. Already compressed chunks wont be redecompressed. I also changed the way things are set up with the decompression object. I allocate an array for the entire decompressed file, and each thread writes to that preallocated memory reducing the amount of runtime allocations needed. The file is read straight from the file into the decompressor into the preallocated memory. This leaves cleaning up the actual saber definitions for the s3dpak, imeta, and stuff. From there I WILL be done with the libSaber whether I like it or not. I can't fall into the trap of trying to make it "perfect" especially if I keep pushing the bar for what I want. Threading is there, Random access is there, I dont need to over engineer anymore.
-
So, things have been a bit quiet from my front this past month, but I figured I’d post an update to some of the progress that’s been getting made. When I say “team” I currently mean myself; however, I am looking for people to join. If you’re interested shoot me a dm onto the updates! SeT had a major revision. I refactored everything to utilize polymorphism a bit more. Hopefully to keep things a bit more organized. There have been 2 major updates to SeT. One is SEK. Which is a saber editing kit. It’s a hek style set of tools that will be used to modify saber files. this is going to replace SuP. Planned tools are: S-edit - a guerillaesque program used to edit the files inside the saber paks. s-tool - a tool-esque program used to compile new files. P-arc: this will let us pack All the mod files into one file (.s3dpak, .ipak, .imeta, .fmeta, .map). It also lets us send only the modded files, greatly reducing file size. File size comparison (desert hog mod): my hopes is this will simplify modding cea, and make it more user friendly. besides planned programs, my hopes is to create an export/importer with templates to blender. Coordinates, texture coordinates, and faces have been mapped out for cea. This means I can extract most of the model data; however, this is only for “simple” meshes. I have not found rigging data. Research is still being done. I’m going to attempt to manually build a model into a template to test building soon. lastly, While the primary focus is for h1, I have been poking around with h2a as well. I managed to extract the contents of the h2a pck files. I have a lot of templates from there as well. I figured out some of the model data for them as well; however, I wasn’t able to figure out texture coordinates, or rigging information for that either. is your interested in helping out, I could really use a hand. I only have ~2 days a week that I can dedicate to this.
-
It's likely you've reached the edge of the map, and the remainder is the skybox.
-
Alright, I just want to update the thread cause it has been a while. I've been (attempting) to get a working lzx implementation to use for xbox maps. I gotta say though, the book I'm reading is a bit dense. It's rather hard to absorb what is said in a meaningful way. Because of this I feel I need to read a book on information theory before I continue with attempting to understand compression. I've read the same chapter like 10 times, and I just dont think the way this author explains things works well for me. Parts of the book make sense though, so I feel it's my comprehension and not the authors' poor dictation. I'm sure there will be one piece that will fall into place, and I'll finally have a cascade of understanding. Besides this I've been mildly burnt out in general. Last month I had facial surgery, right before my vacation (which I'm currently on, greetings from san fran) So I've been dedicating this time to honing my skills and gaining knowledge. A less active role; however, I'll label it "Research and Development" and stop beating myself up over it. I finished my books on software architecture, and c++20. I may reproach some things I have identified will come back and bite me later. The EndianStream library may require an update. After thinking about the pros and the cons of the current system I think I may now understand why others take the approach they do with endian-sensitive data . From what I've seen they make the variable aware of the endianness not the stream. After reading the book on software architecture this actually satisfies more quality restrictions than the prior. For example, having the stream aware of the endianness means the stream is doing two things. Reading the data, and (potentially) swapping the endianness. Comparatively if we make the variable aware of it's own endianness, and the variable is responsible for how it is interpreted; endianness becomes an extension of the interpretation. This reduces coupling. Not only this, but also if I run into the issue of mixed endian files I would have to do some sloppy stream enforced over ride. Which requires a lot more overhead than just declaring the variable as the correct type. This makes both the stream simpler, and the variables less dependent of prior knowledge of anticipated data. This also means I could invert the variables if you will. Meaning I dont need two different classes for files produced from different architectures. say if a Xbox file was big endian, but the PC one is little or w/e This means I'll need to make some slight adjustments to the libSaber; however, I dont forsee this impacting the implementation. the fix is only really a refactor. Also the libMccCompress has turned into libSaberCompress. The decompression algorithm is becoming a bit more modular with this clean up. Hopefully allowing for generalizing any saber compressed file with a highlevel implementation of the lower level systems. Beyond that I think I'm pretty happy with this implementation. It's been a while since I've look at the UI elements, and I've also learned a lot in the meanwhile. There may be changes to that once I try to mesh the two systems together. Though things have been hectic on my end. I'm seeing jacob collier, madeon, porter robinson though :DDD So give it like a week and I should start to be able to dedicate more time to the project
-
Hello! I am unaware of the pck file. Is this for Halo MCC? I have worked with some other saber-esque formats; however, it is worth noting that there are variations between versions. "Stream" makes me feel like it's an Xbox file. The issue is likely due to the stream expecting an offset/chunk but it isn't in the right location/reading the right data and the requested offsets are bigger than the file its self. I dunno if you're able to reach out to me on discord (zatarita#3266) I'd be happy to take a look at it. Though I don't think this tool directly will be able to assist you with what you're trying to do. edit: OH! Groundhog! That is unfortunately not a file compressed the same way. Sadly I don't have much knowledge on these files :c if you like you can reach out to me still on discord. I will try to take a look at the files in my free time. It seems the file is pretty straight forward. it seems like it contains wave files. Though they seem to be compressed
-
H2A-inflate is a quick python script I threw together to decompress halo 2 anniversary .pck files cli style interface (time to embrace my inner kava): h2a-inflate <file> <output> Quick and simple; source might be available if I can compress back in the future. This might also mean I can add h2a support into SuP at some point c: for batch files, I recommend copying the compressed files out of the directory, and creating a text document named "inflate.bat" copy this into it. Make sure h2a-inflate is in the same directory as the compressed file. mkdir inflated FOR %%i IN (*.pck) DO h2a-inflate "%%i" "inflated\%%~ni.pck" h2a-inflate.rar
-
Yo So I've been having a very eventful month. Had to get two teeth extracted, been on pain pills. Then had my battery die out in my car. Works letting me get some OT in and I'm trying to take as much of that as I can to catch up on some finances. I'm still working away though. The pain meds make it difficult to focus on in depth things. So I shifted my focus a bit to some tpl reverse engineering. It's frustrating trying to remember what I was thinking. Like trying to solve one of those slide puzzles with a missing piece. Hate them things. I've made massive strides in a blender importer though!: There are just a few issues. 1) I cheated a bit in that screen shot. The object is translated using a 4x4 transform matrix. Me, being the astute student I was categorized matrix multiplication as "thing I will likely not need in the foreseeable future." past Andy was wrong. 2) There are a few unknowns still that prevent "creation" of a template. In due time things will get easier. I believe the reason the matrix is being used is because the values are unsigned shorts. If we look at an example of a pelican we see each item is designed to take up as much space as possible within the signed short range The matrix is supposed to resize, and relocate these parts into the right area. This is the pelican with each "part" spread from each other. You can see the scale, location, rotation, and skews are partially off. The tpl format is also similar to the gbxmodel in the sense of it's not a model but a collection of models. Different LODs, Permutations, Muzzle flare, even "sfx geometry" Also seems like there is a reflection of the node parent/child structure as well. I see often "frame _______" and see a hierarchy linking parents to children. I'm curious if the anniversary engine uses this to "copy" animations. I haven't seen much info in regards to animations. If the engine links the nodes to the same nodes between engines it would explain how they could "translate" the info. This would also keep things synced between version. This is speculative. There are some noticeable outliers like Keyes walking up the stairs in a10. In classic he just ascends the stairs; however, in anniversary you can see him actually throwing his weight up the stairs. It appears as though he is actually a physical entity with weight inside of a 3d space. In cinematic situations I do feel there is something that must contain "extra" data. Also for some extra fun here is a bonus picture Might be kinda hard to make out; however, these are BSP vertices. I have made some progress reverse engineering the .lg file as well. Managing to extract some useful data. In fact the format is extremely similar. There is one major difference; these faces are broken into streams of indices. I can't seem to find anywhere in the file that specifies the size of these streams, or any form of delimitator telling me when one stream ends and another begins. Essentially, if it was a sentence; I cant find the period at the end, and it's written in another language.
-
I honestly feel like consistency is more important. I have zero trouble navigating the forum at all. They're broken down logically Each game has their own category, each category has subcategories. They're the same for each game. I feel this setup is better than one subforum being packed with everything. If someone wants to be able to just look through what tutorials are available they can do that. No problem. Compared to most modern platforms like say Reddit where it relies on the user to tag the information contained accordingly. You can't trust individuals to add the right tags, but you can easily just move a post from one subforum to the correct location easy. We do have to keep in mind the sanity of management as well. I feel they've made a good system. Modhalo used to be chaotic, Halo maps is ugly, Reddit is too generalized. This is my favorite forum that has existed for Halo in my 16 years of modding.
-
Got a wicked tooth infection. Almost landed me in the ER. Terrifying, this is the first time someone's told me "it's not a matter of if it to kill you, it's a matter of when" Mildly spooky to say the least ಠ_ಠ I have surgery today ;-; I have a feeling this will be a fun one. That aside, I've been trying to do what I can when I can. My book came in on compression algorithms. I'm writing my own LZX implementation for Xbox. Turns out I got the wrong thing for modding my Xboxʕ•ᴥ•ʔ Once I recover from surgery expenses I think I may just purchase an already modded Xbox 360 instead. Started rewriting the compression algorithm to accommodate for new formats. The current generalized algorithm is build to support Zlib. I think I'm going to pass the compression implementation as a template parameter to keep the compression class as generalized as possible. I'm going to rewrite the decompressesion algorithm as well to utilize virtual functions a bit better. Coming back to it with fresh eyes has highlighted a few short comings I wish to tackle. Beyond this h1a library is done. Just needs polishing. I can extract and rebuild imeta, ipak, and s3dpak (fmeta has been antiquated and removed from the standard) the h2a library will be much easier coming in with the one pck file. Beyond this, I have a feeling I might have to update the UI portion of the program. I have a feeling my programming style has changed so drastically at this point, that some portions may be programmed "wrong" Started working on a tool wrapper that enables queuing commands. This is a proof of concept for the "project file" which will allow you to build a map from a build file for both saber and Halo. Also, did some more research into template (the 3d model format used by saber) I have a solid enough understanding I fell to extract data programmatically. But reinjection (or exporting) is not possible yet. One major reason for this is I need to read up on calc. The models use what's called a homogeneous vector too calculate scale and rotation based off of a 4x4 matrix. Forgive a fool, but I've forgotten matrix math, and thusly can't adjust for size variations stored locally in the matrix in the file. (Or calculate them)
-
Just a mild update. There has been a temporary delay for a few reasons. One: due to personal issues I'm migrating my setup to a new location. Not really "moving" per say, but I'm relocating my primary living location. Two: I'm waiting on two new things to push this library to the level I'm hoping for. One of them is a book on compression algorithm. I haven't found a very good LZX library that cooperates with my progressive decompression algorithm. So I decided I'm going to write my own implementation. Xbox uses LZX compression instead of zlib (if I'm not mistaken. Though, the book will definetly clear up which compression is used if not) I have also decided to mod my 360 so I can try to mod the Xbox version (as emulation doesn't seem feasible for 360 cea yet) these are both taking some time. I have not abandoned the project by any means! Don't let the silence be spoopy. I have accumulated a bookshelf for this project and I've learned a lot. In fact it's inspired me to go back to school! So maybe in the future things will be even better.
-
Yo I'll keep it sweet and simple. Download ( if it asks for a key: ysqtyy6YGoSdEyxqlfqrvRt8pOMsSOCyc5r2nWwIDfw ) I feel having assets available allows for making custom maps a lot easier. Level designers don't want to always have to stop and make scenery. I see no shame in using perfectly good assets that already have the "halo" vibe. With them recently giving us permission to use assets across mcc I decided to backport some foliage. These tags are formatted to work with mcc; however, they will work in your custom edition maps as well. Keep in mind they have no collision. They are strictly for decoration. Just drag them into your tags folder. All of the scenery objects can be found in scenery/halo_3/foliage I take ZERO credit for anything besides ripping the models/textures, and creating the (basic) tags from them. I tried upscalling the textures, but I feel like high quality things look strange in halo 1 aesthetic. I plan creating more scenery backports. Hopefully this will allow people to design "halo" feeling levels a bit quicker.
-
Introduction Hello! My name is Zatarita! Today, we look at Texture files found inside of S3dpaks. On the Xbox release of CEA. The S3dpak is the primary file type used for storing game asset data. This means for the Xbox version the S3dpaks had to contain the textures used by the assets in scene. There are a few different files that work together to allow the engine to load and index textures when needed. For example TexturesMips64, and TexturesInfo. For more information on these the appendix. Depending on when you read this, there may be dedicated pages explaining them and how they work together. By the time you are done reading this tutorial. You should feel confident in your ability to navigate through a texture file. I will also show you a few techniques to extract the textures manually from the file; given you ever find your self with the need to do so. One thing to keep in mind is there are two different flavors of Texture. The ones in the pac_stream.s3dpak are different from the ones inside of the level s3dpaks. I'm assuming it was done this was so that s3dpak could stream the texture pre-formatted right into memory without any other processing. These streamed textures use the same format number as the non-streamed variants ( 0x6 ) inside the s3dpaks. So there is no easy way to tell which texture type you have without inspecting the data. These style textures are not going to be covered as I sadly don't have enough time at the moment, and requires a more nuanced explanation. I will cover these textures in their own post to give them the needed attention they require. DISCLAIMER I will periodically review this post to add updated information as new discoveries are made; however, all information present here is correct to my current understanding. Some things may be proven incorrect. Texture The Texture file utilizes a common system used by a lot of different saber files. Each 'structure' of data is assigned a sentinel value. Following this sentinel value is the end of block pointer. This end of block pointer points to the start of the next sentinel value. Everything in-between is data that belongs to the structure of data. The "structure" can be anything, sometimes even just one property of the data. For example, we can see in the picture above the sentinel "0F 00" in blue with the end of block value "0A 00 00 00" following it in pink. The value "0F 00" in this context denotes that the data is a signature. In green we can see the value of the data "TCIP" which denotes that the superseding data should be interpreted as texture data. Following this we have "02 01" and another end of block value. With an understanding of the sentinel system; I can explain the data in relationship to that system. It will make more sense. F0 00 - Signature ("TCIP") 02 01 - Dimensions Width, Height, Depth, Faces (0x400, 0x400, 0x1, 0x1) F2 00 - Format (0xC) F9 00 - Mipmap Count (0x1) FF 00 - Pixel Data ( ... ) 01 00 - End Of Data ( ) Additional Reading Format Formatting follows a similar system to the other files like TexturesInfo, TexturesMips64, and the Ipak. I have laid out on a table the supported textures for the Saber engine. Out of all of the options, only a handful are used by MCC, and they are highlighted in green. This table can be used to reference which format number is which dds format. As seen above in pink, the format of that texture is 0xC. Which corresponds to the value OXT1 on the table. For more information on the table see TexturesInfo in appendix. Extraction Now we get to the fun part. We're going to use the information we extracted from the data above to extract a DDS file from the Texture. For this we're going to need a few prerequisites. RawTex will help us generate us a DDS header for our data. Also, In order for rawtex to show us a preview it requires texconv to be installed in the same directory as rawtex. Once we've done this we can open rawtex, and drag a texture into the window. Briefly I will go over the UI. On the top left there is a textbox labeled 0xOFFSET which is how we tell rawtex where to find the pixel data. On the column to the left, and row on the top, there are height and width selection options, Plus text boxes where you can manually type it in. Also, on the column on the left, we have a few options for format. Following this is the "Try and Convert" button. There are a few other things; however, for our purposes this should be enough. The best way to figure it out is to play around with it. It's one of my go-to "Reverse Engineers Toolkit" items. Moving on, we're going to extract a texture using the above information. I grabbed a random texture and dragged it into a newly opened rawtex window. When we do that, we see this beautiful creation. We have to update the information to the correct properties using the data from the hex. The important data is the height, width, and format. For my case the data is 256, 256, OXT1. Once the correct values are inputted. We are shown the texture, and a .png copy of the texture is saved to the same directory as the original file once you click "Try and Convert". Conclusion The xbox version of the game utilizes s3dpaks as it's primary format for storing data. Because of this, they need to be able to support textures which can be utilized by assets on scene. They manage this with the "Textures" format. There are a few variants to the Texture format, and in this tutorial I introduced you to the non-streamed variant. In this brief spec I attempted to present to you enough information about the format to be able to navigate it's contents with some level of confidence. Thank you again for stopping in. PS Currently I'm unsure if there is a tool out there that can extract from Xbox s3dpaks; however, in the s3dpak specification I go over how you can manually extract data if needed. It's not too complicated. Appendix s3dpak TexturesInfo ipak / imeta
-
I have done the thing. It is messy, and clunky; however, everything works. I can load and save all the file types h1a using this method; however, I have to go through and fill out all the placeholder functions I made. Nothing too interesting, but progress. Hex compares the files to be identical when created from scratch vs ones supplied. My brain hurts though, I think I'ma take a break for the night.
-
Introduction Hello! My name is Zatarita. I'm sure you've seen me around. I may have made one, or two, posts in this subforum. I had some free time today, so I felt I would delve a bit into some of the sub-files inside of s3dpaks. For those who are unaware. S3dpaks are archive style files that contain game files inside of them. The TexturesInfo file ( if you want to call it a file ( its definitely easier to consider it a file... ) ) is one of these sub-files. For more information on S3dpaks see my write up in the appendix. DISCLAIMER I will periodically review this post to add updated information as new discoveries are made; however, all information present here is correct to my current understanding. Some things may be proven incorrect. All that aside, I'm fairly confident in my findings; I have developed working mod tools using this understanding. The TexturesInfo data is fairly straight forward, and I'm certain the information here is complete. TexturesInfo Starting off it might be important to understand what the TexturesInfo data actually does. It's functionality it drastically different for the pc version to the xbox version. In fact, the PC version doesn't even use it at all as far as I can tell. You can just delete it from the s3dpak and move on with your day no problem; however, for the Xbox it's a different story. One of the major differences between PC s3dpaks and Xbox s3dpaks is that the Xbox variant contains the textures needed for the level and all the objects contained with-in. Meaning that each level has it's own copy of each texture. This is rather inefficient which is why, I'm sure, they switched to the Ipak/Imeta system used by PC. Fig 1. TexturesInfo Entry list (Left) And the corresponding textures inside the s3dpak ( right ) The TexturesInfo is the glossary containing the meta data about the textures contained within the s3dpak. These textures are also files inside the s3dpak. The TexturesInfo files the tells the s3dpak the name, format, dimensions, face count, and mip map count of each of its entries. This again is rather redundant as the texture entry inside of the s3dpak also contains all this information except for the name of the texture. You can see an example s3dpak below with the TexturesInfo data exposed. Notice the file count at the top is significantly different between versions as well. This reflects the presence of textures in the s3dpak ( as well as other file types not present in the pc version, see appendix ) The PC version just contains default values in everything, but still has the texture name present. Fig 2. Xbox (Left) Compared to PC (Right) TexturesInfo. Format The texture format is a DDS format type. In fact the textures are all just dds textures with the header replaced with a saber specific one. There are a handful of supported formats with the saber engine. In fact the list is quite long. Most of them are completely unused by the halo engine. Some of them may be wrong; however, for the means of MCC; all the ones supported by the engine are represented accurately. Also do note that H2A is in the table as well. This is to reuse the table at a later time. You can ignore that column all-together. One special note about the table is OXT1/AXT1. These are both DXT1 textures; however, it is believed that the "O" and the "A" stands for Opaque and Alpha respectively. We're not quite sure why they split them up. I believe it's an optimization thing rather than a system limitation; because, regardless of system I still see the split between the two. Something to keep in mind though, as they are both technically just DXT1 blocked compressed textures as far as DDS is concerned. The Raw Data The raw data for the textures info is pretty straight forward. One interesting note about this compared to other Saber formats is there is no child count present. Since the string length can vary in length, there actually is no way to access the children randomly. This must be processed as a stream of data. The TexturesInfo follows the string length followed by string text pattern see in other Saber format. Following the string name is the signature "TCIP" which is pict backwards. This is seen again in the h2a pict format ( see appendix ). After the signature, in order ( all 32 bit unsigned integers ) is width ( Purple ), height ( Cyan ), depth ( Green ), face count ( Yellow ), mipmap count ( Browm ), format ( Orange; see above, first item = 0 ) Fig 4. Textures Info ( Left ) Mapped to colors ( Right ) Conclusion Inside the s3dpaks are many different file types. The TexturesInfo is one of those format types. It contains high level metadata about the textures that the s3dpak house. It is important to note the the textures info file is xbox only, on pc there is only placeholder data present. Many formats are available for the Saber engine; although, only a handful are supported for MCC. It may be important to manually edit this file for modding Xbox files, along with importing the textures that correspond to the entry you're importing. In future updates I will add a reference to SceneData, and s3dpak Texture and TexturesMip64 entries once the pages are made for the formats. Thank you all so much for reading c: and I hope you learned something. If anything seems unclear, do feel free to drop a comment and I will gladly reply when possible. Appendix S3dpaks: H2A Pict format
-
OKAY So after screwing up my vtables and spending an entire day trying to pin point the issue I have finally taken the time to organize a few things. Cmake is now pretty, and does it's job. All targets are no longer compiled independently calling for a cascade of rebuilding when a low level library gets edited -shudder- It also made debugging much easier, as visual studio can debug trace. everything works as the Cmake gods intended. S3dpaks were implemented using the new generic file system. A few changes were made. Format is built into the generic file as a template parameter. This is abstracted from the end user through the file interface anyway. A new virtual function has been defined "getFileExtention" which must be defined for each file type and takes a format. This will be helpful for differentiating between file types -Files with extensions use their existing extensions -Files follow a halo-esque file format barring the standard 3 character abbreviations. (eg ".wavebanks_strm_file", ".wavebanks_mem", ".cacheblock", ) File format types are now stored inside a Format namespace instead of the file interface. These changes were deemed necessary due to the way I organized my files. This is not an issue by any means. In fact I think I like this implementation better. This means I can write more generalized code and repeat myself less. I have successfully managed to extract data from all s3dpaks ( pc and xbox ) using this method. Implementing ipaks, and imetas should be as simple as defining the format, and the "parse/serialize" header function and the same code should work. I feel I'm finally leveraging c++ a bit better c:< Also made some minor tweaks to the decompression object classes. Moved the static wrapper functions to the global namespace in the main include header. This just returns a decompression object specialized for that files offset types, chunk sizes, and format. Just hides the clunky template syntax from the higher level. Made it so when the decompresion object attempts to decompress a chunk, and fails, it sets the uncompressed flag and parses the data as raw uncompressed data. I felt this made the most sense. It's possible to try and open any file as a s3dpak anyway, so I would need to have the parsing file determine if the data is valid. So I feel this is a safe assumption to make since either way the file needs to be validated. Currently the zlib decompression is not threaded. So there is no discernable time difference for extracting from an entire file; however, random access to the files is now significantly decreased. The difference is insane. I can extract a 68mb file from a compressed s3dpak in 1.6 seconds. Of course if I'm extracting all of the s3dpak, I still have to decompress the entire file, and write all the data to disk. With SuP for a10.s3pdak it takes me ~8 seconds, which is only slightly improved with libSaber. This will change once I thread decompression. Also small side note, but it's versatile enough to extract from inversion s3dpaks (another game released by saber around the same time) as well with minor tweaks. Might generalize the decompression object a bit further to allow for variants in both compression format, as well as file format. Which will be needed to decompress xbox s3dpaks anyway.